Science.gov

Sample records for video ccd camera

  1. CCD Camera

    DOEpatents

    Roth, Roger R. (Minnetonka, MN)

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  2. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  3. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  4. Biofeedback control analysis using a synchronized system of two CCD video cameras and a force-plate sensor

    NASA Astrophysics Data System (ADS)

    Tsuruoka, Masako; Shibasaki, Ryosuke; Murai, Shunji

    1999-01-01

    The biofeedback control analysis of human movement has become increasingly important in rehabilitation, sports medicine and physical fitness. In this study, a synchronized system was developed for acquiring sequential data of a person's movement. The setup employs a video recorder system linked with two CCD video cameras and fore-plate sensor system, which are configured to stop and start simultaneously. The feedback control movement of postural stability was selected as a subject for analysis. The person's center of body gravity (COG) was calculated by measured 3-D coordinates of major joints using videometry with bundle adjustment and self-calibration. The raw serial data of COG and foot pressure by measured force plate sensor are difficult to analyze directly because of their complex fluctuations. Utilizing auto regressive modeling, the power spectrum and the impulse response of movement factors, enable analysis of their dynamic relations. This new biomedical engineering approach provides efficient information for medical evaluation of a person's stability.

  5. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H. (Lafayette, CA)

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  6. Calibration Tests of Industrial and Scientific CCD Cameras

    NASA Technical Reports Server (NTRS)

    Shortis, M. R.; Burner, A. W.; Snow, W. L.; Goad, W. K.

    1991-01-01

    Small format, medium resolution CCD cameras are at present widely used for industrial metrology applications. Large format, high resolution CCD cameras are primarily in use for scientific applications, but in due course should increase both the range of applications and the object space accuracy achievable by close range measurement. Slow scan, cooled scientific CCD cameras provide the additional benefit of additional quantisation levels which enables improved radiometric resolution. The calibration of all types of CCD cameras is necessary in order to characterize the geometry of the sensors and lenses. A number of different types of CCD cameras have been calibrated a the NASA Langley Research Center using self calibration and a small test object. The results of these calibration tests will be described, with particular emphasis on the differences between standard CCD video cameras and scientific slow scan CCD cameras.

  7. A control system for LAMOST CCD cameras

    NASA Astrophysics Data System (ADS)

    Deng, Xiaochao; Wang, Jian; Dong, Jian; Luo, Yu; Liu, Guangcao; Yuan, Hailong; Jin, Ge

    2010-07-01

    32 scientific CCD cameras within 16 low-dispersion spectrographs of LAMOST are used for object spectra. This paper introduced the CCD Master system designed for camera management and control based on UCAM controller. The layers of Master, UDP and CCD-end daemons are described in detail. The commands, statuses, user interface and spectra viewer are discussed.

  8. An auto-focusing CCD camera mount

    NASA Astrophysics Data System (ADS)

    Arbour, R. W.

    1994-08-01

    The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

  9. CCD Video Imaging Of Cardiac Activity

    NASA Astrophysics Data System (ADS)

    Nassif, G.; Fillette, F.; Lascault, Aouate G.; Grosgogeat, Y.

    1988-06-01

    Helium-Neon LASER spectrometry of fluorescent dye WW 781 bound to heart tissues permits to collect optical signals significant from the electrical activity and from the electromechanical activity. It is also possible to image the electrical activity of a myocardial surface stained with WW 781 and illuminated with a direct or an unfocused Helium-Neon LASER beam using a Charge Coupled Device (CCD) video camera. Sheep ventricular fragments and right mouse atria were stained with Tyrode solutions containing from 0.4 to 1 g/1 WW 781. Focused or unfocused illuminations were performed with a 2 mW Helium-Neon LASER through a lens or an optical fiber. Direct illumination was performed with nine 5 mW Helim-Neon LASERs permitting to map the observed surface. CCD video camera was connected to a 70-220 Zoom-telelens and placed behind a 665 nm high pass optical filter. Video signals were amplified and recorded on a NTSC B. V. U professionnal magnetoscope. Video recordings were studied frame by frame. Fluorescent emissions from illuminated areas were monitored with a 200 um diameter optical fiber optrode using a monochromator-photomultiplier set. Direct illumination permitted to map on nine points and to follow the electrical activity propagation on sheep ventricular epicardium observing a 2/1 conduction block on a limited area. Fluorescent signals significant from electrical activity were simultaneously recorded. Unfocused lighting permitted to follow the depolarization of a 1.2 mm spot on sheep ventricular endocardium and to follow the propagation of the fluorescence on a 2 mm diameter area on mouse atrium. Such a technique appear to be of great imterest in the study of arrythmias on experimental models with foreseeable pharmacological applications.

  10. Solid state television camera (CCD-buried channel), revision 1

    NASA Technical Reports Server (NTRS)

    1977-01-01

    An all solid state television camera was designed which uses a buried channel charge coupled device (CCD) as the image sensor. A 380 x 488 element CCD array is utilized to ensure compatibility with 525-line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (1) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (2) techniques for the elimination or suppression of CCD blemish effects, and (3) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  11. Application of the CCD camera in medical imaging

    NASA Astrophysics Data System (ADS)

    Chu, Wei-Kom; Smith, Chuck; Bunting, Ralph; Knoll, Paul; Wobig, Randy; Thacker, Rod

    1999-04-01

    Medical fluoroscopy is a set of radiological procedures used in medical imaging for functional and dynamic studies of digestive system. Major components in the imaging chain include image intensifier that converts x-ray information into an intensity pattern on its output screen and a CCTV camera that converts the output screen intensity pattern into video information to be displayed on a TV monitor. To properly respond to such a wide dynamic range on a real-time basis, such as fluoroscopy procedure, are very challenging. Also, similar to all other medical imaging studies, detail resolution is of great importance. Without proper contrast, spatial resolution is compromised. The many inherent advantages of CCD make it a suitable choice for dynamic studies. Recently, CCD camera are introduced as the camera of choice for medical fluoroscopy imaging system. The objective of our project was to investigate a newly installed CCD fluoroscopy system in areas of contrast resolution, details, and radiation dose.

  12. Solid state television camera (CCD-buried channel)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The development of an all solid state television camera, which uses a buried channel charge coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array is utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control (i.e., ALC and AGC) techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  13. Vacuum compatible miniature CCD camera head

    SciTech Connect

    Conder, A.D.

    2000-06-20

    A charge-coupled device (CCD) camera head is disclosed which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04 inches for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military, industrial, and medical imaging applications.

  14. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  15. Integration design of FPGA software for a miniaturizing CCD remote sensing camera

    NASA Astrophysics Data System (ADS)

    Yin, Na; Li, Qiang; Rong, Peng; Lei, Ning; Wan, Min

    2014-09-01

    Video signal processor (VSP) is an important part for CCD remote sensing cameras, and also is the key part of light miniaturization design for cameras. We need to apply FPGAs to improve the level of integration for simplifying the video signal processor circuit. This paper introduces an integration design of FPGA software for video signal processor in a certain space remote sensing camera in detail. This design has accomplished the functions of integration in CCD timing control, integral time control, CCD data formatting and CCD image processing and correction on one single FPGA chip, which resolved the problem for miniaturization of video signal processor in remote sensing cameras. Currently, this camera has already launched successfully and obtained high quality remote sensing images, which made contribution to the miniaturizing remote sensing camera.

  16. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his

  17. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  18. CCD camera-based diagnostics of optically dense pulsed plasma with account of self-absorption

    NASA Astrophysics Data System (ADS)

    Nikonchuk, I. S.; Chumakov, A. N.

    2016-01-01

    We present a system developed for the optically dense pulsed plasma diagnostics based on a digital video camera with a CCD matrix, that provides determination of spectral brightness and optical density with account of self-absorption.

  19. Typical effects of laser dazzling CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhang, Jianmin; Shao, Bibo; Cheng, Deyan; Ye, Xisheng; Feng, Guobin

    2015-05-01

    In this article, an overview of laser dazzling effect to buried channel CCD camera is given. The CCDs are sorted into staring and scanning types. The former includes the frame transfer and interline transfer types. The latter includes linear and time delay integration types. All CCDs must perform four primary tasks in generating an image, which are called charge generation, charge collection, charge transfer and charge measurement. In camera, the lenses are needed to input the optical signal to the CCD sensors, in which the techniques for erasing stray light are used. And the electron circuits are needed to process the output signal of CCD, in which many electronic techniques are used. The dazzling effects are the conjunct result of light distribution distortion and charge distribution distortion, which respectively derive from the lens and the sensor. Strictly speaking, in lens, the light distribution is not distorted. In general, the lens are so well designed and fabricated that its stray light can be neglected. But the laser is of much enough intensity to make its stray light obvious. In CCD image sensors, laser can induce a so large electrons generation. Charges transfer inefficiency and charges blooming will cause the distortion of the charge distribution. Commonly, the largest signal outputted from CCD sensor is restricted by capability of the collection well of CCD, and can't go beyond the dynamic range for the subsequent electron circuits maintaining normal work. So the signal is not distorted in the post-processing circuits. But some techniques in the circuit can make some dazzling effects present different phenomenon in final image.

  20. High-performance digital color video camera

    NASA Astrophysics Data System (ADS)

    Parulski, Kenneth A.; D'Luna, Lionel J.; Benamati, Brian L.; Shelley, Paul R.

    1992-01-01

    Typical one-chip color cameras use analog video processing circuits. An improved digital camera architecture has been developed using a dual-slope A/D conversion technique and two full-custom CMOS digital video processing integrated circuits, the color filter array (CFA) processor and the RGB postprocessor. The system used a 768 X 484 active element interline transfer CCD with a new field-staggered 3G color filter pattern and a lenslet overlay, which doubles the sensitivity of the camera. The industrial-quality digital camera design offers improved image quality, reliability, manufacturability, while meeting aggressive size, power, and cost constraints. The CFA processor digital VLSI chip includes color filter interpolation processing, an optical black clamp, defect correction, white balance, and gain control. The RGB postprocessor digital integrated circuit includes a color correction matrix, gamma correction, 2D edge enhancement, and circuits to control the black balance, lens aperture, and focus.

  1. High-speed optical shutter coupled to fast-readout CCD camera

    NASA Astrophysics Data System (ADS)

    Yates, George J.; Pena, Claudine R.; McDonald, Thomas E., Jr.; Gallegos, Robert A.; Numkena, Dustin M.; Turko, Bojan T.; Ziska, George; Millaud, Jacques E.; Diaz, Rick; Buckley, John; Anthony, Glen; Araki, Takae; Larson, Eric D.

    1999-04-01

    A high frame rate optically shuttered CCD camera for radiometric imaging of transient optical phenomena has been designed and several prototypes fabricated, which are now in evaluation phase. the camera design incorporates stripline geometry image intensifiers for ultra fast image shutters capable of 200ps exposures. The intensifiers are fiber optically coupled to a multiport CCD capable of 75 MHz pixel clocking to achieve 4KHz frame rate for 512 X 512 pixels from simultaneous readout of 16 individual segments of the CCD array. The intensifier, Philips XX1412MH/E03 is generically a Generation II proximity-focused micro channel plate intensifier (MCPII) redesigned for high speed gating by Los Alamos National Laboratory and manufactured by Philips Components. The CCD is a Reticon HSO512 split storage with bi-direcitonal vertical readout architecture. The camera main frame is designed utilizing a multilayer motherboard for transporting CCD video signals and clocks via imbedded stripline buses designed for 100MHz operation. The MCPII gate duration and gain variables are controlled and measured in real time and up-dated for data logging each frame, with 10-bit resolution, selectable either locally or by computer. The camera provides both analog and 10-bit digital video. The camera's architecture, salient design characteristics, and current test data depicting resolution, dynamic range, shutter sequences, and image reconstruction will be presented and discussed.

  2. Tools and Techniques for Measuring Asteroid Occultations with DSLR and CCD Cameras

    NASA Astrophysics Data System (ADS)

    Hoot, John E.

    2012-05-01

    Currently most asteroid occultations are measured with video equipment. This technique is limited to stars bright enough to be measured at 30 frames per second and limits participation to observers that have portable low-light video and time standard tagging equipment. This paper presents new observation tools and analysis methods that allow the larger community of astroimagers to make precise occultation measurements with tracking telescopes and DSLR or CCD integrating cameras.

  3. Solid state, CCD-buried channel, television camera study and design

    NASA Technical Reports Server (NTRS)

    Hoagland, K. A.; Balopole, H.

    1976-01-01

    An investigation of an all solid state television camera design, which uses a buried channel charge-coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array was utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a design which addresses the program requirements for a deliverable solid state TV camera.

  4. Printed circuit board for a CCD camera head

    DOEpatents

    Conder, Alan D.

    2002-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  5. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. )

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  6. Solid-State Video Camera for the Accelerator Environment

    SciTech Connect

    Brown, R

    2004-05-27

    Solid-State video cameras employing CMOS technology have been developed and tested for several years in the SLAC accelerator; notably the PEPII (BaBar) injection lines. They have proven much more robust than their CCD counterparts in radiation areas. Repair is simple, inexpensive, and generates very little radioactive waste.

  7. High-resolution CCD camera family with a PC host

    NASA Astrophysics Data System (ADS)

    Raanes, Chris A.; Bottenberg, Les

    1993-05-01

    EG&G Reticon and Adaptive Optics Associates have developed a family of high resolution CCD cameras with a PC/AT host to fulfill imaging applications from medical science to industrial inspection. The MC4000 family of CCD cameras encompasses resolutions of 512 X 512, 1024 X 1024, and 2048 X 2048 pixels. All three of these high performance cameras interface to the SB4000, PC/AT controller, which serves as a frame buffer with up to 64 MBytes of storage, as well as providing all the required control, and setup parameters while the camera head is remotely located at distances of up to 100 ft. All of the MC4000 high resolution cameras employ MPP clocking to achieve high dynamic range without cooling the CCD sensor. The use of this low power clocking technique, surface mount components, electronic shutter and clever packaging have allowed Reticon to deliver the MC4000 cameras in convenient, rugged small housings. The MC4000 family provides users with a total imaging solution from leading edge sensors and electronics in ruggedized housings, to cables, power supplies, and a PC/AT frame buffer and controller card. All the components are designed to function together as a turn-key, self-contained system, or individual components can become part of a user's larger system. The MC4000 CCD camera family makes high resolution, electronic imaging an accessible tool for a wide range of applications.

  8. Video camera as a visual sensor, part 2

    NASA Astrophysics Data System (ADS)

    Gomi, Hiromi

    1994-03-01

    A video camera was calibrated with one-to-one correspondence between picture elements of CCD and digital image by synchronizing VD/HD signals of video camera and an AD converter. Measurements were made on space resolution and sensitivity of the picture element, stability of the video signal, and on lens characteristics. A picture element leaked the output to the adjacent picture elements and the interference of outputs was linear. There were no defective picture elements. A focussing mechanism and 'contrast' algorithm made it possible to measure distance at a space resolution of 2 x 2 picture elements.

  9. CTK: A new CCD Camera at the University Observatory Jena

    NASA Astrophysics Data System (ADS)

    Mugrauer, M.

    2009-05-01

    The Cassegrain-Teleskop-Kamera (CTK) is a new CCD imager which is operated at the University Observatory Jena since begin of 2006. This article describes the main characteristics of the new camera. The properties of the CCD detector, the CTK image quality, as well as its detection limits for all filters are presented. Based on observations obtained with telescopes of the University Observatory Jena, which is operated by the Astrophysical Institute of the Friedrich-Schiller-University.

  10. Using a CCD Camera for Double Star Astrometry

    NASA Astrophysics Data System (ADS)

    Carro, Joseph

    2014-05-01

    This paper describes the use of a CCD camera for double star astrometry, and it will serve as a planning guide for the installation and use of such equipment. The advantages and disadvantages of the suggested equipment plus examples of the photographs produced by such equipment are included. This author has successfully used a CCD camera for more than two years, and has published papers in three journals, namely the Journal of Double Star Observations, El Observatorio de Estrellas Dobles, and il Bollettino de Stelle Doppie.

  11. Driving techniques for high frame rate CCD camera

    NASA Astrophysics Data System (ADS)

    Guo, Weiqiang; Jin, Longxu; Xiong, Jingwu

    2008-03-01

    This paper describes a high-frame rate CCD camera capable of operating at 100 frames/s. This camera utilizes Kodak KAI-0340, an interline transfer CCD with 640(vertical)×480(horizontal) pixels. Two output ports are used to read out CCD data and pixel rates approaching 30 MHz. Because of its reduced effective opacity of vertical charge transfer registers, interline transfer CCD can cause undesired image artifacts, such as random white spots and smear generated in the registers. To increase frame rate, a kind of speed-up structure has been incorporated inside KAI-0340, then it is vulnerable to a vertical stripe effect. The phenomena which mentioned above may severely impair the image quality. To solve these problems, some electronic methods of eliminating these artifacts are adopted. Special clocking mode can dump the unwanted charge quickly, then the fast readout of the images, cleared of smear, follows immediately. Amplifier is used to sense and correct delay mismatch between the dual phase vertical clock pulses, the transition edges become close to coincident, so vertical stripes disappear. Results obtained with the CCD camera are shown.

  12. New method for testing the objectives for miniature CCD cameras

    NASA Astrophysics Data System (ADS)

    Lesniewski, Marcin; Pawlowski, Michal E.

    2003-07-01

    In the paper we present a new method for testing objectives for CCD camera. The method is based on determination of the modulation transfer function. The results were verified by experiments performed at the specially designed automated testing. The method, accompanying program software and hardware are applied in the teaching process of photonics at Institute of Micromechanics and Photonics, Warsaw University of Technology.

  13. Color measurements using a colorimeter and a CCD camera

    SciTech Connect

    Spratlin, T.L.; Simpson, M.L.

    1992-02-01

    Two new techniques are introduced for measuring the color content of printed graphic images with applications to web inspection such as color flaws and measurement of color quality. The techniques involve the development of algorithms for combining the information obtained from commercially available CCD color cameras and colorimeters to produce a colorimeter system with pixel resolution. 9 refs.

  14. Developments in the EM-CCD camera for OGRE

    NASA Astrophysics Data System (ADS)

    Tutt, James H.; McEntaffer, Randall L.; DeRoo, Casey; Schultz, Ted; Miles, Drew M.; Zhang, William; Murray, Neil J.; Holland, Andrew D.; Cash, Webster; Rogers, Thomas; O'Dell, Steve; Gaskin, Jessica; Kolodziejczak, Jeff; Evagora, Anthony M.; Holland, Karen; Colebrook, David

    2014-07-01

    The Off-plane Grating Rocket Experiment (OGRE) is a sub-orbital rocket payload designed to advance the development of several emerging technologies for use on space missions. The payload consists of a high resolution soft X-ray spectrometer based around an optic made from precision cut and ground, single crystal silicon mirrors, a module of off-plane gratings and a camera array based around Electron Multiplying CCD (EM-CCD) technology. This paper gives an overview of OGRE with emphasis on the detector array; specifically this paper will address the reasons that EM-CCDs are the detector of choice and the advantages and disadvantages that this technology offers.

  15. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  16. Design and application of TEC controller Using in CCD camera

    NASA Astrophysics Data System (ADS)

    Gan, Yu-quan; Ge, Wei; Qiao, Wei-dong; Lu, Di; Lv, Juan

    2011-08-01

    Thermoelectric cooler (TEC) is a kind of solid hot pump performed with Peltier effect. And it is small, light and noiseless. The cooling quantity is proportional to the TEC working current when the temperature difference between the hot side and the cold side keeps stable. The heating quantity and cooling quantity can be controlled by changing the value and direction of current of two sides of TEC. So, thermoelectric cooling technology is the best way to cool CCD device. The E2V's scientific image sensor CCD47-20 integrates TEC and CCD together. This package makes easier of electrical design. Software and hardware system of TEC controller are designed with CCD47-20 which is packaged with integral solid-state Peltier cooler. For hardware system, 80C51 MCU is used as CPU, 8-bit ADC and 8-bit DAC compose of closed-loop controlled system. Controlled quantity can be computed by sampling the temperature from thermistor in CCD. TEC is drove by MOSFET which consists of constant current driving circuit. For software system, advanced controlled precision and convergence speed of TEC system can be gotten by using PID controlled algorithm and tuning proportional, integral and differential coefficient. The result shows: if the heat emission of the hot side of TEC is good enough to keep the temperature stable, and when the sampling frequency is 2 seconds, temperature controlled velocity is 5°C/min. And temperature difference can reach -40°C controlled precision can achieve 0.3°C. When the hot side temperature is stable at °C, CCD temperature can reach -°C, and thermal noise of CCD is less than 1e-/pix/s. The controlled system restricts the dark-current noise of CCD and increases SNR of the camera system.

  17. CCD camera for dual-energy digital subtraction angiography.

    PubMed

    Molloi, S; Ersahin, A; Qian, Y J

    1995-01-01

    A motion immune dual-energy subtraction technique in which X-ray tube voltage and beam filtration were switched at 30 Hz between 60 kVp (2.0 mm Al filter) and 120 kVp (2.00 mm Al+2.5 mm Cu filter) was previously reported. In this study the effects of camera lag on the dual-energy iodine signal is investigated. The temporal lag of the lead oxide vidicon tested reduced the dual-energy iodine signal by a factor of 2.3, as compared to a mode that included 4 scrub frames between low- and high-energy images, for an iodine phantom with thicknesses of 0-86.0 mg/cm(2), imaged over a 15 cm thick Lucite phantom. On the other hand, the Charge-Coupled Device (CCD) camera has inherently no temporal lag and its versatile scanning characteristics make it near ideal for dual-energy DSA. The CCD camera eliminates the reduction of dual-energy iodine signal, since it does not mix low- and high-energy image data. Another benefit of the CCD camera is that the separation time between low and high-energy images is not limited to the frame period, as is the lead oxide vidicon; and as small as a 5-msec time difference is possible. The short time interval between low and high-energy images minimizes motion misregistration artifacts. Due to these advantages, the CCD camera significantly improves the utility of dual-energy DSA. PMID:18215878

  18. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  19. Development of an all-in-one gamma camera/CCD system for safeguard verification

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Il; An, Su Jung; Chung, Yong Hyun; Kwak, Sung-Woo

    2014-12-01

    For the purpose of monitoring and verifying efforts at safeguarding radioactive materials in various fields, a new all-in-one gamma camera/charged coupled device (CCD) system was developed. This combined system consists of a gamma camera, which gathers energy and position information on gamma-ray sources, and a CCD camera, which identifies the specific location in a monitored area. Therefore, 2-D image information and quantitative information regarding gamma-ray sources can be obtained using fused images. A gamma camera consists of a diverging collimator, a 22 22 array CsI(Na) pixelated scintillation crystal with a pixel size of 2 2 6 mm3 and Hamamatsu H8500 position-sensitive photomultiplier tube (PSPMT). The Basler scA640-70gc CCD camera, which delivers 70 frames per second at video graphics array (VGA) resolution, was employed. Performance testing was performed using a Co-57 point source 30 cm from the detector. The measured spatial resolution and sensitivity were 4.77 mm full width at half maximum (FWHM) and 7.78 cps/MBq, respectively. The energy resolution was 18% at 122 keV. These results demonstrate that the combined system has considerable potential for radiation monitoring.

  20. The SXI: CCD camera onboard the NeXT mission

    NASA Astrophysics Data System (ADS)

    Tsunemi, Hiroshi; Tsuru, Takeshi Go; Dotani, Tadayasu; Hayashida, Kiyoshi; Bautz, Marshall W.

    2008-07-01

    The Soft X-ray Imager (SXI) is the X-ray CCD camera on board the NeXT mission that is to be launched around 2013. We are going to employ the CCD chips developed at Hamamatsu Photonics, K.K. We have been developing two types of the CCD: an N-channel chip and a P-channel chip. The effective area of the detector system will be 5-6 cm square with a depletion layer of 100-200?m. The P-channel chip will have thicker depletion layer that makes it easy to develop it to back-illuminated type CCD. It will need a year or so for us to reach the final conclusion which type will be available. Based on the Suzaku experience, we will incorporate the charge injection gate so that we can reduce the proton damage. Furthermore, we will employ a mechanical cooler to keep the CCD working temperature down to -120C in spite that NeXT will be in the low earth orbit. We can expect the radiation damage on our system very small. The CCD will have an Al coat on the chip to prevent optical photons from entering. This also eliminates the vacuum-tight chamber and the door-opening mechanism. We are planning to employ a custom-made analog ASIC that will reduce the power consumption and the size. The ASIC may speed up the frame-time if we can use a multi-node CCD. With using the focal length of 6m, the SXI will fully function with the optics around 20" resolution. We will report the current plan of the SXI in detail.

  1. Video indirect ophthalmoscopy using a hand-held video camera

    PubMed Central

    Shanmugam, Mahesh P

    2011-01-01

    Fundus photography in adults and cooperative children is possible with a fundus camera or by using a slit lamp-mounted digital camera. Retcamor a video indirect ophthalmoscope is necessary for fundus imaging in infants and young children under anesthesia. Herein, a technique of converting and using a digital video camera into a video indirect ophthalmoscope for fundus imaging is described. This device will allow anyone with a hand-held video camera to obtain fundus images. Limitations of this technique involve a learning curve and inability to perform scleral depression. PMID:21157075

  2. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (Inventor)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  3. Use Of A C.C.D. Array In An X-Ray Pinhole Camera

    NASA Astrophysics Data System (ADS)

    Cavailler, C.; Henry, Ph.; Launspach, J.; Mens, A.; Rostaing, M.; Sauneuf, R.

    1985-02-01

    X-ray imaging adapted to the laser-matter interaction experiments consits in recording plasma images from its X-ray emission ; those phenomena have between 100 ps and some nanoseconds duration. When we only need spatial information on 1-10 keV X-ray emission, the most simple imaging device is the pinhole camera ; the two dimension image of the plasma is temporally integrated by an X-ray sensitive detector. Until now, X-ray film was used. Its operation and processing were long and tedious, so we replaced it by a television camera built around a Charge Coupled Device (C.C.D.). This camera is directly integrated in the pinhole camera. The X-ray detection is made by the silicon substrat of a C.C.D. without input window working in the vacuum of the expe-riment chamber ; a compact camera head (40 mm diameter, 120 mm length) located near the C.C.D. (1 to 2 cm) makes the charge/voltage conversion and the signal amplification. The immediate operation of images is done by an image acquisition and processing unit after digitizing the video signal on 8 bits. From measurements made on a continuous X-ray source (5,4 keV) we could point out the fact that a THOMSON-CSF THX 31135 CCD is 10 times more sensitive than the X-ray SB2 KODAK film that we use in pinhole cameras. The dynamic range measured in these conditions was about 300. The first experimental results obtained on a pulsed X-ray source are presented.

  4. High-frame-rate CCD cameras with fast optical shutters for military and medical imaging applications

    NASA Astrophysics Data System (ADS)

    King, Nicholas S. P.; Albright, Kevin L.; Jaramillo, Steven A.; McDonald, Thomas E.; Yates, George J.; Turko, Bojan T.

    1994-10-01

    Los Alamos National Laboratory (LANL) has designed and prototyped high-frame rate intensified/shuttered Charge-Coupled-Device (CCD) cameras capable of operating at Kilohertz frame rates (non-interfaced mode) with optical shutters capable of acquiring nanosecond-to- microsecond exposures each frame. These cameras utilize an Interline Transfer CCD, Loral Fairchild CCD-222 with 244 (vertical) X 380 (horizontal) pixels operated at pixel rates approaching 100 Mhz. Initial prototype designs demonstrated single-port serial readout rates exceeding 2.97 Kilohertz with greater than 5 lp/mm spatial resolution at shutter speeds as short as 5 ns. Readout was achieved by using a truncated format of 128 X 128 pixels by partial masking of the CCD and then subclocking the array at approximately 65 Mhz pixel rate. Shuttering was accomplished with a proximity focused microchannel plate (MCP) image intensifier (MCPII) that incorporated a high strip current MCP (28 uA/sq.cm) and a LANL design modification for high-speed stripline gating geometry to provide both fast shuttering and high repetition rate capabilities. Later camera designs use a close-packed quadrupole head geometry fabricated using an array of four separate CCDs (pseudo 4-port device). This design provides four video outputs with optional parallel or time-phased sequential readout modes. Parallel readout exploits the full potential of both the CCD and MCPII with reduced performance whereas sequential readout permits 4X slower operation with improved performance by multiplexing, but requires individual shuttering of each CCD. The quad head format was designed with flexibility for coupling to various image intensifier configurations, including individual intensifiers for each CCD imager, a single intensifier with fiber optic or lens/prism coupled fanout of the input image to be shared by the four CCD imagers or a large diameter phosphor screen of a gateable framing type intensifier for time sequential relaying of a complete new input image to each CCD imager. Camera designs and their potential use in ongoing military and medical time-resolved imaging applications are discussed.

  5. High frame rate CCD camera with fast optical shutter

    SciTech Connect

    Yates, G.J.; McDonald, T.E. Jr.; Turko, B.T.

    1998-09-01

    A high frame rate CCD camera coupled with a fast optical shutter has been designed for high repetition rate imaging applications. The design uses state-of-the-art microchannel plate image intensifier (MCPII) technology fostered/developed by Los Alamos National Laboratory to support nuclear, military, and medical research requiring high-speed imagery. Key design features include asynchronous resetting of the camera to acquire random transient images, patented real-time analog signal processing with 10-bit digitization at 40--75 MHz pixel rates, synchronized shutter exposures as short as 200pS, sustained continuous readout of 512 x 512 pixels per frame at 1--5Hz rates via parallel multiport (16-port CCD) data transfer. Salient characterization/performance test data for the prototype camera are presented, temporally and spatially resolved images obtained from range-gated LADAR field testing are included, an alternative system configuration using several cameras sequenced to deliver discrete numbers of consecutive frames at effective burst rates up to 5GHz (accomplished by time-phasing of consecutive MCPII shutter gates without overlap) is discussed. Potential applications including dynamic radiography and optical correlation will be presented.

  6. Two-wavelength microscopic speckle interferometry using colour CCD camera

    NASA Astrophysics Data System (ADS)

    Upputuri, Paul K.; Pramanik, Manojit; Kothiyal, Mahendra P.; Nandigana, Krishna M.

    2015-03-01

    Single wavelength microscopic speckle interferometry is widely used for deformation, shape and non-destructive testing (NDT) of engineering structures. However the single wavelength configuration fails to quantify the large deformation due to the overcrowding of fringes and it cannot provide shape of a specimen under test. In this paper, we discuss a two wavelength microscopic speckle interferometry using single-chip colour CCD camera for characterization of microsamples. The use of colour CCD allows simultaneous acquisition of speckle patterns at two different wavelengths and thus it makes the data acquisition as simple as single wavelength case. For the quantitative measurement, an error compensating 8-step phase shifted algorithm is used. The system allows quantification of large deformation and shape of a specimen with rough surface. The design of the system along with few experimental results on small scale rough specimens is presented.

  7. STK: A new CCD camera at the University Observatory Jena

    NASA Astrophysics Data System (ADS)

    Mugrauer, M.; Berthold, T.

    2010-04-01

    The Schmidt-Teleskop-Kamera (STK) is a new CCD-imager, which is operated since begin of 2009 at the University Observatory Jena. This article describes the main characteristics of the new camera. The properties of the STK detector, the astrometry and image quality of the STK, as well as its detection limits at the 0.9 m telescope of the University Observatory Jena are presented. Based on observations obtained with telescopes of the University Observatory Jena, which is operated by the Astrophysical Institute of the Friedrich-Schiller-University.

  8. Close-Range Photogrammetry with Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1983-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  9. Close-range photogrammetry with video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1985-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  10. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  11. Research of fiber position measurement by multi CCD cameras

    NASA Astrophysics Data System (ADS)

    Zhou, Zengxiang; Hu, Hongzhuan; Wang, Jianping; Zhai, Chao; Chu, Jiaru; Liu, Zhigang

    2014-07-01

    Parallel controlled fiber positioner as an efficiency observation system, has been used in LAMOST for four years, and will be proposed in ngCFHT and rebuilt telescope Mayall. The fiber positioner research group in USTC have designed a new generation prototype by a close-packed module robotic positioner mechanisms. The prototype includes about 150 groups fiber positioning module plugged in 1 meter diameter honeycombed focal plane. Each module has 37 12mm diameter fiber positioners. Furthermore the new system promotes the accuracy from 40 um in LAMOST to 10um in MSDESI. That's a new challenge for measurement. Close-loop control system are to be used in new system. The CCD camera captures the photo of fiber tip position covered the focal plane, calculates the precise position information and feeds back to control system. After the positioner rotated several loops, the accuracy of all positioners will be confined to less than 10um. We report our component development and performance measurement program of new measuring system by using multi CCD cameras. With the stereo vision and image processing method, we precisely measure the 3-demension position of fiber tip carried by fiber positioner. Finally we present baseline parameters for the fiber positioner measurement as a reference of next generation survey telescope design.

  12. Calibration tool for a CCD-camera-based vision system

    NASA Astrophysics Data System (ADS)

    Xu, Gan; Tan, Siew Leng; Low, Siok Pheng; Heng, Yee S.; Lai, Weng C.; Du, Xianhe

    2000-11-01

    A special calibration tool has been developed for a CCD camera based vision system in an automatic assembly machine. The machine is used to attach orifice plates onto a silicon wafer in a production process. The center locations of the positioning circular holes on the plate must be controlled accurately to coincide with those on the wafer die before they are attached together by UV curing. Although CCD camera based vision systems are widely used for accurate positioning and dimensional measurements in precision engineering, electronics and semiconductor industry, their calibrations are normally done by artefacts with plane patterns. These artefacts are therefore restricted to only two dimensional measurements. The calibration tool we developed was to check the positioning accuracy of circular objects in a two-layered structure. It can also be used to determine parallax errors, non-linearity and spatial non- uniformity errors as well as repeatability of the vision system with an uncertainty at sub-micrometer level. The design, calibration and performance of the tool are described in detail in this paper.

  13. Optical system based on a CCD camera for ethanol detection

    NASA Astrophysics Data System (ADS)

    Martnez-Hipatl, C.; Muoz-Aguirre, S.; Muoz-Guerrero, R.; Castillo-Mixcatl, J.; Beltrn-Prez, G.; Gutirrez-Salgado, J. M.

    2013-10-01

    This work reports the optimization of an optical system used to detect and quantify volatile organic compounds (VOC). The sensor consisted of a polydimethylsiloxane (PDMS) sensing film deposited on a glass substrate by the spin-coating technique. The PDMS has the property of swelling and/or changing its refractive index when it interacts with molecules of VOC in vapor phase. In order to measure the PDMS swelling, a charge-coupled device (CCD) camera was employed to evaluate the interference fringe shift in a Pohl interferometric arrangement. With this approach, it is possible to use each pixel of the CCD camera as a single photodetector in the arrangement. Similarly, different computer algorithms were developed in order to acquire and process the obtained data. The improvements in the system allowed the acquisition and plot of 1 datum per second. The steady-state responses of the PDMS sensors in the presence of ethanol vapor were analyzed. The obtained results showed that noise level was reduced approximately three times after performing data processing.

  14. Thomson scattering stray light reduction techniques using a CCD camera

    SciTech Connect

    Nilson, D.G.; Hill, D.N.; Evans, J.C.

    1996-02-01

    The DIII-D Thomson scattering system has been expanded to measure divertor plasma temperatures from 1-500 eV and densities from 0.05 to 8 X 10{sup 20} m{sup -3}. To complete this system, a difficult stray light problem was overcome to allow for an accurate Rayleigh scattering density calibration. The initial stray light levels were over 500 times higher than the expected Rayleigh scattered signal. Using a CCD camera, various portions of the vessel interior were examined while the laser was fired through the vessel in air at atmospheric pressure. Image relaying, exit window tilting, entrance and exit baffle modifications, and a beam polarizer were then used to reduce the stray light to acceptable levels. The CCD camera gave prompt feedback on the effectiveness of each modification, without the need to re-establish vacuum conditions required when using the normal avalanche Photodiode detectors (APD). Once the stray light was sufficiently reduced, the APD detectors provided the signal time history to more accurately identify the source location. We have also found that certain types of high reflectance dielectric coatings produce 10 to 15 times more scatter than other types of more conventional coatings. By using low-scatter mirror coatings and these new stray light reduction techniques, we now have more flexibility in the design of complex Thomson scattering configurations required to probe the central core and the new radiative divertor regions of the DIII-D vessel.

  15. Development of high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  16. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  17. Video Analysis with a Web Camera

    ERIC Educational Resources Information Center

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as…

  18. Video Analysis with a Web Camera

    ERIC Educational Resources Information Center

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as

  19. Video imagers with low speed CCD and LC based on temporal compressed

    NASA Astrophysics Data System (ADS)

    Zhong, Xiaoming; Li, Huan; Zhao, Haibo; Liu, Yanli

    2015-08-01

    Traditional video imagers require high-speed CCD, we present a new method to implement video imagers with low speed CCD detector imager system based on video compressed. Using low speed CCD detector and transmissive liquid crystal (LC) instead of high speed CCD to get data cube; by the method of data processing method , we make high precision reconstruction of compressed video data, theoretical analysis and experimental result show that it is not ensures the video imaging quality but also reduced the frame rate of the detectors and complexity of video imaging system greatly.

  20. Photogrammetric Applications of Immersive Video Cameras

    NASA Astrophysics Data System (ADS)

    Kwiatek, K.; Tokarczyk, R.

    2014-05-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug®3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

  1. Stereovision measurement technology with rotation CCD camera of multi-object

    NASA Astrophysics Data System (ADS)

    Li, Xiaofeng; Jin, Jing; Li, Weimin

    2011-10-01

    Aiming at these problems in real-time measurement about multiple targets in large scale, we propose a vision measuring method with CCD Rotation Ranging System (CRRS) to achieve the dynamic measurement with high-accuracy. CRRS is composed of mechanical rotating platform and Area scan CCD camera. The CCD camera is fixed on a mechanical rotating platform which is droved by motor. The CCD camera is used to ensure the position measurement of target. Firstly, we calibrate the CCD camera before the measurement. Secondly, the two CCD cameras is revolved and pitched by the rotating platform to measure a fixed field of the large area with high precision. The two CCD cameras composed of a Binocular stereo vision system to measure the local coordinates of the target. Thirdly, the poses of two CCD cameras are got from the rotating platform, so we can translate the local coordinates of target to the global coordinates. The experiment results show that, the measurement device can measure the targets precisely and efficiently, with non-contact the rotation CCD probe.

  2. A CCD CAMERA-BASED HYPERSPECTRAL IMAGING SYSTEM FOR STATIONARY AND AIRBORNE APPLICATIONS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes a charge coupled device (CCD) camera-based hyperspectral imaging system designed for both stationary and airborne remote sensing applications. The system consists of a high performance digital CCD camera, an imaging spectrograph, an optional focal plane scanner, and a PC comput...

  3. Video Analysis with a Web Camera

    NASA Astrophysics Data System (ADS)

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's2 Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as Videopoint3 and Tracker,4 which is freely downloadable, by Doug Brown could also be used. I purchased Logitech's5 QuickCam Pro 4000 web camera for 99 after Rick Sorensen6 at Vernier Software and Technology recommended it for computers using a Windows platform. Once I had mounted the web camera on a mobile computer with Velcro and installed the software, I was ready to capture motion video and analyze it.

  4. Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Hardware

    NASA Astrophysics Data System (ADS)

    Kang, Y.-W.; Byun, Y. I.; Rhee, J. H.; Oh, S. H.; Kim, D. K.

    2007-12-01

    We designed and developed a multi-purpose CCD camera system for three kinds of CCDs; KAF-0401E(768×512), KAF-1602E(1536×1024), KAF-3200E(2184×1472) made by KODAK Co.. The system supports fast USB port as well as parallel port for data I/O and control signal. The packing is based on two stage circuit boards for size reduction and contains built-in filter wheel. Basic hardware components include clock pattern circuit, A/D conversion circuit, CCD data flow control circuit, and CCD temperature control unit. The CCD temperature can be controlled with accuracy of approximately 0.4° C in the max. range of temperature, Δ 33° C. This CCD camera system has with readout noise 6 e^{-}, and system gain 5 e^{-}/ADU. A total of 10 CCD camera systems were produced and our tests show that all of them show passable performance.

  5. VME image acquisition and processing using standard TV CCD cameras

    NASA Astrophysics Data System (ADS)

    Epaud, F.; Verdier, P.

    1994-12-01

    The ESRF has released the first version of a low-cost image acquisition and processing system based on a industrial VME board and commercial CCD TV cameras. The images from standard CCIR (625 lines) or EIA (525 lines) inputs are digitised with 8-bit dynamic range and stored in a general purpose frame buffer to be processed by the embedded firmware. They can also be transferred to a UNIX workstation through the network for display in a X11 window, or stored in a file for off-line processing with image analysis packages like KHOROS, IDL, etc. The front-end VME acquisition system can be controlled with a Graphic Users' Interface (GUI) based on X11/Motif running under UNIX. The first release of the system is in operation and allows one to observe and analyse beam spots around the accelerators. The system has been extended to make it possible to position a mu sample (less than 10 ?m 2) not visible to the naked eye. This system is a general purpose image acquisition system which may have wider applications.

  6. Analysis of focusing accuracy for multispectral CCD camera based on satellite

    NASA Astrophysics Data System (ADS)

    Lv, Shiliang; Liu, Jinguo

    2015-10-01

    As a key technology to improve the imaging quality of remote multispectral CCD camera, the performance of a focusing system for multispectral CCD camera was presented in detail in this paper. Firstly, the focusing precision required was calculated in the optical system. The method of direct adjusting multispectral CCD focal plane was proposed, which was suitable for this multispectral CCD camera optical system. Secondly, we developed a focusing system which has the advantages of lower constructional complexity, easier hardware implementation and high focusing sensitivity. Finally, experimental test was constructed to evaluate the focusing precision performance of the focusing system. The result of focusing precision test is 3.62?m(3?) in a focusing range of +/-2.5mm. The experimental result shows that the focusing system we proposed is reasonable, and reliability as well as stable, which meet the focusing precision requirements for multispectral CCD camera.

  7. Use of CCD cameras for the differential restitution of photogrammetric snapshots

    NASA Astrophysics Data System (ADS)

    Behr, Franz-Josef

    The use of Charged Coupled Device (CCD) cameras for the fabrication of orthophotos is described. The integrated use of optical electronic system components is identified as hybrid orthophoto production. The video equipment allows the hybrid orthophoto production. The radiometric and geometrical system properties are to be considered for the design of the hybrid orthophoto system, in order to obtain a high quality product. The system is based on a photoprocessing base package which allows an efficient software development as well as control and further handling (radiometric adjustment, contour generation). A series of examples of the differential restitution of individual objects, such as bridges or large surface buildings is presented. It is shown that, using appropriate software, an analytical plotter equipped with a valuable standardized photoprocessing hardware can be used as an orthophoto projector.

  8. Abilities of Russian digital CCD cameras of serial manufacture for astronomical applications

    NASA Astrophysics Data System (ADS)

    Komarov, Vladimir V.; Komarov, Anton V.

    2007-05-01

    There is the presentation of investigation results of last native elaborations of b/w high sensitive CCD cameras for optical telescope application. By the example of SDU-259 camera (000"Specteletehnika", Moscow) capability of its use as a digitized TV guiding camera for large optical telescopes is demonstrated. In SAO RAS there is constructed SDU-259C camera with termoelectric cooler equipped. The parameters of CCD camera SDU-259C and its test results when used with 10 inch telescope "Meade LXD-55" are given.

  9. A new testing method of SNR for cooled CCD imaging camera based on stationary wavelet transform

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Liu, Qianshun; Yu, Feihong

    2013-08-01

    Cooled CCD (charge coupled device) imaging camera has found wide application in the field of astronomy, color photometry, spectroscopy, medical imaging, densitometry, chemiluminescence and epifluorescence imaging. A Cooled CCD (CCCD) imaging camera differs from traditional CCD/CMOS imaging camera in that Cooled CCD imaging camera can get high resolution image even in the low illumination environment. SNR (signal noise ratio) is most popular parameter of digital image quality evaluation. Many researchers have proposed various SNR testing methods for traditional CCD imaging camera, however, which is seldom suitable to Cooled CCD imaging camera because of different main noise source. In this paper, a new testing method of SNR is proposed to evaluate the quality of image captured by Cooled CCD. Stationary Wavelet Transform (SWT) is introduced in the testing method for getting more exact image SNR value. The method proposed take full advantage of SWT in the image processing, which makes the experiment results accuracy and reliable. To further refining SNR testing results, the relation between SNR and integration time is also analyzed in this article. The experimental results indicate that the testing method proposed accords with the SNR model of CCCD. In addition, the testing values for one system are about one value, which show that the proposed testing method is robust.

  10. Applying CCD Cameras in Stereo Panorama Systems for 3d Environment Reconstruction

    NASA Astrophysics Data System (ADS)

    Ashamini, A. Sh.; Varshosaz, M.; Saadatseresht, M.

    2012-07-01

    Proper recontruction of 3D environments is nowadays needed by many organizations and applications. In addition to conventional methods the use of stereo panoramas is an appropriate technique to use due to simplicity, low cost and the ability to view an environment the way it is in reality. This paper investigates the ability of applying stereo CCD cameras for 3D reconstruction and presentation of the environment and geometric measuring among that. For this purpose, a rotating stereo panorama was established using two CCDs with a base-length of 350 mm and a DVR (digital video recorder) box. The stereo system was first calibrated using a 3D test-field and used to perform accurate measurements. The results of investigating the system in a real environment showed that although this kind of cameras produce noisy images and they do not have appropriate geometric stability, but they can be easily synchronized, well controlled and reasonable accuracy (about 40 mm in objects at 12 meters distance from the camera) can be achieved.

  11. Measuring neutron fluences and gamma/x-ray fluxes with CCD cameras

    SciTech Connect

    Yates, G.J.; Smith, G.W.; Zagarino, P.; Thomas, M.C.

    1991-12-01

    The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4--12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate {approx}.05 V/rad responsivity with {ge}1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or ``peaks`` binned by area and amplitude as functions of fluence in the 10{sup 5} to 10{sup 7} n/cm{sup 2} range indicate smearing over {approx}1 to 10% of CCD array with charge per pixel ranging between noise and saturation levels.

  12. Nios II implementation in CCD camera for Pi of the Sky experiment

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Rybka, Dominik; Romaniuk, Ryszard S.; Pozniak, Krzysztof T.

    2008-01-01

    The concept of the Altera Nios II embedded processor implementation inside Field Programmable Gate Array (FPGA) of the CCD camera for the "Pi of the Sky" experiment is presented. The digital board of the CCD camera, its most important components, current implementation of firmware (VHDL) inside the FPGA and the role of external 8051 microcontroller is briefly described. The main goal of the presented work is to get rid of the external microcontroller and to design new system with Nios II processor built inside FPGA chip. Constraints for implementing the design into the existing camera boards are discussed. New possibilities offered by a larger FPGA for next generation of cameras are considered.

  13. Synchronizing A Stroboscope With A Video Camera

    NASA Technical Reports Server (NTRS)

    Rhodes, David B.; Franke, John M.; Jones, Stephen B.; Dismond, Harriet R.

    1993-01-01

    Circuit synchronizes flash of light from stroboscope with frame and field periods of video camera. Sync stripper sends vertical-synchronization signal to delay generator, which generates trigger signal. Flashlamp power supply accepts delayed trigger signal and sends pulse of power to flash lamp. Designed for use in making short-exposure images that "freeze" flow in wind tunnel. Also used for making longer-exposure images obtained by use of continuous intense illumination.

  14. The image pretreatment based on the FPGA inside digital CCD camera

    NASA Astrophysics Data System (ADS)

    Tian, Rui; Liu, Yan-ying

    2009-07-01

    In a space project, a digital CCD camera which can image more clearly in the 1 Lux light environment has been asked to design . The CCD sensor ICX285AL produced by SONY Co.Ltd has been used in the CCD camera. The FPGA (Field Programmable Gate Array) chip XQR2V1000 has been used as a timing generator and a signal processor inside the CCD camera. But in the low-light environment, two kinds of random noise become apparent because of the improving of CCD camera's variable gain, one is dark current noise in the image background, the other is vertical transfer noise. The real time method for eliminating noise based on FPGA inside the CCD camera would be introduced. The causes and characteristics of the random noise have been analyzed. First, several ideas for eliminating dark current noise had been motioned; then they were emulated by VC++ in order to compare their speed and effect; Gauss filter has been chosen because of the filtering effect. The vertical transfer vertical noise has the character that the vertical noise points have regular ordinate in the image two-dimensional coordinates; and the performance of the noise is fixed, the gray value of the noise points is 16-20 less than the surrounding pixels. According to these characters, local median filter has been used to clear up the vertical noise. Finally, these algorithms had been transplanted into the FPGA chip inside the CCD camera. A large number of experiments had proved that the pretreatment has better real-time features. The pretreatment makes the digital CCD camera improve the signal-to-noise ratio of 3-5dB in the low-light environment.

  15. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  16. An RS-170 to 700 frame per second CCD camera

    SciTech Connect

    Albright, K.L.; King, N.S.P.; Yates, G.J.; McDonald, T.E.; Turko, B.T.

    1993-08-01

    A versatile new camera, the Los Alamos National Laboratory (LANL) model GY6, is described. It operates at a wide variety of frame rates, from RS-170 to 700 frames per second. The camera operates as an NTSC compatible black and white camera when operating at RS-170 rates. When used for variable high-frame rates, a simple substitution is made of the RS-170 sync/clock generator circuit card with a high speed emitter-coupled logic (ECL) circuit card.

  17. Evaluation of intraoral CCD camera for dental examination in forensic inspection.

    PubMed

    Tsuzuki, Tamiyuki; Ueno, Asao; Kajiwara, Masahiro; Hanaoka, Yoichi; Uchiyama, Hideki; Agawa, Yukihisa; Takagi, Tetsuya; Sato, Yoshinobu

    2002-03-01

    This study was performed to assess the effectiveness of an intraoral CCD camera for dental examinations when sufficient jaw opening or adequate lighting cannot be obtained. A handpiece-type intraoral CCD camera (Crystal Cam; GC Corp., Japan) was used for the study. Because a full view taken by the intraoral CCD camera covers only one or two teeth, all the teeth were individually photographed and a view of the dentition assembled on a personal computer. Assuming that the jaw could not be opened widely enough to inspect an occlusal view, a dry skull and a volunteer were restricted to open the mouth and all the teeth were photographed with an intraoral CCD camera. These were compared to intraoral photographs taken by the conventional method using a single-lens reflex camera and mirror. When the intraoral CCD camera was used to photograph teeth, the color tone of metal restorations could be readily identified, but special care was required to identify carious lesions, discoloration of tooth structure, and esthetic restorations. The dentition photographs assembled from the original intraoral CCD images were transferred via the Internet as e-mail attachment files to allow preparation of the dental chart at the destination. Based on the transferred images, it was possible to prepare a dental chart agreeing satisfactorily with actual oral conditions. The easy transfer of digital images provides various advantages in evaluating and discussing certain cases in cooperation with other forensic odontologists via the Internet. The camera may be made more effective or useful through improvement of the tip portion of the camera and the entire system to achieve a more compact design and better portability. PMID:12935691

  18. Radiation damage of the PCO Pixelfly VGA CCD camera of the BES system on KSTAR tokamak

    NASA Astrophysics Data System (ADS)

    Nfrdi, Gbor; Kovcsik, kos; Pr, Gbor; Lampert, Mt; Un Nam, Yong; Zoletnik, Sndor

    2015-01-01

    A PCO Pixelfly VGA CCD camera which is part a of the Beam Emission Spectroscopy (BES) diagnostic system of the Korea Superconducting Tokamak Advanced Research (KSTAR) used for spatial calibrations, suffered from serious radiation damage, white pixel defects have been generated in it. The main goal of this work was to identify the origin of the radiation damage and to give solutions to avoid it. Monte Carlo N-Particle eXtended (MCNPX) model was built using Monte Carlo Modeling Interface Program (MCAM) and calculations were carried out to predict the neutron and gamma-ray fields in the camera position. Besides the MCNPX calculations pure gamma-ray irradiations of the CCD camera were carried out in the Training Reactor of BME. Before, during and after the irradiations numerous frames were taken with the camera with 5 s long exposure times. The evaluation of these frames showed that with the applied high gamma-ray dose (1.7 Gy) and dose rate levels (up to 2 Gy/h) the number of the white pixels did not increase. We have found that the origin of the white pixel generation was the neutron-induced thermal hopping of the electrons which means that in the future only neutron shielding is necessary around the CCD camera. Another solution could be to replace the CCD camera with a more radiation tolerant one for example with a suitable CMOS camera or apply both solutions simultaneously.

  19. Interline Transfer CCD Camera for Gated Broadband Coherent Anti-Stokes Raman-Scattering Measurements.

    PubMed

    Roy, S; Ray, G; Lucht, R P

    2001-11-20

    Use of an interline transfer CCD camera for the acquisition of broadband coherent anti-Stokes Raman-scattering (CARS) spectra is demonstrated. The interline transfer CCD has alternating columns of imaging and storage pixels that allow one to acquire two successive images by shifting the first image in the storage pixels and immediately acquiring the second image. We have used this dual-image mode for gated CARS measurements by acquiring a CARS spectral image and shifting it rapidly from the imaging pixel columns to the storage pixel columns. We have demonstrated the use of this dual-image mode for gated single-laser-shot measurement of hydrogen and nitrogen CARS spectra at room temperature and in atmospheric pressure flames. The performance of the interline transfer CCD for these CARS measurements is compared directly with the performance of a back-illuminated unintensified CCD camera. PMID:18364895

  20. The In-flight Spectroscopic Performance of the Swift XRT CCD Camera During 2006-2007

    NASA Technical Reports Server (NTRS)

    Godet, O.; Beardmore, A.P.; Abbey, A.F.; Osborne, J.P.; Page, K.L.; Evans, P.; Starling, R.; Wells, A.A.; Angelini, L.; Burrows, D.N.; Kennea, J.; Campana, S.; Chincarini, G.; Citterio, O.; Cusumano, G.; LaParola, V.; Mangano, V.; Mineo, T.; Giommi, P.; Perri, M.; Capalbi, M.; Tamburelli, F.

    2007-01-01

    The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.

  1. Automated CCD camera characterization. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports

    SciTech Connect

    Silbermann, J.

    1999-03-01

    The OMEGA system uses CCD cameras for a broad range of applications. Over 100 video rate CCD cameras are used for such purposes as targeting, aligning, and monitoring areas such as the target chamber, laser bay, and viewing gallery. There are approximately 14 scientific grade CCD cameras on the system which are used to obtain precise photometric results from the laser beam as well as target diagnostics. It is very important that these scientific grade CCDs are properly characterized so that the results received from them can be evaluated appropriately. Currently characterization is a tedious process done by hand. The operator must manually operate the camera and light source simultaneously. Because more exposures means more accurate information on the camera, the characterization tests can become very length affairs. Sometimes it takes an entire day to complete just a single plot. Characterization requires the testing of many aspects of the camera`s operation. Such aspects include the following: variance vs. mean signal level--this should be proportional due to Poisson statistics of the incident photon flux; linearity--the ability of the CCD to produce signals proportional to the light it received; signal-to-noise ratio--the relative magnitude of the signal vs. the uncertainty in that signal; dark current--the amount of noise due to thermal generation of electrons (cooling lowers this noise contribution significantly). These tests, as well as many others, must be conducted in order to properly understand a CCD camera. The goal of this project was to construct an apparatus that could characterize a camera automatically.

  2. Cooled video camera for optical investigations below 1 mK

    NASA Astrophysics Data System (ADS)

    Alles, H.; Ruutu, J. P.; Babkin, A. V.; Hakonen, P. J.; Manninen, A. J.; Pekola, J. P.

    1994-05-01

    A milliKelvin temperature imaging system based on a cooled CCD which operates a real time video speed is constructed to obtain images of the He superfluid phases. A B/W surveillance camera capable of operating down to T = 60 K is utilized. To compensate for the phase shift two CMOS gates are inserted, with a total delay of 10 ns, into the clock signal line of the CCD read-out control. To decrease heat leak in the nuclear demagnetization cryostat, laser light of lower power level is preferred and milliKelvin components should be shielded from rf radiation.

  3. Development of CCD Cameras for Soft X-ray Imaging at the National Ignition Facility

    SciTech Connect

    Teruya, A. T.; Palmer, N. E.; Schneider, M. B.; Bell, P. M.; Sims, G.; Toerne, K.; Rodenburg, K.; Croft, M.; Haugh, M. J.; Charest, M. R.; Romano, E. D.; Jacoby, K. D.

    2013-09-01

    The Static X-Ray Imager (SXI) is a National Ignition Facility (NIF) diagnostic that uses a CCD camera to record time-integrated X-ray images of target features such as the laser entrance hole of hohlraums. SXI has two dedicated positioners on the NIF target chamber for viewing the target from above and below, and the X-ray energies of interest are 870 eV for the “soft” channel and 3 – 5 keV for the “hard” channels. The original cameras utilize a large format back-illuminated 2048 x 2048 CCD sensor with 24 micron pixels. Since the original sensor is no longer available, an effort was recently undertaken to build replacement cameras with suitable new sensors. Three of the new cameras use a commercially available front-illuminated CCD of similar size to the original, which has adequate sensitivity for the hard X-ray channels but not for the soft. For sensitivity below 1 keV, Lawrence Livermore National Laboratory (LLNL) had additional CCDs back-thinned and converted to back-illumination for use in the other two new cameras. In this paper we describe the characteristics of the new cameras and present performance data (quantum efficiency, flat field, and dynamic range) for the front- and back-illuminated cameras, with comparisons to the original cameras.

  4. Wilbur: A low-cost CCD camera system for MDM Observatory

    NASA Technical Reports Server (NTRS)

    Metzger, M. R.; Luppino, G. A.; Tonry, J. L.

    1992-01-01

    The recent availability of several 'off-the-shelf' components, particularly CCD control electronics from SDSU, has made it possible to put together a flexible CCD camera system at relatively low cost and effort. The authors describe Wilbur, a complete CCD camera system constructed for the Michigan-Dartmouth-MIT Observatory. The hardware consists of a Loral 2048(exp 2) CCD controlled by the SDSU electronics, an existing dewar design modified for use at MDM, a Sun Sparcstation 2 with a commercial high-speed parallel controller, and a simple custom interface between the controller and the SDSU electronics. The camera is controlled from the Sparcstation by software that provides low-level I/O in real time, collection of additional information from the telescope, and a simple command interface for use by an observer. Readout of the 2048(exp 2) array is complete in under two minutes at 5 e(sup -) read noise, and readout time can be decreased at the cost of increased noise. The system can be easily expanded to handle multiple CCD's/multiple readouts, and can control other dewars/CCD's using the same host software.

  5. Preliminary results from a single-photon imaging X-ray charge coupled device /CCD/ camera

    NASA Technical Reports Server (NTRS)

    Griffiths, R. E.; Polucci, G.; Mak, A.; Murray, S. S.; Schwartz, D. A.; Zombeck, M. V.

    1981-01-01

    A CCD camera is described which has been designed for single-photon X-ray imaging in the 1-10 keV energy range. Preliminary results are presented from the front-side illuminated Fairchild CCD 211, which has been shown to image well at 3 keV. The problem of charge-spreading above 4 keV is discussed by analogy with a similar problem at infrared wavelengths. The total system noise is discussed and compared with values obtained by other CCD users.

  6. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  7. Video-Based Point Cloud Generation Using Multiple Action Cameras

    NASA Astrophysics Data System (ADS)

    Teo, T.

    2015-05-01

    Due to the development of action cameras, the use of video technology for collecting geo-spatial data becomes an important trend. The objective of this study is to compare the image-mode and video-mode of multiple action cameras for 3D point clouds generation. Frame images are acquired from discrete camera stations while videos are taken from continuous trajectories. The proposed method includes five major parts: (1) camera calibration, (2) video conversion and alignment, (3) orientation modelling, (4) dense matching, and (5) evaluation. As the action cameras usually have large FOV in wide viewing mode, camera calibration plays an important role to calibrate the effect of lens distortion before image matching. Once the camera has been calibrated, the author use these action cameras to take video in an indoor environment. The videos are further converted into multiple frame images based on the frame rates. In order to overcome the time synchronous issues in between videos from different viewpoints, an additional timer APP is used to determine the time shift factor between cameras in time alignment. A structure form motion (SfM) technique is utilized to obtain the image orientations. Then, semi-global matching (SGM) algorithm is adopted to obtain dense 3D point clouds. The preliminary results indicated that the 3D points from 4K video are similar to 12MP images, but the data acquisition performance of 4K video is more efficient than 12MP digital images.

  8. Inexpensive range camera operating at video speed.

    PubMed

    Kramer, J; Seitz, P; Baltes, H

    1993-05-01

    An optoelectronic device has been developed and built that acquires and displays the range data of an object surface in space in video real time. The recovery of depth is performed with active triangulation. A galvanometer scanner system sweeps a sheet of light across the object at a video field rate of 50 Hz. High-speed signal processing is achieved through the use of a special optical sensor and hardware implementation of the simple electronic-processing steps. Fifty range maps are generated per second and converted into a European standard video signal where the depth is encoded in gray levels or color. The image resolution currently is 128 x 500 pixels with a depth accuracy of 1.5% of the depth range. The present setup uses a 500-mW diode laser for the generation of the light sheet. A 45-mm imaging lens covers a measurement volume of 93 mm x 61 mm x 63 mm at a medium distance of 250 mm from the camera, but this can easily be adapted to other dimensions. PMID:20820391

  9. Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund

    NASA Technical Reports Server (NTRS)

    Hagyard, Mona J.

    1992-01-01

    The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.

  10. Research on detecting heterogeneous fibre from cotton based on linear CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Xian-bin; Cao, Bing; Zhang, Xin-peng; Shi, Wei

    2009-07-01

    The heterogeneous fibre in cotton make a great impact on production of cotton textile, it will have a bad effect on the quality of product, thereby affect economic benefits and market competitive ability of corporation. So the detecting and eliminating of heterogeneous fibre is particular important to improve machining technics of cotton, advance the quality of cotton textile and reduce production cost. There are favorable market value and future development for this technology. An optical detecting system obtains the widespread application. In this system, we use a linear CCD camera to scan the running cotton, then the video signals are put into computer and processed according to the difference of grayscale, if there is heterogeneous fibre in cotton, the computer will send an order to drive the gas nozzle to eliminate the heterogeneous fibre. In the paper, we adopt monochrome LED array as the new detecting light source, it's lamp flicker, stability of luminous intensity, lumens depreciation and useful life are all superior to fluorescence light. We analyse the reflection spectrum of cotton and various heterogeneous fibre first, then select appropriate frequency of the light source, we finally adopt violet LED array as the new detecting light source. The whole hardware structure and software design are introduced in this paper.

  11. Auto-measurement system of aerial camera lens' resolution based on orthogonal linear CCD

    NASA Astrophysics Data System (ADS)

    Zhao, Yu-liang; Zhang, Yu-ye; Ding, Hong-yi

    2010-10-01

    The resolution of aerial camera lens is one of the most important camera's performance indexes. The measurement and calibration of resolution are important test items in in maintenance of camera. The traditional method that is observing resolution panel of collimator rely on human's eyes using microscope and doing some computing. The method is of low efficiency and susceptible to artificial factors. The measurement results are unstable, too. An auto-measurement system of aerial camera lens' resolution, which uses orthogonal linear CCD sensor as the detector to replace reading microscope, is introduced. The system can measure automatically and show result real-timely. In order to measure the smallest diameter of resolution panel which could be identified, two orthogonal linear CCD is laid on the imaging plane of measured lens and four intersection points are formed on the orthogonal linear CCD. A coordinate system is determined by origin point of the linear CCD. And a circle is determined by four intersection points. In order to obtain the circle's radius, firstly, the image of resolution panel is transformed to pulse width of electric signal which is send to computer through amplifying circuit and threshold comparator and counter. Secondly, the smallest circle would be extracted to do measurement. The circle extraction made using of wavelet transform which has character of localization in the domain of time and frequency and has capability of multi-scale analysis. Lastly, according to the solution formula of lens' resolution, we could obtain the resolution of measured lens. The measuring precision on practical measurement is analyzed, and the result indicated that the precision will be improved when using linear CCD instead of reading microscope. Moreover, the improvement of system error is determined by the pixel's size of CCD. With the technique of CCD developed, the pixel's size will smaller, the system error will be reduced greatly too. So the auto-measuring system has high practical value and wide application prospect.

  12. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    NASA Astrophysics Data System (ADS)

    Oh, S. H.; Kang, Y. W.; Byun, Y. I.

    2007-12-01

    We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512), KAF-1602E (15367times;1024), KAF-3200E (2184×1472) made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  13. A CCD Camera with Electron Decelerator for Intermediate Voltage Electron Microscopy

    SciTech Connect

    Downing, Kenneth H; Downing, Kenneth H.; Mooney, Paul E.

    2008-03-17

    Electron microscopists are increasingly turning to Intermediate Voltage Electron Microscopes (IVEMs) operating at 300 - 400 kV for a wide range of studies. They are also increasingly taking advantage of slow-scan charge coupled device (CCD) cameras, which have become widely used on electron microscopes. Under some conditions CCDs provide an improvement in data quality over photographic film, as well as the many advantages of direct digital readout. However, CCD performance is seriously degraded on IVEMs compared to the more conventional 100 kV microscopes. In order to increase the efficiency and quality of data recording on IVEMs, we have developed a CCD camera system in which the electrons are decelerated to below 100 kV before impacting the camera, resulting in greatly improved performance in both signal quality and resolution compared to other CCDs used in electron microscopy. These improvements will allow high-quality image and diffraction data to be collected directly with the CCD, enabling improvements in data collection for applications including high-resolution electron crystallography, single-particle reconstruction of protein structures, tomographic studies of cell ultrastructure and remote microscope operation. This approach will enable us to use even larger format CCD chips that are being developed with smaller pixels.

  14. Color video camera capable of 1,000,000 fps with triple ultrahigh-speed image sensors

    NASA Astrophysics Data System (ADS)

    Maruyama, Hirotaka; Ohtake, Hiroshi; Hayashida, Tetsuya; Yamada, Masato; Kitamura, Kazuya; Arai, Toshiki; Tanioka, Kenkichi; Etoh, Takeharu G.; Namiki, Jun; Yoshida, Tetsuo; Maruno, Hiromasa; Kondo, Yasushi; Ozaki, Takao; Kanayama, Shigehiro

    2005-03-01

    We developed an ultrahigh-speed, high-sensitivity, color camera that captures moving images of phenomena too fast to be perceived by the human eye. The camera operates well even under restricted lighting conditions. It incorporates a special CCD device that is capable of ultrahigh-speed shots while retaining its high sensitivity. Its ultrahigh-speed shooting capability is made possible by directly connecting CCD storages, which record video images, to photodiodes of individual pixels. Its large photodiode area together with the low-noise characteristic of the CCD contributes to its high sensitivity. The camera can clearly capture events even under poor light conditions, such as during a baseball game at night. Our camera can record the very moment the bat hits the ball.

  15. Frequency analysis for roughness of optical surface by focal plane CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Jianbai; Ying, Aihan; Li, Xiaoyun; Zhang, Xiaolin; Zao, Anqing

    1997-09-01

    In this paper, the new method on evaluating and measuring roughness of optical surface by Fourier spatial frequency analysis is presented. Authors have gotten the equipment, in which the electron photomicrographs of optical surface is scanned and analyzed by CCD camera--microcomputer system. The new method have both good virtual and lateral resolution, fast scanning speed and better measuring accuracy.

  16. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera

  17. Time-Resolved Spectra of Dense Plasma Focus Using Spectrometer, Streak Camera, CCD Combination

    SciTech Connect

    F. J. Goldin, B. T. Meehan, E. C. Hagen, P. R. Wilkins

    2010-10-01

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially CzernyTurner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

  18. Time-resolved spectra of dense plasma focus using spectrometer, streak camera, and CCD combination.

    PubMed

    Goldin, F J; Meehan, B T; Hagen, E C; Wilkins, P R

    2010-10-01

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny-Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments. PMID:21034059

  19. A Simple Approach of CCD Camera Calibration for Optical Diagnostics Instrumentation

    NASA Technical Reports Server (NTRS)

    Cha, Soyoung Stephen; Leslie, Fred W.; Ramachandran, Narayanan; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    Solid State array sensors are ubiquitous nowadays for obtaining gross field images in numerous scientific and engineering applications including optical diagnostics and instrumentation. Linear responses of these sensors are often required as in interferometry, light scattering and attenuation measurements, and photometry. In most applications, the linearity is usually taken to be granted without thorough quantitative assessment or correction through calibration. Upper-grade CCD cameras of high price may offer better linearity: however, they also require linearity checking and correction if necessary. Intermediate- or low-grade CCD cameras are more likely to need calibration for linearity . Here, we present two very simple approaches: one for quickly checking camera linearity without any additional setup and one for precisely correcting nonlinear sensor responses. It is believed that after calibration, those sensors of intermediate or low grade can function as effectively as their expensive counterpart.

  20. Time-resolved spectra of dense plasma focus using spectrometer, streak camera, and CCD combination

    SciTech Connect

    Goldin, F. J.; Meehan, B. T.; Hagen, E. C.; Wilkins, P. R.

    2010-10-15

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny-Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

  1. The University of Hawaii Institute for Astronomy CCD camera control system

    NASA Technical Reports Server (NTRS)

    Jim, K. T. C.; Yamada, H. T.; Luppino, G. A.; Hlivak, R. J.

    1992-01-01

    The University of Hawaii Institute for Astronomy CCD Camera Control System consists of a NeXT workstation, a graphical user interface, and a fiber optics communications interface which is connected to a San Diego State University CCD controller. The UH system employs the NeXT-resident Motorola DSP 56001 as a real time hardware controller. The DSP 56001 is interfaced to the Mach-based UNIX of the NeXT workstation by DMA and multithreading. Since the SDSU controller also uses the DPS 56001, the NeXT is used as a development platform for the embedded control software. The fiber optic interface links the two DSP 56001's through their Synchronous Serial Interfaces. The user interface is based on the NeXTStep windowing system. It is easy to use and features real-time display of image data and control over all camera functions. Both Loral and Tektronix 2048 x 2048 CCD's have been driven at full readout speeds, and the system is intended to be capable of simultaneous readout of four such CCD's. The total hardware package is compact enough to be quite portable and has been used on five different telescopes on Mauna Kea. The complete CCD control system can be assembled for a very low cost. The hardware and software of the control system has proven to be quite reliable, well adapted to the needs of astronomers, and extensible to increasingly complicated control requirements.

  2. PIV camera response to high frequency signal: comparison of CCD and CMOS cameras using particle image simulation

    NASA Astrophysics Data System (ADS)

    Abdelsalam, D. G.; Stanislas, M.; Coudert, S.

    2014-08-01

    We present a quantitative comparison between FlowMaster3 CCD and Phantom V9.1 CMOS cameras response in the scope of application to particle image velocimetry (PIV). First, the subpixel response is characterized using a specifically designed set-up. The crosstalk between adjacent pixels for the two cameras is then estimated and compared. Then, the camera response is experimentally characterized using particle image simulation. Based on a three-point Gaussian peak fitting, the bias and RMS errors between locations of simulated and real images for the two cameras are accurately calculated using a homemade program. The results show that, although the pixel response is not perfect, the optical crosstalk between adjacent pixels stays relatively low and the accuracy of the position determination of an ideal PIV particle image is much better than expected.

  3. Calibration of CCD-Cameras for Machine Vision and Robotics

    NASA Astrophysics Data System (ADS)

    Beyer, Horst A.

    1990-02-01

    The basic mathematical formulation of a general solution to the extraction of three-dimensional information from images and camera calibration is presented. Standard photogrammetric algorithms for the least squares estimation of relevant parameters are outlined together with terms and principal aspects of calibration and quality assessment. A second generation prototype system for "Real-Time Photogrammetry" developed as part of the "Digital Photogrammetric Station" of the Institute of Geodesy and Photogrammetry of ETH-Zurich is described. Two calibration tests with three-dimensional testfields and independently determined reference coordinates for quality assessment are presented. In a laboratory calibration with off the shelf equipment an accuracy of 1120th and 1150th of the pixel spacing in row and column direction respectively has been achieved. Problems of the hardware used in the test are outlined. The calibration of a vision system of a ping-pong playing high-speed robot led to an improvement of the accuracy of object coordinates by a factor of over 8. The vision system is tracking table-tennis balls with a 50 Hz rate.

  4. Initial laboratory evaluation of color video cameras: Phase 2

    SciTech Connect

    Terry, P.L.

    1993-07-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than to identify intruders. The monochrome cameras were selected over color cameras because they have greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Because color camera technology is rapidly changing and because color information is useful for identification purposes, Sandia National Laboratories has established an on-going program to evaluate the newest color solid-state cameras. Phase One of the Sandia program resulted in the SAND91-2579/1 report titled: Initial Laboratory Evaluation of Color Video Cameras. The report briefly discusses imager chips, color cameras, and monitors, describes the camera selection, details traditional test parameters and procedures, and gives the results reached by evaluating 12 cameras. Here, in Phase Two of the report, we tested 6 additional cameras using traditional methods. In addition, all 18 cameras were tested by newly developed methods. This Phase 2 report details those newly developed test parameters and procedures, and evaluates the results.

  5. Optics design of laser spotter camera for ex-CCD sensor

    NASA Astrophysics Data System (ADS)

    Nautiyal, R. P.; Mishra, V. K.; Sharma, P. K.

    2015-06-01

    Development of Laser based instruments like laser range finder and laser ranger designator has received prominence in modern day military application. Aiming the laser on the target is done with the help of a bore sighted graticule as human eye cannot see the laser beam directly. To view Laser spot there are two types of detectors available, InGaAs detector and Ex-CCD detector, the latter being a cost effective solution. In this paper optics design for Ex-CCD based camera is discussed. The designed system is light weight and compact and has the ability to see the 1064nm pulsed laser spot upto a range of 5 km.

  6. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  7. BroCam: a versatile PC-based CCD camera system

    NASA Astrophysics Data System (ADS)

    Klougart, Jens

    1995-03-01

    At the Copenhagen University, we have developed a compact CCD camera system for single and mosaic CCDs. The camera control and data acquisition is performed by a 486 type PC via a frame buffer located in one ISA-bus slot, communicating to the camera electronics on two optical fibers. The PC can run as well special purpose DOS programs, as in a more general mode under LINUX, a UNIX similar operating system. In the latter mode, standard software packages, such as SAOimage and Gnuplot, are utilized extensively thereby reducing the amount of camera specific software. At the same time the observer feels at ease with the system in an IRAF-like environment. Finally, the LINUX version enables the camera to be remotely controlled.

  8. Camera Control and Geo-Registration for Video Sensor Networks

    NASA Astrophysics Data System (ADS)

    Davis, James W.

    With the use of large video networks, there is a need to coordinate and interpret the video imagery for decision support systems with the goal of reducing the cognitive and perceptual overload of human operators. We present computer vision strategies that enable efficient control and management of cameras to effectively monitor wide-coverage areas, and examine the framework within an actual multi-camera outdoor urban video surveillance network. First, we construct a robust and precise camera control model for commercial pan-tilt-zoom (PTZ) video cameras. In addition to providing a complete functional control mapping for PTZ repositioning, the model can be used to generate wide-view spherical panoramic viewspaces for the cameras. Using the individual camera control models, we next individually map the spherical panoramic viewspace of each camera to a large aerial orthophotograph of the scene. The result provides a unified geo-referenced map representation to permit automatic (and manual) video control and exploitation of cameras in a coordinated manner. The combined framework provides new capabilities for video sensor networks that are of significance and benefit to the broad surveillance/security community.

  9. Measuring high-resolution sky luminance distributions with a CCD camera.

    PubMed

    Tohsing, Korntip; Schrempf, Michael; Riechelmann, Stefan; Schilke, Holger; Seckmeyer, Gunther

    2013-03-10

    We describe how sky luminance can be derived from a newly developed hemispherical sky imager (HSI) system. The system contains a commercial compact charge coupled device (CCD) camera equipped with a fish-eye lens. The projection of the camera system has been found to be nearly equidistant. The luminance from the high dynamic range images has been calculated and then validated with luminance data measured by a CCD array spectroradiometer. The deviation between both datasets is less than 10% for cloudless and completely overcast skies, and differs by no more than 20% for all sky conditions. The global illuminance derived from the HSI pictures deviates by less than 5% and 20% under cloudless and cloudy skies for solar zenith angles less than 80, respectively. This system is therefore capable of measuring sky luminance with the high spatial and temporal resolution of more than a million pixels and every 20 s respectively. PMID:23478758

  10. Deflection Measurements of a Thermally Simulated Nuclear Core Using a High-Resolution CCD-Camera

    NASA Technical Reports Server (NTRS)

    Stanojev, B. J.; Houts, M.

    2004-01-01

    Space fission systems under consideration for near-term missions all use compact. fast-spectrum reactor cores. Reactor dimensional change with increasing temperature, which affects neutron leakage. is the dominant source of reactivity feedback in these systems. Accurately measuring core dimensional changes during realistic non-nuclear testing is therefore necessary in predicting the system nuclear equivalent behavior. This paper discusses one key technique being evaluated for measuring such changes. The proposed technique is to use a Charged Couple Device (CCD) sensor to obtain deformation readings of electrically heated prototypic reactor core geometry. This paper introduces a technique by which a single high spatial resolution CCD camera is used to measure core deformation in Real-Time (RT). Initial system checkout results are presented along with a discussion on how additional cameras could be used to achieve a three- dimensional deformation profile of the core during test.

  11. White light single-shot interferometry with colour CCD camera for optical inspection of microsystems

    NASA Astrophysics Data System (ADS)

    Upputuri, Paul Kumar; Pramanik, Manojit; Nandigana, Krishna Mohan; Kothiyal, Mahendra Prasad

    2015-07-01

    White light interferometry is a well-established optical tool for surface metrology of reflective samples. In this work, we discuss a single-shot white light interferometer based on single-chip color CCD camera and Hilbert transformation. The approach makes the measurement dynamic, faster, easier and cost-effective for industrial applications. Here we acquire only one white light interferogram using colour CCD camera and then decompose into its individual components using software. We present a simple Hilbert transformation approach to remove the non-uniform bias associated with the interference signal. The phases at individual wavelengths are calculated using Hilbert transformation. The use of Hilbert transformation introduces phase error which depends on number of fringe cycles. We discuss these errors. Experimental results on reflective micro-scale-samples for surface profiling are presented.

  12. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  13. The L3Vision CCD220 with its OCam test camera for AO applications in Europe

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Stadler, Eric; Magnard, Yves; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Skegg, Michael; Pool, Peter; Bell, Ray; Burt, David; Reyes, Javier; Meyer, Manfred; Hubin, Norbert; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz Garcia, Jose Javier

    2008-07-01

    ESO and JRA2 OPTICON have jointly funded e2v technologies to develop a custom CCD for Adaptive Optic Wave Front Sensor (AO WFS) applications. The device, called CCD220, is a compact Peltier-cooled 240×240 pixel frametransfer 8-output back-illuminated sensor. Using the electron-multiplying technology of L3Vision detectors, the device is designed to achieve sub-electron read noise at frame rates from 25 Hz to 1,500 Hz and dark current lower than 0.01 e-/pixel/frame. The development has many unique features. To obtain high frame rates, multiple EMCCD gain registers and metal buttressing of row clock lines are used. The baseline device is built in standard silicon. In addition, two speculative variants have been built; deep depletion silicon devices to improve red response and devices with an electronic shutter to extend use to Rayleigh and Pulsed Laser Guide Star applications. These are all firsts for L3Vision CCDs. These CCD220 detectors have now been fabricated by e2v technologies. This paper describes the design of the device, technology trade-offs, and progress to date. A Test Camera, called "OCam", has been specially designed and built for these sensors. Main features of the OCam camera are extensively described in this paper, together with first light images obtained with the CCD220.

  14. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  15. The development of a high-speed 100 fps CCD camera

    SciTech Connect

    Hoffberg, M.; Laird, R.; Lenkzsus, F. Liu, Chuande; Rodricks, B.; Gelbart, A.

    1996-09-01

    This paper describes the development of a high-speed CCD digital camera system. The system has been designed to use CCDs from various manufacturers with minimal modifications. The first camera built on this design utilizes a Thomson 512x512 pixel CCD as its sensor which is read out from two parallel outputs at a speed of 15 MHz/pixel/output. The data undergoes correlated double sampling after which, they are digitized into 12 bits. The throughput of the system translates into 60 MB/second which is either stored directly in a PC or transferred to a custom designed VXI module. The PC data acquisition version of the camera can collect sustained data in real time that is limited to the memory installed in the PC. The VXI version of the camera, also controlled by a PC, stores 512 MB of real-time data before it must be read out to the PC disk storage. The uncooled CCD can be used either with lenses for visible light imaging or with a phosphor screen for x-ray imaging. This camera has been tested with a phosphor screen coupled to a fiber-optic face plate for high-resolution, high-speed x-ray imaging. The camera is controlled through a custom event-driven user-friendly Windows package. The pixel clock speed can be changed from I MHz to 15 MHz. The noise was measure to be 1.05 bits at a 13.3 MHz pixel clock. This paper will describe the electronics, software, and characterizations that have been performed using both visible and x-ray photons.

  16. Analysis of unstructured video based on camera motion

    NASA Astrophysics Data System (ADS)

    Abdollahian, Golnaz; Delp, Edward J.

    2007-01-01

    Although considerable work has been done in management of "structured" video such as movies, sports, and television programs that has known scene structures, "unstructured" video analysis is still a challenging problem due to its unrestricted nature. The purpose of this paper is to address issues in the analysis of unstructured video and in particular video shot by a typical unprofessional user (i.e home video). We describe how one can make use of camera motion information for unstructured video analysis. A new concept, "camera viewing direction," is introduced as the building block of home video analysis. Motion displacement vectors are employed to temporally segment the video based on this concept. We then find the correspondence between the camera behavior with respect to the subjective importance of the information in each segment and describe how different patterns in the camera motion can indicate levels of interest in a particular object or scene. By extracting these patterns, the most representative frames, keyframes, for the scenes are determined and aggregated to summarize the video sequence.

  17. Are Video Cameras the Key to School Safety?

    ERIC Educational Resources Information Center

    Maranzano, Chuck

    1998-01-01

    Describes one high school's use of video cameras as a preventive tool in stemming theft and violent episodes within schools. The top 10 design tips for preventing crime on campus are highlighted. (GR)

  18. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  19. DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  20. Subtractive imaging in confocal scanning microscopy using a CCD camera as a detector.

    PubMed

    Snchez-Ortiga, Emilio; Sheppard, Colin J R; Saavedra, Genaro; Martnez-Corral, Manuel; Doblas, Ana; Calatayud, Arnau

    2012-04-01

    We report a scheme for the detector system of confocal microscopes in which the pinhole and a large-area detector are substituted by a CCD camera. The numerical integration of the intensities acquired by the active pixels emulates the signal passing through the pinhole. We demonstrate the imaging capability and the optical sectioning of the system. Subtractive-imaging confocal microscopy can be implemented in a simple manner, providing superresolution and improving optical sectioning. PMID:22466221

  1. Outer planet investigations using a CCD camera system. [Saturn disk photommetry

    NASA Technical Reports Server (NTRS)

    Price, M. J.

    1980-01-01

    Problems related to analog noise, data transfer from the camera buffer to the storage computer, and loss of sensitivity of a two dimensional charge coupled device imaging system are reported. To calibrate the CCD system, calibrated UBV pinhole scans of the Saturn disk were obtained with a photoelectric area scanning photometer. Atmospheric point spread functions were also obtained. The UBV observations and models of the Saturn atmosphere are analyzed.

  2. A range-resolved bistatic lidar using a high-sensitive CCD-camera

    NASA Technical Reports Server (NTRS)

    Yamaguchi, K.; Nomura, A.; Saito, Y.; Kano, T.

    1992-01-01

    Until now monostatic type lidar systems have been mainly utilized in the field of lidar measurements of the atmosphere. We propose here a range-resolved bistatic lidar system using a high-sensitive cooled charge coupled device (CCD) camera. This system has the ability to measure the three dimensional distributions of aerosol, atmospheric density, and cloud by processing the image data of the laser beam trajectory obtained by a CCD camera. Also, this lidar system has a feature that allows dual utilization of continuous wave (CW) lasers and pulse lasers. The scheme of measurement with this bistatic lidar is shown. A laser beam is emitted vertically and the image of its trajectory is taken with a remote high-sensitive CCD detector using an interference filter and a camera lens. The specifications of the bistatic lidar system used in the experiments are shown. The preliminary experimental results of our range-resolved bistatic lidar system suggest potential applications in the field of lidar measurements of the atmosphere.

  3. Research on simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA

    NASA Astrophysics Data System (ADS)

    Ma, Fei; Liu, Qi; Cui, Xuenan

    2014-09-01

    To satisfy the needs for testing video processor of satellite remote sensing cameras, a design is provided to achieve a simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA. The correctness of video processor FPGA logic can be verified even without CCD signals or analog to digital convertor. Two Xilinx Virtex FPGAs are adopted to make a center unit, the logic of A/D digital data generating and data processing are developed with VHDL. The RS-232 interface is used to receive commands from the host computer, and different types of data are generated and outputted depending on the commands. Experimental results show that the simulation and verification system is flexible and can work well. The simulation and verification system meets the requirements of testing video processors for several different types of satellite remote sensing cameras.

  4. Modeling of the over-exposed pixel area of CCD cameras caused by laser dazzling

    NASA Astrophysics Data System (ADS)

    Benoist, Koen W.; Schleijpen, Ric H. M. A.

    2014-10-01

    A simple model has been developed and implemented in Matlab code, predicting the over-exposed pixel area of cameras caused by laser dazzling. Inputs of this model are the laser irradiance on the front optics of the camera, the Point Spread Function (PSF) of the used optics, the integration time of the camera, and camera sensor specifications like pixel size, quantum efficiency and full well capacity. Effects of the read-out circuit of the camera are not incorporated. The model was evaluated with laser dazzle experiments on CCD cameras using a 532 nm CW laser dazzler and shows good agreement. For relatively low laser irradiance the model predicts the over-exposed laser spot area quite accurately and shows the cube root dependency of spot diameter on laser irradiance, caused by the PSF as demonstrated before for IR cameras. For higher laser power levels the laser induced spot diameter increases more rapidly than predicted, which probably can be attributed to scatter effects in the camera. Some first attempts to model scatter contributions, using a simple scatter power function f(?), show good resemblance with experiments. Using this model, a tool is available which can assess the performance of observation sensor systems while being subjected to laser countermeasures.

  5. Video camera system for locating bullet holes in targets at a ballistics tunnel

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Rummler, D. R.; Goad, W. K.

    1990-01-01

    A system consisting of a single charge coupled device (CCD) video camera, computer controlled video digitizer, and software to automate the measurement was developed to measure the location of bullet holes in targets at the International Shooters Development Fund (ISDF)/NASA Ballistics Tunnel. The camera/digitizer system is a crucial component of a highly instrumented indoor 50 meter rifle range which is being constructed to support development of wind resistant, ultra match ammunition. The system was designed to take data rapidly (10 sec between shoots) and automatically with little operator intervention. The system description, measurement concept, and procedure are presented along with laboratory tests of repeatability and bias error. The long term (1 hour) repeatability of the system was found to be 4 microns (one standard deviation) at the target and the bias error was found to be less than 50 microns. An analysis of potential errors and a technique for calibration of the system are presented.

  6. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  7. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera…

  8. Traceability of a CCD-Camera System for High-Temperature Measurements

    NASA Astrophysics Data System (ADS)

    Bnger, L.; Anhalt, K.; Taubert, R. D.; Krger, U.; Schmidt, F.

    2015-08-01

    A CCD camera, which has been specially equipped with narrow-band interference filters in the visible spectral range for temperature measurements above 1200 K, was characterized with respect to its temperature response traceable to ITS-90 and with respect to absolute spectral radiance responsivity. The calibration traceable to ITS-90 was performed at a high-temperature blackbody source using a radiation thermometer as a transfer standard. Use of Planck's law and the absolute spectral radiance responsivity of the camera system allows the determination of the thermodynamic temperature. For the determination of the absolute spectral radiance responsivity, a monochromator-based setup with a supercontinuum white-light laser source was developed. The CCD-camera system was characterized with respect to the dark-signal-non-uniformity, the photo-response-non-uniformity, the non-linearity, and the size-of-source effect. The influence of these parameters on the calibration and measurement was evaluated and is considered for the uncertainty budget. The results of the two different calibration schemes for the investigated temperature range from 1200 K to 1800 K are in good agreement considering the expanded uncertainty . The uncertainty for the absolute spectral responsivity of the camera is 0.56 %.

  9. Video Cameras in the Ondrejov Flare Spectrograph Results and Prospects

    NASA Astrophysics Data System (ADS)

    Kotrc, P.

    Since 1991 video cameras have been widely used both in the image and in the spectral data acquisition of the Ondrejov Multichannel Flare Spectrograph. In addition to classical photographic data registration, this kind of detectors brought new possibilities, especially into dynamical solar phenomena observations and put new requirements on the digitization, archiving and data processing techniques. The unique complex video system consisting of four video cameras and auxiliary equipment was mostly developed, implemented and used in the Ondrejov observatory. The main advantages and limitations of the system are briefly described from the points of view of its scientific philosophy, intents and outputs. Some obtained results, experience and future prospects are discussed.

  10. CCD camera baseline calibration and its effects on imaging processing and laser beam analysis

    NASA Astrophysics Data System (ADS)

    Roundy, Carlos B.

    1997-09-01

    CCD cameras are commonly used for many imaging applications, as well as in optical instrumentation applications. These cameras have many excellent characteristics for both scene imaging and laser beam analysis. However, CCD cameras have two characteristics that limit their potential performance. The first limiting factor is the baseline drift of the camera. If the baseline drifts below the digitizer zero, data in the background is lost, and is uncorrectable. If the baseline drifts above the digitizer zero, than a false background is introduced into the scene. This false background is partially correctable by taking a background frame with no input image, and then subtracting that from each imaged frame. ('Partially correctable' is explained in detail later.) The second characteristic that inhibits CCD cameras is their high level of random noise. A typical CCD camera used with an 8-bit digitizer yielding 256 counts, has 2 to 6 counts of random noise in the baseline. The noise is typically Gaussian, and goes both positive and negative about a mean or average baseline level. When normal baseline subtraction occurs, the negative noise components are truncated, leaving only the positive components. These lost negative noise components can distort measurements that rely on low intensity background. Situations exist in which the baseline offset and lost negative noise components are very significant. For example, in image processing, when attempting to distinguish data with a very low contrast between objects, the contrast is compromised by the loss of the negative noise. Secondly the measurement of laser beam widths requires analysis of very low intensity signals far out into the wings of the beam. The intensity is low, but the area is large, and so even small distortion can create significant errors in measuring beam width. The effect of baseline error is particularly significant on the measurement of a laser beam width. This measurement is very important because it gives the size of the beam at the measurement point, it is used in laser divergence measurement, and it is critical for realistic measurement of M2, the ultimate criterion for the quality of a laser beam. One measurement of laser beam width, called second moment, or D4(sigma) , which is the ISO definition of a true laser beam width, is especially sensitive to noise in the baseline. The D4(sigma) measurement method integrates all signals far out into the wings of the beam, and gives particular weight to the noise and signal in the wings. It is impossible to make this measurement without the negative noise components, and without other special algorithms to limit the effect of noise in the wings.

  11. A new algorithm for automatic white balance based on CCD camera

    NASA Astrophysics Data System (ADS)

    Xu, Zhaohui; Li, Han; Tian, Yan; Jiao, Guohua

    2009-10-01

    Auto white balance plays a key role in a digital camera system, and determines image quality to a large extent. If the white balance is not to be considered in the development of CCD camera, under the different color temperatures, this will cause chromatic aberration. A new effective automatic white balance algorithm for digital camera is proposed in this paper. With a new color temperature estimation method based on the extraction both skin and white regions, the algorithm can find more proper pixels to calculate the averaged chromatic aberration to improve the precision of the estimated color temperature. And to some extent, the algorithm solves the problem that the classical automatic white balance algorithm fails in estimating color temperature in the past in the case of the images have not white regions.

  12. Color balancing in CCD color cameras using analog signal processors made by Kodak

    NASA Astrophysics Data System (ADS)

    Kannegundla, Ram

    1995-03-01

    The green, red, and blue color filters used for CCD sensors generally have different responses. It is often necessary to balance these three colors for displaying a high-quality image on the monitor. The color filter arrays on sensors have different architectures. A CCD with standard G R G B pattern is considered for the present discussion. A simple method of separating the colors using CDS/H that is a part of KASPs (Analog Signal Processors made by Kodak) and using the gain control, which is also a part of KASPs for color balance, is presented. The colors are separated from the video output of sensor by using three KASPs, one each for green, red, and blue colors and by using alternate sample pulses for green and 1 in 4 pulses for red and blue. The separated colors gain is adjusted either automatically or manually and sent to the monitor for direct display in the analog mode or through an A/D converter digitally to the memory. This method of color balancing demands high-quality ASPs. Kodak has designed four different chips with varying levels of power consumption and speed for analog signal processing of video output of CCD sensors. The analog ASICs have been characterized for noise, clock feedthrough, acquisition time, linearity, variable gain, line rate clamp, black muxing, affect of temperature variations on chip performance, and droop. The ASP chips have met their design specifications.

  13. Thermal modeling of cooled instrument: from the WIRCam IR camera to CCD Peltier cooled compact packages

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Stadler, Eric; Downing, Mark; Hurrell, Steve; Wheeler, Patrick; Gach, Jean-Luc; Magnard, Yves; Balard, Philippe; Guillaume, Christian; Hubin, Norbert; Diaz, Jos Javier; Suske, Wolfgang; Jorden, Paul

    2006-06-01

    In the past decade, new thermal modelling tools have been offered to system designers. These modelling tools have rarely been used for the cooled instruments in ground-based astronomy. In addition to an overwhelming increase of PC computer capabilities, these tools are now mature enough to drive the design of complex astronomical instruments that are cooled. This is the case for WIRCam, the new wide-field infrared camera installed on the CFHT in Hawaii on the Mauna Kea summit. This camera uses four 2K2K Rockwell Hawaii-2RG infrared detectors and includes 2 optical barrels and 2 filter wheels. This camera is mounted at the prime focus of the 3.6m CFHT telescope. The mass to be cooled is close to 100 kg. The camera uses a Gifford Mac-Mahon closed-cycle cryo-cooler. The capabilities of the I-deas thermal module (TMG) is demonstrated for our particular application: predicted performances are presented and compared to real measurements after integration on the telescope in December 2004. In addition, we present thermal modelling of small Peltier cooled CCD packages, including the thermal model of the CCD220 Peltier package (fabricated by e2v technologies) and cold head. ESO and the OPTICON European network have funded e2v technologies to develop a compact packaged Peltier-cooled 8-output back illuminated L3Vision CCD. The device will achieve sub-electron read-noise at frame rates up to 1.5 kHz. The development, fully dedicated to the latest generation of adaptive optics wavefront sensors, has many unique features. Among them, the ultra-compactness offered by a Peltier package integrated in a small cold head including the detector drive electronics, is a way to achieve amazing performances for adaptive optics systems. All these models were carried out using a normal PC laptop.

  14. Design of an Event-Driven Random-Access-Windowing CCD-Based Camera

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P.; Lam, Raymond K.; Portillo, Angel A.; Ortiz, Gerardo G.

    2003-01-01

    Commercially available cameras are not design for the combination of single frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROI). A new control paradigm is defined to eliminate the tight coupling between the camera logic and the host controller. This functionality is achieved by defining the indivisible pixel read out operation on a per ROI basis with in-camera time keeping capability. This methodology provides a Random Access, Real-Time, Event-driven (RARE) camera for adaptive camera control and is will suited for target tracking applications requiring autonomous control of multiple ROI's. This methodology additionally provides for reduced ROI read out time and higher frame rates compared to the original architecture by avoiding external control intervention during the ROI read out process.

  15. Optical readout of a two phase liquid argon TPC using CCD camera and THGEMs

    NASA Astrophysics Data System (ADS)

    Mavrokoridis, K.; Ball, F.; Carroll, J.; Lazos, M.; McCormick, K. J.; Smith, N. A.; Touramanis, C.; Walker, J.

    2014-02-01

    This paper presents a preliminary study into the use of CCDs to image secondary scintillation light generated by THick Gas Electron Multipliers (THGEMs) in a two phase LAr TPC. A Sony ICX285AL CCD chip was mounted above a double THGEM in the gas phase of a 40 litre two-phase LAr TPC with the majority of the camera electronics positioned externally via a feedthrough. An Am-241 source was mounted on a rotatable motion feedthrough allowing the positioning of the alpha source either inside or outside of the field cage. Developed for and incorporated into the TPC design was a novel high voltage feedthrough featuring LAr insulation. Furthermore, a range of webcams were tested for operation in cryogenics as an internal detector monitoring tool. Of the range of webcams tested the Microsoft HD-3000 (model no:1456) webcam was found to be superior in terms of noise and lowest operating temperature. In ambient temperature and atmospheric pressure 1 ppm pure argon gas, the THGEM gain was ? 1000 and using a 1 msec exposure the CCD captured single alpha tracks. Successful operation of the CCD camera in two-phase cryogenic mode was also achieved. Using a 10 sec exposure a photograph of secondary scintillation light induced by the Am-241 source in LAr has been captured for the first time.

  16. OCam with CCD220, the Fastest and Most Sensitive Camera to Date for AO Wavefront Sensing

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz Garcia, José Javier

    2011-03-01

    For the first time, subelectron readout noise has been achieved with a camera dedicated to astronomical wavefront-sensing applications. The OCam system demonstrated this performance at a 1300 Hz frame rate and with 240 × 240 pixel frame size. ESO and JRA2 OPTICON jointly funded e2v Technologies to develop a custom CCD for adaptive optics (AO) wavefront-sensing applications. The device, called CCD220, is a compact Peltier-cooled 240 × 240 pixel frame-transfer eight-output back-illuminated sensor using the EMCCD technology. This article demonstrates, for the first time, subelectron readout noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e- pixel-1 frame-1. It reports on the quantitative performance characterization of OCam and the CCD220, including readout noise, dark current, multiplication gain, quantum efficiency, and charge transfer efficiency. OCam includes a low-noise preamplifier stage, a digital board to generate the clocks, and a microcontroller. The data acquisition system includes a user-friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, has been designed to offer enhanced performance, a completely sealed camera package, and an additional Peltier stage to facilitate operation on a telescope or environmentally challenging applications. New features of OCam2 are presented in this article. This instrumental development will strongly impact the performance of the most advanced AO systems to come.

  17. Proton radiation damage experiment on P-Channel CCD for an X-ray CCD camera onboard the ASTRO-H satellite

    NASA Astrophysics Data System (ADS)

    Mori, Koji; Nishioka, Yusuke; Ohura, Satoshi; Koura, Yoshiaki; Yamauchi, Makoto; Nakajima, Hiroshi; Ueda, Shutaro; Kan, Hiroaki; Anabuki, Naohisa; Nagino, Ryo; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Kohmura, Takayoshi; Ikeda, Shoma; Murakami, Hiroshi; Ozaki, Masanobu; Dotani, Tadayasu; Maeda, Yukie; Sagara, Kenshi

    2013-12-01

    We report on a proton radiation damage experiment on P-channel CCD newly developed for an X-ray CCD camera onboard the ASTRO-H satellite. The device was exposed up to 109 protons cm-2 at 6.7 MeV. The charge transfer inefficiency (CTI) was measured as a function of radiation dose. In comparison with the CTI currently measured in the CCD camera onboard the Suzaku satellite for 6 years, we confirmed that the new type of P-channel CCD is radiation tolerant enough for space use. We also confirmed that a charge-injection technique and lowering the operating temperature efficiently work to reduce the CTI for our device. A comparison with other P-channel CCD experiments is also discussed. We performed a proton radiation damage experiment on a new P-channel CCD. The device was exposed up to 109 protons cm-2 at 6.7 MeV. We confirmed that it is radiation tolerant enough for space use. We confirmed that a charge-injection technique reduces the CTI. We confirmed that lowering the operating temperature also reduces the CTI.

  18. Design and realization of an image mosaic system on the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Wang, Peng; Zhu, Hai bin; Li, Yan; Zhang, Shao jun

    2015-08-01

    It has long been difficulties in aerial photograph to stitch multi-route images into a panoramic image in real time for multi-route flight framing CCD camera with very large amount of data, and high accuracy requirements. An automatic aerial image mosaic system based on GPU development platform is described in this paper. Parallel computing of SIFT feature extraction and matching algorithm module is achieved by using CUDA technology for motion model parameter estimation on the platform, which makes it's possible to stitch multiple CCD images in real-time. Aerial tests proved that the mosaic system meets the user's requirements with 99% accuracy and 30 to 50 times' speed improvement of the normal mosaic system.

  19. 800 x 800 charge-coupled device /CCD/ camera for the Galileo Jupiter Orbiter mission

    NASA Technical Reports Server (NTRS)

    Clary, M. C.; Klaasen, K. P.; Snyder, L. M.; Wang, P. K.

    1979-01-01

    During January 1982 the NASA space transportation system will launch a Galileo spacecraft composed of an orbiting bus and an atmospheric entry probe to arrive at the planet Jupiter in July 1985. A prime element of the orbiter's scientific instrument payload will be a new generation slow-scan planetary imaging system based on a newly developed 800 x 800 charge-coupled device (CCD) image sensor. Following Jupiter orbit insertion, the single, narrow-angle, CCD camera, designated the Solid State Imaging (SSI) Subsystem, will operate for 20 months as the orbiter makes repeated encounters with Jupiter and its Galilean Satellites. During this period the SSI will acquire 40,000 images of Jupiter's atmosphere and the surfaces of the Galilean Satellites. This paper describes the SSI, its operational modes, and science objectives.

  20. Experimental research on femto-second laser damaging array CCD cameras

    NASA Astrophysics Data System (ADS)

    Shao, Junfeng; Guo, Jin; Wang, Ting-feng; Wang, Ming

    2013-05-01

    Charged Coupled Devices (CCD) are widely used in military and security applications, such as airborne and ship based surveillance, satellite reconnaissance and so on. Homeland security requires effective means to negate these advanced overseeing systems. Researches show that CCD based EO systems can be significantly dazzled or even damaged by high-repetition rate pulsed lasers. Here, we report femto - second laser interaction with CCD camera, which is probable of great importance in future. Femto - second laser is quite fresh new lasers, which has unique characteristics, such as extremely short pulse width (1 fs = 10-15 s), extremely high peak power (1 TW = 1012W), and especially its unique features when interacting with matters. Researches in femto second laser interaction with materials (metals, dielectrics) clearly indicate non-thermal effect dominates the process, which is of vast difference from that of long pulses interaction with matters. Firstly, the damage threshold test are performed with femto second laser acting on the CCD camera. An 800nm, 500?J, 100fs laser pulse is used to irradiate interline CCD solid-state image sensor in the experiment. In order to focus laser energy onto tiny CCD active cells, an optical system of F/5.6 is used. A Sony production CCDs are chose as typical targets. The damage threshold is evaluated with multiple test data. Point damage, line damage and full array damage were observed when the irradiated pulse energy continuously increase during the experiment. The point damage threshold is found 151.2 mJ/cm2.The line damage threshold is found 508.2 mJ/cm2.The full-array damage threshold is found to be 5.91 J/cm2. Although the phenomenon is almost the same as that of nano laser interaction with CCD, these damage thresholds are substantially lower than that of data obtained from nano second laser interaction with CCD. Then at the same time, the electric features after different degrees of damage are tested with electronic multi meter. The resistance values between clock signal lines are measured. Contrasting the resistance values of the CCD before and after damage, it is found that the resistances decrease significantly between the vertical transfer clock signal lines values. The same results are found between the vertical transfer clock signal line and the earth electrode (ground).At last, the damage position and the damage mechanism were analyzed with above results and SEM morphological experiments. The point damage results in the laser destroying material, which shows no macro electro influence. The line damage is quite different from that of point damage, which shows deeper material corroding effect. More importantly, short circuits are found between vertical clock lines. The full array damage is even more severe than that of line damage starring with SEM, while no obvious different electrical features than that of line damage are found. Further researches are anticipated in femto second laser caused CCD damage mechanism with more advanced tools. This research is valuable in EO countermeasure and/or laser shielding applications.

  1. CQUEAN: New CCD Camera System For The Otto Struve Telescope At The McDonald Observatory

    NASA Astrophysics Data System (ADS)

    Pak, Soojong; Park, W.; Im, M.

    2012-01-01

    We describe the overall characteristics and the performance of an optical CCD camera system, Camera for QUasars in EArly uNiverse (CQUEAN), which is being used at the 2.1m Otto Struve Telescope of the McDonald Observatory since 2010 August. CQUEAN was developed for follow-up imaging observations of near infrared bright sources such as high redshift quasar candidates (z > 4.5), Gamma Ray Bursts, brown dwarfs, and young stellar objects. For efficient observations of the red objects, CQUEAN has a science camera with a deep depletion CCD chip. By employing an auto-guiding system and a focal reducer to enhance the field of view at the classic cassegrain focus, we achieved a stable guiding in 20 minute exposures, an imaging quality with FWHM > 0.6 arcsec over the whole field (4.8 4.8 arcmin), and a limiting magnitude of z = 23.4 AB mag at 5-sigma with one hour integration.

  2. An intensified/shuttered cooled CCD camera for dynamic proton radiography

    SciTech Connect

    Yates, G.J.; Albright, K.L.; Alrick, K.R.

    1998-12-31

    An intensified/shuttered cooled PC-based CCD camera system was designed and successfully fielded on proton radiography experiments at the Los Alamos National Laboratory LANSCE facility using 800-MeV protons. The four camera detector system used front-illuminated full-frame CCD arrays (two 1,024 x 1,024 pixels and two 512 x 512 pixels) fiber optically coupled to either 25-mm diameter planar diode or microchannel plate image intensifiers which provided optical shuttering for time resolved imaging of shock propagation in high explosives. The intensifiers also provided wavelength shifting and optical gain. Typical sequences consisting of four images corresponding to consecutive exposures of about 500 ns duration for 40-ns proton burst images (from a fast scintillating fiber array) separated by approximately 1 microsecond were taken during the radiography experiments. Camera design goals and measured performance characteristics including resolution, dynamic range, responsivity, system detection quantum efficiency (DQE), and signal-to-noise will be discussed.

  3. Upwelling radiance at 976 nm measured from space using the OPALS CCD camera on the ISS

    NASA Astrophysics Data System (ADS)

    Biswas, Abhijit; Kovalik, Joseph M.; Oaida, Bogdan V.; Abrahamson, Matthew; Wright, Malcolm W.

    2015-03-01

    The Optical Payload for Lasercomm Science (OPALS) Flight System on-board the International Space Station uses a charge coupled device (CCD) camera to detect a beacon laser from Earth. Relative measurements of the background contributed by upwelling radiance under diverse illumination conditions and varying surface terrain is presented. In some cases clouds in the field-of-view allowed a comparison of terrestrial and cloud-top upwelling radiance. In this paper we will report these measurements and examine the extent of agreement with atmospheric model predictions.

  4. The measurement of astronomical parallaxes with CCD imaging cameras on small telescopes

    SciTech Connect

    Ratcliff, S.J. ); Balonek, T.J. ); Marschall, L.A. ); DuPuy, D.L. ); Pennypacker, C.R. ); Verma, R. ); Alexov, A. ); Bonney, V. )

    1993-03-01

    Small telescopes equipped with charge-coupled device (CCD) imaging cameras are well suited to introductory laboratory exercises in positional astronomy (astrometry). An elegant example is the determination of the parallax of extraterrestrial objects, such as asteroids. For laboratory exercises suitable for introductory students, the astronomical hardware needs are relatively modest, and, under the best circumstances, the analysis requires little more than arithmetic and a microcomputer with image display capabilities. Results from the first such coordinated parallax observations of asteroids ever made are presented. In addition, procedures for several related experiments, involving single-site observations and/or parallaxes of earth-orbiting artificial satellites, are outlined.

  5. Optical characterization of the SOFIA telescope using fast EM-CCD cameras

    NASA Astrophysics Data System (ADS)

    Pfller, Enrico; Wolf, Jrgen; Hall, Helen; Rser, Hans-Peter

    2012-09-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) has recently demonstrated its scientific capabilities in a first series of astronomical observing flights. In parallel, special measurements and engineering flights were conducted aiming at the characterization and the commissioning of the telescope and the complete airborne observatory. To support the characterization measurements, two commercial Andor iXon EM-CCD cameras have been used, a DU-888 dubbed Fast Diagnostic Camera (FDC) running at frame rates up to about 400 fps, and a DU-860 as a Super Fast Diagnostic Camera (SFDC) providing 2000 fps. Both cameras have been mounted to the telescopes Focal Plane Imager (FPI) flange in lieu of the standard FPI tracking camera. Their fast image sequences have been used to analyze and to improve the telescopes pointing stability, especially to help tuning active mass dampers that suppress eigenfrequencies in the telescope system, to characterize and to optimize the chopping secondary mirror and to investigate the structure and behavior of the shear layer that forms over the open telescope cavity in flight. In June 2011, a collaboration between the HIPO science instrument team, the MITs stellar occultation group and the FDC team, led to the first SOFIA observation of a stellar occultation by the dwarf planet Pluto over the Pacific.

  6. HERSCHEL/SCORE, imaging the solar corona in visible and EUV light: CCD camera characterization.

    PubMed

    Pancrazzi, M; Focardi, M; Landini, F; Romoli, M; Fineschi, S; Gherardi, A; Pace, E; Massone, G; Antonucci, E; Moses, D; Newmark, J; Wang, D; Rossi, G

    2010-07-01

    The HERSCHEL (helium resonant scattering in the corona and heliosphere) experiment is a rocket mission that was successfully launched last September from White Sands Missile Range, New Mexico, USA. HERSCHEL was conceived to investigate the solar corona in the extreme UV (EUV) and in the visible broadband polarized brightness and provided, for the first time, a global map of helium in the solar environment. The HERSCHEL payload consisted of a telescope, HERSCHEL EUV Imaging Telescope (HEIT), and two coronagraphs, HECOR (helium coronagraph) and SCORE (sounding coronagraph experiment). The SCORE instrument was designed and developed mainly by Italian research institutes and it is an imaging coronagraph to observe the solar corona from 1.4 to 4 solar radii. SCORE has two detectors for the EUV lines at 121.6 nm (HI) and 30.4 nm (HeII) and the visible broadband polarized brightness. The SCORE UV detector is an intensified CCD with a microchannel plate coupled to a CCD through a fiber-optic bundle. The SCORE visible light detector is a frame-transfer CCD coupled to a polarimeter based on a liquid crystal variable retarder plate. The SCORE coronagraph is described together with the performances of the cameras for imaging the solar corona. PMID:20428852

  7. CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

    2011-02-01

    Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

  8. Multi-scale algorithm for improved scintillation detection in a CCD-based gamma camera.

    PubMed

    Korevaar, Marc A N; Heemskerk, Jan W T; Goorden, Marlies C; Beekman, Freek J

    2009-02-21

    Gamma cameras based on charge-coupled devices (CCDs) and micro-columnar CsI scintillators can reach high spatial resolutions. However, the gamma interaction probability of these scintillators is low (typically <30% at 141 keV) due to the limited thickness of presently available micro-columnar scintillators. Continuous scintillators can improve the interaction probability but suffer from increased light spread compared to columnar scintillators. In addition, for both types of scintillators, gamma photons incident at an oblique angle reduce the spatial resolution due to the variable depth of interaction (DOI). To improve the spatial resolution and spectral characteristics of these detectors, we have developed a fast analytic scintillation detection algorithm that makes use of a depth-dependent light spread model and as a result is able to estimate the DOI in the scintillator. This algorithm, performing multi-scale frame analysis, was tested for an electron multiplying CCD (EM-CCD) optically coupled to CsI(Tl) scintillators of different thicknesses. For the thickest scintillator (2.6 mm) a spatial resolution of 148 microm full width half maximum (FWHM) was obtained with an energy resolution of 46% FWHM for perpendicularly incident gamma photons (interaction probability 61% at 141 keV). The multi-scale algorithm improves the spatial resolution up to 11%, the energy resolution up to 36% and the signal-to-background counts ratio up to 46% compared to a previously implemented algorithm that did not model the depth-dependent light spread. In addition, the multi-scale algorithm can accurately estimate DOI. As a result, degradation of the spatial resolution due to the variable DOI for gamma photons incident at a 45 degrees angle was improved from 2.0 10(3) to 448 microm FWHM. We conclude that the multi-scale algorithm significantly improves CCD-based gamma cameras as can be applied in future SPECT systems. PMID:19141886

  9. A toolkit for the characterization of CCD cameras for transmission electron microscopy.

    PubMed

    Vulovic, M; Rieger, B; van Vliet, L J; Koster, A J; Ravelli, R B G

    2010-01-01

    Charge-coupled devices (CCD) are nowadays commonly utilized in transmission electron microscopy (TEM) for applications in life sciences. Direct access to digitized images has revolutionized the use of electron microscopy, sparking developments such as automated collection of tomographic data, focal series, random conical tilt pairs and ultralarge single-particle data sets. Nevertheless, for ultrahigh-resolution work photographic plates are often still preferred. In the ideal case, the quality of the recorded image of a vitrified biological sample would solely be determined by the counting statistics of the limited electron dose the sample can withstand before beam-induced alterations dominate. Unfortunately, the image is degraded by the non-ideal point-spread function of the detector, as a result of a scintillator coupled by fibre optics to a CCD, and the addition of several inherent noise components. Different detector manufacturers provide different types of figures of merit when advertising the quality of their detector. It is hard for most laboratories to verify whether all of the anticipated specifications are met. In this report, a set of algorithms is presented to characterize on-axis slow-scan large-area CCD-based TEM detectors. These tools have been added to a publicly available image-processing toolbox for MATLAB. Three in-house CCD cameras were carefully characterized, yielding, among others, statistics for hot and bad pixels, the modulation transfer function, the conversion factor, the effective gain and the detective quantum efficiency. These statistics will aid data-collection strategy programs and provide prior information for quantitative imaging. The relative performance of the characterized detectors is discussed and a comparison is made with similar detectors that are used in the field of X-ray crystallography. PMID:20057054

  10. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument currently being developed by NASA's Marshall Space Flight Center (MSFC), the National Astronomical Observatory of Japan (NAOJ), and other partners. The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's chromosphere. The polarized spectrum imaged by the CCD cameras will capture information about the local magnetic field, allowing for measurements of magnetic strength and structure. In order to make accurate measurements of this effect, the performance characteristics of the three on- board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, read noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of 2.0+/- 0.5 e--/DN, a read noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non- linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  11. Lights, Camera, Action! Using Video Recordings to Evaluate Teachers

    ERIC Educational Resources Information Center

    Petrilli, Michael J.

    2011-01-01

    Teachers and their unions do not want test scores to count for everything; classroom observations are key, too. But planning a couple of visits from the principal is hardly sufficient. These visits may "change the teacher's behavior"; furthermore, principals may not be the best judges of effective teaching. So why not put video cameras in

  12. Using a Digital Video Camera to Study Motion

    ERIC Educational Resources Information Center

    Abisdris, Gil; Phaneuf, Alain

    2007-01-01

    To illustrate how a digital video camera can be used to analyze various types of motion, this simple activity analyzes the motion and measures the acceleration due to gravity of a basketball in free fall. Although many excellent commercially available data loggers and software can accomplish this task, this activity requires almost no financial…

  13. Lights, Camera, Action! Using Video Recordings to Evaluate Teachers

    ERIC Educational Resources Information Center

    Petrilli, Michael J.

    2011-01-01

    Teachers and their unions do not want test scores to count for everything; classroom observations are key, too. But planning a couple of visits from the principal is hardly sufficient. These visits may "change the teacher's behavior"; furthermore, principals may not be the best judges of effective teaching. So why not put video cameras in…

  14. 67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST OF ASSISTANT LAUNCH CONDUCTOR PANEL SHOWN IN CA-133-1-A-66 - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  15. Soft x-ray response of the x-ray CCD camera directly coated with optical blocking layer

    NASA Astrophysics Data System (ADS)

    Ikeda, S.; Kohmura, T.; Kawai, K.; Kaneko, K.; watanabe, T.; Tsunemi, H.; Hayashida, K.; Anabuki, N.; Nakajima, H.; Ueda, S.; Tsuru, T. G.; Dotani, T.; Ozaki, M.; Matsuta, K.; Fujinaga, T.; Kitamoto, S.; Murakami, H.; Hiraga, J.; Mori, K.; ASTRO-H SXI Team

    2012-03-01

    We have developed the back-illuminated X-ray CCD camera (BI-CCD) to observe Xray in space. The X-ray CCD has a sensitivity not only for in X-ray but also in both Optical and UV light, X-ray CCD has to equip a filter to cut off optical light as well as UV light. The X-ray Imaging Spectrometer (XIS) onboard Suzaku satellite equipped with a thin film (OBF: Optical Blocking Filter) to cut off optical light and UV light. OBF is always in danger tearing by the acousmato or vibration during the launch, and it is difficult to handle on the ground because of its thickness. Instead of OBF, we have newly developed and produced OBL (Optical Blocking Layer), which is directly coating on the X-ray CCD surface.

  16. CTK-II & RTK: The CCD-cameras operated at the auxiliary telescopes of the University Observatory Jena

    NASA Astrophysics Data System (ADS)

    Mugrauer, M.

    2016-02-01

    The Cassegrain-Teleskop-Kamera (CTK-II) and the Refraktor-Teleskop-Kamera (RTK) are two CCD-imagers which are operated at the 25 cm Cassegrain and 20 cm refractor auxiliary telescopes of the University Observatory Jena. This article describes the main characteristics of these instruments. The properties of the CCD-detectors, the astrometry, the image quality, and the detection limits of both CCD-cameras, as well as some results of ongoing observing projects, carried out with these instruments, are presented. Based on observations obtained with telescopes of the University Observatory Jena, which is operated by the Astrophysical Institute of the Friedrich-Schiller-University.

  17. Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2004-01-01

    Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

  18. Charge-coupled device (CCD) television camera for NASA's Galileo mission to Jupiter

    NASA Technical Reports Server (NTRS)

    Klaasen, K. P.; Clary, M. C.; Janesick, J. R.

    1982-01-01

    The CCD detector under construction for use in the slow-scan television camera for the NASA Galileo Jupiter orbiter to be launched in 1985 is presented. The science objectives and the design constraints imposed by the earth telemetry link, platform residual motion, and the Jovian radiation environment are discussed. Camera optics are inherited from Voyager; filter wavelengths are chosen to enable discrimination of Galilean-satellite surface chemical composition. The CCO design, an 800 by 800-element 'virtual-phase' solid-state silicon image-sensor array with supporting electronics, is described with detailed discussion of the thermally generated dark current, quantum efficiency, signal-to-noise ratio, and resolution. Tests of the effect of ionizing radiation were performed and are analyzed statistically. An imaging mode using a 2-1/3-sec frame time and on-chip summation of the signal in 2 x 2 blocks of adjacent pixels is designed to limit the effects of the most extreme Jovian radiation. Smearing due to spacecraft/target relative velocity and platform instability will be corrected for via an algorithm maximizing spacial resolution at a given signal-to-noise level. The camera is expected to produce 40,000 images of Jupiter and its satellites during the 20-month mission.

  19. Benchmarking of Back Thinned 512x512 X-ray CCD Camera Measurements with DEF X-ray film

    NASA Astrophysics Data System (ADS)

    Shambo, N. A.; Workman, J.; Kyrala, G.; Hurry, T.; Gonzales, R.; Evans, S. C.

    1999-11-01

    Using the Trident Laser Facility at Los Alamos National Laboratory 25-micron thick, 2mm diameter titanium disks were shot with a 527nm(green) laser light to measure x-ray yield. 1.0 mil and 0.5 mil Aluminum steps were used to test the linearity of the CCD Camera and DEF X-ray film was used to test the calibration of the CCD Camera response at 4.75keV. Both laser spot size and incident laser intensity were constrained to give constancy to the experimental data. This poster will discuss both the experimental design and results.

  20. Performance of front-end mixed-signal ASIC for onboard CCD cameras

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Inoue, Shota; Nagino, Ryo; Anabuki, Naohisa; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Doty, John P.; Ikeda, Hirokazu

    2014-07-01

    We report on the development status of the readout ASIC for an onboard X-ray CCD camera. The quick low- noise readout is essential for the pile-up free imaging spectroscopy with the future highly sensitive telescope. The dedicated ASIC for ASTRO-H/SXI has sufficient noise performance only at the slow pixel rate of 68 kHz. Then we have been developing the upgraded ASIC with the fourth-order ?? modulators. Upgrading the order of the modulator enables us to oversample the CCD signals less times so that we. The digitized pulse height is a serial bit stream that is decrypted with a decimation filter. The weighting coefficient of the filter is optimized to maximize the signal-to-noise ratio by a simulation. We present the performances such as the input equivalent noise (IEN), gain, effective signal range. The digitized pulse height data are successfully obtained in the first functional test up to 625 kHz. IEN is almost the same as that obtained with the chip for ASTRO-H/SXI. The residuals from the gain function is about 0.1%, which is better than that of the conventional ASIC by a factor of two. Assuming that the gain of the CCD is the same as that for ASTRO-H, the effective range is 30 keV in the case of the maximum gain. By changing the gain it can manage the signal charges of 100 ke-. These results will be fed back to the optimization of the pulse height decrypting filter.

  1. Advantages of the CCD camera measurements for profile and wear of cutting tools

    NASA Astrophysics Data System (ADS)

    Varga, G.; Balajti, Z.; Duds, I.

    2005-01-01

    In our paper we prepared an evaluating study of which conclusions draw mainly two directions for our fields of research. On the one hand, this means the measuring of fix, standing workpieces, on the other hand this means geometrical measurement of moving tools. The first case seems to be solved in many respects (in general cases), but the second one is not completely worked out according to the relevant literature. The monitoring of tool wear, the determination of geometrical parameters (this is mainly in case of gear-generating tools) is not really widespread yet, mainly, if optical parameters have influence on the evaluating procedure (e.g. examination of profiles of grinding wheels). We show the elaboration of a process for the practical application of measuring techniques performed by image processing CCD cameras on the basis of wearing criteria of different cutting tools (drilling tool, turning tool). We have made a profile and cutting tool wear measuring program.

  2. Imaging of blood vessels with CCD-camera based three-dimensional photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Nuster, Robert; Slezak, Paul; Paltauf, Guenther

    2014-03-01

    An optical phase contrast full field detection setup in combination with a CCD-camera is presented to record acoustic fields for real-time projection and fast three-dimensional imaging. When recording projection images of the wave pattern around the imaging object, the three-dimensional photoacoustic imaging problem is reduced to a set of two-dimensional reconstructions and the measurement setup requires only a single axis of rotation. Using a 10 Hz pulse laser system for photoacoustic excitation a three dimensional image can be obtained in less than 1 min. The sensitivity and resolution of the detection system was estimated experimentally with 5 kPa mm and 75?m, respectively. Experiments on biological samples show the applicability of this technique for the imaging of blood vessel distributions.

  3. High resolution three-dimensional photoacoutic tomography with CCD-camera based ultrasound detection.

    PubMed

    Nuster, Robert; Slezak, Paul; Paltauf, Guenther

    2014-08-01

    A photoacoustic tomograph based on optical ultrasound detection is demonstrated, which is capable of high resolution real-time projection imaging and fast three-dimensional (3D) imaging. Snapshots of the pressure field outside the imaged object are taken at defined delay times after photoacoustic excitation by use of a charge coupled device (CCD) camera in combination with an optical phase contrast method. From the obtained wave patterns photoacoustic projection images are reconstructed using a back propagation Fourier domain reconstruction algorithm. Applying the inverse Radon transform to a set of projections recorded over a half rotation of the sample provides 3D photoacoustic tomography images in less than one minute with a resolution below 100 m. The sensitivity of the device was experimentally determined to be 5.1 kPa over a projection length of 1 mm. In vivo images of the vasculature of a mouse demonstrate the potential of the developed method for biomedical applications. PMID:25136491

  4. High resolution three-dimensional photoacoutic tomography with CCD-camera based ultrasound detection

    PubMed Central

    Nuster, Robert; Slezak, Paul; Paltauf, Guenther

    2014-01-01

    A photoacoustic tomograph based on optical ultrasound detection is demonstrated, which is capable of high resolution real-time projection imaging and fast three-dimensional (3D) imaging. Snapshots of the pressure field outside the imaged object are taken at defined delay times after photoacoustic excitation by use of a charge coupled device (CCD) camera in combination with an optical phase contrast method. From the obtained wave patterns photoacoustic projection images are reconstructed using a back propagation Fourier domain reconstruction algorithm. Applying the inverse Radon transform to a set of projections recorded over a half rotation of the sample provides 3D photoacoustic tomography images in less than one minute with a resolution below 100 m. The sensitivity of the device was experimentally determined to be 5.1 kPa over a projection length of 1 mm. In vivo images of the vasculature of a mouse demonstrate the potential of the developed method for biomedical applications. PMID:25136491

  5. A reflectance model for non-contact mapping of venous oxygen saturation using a CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Jun; Dunmire, Barbrina; Beach, Kirk W.; Leotta, Daniel F.

    2013-11-01

    A method of non-contact mapping of venous oxygen saturation (SvO2) is presented. A CCD camera is used to image skin tissue illuminated alternately by a red (660 nm) and an infrared (800 nm) LED light source. Low cuff pressures of 30-40 mmHg are applied to induce a venous blood volume change with negligible change in the arterial blood volume. A hybrid model combining the Beer-Lambert law and the light diffusion model is developed and used to convert the change in the light intensity to the change in skin tissue absorption coefficient. A simulation study incorporating the full light diffusion model is used to verify the hybrid model and to correct a calculation bias. SvO2 in the fingers, palm, and forearm for five volunteers are presented and compared with results in the published literature. Two-dimensional maps of venous oxygen saturation are given for the three anatomical regions.

  6. Improvement of relief algorithm to prevent inpatient's downfall accident with night-vision CCD camera

    NASA Astrophysics Data System (ADS)

    Matsuda, Noriyuki; Yamamoto, Takeshi; Miwa, Masafumi; Nukumi, Shinobu; Mori, Kumiko; Kuinose, Yuko; Maeda, Etuko; Miura, Hirokazu; Taki, Hirokazu; Hori, Satoshi; Abe, Norihiro

    2005-12-01

    "ROSAI" hospital, Wakayama City in Japan, reported that inpatient's bed-downfall is one of the most serious accidents in hospital at night. Many inpatients have been having serious damages from downfall accidents from a bed. To prevent accidents, the hospital tested several sensors in a sickroom to send warning-signal of inpatient's downfall accidents to a nurse. However, it sent too much inadequate wrong warning about inpatients' sleeping situation. To send a nurse useful information, precise automatic detection for an inpatient's sleeping situation is necessary. In this paper, we focus on a clustering-algorithm which evaluates inpatient's situation from multiple angles by several kinds of sensor including night-vision CCD camera. This paper indicates new relief algorithm to improve the weakness about exceptional cases.

  7. Method for searching the mapping relationship between space points and their image points in CCD camera

    NASA Astrophysics Data System (ADS)

    Sun, Yuchen; Ge, Baozhen; Lu, Qieni; Zou, Jin; Zhang, Yimo

    2005-01-01

    BP Neural Network Method and Linear Partition Method are proposed to search the mapping relationship between space points and their image points in CCD cameras, which can be adopted to calibrate three-dimensional digitization systems based on optical method. Both of the methods only need the coordinates of calibration points and their corresponding image points" coordinates as parameters. The principle of the calibration techniques includes the formula and solution procedure is deduced in detail. Calibration experiment results indicate that the use of Linear Partition Method to coplanar points enables its measuring mean relative error to reach 0.44 percent and the use of BP Neural Network Method to non-coplanar points enables its testing accuracy to reach 0.5-0.6 pixels.

  8. 0.25mm-thick CCD packaging for the Dark Energy Survey Camera array

    SciTech Connect

    Derylo, Greg; Diehl, H.Thomas; Estrada, Juan; /Fermilab

    2006-06-01

    The Dark Energy Survey Camera focal plane array will consist of 62 2k x 4k CCDs with a pixel size of 15 microns and a silicon thickness of 250 microns for use at wavelengths between 400 and 1000 nm. Bare CCD die will be received from the Lawrence Berkeley National Laboratory (LBNL). At the Fermi National Accelerator Laboratory, the bare die will be packaged into a custom back-side-illuminated module design. Cold probe data from LBNL will be used to select the CCDs to be packaged. The module design utilizes an aluminum nitride readout board and spacer and an Invar foot. A module flatness of 3 microns over small (1 sqcm) areas and less than 10 microns over neighboring areas on a CCD are required for uniform images over the focal plane. A confocal chromatic inspection system is being developed to precisely measure flatness over a grid up to 300 x 300 mm. This system will be utilized to inspect not only room-temperature modules, but also cold individual modules and partial arrays through flat dewar windows.

  9. Picosecond Raman spectroscopy with a fast intensified CCD camera for depth analysis of diffusely scattering media.

    PubMed

    Ariese, Freek; Meuzelaar, Heleen; Kerssens, Marleen M; Buijs, Joost B; Gooijer, Cees

    2009-06-01

    A spectroscopic depth profiling approach is demonstrated for layers of non-transparent, diffusely scattering materials. The technique is based on the temporal discrimination between Raman photons emitted from the surface and Raman photons originating from a deeper layer. Excitation was carried out with a frequency-doubled, 3 ps Ti:sapphire laser system (398 nm; 76 MHz repetition rate). Time-resolved detection was carried out with an intensified CCD camera that can be gated with a 250 ps gate width. The performance of the system was assessed using 1 mm and 2 mm pathlength cuvettes with powdered PMMA and trans-stilbene (TS) crystals, respectively, or solid white polymer blocks: Arnite (polyethylene terephthalate), Delrin (polyoxymethylene), polythene (polyethylene) and Teflon (polytetrafluoroethylene). These samples were pressed together in different configurations and Raman photons were collected in backscatter mode in order to study the time difference in such media corresponding with several mm of extra net photon migration distance. We also studied the lateral contrast between two different second layers. The results demonstrate that by means of a picosecond laser system and the time discrimination of a gated intensified CCD camera, molecular spectroscopic information can be obtained through a turbid surface layer. In the case of the PMMA/TS two-layer system, time-resolved detection with a 400 ps delay improved the relative intensity of the Raman bands of the second layer with a factor of 124 in comparison with the spectrum recorded with a 100 ps delay (which is more selective for the first layer) and with a factor of 14 in comparison with a non-gated setup. Possible applications will be discussed, as well as advantages/disadvantages over other Raman techniques for diffusely scattering media. PMID:19475147

  10. Unmanned Vehicle Guidance Using Video Camera/Vehicle Model

    NASA Technical Reports Server (NTRS)

    Sutherland, T.

    1999-01-01

    A video guidance sensor (VGS) system has flown on both STS-87 and STS-95 to validate a single camera/target concept for vehicle navigation. The main part of the image algorithm was the subtraction of two consecutive images using software. For a nominal size image of 256 x 256 pixels this subtraction can take a large portion of the time between successive frames in standard rate video leaving very little time for other computations. The purpose of this project was to integrate the software subtraction into hardware to speed up the subtraction process and allow for more complex algorithms to be performed, both in hardware and software.

  11. In-flight Video Captured by External Tank Camera System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In this July 26, 2005 video, Earth slowly fades into the background as the STS-114 Space Shuttle Discovery climbs into space until the External Tank (ET) separates from the orbiter. An External Tank ET Camera System featuring a Sony XC-999 model camera provided never before seen footage of the launch and tank separation. The camera was installed in the ET LO2 Feedline Fairing. From this position, the camera had a 40% field of view with a 3.5 mm lens. The field of view showed some of the Bipod area, a portion of the LH2 tank and Intertank flange area, and some of the bottom of the shuttle orbiter. Contained in an electronic box, the battery pack and transmitter were mounted on top of the Solid Rocker Booster (SRB) crossbeam inside the ET. The battery pack included 20 Nickel-Metal Hydride batteries (similar to cordless phone battery packs) totaling 28 volts DC and could supply about 70 minutes of video. Located 95 degrees apart on the exterior of the Intertank opposite orbiter side, there were 2 blade S-Band antennas about 2 1/2 inches long that transmitted a 10 watt signal to the ground stations. The camera turned on approximately 10 minutes prior to launch and operated for 15 minutes following liftoff. The complete camera system weighs about 32 pounds. Marshall Space Flight Center (MSFC), Johnson Space Center (JSC), Goddard Space Flight Center (GSFC), and Kennedy Space Center (KSC) participated in the design, development, and testing of the ET camera system.

  12. Fast roadway detection using car cabin video camera

    NASA Astrophysics Data System (ADS)

    Krokhina, Daria; Blinov, Veniamin; Gladilin, Sergey; Tarhanov, Ivan; Postnikov, Vassili

    2015-12-01

    We describe a fast method for road detection in images from a vehicle cabin camera. Straight section of roadway is detected using Fast Hough Transform and the method of dynamic programming. We assume that location of horizon line in the image and the road pattern are known. The developed method is fast enough to detect the roadway on each frame of the video stream in real time and may be further accelerated by the use of tracking.

  13. Real-time road traffic classification using mobile video cameras

    NASA Astrophysics Data System (ADS)

    Lapeyronnie, A.; Parisot, C.; Meessen, J.; Desurmont, X.; Delaigle, J.-F.

    2008-02-01

    On board video analysis has attracted a lot of interest over the two last decades with as main goal to improve safety by detecting obstacles or assisting the driver. Our study aims at providing a real-time understanding of the urban road traffic. Considering a video camera fixed on the front of a public bus, we propose a cost-effective approach to estimate the speed of the vehicles on the adjacent lanes when the bus operates on a dedicated lane. We work on 1-D segments drawn in the image space, aligned with the road lanes. The relative speed of the vehicles is computed by detecting and tracking features along each of these segments. The absolute speed can be estimated from the relative speed if the camera speed is known, e.g. thanks to an odometer and/or GPS. Using pre-defined speed thresholds, the traffic can be classified into different categories such as 'fluid', 'congestion' etc. The solution offers both good performances and low computing complexity and is compatible with cheap video cameras, which allows its adoption by city traffic management authorities.

  14. Robust camera calibration for sport videos using court models

    NASA Astrophysics Data System (ADS)

    Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang

    2003-12-01

    We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.

  15. Identifying sports videos using replay, text, and camera motion features

    NASA Astrophysics Data System (ADS)

    Kobla, Vikrant; DeMenthon, Daniel; Doermann, David S.

    1999-12-01

    Automated classification of digital video is emerging as an important piece of the puzzle in the design of content management systems for digital libraries. The ability to classify videos into various classes such as sports, news, movies, or documentaries, increases the efficiency of indexing, browsing, and retrieval of video in large databases. In this paper, we discuss the extraction of features that enable identification of sports videos directly from the compressed domain of MPEG video. These features include detecting the presence of action replays, determining the amount of scene text in vide, and calculating various statistics on camera and/or object motion. The features are derived from the macroblock, motion,and bit-rate information that is readily accessible from MPEG video with very minimal decoding, leading to substantial gains in processing speeds. Full-decoding of selective frames is required only for text analysis. A decision tree classifier built using these features is able to identify sports clips with an accuracy of about 93 percent.

  16. Scientific CCD technology at JPL

    NASA Technical Reports Server (NTRS)

    Janesick, J.; Collins, S. A.; Fossum, E. R.

    1991-01-01

    Charge-coupled devices (CCD's) were recognized for their potential as an imaging technology almost immediately following their conception in 1970. Twenty years later, they are firmly established as the technology of choice for visible imaging. While consumer applications of CCD's, especially the emerging home video camera market, dominated manufacturing activity, the scientific market for CCD imagers has become significant. Activity of the Jet Propulsion Laboratory and its industrial partners in the area of CCD imagers for space scientific instruments is described. Requirements for scientific imagers are significantly different from those needed for home video cameras, and are described. An imager for an instrument on the CRAF/Cassini mission is described in detail to highlight achieved levels of performance.

  17. Status of the CCD camera for the eROSITA space telescope

    NASA Astrophysics Data System (ADS)

    Meidinger, Norbert; Andritschke, Robert; Elbs, Johannes; Granato, Stefanie; Hlker, Olaf; Hartner, Gisela; Herrmann, Sven; Miessner, Danilo; Pietschner, Daniel; Predehl, Peter; Reiffers, Jonas; Rommerskirchen, Tanja; Schmaler, Gabriele; Strder, Lothar; Tiedemann, Lars

    2011-09-01

    The approved German X-ray telescope eROSITA (extended ROentgen Survey with an Imaging Telescope Array) is the core instrument on the Russian Spektrum-Roentgen-Gamma (SRG) mission. After satellite launch to Lagrangian point L2 in near future, eROSITA will perform a survey of the entire X-ray sky. In the soft band (0.5 keV - 2 keV), it will be about 30 times more sensitive than ROSAT, while in the hard band (2 keV - 8 keV) it will provide the first complete imaging survey of the sky. The design driving science is the detection of 100,000 clusters of galaxies up to redshift z ~ 1.3 in order to study the large scale structure in the Universe and test cosmological models including Dark Energy. Detection of single X-ray photons with information about their energy, arrival angle and time is accomplished by an array of seven identical and independent PNCCD cameras. Each camera is assigned to a dedicated mirror system of Wolter-I type. The key component of the camera is a 5 cm 3 cm large, back-illuminated, 450 ?m thick and fully depleted frame store PNCCD chip. It is a further development of the sensor type which is in operation aboard the XMM-Newton satellite since 1999. Development and production of the CCDs for the eROSITA project were performed in the semiconductor laboratory of the Max-Planck-Institutes for Physics and Extraterrestrial Physics, the MPI Halbleiterlabor. By means of a unique so-called 'cold-chuck probe station', we have characterized the performance of each PNCCD sensor on chip-level. Various tests were carried out for a detailed characterization of the CCD and its custom-made analog readout ASIC. This includes in particular the evaluation of the optimum detector operating conditions in terms of operating sequence, supply voltages and operating temperature in order to achieve optimum performance. In the course of the eROSITA camera development, an engineering model of the eROSITA flight detector was assembled and is used for tests since 2010. Based on these results and on the extensive tests with lab model detectors, the design of the front-end electronics has meanwhile been finalized for the flight cameras. Furthermore, the specifications for the other supply and control electronics were precisely concluded on the basis of the experimental tests.

  18. Statistical estimates of fundamental constraints on the use of commercial CCD cameras in TV guide systems of large optical telescopes

    NASA Astrophysics Data System (ADS)

    Komarov, V. V.; Fomenko, A. F.

    2007-03-01

    The parameters of TV guide cameras of the BTA and Zeiss-1000 telescopes are analyzed. The formation of optical images by the atmosphere + telescope system is analyzed with allowance for the laws of photoelectron statistics in order to justify the applicability of commercial CCD cameras in the guiding systems of large optical telescopes. The analysis focuses on the estimates of fundamental constraints imposed on the method of TV observations of the sky through a turbulent atmosphere. The possible ways of reducing the main constraining factors in the case of the use of highly sensitive commercially produced CCDs in TV guide cameras are outlined.

  19. OP09O-OP404-9 Wide Field Camera 3 CCD Quantum Efficiency Hysteresis

    NASA Technical Reports Server (NTRS)

    Collins, Nick

    2009-01-01

    The HST/Wide Field Camera (WFC) 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. At the nominal operating temperature of -83C, the QEH feature contrast was typically 0.1-0.2% or less. The behavior was replicated using flight spare detectors. A visible light flat-field (540nm) with a several times full-well signal level can pin the detectors at both optical (600nm) and near-UV (230nm) wavelengths, suppressing the QEH behavior. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. The HST/Wide Field Camera 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. The first observed manifestation of QEH was the presence in a small percentage of flat-field images of a bowtie-shaped contrast that spanned the width of each chip. At the nominal operating temperature of -83C, the contrast observed for this feature was typically 0.1-0.2% or less, though at warmer temperatures contrasts up to 5% (at -50C) have been observed. The bowtie morphology was replicated using flight spare detectors in tests at the GSFC Detector Characterization Laboratory by power cycling the detector while cold. Continued investigation revealed that a clearly-related global QE suppression at the approximately 5% level can be produced by cooling the detector in the dark; subsequent flat-field exposures at a constant illumination show asymptotically increasing response. This QE "pinning" can be achieved with a single high signal flat-field or a series of lower signal flats; a visible light (500-580nm) flat-field with a signal level of several hundred thousand electrons per pixel is sufficient for QE pinning at both optical (600nm) and near-UV (230nm) wavelengths. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. A preliminary estimate of the decay timescale for one detector is that a drop of 0.1-0.2% occurs over a ten day period, indicating that relatively infrequent cal lamp exposures can mitigate the behavior to extremely low levels.

  20. Measuring the Flatness of Focal Plane for Very Large Mosaic CCD Camera

    SciTech Connect

    Hao, Jiangang; Estrada, Juan; Cease, Herman; Diehl, H.Thomas; Flaugher, Brenna L.; Kubik, Donna; Kuk, Keivin; Kuropatkine, Nickolai; Lin, Huan; Montes, Jorge; Scarpine, Vic

    2010-06-08

    Large mosaic multiCCD camera is the key instrument for modern digital sky survey. DECam is an extremely red sensitive 520 Megapixel camera designed for the incoming Dark Energy Survey (DES). It is consist of sixty two 4k x 2k and twelve 2k x 2k 250-micron thick fully-depleted CCDs, with a focal plane of 44 cm in diameter and a field of view of 2.2 square degree. It will be attached to the Blanco 4-meter telescope at CTIO. The DES will cover 5000 square-degrees of the southern galactic cap in 5 color bands (g, r, i, z, Y) in 5 years starting from 2011. To achieve the science goal of constraining the Dark Energy evolution, stringent requirements are laid down for the design of DECam. Among them, the flatness of the focal plane needs to be controlled within a 60-micron envelope in order to achieve the specified PSF variation limit. It is very challenging to measure the flatness of the focal plane to such precision when it is placed in a high vacuum dewar at 173 K. We developed two image based techniques to measure the flatness of the focal plane. By imaging a regular grid of dots on the focal plane, the CCD offset along the optical axis is converted to the variation the grid spacings at different positions on the focal plane. After extracting the patterns and comparing the change in spacings, we can measure the flatness to high precision. In method 1, the regular dots are kept in high sub micron precision and cover the whole focal plane. In method 2, no high precision for the grid is required. Instead, we use a precise XY stage moves the pattern across the whole focal plane and comparing the variations of the spacing when it is imaged by different CCDs. Simulation and real measurements show that the two methods work very well for our purpose, and are in good agreement with the direct optical measurements.

  1. A new paradigm for video cameras: optical sensors

    NASA Astrophysics Data System (ADS)

    Grottle, Kevin; Nathan, Anoo; Smith, Catherine

    2007-04-01

    This paper presents a new paradigm for the utilization of video surveillance cameras as optical sensors to augment and significantly improve the reliability and responsiveness of chemical monitoring systems. Incorporated into a hierarchical tiered sensing architecture, cameras serve as 'Tier 1' or 'trigger' sensors monitoring for visible indications after a release of warfare or industrial toxic chemical agents. No single sensor today yet detects the full range of these agents, but the result of exposure is harmful and yields visible 'duress' behaviors. Duress behaviors range from simple to complex types of observable signatures. By incorporating optical sensors in a tiered sensing architecture, the resulting alarm signals based on these behavioral signatures increases the range of detectable toxic chemical agent releases and allows timely confirmation of an agent release. Given the rapid onset of duress type symptoms, an optical sensor can detect the presence of a release almost immediately. This provides cues for a monitoring system to send air samples to a higher-tiered chemical sensor, quickly launch protective mitigation steps, and notify an operator to inspect the area using the camera's video signal well before the chemical agent can disperse widely throughout a building.

  2. Refocusing images and videos with a conventional compact camera

    NASA Astrophysics Data System (ADS)

    Kang, Lai; Wu, Lingda; Wei, Yingmei; Song, Hanchen; Yang, Zheng

    2015-03-01

    Digital refocusing is an interesting and useful tool for generating dynamic depth-of-field (DOF) effects in many types of photography such as portraits and creative photography. Since most existing digital refocusing methods rely on four-dimensional light field captured by special precisely manufactured devices or a sequence of images captured by a single camera, existing systems are either expensive for wide practical use or incapable of handling dynamic scenes. We present a low-cost approach for refocusing high-resolution (up to 8 mega pixels) images and videos based on a single shot using an easy to build camera-mirror stereo system. Our proposed method consists of four main steps, namely system calibration, image rectification, disparity estimation, and refocusing rendering. The effectiveness of our proposed method has been evaluated extensively using both static and dynamic scenes with various depth ranges. Promising experimental results demonstrate that our method is able to simulate various controllable realistic DOF effects. To the best of our knowledge, our method is the first that allows one to refocus high-resolution images and videos of dynamic scenes captured by a conventional compact camera.

  3. Fast auto-acquisition tomography tilt series by using HD video camera in ultra-high voltage electron microscope.

    PubMed

    Nishi, Ryuji; Cao, Meng; Kanaji, Atsuko; Nishida, Tomoki; Yoshida, Kiyokazu; Isakozawa, Shigeto

    2014-11-01

    The ultra-high voltage electron microscope (UHVEM) H-3000 with the world highest acceleration voltage of 3 MV can observe remarkable three dimensional microstructures of microns-thick samples[1]. Acquiring a tilt series of electron tomography is laborious work and thus an automatic technique is highly desired. We proposed the Auto-Focus system using image Sharpness (AFS)[2,3] for UHVEM tomography tilt series acquisition. In the method, five images with different defocus values are firstly acquired and the image sharpness are calculated. The sharpness are then fitted to a quasi-Gaussian function to decide the best focus value[3]. Defocused images acquired by the slow scan CCD (SS-CCD) camera (Hitachi F486BK) are of high quality but one minute is taken for acquisition of five defocused images.In this study, we introduce a high-definition video camera (HD video camera; Hamamatsu Photonics K. K. C9721S) for fast acquisition of images[4]. It is an analog camera but the camera image is captured by a PC and the effective image resolution is 12801023 pixels. This resolution is lower than that of the SS-CCD camera of 40964096 pixels. However, the HD video camera captures one image for only 1/30 second. In exchange for the faster acquisition the S/N of images are low. To improve the S/N, 22 captured frames are integrated so that each image sharpness is enough to become lower fitting error. As countermeasure against low resolution, we selected a large defocus step, which is typically five times of the manual defocus step, to discriminate different defocused images.By using HD video camera for autofocus process, the time consumption for each autofocus procedure was reduced to about six seconds. It took one second for correction of an image position and the total correction time was seven seconds, which was shorter by one order than that using SS-CCD camera. When we used SS-CCD camera for final image capture, it took 30 seconds to record one tilt image. We can obtain a tilt series of 61 images within 30 minutes. Accuracy and repeatability were good enough to practical use (Figure1). We successfully reduced the total acquisition time of a tomography tilt series in half than before.jmicro;63/suppl_1/i25/DFU066F1F1DFU066F1Fig. 1.Objective lens current change with a tilt angle during acquisition of tomography series (Sample: a rat hepatocyte, thickness: 2 m, magnification: 4k, acc. voltage: 2 MV). Tilt angle range is 60 degree with 2 degree step angle. Two series were acquired in the same area. Both data were almost same and the deviation was smaller than the minimum step by manual, so auto-focus worked well. We also developed a computer-aided three dimensional (3D) visualization and analysis software for electron tomography "HawkC" which can sectionalize the 3D data semi-automatically[5,6]. If this auto-acquisition system is used with IMOD reconstruction software[7] and HawkC software, we will be able to do on-line UHVEM tomography. The system would help pathology examination in the future.This work was supported by the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under a Grant-in-Aid for Scientific Research (Grant No. 23560024, 23560786), and SENTAN, Japan Science and Technology Agency, Japan. PMID:25359822

  4. Measurement of time varying temperature fields using visible imaging CCD cameras

    SciTech Connect

    Keanini, R.G.; Allgood, C.L.

    1996-12-31

    A method for measuring time-varying surface temperature distributions using high frame rate visible imaging CCD cameras is described. The technique is based on an ad hoc model relating measured radiance to local surface temperature. This approach is based on the fairly non-restrictive assumptions that atmospheric scattering and absorption, and secondary emission and reflection are negligible. In order to assess performance, both concurrent and non-concurrent calibration and measurement, performed under dynamic thermal conditions, are examined. It is found that measurement accuracy is comparable to the theoretical accuracy predicted for infrared-based systems. In addition, performance tests indicate that in the experimental system, real-time calibration can be achieved while real-time whole-field temperature measurements require relatively coarse spatial resolution. The principal advantages of the proposed method are its simplicity and low cost. In addition, since independent temperature measurements are used for calibration, emissivity remains unspecified, so that a potentially significant source of error is eliminated.

  5. On the performance of optical filters for the XMM focal plane CCD-camera EPIC

    NASA Astrophysics Data System (ADS)

    Stephan, K.-H.; Reppin, C.; Maier, H. J.; Frischke, D.; Fuchs, D.; Müller, P.; Moeller, S.; Gürtler, P.

    1995-02-01

    Optical filters have been developed for the X-ray astronomy project XMM (X-ray Multi Mirror Mission) [1] of ESA, where specific CCDs will serve as focal plane cameras on board the observatory. These detectors are sensitive from the X-ray to the NIR (near infrared) spectral region. For observations in X-ray astronomy an optical filter must be placed in front of the CCD, suppressing visible and UV (ultraviolet) radiation of stars by more than 6 orders of magnitude while being highly transparent at photon energies above 100 eV. The flight model filter is designed to have an effective area of 73 mm diameter without making use of a supporting grid. Efforts have been made to utilize plastic foils to tailor filters meeting these specific requirements. It was found, that a typical filter could be composed, e.g., of a polypropylene foil of 20 μg/cm2 thickness serving as a carrier, coated with metallic films of Al or Al and Sn of about 20-25 μg/cm2 thickness. Other possible carriers are polycarbonate (Lexan, Macrolon) and poly-para-xylylene (Parylene N) films of similar thicknesses. The preparation and characterization of these three types of carrier foils as well as of two sample filters is described, including mechanical tests as well as optical transmission measurements in the photon energy range from 1 eV to 2 keV.

  6. Simultaneous Camera Path Optimization and Distraction Removal for Improving Amateur Video.

    PubMed

    Zhang, Fang-Lue; Wang, Jue; Zhao, Han; Martin, Ralph R; Hu, Shi-Min

    2015-12-01

    A major difference between amateur and professional video lies in the quality of camera paths. Previous work on video stabilization has considered how to improve amateur video by smoothing the camera path. In this paper, we show that additional changes to the camera path can further improve video aesthetics. Our new optimization method achieves multiple simultaneous goals: 1) stabilizing video content over short time scales; 2) ensuring simple and consistent camera paths over longer time scales; and 3) improving scene composition by automatically removing distractions, a common occurrence in amateur video. Our approach uses an L(1) camera path optimization framework, extended to handle multiple constraints. Two passes of optimization are used to address both low-level and high-level constraints on the camera path. The experimental and user study results show that our approach outputs video that is perceptually better than the input, or the results of using stabilization only. PMID:26513791

  7. Field-programmable gate array-based hardware architecture for high-speed camera with KAI-0340 CCD image sensor

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Yan, Su; Zhou, Zuofeng; Cao, Jianzhong; Yan, Aqi; Tang, Linao; Lei, Yangjie

    2013-08-01

    We present a field-programmable gate array (FPGA)-based hardware architecture for high-speed camera which have fast auto-exposure control and colour filter array (CFA) demosaicing. The proposed hardware architecture includes the design of charge coupled devices (CCD) drive circuits, image processing circuits, and power supply circuits. CCD drive circuits transfer the TTL (Transistor-Transistor-Logic) level timing Sequences which is produced by image processing circuits to the timing Sequences under which CCD image sensor can output analog image signals. Image processing circuits convert the analog signals to digital signals which is processing subsequently, and the TTL timing, auto-exposure control, CFA demosaicing, and gamma correction is accomplished in this module. Power supply circuits provide the power for the whole system, which is very important for image quality. Power noises effect image quality directly, and we reduce power noises by hardware way, which is very effective. In this system, the CCD is KAI-0340 which is can output 210 full resolution frame-per-second, and our camera can work outstandingly in this mode. The speed of traditional auto-exposure control algorithms to reach a proper exposure level is so slow that it is necessary to develop a fast auto-exposure control method. We present a new auto-exposure algorithm which is fit high-speed camera. Color demosaicing is critical for digital cameras, because it converts a Bayer sensor mosaic output to a full color image, which determines the output image quality of the camera. Complexity algorithm can acquire high quality but cannot implement in hardware. An low-complexity demosaicing method is presented which can implement in hardware and satisfy the demand of quality. The experiment results are given in this paper in last.

  8. Social Justice through Literacy: Integrating Digital Video Cameras in Reading Summaries and Responses

    ERIC Educational Resources Information Center

    Liu, Rong; Unger, John A.; Scullion, Vicki A.

    2014-01-01

    Drawing data from an action-oriented research project for integrating digital video cameras into the reading process in pre-college courses, this study proposes using digital video cameras in reading summaries and responses to promote critical thinking and to teach social justice concepts. The digital video research project is founded on…

  9. Automatic radial distortion correction in zoom lens video camera

    NASA Astrophysics Data System (ADS)

    Kim, Daehyun; Shin, Hyoungchul; Oh, Juhyun; Sohn, Kwanghoon

    2010-10-01

    We present a novel method for automatically correcting the radial lens distortion in a zoom lens video camera system. We first define the zoom lens distortion model using an inherent characteristic of the zoom lens. Next, we sample some video frames with different focal lengths and estimate their radial distortion parameters and focal lengths. We then optimize the zoom lens distortion model with preestimated parameter pairs using the least-squares method. For more robust optimization, we divide the sample images into two groups according to distortion types (i.e., barrel and pincushion) and then separately optimize the zoom lens distortion models with respect to divided groups. Our results show that the zoom lens distortion model can accurately represent the radial distortion of a zoom lens.

  10. Study of pixel damages in CCD cameras irradiated at the neutron tomography facility of IPEN-CNEN/SP

    NASA Astrophysics Data System (ADS)

    Pugliesi, R.; Andrade, M. L. G.; Dias, M. S.; Siqueira, P. T. D.; Pereira, M. A. S.

    2015-12-01

    A methodology to investigate damages in CCD sensors caused by radiation beams of neutron tomography facilities is proposed. This methodology has been developed in the facility installed at the nuclear research reactor of IPEN-CNEN/SP, and the damages were evaluated by counting of white spots in images. The damage production rate at the main camera position was evaluated to be in the range between 0.008 and 0.040 damages per second. For this range, only 4 to 20 CCD pixels are damaged per tomography, assuring high quality images for hundreds of tomographs. Since the present methodology is capable of quantifying the damage production rate for each type of radiation, it can also be used in other facilities to improve the radiation shielding close of the CCD sensors.

  11. A dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. I - Optical and detector systems

    NASA Technical Reports Server (NTRS)

    Meyer, S. S.; Ricker, G. R.

    1980-01-01

    The MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes), an instrument capable of simultaneously performing both direct imaging and spectrometry of faint objects, is examined. An optical layout is given of the instrument which uses two CCD's mounted on the same temperature regulated detector block. Two sources of noise on the signal are discussed: (1) the CCD readout noise, which results in a constant uncertainty in the number of electrons collected from each pixel; and (2) the photon counting noise. The sensitivity of the device is limited by the sky brightness, the overall quantum efficiency, the resolution, and the readout noise of the CCD. Therefore, total system efficiency is calculated at about 15%.

  12. Non-mydriatic, wide field, fundus video camera

    NASA Astrophysics Data System (ADS)

    Hoeher, Bernhard; Voigtmann, Peter; Michelson, Georg; Schmauss, Bernhard

    2014-02-01

    We describe a method we call "stripe field imaging" that is capable of capturing wide field color fundus videos and images of the human eye at pupil sizes of 2mm. This means that it can be used with a non-dilated pupil even with bright ambient light. We realized a mobile demonstrator to prove the method and we could acquire color fundus videos of subjects successfully. We designed the demonstrator as a low-cost device consisting of mass market components to show that there is no major additional technical outlay to realize the improvements we propose. The technical core idea of our method is breaking the rotational symmetry in the optical design that is given in many conventional fundus cameras. By this measure we could extend the possible field of view (FOV) at a pupil size of 2mm from a circular field with 20 in diameter to a square field with 68 by 18 in size. We acquired a fundus video while the subject was slightly touching and releasing the lid. The resulting video showed changes at vessels in the region of the papilla and a change of the paleness of the papilla.

  13. Scientists Behind the Camera - Increasing Video Documentation in the Field

    NASA Astrophysics Data System (ADS)

    Thomson, S.; Wolfe, J.

    2013-12-01

    Over the last two years, Skypunch Creative has designed and implemented a number of pilot projects to increase the amount of video captured by scientists in the field. The major barrier to success that we tackled with the pilot projects was the conflicting demands of the time, space, storage needs of scientists in the field and the demands of shooting high quality video. Our pilots involved providing scientists with equipment, varying levels of instruction on shooting in the field and post-production resources (editing and motion graphics). In each project, the scientific team was provided with cameras (or additional equipment if they owned their own), tripods, and sometimes sound equipment, as well as an external hard drive to return the footage to us. Upon receiving the footage we professionally filmed follow-up interviews and created animations and motion graphics to illustrate their points. We also helped with the distribution of the final product (http://climatescience.tv/2012/05/the-story-of-a-flying-hippo-the-hiaper-pole-to-pole-observation-project/ and http://climatescience.tv/2013/01/bogged-down-in-alaska/). The pilot projects were a success. Most of the scientists returned asking for additional gear and support for future field work. Moving out of the pilot phase, to continue the project, we have produced a 14 page guide for scientists shooting in the field based on lessons learned - it contains key tips and best practice techniques for shooting high quality footage in the field. We have also expanded the project and are now testing the use of video cameras that can be synced with sensors so that the footage is useful both scientifically and artistically. Extract from A Scientist's Guide to Shooting Video in the Field

  14. CCD Video Observation of Microgravity Crystallization of Lysozyme and Correlation with Accelerometer Data

    NASA Technical Reports Server (NTRS)

    Snell, E. H.; Boggon, T. J.; Helliwell, J. R.; Moskowitz, M. E.; Nadarajah, A.

    1997-01-01

    Lysozyme has been crystallized using the ESA Advanced Protein Crystallization Facility onboard the NASA Space Shuttle Orbiter during the IML-2 mission. CCD video monitoring was used to follow the crystallization process and evaluate the growth rate. During the mission some tetragonal crystals were observed moving over distances of up to 200 micrometers. This was correlated with microgravity disturbances caused by firings of vernier jets on the Orbiter. Growth-rate measurement of a stationary crystal (which had nucleated on the growth reactor wall) showed spurts and lulls correlated with an onboard activity; astronaut exercise. The stepped growth rates may be responsible for the residual mosaic block structure seen in crystal mosaicity and topography measurements.

  15. Photometric correction and reflectance calculation for lunar images from the Chang'E-1 CCD stereo camera.

    PubMed

    Chen, Chao; Qin, Qiming; Chen, Li; Zheng, Hong; Fa, Wenzhe; Ghulam, Abduwasit; Zhang, Chengye

    2015-12-01

    Photometric correction and reflectance calculation are two important processes in the scientific analysis and application of Chang'E-1 (CE-1) charge-coupled device (CCD) stereo camera data. In this paper, the methods of photometric correction and reflectance calculation were developed. On the one hand, in considering the specificity of datasets acquired by the CE-1 CCD stereo camera, photometric correction was conducted based on the digital number value directly using the revised Lommel-Seeliger factor. On the other hand, on the basis of laboratory-measured bidirectional reflectances, the relative reflectance was then calculated using the empirical linear model. The presented approach can be used to identify landing sites, obtain global images, and produce topographic maps of the lunar surface. PMID:26831395

  16. Using the Separation of Double Stars to Obtain the Plate Scale of a Telescope with a CCD Camera Attached

    NASA Astrophysics Data System (ADS)

    Muller, R. J.; Cersosimo, J. C.; Centeno, D.; Miranda, V.; Rivera-Rivera, L.; Franco, E.; Morales, K.

    2008-07-01

    A new CCD Camera was coupled to the NURO telescope in March 2006. We used the separation of selected binary stars in the Washington Double Star Catalog to calculate the new plate scale. The value of the plate scale obtained was, within the error bar, in agreement with the design (theoretical) value. We also report the position angle and separation obtained for these selected stars.

  17. Miniature, vacuum compatible 1024 {times} 1024 CCD camera for x-ray, ultra-violet, or optical imaging

    SciTech Connect

    Conder, A.D.; Dunn, J.; Young, B.K.F.

    1994-05-01

    We have developed a very compact (60 {times} 60 {times} 75 mm{sup 3}), vacuum compatible, large format (25 {times} 25 mm{sup 2}, 1024 {times} 1024 pixels) CCD camera for digital imaging of visible and ultraviolet radiation, soft to penetrating x-rays ({le}20 keV), and charged particles. This camera provides a suitable replacement for film with a linear response, dynamic range and intrinsic signal-to- noise response superior than current x-ray film, and provides real- time access to the data. The spatial resolution of the camera (< 25 {mu}m) is similar to typical digitization slit or step sizes used in processing film data. This new large format CCD camera has immediate applications as the recording device for steak cameras or gated microchannel plate diagnostic, or when used directly as the detector for x-ray, xuv, or optical signals. This is especially important in studying high-energy plasmas produced in pulse-power, ICF, and high powered laser-plasma experiments, as well as other medical and industrial applications.

  18. MOA-cam3: a wide-field mosaic CCD camera for a gravitational microlensing survey in New Zealand

    NASA Astrophysics Data System (ADS)

    Sako, T.; Sekiguchi, T.; Sasaki, M.; Okajima, K.; Abe, F.; Bond, I. A.; Hearnshaw, J. B.; Itow, Y.; Kamiya, K.; Kilmartin, P. M.; Masuda, K.; Matsubara, Y.; Muraki, Y.; Rattenbury, N. J.; Sullivan, D. J.; Sumi, T.; Tristram, P.; Yanagisawa, T.; Yock, P. C. M.

    2008-10-01

    We have developed a wide-field mosaic CCD camera, MOA-cam3, mounted at the prime focus of the Microlensing Observations in Astrophysics (MOA) 1.8-m telescope. The camera consists of ten E2V CCD4482 chips, each having 2k4k pixels, and covers a 2.2 deg2 field of view with a single exposure. The optical system is well optimized to realize uniform image quality over this wide field. The chips are constantly cooled by a cryocooler at - 80 C, at which temperature dark current noise is negligible for a typical 1 3 min exposure. The CCD output charge is converted to a 16-bit digital signal by the GenIII system (Astronomical Research Cameras Inc.) and readout is within 25 s. Readout noise of 2 3 ADU (rms) is also negligible. We prepared a wide-band red filter for an effective microlensing survey and also Bessell V, I filters for standard astronomical studies. Microlensing studies have entered into a new era, which requires more statistics, and more rapid alerts to catch exotic light curves. Our new system is a powerful tool to realize both these requirements.

  19. Classification of volcanic ash particles from Sakurajima volcano using CCD camera image and cluster analysis

    NASA Astrophysics Data System (ADS)

    Miwa, T.; Shimano, T.; Nishimura, T.

    2012-12-01

    Quantitative and speedy characterization of volcanic ash particle is needed to conduct a petrologic monitoring of ongoing eruption. We develop a new simple system using CCD camera images for quantitatively characterizing ash properties, and apply it to volcanic ash collected at Sakurajima. Our method characterizes volcanic ash particles by 1) apparent luminance through RGB filters and 2) a quasi-fractal dimension of the shape of particles. Using a monochromatic CCD camera (Starshoot by Orion Co. LTD.) attached to a stereoscopic microscope, we capture digital images of ash particles that are set on a glass plate under which white colored paper or polarizing plate is set. The images of 1390 x 1080 pixels are taken through three kinds of color filters (Red, Green and Blue) under incident-light and transmitted-light through polarizing plate. Brightness of the light sources is set to be constant, and luminance is calibrated by white and black colored papers. About fifteen ash particles are set on the plate at the same time, and their images are saved with a bit map format. We first extract the outlines of particles from the image taken under transmitted-light through polarizing plate. Then, luminances for each color are represented by 256 tones at each pixel in the particles, and the average and its standard deviation are calculated for each ash particle. We also measure the quasi-fractal dimension (qfd) of ash particles. We perform box counting that counts the number of boxes which consist of 11 and 128128 pixels that catch the area of the ash particle. The qfd is estimated by taking the ratio of the former number to the latter one. These parameters are calculated by using software R. We characterize volcanic ash from Showa crater of Sakurajima collected in two days (Feb 09, 2009, and Jan 13, 2010), and apply cluster analyses. Dendrograms are formed from the qfd and following four parameters calculated from the luminance: Rf=R/(R+G+B), G=G/(R+G+B), B=B/(R+G+B), and total luminance=(R+G+B)/665. We classify the volcanic ash particles from the Dendrograms into three groups based on the euclid distance. The groups are named as Group A, B and C in order of increasing of the average value of total luminance. The classification shows that the numbers of particles belonging to Group A, B and C are 77, 25 and 6 in Feb, 09, 2009 sample, and 102, 19 and 6 in Jan, 13, 2010 sample, respectively. The examination under stereoscopic microscope suggests that Group A, B and C mainly correspond with juvenile, altered and free-crystal particles, respectively. So the result of classification by present method demonstrates a difference in the contribution of juvenile material between the two days. To evaluate reliability of our classification, we classify pseudo-samples in which errors of 10% are added in the measured parameters. We apply our method to one thousand psuedo-samples, and the result shows that the numbers of particles classified into the three groups vary less than 20 % of the total number of 235 particles. Our system can classify 120 particles within 6 minutes so that we easily increase the number of ash particles, which enable us to improve reliabilities and resolutions of the classification and to speedily capture temporal changes of the property of ash particles from active volcanoes.

  20. Developing a CCD camera with high spatial resolution for RIXS in the soft X-ray range

    NASA Astrophysics Data System (ADS)

    Soman, M. R.; Hall, D. J.; Tutt, J. H.; Murray, N. J.; Holland, A. D.; Schmitt, T.; Raabe, J.; Schmitt, B.

    2013-12-01

    The Super Advanced X-ray Emission Spectrometer (SAXES) at the Swiss Light Source contains a high resolution Charge-Coupled Device (CCD) camera used for Resonant Inelastic X-ray Scattering (RIXS). Using the current CCD-based camera system, the energy-dispersive spectrometer has an energy resolution (E/ΔE) of approximately 12,000 at 930 eV. A recent study predicted that through an upgrade to the grating and camera system, the energy resolution could be improved by a factor of 2. In order to achieve this goal in the spectral domain, the spatial resolution of the CCD must be improved to better than 5 μm from the current 24 μm spatial resolution (FWHM). The 400 eV-1600 eV energy X-rays detected by this spectrometer primarily interact within the field free region of the CCD, producing electron clouds which will diffuse isotropically until they reach the depleted region and buried channel. This diffusion of the charge leads to events which are split across several pixels. Through the analysis of the charge distribution across the pixels, various centroiding techniques can be used to pinpoint the spatial location of the X-ray interaction to the sub-pixel level, greatly improving the spatial resolution achieved. Using the PolLux soft X-ray microspectroscopy endstation at the Swiss Light Source, a beam of X-rays of energies from 200 eV to 1400 eV can be focused down to a spot size of approximately 20 nm. Scanning this spot across the 16 μm square pixels allows the sub-pixel response to be investigated. Previous work has demonstrated the potential improvement in spatial resolution achievable by centroiding events in a standard CCD. An Electron-Multiplying CCD (EM-CCD) has been used to improve the signal to effective readout noise ratio achieved resulting in a worst-case spatial resolution measurement of 4.5±0.2 μm and 3.9±0.1 μm at 530 eV and 680 eV respectively. A method is described that allows the contribution of the X-ray spot size to be deconvolved from these worst-case resolution measurements, estimating the spatial resolution to be approximately 3.5 μm and 3.0 μm at 530 eV and 680 eV, well below the resolution limit of 5 μm required to improve the spectral resolution by a factor of 2.

  1. Imaging tissues with a polarized light video camera

    NASA Astrophysics Data System (ADS)

    Jacques, Steven L.; Lee, Kenneth

    1999-09-01

    A method for imaging the superficial epidermal and papillary dermal layers of the skin is needed when assessing many skin lesions. We have developed an imaging modality using a video camera whose mechanism of contrast is the reflectance of polarized light from superficial skin. By selecting only polarized light to create the image, one rejects the large amount of diffusely reflected light from the deeper dermis. The specular reflectance (or glare) from the skin surface is also avoided in the setup. The resulting polarization picture maximally accents the details of the superficial layer of the skin and removes the effects of melanin pigmentation from the image. For example, freckles simply disappear and nevi lose their dark pigmentation to reveal the details of abnormal cellular growth. An initial clinical study demonstrated that the polarization camera could identify the margins of sclerosing basal cell carcinoma while the eye of the doctor underestimated the margin estimate. The camera identified an 11-mm-diameter lesion while the unaided eye identified a 6-mm-diameter lesion.

  2. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source

    SciTech Connect

    M. J. Haugh and M. B. Schneider

    2008-10-31

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 ?m square pixels, and 15 ?m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/?E?10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within 1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  3. Video-Camera-Based Position-Measuring System

    NASA Technical Reports Server (NTRS)

    Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert

    2005-01-01

    A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white squares to an object of interest (see Figure 2). For other situations, where circular symmetry is more desirable, circular targets also can be created. Such a target can readily be generated and modified by use of commercially available software and printed by use of a standard office printer. All three relative coordinates (x, y, and z) of each target can be determined by processing the video image of the target. Because of the unique design of corresponding image-processing filters and targets, the vision-based position- measurement system is extremely robust and tolerant of widely varying fields of view, lighting conditions, and varying background imagery.

  4. Deep-Sea Video Cameras Without Pressure Housings

    NASA Technical Reports Server (NTRS)

    Cunningham, Thomas

    2004-01-01

    Underwater video cameras of a proposed type (and, optionally, their light sources) would not be housed in pressure vessels. Conventional underwater cameras and their light sources are housed in pods that keep the contents dry and maintain interior pressures of about 1 atmosphere (.0.1 MPa). Pods strong enough to withstand the pressures at great ocean depths are bulky, heavy, and expensive. Elimination of the pods would make it possible to build camera/light-source units that would be significantly smaller, lighter, and less expensive. The depth ratings of the proposed camera/light source units would be essentially unlimited because the strengths of their housings would no longer be an issue. A camera according to the proposal would contain an active-pixel image sensor and readout circuits, all in the form of a single silicon-based complementary metal oxide/semiconductor (CMOS) integrated- circuit chip. As long as none of the circuitry and none of the electrical leads were exposed to seawater, which is electrically conductive, silicon integrated- circuit chips could withstand the hydrostatic pressure of even the deepest ocean. The pressure would change the semiconductor band gap by only a slight amount . not enough to degrade imaging performance significantly. Electrical contact with seawater would be prevented by potting the integrated-circuit chip in a transparent plastic case. The electrical leads for supplying power to the chip and extracting the video signal would also be potted, though not necessarily in the same transparent plastic. The hydrostatic pressure would tend to compress the plastic case and the chip equally on all sides; there would be no need for great strength because there would be no need to hold back high pressure on one side against low pressure on the other side. A light source suitable for use with the camera could consist of light-emitting diodes (LEDs). Like integrated- circuit chips, LEDs can withstand very large hydrostatic pressures. If power-supply regulators or filter capacitors were needed, these could be attached in chip form directly onto the back of, and potted with, the imager chip. Because CMOS imagers dissipate little power, the potting would not result in overheating. To minimize the cost of the camera, a fixed lens could be fabricated as part of the plastic case. For improved optical performance at greater cost, an adjustable glass achromatic lens would be mounted in a reservoir that would be filled with transparent oil and subject to the full hydrostatic pressure, and the reservoir would be mounted on the case to position the lens in front of the image sensor. The lens would by adjusted for focus by use of a motor inside the reservoir (oil-filled motors already exist).

  5. Characterization of OCam and CCD220: the fastest and most sensitive camera to date for AO wavefront sensing

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz-Garcia, José Javier

    2010-07-01

    For the first time, sub-electron read noise has been achieved with a camera suitable for astronomical wavefront-sensing (WFS) applications. The OCam system has demonstrated this performance at 1300 Hz frame rate and with 240×240-pixel frame rate. ESO and JRA2 OPTICON2 have jointly funded e2v technologies to develop a custom CCD for Adaptive Optics (AO) wavefront sensing applications. The device, called CCD220, is a compact Peltier-cooled 240×240 pixel frame-transfer 8-output back-illuminated sensor using the EMCCD technology. This paper demonstrates sub-electron read noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e-/pixel/frame. It reports on the comprehensive, quantitative performance characterization of OCam and the CCD220 such as readout noise, dark current, multiplication gain, quantum efficiency, charge transfer efficiency... OCam includes a low noise preamplifier stage, a digital board to generate the clocks and a microcontroller. The data acquisition system includes a user friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, was designed offering enhanced performances, a completely sealed camera package and an additional Peltier stage to facilitate operation on a telescope or environmentally rugged applications. OCam2 offers two types of built-in data link to the Real Time Computer: the CameraLink industry standard interface and various fiber link options like the sFPDP interface. OCam2 includes also a modified mechanical design to ease the integration of microlens arrays for use of this camera in all types of wavefront sensing AO system. The front cover of OCam2 can be customized to include a microlens exchange mechanism.

  6. Nighttime Near Infrared Observations of Augustine Volcano Jan-Apr, 2006 Recorded With a Small Astronomical CCD Camera

    NASA Astrophysics Data System (ADS)

    Sentman, D.; McNutt, S.; Reyes, C.; Stenbaek-Nielsen, H.; Deroin, N.

    2006-12-01

    Nighttime observations of Augustine Volcano were made during Jan-Apr, 2006 using a small, unfiltered, astronomical CCD camera operating from Homer, Alaska. Time-lapse images of the volcano were made looking across the open water of the Cook Inlet over a slant range of ~105 km. A variety of volcano activities were observed that originated in near-infrared (NIR) 0.9-1.1 micron emissions, which were detectable at the upper limit of the camera passband but were otherwise invisible to the naked eye. These activities included various types of steam releases, pyroclastic flows, rockfalls and debris flows that were correlated very closely with seismic measurements made from instruments located within 4 km on the volcanic island. Specifically, flow events to the east (towards the camera) produced high amplitudes on the eastern seismic stations and events presumably to the west were stronger on western stations. The ability to detect nighttime volcanic emissions in the NIR over large horizontal distances using standard silicon CCD technology, even in the presence of weak intervening fog, came as a surprise, and is due to a confluence of several mutually reinforcing factors: (1) Hot enough (~1000K) thermal emissions from the volcano that the short wavelength portion of the Planck radiation curve overlaps the upper portions (0.9-1.1 micron) of the sensitivity of the silicon CCD detectors, and could thus be detected, (2) The existence of several atmospheric transmission windows within the NIR passband of the camera for the emissions to propagate with relatively small attenuation through more than 10 atmospheres, and (3) in the case of fog, forward Mie scattering.

  7. The Camera Is Not a Methodology: Towards a Framework for Understanding Young Children's Use of Video Cameras

    ERIC Educational Resources Information Center

    Bird, Jo; Colliver, Yeshe; Edwards, Susan

    2014-01-01

    Participatory research methods argue that young children should be enabled to contribute their perspectives on research seeking to understand their worldviews. Visual research methods, including the use of still and video cameras with young children have been viewed as particularly suited to this aim because cameras have been considered easy and…

  8. The Camera Is Not a Methodology: Towards a Framework for Understanding Young Children's Use of Video Cameras

    ERIC Educational Resources Information Center

    Bird, Jo; Colliver, Yeshe; Edwards, Susan

    2014-01-01

    Participatory research methods argue that young children should be enabled to contribute their perspectives on research seeking to understand their worldviews. Visual research methods, including the use of still and video cameras with young children have been viewed as particularly suited to this aim because cameras have been considered easy and

  9. ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System

    SciTech Connect

    Werry, S.M.

    1995-06-06

    This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

  10. Frequency Identification of Vibration Signals Using Video Camera Image Data

    PubMed Central

    Jeng, Yih-Nen; Wu, Chia-Hung

    2012-01-01

    This study showed that an image data acquisition system connecting a high-speed camera or webcam to a notebook or personal computer (PC) can precisely capture most dominant modes of vibration signal, but may involve the non-physical modes induced by the insufficient frame rates. Using a simple model, frequencies of these modes are properly predicted and excluded. Two experimental designs, which involve using an LED light source and a vibration exciter, are proposed to demonstrate the performance. First, the original gray-level resolution of a video camera from, for instance, 0 to 256 levels, was enhanced by summing gray-level data of all pixels in a small region around the point of interest. The image signal was further enhanced by attaching a white paper sheet marked with a black line on the surface of the vibration system in operation to increase the gray-level resolution. Experimental results showed that the Prosilica CV640C CMOS high-speed camera has the critical frequency of inducing the false mode at 60 Hz, whereas that of the webcam is 7.8 Hz. Several factors were proven to have the effect of partially suppressing the non-physical modes, but they cannot eliminate them completely. Two examples, the prominent vibration modes of which are less than the associated critical frequencies, are examined to demonstrate the performances of the proposed systems. In general, the experimental data show that the non-contact type image data acquisition systems are potential tools for collecting the low-frequency vibration signal of a system. PMID:23202026

  11. Photon-counting gamma camera based on columnar CsI(Tl) optically coupled to a back-illuminated CCD

    NASA Astrophysics Data System (ADS)

    Miller, Brian W.; Barber, H. Bradford; Barrett, Harrison H.; Chen, Liying; Taylor, Sean J.

    2007-03-01

    Recent advances have been made in a new class of CCD-based, single-photon-counting gamma-ray detectors which offer sub-100 ?m intrinsic resolutions. 1-7 These detectors show great promise in small-animal SPECT and molecular imaging and exist in a variety of cofigurations. Typically, a columnar CsI(Tl) scintillator or a radiography screen (Gd IIO IIS:Tb) is imaged onto the CCD. Gamma-ray interactions are seen as clusters of signal spread over multiple pixels. When the detector is operated in a charge-integration mode, signal spread across pixels results in spatial-resolution degradation. However, if the detector is operated in photon-counting mode, the gamma-ray interaction position can be estimated using either Anger (centroid) estimation or maximum-likelihood position estimation resulting in a substantial improvement in spatial resolution.2 Due to the low-light-level nature of the scintillation process, CCD-based gamma cameras implement an amplfication stage in the CCD via electron multiplying (EMCCDs) 8-10 or via an image intensfier prior to the optical path.1 We have applied ideas and techniques from previous systems to our high-resolution LumiSPECT detector. 11, 12 LumiSPECT is a dual-modality optical/SPECT small-animal imaging system which was originally designed to operate in charge-integration mode. It employs a cryogenically cooled, high-quantum-efficiency, back-illuminated large-format CCD and operates in single-photon-counting mode without any intermediate amplfication process. Operating in photon-counting mode, the detector has an intrinsic spatial resolution of 64 ?m compared to 134 ?m in integrating mode.

  12. Optical measurement of the pointing stability of the SOFIA Telescope using a fast EM-CCD camera

    NASA Astrophysics Data System (ADS)

    Pfller, Enrico; Wolf, Jrgen; Rser, Hans-Peter

    2010-07-01

    The goal of the Stratospheric Observatory for Infrared Astronomy (SOFIA) is to point its airborne telescope at astronomical targets stable within 0.2 arcseconds (rms). However, the pointing stability will be affected in flight by aircraft vibrations and movements and constantly changing aerodynamic conditions within the open telescope compartment. Model calculations indicate that initially the deviations from targets may be at the order of several arcseconds. The plan is to carefully analyse and characterize all disturbances and then gradually fine tune the telescope's attitude control system to improve the pointing stability. To optically measure how star images change their position in the focal plane, an Andor DU-888 electronmultiplying (EM) CCD camera will be mounted to the telescope instead of its standard tracking camera. The new camera, dubbed Fast Diagnostic Camera (FDC) has been extensively tested and characterized in the laboratory and on ground based telescopes. In ground tests on the SOFIA telescope system it proofed its capabilities by sampling star images with frame rates up to 400 frames per second. From this data the star's location (centroid) in the focal plane can be calculated every 1/400th second and by means of a Fourier transformation, the star's movement power spectrum can be derived for frequencies up to 200 Hz. Eigenfrequencies and the overall shape of the measured spectrum confirm the previous model calculations. With known disturbances introduced to the telescope's fine drive system, the FDC data can be used to determine the system's transfer function. These data, when measured in flight will be critical for the refinement of the attitude control system. Another subsystem of the telescope that was characterized using FDC data was the chopping secondary mirror. By monitoring a star centroid at high speed while chopping, the chopping mechanism and its properties could be analyzed. This paper will describe the EM-CCD camera and its characteristics and will report on the tests that lead up to its first use in a SOFIA flight.

  13. Reliable camera motion estimation from compressed MPEG videos using machine learning approach

    NASA Astrophysics Data System (ADS)

    Wang, Zheng; Ren, Jinchang; Wang, Yubin; Sun, Meijun; Jiang, Jianmin

    2013-05-01

    As an important feature in characterizing video content, camera motion has been widely applied in various multimedia and computer vision applications. A novel method for fast and reliable estimation of camera motion from MPEG videos is proposed, using support vector machine for estimation in a regression model trained on a synthesized sequence. Experiments conducted on real sequences show that the proposed method yields much improved results in estimating camera motions while the difficulty in selecting valid macroblocks and motion vectors is skipped.

  14. CCD Mosaics

    NASA Astrophysics Data System (ADS)

    George, D.

    2002-05-01

    First-generation CCD cameras were relatively small (32k pixels to 90k pixels). Photographic film, in comparison, has literally millions of indivi- dual grains. In a practical sense, however, the resolution of a typical film image is about six megapixel. Nevertheless, this is over 60X larger than a first-generation camera. So while the high sensitivity of a CCD camera made it very effective for high resolution imaging, it could not compare with photographic film for wide field imaging. Commercially available CCD cameras now range up to 16 megapixel. Unfortun- ately, these large arrays come at an extremely high cost. Fortunately, 1-2 megapixel cameras are available in a price range within reach of many amateur astronomers. With this size of array, building 6-20 megapixel or larger mosaics is quite possible using suitable equipment.

  15. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by…

  16. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by

  17. An exploration of the potential use of a CCD camera for absorption spectroscopy in scattered light at the rainbow

    NASA Astrophysics Data System (ADS)

    Card, J. B. A.; Jones, A. R.

    2003-02-01

    The advent of the CCD camera has made it possible to record light intensity as a function of two dimensions. In this paper, we have explored the camera's potential to measure spectra in scattered light at the rainbow by recording the intensity as a function of angle and wavelength. To this end, white light from a xenon arc lamp was scattered from water sprays containing various concentrations of water-soluble food dyes. Comparisons were made with theoretical spectra calculated using Mie theory Qualitatively agreement was excellent. Quantitatively agreement was reasonable, but there were some discrepancies as yet to be explained. Although the main rainbow is insensitive to the particle size distribution, if concentration of the absorbing species is to be recovered accurately an independent means of determining the particle sizes will be necessary.

  18. A high resolution Small Field Of View (SFOV) gamma camera: a columnar scintillator coated CCD imager for medical applications

    NASA Astrophysics Data System (ADS)

    Lees, J. E.; Bassford, D. J.; Blake, O. E.; Blackshaw, P. E.; Perkins, A. C.

    2011-12-01

    We describe a high resolution, small field of view (SFOV), Charge Coupled Device (CCD) based camera for imaging small volumes of radionuclide uptake in tissues. The Mini Gamma Ray Camera (MGRC) is a collimated, scintillator-coated, low cost, high performance imager using low noise CCDs. The prototype MGRC has a 600 ?m thick layer of columnar CsI(Tl) and operates in photon counting mode using a thermoelectric cooler to achieve an operating temperature of - 10C. Collimation was performed using a pin hole collimator. We have measured the spatial resolution, energy resolution and efficiency using a number of radioisotope sources including 140 keV gamma-rays from 99mTc in a specially designed phantom. We also describe our first imaging of a volunteer patient.

  19. [Research Award providing funds for a tracking video camera

    NASA Technical Reports Server (NTRS)

    Collett, Thomas

    2000-01-01

    The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

  20. Electro-optical testing of fully depleted CCD image sensors for the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Doherty, Peter E.; Antilogus, Pierre; Astier, Pierre; Chiang, James; Gilmore, D. Kirk; Guyonnet, Augustin; Huang, Dajun; Kelly, Heather; Kotov, Ivan; Kubanek, Petr; Nomerotski, Andrei; O'Connor, Paul; Rasmussen, Andrew; Riot, Vincent J.; Stubbs, Christopher W.; Takacs, Peter; Tyson, J. Anthony; Vetter, Kurt

    2014-07-01

    The LSST Camera science sensor array will incorporate 189 large format Charge Coupled Device (CCD) image sensors. Each CCD will include over 16 million pixels and will be divided into 16 equally sized segments and each segment will be read through a separate output amplifier. The science goals of the project require CCD sensors with state of the art performance in many aspects. The broad survey wavelength coverage requires fully depleted, 100 micrometer thick, high resistivity, bulk silicon as the imager substrate. Image quality requirements place strict limits on the image degradation that may be caused by sensor effects: optical, electronic, and mechanical. In this paper we discuss the design of the prototype sensors, the hardware and software that has been used to perform electro-optic testing of the sensors, and a selection of the results of the testing to date. The architectural features that lead to internal electrostatic fields, the various effects on charge collection and transport that are caused by them, including charge diffusion and redistribution, effects on delivered PSF, and potential impacts on delivered science data quality are addressed.

  1. Acquisition of 3-D data by focus sensing utilizing the moire effect of CCD cameras.

    PubMed

    Engelhardt, K

    1991-04-10

    A new technique was recently presented for finding the 3-D shape of diffusely reflecting object surfaces [Appl. Opt. 27, 4684-4689 (1988)]. The technique is based on grid projection with small depth of focus, confocal observation, and focus sensing by evaluation of the local grating contrast. In this paper a modification of the measuring system is demonstrated that allows the use of a 2-D CCD array instead of a vidicon as the detector. In this modification the moire effect between the projected grid and the CCD array is utilized. Depth resolution is increased, and almost no unwanted moire terms arise in the detector output as long as the grating frequency is chosen above the Nyquist frequency of the array. The technique can be useful in robot vision. PMID:20700297

  2. Implementation of a parallel-beam optical-CT apparatus for three-dimensional radiation dosimetry using a high-resolution CCD camera

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Tzeng; Chen, Chin-Hsing; Hung, Chao-Nan; Tuan, Chiu-Ching; Chang, Yuan-Jen

    2015-06-01

    In this study, a charge-coupled device (CCD) camera with 2-megapixel (19201080-pixel) and 12-bit resolution was developed for optical computed tomography(optical CT). The signal-to-noise ratio (SNR) of our system was 30.12 dB, better than that of commercially available CCD cameras (25.31 dB). The 50% modulation transfer function (MTF50) of our 19201080-pixel camera gave a line width per picture height (LW/PH) of 745, which is 73% of the diffraction-limited resolution. Compared with a commercially available 1-megapixel CCD camera (1296966-pixel) with a LW/PH=358 and 46.6% of the diffraction-limited resolution, our camera system provided higher spatial resolution and better image quality. The NIPAM gel dosimeter was used to evaluate the optical CT with a 2-megapixel CCD. A clinical five-field irradiation treatment plan was generated using the Eclipse planning system (Varian Corp., Palo Alto, CA, USA). The gel phantom was irradiated using a 6-MV Varian Clinac IX linear accelerator (Varian). The measured NIPAM gel dose distributions and the calculated dose distributions, generated by the treatment planning software (TPS), were compared using the 3% dose-difference and 3 mm distance-to-agreement criteria. The gamma pass rate was as high as 98.2% when 2-megapixel CCD camera was used in optical CT. However, the gamma pass rate was only 96.0% when a commercially available 1-megapixel CCD camera was used.

  3. Testing the e2v CCD47-20 as the new sensor for the SOFIA target acquisition and tracking cameras

    NASA Astrophysics Data System (ADS)

    Wiedemann, Manuel; Wolf, Jrgen; Rser, Hans-Peter

    2010-07-01

    The telescope of the Stratospheric Observatory for Infrared Astronomy (SOFIA) has three target acquisition and tracking cameras, the Wide Field Imager (WFI), Fine Field Imager (FFI) and Focal Plane Imager (FPI). All three cameras use Thompson TH7888A CCD sensors (now offered by e2v) which are quite suitable in terms of their geometry and readout speed. However, their quantum efficiency and dark current rate are not comparable to newer CCD sensors now widely used in astronomy. The Deutsches SOFIA Institut (DSI) under contract of the German Aerospace Center (DLR) has therefore initiated an upgrade project of the cameras with high-sensitivity and low dark current CCD sensors, the e2v CCD47-20 BI AIMO. The back-illuminated architecture allows for high quantum efficiency, while the inverted mode operation lowers the dark current significantly. Both features enable the cameras to use fainter stars for tracking. The expected improvements in sensitivity range between 1.2 and 2.5 stellar magnitudes for the three cameras. In this paper we present results of laboratory and on-sky tests with the new sensor, obtained with a commercial camera platform.

  4. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system matched the clinical results. Digital image measurement of specimen deformation based on CCD cameras and Image J software has good perspective for application in biomechanical research, which has the advantage of simple optical setup, no-contact, high precision, and no special requirement of test environment.

  5. Station Cameras Capture New Videos of Hurricane Katia - Duration: 5 minutes, 36 seconds.

    NASA Video Gallery

    Aboard the International Space Station, external cameras captured new video of Hurricane Katia as it moved northwest across the western Atlantic north of Puerto Rico at 10:35 a.m. EDT on September ...

  6. Fused Six-Camera Video of STS-134 Launch - Duration: 79 seconds.

    NASA Video Gallery

    Imaging experts funded by the Space Shuttle Program and located at NASA's Ames Research Center prepared this video by merging nearly 20,000 photographs taken by a set of six cameras capturing 250 i...

  7. Using a Video Camera to Measure the Radius of the Earth

    ERIC Educational Resources Information Center

    Carroll, Joshua; Hughes, Stephen

    2013-01-01

    A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of…

  8. Using a Video Camera to Measure the Radius of the Earth

    ERIC Educational Resources Information Center

    Carroll, Joshua; Hughes, Stephen

    2013-01-01

    A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of

  9. Performances of a solid streak camera based on conventional CCD with nanosecond time resolution

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Bai, Yonglin; Zhu, Bingli; Gou, Yongsheng; Xu, Peng; Bai, XiaoHong; Liu, Baiyu; Qin, Junjun

    2015-02-01

    Imaging systems with high temporal resolution are needed to study rapid physical phenomena ranging from shock waves, including extracorporeal shock waves used for surgery, to diagnostics of laser fusion and fuel injection in internal combustion engines. However, conventional streak cameras use a vacuum tube making thus fragile, cumbersome and expensive. Here we report an CMOS streak camera project consists in reproducing completely this streak camera functionality with a single CMOS chip. By changing the mode of charge transfer of CMOS image sensor, fast photoelectric diagnostics of single point with linear CMOS and high-speed line scanning with array CMOS sensor can be achieved respectively. A fast photoelectric diagnostics system has been designed and fabricated to investigate the feasibility of this method. Finally, the dynamic operation of the sensors is exposed. Measurements show a sample time of 500 ps and a time resolution better than 2 ns.

  10. MISR Level 1A CCD Science data, all cameras (MIL1A_V1)

    NASA Technical Reports Server (NTRS)

    Diner, David J. (Principal Investigator)

    The Level 1A data are raw MISR data that are decommutated, reformatted 12-bit Level 0 data shifted to byte boundaries, i.e., reversal of square-root encoding applied and converted to 16 bit, and annotated (e.g., with time information). These data are used by the Level 1B1 processing algorithm to generate calibrated radiances. The science data output preserves the spatial sampling rate of the Level 0 raw MISR CCD science data. CCD data are collected during routine science observations of the sunlit portion of the Earth. Each product represents one 'granule' of data. A 'granule' is defined to be the smallest unit of data required for MISR processing. Also, included in the Level 1A product are pointers to calibration coefficient files provided for Level 1B processing. [Location=GLOBAL] [Temporal_Coverage: Start_Date=2000-02-24; Stop_Date=] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180].

  11. MISR Level 1A CCD Science data, all cameras (MIL1A_V2)

    NASA Technical Reports Server (NTRS)

    Diner, David J. (Principal Investigator)

    The Level 1A data are raw MISR data that are decommutated, reformatted 12-bit Level 0 data shifted to byte boundaries, i.e., reversal of square-root encoding applied and converted to 16 bit, and annotated (e.g., with time information). These data are used by the Level 1B1 processing algorithm to generate calibrated radiances. The science data output preserves the spatial sampling rate of the Level 0 raw MISR CCD science data. CCD data are collected during routine science observations of the sunlit portion of the Earth. Each product represents one 'granule' of data. A 'granule' is defined to be the smallest unit of data required for MISR processing. Also, included in the Level 1A product are pointers to calibration coefficient files provided for Level 1B processing. [Location=GLOBAL] [Temporal_Coverage: Start_Date=2000-02-24; Stop_Date=] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180].

  12. Assessing the capabilities of a 4kx4k CCD camera for electron cryo-microscopy at 300kV.

    PubMed

    Booth, Christopher R; Jakana, Joanita; Chiu, Wah

    2006-12-01

    CCD cameras have numerous advantages over photographic film for detecting electrons; however the point spread function of these cameras has not been sufficient for single particle data collection to subnanometer resolution with 300kV microscopes. We have adopted spectral signal to noise ratio (SNR) as a parameter for assessing detector quality for single particle imaging. The robustness of this parameter is confirmed under a variety of experimental conditions. Using this parameter, we demonstrate that the SNR of images of either amorphous carbon film or ice embedded virus particles collected on a new commercially available 4kx4k CCD camera are slightly better than photographic film at low spatial frequency (<1/5 Nyquist frequency), and as good as photographic film out to half of the Nyquist frequency. In addition it is slightly easier to visualize ice embedded particles on this CCD camera than on photographic film. Based on this analysis it is realistic to collect images containing subnanometer resolution data (6-9A) using this CCD camera at an effective magnification of approximately 112000x on a 300kV electron microscope. PMID:17067819

  13. Requisites for the remote-controlled wide-view CCD camera unit for natural orifice transluminal endoscopic surgery placed in the intraperitoneal cavity.

    PubMed

    Ohdaira, Takeshi; Yasuda, Yoshikazu; Hashizume, Makoto

    2010-04-01

    In natural orifice transluminal endoscopic surgery (NOTES) using a single endoscope, the visual field moves unstably and a wide blind space is formed. We used wireless two wireless CCD cameras (270,000 and 380,000 pixels) placed on the abdominal wall of pigs and a conventional endoscope (410,000 pixels) at the same time to assess whether it was possible to observe the entire process of sigmoidectomy by NOTES. The titanium dioxide-coated lens was used as an antifogging apparatus. To control the CCD image frames, a magnetic body was affixed to the back of the CCD camera unit. To select a suitable visual-transmitter, three frequency bands were assessed: 0.07 GHz, 1.2 GHz, and 2.4 GHz. The cameras showed good performance for monitoring all procedures of the sigmoidectomy. The magnetic force most suitable to control the cameras was found to be 360 mT, and the best transmission frequency was 1.2 GHz. The battery could be used for up to 4 hours with intermittent use. The issue of lens fogging could be resolved by a water supply into the anal canal and a more than 12-hour ultraviolet irradiation. We verified that the CCD camera with the titanium dioxide-coated lens may be useful as the second eye in NOTES. PMID:20437343

  14. Development of Measurement Device of Working Radius of Crane Based on Single CCD Camera and Laser Range Finder

    NASA Astrophysics Data System (ADS)

    Nara, Shunsuke; Takahashi, Satoru

    In this paper, what we want to do is to develop an observation device to measure the working radius of a crane truck. The device has a single CCD camera, a laser range finder and two AC servo motors. First, in order to measure the working radius, we need to consider algorithm of a crane hook recognition. Then, we attach the cross mark on the crane hook. Namely, instead of the crane hook, we try to recognize the cross mark. Further, for the observation device, we construct PI control system with an extended Kalman filter to track the moving cross mark. Through experiments, we show the usefulness of our device including new control system of mark tracking.

  15. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    SciTech Connect

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  16. Method for separating video camera motion from scene motion for constrained 3D displacement measurements

    NASA Astrophysics Data System (ADS)

    Gauthier, L. R.; Jansen, M. E.; Meyer, J. R.

    2014-09-01

    Camera motion is a potential problem when a video camera is used to perform dynamic displacement measurements. If the scene camera moves at the wrong time, the apparent motion of the object under study can easily be confused with the real motion of the object. In some cases, it is practically impossible to prevent camera motion, as for instance, when a camera is used outdoors in windy conditions. A method to address this challenge is described that provides an objective means to measure the displacement of an object of interest in the scene, even when the camera itself is moving in an unpredictable fashion at the same time. The main idea is to synchronously measure the motion of the camera and to use those data ex post facto to subtract out the apparent motion in the scene that is caused by the camera motion. The motion of the scene camera is measured by using a reference camera that is rigidly attached to the scene camera and oriented towards a stationary reference object. For instance, this reference object may be on the ground, which is known to be stationary. It is necessary to calibrate the reference camera by simultaneously measuring the scene images and the reference images at times when it is known that the scene object is stationary and the camera is moving. These data are used to map camera movement data to apparent scene movement data in pixel space and subsequently used to remove the camera movement from the scene measurements.

  17. Close infrared thermography using an intensified CCD camera: application in nondestructive high resolution evaluation of electrothermally actuated MEMS

    NASA Astrophysics Data System (ADS)

    Serio, B.; Hunsinger, J. J.; Conseil, F.; Derderian, P.; Collard, D.; Buchaillot, L.; Ravat, M. F.

    2005-06-01

    This communication proposes the description of an optical method for thermal characterization of MEMS devices. The method is based on the use of an intensified CCD camera to record the thermal radiation emitted by the studied device in the spectral domain from 600 nm to about 850 nm. The camera consists of an intensifier associated to a CCD sensor. The intensification allows for very low signal levels to be amplified and detected. We used a standard optical microscope to image the device with sub-micron resolution. Since, in close infrared, at very small scale and low temperature, typically 250C for thermal MEMS (Micro-Electro-Mechanical Systems), the thermal radiation is very weak, we used image integration in order to increase the signal to noise ratio. Knowing the imaged materials emissivity, the temperature is given by using Planck"s law. In order to evaluate the system performances we have made micro-thermographies of a micro-relay thermal actuator. This device is an "U-shape" Al/SiO2 bimorph cantilever micro-relay with a gold-to-gold electrical contact, designed for secured harsh environment applications. The initial beam curvature resulting from residual stresses ensures a large gap between the contacts of the micro-relay. The current flow through the metallic layer heats the bimorph by Joule effect, and the differential expansion provides the vertical displacement for contact. The experimental results are confronted to FEM and analytical simulations. A good agreement was obtained between experimental results and simulations.

  18. High-sensitive radiography system utilizing a pulse x-ray generator and a night-vision CCD camera (MLX)

    NASA Astrophysics Data System (ADS)

    Sato, Eiichi; Sagae, Michiaki; Tanaka, Etsuro; Mori, Hidezo; Kawai, Toshiaki; Inoue, Takashi; Ogawa, Akira; Sato, Shigehiro; Ichimaru, Toshio; Takayama, Kazuyoshi

    2007-01-01

    High-sensitive radiography system utilizing a kilohertz-range stroboscopic x-ray generator and a night-vision CCD camera (MLX) is described. The x-ray generator consists of the following major components: a main controller, a condenser unit with a Cockcroft-Walton circuit, and an x-ray tube unit in conjunction with a grid controller. The main condenser of about 500 nF in the unit is charged up to 100 kV by the circuit, and the electric charges in the condenser are discharged to the triode by the grid control circuit. The maximum tube current and the repetition rate are approximately 0.5 A and 50 kHz, respectively. The x-ray pulse width ranges from 0.01 to 1.0 ms, and the maximum shot number has a value of 32. At a charging voltage of 60 kV and a width of 1.0 ms, the x-ray intensity obtained without filtering was 6.04 μGy at 1.0 m per pulse. In radiography, an object is exposed by the pulse x-ray generator, and a radiogram is taken by an image intensifier. The image is intensified by the CCD camera, and a stop-motion image is stored by a flash memory device using a trigger delay device. The image quality was improved with increases in the x-ray duration, and a single-shot radiography was performed with durations of less than 1.0 ms.

  19. Progress of the x-ray CCD camera development for the eROSITA telescope

    NASA Astrophysics Data System (ADS)

    Meidinger, Norbert; Andritschke, Robert; Aschauer, Florian; Bornemann, Walter; Emberger, Valentin; Eraerds, Tanja; Frmetz, Maria; Hlker, Olaf; Hartner, Gisela; Kink, Walter; Mller, Siegfried; Pietschner, Daniel; Predehl, Peter; Reiffers, Jonas; Walther, Sabine; Weidenspointner, Georg

    2013-09-01

    The eROSITA space telescope is presently developed for the determination of cosmological parameters and the equation of state of dark energy via evolution of galaxy clusters. It will perform in addition a census of the obscured black hole growth in the Universe. The instrument development was also strongly motivated by the intention of a first imaging X-ray all-sky survey above an energy of 2 keV. eROSITA is scientific payload on the Russian research satellite SRG and the mission duration is scheduled for 7.5 years. The instrument comprises an array of seven identical and parallel-aligned telescopes. The mirror system is of Wolter-I type and the focal plane is equipped with a PNCCD camera for each of the telescopes. This instrumentation permits spectroscopy and imaging of X-rays in the energy band from 0.3 keV to 10 keV with a field of view of 1.0 degree. The camera development is done at the Max-Planck-Institute for Extraterrestrial Physics and in particular the key component, the PNCCD sensor, has been designed and fabricated at the semiconductor laboratory of the Max-Planck Society. All produced devices have been tested and the best selected for the eROSITA project. Based on calculations, simulations, and experimental testing of prototype systems, the flight cameras have been configured. We describe the detector and its performance, the camera design and electronics, the thermal system, and report on the latest estimates of the expected radiation damage taking into account the generation of secondary neutrons. The most recent test results will be presented as well as the status of the instrument development.

  20. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e(-)/DN, a noise level less than 25e(-), a dark current level which is less than 10e(-)/pixel/s, and a residual nonlinearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  1. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Astrophysics Data System (ADS)

    Joiner, R. K.; Kobayashi, K.; Winebarger, A. R.; Champey, P. R.

    2014-12-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e­-/DN, a noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non-linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  2. Engineering task plan for Tanks 241-AN-103, 104, 105 color video camera systems

    SciTech Connect

    Kohlman, E.H.

    1994-11-17

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and installation of the video camera systems into the vapor space within tanks 241-AN-103, 104, and 105. The one camera remotely operated color video systems will be used to observe and record the activities within the vapor space. Activities may include but are not limited to core sampling, auger activities, crust layer examination, monitoring of equipment installation/removal, and any other activities. The objective of this task is to provide a single camera system in each of the tanks for the Flammable Gas Tank Safety Program.

  3. Lights! Camera! Action! Handling Your First Video Assignment.

    ERIC Educational Resources Information Center

    Thomas, Marjorie Bekaert

    1989-01-01

    The author discusses points to consider when hiring and working with a video production company to develop a video for human resources purposes. Questions to ask the consultants are included, as is information on the role of the company liaison and on how to avoid expensive, time-wasting pitfalls. (CH)

  4. Feasibility study of transmission of OTV camera control information in the video vertical blanking interval

    NASA Technical Reports Server (NTRS)

    White, Preston A., III

    1994-01-01

    The Operational Television system at Kennedy Space Center operates hundreds of video cameras, many remotely controllable, in support of the operations at the center. This study was undertaken to determine if commercial NABTS (North American Basic Teletext System) teletext transmission in the vertical blanking interval of the genlock signals distributed to the cameras could be used to send remote control commands to the cameras and the associated pan and tilt platforms. Wavelength division multiplexed fiberoptic links are being installed in the OTV system to obtain RS-250 short-haul quality. It was demonstrated that the NABTS transmission could be sent over the fiberoptic cable plant without excessive video quality degradation and that video cameras could be controlled using NABTS transmissions over multimode fiberoptic paths as long as 1.2 km.

  5. The Study of Timing Relationships Which Arise When Using a Television CCD Camera Watec WAT-902H2 Supreme in Astronomical Investigations

    NASA Astrophysics Data System (ADS)

    Dragomiretsky, V. V.; Ryabov, A. V.; Koshkin, N. I.

    The present paper describes the study of timing relationships which arise when using an analogue CCD camera Watec WAT-902H2 Sup in astronomical investigations, particularly in time-domain measurements of LEO satellites which are fast-moving against stellar background.

  6. ON RELATIVISTIC DISK SPECTROSCOPY IN COMPACT OBJECTS WITH X-RAY CCD CAMERAS

    SciTech Connect

    Miller, J. M.; Cackett, E. M.; D'Ai, A.; Bautz, M. W.; Nowak, M. A.; Bhattacharyya, S.; Burrows, D. N.; Kennea, J.; Fabian, A. C.; Reis, R. C.; Freyberg, M. J.; Haberl, F.; Strohmayer, T. E.; Tsujimoto, M.

    2010-12-01

    X-ray charge-coupled devices (CCDs) are the workhorse detectors of modern X-ray astronomy. Typically covering the 0.3-10.0 keV energy range, CCDs are able to detect photoelectric absorption edges and K shell lines from most abundant metals. New CCDs also offer resolutions of 30-50 (E/{Delta}E), which is sufficient to detect lines in hot plasmas and to resolve many lines shaped by dynamical processes in accretion flows. The spectral capabilities of X-ray CCDs have been particularly important in detecting relativistic emission lines from the inner disks around accreting neutron stars and black holes. One drawback of X-ray CCDs is that spectra can be distorted by photon 'pile-up', wherein two or more photons may be registered as a single event during one frame time. We have conducted a large number of simulations using a statistical model of photon pile-up to assess its impacts on relativistic disk line and continuum spectra from stellar-mass black holes and neutron stars. The simulations cover the range of current X-ray CCD spectrometers and operational modes typically used to observe neutron stars and black holes in X-ray binaries. Our results suggest that severe photon pile-up acts to falsely narrow emission lines, leading to falsely large disk radii and falsely low spin values. In contrast, our simulations suggest that disk continua affected by severe pile-up are measured to have falsely low flux values, leading to falsely small radii and falsely high spin values. The results of these simulations and existing data appear to suggest that relativistic disk spectroscopy is generally robust against pile-up when this effect is modest.

  7. Experimental Comparison of the High-Speed Imaging Performance of an EM-CCD and sCMOS Camera in a Dynamic Live-Cell Imaging Test Case

    PubMed Central

    Beier, Hope T.; Ibey, Bennett L.

    2014-01-01

    The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

  8. The Importance of Camera Calibration and Distortion Correction to Obtain Measurements with Video Surveillance Systems

    NASA Astrophysics Data System (ADS)

    Cattaneo, C.; Mainetti, G.; Sala, R.

    2015-11-01

    Video surveillance systems are commonly used as important sources of quantitative information but from the acquired images it is possible to obtain a large amount of metric information. Yet, different methodological issues must be considered in order to perform accurate measurements using images. The most important one is the camera calibration, which is the estimation of the parameters defining the camera model. One of the most used camera calibration method is the Zhang's method, that allows the estimation of the linear parameters of the camera model. This method is very diffused as it requires a simple setup and it allows to calibrate cameras using a simple and fast procedure, but it does not consider lenses distortions, that must be taken into account with short focal lenses, commonly used in video surveillance systems. In order to perform accurate measurements, the linear camera model and the Zhang's method are improved in order to take nonlinear parameters into account and compensate the distortion contribute. In this paper we first describe the pinhole camera model that considers cameras as central projection systems. After a brief introduction to the camera calibration process and in particular the Zhang's method, we give a description of the different types of lens distortions and the techniques used for the distortion compensation. At the end some numerical example are shown in order to demonstrate the importance of the distortion compensation to obtain accurate measurements.

  9. Video geographic information system using mobile mapping in mobilephone camera

    NASA Astrophysics Data System (ADS)

    Kang, Jinsuk; Lee, Jae-Joon

    2013-12-01

    In this Paper is to develop core technologies such as automatic shape extraction from images (video), spatialtemporal data processing, efficient modeling, and then make it inexpensive and fast to build and process the huge 3D geographic data. The upgrade and maintenance of the technologies are also easy due to the component-based system architecture. Therefore, we designed and implemented the Video mobile GIS using a real-time database system, which consisted of a real-time GIS engine, a middleware, and a mobile client.

  10. Students behind the Video Camera. Augmenting the Vocational Curriculum.

    ERIC Educational Resources Information Center

    Coughlin, Matthew; Carey, Peter

    1987-01-01

    A dropout prevention program operated by the Federation Employment and Guidance Service has developed a communications arts class within the vocational curriculum that focuses on video production. Much of the classwork provides support to other classes in the vocational program. (CH)

  11. Fast-neutron radiation effects in a silica-core optical fiber studied by a CCD-camera spectrometer

    SciTech Connect

    Griscom, D.L.; Gingerich, M.E.; Friebele, E.J. ); Putnam, M. ); Unruh, W. )

    1994-02-20

    A simple CCD-camera spectrometer was deployed at the Los Alamos Spallation Radiation Effects Facility to characterize fast-neutron irradiation effects in several silica-based optical fibers over the wavelength range [similar to]450--1100 nm. The experimental arrangement allowed optical loss spectra to be developed from remotely recovered frame grabs at various times during irradiation without it being necessary to resort to cutback methods. Data recorded for a pure-silica-core/F-doped-silica-clad fiber displayed a peculiar artifact, which is described and mathematically modeled in terms of leaky modes propagating in an optical cladding that is substantially less susceptible to radiation-induced optical attenuation than is the core. Evidence from optical time-domain reflectometry supports the postulate that mode leakage into the cladding may be a result of light scattering from the tracks of ions displaced by the 14-MeV neutrons. These results suggest that fibers with fluorine doping in the core, as well as in the cladding, would be relatively resistant to radiation-induced attenuation in the UV--visible spectral region.

  12. Camera/Video Phones in Schools: Law and Practice

    ERIC Educational Resources Information Center

    Parry, Gareth

    2005-01-01

    The emergence of mobile phones with built-in digital cameras is creating legal and ethical concerns for school systems throughout the world. Users of such phones can instantly email, print or post pictures to other MMS1 phones or websites. Local authorities and schools in Britain, Europe, USA, Canada, Australia and elsewhere have introduced

  13. BOREAS RSS-3 Imagery and Snapshots from a Helicopter-Mounted Video Camera

    NASA Technical Reports Server (NTRS)

    Walthall, Charles L.; Loechel, Sara; Nickeson, Jaime (Editor); Hall, Forrest G. (Editor)

    2000-01-01

    The BOREAS RSS-3 team collected helicopter-based video coverage of forested sites acquired during BOREAS as well as single-frame "snapshots" processed to still images. Helicopter data used in this analysis were collected during all three 1994 IFCs (24-May to 16-Jun, 19-Jul to 10-Aug, and 30-Aug to 19-Sep), at numerous tower and auxiliary sites in both the NSA and the SSA. The VHS-camera observations correspond to other coincident helicopter measurements. The field of view of the camera is unknown. The video tapes are in both VHS and Beta format. The still images are stored in JPEG format.

  14. A measurement system based on visual servoing and error correction method using multiple CCD camera module and structured target for three dimensional measurement

    NASA Astrophysics Data System (ADS)

    Noh, Dong-Ki; Kim, Sung-Han; Park, Young-Jun; Choi, Doo Jin

    2007-10-01

    In the shipyard, the precision requirement of the error margin is less then +/- 2mm for producing 20000 mm by 20000 mm sized panels. This paper proposes a measurement system and an error correction method using several cameras and consecutive image data for a large scale panels to satisfy requested precision bounds. The purpose of this paper is the error correction of a measurement data using the matching of four consecutive camera image data that is built up using four CCD camera modules. This module consists of a CCD Camera, rotary stage and rotary stage controller. The error correction method is established using the mid point of direction vectors from each camera and a relation between the origin camera and others. The relation is estimated using corresponding points between each image plane. A direction vector from each CCD camera is measured using the change in the angle of rotary stage. Especially, to measure the dimension of the shape efficiently, a structured target must be at a center on an image plane. By the visual servoing, a target is moved to the center of the image plane. This means the motion of the measurement system, the change of the angle according to orientation of the rotary stage, is controlled by an image based feedback system. An advantage in using this method is to be able to get the measurement accuracy. With this advantage, we propose the error correction algorithm using four consecutive image data for the correction of the measurement data error. In order to evaluate the proposed algorithm, experiments are performed in real environment, shipbuilding process.

  15. A 5.5 megapixel high-performance low-light military video camera

    NASA Astrophysics Data System (ADS)

    Heim, Gerald B.; Biesterfeld, Brian; Burkepile, Jon; Frame, Wayne W.; Harding, Tyson; Harris, Joshua; Merschel, Steve; Mork, Ryan; Shimonek, Jordan; Smith, Phillips; Wu, Minming

    2009-05-01

    Ball Aerospace & Technologies Corp. has combined the results of recent advances in CMOS imaging sensor, signal processing and embedded computing technologies to produce a new high performance military video camera. In this paper we present the design features and performance characteristics of this new, large format camera which was developed for use in military airborne intelligence, surveillance and reconnaissance (ISR), targeting and pilotage applications. This camera utilizes a high sensitivity CMOS detector array with low read noise, low dark current and large well capacity to provide high quality image data under low-light and high intra-scene dynamic range illumination conditions. The camera utilizes sensor control electronics and an advanced digital video processing chain to maximize the quality and utility of the digital images produced by the CMOS device. Key features of this camera include: rugged, small physical size, wide operating temperature range, low operating power, high frame rate and automatic gain control for all-light-level applications. This camera also features a novel pixel decimation filter to provide custom image sizes and video output formats.

  16. Lights! Camera! Action!: video projects in the classroom.

    PubMed

    Epstein, Carol Diane; Hovancsek, Marcella T; Dolan, Pamela L; Durner, Erin; La Rocco, Nicole; Preiszig, Patricia; Winnen, Caitlin

    2003-12-01

    We report on two classroom video projects intended to promote active student involvement in their classroom experience during a year-long medical-surgical nursing course. We implemented two types of projects, Nursing Grand Rounds and FPBTV. The projects are templates that can be applied to any nursing specialty and can be implemented without the use of video technology. During the course of several years, both projects have proven effective in encouraging students to promote pattern recognition of characteristic features of common illnesses, to develop teamwork strategies, and to practice their presentation skills in a safe environment among their peers. The projects appealed to students because they increased retention of information and immersed students in the experience of becoming experts about an illness or a family of medications. These projects have enabled students to become engaged and invested in their own learning in the classroom. PMID:14694997

  17. Laser Imaging Video Camera Sees Through Fire, Fog, Smoke

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.

  18. Observation of hydrothermal flows with acoustic video camera

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Asada, A.; Tamaki, K.; Scientific Team Of Yk09-13 Leg 1

    2010-12-01

    To evaluate hydrothermal discharging and its diffusion process along the ocean ridge is necessary for understanding balance of mass and flux in the ocean, ecosystem around hydrothermal fields and so on. However, it has been difficult for us to measure hydrothermal activities without disturbance caused by observation platform ( submersible, ROV, AUV ). We wanted to have some observational method to observe hydrothermal discharging behavior as it was. DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. DIDSON operates at two frequencies, 1.8MHz or 1.1MHz, and forms 96 beams spaced 0.3 apart or 48 beams spaced 0.6 apart respectively. It images out to 12m at 1.8MHz and 40m at 1.1MHz. The transmit and receive beams are formed with acoustic lenses with rectangular apertures and made of polymethylpentene plastic and FC-70 liquid. This physical beam forming allows DIDSON to consume only 30W of power. DIDSON updates its image between 20 to 1 frames/s depending on the operating frequency and the maximum range imaged. It communicates its host using Ethernet. Institute of Industrial Science, University of Tokyo ( IIS ) has understood DIDSONs superior performance and tried to find new method for utilization of it. The observation systems that IIS has ever developed based on DIDSON are waterside surveillance system, automatic measurement system for fish length, automatic system for fish counting, diagnosis system for deterioration of underwater structure and so on. A next challenge is to develop an observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. In this cruise, seven dives of Shinkai6500 were conducted. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. Processing and analyzing the acoustic video image data are going on. We will report the overview of the acoustic video image of the hydrothermal plumes and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

  19. Temperature monitoring of Nd:YAG laser cladding (CW and PP) by advanced pyrometry and CCD-camera-based diagnostic tool

    NASA Astrophysics Data System (ADS)

    Doubenskaia, M.; Bertrand, Ph.; Smurov, Igor Y.

    2004-04-01

    The set of original pyrometers and the special diagnostic CCD-camera were applied for monitoring of Nd:YAG laser cladding (Pulsed-Periodic and Continuous Wave) with coaxial powder injection and on-line measurement of cladded layer temperature. The experiments were carried out in course of elaboration of wear resistant coatings using various powder blends (WC-Co, CuSn, Mo, Stellite grade 12, etc.) applying variation of different process parameters: laser power, cladding velocity, powder feeding rate, etc. Surface temperature distribution to the cladding seam and the overall temperature mapping were registered. The CCD-camera based diagnostic tool was applied for: (1) monitoring of flux of hot particles and its instability; (2) measurement of particle-in-flight size and velocity; (3) monitoring of particle collision with the clad in the interaction zone.

  20. STS-29 Discovery, OV-103, MS Bagian uses video camera on forward flight deck

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Mission Specialist James P. Bagian points video camera out forward flight deck window W2 while freefloating above commanders station seat and controls. An unsecured seat belt drifts below Bagian's elbows. Bagian films Earth's surface while onboard Discovery, Orbiter Vehicle (OV) 103, during Mission STS-29.

  1. Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera

    ERIC Educational Resources Information Center

    Levesque, Luc

    2014-01-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the

  2. Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera

    ERIC Educational Resources Information Center

    Levesque, Luc

    2014-01-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the…

  3. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  4. Video content analysis on body-worn cameras for retrospective investigation

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Baan, Jan; ter Haar, Frank B.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Burghouts, Gertjan J.; Wijn, Remco; van den Broek, Sebastiaan P.; van Rest, Jeroen H. C.

    2015-10-01

    In the security domain, cameras are important to assess critical situations. Apart from fixed surveillance cameras we observe an increasing number of sensors on mobile platforms, such as drones, vehicles and persons. Mobile cameras allow rapid and local deployment, enabling many novel applications and effects, such as the reduction of violence between police and citizens. However, the increased use of bodycams also creates potential challenges. For example: how can end-users extract information from the abundance of video, how can the information be presented, and how can an officer retrieve information efficiently? Nevertheless, such video gives the opportunity to stimulate the professionals' memory, and support complete and accurate reporting. In this paper, we show how video content analysis (VCA) can address these challenges and seize these opportunities. To this end, we focus on methods for creating a complete summary of the video, which allows quick retrieval of relevant fragments. The content analysis for summarization consists of several components, such as stabilization, scene selection, motion estimation, localization, pedestrian tracking and action recognition in the video from a bodycam. The different components and visual representations of summaries are presented for retrospective investigation.

  5. Temporal evolution of thermocavitation bubbles using high speed video camera

    NASA Astrophysics Data System (ADS)

    Padilla-Martinez, J. P.; Aguilar, G.; Ramirez-San-Juan, J. C.; Ramos-García, R.

    2011-10-01

    In this work, we present a novel method of cavitation, thermocavitation, induced by CW low power laser radiation in a highly absorbing solution of copper nitrate (CuNO4) dissolved in deionized water. The high absorption coefficient of the solution (α=135 cm-1) produces an overheated region (~300cm-1) followed by explosive phase transition and consequently the formation of an expanding vapor bubble, which later collapse very rapidly emitting intense acoustic shockwaves. We study the dynamic behavior of bubbles formed in contact with solid interface as a function of laser power using high speed video recording with rates of ~105 fps. The bubble grows regularly without any significant modification of its halfhemisphere shape, it reaches its maximum radius, but it deforms in the final stage of the collapse, probably due to the bubble adhesion to the surface. We also show that the maximum bubble radius and the shock-wave energy scales are inversely with the beam intensity.

  6. Acceptance/operational test procedure 101-AW tank camera purge system and 101-AW video camera system

    SciTech Connect

    Castleberry, J.L.

    1994-09-19

    This procedure will document the satisfactory operation of the 101-AW Tank Camera Purge System (CPS) and the 101-AW Video Camera System. The safety interlock which shuts down all the electronics inside the 101-AW vapor space, during loss of purge pressure, will be in place and tested to ensure reliable performance. This procedure is separated into four sections. Section 6.1 is performed in the 306 building prior to delivery to the 200 East Tank Farms and involves leak checking all fittings on the 101-AW Purge Panel for leakage using a Snoop solution and resolving the leakage. Section 7.1 verifies that PR-1, the regulator which maintains a positive pressure within the volume (cameras and pneumatic lines), is properly set. In addition the green light (PRESSURIZED) (located on the Purge Control Panel) is verified to turn on above 10 in. w.g. and after the time delay (TDR) has timed out. Section 7.2 verifies that the purge cycle functions properly, the red light (PURGE ON) comes on, and that the correct flowrate is obtained to meet the requirements of the National Fire Protection Association. Section 7.3 verifies that the pan and tilt, camera, associated controls and components operate correctly. This section also verifies that the safety interlock system operates correctly during loss of purge pressure. During the loss of purge operation the illumination of the amber light (PURGE FAILED) will be verified.

  7. Free-viewpoint image generation from a video captured by a handheld camera

    NASA Astrophysics Data System (ADS)

    Takeuchi, Kota; Fukushima, Norishige; Yendo, Tomohiro; Panahpour Tehrani, Mehrdad; Fujii, Toshiaki; Tanimoto, Masayuki

    2011-03-01

    In general, free-viewpoint image is generated by captured images by a camera array aligned on a straight line or circle. A camera array is able to capture synchronized dynamic scene. However, camera array is expensive and requires great care to be aligned exactly. In contrast to camera array, a handy camera is easily available and can capture a static scene easily. We propose a method that generates free-viewpoint images from a video captured by a handheld camera in a static scene. To generate free-viewpoint images, view images from several viewpoints and information of camera pose/positions of these viewpoints are needed. In a previous work, a checkerboard pattern has to be captured in every frame to calculate these parameters. And in another work, a pseudo perspective projection is assumed to estimate parameters. This assumption limits a camera movement. However, in this paper, we can calculate these parameters by "Structure From Motion". Additionally, we propose a selection method for reference images from many captured frames. And we propose a method that uses projective block matching and graph-cuts algorithm with reconstructed feature points to estimate a depth map of a virtual viewpoint.

  8. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  9. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  10. A Refrigerated Web Camera for Photogrammetric Video Measurement inside Biomass Boilers and Combustion Analysis

    PubMed Central

    Porteiro, Jacobo; Riveiro, Belén; Granada, Enrique; Armesto, Julia; Eguía, Pablo; Collazo, Joaquín

    2011-01-01

    This paper describes a prototype instrumentation system for photogrammetric measuring of bed and ash layers, as well as for flying particle detection and pursuit using a single device (CCD) web camera. The system was designed to obtain images of the combustion process in the interior of a domestic boiler. It includes a cooling system, needed because of the high temperatures in the combustion chamber of the boiler. The cooling system was designed using CFD simulations to ensure effectiveness. This method allows more complete and real-time monitoring of the combustion process taking place inside a boiler. The information gained from this system may facilitate the optimisation of boiler processes. PMID:22319349

  11. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source (HTPD 08 paper)

    SciTech Connect

    Haugh, M; Schneider, M B

    2008-04-28

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 {micro}m square pixels, and 15 {micro}m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/{Delta}E {approx} 10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within {+-}1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  12. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras.

    PubMed

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  13. Hardware-based smart camera for recovering high dynamic range video from multiple exposures

    NASA Astrophysics Data System (ADS)

    Lapray, Pierre-Jean; Heyrman, Barthélémy; Ginhac, Dominique

    2014-10-01

    In many applications such as video surveillance or defect detection, the perception of information related to a scene is limited in areas with strong contrasts. The high dynamic range (HDR) capture technique can deal with these limitations. The proposed method has the advantage of automatically selecting multiple exposure times to make outputs more visible than fixed exposure ones. A real-time hardware implementation of the HDR technique that shows more details both in dark and bright areas of a scene is an important line of research. For this purpose, we built a dedicated smart camera that performs both capturing and HDR video processing from three exposures. What is new in our work is shown through the following points: HDR video capture through multiple exposure control, HDR memory management, HDR frame generation, and representation under a hardware context. Our camera achieves a real-time HDR video output at 60 fps at 1.3 megapixels and demonstrates the efficiency of our technique through an experimental result. Applications of this HDR smart camera include the movie industry, the mass-consumer market, military, automotive industry, and surveillance.

  14. Surgical video recording with a modified GoPro Hero 4 camera

    PubMed Central

    Lin, Lily Koo

    2016-01-01

    Background Surgical videography can provide analytical self-examination for the surgeon, teaching opportunities for trainees, and allow for surgical case presentations. This study examined if a modified GoPro Hero 4 camera with a 25 mm lens could prove to be a cost-effective method of surgical videography with enough detail for oculoplastic and strabismus surgery. Method The stock lens mount and lens were removed from a GoPro Hero 4 camera, and was refitted with a Peau Productions SuperMount and 25 mm lens. The modified GoPro Hero 4 camera was then fixed to an overhead surgical light. Results Camera settings were set to 1080p video resolution. The 25 mm lens allowed for nine times the magnification as the GoPro stock lens. There was no noticeable video distortion. The entire cost was less than 600 USD. Conclusion The adapted GoPro Hero 4 with a 25 mm lens allows for high-definition, cost-effective, portable video capture of oculoplastic and strabismus surgery. The 25 mm lens allows for detailed videography that can enhance surgical teaching and self-examination. PMID:26834455

  15. Highly flexible and Internet-programmable CCD camera with a frequency-selectable read-out for imaging and spectroscopy applications

    NASA Astrophysics Data System (ADS)

    Gori, Luca; Pace, Emanuele; Tommasi, Leonardo; Sarocchi, D.; Bagnoli, V.; Sozzi, M.; Puri, S.

    2001-12-01

    A new concept CCD camera is currently being realized at the XUV Lab of the Department of Astronomy and Space Science of the University of Florence. The main features we aim to get are a high level of versatility and a fast pixel rate. Within this project, a versatile CCD sequencer has been realized with interesting and innovative features. Based on a microcontroller and complex programmable logic devices (CPLD), it allows the selection of all the parameters related to charge transfer and CCD readout (number, duration and overlapping of serial and parallel transfer clocks, number of output nodes, pixel transfer rate) and therefore it allows the use of virtually any CCD sensor. Comparing to a common DSP-based sequencer, it is immune to jitter noise and it can also reach pixel rates greater than 40 MHz. The software interface is LabVIEW 6i based and it will allow both local or remote control and display. Furthermore, it will be possible to remote debug the system and to upgrade the LabVIEW interface itself and also the microcontroller resident program and the CPLD implemented schemes.

  16. Video and acoustic camera techniques for studying fish under ice: a review and comparison

    SciTech Connect

    Mueller, Robert P.; Brown, Richard S.; Hop, Haakon H.; Moulton, Larry

    2006-09-05

    Researchers attempting to study the presence, abundance, size, and behavior of fish species in northern and arctic climates during winter face many challenges, including the presence of thick ice cover, snow cover, and, sometimes, extremely low temperatures. This paper describes and compares the use of video and acoustic cameras for determining fish presence and behavior in lakes, rivers, and streams with ice cover. Methods are provided for determining fish density and size, identifying species, and measuring swimming speed and successful applications of previous surveys of fish under the ice are described. These include drilling ice holes, selecting batteries and generators, deploying pan and tilt cameras, and using paired colored lasers to determine fish size and habitat associations. We also discuss use of infrared and white light to enhance image-capturing capabilities, deployment of digital recording systems and time-lapse techniques, and the use of imaging software. Data are presented from initial surveys with video and acoustic cameras in the Sagavanirktok River Delta, Alaska, during late winter 2004. These surveys represent the first known successful application of a dual-frequency identification sonar (DIDSON) acoustic camera under the ice that achieved fish detection and sizing at camera ranges up to 16 m. Feasibility tests of video and acoustic cameras for determining fish size and density at various turbidity levels are also presented. Comparisons are made of the different techniques in terms of suitability for achieving various fisheries research objectives. This information is intended to assist researchers in choosing the equipment that best meets their study needs.

  17. A novel method to reduce time investment when processing videos from camera trap studies.

    PubMed

    Swinnen, Kristijn R R; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

  18. A Novel Method to Reduce Time Investment When Processing Videos from Camera Trap Studies

    PubMed Central

    Swinnen, Kristijn R. R.; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

  19. A passive terahertz video camera based on lumped element kinetic inductance detectors

    NASA Astrophysics Data System (ADS)

    Rowe, Sam; Pascale, Enzo; Doyle, Simon; Dunscombe, Chris; Hargrave, Peter; Papageorgio, Andreas; Wood, Ken; Ade, Peter A. R.; Barry, Peter; Bideaud, Aurélien; Brien, Tom; Dodd, Chris; Grainger, William; House, Julian; Mauskopf, Philip; Moseley, Paul; Spencer, Locke; Sudiwala, Rashmi; Tucker, Carole; Walker, Ian

    2016-03-01

    We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)—designed originally for far-infrared astronomy—as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ˜0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequency domain multiplexing electronics.

  20. A digital underwater video camera system for aquatic research in regulated rivers

    USGS Publications Warehouse

    Martin, Benjamin M.; Irwin, Elise R.

    2010-01-01

    We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

  1. Development of a compact fast CCD camera and resonant soft x-ray scattering endstation for time-resolved pump-probe experiments

    NASA Astrophysics Data System (ADS)

    Doering, D.; Chuang, Y.-D.; Andresen, N.; Chow, K.; Contarato, D.; Cummings, C.; Domning, E.; Joseph, J.; Pepper, J. S.; Smith, B.; Zizka, G.; Ford, C.; Lee, W. S.; Weaver, M.; Patthey, L.; Weizeorick, J.; Hussain, Z.; Denes, P.

    2011-07-01

    The designs of a compact, fast CCD (cFCCD) camera, together with a resonant soft x-ray scattering endstation, are presented. The cFCCD camera consists of a highly parallel, custom, thick, high-resistivity CCD, readout by a custom 16-channel application specific integrated circuit to reach the maximum readout rate of 200 frames per second. The camera is mounted on a virtual-axis flip stage inside the RSXS chamber. When this flip stage is coupled to a differentially pumped rotary seal, the detector assembly can rotate about 100/360 in the vertical/horizontal scattering planes. With a six-degrees-of-freedom cryogenic sample goniometer, this endstation has the capability to detect the superlattice reflections from the electronic orderings showing up in the lower hemisphere. The complete system has been tested at the Advanced Light Source, Lawrence Berkeley National Laboratory, and has been used in multiple experiments at the Linac Coherent Light Source, SLAC National Accelerator Laboratory.

  2. Development of a compact fast CCD camera and resonant soft x-ray scattering endstation for time-resolved pump-probe experiments.

    PubMed

    Doering, D; Chuang, Y-D; Andresen, N; Chow, K; Contarato, D; Cummings, C; Domning, E; Joseph, J; Pepper, J S; Smith, B; Zizka, G; Ford, C; Lee, W S; Weaver, M; Patthey, L; Weizeorick, J; Hussain, Z; Denes, P

    2011-07-01

    The designs of a compact, fast CCD (cFCCD) camera, together with a resonant soft x-ray scattering endstation, are presented. The cFCCD camera consists of a highly parallel, custom, thick, high-resistivity CCD, readout by a custom 16-channel application specific integrated circuit to reach the maximum readout rate of 200 frames per second. The camera is mounted on a virtual-axis flip stage inside the RSXS chamber. When this flip stage is coupled to a differentially pumped rotary seal, the detector assembly can rotate about 100/360 in the vertical/horizontal scattering planes. With a six-degrees-of-freedom cryogenic sample goniometer, this endstation has the capability to detect the superlattice reflections from the electronic orderings showing up in the lower hemisphere. The complete system has been tested at the Advanced Light Source, Lawrence Berkeley National Laboratory, and has been used in multiple experiments at the Linac Coherent Light Source, SLAC National Accelerator Laboratory. PMID:21806178

  3. Operation and maintenance manual for the high resolution stereoscopic video camera system (HRSVS) system 6230

    SciTech Connect

    Pardini, A.F., Westinghouse Hanford

    1996-07-16

    The High Resolution Stereoscopic Video Cameral System (HRSVS),system 6230, is a stereoscopic camera system that will be used as an end effector on the LDUA to perform surveillance and inspection activities within Hanford waste tanks. It is attached to the LDUA by means of a Tool Interface Plate (TIP), which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate.

  4. Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.

  5. Evaluation of lens distortion errors using an underwater camera system for video-based motion analysis

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.

    1994-01-01

    Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.

  6. Precise color images a high-speed color video camera system with three intensified sensors

    NASA Astrophysics Data System (ADS)

    Oki, Sachio; Yamakawa, Masafumi; Gohda, Susumu; Etoh, Takeharu G.

    1999-06-01

    High speed imaging systems have been used in a large field of science and engineering. Although the high speed camera systems have been improved to high performance, most of their applications are only to get high speed motion pictures. However, in some fields of science and technology, it is useful to get some other information, such as temperature of combustion flame, thermal plasma and molten materials. Recent digital high speed video imaging technology should be able to get such information from those objects. For this purpose, we have already developed a high speed video camera system with three-intensified-sensors and cubic prism image splitter. The maximum frame rate is 40,500 pps (picture per second) at 64 X 64 pixels and 4,500 pps at 256 X 256 pixels with 256 (8 bit) intensity resolution for each pixel. The camera system can store more than 1,000 pictures continuously in solid state memory. In order to get the precise color images from this camera system, we need to develop a digital technique, which consists of a computer program and ancillary instruments, to adjust displacement of images taken from two or three image sensors and to calibrate relationship between incident light intensity and corresponding digital output signals. In this paper, the digital technique for pixel-based displacement adjustment are proposed. Although the displacement of the corresponding circle was more than 8 pixels in original image, the displacement was adjusted within 0.2 pixels at most by this method.

  7. Design and evaluation of controls for drift, video gain, and color balance in spaceborne facsimile cameras

    NASA Technical Reports Server (NTRS)

    Katzberg, S. J.; Kelly, W. L., IV; Rowland, C. W.; Burcher, E. E.

    1973-01-01

    The facsimile camera is an optical-mechanical scanning device which has become an attractive candidate as an imaging system for planetary landers and rovers. This paper presents electronic techniques which permit the acquisition and reconstruction of high quality images with this device, even under varying lighting conditions. These techniques include a control for low frequency noise and drift, an automatic gain control, a pulse-duration light modulation scheme, and a relative spectral gain control. Taken together, these techniques allow the reconstruction of radiometrically accurate and properly balanced color images from facsimile camera video data. These techniques have been incorporated into a facsimile camera and reproduction system, and experimental results are presented for each technique and for the complete system.

  8. Structural analysis of color video camera installation on tank 241AW101 (2 Volumes)

    SciTech Connect

    Strehlow, J.P.

    1994-08-24

    A video camera is planned to be installed on the radioactive storage tank 241AW101 at the DOE` s Hanford Site in Richland, Washington. The camera will occupy the 20 inch port of the Multiport Flange riser which is to be installed on riser 5B of the 241AW101 (3,5,10). The objective of the project reported herein was to perform a seismic analysis and evaluation of the structural components of the camera for a postulated Design Basis Earthquake (DBE) per the reference Structural Design Specification (SDS) document (6). The detail of supporting engineering calculations is documented in URS/Blume Calculation No. 66481-01-CA-03 (1).

  9. Video camera observation for assessing overland flow patterns during rainfall events

    NASA Astrophysics Data System (ADS)

    Silasari, Rasmiaditya; Oismüller, Markus; Blöschl, Günter

    2015-04-01

    Physically based hydrological models have been widely used in various studies to model overland flow propagation in cases such as flood inundation and dam break flow. The capability of such models to simulate the formation of overland flow by spatial and temporal discretization of the empirical equations makes it possible for hydrologists to trace the overland flow generation both spatially and temporally across surface and subsurface domains. As the upscaling methods transforming hydrological process spatial patterns from the small obrseved scale to the larger catchment scale are still being progressively developed, the physically based hydrological models become a convenient tool to assess the patterns and their behaviors crucial in determining the upscaling process. Related studies in the past had successfully used these models as well as utilizing field observation data for model verification. The common observation data used for this verification are overland flow discharge during natural rainfall events and camera observations during synthetic events (staged field experiments) while the use of camera observations during natural events are hardly discussed in publications. This study advances in exploring the potential of video camera observations of overland flow generation during natural rainfall events to support the physically based hydrological model verification and the assessment of overland flow spatial patterns. The study is conducted within a 64ha catchment located at Petzenkirchen, Lower Austria, known as HOAL (Hydrological Open Air Laboratory). The catchment land covers are dominated by arable land (87%) with small portions (13%) of forest, pasture and paved surfaces. A 600m stream is running at southeast of the catchment flowing southward and equipped with flumes and pressure transducers measuring water level in minutely basis from various inlets along the stream (i.e. drainages, surface runoffs, springs) to be calculated into flow discharge. A video camera with 10x optical zoom is installed 7m above the ground at the middle of the catchment overlooking the west hillslope area of the stream. Minutely images are taken daily during daylight while video recording is triggered by raindrop movements. The observed images and videos are analyzed in accordance to the overland flow signals captured by the assigned pressure transducers and the rainfall intensities measured by four rain gauges across the catchment. The results show that the video camera observations enable us to assess the spatial and temporal development of the overland flow generation during natural events, thus showing potentials to be used in model verification as well as in spatial patterns analysis.

  10. A New Remote Sensing Filter Radiometer Employing a Fabry-Perot Etalon and a CCD Camera for Column Measurements of Methane in the Earth Atmosphere

    NASA Technical Reports Server (NTRS)

    Georgieva, E. M.; Huang, W.; Heaps, W. S.

    2012-01-01

    A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.

  11. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  12. Holographic combiner design to obtain uniform symbol brightness at head-up display (HUD) video camera

    NASA Astrophysics Data System (ADS)

    Battey, David E.; Melzer, James E.

    1989-03-01

    A typical head-up display (HUD) system incorporates a video camera for recording the HUD symbology and the scene outside the cockpit. When using a HUD video camera (HVC) with a zero-power holographic combiner, the brightness of the HUD symbology seen by the camera changes significantly as a function of vertical field angle because the holographic combiner's reflectance characteristics are angularly sensitive and optimized for the pilot's eye position. A holographic combiner design is presented that overcomes this problem while simultaneously maintaining high reflectance of the phosphor's light to the pilot and high visual transmittance. The combiner contains an additional holographic layer tuned to the blue emission of the P53 phosphor as viewed from the HVC, taking advantage of the HVC's high sensitivity in the blue. The reflectance of the additional hologram is tapered to achieve minimum brightness variation at the HVC. The response of the additional hologram as viewed by the pilot shifts towards the ultra-violet and is thus nearly invisible. Theoretical and measured performance of the combiner are presented.

  13. A compact high-definition low-cost digital stereoscopic video camera for rapid robotic surgery development.

    PubMed

    Carlson, Jay; Kowalczuk, Jędrzej; Psota, Eric; Pérez, Lance C

    2012-01-01

    Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration. PMID:22356964

  14. A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities

    PubMed Central

    Calavia, Lorena; Baladrn, Carlos; Aguiar, Javier M.; Carro, Beln; Snchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  15. A semantic autonomous video surveillance system for dense camera networks in Smart Cities.

    PubMed

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  16. VideoWeb Dataset for Multi-camera Activities and Non-verbal Communication

    NASA Astrophysics Data System (ADS)

    Denina, Giovanni; Bhanu, Bir; Nguyen, Hoang Thanh; Ding, Chong; Kamal, Ahmed; Ravishankar, Chinya; Roy-Chowdhury, Amit; Ivers, Allen; Varda, Brenda

    Human-activity recognition is one of the most challenging problems in computer vision. Researchers from around the world have tried to solve this problem and have come a long way in recognizing simple motions and atomic activities. As the computer vision community heads toward fully recognizing human activities, a challenging and labeled dataset is needed. To respond to that need, we collected a dataset of realistic scenarios in a multi-camera network environment (VideoWeb) involving multiple persons performing dozens of different repetitive and non-repetitive activities. This chapter describes the details of the dataset. We believe that this VideoWeb Activities dataset is unique and it is one of the most challenging datasets available today. The dataset is publicly available online at http://vwdata.ee.ucr.edu/ along with the data annotation.

  17. Acute gastroenteritis and video camera surveillance: a cruise ship case report.

    PubMed

    Diskin, Arthur L; Caro, Gina M; Dahl, Eilif

    2014-01-01

    A 'faecal accident' was discovered in front of a passenger cabin of a cruise ship. After proper cleaning of the area the passenger was approached, but denied having any gastrointestinal symptoms. However, when confronted with surveillance camera evidence, she admitted having the accident and even bringing the towel stained with diarrhoea back to the pool towels bin. She was isolated until the next port where she was disembarked. Acute gastroenteritis (AGE) caused by Norovirus is very contagious and easily transmitted from person to person on cruise ships. The main purpose of isolation is to avoid public vomiting and faecal accidents. To quickly identify and isolate contagious passengers and crew and ensure their compliance are key elements in outbreak prevention and control, but this is difficult if ill persons deny symptoms. All passenger ships visiting US ports now have surveillance video cameras, which under certain circumstances can assist in finding potential index cases for AGE outbreaks. PMID:24677123

  18. System design description for the LDUA high resolution stereoscopic video camera system (HRSVS)

    SciTech Connect

    Pardini, A.F.

    1998-01-27

    The High Resolution Stereoscopic Video Camera System (HRSVS), system 6230, was designed to be used as an end effector on the LDUA to perform surveillance and inspection activities within a waste tank. It is attached to the LDUA by means of a Tool Interface Plate (TIP) which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate. Designed to perform up close weld and corrosion inspection roles in US T operations, the HRSVS will support and supplement the Light Duty Utility Arm (LDUA) and provide the crucial inspection tasks needed to ascertain waste tank condition.

  19. Fresnel hologram generation using an HD resolution depth range video camera

    NASA Astrophysics Data System (ADS)

    Oi, Ryutaro; Mishina, Tomoyuki; Yamamoto, Kenji; Senoh, Takanori; Kurita, Taiichiro

    2010-02-01

    Holography is considered as an ideal 3D display method. We generated a hologram under white light. The infrared depth camera, which we used, captures the depth information as well as color video of the scene in 20mm of accuracy at 2m of object distance. In this research, we developed a software converter to convert the HD resolution depth map to the hologram. In this conversion method, each elemental diffraction pattern on a hologram plane was calculated beforehand according to the object distance and the maximum diffraction angle determined by the reconstruction SLM device (high resolution LCOS). The reconstructed 3D image was observed.

  20. Scalable software architecture for on-line multi-camera video processing

    NASA Astrophysics Data System (ADS)

    Camplani, Massimo; Salgado, Luis

    2011-03-01

    In this paper we present a scalable software architecture for on-line multi-camera video processing, that guarantees a good trade off between computational power, scalability and flexibility. The software system is modular and its main blocks are the Processing Units (PUs), and the Central Unit. The Central Unit works as a supervisor of the running PUs and each PU manages the acquisition phase and the processing phase. Furthermore, an approach to easily parallelize the desired processing application has been presented. In this paper, as case study, we apply the proposed software architecture to a multi-camera system in order to efficiently manage multiple 2D object detection modules in a real-time scenario. System performance has been evaluated under different load conditions such as number of cameras and image sizes. The results show that the software architecture scales well with the number of camera and can easily works with different image formats respecting the real time constraints. Moreover, the parallelization approach can be used in order to speed up the processing tasks with a low level of overhead.

  1. Mounted Video Camera Captures Launch of STS-112, Shuttle Orbiter Atlantis

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A color video camera mounted to the top of the External Tank (ET) provided this spectacular never-before-seen view of the STS-112 mission as the Space Shuttle Orbiter Atlantis lifted off in the afternoon of October 7, 2002. The camera provided views as the orbiter began its ascent until it reached near-orbital speed, about 56 miles above the Earth, including a view of the front and belly of the orbiter, a portion of the Solid Rocket Booster, and ET. The video was downlinked during flight to several NASA data-receiving sites, offering the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. Atlantis carried the S1 Integrated Truss Structure and the Crew and Equipment Translation Aid (CETA) Cart. The CETA is the first of two human-powered carts that will ride along the International Space Station's railway providing a mobile work platform for future extravehicular activities by astronauts. Landing on October 18, 2002, the Orbiter Atlantis ended its 11-day mission.

  2. Mounted Video Camera Captures Launch of STS-112, Shuttle Orbiter Atlantis

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A color video camera mounted to the top of the External Tank (ET) provided this spectacular never-before-seen view of the STS-112 mission as the Space Shuttle Orbiter Atlantis lifted off in the afternoon of October 7, 2002, The camera provided views as the the orbiter began its ascent until it reached near-orbital speed, about 56 miles above the Earth, including a view of the front and belly of the orbiter, a portion of the Solid Rocket Booster, and ET. The video was downlinked during flight to several NASA data-receiving sites, offering the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. Atlantis carried the S1 Integrated Truss Structure and the Crew and Equipment Translation Aid (CETA) Cart. The CETA is the first of two human-powered carts that will ride along the International Space Station's railway providing a mobile work platform for future extravehicular activities by astronauts. Landing on October 18, 2002, the Orbiter Atlantis ended its 11-day mission.

  3. A Simple Method Based on the Application of a CCD Camera as a Sensor to Detect Low Concentrations of Barium Sulfate in Suspension

    PubMed Central

    de Sena, Rodrigo Caciano; Soares, Matheus; Pereira, Maria Luiza Oliveira; da Silva, Rogrio Cruz Domingues; do Rosrio, Francisca Ferreira; da Silva, Joao Francisco Cajaiba

    2011-01-01

    The development of a simple, rapid and low cost method based on video image analysis and aimed at the detection of low concentrations of precipitated barium sulfate is described. The proposed system is basically composed of a webcam with a CCD sensor and a conventional dichroic lamp. For this purpose, software for processing and analyzing the digital images based on the RGB (Red, Green and Blue) color system was developed. The proposed method had shown very good repeatability and linearity and also presented higher sensitivity than the standard turbidimetric method. The developed method is presented as a simple alternative for future applications in the study of precipitations of inorganic salts and also for detecting the crystallization of organic compounds. PMID:22346607

  4. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  5. Calibration grooming and alignment for LDUA High Resolution Stereoscopic Video Camera System (HRSVS)

    SciTech Connect

    Pardini, A.F.

    1998-01-27

    The High Resolution Stereoscopic Video Camera System (HRSVS) was designed by the Savannah River Technology Center (SRTC) to provide routine and troubleshooting views of tank interiors during characterization and remediation phases of underground storage tank (UST) processing. The HRSVS is a dual color camera system designed to provide stereo viewing of the interior of the tanks including the tank wall in a Class 1, Division 1, flammable atmosphere. The HRSVS was designed with a modular philosophy for easy maintenance and configuration modifications. During operation of the system with the LDUA, the control of the camera system will be performed by the LDUA supervisory data acquisition system (SDAS). Video and control status 1458 will be displayed on monitors within the LDUA control center. All control functions are accessible from the front panel of the control box located within the Operations Control Trailer (OCT). The LDUA will provide all positioning functions within the waste tank for the end effector. Various electronic measurement instruments will be used to perform CG and A activities. The instruments may include a digital volt meter, oscilloscope, signal generator, and other electronic repair equipment. None of these instruments will need to be calibrated beyond what comes from the manufacturer. During CG and A a temperature indicating device will be used to measure the temperature of the outside of the HRSVS from initial startup until the temperature has stabilized. This device will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing. This sensor will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing.

  6. A stroboscopic technique for using CCD cameras in flow visualization systems for continuous viewing and stop action photography

    NASA Technical Reports Server (NTRS)

    Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.

    1992-01-01

    A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.

  7. Visual fatigue modeling for stereoscopic video shot based on camera motion

    NASA Astrophysics Data System (ADS)

    Shi, Guozhong; Sang, Xinzhu; Yu, Xunbo; Liu, Yangdong; Liu, Jing

    2014-11-01

    As three-dimensional television (3-DTV) and 3-D movie become popular, the discomfort of visual feeling limits further applications of 3D display technology. The cause of visual discomfort from stereoscopic video conflicts between accommodation and convergence, excessive binocular parallax, fast motion of objects and so on. Here, a novel method for evaluating visual fatigue is demonstrated. Influence factors including spatial structure, motion scale and comfortable zone are analyzed. According to the human visual system (HVS), people only need to converge their eyes to the specific objects for static cameras and background. Relative motion should be considered for different camera conditions determining different factor coefficients and weights. Compared with the traditional visual fatigue prediction model, a novel visual fatigue predicting model is presented. Visual fatigue degree is predicted using multiple linear regression method combining with the subjective evaluation. Consequently, each factor can reflect the characteristics of the scene, and the total visual fatigue score can be indicated according to the proposed algorithm. Compared with conventional algorithms which ignored the status of the camera, our approach exhibits reliable performance in terms of correlation with subjective test results.

  8. Gain, Level, And Exposure Control For A Television Camera

    NASA Technical Reports Server (NTRS)

    Major, Geoffrey J.; Hetherington, Rolfe W.

    1992-01-01

    Automatic-level-control/automatic-gain-control (ALC/AGC) system for charge-coupled-device (CCD) color television camera prevents over-loading in bright scenes using technique for measuring brightness of scene from red, green, and blue output signals and processing these into adjustments of video amplifiers and iris on camera lens. System faster, does not distort video brightness signals, and built with smaller components.

  9. Front-illuminated versus back-illuminated photon-counting CCD-based gamma camera: important consequences for spatial resolution and energy resolution.

    PubMed

    Heemskerk, Jan W T; Westra, Albert H; Linotte, Peter M; Ligtvoet, Kees M; Zbijewski, Wojciech; Beekman, Freek J

    2007-04-21

    Charge-coupled devices (CCDs) coupled to scintillation crystals can be used for high-resolution imaging with x-rays and gamma rays. When the CCD images can be read out fast enough, the energy and interaction position of individual gamma quanta can be estimated by a real-time image analysis of the scintillation light flashes ('photon-counting mode'). The electron-multiplying CCD (EMCCD) is well suited for fast read out, since even at high frame rates it has extremely low read-out noise. Back-illuminated (BI) EMCCDs have much higher quantum efficiency than front-illuminated (FI) EMCCDs. Here we compare the spatial and energy resolution of gamma cameras based on FI and BI EMCCDs. The CCDs are coupled to a 1000 microm thick columnar CsI(Tl) crystal for the purpose of Tc-99m and I-125 imaging. Intrinsic spatial resolutions of 44 microm for I-125 and 49 microm for Tc-99m were obtained when using a BI EMCCD, which is an improvement by a factor of about 1.2-2 over the FI EMCCD. Furthermore, in the energy spectrum of the BI EMCCD, the I-125 signal could be clearly separated from the background noise, which was not the case for the FI EMCCD. The energy resolution of a BI EMCCD for Tc-99m was estimated to be approximately 36 keV, full width at half maximum, at 141 keV. The excellent results for the BI EMCCD encouraged us to investigate the cooling requirements for our setup. We have found that for the BI EMCCD, the spatial and energy resolution, as well as image noise, remained stable over a range of temperatures from -50 degrees C to -15 degrees C. This is a significant advantage over the FI EMCCD, which suffered from loss of spatial and especially energy resolution at temperatures as low as -40 degrees C. We conclude that the use of BI EMCCDs may significantly improve the imaging capabilities and the cost efficiency of CCD-based high-resolution gamma cameras. PMID:17404450

  10. Identifying predators and fates of grassland passerine nests using miniature video cameras

    USGS Publications Warehouse

    Pietz, P.J.; Granfors, D.A.

    2000-01-01

    Nest fates, causes of nest failure, and identities of nest predators are difficult to determine for grassland passerines. We developed a miniature video-camera system for use in grasslands and deployed it at 69 nests of 10 passerine species in North Dakota during 1996-97. Abandonment rates were higher at nests 1 day or night (22-116 hr) at 6 nests, 5 of which were depredated by ground squirrels or mice. For nests without cameras, estimated predation rates were lower for ground nests than aboveground nests (P = 0.055), but did not differ between open and covered nests (P = 0.74). Open and covered nests differed, however, when predation risk (estimated by initial-predation rate) was examined separately for day and night using camera-monitored nests; the frequency of initial predations that occurred during the day was higher for open nests than covered nests (P = 0.015). Thus, vulnerability of some nest types may depend on the relative importance of nocturnal and diurnal predators. Predation risk increased with nestling age from 0 to 8 days (P = 0.07). Up to 15% of fates assigned to camera-monitored nests were wrong when based solely on evidence that would have been available from periodic nest visits. There was no evidence of disturbance at nearly half the depredated nests, including all 5 depredated by large mammals. Overlap in types of sign left by different predator species, and variability of sign within species, suggests that evidence at nests is unreliable for identifying predators of grassland passerines.

  11. ProgRes 3000: a digital color camera with a 2-D array CCD sensor and programmable resolution up to 2994 x 2320 picture elements

    NASA Astrophysics Data System (ADS)

    Lenz, Reimar K.; Lenz, Udo

    1990-11-01

    A newly developed imaging principle two dimensional microscanning with Piezo-controlled Aperture Displacement (PAD) allows for high image resolutions. The advantages of line scanners (high resolution) are combined with those of CCD area sensors (high light sensitivity geometrical accuracy and stability easy focussing illumination control and selection of field of view by means of TV real-time imaging). A custom designed sensor optimized for small sensor element apertures and color fidelity eliminates the need for color filter revolvers or mechanical shutters and guarantees good color convergence. By altering the computer controlled microscan patterns spatial and temporal resolution become interchangeable their product being a constant. The highest temporal resolution is TV real-time (50 fields/sec) the highest spatial resolution is 2994 x 2320 picture elements (Pels) for each of the three color channels (28 MBytes of raw image data in 8 see). Thus for the first time it becomes possible to take 35mm slide quality still color images of natural 3D scenes by purely electronic means. Nearly " square" Pels as well as hexagonal sampling schemes are possible. Excellent geometrical accuracy and low noise is guaranteed by sensor element (Sel) synchronous analog to digital conversion within the camera head. The cameras principle of operation and the procedure to calibrate the two-dimensional piezo-mechanical motion with an accuracy of better than O. 2. tm RMSE in image space is explained. The remaining positioning inaccuracy may be further

  12. Television automatic video-line tester

    NASA Astrophysics Data System (ADS)

    Ge, Zhaoxiang; Tang, Dongsheng; Feng, Binghua

    1998-08-01

    The linearity of telescope video-line is an important character for geodetic instruments and micrometer- telescopes. The instrument of 1 inch video-line tester, invented by University of Shanghai for Science and Technology, has been adopted in related instrument criterion and national metering regulation. But in optical and chemical reading with visual alignment, it can cause subjective error and can not give detailed data and so on. In this paper, the author put forward an improvement for video-line tester by using CCD for TV camera, displaying and processing CCD signal through computer, and auto-testing, with advantage of objectivity, reliability, rapid speed and less focusing error.

  13. A miniture spectrometer using color CCD and frame calculus technique

    NASA Astrophysics Data System (ADS)

    Wan, Wei; Zhang, Guoping; Chen, Minghong; Liu, Minmin

    2005-01-01

    A design of spectrometer is presented, which uses a holographic grating and a two-dimensional color CCD camera connected with PC via video format port. And in the image post-procession, a real-time frame calculus technique and a non-linear filter were applied to provider higher image quality and better resistant to background noise. With improved designed zoom mechanics, the device has a wide resolution dynamic range and high frequency, since it can gather more spectrum information than linear black-white CCD. The spectrum analysis experiments for water quality detection indicate that the device can meet variant requirements of analysis at low cost.

  14. Plant iodine-131 uptake in relation to root concentration as measured in minirhizotron by video camera:

    SciTech Connect

    Moss, K.J.

    1990-09-01

    Glass viewing tubes (minirhizotrons) were placed in the soil beneath native perennial bunchgrass (Agropyron spicatum). The tubes provided access for observing and quantifying plant roots with a miniature video camera and soil moisture estimates by neutron hydroprobe. The radiotracer I-131 was delivered to the root zone at three depths with differing root concentrations. The plant was subsequently sampled and analyzed for I-131. Plant uptake was greater when I-131 was applied at soil depths with higher root concentrations. When I-131 was applied at soil depths with lower root concentrations, plant uptake was less. However, the relationship between root concentration and plant uptake was not a direct one. When I-131 was delivered to deeper soil depths with low root concentrations, the quantity of roots there appeared to be less effective in uptake than the same quantity of roots at shallow soil depths with high root concentration. 29 refs., 6 figs., 11 tabs.

  15. Embedded FIR filter design for real-time refocusing using a standard plenoptic video camera

    NASA Astrophysics Data System (ADS)

    Hahne, Christopher; Aggoun, Amar

    2014-03-01

    A novel and low-cost embedded hardware architecture for real-time refocusing based on a standard plenoptic camera is presented in this study. The proposed layout design synthesizes refocusing slices directly from micro images by omitting the process for the commonly used sub-aperture extraction. Therefore, intellectual property cores, containing switch controlled Finite Impulse Response (FIR) filters, are developed and applied to the Field Programmable Gate Array (FPGA) XC6SLX45 from Xilinx. Enabling the hardware design to work economically, the FIR filters are composed of stored product as well as upsampling and interpolation techniques in order to achieve an ideal relation between image resolution, delay time, power consumption and the demand of logic gates. The video output is transmitted via High-Definition Multimedia Interface (HDMI) with a resolution of 720p at a frame rate of 60 fps conforming to the HD ready standard. Examples of the synthesized refocusing slices are presented.

  16. Dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. II - Data handling and control systems

    NASA Technical Reports Server (NTRS)

    Dewey, D.; Ricker, G. R.

    1980-01-01

    The data collection system for the MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes) is described. The system relies on an RCA 1802 microprocessor-based controller, which serves to collect and format data, to present data to a scan converter, and to operate a device communication bus. A NOVA minicomputer is used to record and recall frame images and to perform refined image processing. The RCA 1802 also provides instrument mode control for the MASCOT. Commands are issued using STOIC, a FORTH-like language. Sufficient flexibility has been provided so that a variety of CCDs can be accommodated.

  17. Quantitative underwater 3D motion analysis using submerged video cameras: accuracy analysis and trajectory reconstruction.

    PubMed

    Silvatti, Amanda P; Cerveri, Pietro; Telles, Thiago; Dias, Fbio A S; Baroni, Guido; Barros, Ricardo M L

    2013-01-01

    In this study we aim at investigating the applicability of underwater 3D motion capture based on submerged video cameras in terms of 3D accuracy analysis and trajectory reconstruction. Static points with classical direct linear transform (DLT) solution, a moving wand with bundle adjustment and a moving 2D plate with Zhang's method were considered for camera calibration. As an example of the final application, we reconstructed the hand motion trajectories in different swimming styles and qualitatively compared this with Maglischo's model. Four highly trained male swimmers performed butterfly, breaststroke and freestyle tasks. The middle fingertip trajectories of both hands in the underwater phase were considered. The accuracy (mean absolute error) of the two calibration approaches (wand: 0.96 mm - 2D plate: 0.73 mm) was comparable to out of water results and highly superior to the classical DLT results (9.74 mm). Among all the swimmers, the hands' trajectories of the expert swimmer in the style were almost symmetric and in good agreement with Maglischo's model. The kinematic results highlight symmetry or asymmetry between the two hand sides, intra- and inter-subject variability in terms of the motion patterns and agreement or disagreement with the model. The two outcomes, calibration results and trajectory reconstruction, both move towards the quantitative 3D underwater motion analysis. PMID:22435960

  18. The design and realization of a three-dimensional video system by means of a CCD array

    NASA Astrophysics Data System (ADS)

    Boizard, J. L.

    1985-12-01

    Design features and principles and initial tests of a prototype three-dimensional robot vision system based on a laser source and a CCD detector array is described. The use of a laser as a coherent illumination source permits the determination of the relief using one emitter since the location of the source is a known quantity with low distortion. The CCD signal detector array furnishes an acceptable signal/noise ratio and, when wired to an appropriate signal processing system, furnishes real-time data on the return signals, i.e., the characteristic points of an object being scanned. Signal processing involves integration of 29 kB of data per 100 samples, with sampling occurring at a rate of 5 MHz (the CCDs) and yielding an image every 12 msec. Algorithms for filtering errors from the data stream are discussed.

  19. Dual-modality imaging in vivo with an NIR and gamma emitter using an intensified CCD camera and a conventional gamma camera

    NASA Astrophysics Data System (ADS)

    Houston, Jessica P.; Ke, Shi; Wang, Wei; Li, Chun; Sevick-Muraca, Eva M.

    2005-04-01

    Fluorescence-enhanced optical imaging measurements and conventional gamma camera images on human M21 melanoma xenografts were acquired for a "dual-modality" molecular imaging study. The avb3 integrin cell surface receptors were imaged using a cyclic peptide, cyclopentapeptide cyclo(lys-Arg-Gly-Asp-phe) [c(KRGDf)] probe which is known to target the membrane receptor. The probe, dual-labeled with a radiotracer, 111Indium, for gamma scintigraphy as well as with a near-infrared dye, IRDye800, was injected into six nude mice at a dose equivalent to 90mCi of 111In and 5 nanomoles of near-infrared (NIR) dye. A 15 min gamma scan and 800 millisecond NIR-sensitive ICCD optical photograph were collected 24 hours after injection of the dual-labeled probe. The image quality between the nuclear and optical data was investigated with the results showing similar target-to-background ratios (TBR) based on the origin of fluorescence and gamma emissions at the targeted tumor site. Furthermore, an analysis of SNR versus contrast showed greater sensitivity of optical over nuclear imaging for the subcutaneous tumor targets measured by surface regions of interest.

  20. A cooled CCD camera-based protocol provides an effective solution for in vitro monitoring of luciferase.

    PubMed

    Afshari, Amirali; Uhde-Stone, Claudia; Lu, Biao

    2015-03-13

    Luciferase assay has become an increasingly important technique to monitor a wide range of biological processes. However, the mainstay protocols require a luminometer to acquire and process the data, therefore limiting its application to specialized research labs. To overcome this limitation, we have developed an alternative protocol that utilizes a commonly available cooled charge-coupled device (CCCD), instead of a luminometer for data acquiring and processing. By measuring activities of different luciferases, we characterized their substrate specificity, assay linearity, signal-to-noise levels, and fold-changes via CCCD. Next, we defined the assay parameters that are critical for appropriate use of CCCD for different luciferases. To demonstrate the usefulness in cultured mammalian cells, we conducted a case study to examine NF?B gene activation in response to inflammatory signals in human embryonic kidney cells (HEK293 cells). We found that data collected by CCCD camera was equivalent to those acquired by luminometer, thus validating the assay protocol. In comparison, The CCCD-based protocol is readily amenable to live-cell and high-throughput applications, offering fast simultaneous data acquisition and visual and quantitative data presentation. In conclusion, the CCCD-based protocol provides a useful alternative for monitoring luciferase reporters. The wide availability of CCCD will enable more researchers to use luciferases to monitor and quantify biological processes. PMID:25677617

  1. A unified and efficient framework for court-net sports video analysis using 3D camera modeling

    NASA Astrophysics Data System (ADS)

    Han, Jungong; de With, Peter H. N.

    2007-01-01

    The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.

  2. Compensating for Camera Translation in Video Eye Movement Recordings by Tracking a Representative Landmark Selected Automatically by a Genetic Algorithm

    PubMed Central

    Karmali, Faisal; Shelhamer, Mark

    2013-01-01

    It is common in oculomotor and vestibular research to use video or still cameras to acquire data on eye movements. Unfortunately, such data are often contaminated by unwanted motion of the face relative to the camera, especially during experiments in dynamic motion environments. We develop a method for estimating the motion of a camera relative to a highly deformable surface, specifically the movement of a camera relative to the face and eyes. A small rectangular region of interest (ROI) on the face is automatically selected and tracked throughout a set of video frames as a measure of vertical camera translation. The specific goal is to present a process based on a genetic algorithm that selects a suitable ROI for tracking: one whose translation within the camera image accurately matches the actual relative motion of the camera. We find that co-correlation, a statistic describing the time series of a large group of ROIs, predicts the accuracy of the ROIs, and can be used to select the best ROI from a group. After the genetic algorithm finds the best ROIs from a group, it uses recombination to form a new generation of ROIs that inherits properties of the ROIs from the previous generation. We show that the algorithm can select an ROI that will estimate camera translation and determine the direction that the eye is looking with an average accuracy of 0.75, even with camera translations of 2.5 mm at a viewing distance of 120 mm, which would cause an error of 11 without correction. PMID:18835407

  3. Evaluation of camera requirements for comparative genomic hybridization.

    PubMed

    Tirkkonen, M; Karhu, R; Kallioniemi, O; Isola, J

    1996-12-01

    Comparative genomic hybridization (CGH) is based on quantitative digital image analysis of fluorescence intensities from metaphase chromosomes. High-quality CCD cameras are commonly used for image acquisition, but the minimal requirements of CCD cameras have not been determined. We first evaluated minimal camera requirements by artificially reducing spatial and dynamic resolution of images produced by a scientific-grade CCD camera (Xillix MicroImager). The results showed that reduction of dynamic resolution from 4,096 to 256 gray levels (12-bit image transformed to an 8-bit image) had negligible effect on CGH profiles and no effect on their interpretation. Similarly, CGH profiles obtained from spatially reduced images (from 1,340 x 1,035 to 670 x 517 pixels) were virtually identical to those obtained from the original image. For a practical test, we compared two 8-bit frame integrating video-rated CCD cameras (Cohu 4910 and Photometrics ImagePoint) to the Xillix Micro-Imager in a real CGH setting. Images collected from the same metaphase cells with all three cameras resulted in the identification of the same genetic changes in the samples studied. We conclude that requirements for camera resolution in CGH analysis are not stringent, and therefore that low-priced video-rated cameras capable of frame integration are sufficient for comparative genomic hybridization. PMID:8946148

  4. A versatile digital video engine for safeguards and security applications

    SciTech Connect

    Hale, W.R.; Johnson, C.S.; DeKeyser, P.

    1996-08-01

    The capture and storage of video images have been major engineering challenges for safeguard and security applications since the video camera provided a method to observe remote operations. The problems of designing reliable video cameras were solved in the early 1980`s with the introduction of the CCD (charged couple device) camera. The first CCD cameras cost in the thousands of dollars but have now been replaced by cameras costing in the hundreds. The remaining problem of storing and viewing video images in both attended and unattended video surveillance systems and remote monitoring systems is being solved by sophisticated digital compression systems. One such system is the PC-104 three card set which is literally a ``video engine`` that can provide power for video storage systems. The use of digital images in surveillance systems makes it possible to develop remote monitoring systems, portable video surveillance units, image review stations, and authenticated camera modules. This paper discusses the video card set and how it can be used in many applications.

  5. The Automatically Triggered Video or Imaging Station (ATVIS): An Inexpensive Way to Catch Geomorphic Events on Camera

    NASA Astrophysics Data System (ADS)

    Wickert, A. D.

    2010-12-01

    To understand how single events can affect landscape change, we must catch the landscape in the act. Direct observations are rare and often dangerous. While video is a good alternative, commercially-available video systems for field installation cost 11,000, weigh ~100 pounds (45 kg), and shoot 640x480 pixel video at 4 frames per second. This is the same resolution as a cheap point-and-shoot camera, with a frame rate that is nearly an order of magnitude worse. To overcome these limitations of resolution, cost, and portability, I designed and built a new observation station. This system, called ATVIS (Automatically Triggered Video or Imaging Station), costs 450--500 and weighs about 15 pounds. It can take roughly 3 hours of 1280x720 pixel video, 6.5 hours of 640x480 video, or 98,000 1600x1200 pixel photos (one photo every 7 seconds for 8 days). The design calls for a simple Canon point-and-shoot camera fitted with custom firmware that allows 5V pulses through its USB cable to trigger it to take a picture or to initiate or stop video recording. These pulses are provided by a programmable microcontroller that can take input from either sensors or a data logger. The design is easily modifiable to a variety of camera and sensor types, and can also be used for continuous time-lapse imagery. We currently have prototypes set up at a gully near West Bijou Creek on the Colorado high plains and at tributaries to Marble Canyon in northern Arizona. Hopefully, a relatively inexpensive and portable system such as this will allow geomorphologists to supplement sensor networks with photo or video monitoring and allow them to seeand better quantifythe fantastic array of processes that modify landscapes as they unfold. Camera station set up at Badger Canyon, Arizona.Inset: view into box. Clockwise from bottom right: camera, microcontroller (blue), DC converter (red), solar charge controller, 12V battery. Materials and installation assistance courtesy of Ron Griffiths and the USGS Grand Canyon Monitoring and Research Center.

  6. Optimal camera exposure for video surveillance systems by predictive control of shutter speed, aperture, and gain

    NASA Astrophysics Data System (ADS)

    Torres, Juan; Menéndez, José Manuel

    2015-02-01

    This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together with a representative auto-exposure algorithm in the recent literature. Besides the sunrises and the nightfalls, multiple weather conditions occurred which produced light changes in the scene: sunny hours that produced sharpen shadows and highlights; cloud coverages that softened the shadows; and cloudy and rainy hours that dimmed the scene. Several indicators were used to measure the performance of the algorithms. They provided the objective quality as regards: the time that the algorithms recover from an under or over exposure, the brightness stability, and the change related to the optimal exposure. The results demonstrated that our algorithm reacts faster to all the light changes than the selected state-of-the-art algorithm. It is also capable of acquiring well exposed images and maintaining the brightness stable during more time. Summing up the results, we concluded that the proposed algorithm provides a fast and stable auto-exposure method that maintains an optimal exposure for video surveillance applications. Future work will involve the evaluation of this algorithm in robotics.

  7. Surgeon point-of-view recording: Using a high-definition head-mounted video camera in the operating room

    PubMed Central

    Nair, Akshay Gopinathan; Kamal, Saurabh; Dave, Tarjani Vivek; Mishra, Kapil; Reddy, Harsha S; Rocca, David Della; Rocca, Robert C Della; Andron, Aleza; Jain, Vandana

    2015-01-01

    Objective: To study the utility of a commercially available small, portable ultra-high definition (HD) camera (GoPro Hero 4) for intraoperative recording. Methods: A head mount was used to fix the camera on the operating surgeon's head. Due care was taken to protect the patient's identity. The recorded video was subsequently edited and used as a teaching tool. This retrospective, noncomparative study was conducted at three tertiary eye care centers. The surgeries recorded were ptosis correction, ectropion correction, dacryocystorhinostomy, angular dermoid excision, enucleation, blepharoplasty and lid tear repair surgery (one each). The recorded videos were reviewed, edited, and checked for clarity, resolution, and reproducibility. Results: The recorded videos were found to be high quality, which allowed for zooming and visualization of the surgical anatomy clearly. Minimal distortion is a drawback that can be effectively addressed during postproduction. The camera, owing to its lightweight and small size, can be mounted on the surgeon's head, thus offering a unique surgeon point-of-view. In our experience, the results were of good quality and reproducible. Conclusions: A head-mounted ultra-HD video recording system is a cheap, high quality, and unobtrusive technique to record surgery and can be a useful teaching tool in external facial and ophthalmic plastic surgery. PMID:26655001

  8. CCD TV focal plane guider development and comparison to SIRTF applications

    NASA Technical Reports Server (NTRS)

    Rank, David M.

    1989-01-01

    It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

  9. Nyquist sampling theorem: understanding the illusion of a spinning wheel captured with a video camera

    NASA Astrophysics Data System (ADS)

    Lvesque, Luc

    2014-11-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the sampling time is chosen judiciously, then it is possible to accurately determine the frequency of a signal varying periodically with time. This paper is of educational value as it presents the principles of sampling during data acquisition. The concept of the Nyquist sampling theorem is usually introduced very briefly in the literature, with very little practical examples to grasp its importance during data acquisitions. Through a series of carefully chosen examples, we attempt to present data sampling from the elementary conceptual idea and try to lead the reader naturally to the Nyquist sampling theorem so we may more clearly understand why a signal can be interpreted incorrectly during a data acquisition procedure in the case of undersampling.

  10. Modeling camera orientation and 3D structure from a sequence of images taken by a perambulating commercial video camera

    NASA Astrophysics Data System (ADS)

    M-Rouhani, Behrouz; Anderson, James A. D. W.

    1997-04-01

    In this paper we report the degree of reliability of image sequences taken by off-the-shelf TV cameras for modeling camera rotation and reconstructing 3D structure using computer vision techniques. This is done in spite of the fact that computer vision systems usually use imaging devices that are specifically designed for the human vision. Our scenario consists of a static scene and a mobile camera moving through the scene. The scene is any long axial building dominated by features along the three principal orientations and with at least one wall containing prominent repetitive planar features such as doors, windows bricks etc. The camera is an ordinary commercial camcorder moving along the axial axis of the scene and is allowed to rotate freely within the range +/- 10 degrees in all directions. This makes it possible that the camera be held by a walking unprofessional cameraman with normal gait, or to be mounted on a mobile robot. The system has been tested successfully on sequence of images of a variety of structured, but fairly cluttered scenes taken by different walking cameramen. The potential application areas of the system include medicine, robotics and photogrammetry.

  11. Single-Camera Panoramic-Imaging Systems

    NASA Technical Reports Server (NTRS)

    Lindner, Jeffrey L.; Gilbert, John

    2007-01-01

    Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

  12. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  13. Study of secondary flow in centrifugal blood pumps using a flow visualization method with a high-speed video camera.

    PubMed

    Sakuma, I; Fukui, Y; Dohi, T

    1996-06-01

    Four pump models with different vane configurations were evaluated with flow visualization techniques using a high-speed video camera. These models also were evaluated through in vivo hemolysis tests using bovine blood. The impeller having the greatest fluid velocity relative to the impeller, the largest velocity variance, and the most irregular local flow patterns in the flow passage caused the most hemolysis. Even if the pumps were operated at almost the same speed (rpm) at the same output, the impeller showing more irregular flow patterns had a statistically greater rate of hemolysis. This fact confirms that the existence of local irregular flow patterns in a centrifugal blood pump deteriorates its hemolytic performance. Thus, to optimize the design of the pump, it is very important to examine the secondary flow patterns in the centrifugal blood pump in detail using flow visualization with a high-speed video camera. PMID:8817952

  14. Application of video-cameras for quality control and sampling optimisation of hydrological and erosion measurements in a catchment

    NASA Astrophysics Data System (ADS)

    Lora-Milln, Julio S.; Taguas, Encarnacion V.; Gomez, Jose A.; Perez, Rafael

    2014-05-01

    Long term soil erosion studies imply substantial efforts, particularly when there is the need to maintain continuous measurements. There are high costs associated to maintenance of field equipment keeping and quality control of data collection. Energy supply and/or electronic failures, vandalism and burglary are common causes of gaps in datasets, reducing their reach in many cases. In this work, a system of three video-cameras, a recorder and a transmission modem (3G technology) has been set up in a gauging station where rainfall, runoff flow and sediment concentration are monitored. The gauging station is located in the outlet of an olive orchard catchment of 6.4 ha. Rainfall is measured with one automatic raingauge that records intensity at one minute intervals. The discharge is measured by a flume of critical flow depth, where the water is recorded by an ultrasonic sensor. When the water level rises to a predetermined level, the automatic sampler turns on and fills a bottle at different intervals according to a program depending on the antecedent precipitation. A data logger controls the instruments' functions and records the data. The purpose of the video-camera system is to improve the quality of the dataset by i) the visual analysis of the measurement conditions of flow into the flume; ii) the optimisation of the sampling programs. The cameras are positioned to record the flow at the approximation and the gorge of the flume. In order to contrast the values of ultrasonic sensor, there is a third camera recording the flow level close to a measure tape. This system is activated when the ultrasonic sensor detects a height threshold, equivalent to an electric intensity level. Thus, only when there is enough flow, video-cameras record the event. This simplifies post-processing and reduces the cost of download of recordings. The preliminary contrast analysis will be presented as well as the main improvements in the sample program.

  15. A Numerical Analysis of a Frame Calibration Method for Video-based All-Sky Camera Systems

    NASA Astrophysics Data System (ADS)

    Bannister, Steven M.; Boucheron, Laura E.; Voelz, David G.

    2013-09-01

    The field of meteor monitoring has grown considerably over the past 20 years with the development of affordable, automated video camera systems. We describe a method for calibrating video all-sky cameras in terms of local zenith and azimuth angles. The method involves the observation of known training points (stars) and is based on an approach developed by Ceplecha & Borovi?ka. We use a simplified equation set, incorporate a quadratic expression for modeling the lens response, and utilize a nonlinear solver to obtain the calibration parameters. Simulation results with synthetic star data are presented to examine the effect of a limited number of training points, training point location, and initial parameter values on the calibration. Assumed simulation parameters are consistent with expectations for cameras in the NMSU SkySentinel network. Our modified calibration approach is shown to be stable over a broad range of calibration parameters with typical azimuth and zenith residual errors of less than 1. Example calibration results for three camera nodes in the SkySentinel network are presented.

  16. Research on portable intelligent monitoring system based on video server

    NASA Astrophysics Data System (ADS)

    Song, Gui-cai; Na, Yan-xiang; Yang, Fei-yu; Cao, Shi-hao

    2011-08-01

    Intelligent video surveillance system study in this paper is constituted by CCD cameras, infrared pyroelectric sensor, stepping motor and the computer. And make In-depth study for portable intelligent monitoring system from two aspects of hardware and software. compare and analyse between CCD image sensor and CMOS image sensor, key research on the CCD various' characteristics and performance indicators; investigate for the infrared pyroelectric sensor structure, characteristics, put forward further method to improve pyroelectric sensor performance, response degree. On software, according to the calculation of moving object detection, through controll the step motor, can tracking video real-time or finish videoing,video Real-time. Intelligent video surveillance system use infrared pyroelectric sensor as access switches, make sure the foolproof of monitoring site safety system.

  17. Lori Losey - The Woman Behind the Video Camera - Duration: 3 minutes, 36 seconds.

    NASA Video Gallery

    The often-spectacular aerial video imagery of NASA flight research, airborne science missions and space satellite launches doesn't just happen. Much of it is the work of Lori Losey, senior video pr...

  18. HDR {sup 192}Ir source speed measurements using a high speed video camera

    SciTech Connect

    Fonseca, Gabriel P.; Rubo, Rodrigo A.; Sales, Camila P. de; Verhaegen, Frank

    2015-01-15

    Purpose: The dose delivered with a HDR {sup 192}Ir afterloader can be separated into a dwell component, and a transit component resulting from the source movement. The transit component is directly dependent on the source speed profile and it is the goal of this study to measure accurate source speed profiles. Methods: A high speed video camera was used to record the movement of a {sup 192}Ir source (Nucletron, an Elekta company, Stockholm, Sweden) for interdwell distances of 0.25–5 cm with dwell times of 0.1, 1, and 2 s. Transit dose distributions were calculated using a Monte Carlo code simulating the source movement. Results: The source stops at each dwell position oscillating around the desired position for a duration up to (0.026 ± 0.005) s. The source speed profile shows variations between 0 and 81 cm/s with average speed of ∼33 cm/s for most of the interdwell distances. The source stops for up to (0.005 ± 0.001) s at nonprogrammed positions in between two programmed dwell positions. The dwell time correction applied by the manufacturer compensates the transit dose between the dwell positions leading to a maximum overdose of 41 mGy for the considered cases and assuming an air-kerma strength of 48 000 U. The transit dose component is not uniformly distributed leading to over and underdoses, which is within 1.4% for commonly prescribed doses (3–10 Gy). Conclusions: The source maintains its speed even for the short interdwell distances. Dose variations due to the transit dose component are much lower than the prescribed treatment doses for brachytherapy, although transit dose component should be evaluated individually for clinical cases.

  19. Determination of visible coordinates of the low-orbit space objects and their photometry by the CCD camera with the analogue output. Initial image processing

    NASA Astrophysics Data System (ADS)

    Shakun, L. S.; Koshkin, N. I.

    2014-06-01

    The number of artificial space objects in the low Earth orbit has been continuously increasing. That raises the requirements for the accuracy of measurement of their coordinates and for the precision of the prediction of their motion. The accuracy of the prediction can be improved if the actual current orientation of the non-spherical satellite is taken into account. In so doing, it becomes possible to directly determine the atmospheric density along the orbit. The problem solution is to regularly conduct the photometric surveillances of a large number of satellites and monitor the parameters of their rotation around the centre of mass. To do that, it is necessary to get and promptly process large video arrays, containing pictures of a satellite against the background stars. In the present paper, the method for the simultaneous measurement of coordinates and brightness of the low Earth orbit space objects against the background stars when they are tracked by telescope KT-50 with the mirror diameter of 50 cm and with video camera WAT-209H2 is considered. The problem of determination of the moments of exposures of images is examined in detail. The estimation of the accuracy of measuring both the apparent coordinates of stars and their photometry is given on the example of observation of the open star cluster. In the presented observations, the standard deviation of one position measured is 1?, the accuracy of determination of the moment of exposure of images is better than 0.0001 s. The estimate of the standard deviation of one measurement of brightness is 0.1m. Some examples of the results of surveillances of satellites are also presented in the paper.

  20. Lights, Camera, Action: Advancing Learning, Research, and Program Evaluation through Video Production in Educational Leadership Preparation

    ERIC Educational Resources Information Center

    Friend, Jennifer; Militello, Matthew

    2015-01-01

    This article analyzes specific uses of digital video production in the field of educational leadership preparation, advancing a three-part framework that includes the use of video in (a) teaching and learning, (b) research methods, and (c) program evaluation and service to the profession. The first category within the framework examines videos

  1. New fully electronic streak camera based on intensified picosecond tube

    NASA Astrophysics Data System (ADS)

    Imhoff, Claude; Eumurian, Gregoire M.; Pastre, Jean-Luc

    1993-10-01

    Thomson-CSF is introducing a new camera, the NUCAM TSN 906, especially designed for measurement of ultra-fast light phenomena. The camera features a temporal resolution of less than three picoseconds over a broad light spectrum by the utilization of photocathode S20, S25, or S1. This camera is designed for ease of use in industrial as well as laboratory environments. In previous camera generations, the image was either acquired on photographic film or through adaptation of an external video camera. With the TSN 906, electronic image acquisition is standard, creating ease of use. NUCAM integrates an intensified image converter tube, 512 X 512 pixel CCD sensor, image memory and GPIB in the same case. This original design produces a very compact and low cost camera. The camera can be used locally by displaying the image on a video monitor or remote controlled via a microcomputer installed with an interface board and control software.

  2. High speed cooled CCD experiments

    SciTech Connect

    Pena, C.R.; Albright, K.L.; Yates, G.J.

    1998-12-31

    Experiments were conducted using cooled and intensified CCD cameras. Two different cameras were identically tested using different Optical test stimulus variables. Camera gain and dynamic range were measured by varying microchannel plate (MCP) voltages and controlling light flux using neutral density (ND) filters to yield analog digitized units (ADU) which are digitized values of the CCD pixel`s analog charge. A Xenon strobe (5 {micro}s FWHM, blue light, 430 nm) and a doubled Nd.yag laser (10 ns FWHM, green light, 532 nm) were both used as pulsed illumination sources for the cameras. Images were captured on PC desktop computer system using commercial software. Camera gain and integration time values were adjusted using camera software. Mean values of camera volts versus input flux were also obtained by performing line scans through regions of interest. Experiments and results will be discussed.

  3. Development of a 300,000-pixel ultrahigh-speed high-sensitivity CCD

    NASA Astrophysics Data System (ADS)

    Ohtake, H.; Hayashida, T.; Kitamura, K.; Arai, T.; Yonai, J.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Poggemann, D.; Ruckelshausen, A.; van Kuijk, H.; Bosiers, Jan T.

    2006-02-01

    We are developing an ultrahigh-speed, high-sensitivity broadcast camera that is capable of capturing clear, smooth slow-motion videos even where lighting is limited, such as at professional baseball games played at night. In earlier work, we developed an ultrahigh-speed broadcast color camera1) using three 80,000-pixel ultrahigh-speed, highsensitivity CCDs2). This camera had about ten times the sensitivity of standard high-speed cameras, and enabled an entirely new style of presentation for sports broadcasts and science programs. Most notably, increasing the pixel count is crucially important for applying ultrahigh-speed, high-sensitivity CCDs to HDTV broadcasting. This paper provides a summary of our experimental development aimed at improving the resolution of CCD even further: a new ultrahigh-speed high-sensitivity CCD that increases the pixel count four-fold to 300,000 pixels.

  4. On the use of Video Camera Systems in the Detection of Kuiper Belt Objects by Stellar Occultations

    NASA Astrophysics Data System (ADS)

    Subasinghe, Dilini

    2012-10-01

    Due to the distance between us and the Kuiper Belt, direct detection of Kuiper Belt Objects (KBOs) is not currently possible for objects less than 10 km in diameter. Indirect methods such as stellar occultations must be employed to remotely probe these bodies. The size, shape, as well as atmospheric properties and ring system information of a body (if any), can be collected through observations of stellar occultations. This method has been previously used with some success - Roques et al. (2006) detected 3 Trans-Neptunian objects; Schlichting et al. (2009) detected a single object in archival data. However, previous assessments of KBO occultation detection rates have been calculated only for telescopes - we extend this method to video camera systems. Building on Roques & Moncuquet (2000), we present a derivation that can be applied to any video camera system, taking into account camera specifications and diffraction effects. This allows for a determination of the number of observable KBO occultations per night. Example calculations are presented for some of the automated meteor camera systems currently in use at the University of Western Ontario. The results of this project will allow us to refine and improve our own camera system, as well as allow others to enhance their systems for KBO detection. Roques, F., Doressoundiram, A., Dhillon, V., Marsh, T., Bickerton, S., Kavelaars, J. J., Moncuquet, M., Auvergne, M., Belskaya, I., Chevreton, M., Colas, F., Fernandez, A., Fitzsimmons, A., Lecacheux, J., Mousis, O., Pau, S., Peixinho, N., & Tozzi, G. P. (2006). The Astronomical Journal, 132(2), 819-822. Roques, F., & Moncuquet, M. (2000). Icarus, 147(2), 530-544. Schlichting, H. E., Ofek, E. O., Wenz, M., Sari, R., Gal-Yam, A., Livio, M., Nelan, E., & Zucker, S. (2009). Nature, 462(7275), 895-897.

  5. SU-C-18A-02: Image-Based Camera Tracking: Towards Registration of Endoscopic Video to CT

    SciTech Connect

    Ingram, S; Rao, A; Wendt, R; Castillo, R; Court, L; Yang, J; Beadle, B

    2014-06-01

    Purpose: Endoscopic examinations are routinely performed on head and neck and esophageal cancer patients. However, these images are underutilized for radiation therapy because there is currently no way to register them to a CT of the patient. The purpose of this work is to develop a method to track the motion of an endoscope within a structure using images from standard clinical equipment. This method will be incorporated into a broader endoscopy/CT registration framework. Methods: We developed a software algorithm to track the motion of an endoscope within an arbitrary structure. We computed frame-to-frame rotation and translation of the camera by tracking surface points across the video sequence and utilizing two-camera epipolar geometry. The resulting 3D camera path was used to recover the surrounding structure via triangulation methods. We tested this algorithm on a rigid cylindrical phantom with a pattern spray-painted on the inside. We did not constrain the motion of the endoscope while recording, and we did not constrain our measurements using the known structure of the phantom. Results: Our software algorithm can successfully track the general motion of the endoscope as it moves through the phantom. However, our preliminary data do not show a high degree of accuracy in the triangulation of 3D point locations. More rigorous data will be presented at the annual meeting. Conclusion: Image-based camera tracking is a promising method for endoscopy/CT image registration, and it requires only standard clinical equipment. It is one of two major components needed to achieve endoscopy/CT registration, the second of which is tying the camera path to absolute patient geometry. In addition to this second component, future work will focus on validating our camera tracking algorithm in the presence of clinical imaging features such as patient motion, erratic camera motion, and dynamic scene illumination.

  6. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, Bojan T. (Moraga, CA); Yates, George J. (Santa Fe, NM)

    1992-01-01

    An electronic method for eliminating artifacts in a video camera (10) employing a charge coupled device (CCD) (12) as an image sensor. The method comprises the step of initializing the camera (10) prior to normal read out and includes a first dump cycle period (76) for transferring radiation generated charge into the horizontal register (28) while the decaying image on the phosphor (39) being imaged is being integrated in the photosites, and a second dump cycle period (78), occurring after the phosphor (39) image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers (32). Image charge is then transferred from the photosites (36) and (38) to the vertical registers (32) and read out in conventional fashion. The inventive method allows the video camera (10) to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers (28) and (32), and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites (36) and (37).

  7. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, B.T.; Yates, G.J.

    1992-06-09

    An electronic method for eliminating artifacts in a video camera employing a charge coupled device (CCD) as an image sensor is disclosed. The method comprises the step of initializing the camera prior to normal read out and includes a first dump cycle period for transferring radiation generated charge into the horizontal register while the decaying image on the phosphor being imaged is being integrated in the photosites, and a second dump cycle period, occurring after the phosphor image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers. Image charge is then transferred from the photosites and to the vertical registers and read out in conventional fashion. The inventive method allows the video camera to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers and, and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites. 3 figs.

  8. Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments

    ERIC Educational Resources Information Center

    Schultz, Patrick L.; Quinn, Andrew S.

    2014-01-01

    In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student

  9. Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments

    ERIC Educational Resources Information Center

    Schultz, Patrick L.; Quinn, Andrew S.

    2014-01-01

    In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student…

  10. Lights, Camera, Action: Advancing Learning, Research, and Program Evaluation through Video Production in Educational Leadership Preparation

    ERIC Educational Resources Information Center

    Friend, Jennifer; Militello, Matthew

    2015-01-01

    This article analyzes specific uses of digital video production in the field of educational leadership preparation, advancing a three-part framework that includes the use of video in (a) teaching and learning, (b) research methods, and (c) program evaluation and service to the profession. The first category within the framework examines videos…

  11. 241-AZ-101 Waste Tank Color Video Camera System Shop Acceptance Test Report

    SciTech Connect

    WERRY, S.M.

    2000-03-23

    This report includes shop acceptance test results. The test was performed prior to installation at tank AZ-101. Both the camera system and camera purge system were originally sought and procured as a part of initial waste retrieval project W-151.

  12. CCD Memory

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliot, Tom; Norris, Dave; Vescelus, Fred

    1987-01-01

    CCD memory device yields over 6.4 x 10 to the eighth power levels of information on single chip. Charge-coupled device (CCD) demonstrated to operate as either read-only-memory (ROM) or photon-programmable memory with capacity of 640,000 bits, with each bit capable of being weighted to more than 1,000 discrete analog levels. Larger memory capacities now possible using proposed approach in conjunction with CCD's now being fabricated, which yield over 4 x 10 to the ninth power discrete levels of information on single chip.

  13. In-situ measurements of alloy oxidation/corrosion/erosion using a video camera and proximity sensor with microcomputer control

    NASA Technical Reports Server (NTRS)

    Deadmore, D. L.

    1984-01-01

    Two noncontacting and nondestructive, remotely controlled methods of measuring the progress of oxidation/corrosion/erosion of metal alloys, exposed to flame test conditions, are described. The external diameter of a sample under test in a flame was measured by a video camera width measurement system. An eddy current proximity probe system, for measurements outside of the flame, was also developed and tested. The two techniques were applied to the measurement of the oxidation of 304 stainless steel at 910 C using a Mach 0.3 flame. The eddy current probe system yielded a recession rate of 0.41 mils diameter loss per hour and the video system gave 0.27.

  14. Hand-gesture extraction and recognition from the video sequence acquired by a dynamic camera using condensation algorithm

    NASA Astrophysics Data System (ADS)

    Luo, Dan; Ohya, Jun

    2009-01-01

    To achieve environments in which humans and mobile robots co-exist, technologies for recognizing hand gestures from the video sequence acquired by a dynamic camera could be useful for human-to-robot interface systems. Most of conventional hand gesture technologies deal with only still camera images. This paper proposes a very simple and stable method for extracting hand motion trajectories based on the Human-Following Local Coordinate System (HFLC System), which is obtained from the located human face and both hands. Then, we apply Condensation Algorithm to the extracted hand trajectories so that the hand motion is recognized. We demonstrate the effectiveness of the proposed method by conducting experiments on 35 kinds of sign language based hand gestures.

  15. The MMT all-sky camera

    NASA Astrophysics Data System (ADS)

    Pickering, T. E.

    2006-06-01

    The MMT all-sky camera is a low-cost, wide-angle camera system that takes images of the sky every 10 seconds, day and night. It is based on an Adirondack Video Astronomy StellaCam II video camera and utilizes an auto-iris fish-eye lens to allow safe operation under all lighting conditions, even direct sunlight. This combined with the anti-blooming characteristics of the StellaCam's detector allows useful images to be obtained during sunny days as well as brightly moonlit nights. Under dark skies the system can detect stars as faint as 6th magnitude as well as very thin cirrus and low surface brightness zodiacal features such as gegenschein. The total hardware cost of the system was less than $3500 including computer and framegrabber card, a fraction of the cost of comparable systems utilizing traditional CCD cameras.

  16. Studying complex decision making in natural settings: using a head-mounted video camera to study competitive orienteering.

    PubMed

    Omodei, M M; McLennan, J

    1994-12-01

    Head-mounted video recording is described as a potentially powerful method for studying decision making in natural settings. Most alternative data-collection procedures are intrusive and disruptive of the decision-making processes involved while conventional video-recording procedures are either impractical or impossible. As a severe test of the robustness of the methodology we studied the decision making of 6 experienced orienteers who carried a head-mounted light-weight video camera as they navigated, running as fast as possible, around a set of control points in a forest. Use of the Wilcoxon matched-pairs signed-ranks test indicated that compared with free recall, video-assisted recall evoked (a) significantly greater experiential immersion in the recall, (b) significantly more specific recollections of navigation-related thoughts and feelings, (c) significantly more realizations of map and terrain features and aspects of running speed which were not noticed at the time of actual competition, and (d) significantly greater insight into specific navigational errors and the intrusion of distracting thoughts into the decision-making process. Potential applications of the technique in (a) the environments of emergency services, (b) therapeutic contexts, (c) education and training, and (d) sports psychology are discussed. PMID:7870526

  17. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras.

    PubMed

    Troscianko, Jolyon; Rutz, Christian

    2015-12-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types-hooked stick tools-under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an 'expanded' foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  18. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras

    PubMed Central

    Troscianko, Jolyon; Rutz, Christian

    2015-01-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types—hooked stick tools—under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an ‘expanded’ foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  19. Lights, Camera: Learning! Findings from studies of video in formal and informal science education

    NASA Astrophysics Data System (ADS)

    Borland, J.

    2013-12-01

    As part of the panel, media researcher, Jennifer Borland, will highlight findings from a variety of studies of videos across the spectrum of formal to informal learning, including schools, museums, and in viewers homes. In her presentation, Borland will assert that the viewing context matters a great deal, but there are some general take-aways that can be extrapolated to the use of educational video in a variety of settings. Borland has served as an evaluator on several video-related projects funded by NASA and the the National Science Foundation including: Data Visualization videos and Space Shows developed by the American Museum of Natural History, DragonflyTV, Earth the Operators Manual, The Music Instinct and Time Team America.

  20. Lights, camera, actioncritique? Submit videos to AGU communications workshop

    NASA Astrophysics Data System (ADS)

    Vias, Maria-Jos

    2011-08-01

    What does it take to create a science video that engages the audience and draws thousands of views on YouTube? Those interested in finding out should submit their research-related videos to AGU's Fall Meeting science film analysis workshop, led by oceanographer turned documentary director Randy Olson. Olson, writer-director of two films (Flock of Dodos: The Evolution-Intelligent Design Circus and Sizzle: A Global Warming Comedy) and author of the book Don't Be Such a Scientist: Talking Substance in an Age of Style, will provide constructive criticism on 10 selected video submissions, followed by moderated discussion with the audience. To submit your science video (5 minutes or shorter), post it on YouTube and send the link to the workshop coordinator, Maria-Jos Vias (mjvinas@agu.org), with the following subject line: Video submission for Olson workshop. AGU will be accepting submissions from researchers and media officers of scientific institutions until 6:00 P.M. eastern time on Friday, 4 November. Those whose videos are selected to be screened will be notified by Friday, 18 November. All are welcome to attend the workshop at the Fall Meeting.

  1. Ground and aerial use of an infrared video camera with a mid-infrared filter (1.45 to 2.0 microns)

    NASA Astrophysics Data System (ADS)

    Everitt, J. H.; Escobar, D. E.; Nixon, P. R.; Hussey, M. A.; Blazquez, C. H.

    1986-01-01

    A black-and-white infrared (0.9 to 2.2 micron) video camera, filtered to record radiation within the 1.45 to 2.0 microns midinfrared water absorption region, was evaluated with ground and aerial studies. Imagery of single leaves of seven plant species (four succulent; three nonsucculent) showed that succulent leaves were easily distinguishable from nonsucculent leaves. Spectrophotometric leaf reflectance measurements made over the 1.45 to 2.0 microns confirmed the imagery results. Ground-based video recordings also showed that severely drought-stressed buffelgrass (Cenchrus ciliaris L.) plants were distinguishable from the nonstressed and moderately stressed plants. Moreover, the camera provided airborne imagery that clearly differentiated between irrigated and nonirrigated grass plots. Due to the lower radiation intensity in the mid-infrared spectral region and the low sensitivity response of the camera's tube, these video images were not as sharp as those obtained by visible or visible/near-infrared sensitive video cameras. Nevertheless, these results showed that a video camera with midinfrared sensitivity has potential for use in remote sensing research and applications.

  2. Development of observation method for hydrothermal flows with acoustic video camera

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Asada, A.; Kinoshita, M.; Tamura, H.; Tamaki, K.

    2011-12-01

    DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. Institute of Industrial Science, University of Tokyo (IIS) has understood DIDSON's superior performance and tried to develop a new observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. We could identify shadings inside the acoustic video images of the hydrothermal plumes. Silhouettes of the hydrothermal plumes varied from second to second, and the shadings inside them varied their shapes, too. These variations corresponded to internal structures and flows of the plumes. We are analyzing the acoustic video images in order to deduce information of their internal structures and flows in plumes. On the other hand, we are preparing a tank experiment so that we will have acoustic video images of water flow under the control of flow rate. The purpose of the experiment is to understand relation between flow rate and acoustic video image quantitatively. Results from this experiment will support the aforementioned image analysis of the hydrothermal plume data from Central Indian Ridge. We will report the overview of the image analysis and the tank experiments, and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

  3. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGESBeta

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  4. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    SciTech Connect

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.

  5. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  6. Visual surveys can reveal rather different 'pictures' of fish densities: Comparison of trawl and video camera surveys in the Rockall Bank, NE Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    McIntyre, F. D.; Neat, F.; Collie, N.; Stewart, M.; Fernandes, P. G.

    2015-01-01

    Visual surveys allow non-invasive sampling of organisms in the marine environment which is of particular importance in deep-sea habitats that are vulnerable to damage caused by destructive sampling devices such as bottom trawls. To enable visual surveying at depths greater than 200 m we used a deep towed video camera system, to survey large areas around the Rockall Bank in the North East Atlantic. The area of seabed sampled was similar to that sampled by a bottom trawl, enabling samples from the towed video camera system to be compared with trawl sampling to quantitatively assess the numerical density of deep-water fish populations. The two survey methods provided different results for certain fish taxa and comparable results for others. Fish that exhibited a detectable avoidance behaviour to the towed video camera system, such as the Chimaeridae, resulted in mean density estimates that were significantly lower (121 fish/km2) than those determined by trawl sampling (839 fish/km2). On the other hand, skates and rays showed no reaction to the lights in the towed body of the camera system, and mean density estimates of these were an order of magnitude higher (64 fish/km2) than the trawl (5 fish/km2). This is probably because these fish can pass under the footrope of the trawl due to their flat body shape lying close to the seabed but are easily detected by the benign towed video camera system. For other species, such as Molva sp, estimates of mean density were comparable between the two survey methods (towed camera, 62 fish/km2; trawl, 73 fish/km2). The towed video camera system presented here can be used as an alternative benign method for providing indices of abundance for species such as ling in areas closed to trawling, or for those fish that are poorly monitored by trawl surveying in any area, such as the skates and rays.

  7. Lights! Camera! Action! Producing Library Instruction Video Tutorials Using Camtasia Studio

    ERIC Educational Resources Information Center

    Charnigo, Laurie

    2009-01-01

    From Web guides to online tutorials, academic librarians are increasingly experimenting with many different technologies in order to meet the needs of today's growing distance education populations. In this article, the author discusses one librarian's experience using Camtasia Studio to create subject specific video tutorials. Benefits, as well

  8. "Lights, Camera, Reflection": Using Peer Video to Promote Reflective Dialogue among Student Teachers

    ERIC Educational Resources Information Center

    Harford, Judith; MacRuairc, Gerry; McCartan, Dermot

    2010-01-01

    This paper examines the use of peer-videoing in the classroom as a means of promoting reflection among student teachers. Ten pre-service teachers participating in a teacher education programme in a university in the Republic of Ireland and ten pre-service teachers participating in a teacher education programme in a university in the North of

  9. Evaluation of a 0.9- to 2.2-microns sensitive video camera with a mid-infrared filter (1.45- to 2.0-microns)

    NASA Astrophysics Data System (ADS)

    Everitt, J. H.; Escobar, D. E.; Nixon, P. R.; Blazquez, C. H.; Hussey, M. A.

    The application of 0.9- to 2.2-microns sensitive black and white IR video cameras to remote sensing is examined. Field and laboratory recordings of the upper and lower surface of peperomia leaves, succulent prickly pear, and buffelgrass are evaluated; the reflectance, phytomass, green weight, and water content for the samples were measured. The data reveal that 0.9- to 2.2-microns video cameras are effective tools for laboratory and field research; however, the resolution and image quality of the data is poor compared to visible and near-IR images.

  10. Video-based realtime IMU-camera calibration for robot navigation

    NASA Astrophysics Data System (ADS)

    Petersen, Arne; Koch, Reinhard

    2012-06-01

    This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly coupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the transfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and the horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation, information on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot navigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the estimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters can be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality and minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware is required. As will be shown, the system is capable of estimating the calibration within a short period of time. Depending on the requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system at startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality and consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and rotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the estimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability to robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered conditions easily.

  11. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  12. High-speed flow visualization with a new digital video camera

    NASA Astrophysics Data System (ADS)

    Volpe, Jason

    2005-11-01

    Scientific photography opened new vistas upon high-speed physics in the previous century. Now, high-speed digital cameras are becoming available to replace the older photographic technology with similar speed, resolution, and light sensitivity but vastly better utility and user-friendliness. Here we apply a Photron Fastcam APX-RS digital camera that is capable of megapixel image resolution at 3000 frames/sec up to 250,000 frames/sec at lower resolution. Frame exposure is separately adjustable down to 1 microsecond. Several of the ``icons'' of high-speed flow visualization are repeated here, including firecracker and gram-range explosions, popping a champagne cork, vortex rings, shock emergence from a shock tube, the splash of a milk drop, and the burst of a toy balloon. Many of these visualizations utilize traditional schlieren or shadowgraph optics to show shock wave propagation. Still frames and brief movies will be shown.

  13. A risk-based coverage model for video surveillance camera control optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua

    2015-12-01

    Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.

  14. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [?] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [?]M,N M(t) = K(t) Log N(t).

  15. Study of recognizing multiple persons' complicated hand gestures from the video sequence acquired by a moving camera

    NASA Astrophysics Data System (ADS)

    Dan, Luo; Ohya, Jun

    2010-02-01

    Recognizing hand gestures from the video sequence acquired by a dynamic camera could be a useful interface between humans and mobile robots. We develop a state based approach to extract and recognize hand gestures from moving camera images. We improved Human-Following Local Coordinate (HFLC) System, a very simple and stable method for extracting hand motion trajectories, which is obtained from the located human face, body part and hand blob changing factor. Condensation algorithm and PCA-based algorithm was performed to recognize extracted hand trajectories. In last research, this Condensation Algorithm based method only applied for one person's hand gestures. In this paper, we propose a principal component analysis (PCA) based approach to improve the recognition accuracy. For further improvement, temporal changes in the observed hand area changing factor are utilized as new image features to be stored in the database after being analyzed by PCA. Every hand gesture trajectory in the database is classified into either one hand gesture categories, two hand gesture categories, or temporal changes in hand blob changes. We demonstrate the effectiveness of the proposed method by conducting experiments on 45 kinds of sign language based Japanese and American Sign Language gestures obtained from 5 people. Our experimental recognition results show better performance is obtained by PCA based approach than the Condensation algorithm based method.

  16. DrugCam(®)-An intelligent video camera system to make safe cytotoxic drug preparations.

    PubMed

    Benizri, Frédéric; Dalifard, Benoit; Zemmour, Christophe; Henriquet, Maxime; Fougereau, Emmanuelle; Le Franc, Benoit

    2016-04-11

    DrugCam(®) is a new approach to control the chemotherapy preparations with an intelligent video system that enables automatic verification during the critical stages of preparations combined with an a posteriori control with partial or total visualization of the video recording of preparations. The assessment was about the recognizing of anticancer drug vials (qualitative analysis) and syringe volumes (quantitative analysis). The qualitative analysis was conducted for a total of 120 vials with sensitivity of 100% for 84.2% of the vials and at least 97% for all the vials tested. Accuracy was at least 98.5% for all vials. The quantitative analysis was assessed by detecting 10 measures of each graduation for syringes. The identification error rate was 2.1% (244/11,640) i.e. almost 94% to the next graduation. Only 3% (35/1164) of the graduations tested, i.e. 23/35 for volume <0.13ml of 1ml syringes, presented a volume error outside the admissible limit of ±5% of a confidence band constructed for the estimated linear regression line for each syringe. In addition to the vial detection model, barcodes can also read when they are present on vials. DrugCam(®) offers an innovative approach for controlling chemotherapy preparations and constitutes an optimized application of telepharmacy. PMID:26923317

  17. The evolution of the scientific CCD

    NASA Astrophysics Data System (ADS)

    Blouke, M. M.

    2011-03-01

    There is little doubt that the Charge-Coupled Device (CCD) and its cousin the CMOS Active Pixel Sensor (APS) have completely revolutionized the imaging field. It is becoming more and more difficult to obtain film for cameras and digital cameras are available from the cell phone to the professional photographer in multimegapixel format. This paper explores some of the origins of the CCD as a scientific sensor.

  18. Introducing Contactless Blood Pressure Assessment Using a High Speed Video Camera.

    PubMed

    Jeong, In Cheol; Finkelstein, Joseph

    2016-04-01

    Recent studies demonstrated that blood pressure (BP) can be estimated using pulse transit time (PTT). For PTT calculation, photoplethysmogram (PPG) is usually used to detect a time lag in pulse wave propagation which is correlated with BP. Until now, PTT and PPG were registered using a set of body-worn sensors. In this study a new methodology is introduced allowing contactless registration of PTT and PPG using high speed camera resulting in corresponding image-based PTT (iPTT) and image-based PPG (iPPG) generation. The iPTT value can be potentially utilized for blood pressure estimation however extent of correlation between iPTT and BP is unknown. The goal of this preliminary feasibility study was to introduce the methodology for contactless generation of iPPG and iPTT and to make initial estimation of the extent of correlation between iPTT and BP "in vivo." A short cycling exercise was used to generate BP changes in healthy adult volunteers in three consecutive visits. BP was measured by a verified BP monitor simultaneously with iPTT registration at three exercise points: rest, exercise peak, and recovery. iPPG was simultaneously registered at two body locations during the exercise using high speed camera at 420 frames per second. iPTT was calculated as a time lag between pulse waves obtained as two iPPG's registered from simultaneous recoding of head and palm areas. The average inter-person correlation between PTT and iPTT was 0.85 ± 0.08. The range of inter-person correlations between PTT and iPTT was from 0.70 to 0.95 (p < 0.05). The average inter-person coefficient of correlation between SBP and iPTT was -0.80 ± 0.12. The range of correlations between systolic BP and iPTT was from 0.632 to 0.960 with p < 0.05 for most of the participants. Preliminary data indicated that a high speed camera can be potentially utilized for unobtrusive contactless monitoring of abrupt blood pressure changes in a variety of settings. The initial prototype system was able to successfully generate approximation of pulse transit time and showed high intra-individual correlation between iPTT and BP. Further investigation of the proposed approach is warranted. PMID:26791993

  19. Measuring multivariate subjective image quality for still and video cameras and image processing system components

    NASA Astrophysics Data System (ADS)

    Nyman, Göte; Leisti, Tuomas; Lindroos, Paul; Radun, Jenni; Suomi, Sini; Virtanen, Toni; Olives, Jean-Luc; Oja, Joni; Vuori, Tero

    2008-01-01

    The subjective quality of an image is a non-linear product of several, simultaneously contributing subjective factors such as the experienced naturalness, colorfulness, lightness, and clarity. We have studied subjective image quality by using a hybrid qualitative/quantitative method in order to disclose relevant attributes to experienced image quality. We describe our approach in mapping the image quality attribute space in three cases: still studio image, video clips of a talking head and moving objects, and in the use of image processing pipes for 15 still image contents. Naive observers participated in three image quality research contexts in which they were asked to freely and spontaneously describe the quality of the presented test images. Standard viewing conditions were used. The data shows which attributes are most relevant for each test context, and how they differentiate between the selected image contents and processing systems. The role of non-HVS based image quality analysis is discussed.

  20. A simple, inexpensive video camera setup for the study of avian nest activity

    USGS Publications Warehouse

    Sabine, J.B.; Meyers, J.M.; Schweitzer, S.H.

    2005-01-01

    Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (>USA $4000/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus), American Crow (Corvus brachyrhynchos), and ghost crab (Ocypode quadrata) predation on oystercatcher nests. Other detected causes of nest failure included tidal overwash, horse trampling, abandonment, and human destruction. System failure rates were comparable with commercially available units. Our system's efficacy and low cost (<$800) provided useful data for the management and conservation of the American Oystercatcher.

  1. CCD based beam loss monitor for ion accelerators

    NASA Astrophysics Data System (ADS)

    Belousov, A.; Mustafin, E.; Ensinger, W.

    2014-04-01

    Beam loss monitoring is an important aspect of proper accelerator functioning. There is a variety of existing solutions, but each has its own disadvantages, e.g. unsuitable dynamic range or time resolution, high cost, or short lifetime. Therefore, new options are looked for. This paper shows a method of application of a charge-coupled device (CCD) video camera as a beam loss monitor (BLM) for ion beam accelerators. The system was tested with a 500 MeV/u N+7 ion beam interacting with an aluminum target. The algorithms of camera signal processing with LabView based code and beam loss measurement are explained. Limits of applicability of this monitor system are discussed.

  2. Evaluation of a high dynamic range video camera with non-regular sensor

    NASA Astrophysics Data System (ADS)

    Schberl, Michael; Keinert, Joachim; Ziegler, Matthias; Seiler, Jrgen; Niehaus, Marco; Schuller, Gerald; Kaup, Andr; Foessel, Siegfried

    2013-01-01

    Although there is steady progress in sensor technology, imaging with a high dynamic range (HDR) is still difficult for motion imaging with high image quality. This paper presents our new approach for video acquisition with high dynamic range. The principle is based on optical attenuation of some of the pixels of an existing image sensor. This well known method traditionally trades spatial resolution for an increase in dynamic range. In contrast to existing work, we use a non-regular pattern of optical ND filters for attenuation. This allows for an image reconstruction that is able to recover high resolution images. The reconstruction is based on the assumption that natural images can be represented nearly sparse in transform domains, which allows for recovery of scenes with high detail. The proposed combination of non-regular sampling and image reconstruction leads to a system with an increase in dynamic range without sacrificing spatial resolution. In this paper, a further evaluation is presented on the achievable image quality. In our prototype we found that crosstalk is present and significant. The discussion thus shows the limits of the proposed imaging system.

  3. Characterization of insect vision based collision avoidance models using a video camera

    NASA Astrophysics Data System (ADS)

    Guzinski, R.; Nguyen, K.; Yong, Z. H.; Rajesh, S.; O'Carroll, D. C.; Abbott, D.

    2006-01-01

    Insects have very efficient vision algorithms that allow them to perform complex manoeuvres in real time, while using a very limited processing power. In this paper we study some of the properties of these algorithms with the aim of implementing them in microchip devices. To achieve this we simulate insect vision using our software, which utilises the Horridge Template Model, to detect the angular velocity of a moving object. The motion is simulated using a number of rotating images showing both artificial constructs and real life scenes and is captured with a CMOS camera. We investigate the effects of texel density, contrast, luminance and chrominance properties of the moving images. Pre and post template filtering and different threshold settings are used to improve the accuracy of the estimated angular velocity. We then further analyse and compare the results obtained. We will then implement an efficient velocity estimation algorithm that produces reliable results. Lastly, we will also look into developing the estimation of time to impact algorithm.

  4. Development of CCD controller for scientific application

    NASA Astrophysics Data System (ADS)

    Khan, M. S.; Pathan, F. M.; Shah, U. V., Prof; Makwana, D. H., Prof; Anandarao, B. G., Prof

    2010-02-01

    Photoelectric equipment has wide applications such as spectroscopy, temperature measurement in infrared region and in astronomical research etc. A photoelectric transducer converts radiant energy into electrical energy. There are two types of photoelectric transducers namely photo-multiplier tube (PMT) and charged couple device (CCD) are used to convert radiant energy into electrical signal. Now the entire modern instruments use CCD technology. We have designed and developed a CCD camera controller using camera chip CD47-10 of Marconi which has 1K 1K pixel for space application only.

  5. Jellyfish Support High Energy Intake of Leatherback Sea Turtles (Dermochelys coriacea): Video Evidence from Animal-Borne Cameras

    PubMed Central

    Heaslip, Susan G.; Iverson, Sara J.; Bowen, W. Don; James, Michael C.

    2012-01-01

    The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n?=?19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:083:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p?=?0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p?=?0.0001). Estimates of energy intake averaged 66,018 kJd?1 but were as high as 167,797 kJd?1 corresponding to turtles consuming an average of 330 kg wet massd?1 (up to 840 kgd?1) or approximately 261 (up to 664) jellyfishd-1. Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body massd?1 equating to an average energy intake of 37 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to southward migration. PMID:22438906

  6. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  7. General purpose solid state camera for SERTS

    NASA Astrophysics Data System (ADS)

    Payne, Leslie J.; Haas, J. Patrick

    1996-11-01

    The Laboratory for Astronomy and Solar Physics at Goddard Space Flight Center uses a variety of CCD's and other solid state imaging sensors for its instrumentation programs. Traditionally, custom camera systems are built around the imaging device to optimize the circuitry for the particular sensor. This usually produces a camera that is small, uses little power and is elegant. Although these are desirable characteristics, this approach is also expensive and time consuming. An alternative approach is to design a `universal' camera that is easily customized to meet specific mission requirements. This is the approach our team used for SERTS. The camera design used to support the SERTS mission is a general purpose camera design that is derived from an existing camera on the SOHO spacecraft. This camera is designed to be rugged, modest in power requirements and flexible. The base design of the camera supports quadrant CCD devices with up to 4 phases. Imaging devices with simpler architectures are in general supportable. The basic camera is comprised of a main electronics box which performs all timing generation, voltage level control, data processing and compression. A second unit, placed close to the detector head, is responsible for driving the image device control electrodes and amplifying the multichannel detector video. Programmable high voltage units are used for the single stage MCP type intensifier. The detector head is customized for each sensor type supported. Auxiliary equipment includes a frame buffer that works either as a multi-frame storage unit or as a photon counting accumulation unit. This unit also performs interface buffering so that the camera may appear as a piece of GPIB instrumentation.

  8. Micro-rheology Using Multi Speckle DWS with Video Camera. Application to Film Formation, Drying and Rheological Stability

    NASA Astrophysics Data System (ADS)

    Brunel, Laurent; Dihang, Hlne

    2008-07-01

    We present in this work two applications of microrheology: the monitoring of film formation and the rheological stability. Microrheology is based on the Diffusing Wave Spectroscopy (DWS) method [1] that relates the particle dynamics to the speckle field dynamics, and further to the visco-elastic moduli G' and G? with respect to frequency [2]. Our technology uses the Multi Speckle DWS (MS-DWS) set-up in backscattering with a video camera. For film formation and drying application, we present a new algorithm called "Adaptive Speckle Imaging Interferometry" (ASII) that extracts a simple kinetics from the speckle field dynamics [3,4]. Different film forming and drying have been investigated (water-based, solvent and solvent-free paints, inks, adhesives, varnishes, ) on various types of substrates and at different thickness (few to hundreds microns). For rheological stability we show that the robust measurement of speckle correlation using the inter image distance [3] can bring useful information for industry on viscoelasticity variations over a wide range of frequency without additional parameter.

  9. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  10. Bird-Borne Video-Cameras Show That Seabird Movement Patterns Relate to Previously Unrevealed Proximate Environment, Not Prey

    PubMed Central

    Tremblay, Yann; Thiebault, Andra; Mullers, Ralf; Pistorius, Pierre

    2014-01-01

    The study of ecological and behavioral processes has been revolutionized in the last two decades with the rapid development of biologging-science. Recently, using image-capturing devices, some pilot studies demonstrated the potential of understanding marine vertebrate movement patterns in relation to their proximate, as opposed to remote sensed environmental contexts. Here, using miniaturized video cameras and GPS tracking recorders simultaneously, we show for the first time that information on the immediate visual surroundings of a foraging seabird, the Cape gannet, is fundamental in understanding the origins of its movement patterns. We found that movement patterns were related to specific stimuli which were mostly other predators such as gannets, dolphins or fishing boats. Contrary to a widely accepted idea, our data suggest that foraging seabirds are not directly looking for prey. Instead, they search for indicators of the presence of prey, the latter being targeted at the very last moment and at a very small scale. We demonstrate that movement patterns of foraging seabirds can be heavily driven by processes unobservable with conventional methodology. Except perhaps for large scale processes, local-enhancement seems to be the only ruling mechanism; this has profounds implications for ecosystem-based management of marine areas. PMID:24523892

  11. Linear CCD attitude measurement system based on the identification of the auxiliary array CCD

    NASA Astrophysics Data System (ADS)

    Hu, Yinghui; Yuan, Feng; Li, Kai; Wang, Yan

    2015-10-01

    Object to the high precision flying target attitude measurement issues of a large space and large field of view, comparing existing measurement methods, the idea is proposed of using two array CCD to assist in identifying the three linear CCD with multi-cooperative target attitude measurement system, and to address the existing nonlinear system errors and calibration parameters and more problems with nine linear CCD spectroscopic test system of too complicated constraints among camera position caused by excessive. The mathematical model of binocular vision and three linear CCD test system are established, co-spot composition triangle utilize three red LED position light, three points' coordinates are given in advance by Cooperate Measuring Machine, the red LED in the composition of the three sides of a triangle adds three blue LED light points as an auxiliary, so that array CCD is easier to identify three red LED light points, and linear CCD camera is installed of a red filter to filter out the blue LED light points while reducing stray light. Using array CCD to measure the spot, identifying and calculating the spatial coordinates solutions of red LED light points, while utilizing linear CCD to measure three red LED spot for solving linear CCD test system, which can be drawn from 27 solution. Measured with array CCD coordinates auxiliary linear CCD has achieved spot identification, and has solved the difficult problems of multi-objective linear CCD identification. Unique combination of linear CCD imaging features, linear CCD special cylindrical lens system is developed using telecentric optical design, the energy center of the spot position in the depth range of convergence in the direction is perpendicular to the optical axis of the small changes ensuring highprecision image quality, and the entire test system improves spatial object attitude measurement speed and precision.

  12. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Technical Reports Server (NTRS)

    Bolas, M. T.; Fisher, S. S.

    1990-01-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remore manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  13. Video photographic considerations for measuring the proximity of a probe aircraft with a smoke seeded trailing vortex

    NASA Technical Reports Server (NTRS)

    Childers, Brooks A.; Snow, Walter L.

    1990-01-01

    Considerations for acquiring and analyzing 30 Hz video frames from charge coupled device (CCD) cameras mounted in the wing tips of a Beech T-34 aircraft are described. Particular attention is given to the characterization and correction of optical distortions inherent in the data.

  14. Evaluating the Effects of Camera Perspective in Video Modeling for Children with Autism: Point of View versus Scene Modeling

    ERIC Educational Resources Information Center

    Cotter, Courtney

    2010-01-01

    Video modeling has been used effectively to teach a variety of skills to children with autism. This body of literature is characterized by a variety of procedural variations including the characteristics of the video model (e.g., self vs. other, adult vs. peer). Traditionally, most video models have been filmed using third person perspective

  15. Improvement in the light sensitivity of the ultrahigh-speed high-sensitivity CCD with a microlens array

    NASA Astrophysics Data System (ADS)

    Hayashida, T.,; Yonai, J.; Kitamura, K.; Arai, T.; Kurita, T.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Kitagawa, S.; Hatade, K.; Yamaguchi, T.; Takeuchi, H.; Iida, K.

    2008-02-01

    We are advancing the development of ultrahigh-speed, high-sensitivity CCDs for broadcast use that are capable of capturing smooth slow-motion videos in vivid colors even where lighting is limited, such as at professional baseball games played at night. We have already developed a 300,000 pixel, ultrahigh-speed CCD, and a single CCD color camera that has been used for sports broadcasts and science programs using this CCD. However, there are cases where even higher sensitivity is required, such as when using a telephoto lens during a baseball broadcast or a high-magnification microscope during science programs. This paper provides a summary of our experimental development aimed at further increasing the sensitivity of CCDs using the light-collecting effects of a microlens array.

  16. Automatic CCD Imaging Systems for Time-series CCD Photometry

    NASA Astrophysics Data System (ADS)

    Caton, D. B.; Pollock, J. T.; Davis, S. A.

    2004-12-01

    CCDs allow precision photometry to be done with small telescopes and at sites with less than ideal seeing conditions. The addition of an automatic observing mode makes it easy to do time-series CCD photometry of variable stars and AGN/QSOs. At Appalachian State University's Dark Sky Observatory (DSO), we have implemented automatic imaging systems for image acquisition, scripted filter changing, data storage and quick-look online photometry two different telescopes, the 32-inch and 18-inch telescopes. The camera at the 18-inch allows a simple system where the data acquisition PC controls a DFM Engineering filter wheel and Photometrics/Roper camera. The 32-inch system is the more complex, with three computers communicating in order to make good use of its camera's 30-second CCD-read time for filter change. Both telescopes use macros written in the PMIS software (GKR Computer Consulting). Both systems allow automatic data capture with only tended care provided by the observer. Indeed, one observer can easily run both telescopes simultaneously. The efficiency and reliability of these systems also reduces observer errors. The only unresolved problem is an occasional but rare camera-read error (the PC is apparently interrupted). We also sometimes experience a crash of the PMIS software, probably due to its 16-bit code now running in the Windows 2000 32-bit environment. We gratefully acknowledge the support of the National Science Foundation through grants number AST-0089248 and AST-9119750, the Dunham Fund for Astrophysical Research, and the ASU Research Council.

  17. Design of video interface conversion system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Heng; Wang, Xiang-jun

    2014-11-01

    This paper presents a FPGA based video interface conversion system that enables the inter-conversion between digital and analog video. Cyclone IV series EP4CE22F17C chip from Altera Corporation is used as the main video processing chip, and single-chip is used as the information interaction control unit between FPGA and PC. The system is able to encode/decode messages from the PC. Technologies including video decoding/encoding circuits, bus communication protocol, data stream de-interleaving and de-interlacing, color space conversion and the Camera Link timing generator module of FPGA are introduced. The system converts Composite Video Broadcast Signal (CVBS) from the CCD camera into Low Voltage Differential Signaling (LVDS), which will be collected by the video processing unit with Camera Link interface. The processed video signals will then be inputted to system output board and displayed on the monitor.The current experiment shows that it can achieve high-quality video conversion with minimum board size.

  18. Low-light-level EMCCD color camera

    NASA Astrophysics Data System (ADS)

    Heim, Gerald B.; Burkepile, Jon; Frame, Wayne W.

    2006-05-01

    Video cameras have increased in usefulness in military applications over the past four decades. This is a result of many advances in technology and because no one portion of the spectrum reigns supreme under all environmental and operating conditions. The visible portion of the spectrum has the clear advantage of ease of information interpretation, requiring little or no training. This advantage extends into the Near IR (NIR) spectral region to silicon cutoff with little difficulty. Inclusion of the NIR region is of particular importance due to the rich photon content of natural night illumination. The addition of color capability offers another dimension to target/situation discrimination and hence is highly desirable. A military camera must be small, lightweight and low power. Limiting resolution and sensitivity cannot be sacrificed to achieve color capability. Newly developed electron-multiplication CCD sensors (EMCCDs) open the door to a practical low-light/all-light color camera without an image intensifier. Ball Aerospace & Technologies Corp (BATC) has developed a unique color camera that allows the addition of color with a very small impact on low light level performance and negligible impact on limiting resolution. The approach, which includes the NIR portion of the spectrum along with the visible, requires no moving parts and is based on the addition of a sparse sampling color filter to the surface of an EMCCD. It renders the correct hue in a real time, video rate image with negligible latency. Furthermore, camera size and power impact is slight.

  19. Cryostat and CCD for MEGARA at GTC

    NASA Astrophysics Data System (ADS)

    Castillo-Domínguez, E.; Ferrusca, D.; Tulloch, S.; Velázquez, M.; Carrasco, E.; Gallego, J.; Gil de Paz, A.; Sánchez, F. M.; Vílchez Medina, J. M.

    2012-09-01

    MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is the new integral field unit (IFU) and multi-object spectrograph (MOS) instrument for the GTC. The spectrograph subsystems include the pseudo-slit, the shutter, the collimator with a focusing mechanism, pupil elements on a volume phase holographic grating (VPH) wheel and the camera joined to the cryostat through the last lens, with a CCD detector inside. In this paper we describe the full preliminary design of the cryostat which will harbor the CCD detector for the spectrograph. The selected cryogenic device is an LN2 open-cycle cryostat which has been designed by the "Astronomical Instrumentation Lab for Millimeter Wavelengths" at INAOE. A complete description of the cryostat main body and CCD head is presented as well as all the vacuum and temperature sub-systems to operate it. The CCD is surrounded by a radiation shield to improve its performance and is placed in a custom made mechanical mounting which will allow physical adjustments for alignment with the spectrograph camera. The 4k x 4k pixel CCD231 is our selection for the cryogenically cooled detector of MEGARA. The characteristics of this CCD, the internal cryostat cabling and CCD controller hardware are discussed. Finally, static structural finite element modeling and thermal analysis results are shown to validate the cryostat model.

  20. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  1. CCD calibration method for wheel set wear online measurement

    NASA Astrophysics Data System (ADS)

    Chen, Jiang; Wu, Kaihua

    2010-11-01

    CCD calibration is carried out for wheel set wear online measurement. Calibration precision influences the accuracy of wheel set measurement. A CCD calibration method is designed based on perspective projection of pinhole camera and iterative algorithm. Space angles and object distance of CCD parameters are obtained by CCD calibration. Transformation relation between physical coordinates and pixel coordinates is determined by these two CCD parameters. Experimental results showed that space angle achieved a error of 0.1 and object distance achieved a error of 2 mm. The calibration method meet accuracy demands of wheel set wear measurement.

  2. Are traditional methods of determining nest predators and nest fates reliable? An experiment with Wood Thrushes (Hylocichla mustelina) using miniature video cameras

    USGS Publications Warehouse

    Williams, Gary E.; Wood, P.B.

    2002-01-01

    We used miniature infrared video cameras to monitor Wood Thrush (Hylocichla mustelina) nests during 1998-2000. We documented nest predators and examined whether evidence at nests can be used to predict predator identities and nest fates. Fifty-six nests were monitored; 26 failed, with 3 abandoned and 23 depredated. We predicted predator class (avian, mammalian, snake) prior to review of video footage and were incorrect 57% of the time. Birds and mammals were underrepresented whereas snakes were over-represented in our predictions. We documented ???9 nest-predator species, with the southern flying squirrel (Glaucomys volans) taking the most nests (n = 8). During 2000, we predicted fate (fledge or fail) of 27 nests; 23 were classified correctly. Traditional methods of monitoring nests appear to be effective for classifying success or failure of nests, but ineffective at classifying nest predators.

  3. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  4. IR CCD staring imaging system

    NASA Astrophysics Data System (ADS)

    Zhou, Qibo

    1991-12-01

    An infrared staring imaging system in the 3 to 5 micrometers spectral band has been developed by SITP laboratories in China. The sensor utilized is a Pt-Si Schottky-barrier infrared CCD focal plane array. The digital video processing electronics is designed for 32 X 64, 64 X 64, and 128 X 128 pixel formats and contains the elimination of fixed pattern noise and the correction of response nonuniformity in real time and provides the high-quality IR image. The standard TV compatible and portable features are part of the design. In this paper, we describe the design and performance of this prototype system and present some experimental examples of IR imagery. The results demonstrate that the Pt-Si IR CCD imaging system has good performance, high reliability, low cost, and can be suitable for a variety of commercial applications.

  5. Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic

    NASA Astrophysics Data System (ADS)

    Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

    2008-11-01

    Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

  6. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  7. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  8. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  9. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or

  10. Mapping herbage biomass and nitrogen status in an Italian ryegrass (Lolium multiflorum L.) field using a digital video camera with balloon system

    NASA Astrophysics Data System (ADS)

    Kawamura, Kensuke; Sakuno, Yuji; Tanaka, Yoshikazu; Lee, Hyo-Jin; Lim, Jihyun; Kurokawa, Yuzo; Watanabe, Nariyasu

    2011-01-01

    Improving current precision nutrient management requires practical tools to aid the collection of site specific data. Recent technological developments in commercial digital video cameras and the miniaturization of systems on board low-altitude platforms offer cost effective, real time applications for efficient nutrient management. We tested the potential use of commercial digital video camera imagery acquired by a balloon system for mapping herbage biomass (BM), nitrogen (N) concentration, and herbage mass of N (Nmass) in an Italian ryegrass (Lolium multiflorum L.) meadow. The field measurements were made at the Setouchi Field Science Center, Hiroshima University, Japan on June 5 and 6, 2009. The field consists of two 1.0 ha Italian ryegrass meadows, which are located in an east-facing slope area (230 to 240 m above sea level). Plant samples were obtained at 20 sites in the field. A captive balloon was used for obtaining digital video data from a height of approximately 50 m (approximately 15 cm spatial resolution). We tested several statistical methods, including simple and multivariate regressions, using forage parameters (BM, N, and Nmass) and three visible color bands or color indices based on ratio vegetation index and normalized difference vegetation index. Of the various investigations, a multiple linear regression (MLR) model showed the best cross validated coefficients of determination (R2) and minimum root-mean-squared error (RMSECV) values between observed and predicted herbage BM (R2 = 0.56, RMSECV = 51.54), Nmass (R2 = 0.65, RMSECV = 0.93), and N concentration (R2 = 0.33, RMSECV = 0.24). Applying these MLR models on mosaic images, the spatial distributions of the herbage BM and N status within the Italian ryegrass field were successfully displayed at a high resolution. Such fine-scale maps showed higher values of BM and N status at the bottom area of the slope, with lower values at the top of the slope.

  11. CCD Photometry of the Polar BY Cam

    NASA Astrophysics Data System (ADS)

    Jessop, H.; Chin, V.; Spear, G.

    1992-12-01

    BY Cam (=H0538+608), a very erratic member of the AM Herculis-type binaries also known as polars, was observed at the University of Arizona 40-inch telescope, with the Sonoma State University Astrolink CCD camera for six nights during November 1991. Two additional nights of CCD photometry were obtained during September 1992 at the Sonoma State Observatory, with the 25-cm Epoch Automated Telescope and SSU Astrolink CCD camera. These data comprise one of the most extensive sets of photometry acquired for this object. We will present the results of these observations, and discuss their relevance towards the further determination of some of the system's parameters. This work has been supported by a California State University Pre-Doctoral Award and Pre-Doctoral Summer Internship Award, and a Grant-In-Aid from the Sigma Xi Scientific Research Society.

  12. A video precipitation sensor for imaging and velocimetry of hydrometeors

    NASA Astrophysics Data System (ADS)

    Liu, X. C.; Gao, T. C.; Liu, L.

    2014-07-01

    A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of a video precipitation sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fibre cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit. The cylindrical space between the optical unit and imaging unit is sampling volume (300 mm 40 mm 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposes twice in a single frame, which allows the double exposure of particles images to be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in the double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of a linear scan CCD disdrometer and an impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

  13. CCD Double Star Measures: Jack Jones Observatory Report #2

    NASA Astrophysics Data System (ADS)

    Jones, James L.

    2009-10-01

    This paper submits 44 CCD measurements of 41 multiple star systems for inclusion in the WDS. Observations were made during the calendar year 2008. Measurements were made using a CCD camera and an 11" Schmidt-Cassegrain telescope. Brief discussions of pertinent observations are included.

  14. Identification of Prey Captures in Australian Fur Seals (Arctocephalus pusillus doriferus) Using Head-Mounted Accelerometers: Field Validation with Animal-Borne Video Cameras

    PubMed Central

    Volpov, Beth L.; Hoskins, Andrew J.; Battaile, Brian C.; Viviant, Morgane; Wheatley, Kathryn E.; Marshall, Greg; Abernathy, Kyler; Arnould, John P. Y.

    2015-01-01

    This study investigated prey captures in free-ranging adult female Australian fur seals (Arctocephalus pusillus doriferus) using head-mounted 3-axis accelerometers and animal-borne video cameras. Acceleration data was used to identify individual attempted prey captures (APC), and video data were used to independently verify APC and prey types. Results demonstrated that head-mounted accelerometers could detect individual APC but were unable to distinguish among prey types (fish, cephalopod, stingray) or between successful captures and unsuccessful capture attempts. Mean detection rate (true positive rate) on individual animals in the testing subset ranged from 67-100%, and mean detection on the testing subset averaged across 4 animals ranged from 82-97%. Mean False positive (FP) rate ranged from 15-67% individually in the testing subset, and 26-59% averaged across 4 animals. Surge and sway had significantly greater detection rates, but also conversely greater FP rates compared to heave. Video data also indicated that some head movements recorded by the accelerometers were unrelated to APC and that a peak in acceleration variance did not always equate to an individual prey item. The results of the present study indicate that head-mounted accelerometers provide a complementary tool for investigating foraging behaviour in pinnipeds, but that detection and FP correction factors need to be applied for reliable field application. PMID:26107647

  15. Identification of Prey Captures in Australian Fur Seals (Arctocephalus pusillus doriferus) Using Head-Mounted Accelerometers: Field Validation with Animal-Borne Video Cameras.

    PubMed

    Volpov, Beth L; Hoskins, Andrew J; Battaile, Brian C; Viviant, Morgane; Wheatley, Kathryn E; Marshall, Greg; Abernathy, Kyler; Arnould, John P Y

    2015-01-01

    This study investigated prey captures in free-ranging adult female Australian fur seals (Arctocephalus pusillus doriferus) using head-mounted 3-axis accelerometers and animal-borne video cameras. Acceleration data was used to identify individual attempted prey captures (APC), and video data were used to independently verify APC and prey types. Results demonstrated that head-mounted accelerometers could detect individual APC but were unable to distinguish among prey types (fish, cephalopod, stingray) or between successful captures and unsuccessful capture attempts. Mean detection rate (true positive rate) on individual animals in the testing subset ranged from 67-100%, and mean detection on the testing subset averaged across 4 animals ranged from 82-97%. Mean False positive (FP) rate ranged from 15-67% individually in the testing subset, and 26-59% averaged across 4 animals. Surge and sway had significantly greater detection rates, but also conversely greater FP rates compared to heave. Video data also indicated that some head movements recorded by the accelerometers were unrelated to APC and that a peak in acceleration variance did not always equate to an individual prey item. The results of the present study indicate that head-mounted accelerometers provide a complementary tool for investigating foraging behaviour in pinnipeds, but that detection and FP correction factors need to be applied for reliable field application. PMID:26107647

  16. Fast measurement of temporal noise of digital camera's photosensors

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    Currently photo- and videocameras are widespread parts of both scientific experimental setups and consumer applications. They are used in optics, radiophysics, astrophotography, chemistry, and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photoand videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Spatial part usually several times lower in magnitude than temporal. At first approximation spatial noises might be neglected. Earlier we proposed modification of the automatic segmentation of non-uniform targets (ASNT) method for measurement of temporal noise of photo- and videocameras. Only two frames are sufficient for noise measurement with the modified method. In result, proposed ASNT modification should allow fast and accurate measurement of temporal noise. In this paper, we estimated light and dark temporal noises of four cameras of different types using the modified ASNT method with only several frames. These cameras are: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PLB781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. We measured elapsed time for processing of shots used for temporal noise estimation. The results demonstrate the possibility of fast obtaining of dependency of camera full temporal noise on signal value with the proposed ASNT modification.

  17. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

  18. Digital video.

    PubMed

    Johnson, Don; Johnson, Mike

    2004-04-01

    The process of digital capture, editing, and archiving video has become an important aspect of documenting arthroscopic surgery. Recording the arthroscopic findings before and after surgery is an essential part of the patient's medical record. The hardware and software has become more reasonable to purchase, but the learning curve to master the software is steep. Digital video is captured at the time of arthroscopy to a hard disk, and written to a CD at the end of the operative procedure. The process of obtaining video of open procedures is more complex. Outside video of the procedure is recorded on digital tape with a digital video camera. The camera must be plugged into a computer to capture the video on the hard disk. Adobe Premiere software is used to edit the video and render the finished video to the hard drive. This finished video is burned onto a CD. We outline the choice of computer hardware and software for the manipulation of digital video. The techniques of backup and archiving the completed projects and files also are outlined. The uses of digital video for education and the formats that can be used in PowerPoint presentations are discussed. PMID:15123920

  19. Multi-station Video Orbits of Minor Meteor Showers

    NASA Astrophysics Data System (ADS)

    Madiedo, Jos M.; Trigo-Rodrguez, Josep M.

    2008-06-01

    During 2006 the SPanish Meteor Network (SPMN) set up three automated video stations in Andalusia for increasing the atmospheric coverage of the already existing low-scan-rate all-sky CCD systems. Despite their initially thought complementary nature, sensitive video cameras have been employed to setup an automatic meteor detection system that provides valuable real-time information on unusual meteor activity, and remarkable fireball events. In fact, during 2006 SPMN video stations participated in the detection of two unexpected meteor outbursts: Orionids and Comae Berenicids. The three new SPMN stations guarantee almost a continuous monitoring of meteor and fireball activity in Andalusia (Spain) and also increase the chance of future meteorite recoveries. A description of the main characteristics of these new observing video stations and some examples of the trajectory, radiant and orbital data obtained so far are presented here.

  20. Lines identification in the emission spectrum and orbital elements of a sporadic video meteor

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Zamorano, J.; Ocaa, F.; Izquierdo, J.; Sanchez de Miguel, A.; Trigo-Rodriguez, J. M.; Toscano, F. M.

    2011-10-01

    Since 2006 the SPanish Meteor Network (SPMN) employs high-sensitivity CCD video cameras to monitor meteor and fireball activity over the Iberian Peninsula and neighboring areas. These allow us to obtain the trajectory and orbit for multi-station events and, when combined with holographic diffraction gratings, also provide information about the chemical composition of the corresponding meteoroids. In this context, we analyze here the emission spectrum, trajectory and orbital parameters of a sporadic bolide imaged on 2010.

  1. An electronic pan/tilt/zoom camera system

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steve; Martin, H. L.

    1992-01-01

    A small camera system is described for remote viewing applications that employs fisheye optics and electronics processing for providing pan, tilt, zoom, and rotational movements. The fisheye lens is designed to give a complete hemispherical FOV with significant peripheral distortion that is corrected with high-speed electronic circuitry. Flexible control of the viewing requirements is provided by a programmable transformation processor so that pan/tilt/rotation/zoom functions can be accomplished without mechanical movements. Images are presented that were taken with a prototype system using a CCD camera, and 5 frames/sec can be acquired from a 180-deg FOV. The image-tranformation device can provide multiple images with different magnifications and pan/tilt/rotation sequences at frame rates compatible with conventional video devices. The system is of interest to the object tracking, surveillance, and viewing in constrained environments that would require the use of several cameras.

  2. CCD correlation techniques

    NASA Technical Reports Server (NTRS)

    Hewes, C. R.; Bosshart, P. W.; Eversole, W. L.; Dewit, M.; Buss, D. D.

    1976-01-01

    Two CCD techniques were discussed for performing an N-point sampled data correlation between an input signal and an electronically programmable reference function. The design and experimental performance of an implementation of the direct time correlator utilizing two analog CCDs and MOS multipliers on a single IC were evaluated. The performance of a CCD implementation of the chirp z transform was described, and the design of a new CCD integrated circuit for performing correlation by multiplication in the frequency domain was presented. This chip provides a discrete Fourier transform (DFT) or inverse DFT, multipliers, and complete support circuitry for the CCD CZT. The two correlation techniques are compared.

  3. Evaluation of the performance of the 576 384 Thomson CCD for astronomical use

    NASA Astrophysics Data System (ADS)

    Mellier, Y.; Cailloux, M.; Dupin, J. P.; Fort, B.; Lours, C.

    1986-03-01

    A slow scan CCD camera system was built at the Toulouse Observatory and used to evaluate the performance of the 576384 CCD from Thomson-CSF (new THX 31133) at low temperatures (150K). The authors have emphasized the optimization of the most important parameters for astronomical applications and have compared the Thomson CCD with other CCDs, now currently used in astronomy.

  4. Overview of a hybrid underwater camera system

    NASA Astrophysics Data System (ADS)

    Church, Philip; Hou, Weilin; Fournier, Georges; Dalgleish, Fraser; Butler, Derek; Pari, Sergio; Jamieson, Michael; Pike, David

    2014-05-01

    The paper provides an overview of a Hybrid Underwater Camera (HUC) system combining sonar with a range-gated laser camera system. The sonar is the BlueView P900-45, operating at 900kHz with a field of view of 45 degrees and ranging capability of 60m. The range-gated laser camera system is based on the third generation LUCIE (Laser Underwater Camera Image Enhancer) sensor originally developed by the Defence Research and Development Canada. LUCIE uses an eye-safe laser generating 1ns pulses at a wavelength of 532nm and at the rate of 25kHz. An intensified CCD camera operates with a gating mechanism synchronized with the laser pulse. The gate opens to let the camera capture photons from a given range of interest and can be set from a minimum delay of 5ns with increments of 200ps. The output of the sensor is a 30Hz video signal. Automatic ranging is achieved using a sonar altimeter. The BlueView sonar and LUCIE sensors are integrated with an underwater computer that controls the sensors parameters and displays the real-time data for the sonar and the laser camera. As an initial step for data integration, graphics overlays representing the laser camera field-of-view along with the gate position and width are overlaid on the sonar display. The HUC system can be manually handled by a diver and can also be controlled from a surface vessel through an umbilical cord. Recent test data obtained from the HUC system operated in a controlled underwater environment will be presented along with measured performance characteristics.

  5. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  6. AXAF CCD Imaging Spectrometer (ACIS)

    NASA Astrophysics Data System (ADS)

    Garmire, G. P.

    1997-05-01

    The ACIS is an advanced X-ray camera for the AXAF scheduled to be launched in 1998. The camera is composed of two arrays of CCDs, one optimized for imaging using four CCDs abutted in a square array, and a linear array of six CCDs optimized for imaging the dispersed spectrum formed by the High and Medium Energy Transmission Grating Spectrometers. The imaging array is tipped with respect to the optical axis to better approximate the curved focal surface formed by the AXAF Wolter Type I optics. The spectroscopic array has a slight tilt to follow the Rowland circle of the grating focus. The CCD camera and electronics were built at the MIT Center for Space Research and Lincoln Laborator. Much of the thermal and mechanical design as well as the power system were carried out at Lockheed-Martin in Denver, Colorado. The CCDs have been calibrated at MIT and the synchrotron at BESSY in Berlin, Germany. The entire flight instrument has been calibrated at the XRCF facility at Marshall Space flight Center in Huntsville, Alabama. The anticipated instrument performance characteristics based on the calibration reluts will be pre A few examples of possible observations will werve to illustrate the great scientific capabilities of the AXAF.

  7. Emission spectrum and orbital elements of a sporadic video meteor with a cometary origin

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Ortiz, J. L.; Castro-Tirado, A. J.; Cabrera, J.; Trigo-Rodriguez, J. M.; Toscano, F. M.

    2012-09-01

    The SPanish Meteor Network (SPMN) monitors meteor and fireball activity over the Iberian Peninsula and neighboring areas by using, among other systems, high-sensitivity CCD video cameras. In this way, we can obtain the atmospheric trajectory and orbit in the Solar System for multi-station events. Besides, holographic diffraction gratings attached to some of our cameras also provide information about the chemical composition of the corresponding meteoroids. In this context, we analyze here the emission spectrum, trajectory and orbital parameters of a sporadic bolide imaged in 2011.

  8. Concerning the Video Drift Method to Measure Double Stars

    NASA Astrophysics Data System (ADS)

    Nugent, Richard L.; Iverson, Ernest W.

    2015-05-01

    Classical methods to measure position angles and separations of double stars rely on just a few measurements either from visual observations or photographic means. Visual and photographic CCD observations are subject to errors from the following sources: misalignments from eyepiece/camera/barlow lens/micrometer/focal reducers, systematic errors from uncorrected optical distortions, aberrations from the telescope system, camera tilt, magnitude and color effects. Conventional video methods rely on calibration doubles and graphically calculating the east-west direction plus careful choice of select video frames stacked for measurement. Atmospheric motion is one of the larger sources of error in any exposure/measurement method which is on the order of 0.5-1.5. Ideally, if a data set from a short video can be used to derive position angle and separation, with each data set self-calibrating independent of any calibration doubles or star catalogues, this would provide measurements of high systematic accuracy. These aims are achieved by the video drift method first proposed by the authors in 2011. This self calibrating video method automatically analyzes 1,000's of measurements from a short video clip.

  9. On the development of new SPMN diurnal video systems for daylight fireball monitoring

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.

    2008-09-01

    Daylight fireball video monitoring High-sensitivity video devices are commonly used for the study of the activity of meteor streams during the night. These provide useful data for the determination, for instance, of radiant, orbital and photometric parameters ([1] to [7]). With this aim, during 2006 three automated video stations supported by Universidad de Huelva were set up in Andalusia within the framework of the SPanish Meteor Network (SPMN). These are endowed with 8-9 high sensitivity wide-field video cameras that achieve a meteor limiting magnitude of about +3. These stations have increased the coverage performed by the low-scan allsky CCD systems operated by the SPMN and, besides, achieve a time accuracy of about 0.01s for determining the appearance of meteor and fireball events. Despite of these nocturnal monitoring efforts, we realised the need of setting up stations for daylight fireball detection. Such effort was also motivated by the appearance of the two recent meteorite-dropping events of Villalbeto de la Peña [8,9] and Puerto Lápice [10]. Although the Villalbeto de la Peña event was casually videotaped, and photographed, no direct pictures or videos were obtained for the Puerto Lápice event. Consequently, in order to perform a continuous recording of daylight fireball events, we setup new automated systems based on CCD video cameras. However, the development of these video stations implies several issues with respect to nocturnal systems that must be properly solved in order to get an optimal operation. The first of these video stations, also supported by University of Huelva, has been setup in Sevilla (Andalusia) during May 2007. But, of course, fireball association is unequivocal only in those cases when two or more stations recorded the fireball, and when consequently the geocentric radiant is accurately determined. With this aim, a second diurnal video station is being setup in Andalusia in the facilities of Centro Internacional de Estudios y Convenciones Ecológicas y Medioambientales (CIECEM, University of Huelva), in the environment of Doñana Natural Park (Huelva province). In this way, both stations, which are separated by a distance of 75 km, will work as a double video station system in order to provide trajectory and orbit information of mayor bolides and, thus, increase the chance of meteorite recovery in the Iberian Peninsula. The new diurnal SPMN video stations are endowed with different models of Mintron cameras (Mintron Enterprise Co., LTD). These are high-sensitivity devices that employ a colour 1/2" Sony interline transfer CCD image sensor. Aspherical lenses are attached to the video cameras in order to maximize image quality. However, the use of fast lenses is not a priority here: while most of our nocturnal cameras use f0.8 or f1.0 lenses in order to detect meteors as faint as magnitude +3, diurnal systems employ in most cases f1.4 to f2.0 lenses. Their focal length ranges from 3.8 to 12 mm to cover different atmospheric volumes. The cameras are arranged in such a way that the whole sky is monitored from every observing station. Figure 1. A daylight event recorded from Sevilla on May 26, 2008 at 4h30m05.4 +-0.1s UT. The way our diurnal video cameras work is similar to the operation of our nocturnal systems [1]. Thus, diurnal stations are automatically switched on and off at sunrise and sunset, respectively. The images taken at 25 fps and with a resolution of 720x576 pixels are continuously sent to PC computers through a video capture device. The computers run a software (UFOCapture, by SonotaCo, Japan) that automatically registers meteor trails and stores the corresponding video frames on hard disk. Besides, before the signal from the cameras reaches the computers, a video time inserter that employs a GPS device (KIWI-OSD, by PFD Systems) inserts time information on every video frame. This allows us to measure time in a precise way (about 0.01 sec.) along the whole fireball path. EPSC Abstracts, Vol. 3, EPSC2008-A-00319, 2008 European Planetary Science Congress, Author(s) 2008 However, one of the issues with respect to nocturnal observing stations is the high number of false detections as a consequence of several factors: higher activity of birds and insects, reflection of sunlight on planes and helicopters, etc. Sometimes some of these false events follow a pattern which is very similar to fireball trails, which makes absolutely necessary the use of a second station in order to discriminate between them. Other key issue is related to the passage of the Sun before the field of view of some of the cameras. In fact, special care is necessary with this to avoid any damage to the CCD sensor. Besides, depending on atmospheric conditions (dust or moisture, for instance), the Sun may saturate most of the video frame. To solve this, our automated system determines which camera is pointing towards the Sun at a given moment and disconnects it. As the cameras are endowed with autoiris lenses, its disconnection means that the optics is fully closed and, so, the CCD sensor is protected. This, of course, means that when this happens the atmospheric volume covered by the corresponding camera is not monitored. It must be also taken into account that, in general, operation temperatures are higher for diurnal cameras. This results in higher thermal noise and, so, poses some difficulties to the detection software. To minimize this effect, it is necessary to employ CCD video cameras with proper signal to noise ratio. Refrigeration of the CCD sensor with, for instance, a Peltier system, can also be considered. The astrometric reduction procedure is also somewhat different for daytime events: it requires that reference objects are located within the field of view of every camera in order to calibrate the corresponding images. This is done by allowing every camera to capture distant buildings that, by means of said calibration, would allow us to obtain the equatorial coordinates of the fireball along its path by measuring its corresponding X and Y positions on every video frame. Such calibration can be performed from stars positions measured from nocturnal images taken with the same cameras. Once made, if the cameras are not moved it is possible to estimate the equatorial coordinates of any future fireball event. We don't use any software for automatic astrometry of the images. This crucial step is made via direct measurements of the pixel position as in all our previous work. Then, from these astrometric measurements, our software estimates the atmospheric trajectory and radiant for each fireball ([10] to [13]). During 2007 and 2008 the SPMN has also setup other diurnal stations based on 1/3' progressive-scan CMOS sensors attached to modified wide-field lenses covering a 120x80 degrees FOV. They are placed in Andalusia: El Arenosillo (Huelva), La Mayora (Málaga) and Murtas (Granada). They have also night sensitivity thanks to a infrared cut filter (ICR) which enables the camera to perform well in both high and low light condition in colour as well as provide IR sensitive Black/White video at night. Conclusions First detections of daylight fireballs by CCD video camera are being achieved in the SPMN framework. Future expansion and set up of new observing stations is currently being planned. The future establishment of additional diurnal SPMN stations will allow an increase in the number of daytime fireballs detected. This will also increase our chance of meteorite recovery.

  10. The future scientific CCD

    NASA Technical Reports Server (NTRS)

    Janesick, J. R.; Elliott, T.; Collins, S.; Marsh, H.; Blouke, M. M.

    1984-01-01

    Since the first introduction of charge-coupled devices (CCDs) in 1970, CCDs have been considered for applications related to memories, logic circuits, and the detection of visible radiation. It is pointed out, however, that the mass market orientation of CCD development has left largely untapped the enormous potential of these devices for advanced scientific instrumentation. The present paper has, therefore, the objective to introduce the CCD characteristics to the scientific community, taking into account prospects for further improvement. Attention is given to evaluation criteria, a summary of current CCDs, CCD performance characteristics, absolute calibration tools, quantum efficiency, aspects of charge collection, charge transfer efficiency, read noise, and predictions regarding the characteristics of the next generation of silicon scientific CCD imagers.

  11. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable assembly. The FPGA controller card is connected to the host computer via a standard peripheral component interface (PCI).

  12. Application of a Two Camera Video Imaging System to Three-Dimensional Vortex Tracking in the 80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1993-01-01

    A description is presented of two enhancements for a two-camera, video imaging system that increase the accuracy and efficiency of the system when applied to the determination of three-dimensional locations of points along a continuous line. These enhancements increase the utility of the system when extracting quantitative data from surface and off-body flow visualizations. The first enhancement utilizes epipolar geometry to resolve the stereo "correspondence" problem. This is the problem of determining, unambiguously, corresponding points in the stereo images of objects that do not have visible reference points. The second enhancement, is a method to automatically identify and trace the core of a vortex in a digital image. This is accomplished by means of an adaptive template matching algorithm. The system was used to determine the trajectory of a vortex generated by the Leading-Edge eXtension (LEX) of a full-scale F/A-18 aircraft tested in the NASA Ames 80- by 120-Foot Wind Tunnel. The system accuracy for resolving the vortex trajectories is estimated to be +/-2 inches over distance of 60 feet. Stereo images of some of the vortex trajectories are presented. The system was also used to determine the point where the LEX vortex "bursts". The vortex burst point locations are compared with those measured in small-scale tests and in flight and found to be in good agreement.

  13. BVR photometry and CCD spectroscopy of nova Del 2013

    NASA Astrophysics Data System (ADS)

    Santangelo, M. M. M.; Pasquini, M.

    2013-08-01

    In the course of the CATS (Capannori Astronomical Transient Survey) project, M.M.M. Santangelo and M. Pasquini performed BVR photoelectric photometry and low resolution CCD long-slit spectrometry of nova Delphini 2013. The measurements were made with an Optec SSP-5A single channel photoelectric photometer (with a photomultiplier tube Hamamatsu R6358), and with a SBIG SGS spectrometer + CCD camera ST-7XME attached at OAC's 0.30-m f/10 Schmidt-Cassegrain telescope.

  14. Television applications of interline-transfer CCD arrays

    NASA Technical Reports Server (NTRS)

    Hoagland, K. A.

    1976-01-01

    The design features and characteristics of interline transfer (ILT) CCD arrays with 190 x 244 and 380 x 488 image elements are reviewed, with emphasis on optional operating modes and system application considerations. It was shown that the observed horizontal resolution for a TV system using an ILT image sensor can approach the aperture response limit determined by photosensor site width, resulting in enhanced resolution for moving images. Preferred camera configurations and read out clocking modes for maximum resolution and low light sensitivity are discussed, including a very low light level intensifier CCD concept. Several camera designs utilizing ILT-CCD arrays are described. These cameras demonstrate feasibility in applications where small size, low-power/low-voltage operation, high sensitivity and extreme ruggedness are either desired or mandatory system requirements.

  15. Video-Level Monitor

    NASA Technical Reports Server (NTRS)

    Gregory, Ray W.

    1993-01-01

    Video-level monitor developed to provide full-scene monitoring of video and indicates level of brightest portion. Circuit designed nonspecific and can be inserted in any closed-circuit camera system utilizing RS170 or RS330 synchronization and standard CCTV video levels. System made of readily available, off-the-shelf components. Several units are in service.

  16. Design of a CCD controller optimized for mosaics

    NASA Astrophysics Data System (ADS)

    Leach, Robert W.

    1988-10-01

    A controller for operating Thomson-CSF CCDs in a 2 x N mosaic is described. It is designed around a monolithic Digital Signal Processor, a bank of digital-to-analog converters for clock generation, a simple video processor, and a fiber-optic serial data link communicating with an instrument control computer. The controller is compact, low power, low cost, fast, and easily programmable to generate waveforms of arbitrary timing whose voltages are also software controlled. Up to 16 CCDs can be efficiently controlled, and each CCD has its own set of clock drivers and a video processor, allowing customization of the readout of each CCD device.

  17. The use of video for air pollution source monitoring

    SciTech Connect

    Ferreira, F.; Camara, A.

    1999-07-01

    The evaluation of air pollution impacts from single industrial emission sources is a complex environmental engineering problem. Recent developments in multimedia technologies used by personal computers improved the digitizing and processing of digital video sequences. This paper proposes a methodology where statistical analysis of both meteorological and air quality data combined with digital video images are used for monitoring air pollution sources. One of the objectives of this paper is to present the use of image processing algorithms in air pollution source monitoring. CCD amateur video cameras capture images that are further processed by computer. The use of video as a remote sensing system was implemented with the goal of determining some particular parameters, either meteorological or related with air quality monitoring and modeling of point sources. These parameters include the remote calculation of wind direction, wind speed, gases stack's outlet velocity, and stack's effective emission height. The characteristics and behavior of a visible pollutant's plume is also studied. Different sequences of relatively simple image processing operations are applied to the images gathered by the different cameras to segment the plume. The algorithms are selected depending on the atmospheric and lighting conditions. The developed system was applied to a 1,000 MW fuel power plant located at Setubal, Portugal. The methodology presented shows that digital video can be an inexpensive form to get useful air pollution related data for monitoring and modeling purposes.

  18. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  19. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  20. Automatic processing method for astronomical CCD images

    NASA Astrophysics Data System (ADS)

    Li, Binhua; Yang, Lei; Mao, Wei

    2002-12-01

    Since several hundreds of CCD images are obtained with the CCD camera in the Lower Latitude Meridian Circle (LLMC) every observational night, it is essential to adopt an automatic processing method to find the initial position of each object in these images, to center the object detected and to calculate its magnitude. In this paper several existing automatic search algorithms searching for objects in astronomical CCD images are reviewed. Our automatic searching algorithm is described, which include 5 steps: background calculating, filtering, object detecting and identifying, and defect eliminating. Then several existing two-dimensional centering algorithms are also reviewed, and our modified two-dimensional moment algorithm and an empirical formula for the centering threshold are presented. An algorithm for determining the magnitudes of objects is also presented in the paper. All these algorithms are programmed with VC++ programming language. In the last our method is tested with CCD images from the 1m RCC telescope in Yunnan Observatory, and some primary results are also given.

  1. Upgrades to NDSF Vehicle Camera Systems and Development of a Prototype System for Migrating and Archiving Video Data in the National Deep Submergence Facility Archives at WHOI

    NASA Astrophysics Data System (ADS)

    Fornari, D.; Howland, J.; Lerner, S.; Gegg, S.; Walden, B.; Bowen, A.; Lamont, M.; Kelley, D.

    2003-12-01

    In recent years, considerable effort has been made to improve the visual recording capabilities of Alvin and ROV Jason. This has culminated in the routine use of digital cameras, both internal and external on these vehicles, which has greatly expanded the scientific recording capabilities of the NDSF. The UNOLS National Deep Submergence Facility (NDSF) archives maintained at Woods Hole Oceanograpic Institution (WHOI) are the repository for the diverse suite of photographic still images (both 35mm and recently digital), video imagery, vehicle data and navigation, and near-bottom side-looking sonar data obtained by the facility vehicles. These data comprise a unique set of information from a wide range of seafloor environments over the more than 25 years of NDSF operations in support of science. Included in the holdings are Alvin data plus data from the tethered vehicles- ROV Jason, Argo II, and the DSL-120 side scan sonar. This information conservatively represents an outlay in facilities and science costs well in excess of \\$100 million. Several archive related improvement issues have become evident over the past few years. The most critical are: 1. migration and better access to the 35mm Alvin and Jason still images through digitization and proper cataloging with relevant meta-data, 2. assessing Alvin data logger data, migrating data on older media no longer in common use, and properly labeling and evaluating vehicle attitude and navigation data, 3. migrating older Alvin and Jason video data, especially data recorded on Hi-8 tape that is very susceptible to degradation on each replay, to newer digital format media such as DVD, 4. improving the capabilities of the NDSF archives to better serve the increasingly complex needs of the oceanographic community, including researchers involved in focused programs like Ridge2000 and MARGINS, where viable distributed databases in various disciplinary topics will form an important component of the data management structure. We report on an archiving effort to transfer video footage currently on Hi-8 and VHS tape to digital media (DVD). At the same time as this is being done, frame grab imagery at reasonable resolution (640x480) at 30 sec. intervals will be compiled and the images will be integrated, as much as possible with vehicle attitude/navigation data and provided to the user community in a web-browser format, such as has already been done for the recent Jason and Alvin frame grabbed imagery. The frame-grabbed images will be tagged with time, thereby permitting integration of vehicle attitude and navigation data once that is available. In order to prototype this system, we plan to utilize data from the East Pacific Rise and Juan de Fuca Ridge which are field areas selected by the community as Ridge2000 Integrated Study Sites. There are over 500 Alvin dives in both these areas and having frame-grabbed, synoptic views of the terrains covered during those dives will be invaluable for scientific and outreach use as part of Ridge2000. We plan to coordinate this activity with the Ridge2000 Data Management Office at LDEO.

  2. Cone penetrometer deployed in situ video microscope for characterizing sub-surface soil properties

    SciTech Connect

    Lieberman, S.H.; Knowles, D.S.; Kertesz, J.

    1997-12-31

    In this paper we report on the development and field testing of an in situ video microscope that has been integrated with a cone penetrometer probe in order to provide a real-time method for characterizing subsurface soil properties. The video microscope system consists of a miniature CCD color camera system coupled with an appropriate magnification and focusing optics to provide a field of view with a coverage of approximately 20 mm. The camera/optic system is mounted in a cone penetrometer probe so that the camera views the soil that is in contact with a sapphire window mounted on the side of the probe. The soil outside the window is illuminated by diffuse light provided through the window by an optical fiber illumination system connected to a white light source at the surface. The video signal from the camera is returned to the surface where it can be displayed in real-time on a video monitor, recorded on a video cassette recorder (VCR), and/or captured digitally with a frame grabber installed in a microcomputer system. In its highest resolution configuration, the in situ camera system has demonstrated a capability to resolve particle sizes as small as 10 {mu}m. By using other lens systems to increase the magnification factor, smaller particles could be resolved, however, the field of view would be reduced. Initial field tests have demonstrated the ability of the camera system to provide real-time qualitative characterization of soil particle sizes. In situ video images also reveal information on porosity of the soil matrix and the presence of water in the saturated zone. Current efforts are focused on the development of automated imaging processing techniques as a means of extracting quantitative information on soil particle size distributions. Data will be presented that compares data derived from digital images with conventional sieve/hydrometer analyses.

  3. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1983-08-02

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid. 4 figs.

  4. Video flowmeter

    DOEpatents

    Lord, David E.; Carter, Gary W.; Petrini, Richard R.

    1983-01-01

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid (10) containing entrained particles (12) is formed and positioned by a rod optic lens assembly (31) on the raster area of a low-light level television camera (20). The particles (12) are illuminated by light transmitted through a bundle of glass fibers (32) surrounding the rod optic lens assembly (31). Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen (40). The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid (10).

  5. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1981-06-10

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid.

  6. Megapixel imaging camera for expanded H{sup {minus}} beam measurements

    SciTech Connect

    Simmons, J.E.; Lillberg, J.W.; McKee, R.J.; Slice, R.W.; Torrez, J.H.; McCurnin, T.W.; Sanchez, P.G.

    1994-02-01

    A charge coupled device (CCD) imaging camera system has been developed as part of the Ground Test Accelerator project at the Los Alamos National Laboratory to measure the properties of a large diameter, neutral particle beam. The camera is designed to operate in the accelerator vacuum system for extended periods of time. It would normally be cooled to reduce dark current. The CCD contains 1024 {times} 1024 pixels with pixel size of 19 {times} 19 {mu}m{sup 2} and with four phase parallel clocking and two phase serial clocking. The serial clock rate is 2.5{times}10{sup 5} pixels per second. Clock sequence and timing are controlled by an external logic-word generator. The DC bias voltages are likewise located externally. The camera contains circuitry to generate the analog clocks for the CCD and also contains the output video signal amplifier. Reset switching noise is removed by an external signal processor that employs delay elements to provide noise suppression by the method of double-correlated sampling. The video signal is digitized to 12 bits in an analog to digital converter (ADC) module controlled by a central processor module. Both modules are located in a VME-type computer crate that communicates via ethernet with a separate workstation where overall control is exercised and image processing occurs. Under cooled conditions the camera shows good linearity with dynamic range of 2000 and with dark noise fluctuations of about {plus_minus}1/2 ADC count. Full well capacity is about 5{times}10{sup 5} electron charges.

  7. Development of a driving method suitable for ultrahigh-speed shooting in a 2M-fps 300k-pixel single-chip color camera

    NASA Astrophysics Data System (ADS)

    Yonai, J.; Arai, T.; Hayashida, T.; Ohtake, H.; Namiki, J.; Yoshida, T.; Etoh, T. Goji

    2012-03-01

    We have developed an ultrahigh-speed CCD camera that can capture instantaneous phenomena not visible to the human eye and impossible to capture with a regular video camera. The ultrahigh-speed CCD was specially constructed so that the CCD memory between the photodiode and the vertical transfer path of each pixel can store 144 frames each. For every one-frame shot, the electric charges generated from the photodiodes are transferred in one step to the memory of all the parallel pixels, making ultrahigh-speed shooting possible. Earlier, we experimentally manufactured a 1M-fps ultrahigh-speed camera and tested it for broadcasting applications. Through those tests, we learned that there are cases that require shooting speeds (frame rate) of more than 1M fps; hence we aimed to develop a new ultrahigh-speed camera that will enable much faster shooting speeds than what is currently possible. Since shooting at speeds of more than 200,000 fps results in decreased image quality and abrupt heating of the image sensor and drive circuit board, faster speeds cannot be achieved merely by increasing the drive frequency. We therefore had to improve the image sensor wiring layout and the driving method to develop a new 2M-fps, 300k-pixel ultrahigh-speed single-chip color camera for broadcasting purposes.

  8. Application of PLZT electro-optical shutter to diaphragm of visible and mid-infrared cameras

    NASA Astrophysics Data System (ADS)

    Fukuyama, Yoshiyuki; Nishioka, Shunji; Chonan, Takao; Sugii, Masakatsu; Shirahata, Hiromichi

    1997-04-01

    Pb0.9La0.09(Zr0.65,Ti0.35)0.9775O3 9/65/35) commonly used as an electro-optical shutter exhibits large phase retardation with low applied voltage. This shutter features as follows; (1) high shutter speed, (2) wide optical transmittance, and (3) high optical density in 'OFF'-state. If the shutter is applied to a diaphragm of video-camera, it could protect its sensor from intense lights. We have tested the basic characteristics of the PLZT electro-optical shutter and resolved power of imaging. The ratio of optical transmittance at 'ON' and 'OFF'-states was 1.1 X 103. The response time of the PLZT shutter from 'ON'-state to 'OFF'-state was 10 micro second. MTF reduction when putting the PLZT shutter in from of the visible video- camera lens has been observed only with 12 percent at a spatial frequency of 38 cycles/mm which are sensor resolution of the video-camera. Moreover, we took the visible image of the Si-CCD video-camera. The He-Ne laser ghost image was observed at 'ON'-state. On the contrary, the ghost image was totally shut out at 'OFF'-state. From these teste, it has been found that the PLZT shutter is useful for the diaphragm of the visible video-camera. The measured optical transmittance of PLZT wafer with no antireflection coating was 78 percent over the range from 2 to 6 microns.

  9. High temporal resolution video imaging of intracellular calcium.

    PubMed

    Takamatsu, T; Wier, W G

    1990-01-01

    We have developed a system for imaging intracellular free calcium ion concentration ([Ca2+]i) at the highest rate possible with conventional video equipment. The system is intended to facilitate quantitative study of rapid changes in [Ca2+]i in cells that move. It utilizes intensified video cameras with nearly ideal properties and digital image processing to produce two images that can be ratioed without artifacts. Two dichroic mirrors direct images of cellular Indo-1 fluorescence at two different wavelengths to two synchronized video cameras, each consisting of a fast micro-channel plate image intensifier optically coupled with a tapered fiber optic bundle to a CCD image sensor. The critical technical issues in this dual-image system are: (1) minimization and correction of the small geometric and other types of differences in the images provided by the two cameras; and (2) the signal-to-noise ratio that can be achieved in single frames. We have used this system to obtain images of [Ca2+]i at 16.7 ms intervals in voltage-clamped single cardiac cells perfused internally with Indo-1 (pentapotassium salt). The images indicate that, except for the nuclear regions, [Ca2+]i is uniform during normal excitation-contraction coupling. In contrast, changes in [Ca2+]i propagate in rapid 'waves' during the spontaneous release of Ca2+ that accompanies certain 'Ca2(+)-overload conditions.' PMID:2354495

  10. Extreme ultraviolet response of a Tektronix 1024 x 1024 CCD

    NASA Astrophysics Data System (ADS)

    Moses, Daniel J.; Hochedez, Jean-Francois E.; Howard, Russell A.; Au, Benjamin D.; Wang, Dennis; Blouke, Morley

    1992-08-01

    The goal of the detector development program for the Solar and Heliospheric Spacecraft (SOHO) EUV Imaging Telescope (EIT) is an Extreme UltraViolet (EUV) CCD (Charge Coupled Device) camera. The Naval Research Lab (NRL) SOHO COD Group has developed a design for the EIT camera and is screening CCDs for flight application. Tektronix Inc. have fabricated 1024x1024 CCDs for the EIT program. As a part of the CCD screening effort the quantum efficiency (QE) of a prototype CCD has been measured in the NRL EUV laboratory over the wavelength range of 256 to 735 Angstroms. A simplified model has been applied to these QE measurements to illustrate the relevant physical processes that determine the performance of the detector.

  11. Major Features of the CCD Photometry Software Data Acquisition and Reduction Package Distributed by Optec, Inc.

    NASA Astrophysics Data System (ADS)

    Dickinson, Jon

    The CCD Photometry Software Data Acquisition and Reduction Package, by Optec Inc., is a three-program software package designed for collecting and analyzing images made by most commercially available CCD cameras. The software runs on any IBM AT-class computer with EGA or better graphics and a hard drive. Some of the important features of the programs are discussed.

  12. A possible theoretical explanation of CCD interference patterns.

    NASA Astrophysics Data System (ADS)

    Jankov, S.; Vince, I.; Kubi?ela, A.; Jevremovi?, D.; Popovi?, L. C.

    1996-07-01

    Quite high interference disturbance has been noticed in the authors' SBIG ST-6 CCD camera when applied in solar spectrophotometry. This behaviour has been explained quantitatively by the effect of spatial incoherence produced by an extended light source as the sun.

  13. Competetive and Mature CCD Imaging Systems for Planetary Raman Spectrometers

    NASA Astrophysics Data System (ADS)

    Ingley, R.; Hutchinson, I.; Harris, L. V.; McHugh, M.; Edwards, H. G. M.; Waltham, N. R.; Brown, P.; Pool, P.

    2014-06-01

    Progress on the design of a CCD-based imaging system is presented. The camera system, provided by the UK, uses space-qualified and mature technology and is included in the ExoMars RLS instrument due for launch 2018.

  14. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  15. Scientific CCD characterisation at Universidad Complutense LICA Laboratory

    NASA Astrophysics Data System (ADS)

    Tulloch, S.; Gil de Paz, A.; Gallego, J.; Zamorano, J.; Tapia, Carlos

    2012-07-01

    A CCD test-bench has been built at the Universidad Complutenss LICA laboratory. It is initially intended for commissioning of the MEGARA1 (Multi-Espectrgrafo en GTC de Alta Resolucin para Astronoma) instrument but can be considered as a general purpose scientific CCD test-bench. The test-bench uses an incandescent broad-band light source in combination with a monochromator and two filter wheels to provide programmable narrow-band illumination across the visible band. Light from the monochromator can be directed to an integrating sphere for flat-field measurements or sent via a small aperture directly onto the CCD under test for high accuracy diode-mode quantum efficiency measurements. Point spread function measurements can also be performed by interposing additional optics between sphere and the CCD under test. The whole system is under LabView control via a clickable GUI. Automated measurement scans of quantum efficiency can be performed requiring only that the user replace the CCD under test with a calibrated photodiode after each measurement run. A 20cm diameter cryostat with a 10cm window and Brooks Polycold PCC closed-cycle cooler also form part of the test-bench. This cryostat is large enough to accommodate almost all scientific CCD formats has initially been used to house an E2V CCD230 in order to fully prove the test-bench functionality. This device is read-out using an Astronomical Research Camera controller connected to the UKATC's UCAM data acquisition system.

  16. CCD image sensor induced error in PIV applications

    NASA Astrophysics Data System (ADS)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  17. CCD high-speed videography system with new concepts and techniques

    NASA Astrophysics Data System (ADS)

    Zheng, Zengrong; Zhao, Wenyi; Wu, Zhiqiang

    1997-05-01

    A novel CCD high speed videography system with brand-new concepts and techniques is developed by Zhejiang University recently. The system can send a series of short flash pulses to the moving object. All of the parameters, such as flash numbers, flash durations, flash intervals, flash intensities and flash colors, can be controlled according to needs by the computer. A series of moving object images frozen by flash pulses, carried information of moving object, are recorded by a CCD video camera, and result images are sent to a computer to be frozen, recognized and processed with special hardware and software. Obtained parameters can be displayed, output as remote controlling signals or written into CD. The highest videography frequency is 30,000 images per second. The shortest image freezing time is several microseconds. The system has been applied to wide fields of energy, chemistry, medicine, biological engineering, aero- dynamics, explosion, multi-phase flow, mechanics, vibration, athletic training, weapon development and national defense engineering. It can also be used in production streamline to carry out the online, real-time monitoring and controlling.

  18. Video-based beam position monitoring at CHESS

    NASA Astrophysics Data System (ADS)

    Revesz, Peter; Pauling, Alan; Krawczyk, Thomas; Kelly, Kevin J.

    2012-10-01

    CHESS has pioneered the development of X-ray Video Beam Position Monitors (VBPMs). Unlike traditional photoelectron beam position monitors that rely on photoelectrons generated by the fringe edges of the X-ray beam, with VBPMs we collect information from the whole cross-section of the X-ray beam. VBPMs can also give real-time shape/size information. We have developed three types of VBPMs: (1) VBPMs based on helium luminescence from the intense white X-ray beam. In this case the CCD camera is viewing the luminescence from the side. (2) VBPMs based on luminescence of a thin (~50 micron) CVD diamond sheet as the white beam passes through it. The CCD camera is placed outside the beam line vacuum and views the diamond fluorescence through a viewport. (3) Scatter-based VBPMs. In this case the white X-ray beam passes through a thin graphite filter or Be window. The scattered X-rays create an image of the beam's footprint on an X-ray sensitive fluorescent screen using a slit placed outside the beam line vacuum. For all VBPMs we use relatively inexpensive 1.3 Mega-pixel CCD cameras connected via USB to a Windows host for image acquisition and analysis. The VBPM host computers are networked and provide live images of the beam and streams of data about the beam position, profile and intensity to CHESS's signal logging system and to the CHESS operator. The operational use of VBPMs showed great advantage over the traditional BPMs by providing direct visual input for the CHESS operator. The VBPM precision in most cases is on the order of ~0.1 micron. On the down side, the data acquisition frequency (50-1000ms) is inferior to the photoelectron based BPMs. In the future with the use of more expensive fast cameras we will be able create VBPMs working in the few hundreds Hz scale.

  19. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  20. Characterization of the series 1000 camera system

    SciTech Connect

    Kimbrough, J.R.; Moody, J.D.; Bell, P.M.; Landen, O.L.

    2004-10-01

    The National Ignition Facility requires a compact network addressable scientific grade charge coupled device (CCD) camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1 MHz readout rate. The PC104+ controller includes 16 analog inputs, four analog outputs, and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  1. Tests of commercial colour CMOS cameras for astronomical applications

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Reshetnyk, V. M.; Zhilyaev, B. E.

    2013-12-01

    We present some results of testing commercial colour CMOS cameras for astronomical applications. Colour CMOS sensors allow to perform photometry in three filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system realized in colour CMOS sensors is close to the astronomical Johnson BVR system. The basic camera characteristics: read noise (e^{-}/pix), thermal noise (e^{-}/pix/sec) and electronic gain (e^{-}/ADU) for the commercial digital camera Canon 5D MarkIII are presented. We give the same characteristics for the scientific high performance cooled CCD camera system ALTA E47. Comparing results for tests of Canon 5D MarkIII and CCD ALTA E47 show that present-day commercial colour CMOS cameras can seriously compete with the scientific CCD cameras in deep astronomical imaging.

  2. Rail head wear measurements using the CCD photonic system

    NASA Astrophysics Data System (ADS)

    Popov, Dmitry V.; Titov, Evgeny V.; Mikhailov, Sergey S.

    1999-10-01

    At present there are exist comprehensive studies in the field of railway track condition monitoring systems and development of non-contact photonic systems based on digital CCD-cameras, high-speed board computers and powerful software. Creation of such systems allows to conduct preventive track maintenance work beforehand and to avoid the effects of vibration from wavy rail defects on a wheel set. As a result, the safety of running, durability of permanent way and rolling-stock are increased and the maintenance costs are reduced. The system developed consists of four special digital matrix CCD-cameras and four laser stripe illuminators. An electronic interface for linking the computer with the cameras, contour extraction models of the rail profile have been developed and the analysis of input- output ports has been carried out. According to the algorithms make a cut-off method and a tangent method have been compared.

  3. CCD Stability Monitor

    NASA Astrophysics Data System (ADS)

    Mack, Jennifer

    2009-07-01

    This program will verify that the low frequency flat fielding, the photometry, and the geometric distortion are stable in time and across the field of view of the CCD arrays. A moderately crowded stellar field in the cluster 47 Tuc is observed with the HRC {at the cluster core} and WFC {6 arcmin West of the cluster core} using the full suite of broad and narrow band imaging filters. The positions and magnitudes of objects will be used to monitor local and large scale variations in the plate scale and the sensitivity of the detectors and to derive an independent measure of the detector CTE. The UV sensitivity for the SBC and HRC will be addressed in the UV contamination monitor program {11886, PI=Smith}.One additional orbit will be obtained at the beginning of the cycle will allow a verification of the CCD gain ratios for WFC using gain 2.0, 1.4, 1.0, 0.5 and for HRC using gain 4.0 and 2.0. In addition, one subarray exposure with the WFC will allow a verification that photometry obtained in full-frame and in sub-array modes are repeatable to better than 1%. This test is important for the ACS Photometric Cross-Calibration program {11889, PI=Bohlin} which uses sub-array exposures.

  4. L3 CCD Technology

    NASA Astrophysics Data System (ADS)

    Tulloch, S.

    2003-12-01

    Low Light Level (L3) CCD technology is a recent development from E2V that opens up interesting new observational regimes. The technology allows production of scientific CCDs in which the read noise of the on-chip amplifier becomes negligibly low. Additionally, this effective zero-noise performance is decoupled from readout speeds and the almost zero noise performance holds up to frame rates of 1KHz. E2V achieve this by using an avalanche multiplication mechanism in the horizontal register of the CCD. A single photo-electron entering this register exits as a substantial charge packet; the exact gain being variable and determined by the level of a high voltage multiplication clock. At gain levels of around 500 it becomes possible to identify individual photon events in the image. The downside of L3 technology is that the multiplication process degrades the SNR at higher signals by a factor of 2^1/2 . There is also a small additional noise contribution from spuriously generated electrons within the device.

  5. CCD technique for longitude/latitude astronomy

    NASA Astrophysics Data System (ADS)

    Damljanovi?, G.; Gerstbach, G.; de Biasi, M. S.; Pejovi?, N.

    2003-10-01

    We report about CCD (Charge Coupled Device) experiments with the isntruments of astrometry and geodesy for the longitude and latitude determinations. At the Techn. University Vienna (TU Vienna), a mobile zenith camera "G1" was developed, based on CCD MX916 (Starlight Xpress) and F=20 cm photo optic. With Hipparcos/Tycho Catalogue, the first results show accuracy up to 0."5 for latitude/longitude. The PC-guided observations can be completed within 10 minutes. The camera G1 (near 4 kg) is used for astrogeodesy (geoid, Earth's crust, etc.). At the Belgrade Astronomical Observatory (AOB), the accuracy of (mean value of) latitude/longitude determinations can be a few 0."01 using zenith stars, Tycho-2 Catalogue and a ST-8 of SBIG (Santa Barbara Instrument Group) with zenith-telescope BLZ (D=11 cm, F=128.7 cm). The same equipment with PIP instrument (D=20 cm and F=457.7 cm, Punta Indio PZT, near La Plata) yields a little better accuracy than the BLZ's one. Both instruments, BLZ and PIP, where in the list of Bureau International de l'Heure - BIH. The mentioned instruments have acquired good possibilities for semi or full-automatic observations.

  6. Classical astrometry longitude and latitude determination by using CCD technique

    NASA Astrophysics Data System (ADS)

    Damljanovi?, G.; de Biasi, M. S.; Gerstbach, G.

    At the AOB, it is the zenith-telescope (D=11 cm, F=128.7 cm, denoted by BLZ in the list of Bureau International de l'Heure - BIH), and at Punta Indio (near La Plata) it is the photographic zenith tube (D=20 cm, F=457.7 cm, denoted by PIP in the list of BIH). At the AOB there is a CCD camera ST-8 of Santa Barbara Instrument Group (SBIG) with 15301020 number of pixels, 99 microns pixel size and 13.89.2 mm array dimension. We did some investigations about the possibilities for longitude (?) and latitude (?) determinations by using ST-8 with BLZ and PIP, and our predicted level of accuracy is few 0."01 from one CCD zenith stars processing with Tycho-2 Catalogue. Also, astro-geodesy has got new practicability with the CCDs (to reach a good accuracy of geoid determination via astro-geodesy ? and ? observations). At the TU Wien there is the CCD MX916 of Starlight Xpress (with 752580 pixels, 1112 microns, 8.76.5 mm active area). Our predicted level of accuracy for ? and ? measurements is few 0."1 from one CCD MX916 processing of zenith stars, with small optic (20 cm focus length because of not stable, but mobile instrument) and Tycho-2. A transportable zenith camera with CCD is under development at the TU Wien for astro-geodesy subjects.

  7. An advanced CCD emulator with 32MB image memory

    NASA Astrophysics Data System (ADS)

    O'Connor, P.; Fried, J.; Kotov, I.

    2012-07-01

    As part of the LSST sensor development program we have developed an advanced CCD emulator for testing new multichannel readout electronics. The emulator, based on an Altera Stratix II FPGA for timing and control, produces 4 channels of simulated video waveforms in response to an appropriate sequence of horizontal and vertical clocks. It features 40MHz, 16-bit DACs for reset and video generation, 32MB of image memory for storage of arbitrary grayscale bitmaps, and provision to simulate reset and clock feedthrough ("glitches") on the video channels. Clock inputs are qualified for proper sequences and levels before video output is generated. Binning, region of interest, and reverse clock sequences are correctly recognized and appropriate video output will be produced. Clock transitions are timestamped and can be played back to a control PC. A simplified user interface is provided via a daughter card having an ARM M3 Cortex microprocessor and miniature color LCD display and joystick. The user can select video modes from stored bitmap images, or flat, gradient, bar, chirp, or checkerboard test patterns; set clock thresholds and video output levels; and set row/column formats for image outputs. Multiple emulators can be operated in parallel to simulate complex CCDs or CCD arrays.

  8. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  9. CCD-based guiding sensor resolution enhancement

    NASA Astrophysics Data System (ADS)

    Sobotka, Milos; Prochazka, Ivan; Hamal, Karel; Blazej, Josef

    1998-04-01

    The paper presents the achievements in research and development of the compact optical CCD based guiding system. The ultimate goal of the project is a modest size, low mass and rugged systems to be applied for sub-arc second optical ground-space guiding and tracking. The system includes the optics, the CCD sensor with the readout and an image- processing algorithm. The optics consist of a diffraction limited objective, four-element lens system with eh effective input aperture 90 millimeters. The objective focal length 300 mm is extended by the additional relay optics. The resulting effective focal length is 3000 millimeters, the focal spot size is 65 micrometers Airy disc diameter. The combination of the diffraction limited objective design, focal extender and mechanical construction permitted to keep the overall length bellow 600 millimeters and the total mass bellow 5 kilograms while maintaining high ruggedness at one arc-second level. A sensor, the Texas Instrument CCD chip 192 X 164 pixels, 15 micrometers size is used. the custom designed readout and data processing hardware has been developed. Parallel communication maintains image download time 0.6 second with 12 bits amplitude resolution. The data acquisition and image processing software package running under MS Windows 95 or NT provide all functions for the camera control, data acquisition and image processing for precise target position evaluation. The position is evaluated as the center of mass of square neighborhood of the brightest CCD pixel. Indoor test of the ultimate position resolution using different diffraction limited images and sizes are described. The image position resolution +/- 0.03 pixel has been achieved. It corresponds to 0.03 arc seconds of angular resolution of the entire guiding sensor.

  10. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  11. Based on line scan CCD print image detection system

    NASA Astrophysics Data System (ADS)

    Zhang, Lifeng; Xie, Kai; Li, Tong

    2015-12-01

    In this paper, a new method based on machine vision is proposed for the defects of the traditional manual inspection of the quality of printed matter. With the aid of on line array CCD camera for image acquisition, using stepper motor as a sampling of drive circuit. Through improvement of driving circuit, to achieve the different size or precision image acquisition. In the terms of image processing, the standard image registration algorithm then, because of the characteristics of CCD-image acquisition, rigid body transformation is usually used in the registration, so as to achieve the detection of printed image.

  12. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  13. STIS-01 CCD Functional

    NASA Astrophysics Data System (ADS)

    Valenti, Jeff

    2001-07-01

    This activity measures the baseline performance and commandability of the CCD subsystem. Only primary amplifier D is used. Bias, Dark, and Flat Field exposures are taken in order to measure read noise, dark current, CTE, and gain. Numerous bias frames are taken to permit construction of "superbias" frames in which the effects of read noise have been rendered negligible. Dark exposures are made outside the SAA. Full frame and binned observations are made, with binning factors of 1x1 and 2x2. Finally, tungsten lamp exposures are taken through narrow slits to confirm the slit positions in the current database. All exposures are internals. This is a reincarnation of SM3A proposal 8502 with some unnecessary tests removed from the program.

  14. CCD observations of Phoebe

    NASA Astrophysics Data System (ADS)

    Veiga, C. H.; Vieira Martins, R.; Andrei, A. H.

    2000-02-01

    Astromeric CCD positions of the Saturnian satellite Phoebe obtained from 60 frames taken in 10 nights are presented. The observations were distributed between 5 missions in the years 1995 to 1997. For the astrometric calibration the USNO-A2.0 Catalogue is used. All positions are compared with those calculated by Jacobson (1998a) and Bec-Borsenberger & Rocher (1982). The residuals have mean and standard deviation smaller than 0farcs 5, in the x and y directions. The distribution of residuals is suggestive of the need of an improvement for the orbit calculations. Based on observations made at Laboratrio Nacional de Astrofsica/CNPq/MCT-Itajub-Brazil. Please send offprint requests to C.H. Veiga. Table 1 is only available at http://www.edpsciences.org

  15. Measurement of marine picoplankton cell size by using a cooled, charge-coupled device camera with image-analyzed fluorescence microscopy

    SciTech Connect

    Viles, C.L.; Sieracki, M.E. )

    1992-02-01

    Accurate measurement of the biomass and size distribution of picoplankton cells (0.2 to 2.0 {mu}m) is paramount in characterizing their contribution to the oceanic food web and global biogeochemical cycling. Image-analyzed fluorescence microscopy, usually based on video camera technology, allows detailed measurements of individual cells to be taken. The application of an imaging system employing a cooled, slow-scan charge-coupled device (CCD) camera to automated counting and sizing of individual picoplankton cells from natural marine samples is described. A slow-scan CCD-based camera was compared to a video camera and was superior for detecting and sizing very small, dim particles such as fluorochrome-stained bacteria. Several edge detection methods for accurately measuring picoplankton cells were evaluated. Standard fluorescent microspheres and a Sargasso Sea surface water picoplankton population were used in the evaluation. Global thresholding was inappropriate for these samples. Methods used previously in image analysis of nanoplankton cells (2 to 20 {mu}m) also did not work well with the smaller picoplankton cells. A method combining an edge detector and an adaptive edge strength operator worked best for rapidly generating accurate cell sizes. A complete sample analysis of more than 1,000 cells averages about 50 min and yields size, shape, and fluorescence data for each cell. With this system, the entire size range of picoplankton can be counted and measured.

  16. CCD Hot Pixel Annealing

    NASA Astrophysics Data System (ADS)

    Cox, Colin

    2006-07-01

    Hot pixel annealing will continue to be performed once every 4 weeks. The CCD TECswill be turned off and heaters will be activated to bring the detectortemperatures to about +20C. This state will be held for approximately6 hours, after which the heaters are turned off, the TECs turned on,and the CCDs returned to normal operating condition. To assess the effectiveness of the annealing, a bias and four dark images will be taken before and after the annealing procedure for both WFC and HRC.The HRC darks are taken in parallel with the WFC darks.The charge transfer efficiency {CTE} of the ACS CCD detectors declinesas damage due to on-orbit radiation exposure accumulates. This degradationhas been closely monitored at regular intervals, because it is likely todetermine the useful lifetime of the CCDs.We combine the annealling activity with the charge transfer efficiency monitoring and also merge into the routine dark image collection. To this end, the CTE monitoring exposures have been moved into this proposal . All the data for this program is acquired using internal targets {lamps} only,so all of the exposures should be taken during Earth occultation time{but not during SAA passages}.This program emulates the ACS pre-flight ground calibration and post-launchSMOV testing {program 8948}, so that results from each epoch can be directlycompared. Extended Pixel Edge Response {EPER} and First Pixel Response {FPR}data will be obtained over a range of signal levels for both theWide Field Channel {WFC}, and the High Resolution Channel {HRC}.

  17. CCD Hot Pixel Annealing

    NASA Astrophysics Data System (ADS)

    Cox, Colin

    2004-07-01

    Hot pixel annealing will continue to be performed once every 4 weeks. The CCD TECs will be turned off and heaters will be activated to bring the detector temperatures to about +20C. This state will be held for approximately 12 hours, after which the heaters are turned off, the TECs turned on, and the CCDs returned to normal operating condition. To assess the effectiveness of the annealing, a bias and four dark images will be taken before and after the annealing procedure for both WFC and HRC. The HRC darks are taken in parallel with the WFC darks. The charge transfer efficiency {CTE} of the ACS CCD detectors declines as damage due to on-orbit radiation exposure accumulates. This degradation has been closely monitored at regular intervals, because it is likely to determine the useful lifetime of the CCDs. We will now combine the annealling activity with the charge transfer efficiency monitoring and also merge into the routine dark image collection. To this end, the CTE monitoring exposures have been moved into this proposal . All the data for this program is acquired using internal targets {lamps} only, so all of the exposures should be taken during Earth occultation time {but not during SAA passages}. This program emulates the ACS pre-flight ground calibration and post-launch SMOV testing {program 8948}, so that results from each epoch can be directly compared. Extended Pixel Edge Response {EPER} and First Pixel Response {FPR} data will be obtained over a range of signal levels for both the Wide Field Channel {WFC}, and the High Resolution Channel {HRC}.

  18. CCD Hot Pixel Annealing

    NASA Astrophysics Data System (ADS)

    Cox, Colin

    2005-07-01

    Hot pixel annealing will continue to be performed once every 4 weeks. The CCD TECswill be turned off and heaters will be activated to bring the detectortemperatures to about +20C. This state will be held for approximately6 hours, after which the heaters are turned off, the TECs turned on,and the CCDs returned to normal operating condition. To assess the effectiveness of the annealing, a bias and four dark images will be taken before and after the annealing procedure for both WFC and HRC.The HRC darks are taken in parallel with the WFC darks.The charge transfer efficiency {CTE} of the ACS CCD detectors declinesas damage due to on-orbit radiation exposure accumulates. This degradationhas been closely monitored at regular intervals, because it is likely todetermine the useful lifetime of the CCDs.We combine the annealling activity with the charge transfer efficiency monitoring and also merge into the routine dark image collection. To this end, the CTE monitoring exposures have been moved into this proposal . All the data for this program is acquired using internal targets {lamps} only,so all of the exposures should be taken during Earth occultation time{but not during SAA passages}.This program emulates the ACS pre-flight ground calibration and post-launchSMOV testing {program 8948}, so that results from each epoch can be directlycompared. Extended Pixel Edge Response {EPER} and First Pixel Response {FPR}data will be obtained over a range of signal levels for both theWide Field Channel {WFC}, and the High Resolution Channel {HRC}.

  19. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  20. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary

  1. Video monitoring system for car seat

    NASA Technical Reports Server (NTRS)

    Elrod, Susan Vinz (Inventor); Dabney, Richard W. (Inventor)

    2004-01-01

    A video monitoring system for use with a child car seat has video camera(s) mounted in the car seat. The video images are wirelessly transmitted to a remote receiver/display encased in a portable housing that can be removably mounted in the vehicle in which the car seat is installed.

  2. ccdproc: CCD data reduction software

    NASA Astrophysics Data System (ADS)

    Crawford, S. M.; Craig, M. W.; ccdproc Contributors

    2015-10-01

    Ccdproc is an affiliated package for the AstroPy package for basic data reductions of CCD images. The ccdproc package provides many of the necessary tools for processing of ccd images built on a framework to provide error propagation and bad pixel tracking throughout the reduction process.

  3. Practical performance evaluation of a 10k 10k CCD for electron cryo-microscopy.

    PubMed

    Bammes, Benjamin E; Rochat, Ryan H; Jakana, Joanita; Chiu, Wah

    2011-09-01

    Electron cryo-microscopy (cryo-EM) images are commonly collected using either charge-coupled devices (CCD) or photographic film. Both film and the current generation of 16 megapixel (4k 4k) CCD cameras have yielded high-resolution structures. Yet, despite the many advantages of CCD cameras, more than two times as many structures of biological macromolecules have been published in recent years using photographic film. The continued preference to film, especially for subnanometer-resolution structures, may be partially influenced by the finer sampling and larger effective specimen imaging area offered by film. Large format digital cameras may finally allow them to overtake film as the preferred detector for cryo-EM. We have evaluated a 111-megapixel (10k 10k) CCD camera with a 9 ?m pixel size. The spectral signal-to-noise ratios of low dose images of carbon film indicate that this detector is capable of providing signal up to at least 2/5 Nyquist frequency potentially retrievable for 3D reconstructions of biological specimens, resulting in more than double the effective specimen imaging area of existing 4k 4k CCD cameras. We verified our estimates using frozen-hydrated ?15 bacteriophage as a biological test specimen with previously determined structure, yielding a ?7 resolution single particle reconstruction from only 80 CCD frames. Finally, we explored the limits of current CCD technology by comparing the performance of this detector to various CCD cameras used for recording data yielding subnanometer resolution cryo-EM structures submitted to the electron microscopy data bank (http://www.emdatabank.org/). PMID:21619932

  4. Guerrilla Video: A New Protocol for Producing Classroom Video

    ERIC Educational Resources Information Center

    Fadde, Peter; Rich, Peter

    2010-01-01

    Contemporary changes in pedagogy point to the need for a higher level of video production value in most classroom video, replacing the default video protocol of an unattended camera in the back of the classroom. The rich and complex environment of today's classroom can be captured more fully using the higher level, but still easily manageable,

  5. System for control of cooled CCD and image data processing for plasma spectroscopy

    SciTech Connect

    Mimura, M.; Kakeda, T.; Inoko, A.

    1995-12-31

    A Spectroscopic measurement system which has a spacial resolution is important for plasma study. This is especially true for a measurement of a plasma without axial symmetry like the LHD-plasma. Several years ago, we developed an imaging spectroscopy system using a CCD camera and an image-memory board of a personal computer. It was very powerful to study a plasma-gas interaction phenomena. In which system, however, an ordinary CCD was used so that the dark-current noise of the CCD prevented to measure dark spectral lines. Recently, a cooled CCD system can be obtained for the high sensitivity measurement. But such system is still very expensive. The cooled CCD itself as an element can be purchased cheaply, because amateur agronomists began to use it to take a picture of heavenly bodies. So we developed an imaging spectroscopy system using such a cheap cooled CCD for plasma experiment.

  6. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward (Hercules, CA)

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  7. Representing videos in tangible products

    NASA Astrophysics Data System (ADS)

    Fageth, Reiner; Weiting, Ralf

    2014-03-01

    Videos can be taken with nearly every camera, digital point and shoot cameras, DSLRs as well as smartphones and more and more with so-called action cameras mounted on sports devices. The implementation of videos while generating QR codes and relevant pictures out of the video stream via a software implementation was contents in last years' paper. This year we present first data about what contents is displayed and how the users represent their videos in printed products, e.g. CEWE PHOTOBOOKS and greeting cards. We report the share of the different video formats used, the number of images extracted out of the video in order to represent the video, the positions in the book and different design strategies compared to regular books.

  8. Improved Optical Techniques for Studying Sonic and Supersonic Injection into MACH-3 Flow. Video Supplement E-10853-V

    NASA Technical Reports Server (NTRS)

    Buggele, A. E.; Seasholtz, R. G.

    1997-01-01

    This video supplements a report examining optical techniques for studying sonic and supersonic injection into MACH-3 flow The study used an injection-seeded, frequency doubled ND:YAG pulsed laser to illuminate a transverse section of the injectant plume. Rayleigh scattered light was passed through an iodine absorption cell to suppress stray laser light and was imaged onto a cooled CCD camera. The scattering was based on condensation of water vapor in the injectant flow. High speed shadowgraph flow visualization images were obtained with several video camera systems. Roof and floor static pressure data are presented several ways for the three configurations of injection designs with and without helium and/or air injection into Mach 3 flow.

  9. Video Mosaicking for Inspection of Gas Pipelines

    NASA Technical Reports Server (NTRS)

    Magruder, Darby; Chien, Chiun-Hong

    2005-01-01

    A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable differences: Because the wide-angle lens introduces considerable distortion, the image data must be processed to effectively unwarp the images (see Figure 2). The computer executes special software that includes an unwarping algorithm that takes explicit account of the cylindrical pipe geometry. To reduce the processing time needed for unwarping, parameters of the geometric mapping between the circular view of a fisheye lens and pipe wall are determined in advance from calibration images and compiled into an electronic lookup table. The software incorporates the assumption that the optical axis of the camera is parallel (rather than perpendicular) to the direction of motion of the camera. The software also compensates for the decrease in illumination with distance from the ring of LEDs.

  10. World's fastest and most sensitive astronomical camera

    NASA Astrophysics Data System (ADS)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these corrections to be done at an even higher rate, more than one thousand times a second, and this is where OCam is essential. "The quality of the adaptive optics correction strongly depends on the speed of the camera and on its sensitivity," says Philippe Feautrier from the LAOG, France, who coordinated the whole project. "But these are a priori contradictory requirements, as in general the faster a camera is, the less sensitive it is." This is why cameras normally used for very high frame-rate movies require extremely powerful illumination, which is of course not an option for astronomical cameras. OCam and its CCD220 detector, developed by the British manufacturer e2v technologies, solve this dilemma, by being not only the fastest available, but also very sensitive, making a significant jump in performance for such cameras. Because of imperfect operation of any physical electronic devices, a CCD camera suffers from so-called readout noise. OCam has a readout noise ten times smaller than the detectors currently used on the VLT, making it much more sensitive and able to take pictures of the faintest of sources. "Thanks to this technology, all the new generation instruments of ESO's Very Large Telescope will be able to produce the best possible images, with an unequalled sharpness," declares Jean-Luc Gach, from the Laboratoire d'Astrophysique de Marseille, France, who led the team that built the camera. "Plans are now underway to develop the adaptive optics detectors required for ESO's planned 42-metre European Extremely Large Telescope, together with our research partners and the industry," says Hubin. Using sensitive detectors developed in the UK, with a control system developed in France, with German and Spanish participation, OCam is truly an outcome of a European collaboration that will be widely used and commercially produced. More information The three French laboratories involved are the Laboratoire d'Astrophysique de Marseille (LAM/INSU/CNRS, Université de Provence; Observatoire Astronomique de Marseille Provence), the Laboratoire d'Astrophysique de Grenoble (LAOG/INSU/CNRS, Université Joseph Fourier; Observatoire des Sciences de l'Univers de Grenoble), and the Observatoire de Haute Provence (OHP/INSU/CNRS; Observatoire Astronomique de Marseille Provence). OCam and the CCD220 are the result of five years work, financed by the European commission, ESO and CNRS-INSU, within the OPTICON project of the 6th Research and Development Framework Programme of the European Union. The development of the CCD220, supervised by ESO, was undertaken by the British company e2v technologies, one of the world leaders in the manufacture of scientific detectors. The corresponding OPTICON activity was led by the Laboratoire d'Astrophysique de Grenoble, France. The OCam camera was built by a team of French engineers from the Laboratoire d'Astrophysique de Marseille, the Laboratoire d'Astrophysique de Grenoble and the Observatoire de Haute Provence. In order to secure the continuation of this successful project a new OPTICON project started in June 2009 as part of the 7th Research and Development Framework Programme of the European Union with the same partners, with the aim of developing a detector and camera with even more powerful functionality for use with an artificial laser star. This development is necessary to ensure the image quality of the future 42-metre European Extremely Large Telescope. ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world's most advanced visible-light astronomical observatory. ESO is the European partner of a revolutionary astronomical telescope ALMA, the largest astronomical project in existence. ESO is currently planning a 42-metre European Extremely Large optical/near-infrared Telescope, the E-ELT, which will become "the world's biggest eye on the sky".

  11. Adaptive optics flood-illumination camera for high speed retinal imaging

    NASA Astrophysics Data System (ADS)

    Rha, Jungtae; Jonnal, Ravi S.; Thorn, Karen E.; Qu, Junle; Zhang, Yan; Miller, Donald T.

    2006-05-01

    Current adaptive optics flood-illumination retina cameras operate at low frame rates, acquiring retinal images below seven Hz, which restricts their research and clinical utility. Here we investigate a novel bench top flood-illumination camera that achieves significantly higher frame rates using strobing fiber-coupled superluminescent and laser diodes in conjunction with a scientific-grade CCD. Source strength was sufficient to obviate frame averaging, even for exposures as short as 1/3 msec. Continuous frame rates of 10, 30, and 60 Hz were achieved for imaging 1.8,0.8, and 0.4 deg retinal patches, respectively. Short-burst imaging up to 500 Hz was also achieved by temporarily storing sequences of images on the CCD. High frame rates, short exposure durations (1 msec), and correction of the most significant aberrations of the eye were found necessary for individuating retinal blood cells and directly measuring cellular flow in capillaries. Cone videos of dark adapted eyes showed a surprisingly rapid fluctuation (~1 Hz) in the reflectance of single cones. As further demonstration of the value of the camera, we evaluated the tradeoff between exposure duration and image blur associated with retina motion.

  12. Echelle spectroscopy with a charge-coupled device /CCD/

    NASA Technical Reports Server (NTRS)

    York, D. G.; Jenkins, E. B.; Zucchino, P.; Lowrance, J. L.; Long, D.; Songaila, A.

    1981-01-01

    The recent availability of large format CCD's with high quantum efficiency makes it possible to achieve significant advances in high dispersion astronomical spectroscopy. An echelle CCD combination excels or equals other techniques presently available, and offers the advantage of complete spectral coverage of several thousand Angstroms in a single exposure. Attention is given to experiments which were conducted with a CCD camera head and an echelle spectrograph on a 4-meter telescope. It was found possible to achieve a signal-to-noise ratio of 150/1 on a 13th magnitude star at 6000 A in a two-hour exposure at 0.16 A/pixel, limited primarily by photon statistics. For fainter objects, readout noise is the limiting factor in precision. For 20 electron rms readout noise, an S/N = 15/1 at 18th magnitude is expected, all other things being equal.

  13. The test of the 10k x 10k CCD for Antarctic Survey Telescopes (AST3)

    NASA Astrophysics Data System (ADS)

    Ma, Bin; Shang, Zhaohui; Wang, Lifan; Boggs, Kasey; Hu, Yi; Liu, Qiang; Song, Qian; Xue, Suijian

    2012-09-01

    A 10k x 10k single-chip CCD camera was installed on the first Antarctic Survey Telescope (AST3-1) at Dome A, Antarctica in January 2012. The pixel size is 9 ?m, corresponding to 1 arcsec on the focal plane. The CCD runs without shutter but in frame transfer mode, and is cooled by thermoelectric cooler (TEC) to take advantage of the low air temperature at Dome A. We tested the performance of the camera in detail, including the gain, linearity, readout noise, dark current, charge transfer efficiency, etc. As this camera is designed to work at Dome A, where the lowest air temperature could go down to -80C in winter, we tested to cool not only the CCD chip but also the controller which usually is operated at normal temperatures for ground-based telescopes. We found that the performance of the camera changes a little when the controller is cooled.

  14. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  15. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  16. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  17. Use of a high-resolution profiling sonar and a towed video camera to map a Zostera marina bed, Solent, UK

    NASA Astrophysics Data System (ADS)

    Lefebvre, A.; Thompson, C. E. L.; Collins, K. J.; Amos, C. L.

    2009-04-01

    Seagrasses are flowering plants that develop into extensive underwater meadows and play a key role in the coastal ecosystem. In the last few years, several techniques have been developed to map and monitor seagrass beds in order to protect them. Here, we present the results of a survey using a profiling sonar, the Sediment Imager Sonar (SIS) and a towed video sledge to study a Zostera marina bed in the Solent, southern UK. The survey aimed to test the instruments for seagrass detection and to describe the area for the first time. On the acoustic data, the bed produced the strongest backscatter along a beam. A high backscatter above the bottom indicated the presence of seagrass. The results of an algorithm developed to detect seagrass from the sonar data were tested against video data. Four parameters were calculated from the SIS data: water depth, a Seagrass Index (average backscatter 10-15 cm above the bed), canopy height (height above the bed where the backscatter crosses a threshold limit) and patchiness (percentage of beams in a sweep where the backscatter 10-15 cm above the bed is greater than a threshold limit). From the video, Zostera density was estimated together with macroalgae abundance and bottom type. Patchiness calculated from the SIS data was strongly correlated to seagrass density evaluated from the video, indicating that this parameter could be used for seagrass detection. The survey area has been classified based upon seagrass density, macroalgae abundance and bottom type. Only a small area was occupied by a dense canopy whereas most of the survey area was characterised by patchy seagrass. Results indicated that Zostera marina developed only on sandy bottoms and was not found in regions of gravel. Furthermore, it was limited to a depth shallower than 1.5 m below the level of Lowest Astronomical Tide and present in small patches across the intertidal zone. The average canopy height was 15 cm and the highest density was 150 shoots m -2.

  18. Video sensor with range measurement capability

    NASA Technical Reports Server (NTRS)

    Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Howard, Richard T. (Inventor); Broderick, David J. (Inventor)

    2008-01-01

    A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.

  19. IR Hiding: A Method to Prevent Video Re-shooting by Exploiting Differences between Human Perceptions and Recording Device Characteristics

    NASA Astrophysics Data System (ADS)

    Yamada, Takayuki; Gohshi, Seiichi; Echizen, Isao

    A method is described to prevent video images and videos displayed on screens from being re-shot by digital cameras and camcorders. Conventional methods using digital watermarking for re-shooting prevention embed content IDs into images and videos, and they help to identify the place and time where the actual content was shot. However, these methods do not actually prevent digital content from being re-shot by camcorders. We developed countermeasures to stop re-shooting by exploiting the differences between the sensory characteristics of humans and devices. The countermeasures require no additional functions to use-side devices. It uses infrared light (IR) to corrupt the content recorded by CCD or CMOS devices. In this way, re-shot content will be unusable. To validate the method, we developed a prototype system and implemented it on a 100-inch cinema screen. Experimental evaluations showed that the method effectively prevents re-shooting.

  20. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.