Science.gov

Sample records for video ccd camera

  1. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  2. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  3. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  4. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  5. Biofeedback control analysis using a synchronized system of two CCD video cameras and a force-plate sensor

    NASA Astrophysics Data System (ADS)

    Tsuruoka, Masako; Shibasaki, Ryosuke; Murai, Shunji

    1999-01-01

    The biofeedback control analysis of human movement has become increasingly important in rehabilitation, sports medicine and physical fitness. In this study, a synchronized system was developed for acquiring sequential data of a person's movement. The setup employs a video recorder system linked with two CCD video cameras and fore-plate sensor system, which are configured to stop and start simultaneously. The feedback control movement of postural stability was selected as a subject for analysis. The person's center of body gravity (COG) was calculated by measured 3-D coordinates of major joints using videometry with bundle adjustment and self-calibration. The raw serial data of COG and foot pressure by measured force plate sensor are difficult to analyze directly because of their complex fluctuations. Utilizing auto regressive modeling, the power spectrum and the impulse response of movement factors, enable analysis of their dynamic relations. This new biomedical engineering approach provides efficient information for medical evaluation of a person's stability.

  6. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  7. Wide Dynamic Range CCD Camera

    NASA Astrophysics Data System (ADS)

    Younse, J. M.; Gove, R. J.; Penz, P. A.; Russell, D. E.

    1984-11-01

    A liquid crystal attenuator (LCA) operated as a variable neutral density filter has been attached to a charge-coupled device (CCD) imager to extend the dynamic range of a solid-state TV camera by an order of magnitude. Many applications are best served by a camera with a dynamic range of several thousand. For example, outside security systems must operate unattended with "dawn-to-dusk" lighting conditions. Although this can be achieved with available auto-iris lens assemblies, more elegant solutions which provide the small size, low power, high reliability advantages of solid state technology are now available. This paper will describe one such unique way of achieving these dynamic ranges using standard optics by making the CCD imager's glass cover a controllable neutral density filter. The liquid crystal attenuator's structure and theoretical properties for this application will be described along with measured transmittance. A small integrated TV camera which utilizes a "virtual-phase" CCD sensor coupled to a LCA will be described and test results for a number of the camera's optical and electrical parameters will be given. These include the following camera parameters: dynamic range, Modulation Transfer Function (MTF), spectral response, and uniformity. Also described will be circuitry which senses the ambient scene illuminance and automatically provides feedback signals to appropriately adjust the transmittance of the LCA. Finally, image photographs using this camera, under various scene illuminations, will be shown.

  8. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  9. 3-D eye movement measurements on four Comex's divers using video CCD cameras, during high pressure diving.

    PubMed

    Guillemant, P; Ulmer, E; Freyss, G

    1995-01-01

    Previous studies have shown the vulnerability of the vestibular system regarding barotraumatism (1) and deep diving may induce immediate neurological changes (2). These extreme conditions (high pressure, limited examination time, restricted space, hydrogen-oxygen mixture, communication difficulties etc.) require adapted technology and associated fast experimental procedure. We were able to solve these problems by developing a new system of 3-D ocular movements on line analysis by means of a video camera. This analyser uses image processing and forms recognition software which allows non-invasive video frequency calculation of eye movements including torsional component. As this system is immediately ready for use, we were able to realize the subsequent examinations in a maximum time of 8 min for each diver: oculomotor tests including saccadic, slow and optokinetic traditional automatic measurements; vestibular tests regarding spontaneous and positional nystagmus, and reactional nystagmus to the pendular test. For pendular induced nystagmus we used appropriate head positions to stimulate separately the lateral and the posterior semicircular canal, and we measured the gain by operating successively in visible light and complete darkness. Recordings were done during a simulated onshore dive to an ambient pressure corresponding to a depth of 350 m. The above examinations were completed on the first and last days by caloric tests with the same video system analyser. The results of the investigations demonstrated perfect tolerance of the oculomotor and vestibular systems of these 4 divers thus fulfilling the preventive conditions defined by Comex Co. We were able to overcome the limitations due to low cost PC computer operation and cameras (necessity of adaptation to pressure, focus difficulties and direct light exposure eye reflexions). We still have on line accurate measurements even on the torsional component of the eye movement. Due to this technological efficiency

  10. Linejitter and geometric calibration of CCD-cameras

    NASA Astrophysics Data System (ADS)

    Beyer, Horst A.

    Precise radiometric and geometric transmission of images from CCD-sensor to memory is a fundamental aspect of CCD-camera calibration. Linejitter and other degradation occurring during transmission are major limiting factors of the precision attainable with most current CCD-cameras and framegrabbers. The video signal, synchronisation signals and principal electronic components involved in synchronisation and transmission are analysed and their influence on linejitter discussed. A method for signal transmission with the elimination of linejitter and other degradations is shown. Methods for the determination and correction of linejitter are discussed.

  11. The CTIO CCD-TV acquisition camera

    NASA Astrophysics Data System (ADS)

    Walker, Alistair R.; Schmidt, Ricardo

    A prototype CCD-TV camera has been built at CTIO, conceptually similar to the cameras in use at Lick Observatory. A GEC CCD is used as the detector, cooled thermo-electrically to -45C. Pictures are displayed via an IBM PC clone computer and an ITI image display board. Results of tests at the CTIO telescopes are discussed, including comparisons with the RCA ISIT cameras used at present for acquisition and guiding.

  12. Application of the CCD camera in medical imaging

    NASA Astrophysics Data System (ADS)

    Chu, Wei-Kom; Smith, Chuck; Bunting, Ralph; Knoll, Paul; Wobig, Randy; Thacker, Rod

    1999-04-01

    Medical fluoroscopy is a set of radiological procedures used in medical imaging for functional and dynamic studies of digestive system. Major components in the imaging chain include image intensifier that converts x-ray information into an intensity pattern on its output screen and a CCTV camera that converts the output screen intensity pattern into video information to be displayed on a TV monitor. To properly respond to such a wide dynamic range on a real-time basis, such as fluoroscopy procedure, are very challenging. Also, similar to all other medical imaging studies, detail resolution is of great importance. Without proper contrast, spatial resolution is compromised. The many inherent advantages of CCD make it a suitable choice for dynamic studies. Recently, CCD camera are introduced as the camera of choice for medical fluoroscopy imaging system. The objective of our project was to investigate a newly installed CCD fluoroscopy system in areas of contrast resolution, details, and radiation dose.

  13. Solid state television camera (CCD-buried channel)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The development of an all solid state television camera, which uses a buried channel charge coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array is utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control (i.e., ALC and AGC) techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  14. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  15. Vacuum compatible miniature CCD camera head

    SciTech Connect

    Conder, A.D.

    2000-06-20

    A charge-coupled device (CCD) camera head is disclosed which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04 inches for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military, industrial, and medical imaging applications.

  16. Integration design of FPGA software for a miniaturizing CCD remote sensing camera

    NASA Astrophysics Data System (ADS)

    Yin, Na; Li, Qiang; Rong, Peng; Lei, Ning; Wan, Min

    2014-09-01

    Video signal processor (VSP) is an important part for CCD remote sensing cameras, and also is the key part of light miniaturization design for cameras. We need to apply FPGAs to improve the level of integration for simplifying the video signal processor circuit. This paper introduces an integration design of FPGA software for video signal processor in a certain space remote sensing camera in detail. This design has accomplished the functions of integration in CCD timing control, integral time control, CCD data formatting and CCD image processing and correction on one single FPGA chip, which resolved the problem for miniaturization of video signal processor in remote sensing cameras. Currently, this camera has already launched successfully and obtained high quality remote sensing images, which made contribution to the miniaturizing remote sensing camera.

  17. Automatic state calibration with CCD cameras

    NASA Astrophysics Data System (ADS)

    Ge, Renyan

    1994-03-01

    The instrumental errors of an analytical plotter derive mainly from the stage of the analytical plotter, so the stage calibration is a very important quality index for evaluating the measurement accuracy of the analytical plotter. With the help of CCD images, realization of high-precision positioning and measurement has become the basic standard for machine vision and real time photogrammetry systems. The software system, which employs some image processing algorithms for the automatic stage calibration with the CCD camera, based on the analytical plotter which SOKKIA researched and developed, is discussed. The reliability and validity are also discussed.

  18. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  19. Typical effects of laser dazzling CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhang, Jianmin; Shao, Bibo; Cheng, Deyan; Ye, Xisheng; Feng, Guobin

    2015-05-01

    In this article, an overview of laser dazzling effect to buried channel CCD camera is given. The CCDs are sorted into staring and scanning types. The former includes the frame transfer and interline transfer types. The latter includes linear and time delay integration types. All CCDs must perform four primary tasks in generating an image, which are called charge generation, charge collection, charge transfer and charge measurement. In camera, the lenses are needed to input the optical signal to the CCD sensors, in which the techniques for erasing stray light are used. And the electron circuits are needed to process the output signal of CCD, in which many electronic techniques are used. The dazzling effects are the conjunct result of light distribution distortion and charge distribution distortion, which respectively derive from the lens and the sensor. Strictly speaking, in lens, the light distribution is not distorted. In general, the lens are so well designed and fabricated that its stray light can be neglected. But the laser is of much enough intensity to make its stray light obvious. In CCD image sensors, laser can induce a so large electrons generation. Charges transfer inefficiency and charges blooming will cause the distortion of the charge distribution. Commonly, the largest signal outputted from CCD sensor is restricted by capability of the collection well of CCD, and can't go beyond the dynamic range for the subsequent electron circuits maintaining normal work. So the signal is not distorted in the post-processing circuits. But some techniques in the circuit can make some dazzling effects present different phenomenon in final image.

  20. Design, development, and performance of the STEREO SECCHI CCD cameras

    NASA Astrophysics Data System (ADS)

    Waltham, Nick; Eyles, Chris

    2007-09-01

    We report the design, development and performance of the SECCHI (Sun Earth Connection Coronal and Heliospheric Investigation) CCD camera electronics on NASA's Solar Terrestrial Relations Observatory (STEREO). STEREO consists of two nearly identical space-based observatories; one ahead of Earth in its orbit, the other trailing behind to provide the first-ever stereoscopic (3D) measurements to study the Sun and the nature of its coronal mass ejections. The SECCHI instrument suite consists of five telescopes that will observe the solar corona, and inner heliosphere all the way from the surface of the Sun to the orbit of the Earth, and beyond. Each telescope contains a large-format science-grade CCD; two within the Heliospheric Imager (HI) instrument, and three in a separate instrument package (SCIP) consisting of two coronagraphs and an EUV imager. The CCDs are operated from two Camera Electronics Boxes. Constraints on the size, mass, and power available for the camera electronics required the development of a miniaturised solution employing digital and mixed-signal ASICs, FPGAs, and compact surface-mount construction. Operating more than one CCD from a single box also provides economy on the number of DC-DC converters and interface electronics required. We describe the requirements for the overall design and implementation, and in particular the design and performance of the camera's space-saving mixed-signal CCD video processing ASIC. The performance of the camera is reviewed together with sample images obtained since the STEREO mission was successfully launched on October 25 2006 from Cape Canaveral.

  1. Design of a multifunction astronomical CCD camera

    NASA Astrophysics Data System (ADS)

    Yao, Dalei; Wen, Desheng; Xue, Jianru; Chen, Zhi; Wen, Yan; Jiang, Baotan; Xi, Jiangbo

    2015-07-01

    To satisfy the requirement of the astronomical observation, a novel timing sequence of frame transfer CCD is proposed. The multiple functions such as the adjustments of work pattern, exposure time and frame frequency are achieved. There are four work patterns: normal, standby, zero exposure and test. The adjustment of exposure time can set multiple exposure time according to the astronomical observation. The fame frequency can be adjusted when dark target is imaged and the maximum exposure time cannot satisfy the requirement. On the design of the video processing, offset correction and adjustment of multiple gains are proposed. Offset correction is used for eliminating the fixed pattern noise of CCD. Three gains pattern can improve the signal to noise ratio of astronomical observation. Finally, the images in different situations are collected and the system readout noise is calculated. The calculation results show that the designs in this paper are practicable.

  2. Event Pileup in AXAF's ACIS CCD Camera

    NASA Technical Reports Server (NTRS)

    McNamara, Brian R.

    1998-01-01

    AXAF's high resolution mirrors will focus a point source near the optical axis to a spot that is contained within a radius of about two pixels on the ACIS Charge Coupled Devices (CCD) camera. Because of the small spot size, the accuracy to which fluxes and spectral energy distributions of bright point sources can be measured will be degrad3ed by event pileup. Event pileup occurs when two or more X-ray photons arrive simultaneously in a single detection cell on a CCD readout frame. When pileup occurs, ACIS's event detection algorithm registers the photons as a single X-ray event. The pulse height channel of the event will correspond to an energy E approximately E-1 + E-2...E-n, where n is the number of photons registered per detection cell per readout frame. As a result, pileup artificially hardens the observed spectral energy distribution. I will discuss the effort at the AXAF Science Center Lo calibrate pileup in ACIS using focused, nearly monochromatic X-ray source. I will discuss techniques for modeling and correcting pileup effects in polychromatic spectra.

  3. High-speed optical shutter coupled to fast-readout CCD camera

    NASA Astrophysics Data System (ADS)

    Yates, George J.; Pena, Claudine R.; McDonald, Thomas E., Jr.; Gallegos, Robert A.; Numkena, Dustin M.; Turko, Bojan T.; Ziska, George; Millaud, Jacques E.; Diaz, Rick; Buckley, John; Anthony, Glen; Araki, Takae; Larson, Eric D.

    1999-04-01

    A high frame rate optically shuttered CCD camera for radiometric imaging of transient optical phenomena has been designed and several prototypes fabricated, which are now in evaluation phase. the camera design incorporates stripline geometry image intensifiers for ultra fast image shutters capable of 200ps exposures. The intensifiers are fiber optically coupled to a multiport CCD capable of 75 MHz pixel clocking to achieve 4KHz frame rate for 512 X 512 pixels from simultaneous readout of 16 individual segments of the CCD array. The intensifier, Philips XX1412MH/E03 is generically a Generation II proximity-focused micro channel plate intensifier (MCPII) redesigned for high speed gating by Los Alamos National Laboratory and manufactured by Philips Components. The CCD is a Reticon HSO512 split storage with bi-direcitonal vertical readout architecture. The camera main frame is designed utilizing a multilayer motherboard for transporting CCD video signals and clocks via imbedded stripline buses designed for 100MHz operation. The MCPII gate duration and gain variables are controlled and measured in real time and up-dated for data logging each frame, with 10-bit resolution, selectable either locally or by computer. The camera provides both analog and 10-bit digital video. The camera's architecture, salient design characteristics, and current test data depicting resolution, dynamic range, shutter sequences, and image reconstruction will be presented and discussed.

  4. Ultrahigh-speed, high-sensitivity color camera with 300,000-pixel single CCD

    NASA Astrophysics Data System (ADS)

    Kitamura, K.; Arai, T.; Yonai, J.; Hayashida, T.; Ohtake, H.; Kurita, T.; Tanioka, K.; Maruyama, H.; Namiki, J.; Yanagi, T.; Yoshida, T.; van Kuijk, H.; Bosiers, Jan T.; Etoh, T. G.

    2007-01-01

    We have developed an ultrahigh-speed, high-sensitivity portable color camera with a new 300,000-pixel single CCD. The 300,000-pixel CCD, which has four times the number of pixels of our initial model, was developed by seamlessly joining two 150,000-pixel CCDs. A green-red-green-blue (GRGB) Bayer filter is used to realize a color camera with the single-chip CCD. The camera is capable of ultrahigh-speed video recording at up to 1,000,000 frames/sec, and small enough to be handheld. We also developed a technology for dividing the CCD output signal to enable parallel, highspeed readout and recording in external memory; this makes possible long, continuous shots up to 1,000 frames/second. As a result of an experiment, video footage was imaged at an athletics meet. Because of high-speed shooting, even detailed movements of athletes' muscles were captured. This camera can capture clear slow-motion videos, so it enables previously impossible live footage to be imaged for various TV broadcasting programs.

  5. Streak Camera Performance with Large-Format CCD Readout

    SciTech Connect

    Lerche, R A; Andrews, D S; Bell, P M; Griffith, R L; McDonald, J W; Torres, P III; Vergel de Dios, G

    2003-07-08

    The ICF program at Livermore has a large inventory of optical streak cameras that were built in the 1970s and 1980s. The cameras include micro-channel plate image-intensifier tubes (IIT) that provide signal amplification and early lens-coupled CCD readouts. Today, these cameras are still very functional, but some replacement parts such as the original streak tube, CCD, and IIT are scarce and obsolete. This article describes recent efforts to improve the performance of these cameras using today's advanced CCD readout technologies. Very sensitive, large-format CCD arrays with efficient fiber-optic input faceplates are now available for direct coupling with the streak tube. Measurements of camera performance characteristics including linearity, spatial and temporal resolution, line-spread function, contrast transfer ratio (CTR), and dynamic range have been made for several different camera configurations: CCD coupled directly to the streak tube, CCD directly coupled to the IIT, and the original configuration with a smaller CCD lens coupled to the IIT output. Spatial resolution (limiting visual) with and without the IIT is 8 and 20 lp/mm, respectively, for photocathode current density up to 25% of the Child-Langmuir (C-L) space-charge limit. Temporal resolution (fwhm) deteriorates by about 20% when the cathode current density reaches 10% of the C-L space charge limit. Streak tube operation with large average tube current was observed by illuminating the entire slit region through a Ronchi ruling and measuring the CTR. Sensitivity (CCD electrons per streak tube photoelectron) for the various configurations ranged from 7.5 to 2,700 with read noise of 7.5 to 10.5 electrons. Optimum spatial resolution is achieved when the IIT is removed. Maximum dynamic range requires a configuration where a single photoelectron from the photocathode produces a signal that is 3 to 5 times the read noise.

  6. Printed circuit board for a CCD camera head

    DOEpatents

    Conder, Alan D.

    2002-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  7. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. )

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  8. Compression of CCD raw images for digital still cameras

    NASA Astrophysics Data System (ADS)

    Sriram, Parthasarathy; Sudharsanan, Subramania

    2005-03-01

    Lossless compression of raw CCD images captured using color filter arrays has several benefits. The benefits include improved storage capacity, reduced memory bandwidth, and lower power consumption for digital still camera processors. The paper discusses the benefits in detail and proposes the use of a computationally efficient block adaptive scheme for lossless compression. Experimental results are provided that indicate that the scheme performs well for CCD raw images attaining compression factors of more than two. The block adaptive method also compares favorably with JPEG-LS. A discussion is provided indicating how the proposed lossless coding scheme can be incorporated into digital still camera processors enabling lower memory bandwidth and storage requirements.

  9. Low-noise video amplifiers for imaging CCD's

    NASA Technical Reports Server (NTRS)

    Scinicariello, F.

    1976-01-01

    Various techniques were developed which enable the CCD (charge coupled device) imaging array user to obtain optimum performance from the device. A CCD video channel was described, and detector-preamplifier interface requirements were examined. A noise model for the system was discussed at length and laboratory data presented and compared to predicted results.

  10. Prospects for a Wide Field CCD Camera Aboard NGST

    NASA Astrophysics Data System (ADS)

    Golimowski, D. A.; Ford, H. C.; Tsvetanov, Z. I.; Burrows, C. J.; Krist, J. E.; White, R. L.; Clampin, M.; Rafal, M.; Hartig, G.

    1998-05-01

    The importance of a Next Generation Space Telescope (NGST) for studying the infrared universe has often overshadowed NGST's potential benefit to optical astronomy. As currently envisioned, NGST could also provide views of the visible universe with resolution and sensitivity that are unmatched by any existing ground- or space-based observatory. We discuss the scientific advantages and technical feasibility of placing a wide-field CCD camera aboard NGST. Using simulated data, we compare the imaging performance of such a camera with that achieved or expected with the Keck Telescope and the HST Advanced Camera for Surveys. Finally, we discuss the technical challenges of temperature regulation and radiation shielding for a CCD camera in the NGST environment.

  11. Driving techniques for high frame rate CCD camera

    NASA Astrophysics Data System (ADS)

    Guo, Weiqiang; Jin, Longxu; Xiong, Jingwu

    2008-03-01

    This paper describes a high-frame rate CCD camera capable of operating at 100 frames/s. This camera utilizes Kodak KAI-0340, an interline transfer CCD with 640(vertical)×480(horizontal) pixels. Two output ports are used to read out CCD data and pixel rates approaching 30 MHz. Because of its reduced effective opacity of vertical charge transfer registers, interline transfer CCD can cause undesired image artifacts, such as random white spots and smear generated in the registers. To increase frame rate, a kind of speed-up structure has been incorporated inside KAI-0340, then it is vulnerable to a vertical stripe effect. The phenomena which mentioned above may severely impair the image quality. To solve these problems, some electronic methods of eliminating these artifacts are adopted. Special clocking mode can dump the unwanted charge quickly, then the fast readout of the images, cleared of smear, follows immediately. Amplifier is used to sense and correct delay mismatch between the dual phase vertical clock pulses, the transition edges become close to coincident, so vertical stripes disappear. Results obtained with the CCD camera are shown.

  12. Visual enhancement of laparoscopic nephrectomies using the 3-CCD camera

    NASA Astrophysics Data System (ADS)

    Crane, Nicole J.; Kansal, Neil S.; Dhanani, Nadeem; Alemozaffar, Mehrdad; Kirk, Allan D.; Pinto, Peter A.; Elster, Eric A.; Huffman, Scott W.; Levin, Ira W.

    2006-02-01

    Many surgical techniques are currently shifting from the more conventional, open approach towards minimally invasive laparoscopic procedures. Laparoscopy results in smaller incisions, potentially leading to less postoperative pain and more rapid recoveries . One key disadvantage of laparoscopic surgery is the loss of three-dimensional assessment of organs and tissue perfusion. Advances in laparoscopic technology include high-definition monitors for improved visualization and upgraded single charge coupled device (CCD) detectors to 3-CCD cameras, to provide a larger, more sensitive color palette to increase the perception of detail. In this discussion, we further advance existing laparoscopic technology to create greater enhancement of images obtained during radical and partial nephrectomies in which the assessment of tissue perfusion is crucial but limited with current 3-CCD cameras. By separating the signals received by each CCD in the 3-CCD camera and by introducing a straight forward algorithm, rapid differentiation of renal vessels and perfusion is accomplished and could be performed real time. The newly acquired images are overlaid onto conventional images for reference and comparison. This affords the surgeon the ability to accurately detect changes in tissue oxygenation despite inherent limitations of the visible light image. Such additional capability should impact procedures in which visual assessment of organ vitality is critical.

  13. Color measurements using a colorimeter and a CCD camera

    SciTech Connect

    Spratlin, T.L.; Simpson, M.L.

    1992-02-01

    Two new techniques are introduced for measuring the color content of printed graphic images with applications to web inspection such as color flaws and measurement of color quality. The techniques involve the development of algorithms for combining the information obtained from commercially available CCD color cameras and colorimeters to produce a colorimeter system with pixel resolution. 9 refs.

  14. A novel calibration method of CCD camera for LAMOST

    NASA Astrophysics Data System (ADS)

    Gu, Yonggang; Jin, Yi; Zhai, Chao

    2012-09-01

    Large Sky Area Multi-object Fiber Spectroscopic Telescope - LAMOST, with a 1.75m-diameter focal plane on which 4000 optical fibers are arranged, is one of major scientific projects in China. During the surveying process of LAMOST, the optical imaging system makes the astrometric objects be imaged in the focal plane, and the optical fiber positioning system controls the 4000 fibers to be aligned with these objects and obtain their spectrum. In order to correct the positioning error of these optical fibers, the CCD camera is used to detect these fibers’ position in the way of close-range photogrammetry. As we all know, the calibration quality of the CCD camera is one of the most important factors for detection precision. However, the camera calibration has two following problems in the field work of LAMOST. First, the camera parameters are not stable due to the changes of on-site work environment and the vibration during movement. So, the CCD camera must be on-line calibrated. Second, a large-size high-precision calibration target is needed to calibrate the camera, for the focal plane is very big. Making such a calibration target, it is very difficult and costly. Meanwhile, the large calibration target is hard to be fixed on LAMOST because of the space constraint. In this paper, an improved bundle adjustment self-calibration method is proposed to solve the two problems above. The results of experiment indicate that this novel calibration method needs only a few control points while the traditional calibration methods need much more control points to get the same accuracy. So the method could realize the on-line high-precision calibration of CCD camera for LAMOST.

  15. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  16. Developments in the EM-CCD camera for OGRE

    NASA Astrophysics Data System (ADS)

    Tutt, James H.; McEntaffer, Randall L.; DeRoo, Casey; Schultz, Ted; Miles, Drew M.; Zhang, William; Murray, Neil J.; Holland, Andrew D.; Cash, Webster; Rogers, Thomas; O'Dell, Steve; Gaskin, Jessica; Kolodziejczak, Jeff; Evagora, Anthony M.; Holland, Karen; Colebrook, David

    2014-07-01

    The Off-plane Grating Rocket Experiment (OGRE) is a sub-orbital rocket payload designed to advance the development of several emerging technologies for use on space missions. The payload consists of a high resolution soft X-ray spectrometer based around an optic made from precision cut and ground, single crystal silicon mirrors, a module of off-plane gratings and a camera array based around Electron Multiplying CCD (EM-CCD) technology. This paper gives an overview of OGRE with emphasis on the detector array; specifically this paper will address the reasons that EM-CCDs are the detector of choice and the advantages and disadvantages that this technology offers.

  17. Design and application of TEC controller Using in CCD camera

    NASA Astrophysics Data System (ADS)

    Gan, Yu-quan; Ge, Wei; Qiao, Wei-dong; Lu, Di; Lv, Juan

    2011-08-01

    Thermoelectric cooler (TEC) is a kind of solid hot pump performed with Peltier effect. And it is small, light and noiseless. The cooling quantity is proportional to the TEC working current when the temperature difference between the hot side and the cold side keeps stable. The heating quantity and cooling quantity can be controlled by changing the value and direction of current of two sides of TEC. So, thermoelectric cooling technology is the best way to cool CCD device. The E2V's scientific image sensor CCD47-20 integrates TEC and CCD together. This package makes easier of electrical design. Software and hardware system of TEC controller are designed with CCD47-20 which is packaged with integral solid-state Peltier cooler. For hardware system, 80C51 MCU is used as CPU, 8-bit ADC and 8-bit DAC compose of closed-loop controlled system. Controlled quantity can be computed by sampling the temperature from thermistor in CCD. TEC is drove by MOSFET which consists of constant current driving circuit. For software system, advanced controlled precision and convergence speed of TEC system can be gotten by using PID controlled algorithm and tuning proportional, integral and differential coefficient. The result shows: if the heat emission of the hot side of TEC is good enough to keep the temperature stable, and when the sampling frequency is 2 seconds, temperature controlled velocity is 5°C/min. And temperature difference can reach -40°C controlled precision can achieve 0.3°C. When the hot side temperature is stable at °C, CCD temperature can reach -°C, and thermal noise of CCD is less than 1e-/pix/s. The controlled system restricts the dark-current noise of CCD and increases SNR of the camera system.

  18. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  19. Development of an all-in-one gamma camera/CCD system for safeguard verification

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Il; An, Su Jung; Chung, Yong Hyun; Kwak, Sung-Woo

    2014-12-01

    For the purpose of monitoring and verifying efforts at safeguarding radioactive materials in various fields, a new all-in-one gamma camera/charged coupled device (CCD) system was developed. This combined system consists of a gamma camera, which gathers energy and position information on gamma-ray sources, and a CCD camera, which identifies the specific location in a monitored area. Therefore, 2-D image information and quantitative information regarding gamma-ray sources can be obtained using fused images. A gamma camera consists of a diverging collimator, a 22 × 22 array CsI(Na) pixelated scintillation crystal with a pixel size of 2 × 2 × 6 mm3 and Hamamatsu H8500 position-sensitive photomultiplier tube (PSPMT). The Basler scA640-70gc CCD camera, which delivers 70 frames per second at video graphics array (VGA) resolution, was employed. Performance testing was performed using a Co-57 point source 30 cm from the detector. The measured spatial resolution and sensitivity were 4.77 mm full width at half maximum (FWHM) and 7.78 cps/MBq, respectively. The energy resolution was 18% at 122 keV. These results demonstrate that the combined system has considerable potential for radiation monitoring.

  20. Suppression of multiple scattering with a CCD camera detection scheme

    NASA Astrophysics Data System (ADS)

    Zakharov, Pavel; Schurtenberger, Peter; Scheffold, Frank

    2005-06-01

    We introduce a CCD camera detection scheme in dynamic light scattering that provides information on the single-scattered auto-correlation function even for fairly turbid samples. Our approach allows access to the extensive range of systems that show low-order scattering by selective detection of the singly scattered light. Model experiments on slowly relaxing suspensions of latex spheres in glycerol were carried out to verify validity range of our approach.

  1. CCD camera full range pH sensor array.

    PubMed

    Safavi, A; Maleki, N; Rostamzadeh, A; Maesum, S

    2007-01-15

    Changes in colors of an array of optical sensors that responds in full pH range were recorded using a CCD camera. The data of the camera were transferred to the computer through a capture card. Simple software was written to read the specific color of each sensor. In order to associate sensor array responses with pH values, a number of different mathematics and chemometrics methods were investigated and compared. The results show that the use of "Microsoft Excel's Solver" provides results which are in very good agreement with those obtained with chemometric methods such as artificial neural network (ANN) and partial least square (PLS) methods. PMID:19071333

  2. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (Inventor)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  3. Wind dynamic range video camera

    NASA Technical Reports Server (NTRS)

    Craig, G. D. (Inventor)

    1985-01-01

    A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

  4. High frame rate CCD camera with fast optical shutter

    SciTech Connect

    Yates, G.J.; McDonald, T.E. Jr.; Turko, B.T.

    1998-09-01

    A high frame rate CCD camera coupled with a fast optical shutter has been designed for high repetition rate imaging applications. The design uses state-of-the-art microchannel plate image intensifier (MCPII) technology fostered/developed by Los Alamos National Laboratory to support nuclear, military, and medical research requiring high-speed imagery. Key design features include asynchronous resetting of the camera to acquire random transient images, patented real-time analog signal processing with 10-bit digitization at 40--75 MHz pixel rates, synchronized shutter exposures as short as 200pS, sustained continuous readout of 512 x 512 pixels per frame at 1--5Hz rates via parallel multiport (16-port CCD) data transfer. Salient characterization/performance test data for the prototype camera are presented, temporally and spatially resolved images obtained from range-gated LADAR field testing are included, an alternative system configuration using several cameras sequenced to deliver discrete numbers of consecutive frames at effective burst rates up to 5GHz (accomplished by time-phasing of consecutive MCPII shutter gates without overlap) is discussed. Potential applications including dynamic radiography and optical correlation will be presented.

  5. Initial laboratory evaluation of color video cameras

    SciTech Connect

    Terry, P L

    1991-01-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Color information is useful for identification purposes, and color camera technology is rapidly changing. Thus, Sandia National Laboratories established an ongoing program to evaluate color solid-state cameras. Phase one resulted in the publishing of a report titled, Initial Laboratory Evaluation of Color Video Cameras (SAND--91-2579).'' It gave a brief discussion of imager chips and color cameras and monitors, described the camera selection, detailed traditional test parameters and procedures, and gave the results of the evaluation of twelve cameras. In phase two six additional cameras were tested by the traditional methods and all eighteen cameras were tested by newly developed methods. This report details both the traditional and newly developed test parameters and procedures, and gives the results of both evaluations.

  6. Initial laboratory evaluation of color video cameras

    NASA Astrophysics Data System (ADS)

    Terry, P. L.

    1991-12-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Color information is useful for identification purposes, and color camera technology is rapidly changing. Thus, Sandia National Laboratories established an ongoing program to evaluate color solid-state cameras. Phase one resulted in the publishing of a report titled, 'Initial Laboratory Evaluation of Color Video Cameras (SAND--91-2579).' It gave a brief discussion of imager chips and color cameras and monitors, described the camera selection, detailed traditional test parameters and procedures, and gave the results of the evaluation of twelve cameras. In phase two, six additional cameras were tested by the traditional methods and all eighteen cameras were tested by newly developed methods. This report details both the traditional and newly developed test parameters and procedures, and gives the results of both evaluations.

  7. Design of 300 frames per second 16-port CCD video processing circuit

    NASA Astrophysics Data System (ADS)

    Yang, Shao-hua; Guo, Ming-an; Li, Bin-kang; Xia, Jing-tao; Wang, Qunshu

    2011-08-01

    It is hard to achieve the speed of hundreds frames per second in high resolution charge coupled device (CCD) cameras, because the pixels' charge must be read out one by one in serial mode, this cost a lot of time. The multiple-port CCD technology is a new efficiency way to realize high frame rate high resolution solid state imaging systems. The pixel charge is read out from a multiple-port CCD through several ports in parallel mode, witch decrease the reading time of the CCD. But it is hard for the multiple-port CCDs' video processing circuit design, and the real time high speed image data acquisition is also a knotty problem. A 16-port high frame rate CCD video processing circuit based on Complex Programmable Logic Device (CPLD) and VSP5010 has been developed around a specialized back illuminated, 512 x 512 pixels, 400fps (frames per second) frame transfer CCD sensor from Sarnoff Ltd. A CPLD is used to produce high precision sample clock and timing, and the high accurate CCD video voltage sample is achieved with Correlated Double Sampling (CDS) technology. 8 chips of VSP5010 with CDS function is adopted to achieve sample and digitize CCD analog signal into 12 bit digital image data. Thus the 16 analog CCD output was digitized into 192 bit 6.67MHz parallel digital image data. Then CPLD and Time Division Multiplexing (TDM) technology are used to encode the 192 bit wide data into two 640MHz serial data and transmitted to remote data acquisition module via two fibers. The acquisition module decodes the serial data into original image data and stores the data into a frame cache, and then the software reads the data from the frame cache based on USB2.0 technology and stores the data in a hard disk. The digital image data with 12bit per pixel was collected and displayed with system software. The results show that the 16-por 300fps CCD output signals could be digitized and transmitted with the video processing circuit, and the remote data acquisition has been realized.

  8. Close-Range Photogrammetry with Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1983-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  9. Close-range photogrammetry with video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1985-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  10. Research of fiber position measurement by multi CCD cameras

    NASA Astrophysics Data System (ADS)

    Zhou, Zengxiang; Hu, Hongzhuan; Wang, Jianping; Zhai, Chao; Chu, Jiaru; Liu, Zhigang

    2014-07-01

    Parallel controlled fiber positioner as an efficiency observation system, has been used in LAMOST for four years, and will be proposed in ngCFHT and rebuilt telescope Mayall. The fiber positioner research group in USTC have designed a new generation prototype by a close-packed module robotic positioner mechanisms. The prototype includes about 150 groups fiber positioning module plugged in 1 meter diameter honeycombed focal plane. Each module has 37 12mm diameter fiber positioners. Furthermore the new system promotes the accuracy from 40 um in LAMOST to 10um in MSDESI. That's a new challenge for measurement. Close-loop control system are to be used in new system. The CCD camera captures the photo of fiber tip position covered the focal plane, calculates the precise position information and feeds back to control system. After the positioner rotated several loops, the accuracy of all positioners will be confined to less than 10um. We report our component development and performance measurement program of new measuring system by using multi CCD cameras. With the stereo vision and image processing method, we precisely measure the 3-demension position of fiber tip carried by fiber positioner. Finally we present baseline parameters for the fiber positioner measurement as a reference of next generation survey telescope design.

  11. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  12. Development of high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  13. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  14. Video Analysis with a Web Camera

    ERIC Educational Resources Information Center

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as…

  15. Neural network method for characterizing video cameras

    NASA Astrophysics Data System (ADS)

    Zhou, Shuangquan; Zhao, Dazun

    1998-08-01

    This paper presents a neural network method for characterizing color video camera. A multilayer feedforward network with the error back-propagation learning rule for training, is used as a nonlinear transformer to model a camera, which realizes a mapping from the CIELAB color space to RGB color space. With SONY video camera, D65 illuminant, Pritchard Spectroradiometer, 410 JIS color charts as training data and 36 charts as testing data, results show that the mean error of training data is 2.9 and that of testing data is 4.0 in a 2563 RGB space.

  16. Experimental evaluation of CCD and CMOS cameras in low-light-level conditions

    NASA Astrophysics Data System (ADS)

    Laitinen, Jyrki; Ailisto, Heikki J.

    1999-09-01

    In this research characteristics of standard commercial CCD and CMOS cameras are evaluated experimentally and compared. Special attention is paid to the operation of these devices in low light level condition, which is typical to many surveillance and consumer electronics applications. One emerging application utilizing inexpensive image sensors at variable illumination condition is the UMTS (Universal Mobile Telecommunications System), which will deliver wirelessly, for example, pictures, graphics and video from the year 2002. The determination of the system performance is based in this study on the imaging of a calibrated gray scale test chart at varying illumination condition. At each level of illumination the system response is characterized by a signal to random noise figure. The signal is calculated as the difference of the system response to the lightest and darkest areas of the gray scale. The random noise is measured as the standard deviation of the gray values in a difference of two successive images of the test pattern. The standard deviation is calculated from 10-bit digitized images for small group of pixels (36 X 36) corresponding to the different areas of the gray scale in the test pattern images. If the random noise is plot as a function of signal (encoded in digital numbers, DN) for small group of pixels, a Photon Transfer curve is obtained. This is one of the basic performance standards of CCD sensors. However, if camera systems with nonlinear response or AGC are evaluated, the variations of the system response at different signal levels should be included to the performance measure. In these cases the signal to noise curve is useful. The signal to random noise curves were determined for a CCD and a CMOS camera characterized by similar specifications. The comparison between two camera systems shows that considerable differences between the operation of these devices especially at low light level condition can exist. It was found that approximately

  17. PN-CCD camera for XMM and ABRIXAS: design of the camera system

    NASA Astrophysics Data System (ADS)

    Pfeffermann, Elmar; Braeuninger, Heinrich W.; Bihler, Edgar; Briel, Ulrich G.; Hippmann, Horst; Holl, Peter; Kemmer, Josef; Kendziorra, Eckhard; Kettenring, Guenther; Kretschmar, Baerbel; Kuster, Markus; Meidinger, Norbert; Metzner, Gerd; Pflueger, Bernhard; Reppin, Claus; Soltau, Heike; Stephan, Karl-Heinz; Strueder, Lothar; Truemper, Joachim; von Zanthier, Christoph

    1999-10-01

    The pn-Charge Coupled Device (pn-CCD) camera was developed as one of the focal plane instruments for the European Photon Imaging Camera on board the x-ray multi mirror mission. An identical camera was foreseen on board ABRIXAS, a German x-ray satellite. The pn-CCD camera is an imaging x- ray detector for single photon counting, operating at a temperature below -80 degrees C. Due to a 0.3 mm depletion depth of the CCDs, the detector has a high quantum efficiency up to 15 keV. The effective area of the instrument is 6 cm X 6 cm with 12 CCDs monolithically integrated on a single silicon wafer. The camera includes a filter wheel with different filters for suppression of optical and UV light. A radioactive source provides an in- orbit calibration. In this paper we give an overview of the mechanical, thermal and electrical design of the instrument and a description of different readout and test modes. More detailed information about the performance and calibration of the instrument can be found in companion papers.

  18. Photogrammetric Applications of Immersive Video Cameras

    NASA Astrophysics Data System (ADS)

    Kwiatek, K.; Tokarczyk, R.

    2014-05-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug®3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

  19. Video Analysis with a Web Camera

    NASA Astrophysics Data System (ADS)

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's2 Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as Videopoint3 and Tracker,4 which is freely downloadable, by Doug Brown could also be used. I purchased Logitech's5 QuickCam Pro 4000 web camera for 99 after Rick Sorensen6 at Vernier Software and Technology recommended it for computers using a Windows platform. Once I had mounted the web camera on a mobile computer with Velcro and installed the software, I was ready to capture motion video and analyze it.

  20. Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Hardware

    NASA Astrophysics Data System (ADS)

    Kang, Y.-W.; Byun, Y. I.; Rhee, J. H.; Oh, S. H.; Kim, D. K.

    2007-12-01

    We designed and developed a multi-purpose CCD camera system for three kinds of CCDs; KAF-0401E(768×512), KAF-1602E(1536×1024), KAF-3200E(2184×1472) made by KODAK Co.. The system supports fast USB port as well as parallel port for data I/O and control signal. The packing is based on two stage circuit boards for size reduction and contains built-in filter wheel. Basic hardware components include clock pattern circuit, A/D conversion circuit, CCD data flow control circuit, and CCD temperature control unit. The CCD temperature can be controlled with accuracy of approximately 0.4° C in the max. range of temperature, Δ 33° C. This CCD camera system has with readout noise 6 e^{-}, and system gain 5 e^{-}/ADU. A total of 10 CCD camera systems were produced and our tests show that all of them show passable performance.

  1. Use of CCD cameras for the differential restitution of photogrammetric snapshots

    NASA Astrophysics Data System (ADS)

    Behr, Franz-Josef

    The use of Charged Coupled Device (CCD) cameras for the fabrication of orthophotos is described. The integrated use of optical electronic system components is identified as hybrid orthophoto production. The video equipment allows the hybrid orthophoto production. The radiometric and geometrical system properties are to be considered for the design of the hybrid orthophoto system, in order to obtain a high quality product. The system is based on a photoprocessing base package which allows an efficient software development as well as control and further handling (radiometric adjustment, contour generation). A series of examples of the differential restitution of individual objects, such as bridges or large surface buildings is presented. It is shown that, using appropriate software, an analytical plotter equipped with a valuable standardized photoprocessing hardware can be used as an orthophoto projector.

  2. A new testing method of SNR for cooled CCD imaging camera based on stationary wavelet transform

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Liu, Qianshun; Yu, Feihong

    2013-08-01

    Cooled CCD (charge coupled device) imaging camera has found wide application in the field of astronomy, color photometry, spectroscopy, medical imaging, densitometry, chemiluminescence and epifluorescence imaging. A Cooled CCD (CCCD) imaging camera differs from traditional CCD/CMOS imaging camera in that Cooled CCD imaging camera can get high resolution image even in the low illumination environment. SNR (signal noise ratio) is most popular parameter of digital image quality evaluation. Many researchers have proposed various SNR testing methods for traditional CCD imaging camera, however, which is seldom suitable to Cooled CCD imaging camera because of different main noise source. In this paper, a new testing method of SNR is proposed to evaluate the quality of image captured by Cooled CCD. Stationary Wavelet Transform (SWT) is introduced in the testing method for getting more exact image SNR value. The method proposed take full advantage of SWT in the image processing, which makes the experiment results accuracy and reliable. To further refining SNR testing results, the relation between SNR and integration time is also analyzed in this article. The experimental results indicate that the testing method proposed accords with the SNR model of CCCD. In addition, the testing values for one system are about one value, which show that the proposed testing method is robust.

  3. A New CCD Camera at the Molėtai Observatory

    NASA Astrophysics Data System (ADS)

    Zdanavičius, J.; Zdanavičius, K.

    The results of the first testing of a new CCD camera of the Molėtai Observatory are given. The linearity and the flat field corrections of good accuracy are determined by using shifted star field exposures.

  4. Auto-measuring system of aero-camera lens focus using linear CCD

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-ye; Zhao, Yu-liang; Wang, Shu-juan

    2014-09-01

    The automatic and accurate focal length measurement of aviation camera lens is of great significance and practical value. The traditional measurement method depends on the human eye to read the scribed line on the focal plane of parallel light pipe by means of reading microscope. The method is of low efficiency and the measuring results are influenced by artificial factors easily. Our method used linear array solid-state image sensor instead of reading microscope to transfer the imaging size of specific object to be electrical signal pulse width, and used computer to measure the focal length automatically. In the process of measurement, the lens to be tested placed in front of the object lens of parallel light tube. A couple of scribed line on the surface of the parallel light pipe's focal plane were imaging on the focal plane of the lens to be tested. Placed the linear CCD drive circuit on the image plane, the linear CCD can convert the light intensity distribution of one dimension signal into time series of electrical signals. After converting, a path of electrical signals is directly brought to the video monitor by image acquisition card for optical path adjustment and focusing. The other path of electrical signals is processed to obtain the pulse width corresponding to the scribed line by electrical circuit. The computer processed the pulse width and output focal length measurement result. Practical measurement results showed that the relative error was about 0.10%, which was in good agreement with the theory.

  5. Measuring neutron fluences and gamma/x-ray fluxes with CCD cameras

    SciTech Connect

    Yates, G.J.; Smith, G.W.; Zagarino, P.; Thomas, M.C.

    1991-12-01

    The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4--12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate {approx}.05 V/rad responsivity with {ge}1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or ``peaks`` binned by area and amplitude as functions of fluence in the 10{sup 5} to 10{sup 7} n/cm{sup 2} range indicate smearing over {approx}1 to 10% of CCD array with charge per pixel ranging between noise and saturation levels.

  6. An unmanned watching system using video cameras

    SciTech Connect

    Kaneda, K.; Nakamae, E. ); Takahashi, E. ); Yazawa, K. )

    1990-04-01

    Techniques for detecting intruders at a remote location, such as a power plant or substation, or in an unmanned building at night, are significant in the field of unmanned watching systems. This article describes an unmanned watching system to detect trespassers in real time, applicable both indoors and outdoors, based on image processing. The main part of the proposed system consists of a video camera, an image processor and a microprocessor. Images are input from the video camera to the image processor every 1/60 second, and objects which enter the image are detected by measuring changes of intensity level in selected sensor areas. This article discusses the system configuration and the detection method. Experimental results under a range of environmental conditions are given.

  7. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  8. A Low Noise, High QE, Large Format CCD Camera System for the NASA MIGHTI Instrument

    NASA Astrophysics Data System (ADS)

    Hancock, J. J.; Cardon, J.; Watson, M.; Cook, J.; Whiteley, M.; Beukers, J.; Englert, C. R.; Brown, C. M.; Harlander, J.

    2015-12-01

    The Michelson Interferometer for Global High-resolution Thermospheric Imaging (MIGHTI) instrument is part of the NASA Ionspheric Connection Explorer (ICON) mission designed to uncover the mysteries of the extreme variability of the Earth's ionosphere. MIGHTI consists of two identical units positioned to observe the Earth's low latitude thermosphere from perpendicular viewing directions. The MIGHTI instrument is a spatial heterodyne spectrometer and requires a low noise, high QE, large format camera system to detect slight phase changes in the fringe patterns which reveal the neutral wind velocity. The MIGHTI camera system uses a single control electronics box to operate two identical CCD camera heads and communicate with the ICON payload electronics. The control electronics are carefully designed for a low noise implementation of CCD biases, clocking, and CCD output digitization. The camera heads consist of a 2k by 2K, back-illuminated, frame transfer CCD provided by e2v. The CCD's are both TEC cooled and have butcher-block filters mounted in close proximity of the active area. The CCDs are nominally operated in binned mode, the control electronics register settings provide flexibility for binning and gain control. An engineering model of the camera system has been assembled and tested. The EM camera system characterization meets all performance requirements. Performance highlights include a measured read noise of 5.7 electrons and dark current of 0.01 electronics/pixel/second. The camera system design and characterization results will be presented.

  9. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  10. High-speed video recording system using multiple CCD imagers and digital storage

    NASA Astrophysics Data System (ADS)

    Racca, Roberto G.; Clements, Reginald M.

    1995-05-01

    This paper describes a fully solid state high speed video recording system. Its principle of operation is based on the use of several independent CCD imagers and an array of liquid crystal light valves that control which imager receives the light from the subject. The imagers are exposed in rapid succession and are then read out sequentially at standard video rate into digital memory, generating a time-resolved sequence with as many frames as there are imagers. This design allows the use of inexpensive, consumer-grade camera modules and electronics. A microprocessor-based controller, designed to accept up to ten imagers, handles all phases of the recording: exposure timing, image digitization and storage, and sequential playback onto a standard video monitor. The system is capable of recording full screen black and white images with spatial resolution similar to that of standard television, at rates of about 10,000 images per second in pulsed illumination mode. We have designed and built two optical configurations for the imager multiplexing system. The first one involves permanently splitting the subject light into multiple channels and placing a liquid crystal shutter in front of each imager. A prototype with three CCD imagers and shutters based on this configuration has allowed successful three-image video recordings of phenomena such as the action of an air rifle pellet shattering a piece of glass, using a high-intensity pulsed light emitting diode as the light source. The second configuration is more light-efficient in that it routes the entire subject light to each individual imager in sequence by using the liquid crystal cells as selectable binary switches. Despite some operational limitations, this method offers a solution when the available light, if subdivided among all the imagers, would not allow a sufficiently short exposure time.

  11. Radiation damage of the PCO Pixelfly VGA CCD camera of the BES system on KSTAR tokamak

    NASA Astrophysics Data System (ADS)

    Náfrádi, Gábor; Kovácsik, Ákos; Pór, Gábor; Lampert, Máté; Un Nam, Yong; Zoletnik, Sándor

    2015-01-01

    A PCO Pixelfly VGA CCD camera which is part a of the Beam Emission Spectroscopy (BES) diagnostic system of the Korea Superconducting Tokamak Advanced Research (KSTAR) used for spatial calibrations, suffered from serious radiation damage, white pixel defects have been generated in it. The main goal of this work was to identify the origin of the radiation damage and to give solutions to avoid it. Monte Carlo N-Particle eXtended (MCNPX) model was built using Monte Carlo Modeling Interface Program (MCAM) and calculations were carried out to predict the neutron and gamma-ray fields in the camera position. Besides the MCNPX calculations pure gamma-ray irradiations of the CCD camera were carried out in the Training Reactor of BME. Before, during and after the irradiations numerous frames were taken with the camera with 5 s long exposure times. The evaluation of these frames showed that with the applied high gamma-ray dose (1.7 Gy) and dose rate levels (up to 2 Gy/h) the number of the white pixels did not increase. We have found that the origin of the white pixel generation was the neutron-induced thermal hopping of the electrons which means that in the future only neutron shielding is necessary around the CCD camera. Another solution could be to replace the CCD camera with a more radiation tolerant one for example with a suitable CMOS camera or apply both solutions simultaneously.

  12. Interline Transfer CCD Camera for Gated Broadband Coherent Anti-Stokes Raman-Scattering Measurements.

    PubMed

    Roy, S; Ray, G; Lucht, R P

    2001-11-20

    Use of an interline transfer CCD camera for the acquisition of broadband coherent anti-Stokes Raman-scattering (CARS) spectra is demonstrated. The interline transfer CCD has alternating columns of imaging and storage pixels that allow one to acquire two successive images by shifting the first image in the storage pixels and immediately acquiring the second image. We have used this dual-image mode for gated CARS measurements by acquiring a CARS spectral image and shifting it rapidly from the imaging pixel columns to the storage pixel columns. We have demonstrated the use of this dual-image mode for gated single-laser-shot measurement of hydrogen and nitrogen CARS spectra at room temperature and in atmospheric pressure flames. The performance of the interline transfer CCD for these CARS measurements is compared directly with the performance of a back-illuminated unintensified CCD camera. PMID:18364895

  13. The In-flight Spectroscopic Performance of the Swift XRT CCD Camera During 2006-2007

    NASA Technical Reports Server (NTRS)

    Godet, O.; Beardmore, A.P.; Abbey, A.F.; Osborne, J.P.; Page, K.L.; Evans, P.; Starling, R.; Wells, A.A.; Angelini, L.; Burrows, D.N.; Kennea, J.; Campana, S.; Chincarini, G.; Citterio, O.; Cusumano, G.; LaParola, V.; Mangano, V.; Mineo, T.; Giommi, P.; Perri, M.; Capalbi, M.; Tamburelli, F.

    2007-01-01

    The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.

  14. Automated CCD camera characterization. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports

    SciTech Connect

    Silbermann, J.

    1999-03-01

    The OMEGA system uses CCD cameras for a broad range of applications. Over 100 video rate CCD cameras are used for such purposes as targeting, aligning, and monitoring areas such as the target chamber, laser bay, and viewing gallery. There are approximately 14 scientific grade CCD cameras on the system which are used to obtain precise photometric results from the laser beam as well as target diagnostics. It is very important that these scientific grade CCDs are properly characterized so that the results received from them can be evaluated appropriately. Currently characterization is a tedious process done by hand. The operator must manually operate the camera and light source simultaneously. Because more exposures means more accurate information on the camera, the characterization tests can become very length affairs. Sometimes it takes an entire day to complete just a single plot. Characterization requires the testing of many aspects of the camera`s operation. Such aspects include the following: variance vs. mean signal level--this should be proportional due to Poisson statistics of the incident photon flux; linearity--the ability of the CCD to produce signals proportional to the light it received; signal-to-noise ratio--the relative magnitude of the signal vs. the uncertainty in that signal; dark current--the amount of noise due to thermal generation of electrons (cooling lowers this noise contribution significantly). These tests, as well as many others, must be conducted in order to properly understand a CCD camera. The goal of this project was to construct an apparatus that could characterize a camera automatically.

  15. High performance CCD camera system for digitalisation of 2D DIGE gels.

    PubMed

    Strijkstra, Annemieke; Trautwein, Kathleen; Roesler, Stefan; Feenders, Christoph; Danzer, Daniel; Riemenschneider, Udo; Blasius, Bernd; Rabus, Ralf

    2016-07-01

    An essential step in 2D DIGE-based analysis of differential proteome profiles is the accurate and sensitive digitalisation of 2D DIGE gels. The performance progress of commercially available charge-coupled device (CCD) camera-based systems combined with light emitting diodes (LED) opens up a new possibility for this type of digitalisation. Here, we assessed the performance of a CCD camera system (Intas Advanced 2D Imager) as alternative to a traditionally employed, high-end laser scanner system (Typhoon 9400) for digitalisation of differential protein profiles from three different environmental bacteria. Overall, the performance of the CCD camera system was comparable to the laser scanner, as evident from very similar protein abundance changes (irrespective of spot position and volume), as well as from linear range and limit of detection. PMID:27252121

  16. Wilbur: A low-cost CCD camera system for MDM Observatory

    NASA Technical Reports Server (NTRS)

    Metzger, M. R.; Luppino, G. A.; Tonry, J. L.

    1992-01-01

    The recent availability of several 'off-the-shelf' components, particularly CCD control electronics from SDSU, has made it possible to put together a flexible CCD camera system at relatively low cost and effort. The authors describe Wilbur, a complete CCD camera system constructed for the Michigan-Dartmouth-MIT Observatory. The hardware consists of a Loral 2048(exp 2) CCD controlled by the SDSU electronics, an existing dewar design modified for use at MDM, a Sun Sparcstation 2 with a commercial high-speed parallel controller, and a simple custom interface between the controller and the SDSU electronics. The camera is controlled from the Sparcstation by software that provides low-level I/O in real time, collection of additional information from the telescope, and a simple command interface for use by an observer. Readout of the 2048(exp 2) array is complete in under two minutes at 5 e(sup -) read noise, and readout time can be decreased at the cost of increased noise. The system can be easily expanded to handle multiple CCD's/multiple readouts, and can control other dewars/CCD's using the same host software.

  17. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  18. Inexpensive range camera operating at video speed.

    PubMed

    Kramer, J; Seitz, P; Baltes, H

    1993-05-01

    An optoelectronic device has been developed and built that acquires and displays the range data of an object surface in space in video real time. The recovery of depth is performed with active triangulation. A galvanometer scanner system sweeps a sheet of light across the object at a video field rate of 50 Hz. High-speed signal processing is achieved through the use of a special optical sensor and hardware implementation of the simple electronic-processing steps. Fifty range maps are generated per second and converted into a European standard video signal where the depth is encoded in gray levels or color. The image resolution currently is 128 x 500 pixels with a depth accuracy of 1.5% of the depth range. The present setup uses a 500-mW diode laser for the generation of the light sheet. A 45-mm imaging lens covers a measurement volume of 93 mm x 61 mm x 63 mm at a medium distance of 250 mm from the camera, but this can easily be adapted to other dimensions. PMID:20820391

  19. Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund

    NASA Technical Reports Server (NTRS)

    Hagyard, Mona J.

    1992-01-01

    The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.

  20. Comparison of Kodak Professional Digital Camera System images to conventional film, still video, and freeze-frame images

    NASA Astrophysics Data System (ADS)

    Kent, Richard A.; McGlone, John T.; Zoltowski, Norbert W.

    1991-06-01

    Electronic cameras provide near real time image evaluation with the benefits of digital storage methods for rapid transmission or computer processing and enhancement of images. But how does the image quality of their images compare to that of conventional film? A standard Nikon F-3TM 35 mm SLR camera was transformed into an electro-optical camera by replacing the film back with Kodak's KAF-1400V (or KAF-1300L) megapixel CCD array detector back and a processing accessory. Images taken with these Kodak electronic cameras were compared to those using conventional films and to several still video cameras. Quantitative and qualitative methods were used to compare images from these camera systems. Images captured on conventional video analog systems provide a maximum of 450 - 500 TV lines of resolution depending upon the camera resolution, storage method, and viewing system resolution. The Kodak Professional Digital Camera SystemTM exceeded this resolution and more closely approached that of film.

  1. Research on detecting heterogeneous fibre from cotton based on linear CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Xian-bin; Cao, Bing; Zhang, Xin-peng; Shi, Wei

    2009-07-01

    The heterogeneous fibre in cotton make a great impact on production of cotton textile, it will have a bad effect on the quality of product, thereby affect economic benefits and market competitive ability of corporation. So the detecting and eliminating of heterogeneous fibre is particular important to improve machining technics of cotton, advance the quality of cotton textile and reduce production cost. There are favorable market value and future development for this technology. An optical detecting system obtains the widespread application. In this system, we use a linear CCD camera to scan the running cotton, then the video signals are put into computer and processed according to the difference of grayscale, if there is heterogeneous fibre in cotton, the computer will send an order to drive the gas nozzle to eliminate the heterogeneous fibre. In the paper, we adopt monochrome LED array as the new detecting light source, it's lamp flicker, stability of luminous intensity, lumens depreciation and useful life are all superior to fluorescence light. We analyse the reflection spectrum of cotton and various heterogeneous fibre first, then select appropriate frequency of the light source, we finally adopt violet LED array as the new detecting light source. The whole hardware structure and software design are introduced in this paper.

  2. Auto-measurement system of aerial camera lens' resolution based on orthogonal linear CCD

    NASA Astrophysics Data System (ADS)

    Zhao, Yu-liang; Zhang, Yu-ye; Ding, Hong-yi

    2010-10-01

    The resolution of aerial camera lens is one of the most important camera's performance indexes. The measurement and calibration of resolution are important test items in in maintenance of camera. The traditional method that is observing resolution panel of collimator rely on human's eyes using microscope and doing some computing. The method is of low efficiency and susceptible to artificial factors. The measurement results are unstable, too. An auto-measurement system of aerial camera lens' resolution, which uses orthogonal linear CCD sensor as the detector to replace reading microscope, is introduced. The system can measure automatically and show result real-timely. In order to measure the smallest diameter of resolution panel which could be identified, two orthogonal linear CCD is laid on the imaging plane of measured lens and four intersection points are formed on the orthogonal linear CCD. A coordinate system is determined by origin point of the linear CCD. And a circle is determined by four intersection points. In order to obtain the circle's radius, firstly, the image of resolution panel is transformed to pulse width of electric signal which is send to computer through amplifying circuit and threshold comparator and counter. Secondly, the smallest circle would be extracted to do measurement. The circle extraction made using of wavelet transform which has character of localization in the domain of time and frequency and has capability of multi-scale analysis. Lastly, according to the solution formula of lens' resolution, we could obtain the resolution of measured lens. The measuring precision on practical measurement is analyzed, and the result indicated that the precision will be improved when using linear CCD instead of reading microscope. Moreover, the improvement of system error is determined by the pixel's size of CCD. With the technique of CCD developed, the pixel's size will smaller, the system error will be reduced greatly too. So the auto

  3. Characterization of the CCD and CMOS cameras for grating-based phase-contrast tomography

    NASA Astrophysics Data System (ADS)

    Lytaev, Pavel; Hipp, Alexander; Lottermoser, Lars; Herzen, Julia; Greving, Imke; Khokhriakov, Igor; Meyer-Loges, Stephan; Plewka, Jörn; Burmester, Jörg; Caselle, Michele; Vogelgesang, Matthias; Chilingaryan, Suren; Kopmann, Andreas; Balzer, Matthias; Schreyer, Andreas; Beckmann, Felix

    2014-09-01

    In this article we present the quantitative characterization of CCD and CMOS sensors which are used at the experiments for microtomography operated by HZG at PETRA III at DESY in Hamburg, Germany. A standard commercial CCD camera is compared to a camera based on a CMOS sensor. This CMOS camera is modified for grating-based differential phase-contrast tomography. The main goal of the project is to quantify and to optimize the statistical parameters of this camera system. These key performance parameters such as readout noise, conversion gain and full-well capacity are used to define an optimized measurement for grating-based phase-contrast. First results will be shown.

  4. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    NASA Astrophysics Data System (ADS)

    Oh, S. H.; Kang, Y. W.; Byun, Y. I.

    2007-12-01

    We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512), KAF-1602E (15367times;1024), KAF-3200E (2184×1472) made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  5. A CCD Camera with Electron Decelerator for Intermediate Voltage Electron Microscopy

    SciTech Connect

    Downing, Kenneth H; Downing, Kenneth H.; Mooney, Paul E.

    2008-03-17

    Electron microscopists are increasingly turning to Intermediate Voltage Electron Microscopes (IVEMs) operating at 300 - 400 kV for a wide range of studies. They are also increasingly taking advantage of slow-scan charge coupled device (CCD) cameras, which have become widely used on electron microscopes. Under some conditions CCDs provide an improvement in data quality over photographic film, as well as the many advantages of direct digital readout. However, CCD performance is seriously degraded on IVEMs compared to the more conventional 100 kV microscopes. In order to increase the efficiency and quality of data recording on IVEMs, we have developed a CCD camera system in which the electrons are decelerated to below 100 kV before impacting the camera, resulting in greatly improved performance in both signal quality and resolution compared to other CCDs used in electron microscopy. These improvements will allow high-quality image and diffraction data to be collected directly with the CCD, enabling improvements in data collection for applications including high-resolution electron crystallography, single-particle reconstruction of protein structures, tomographic studies of cell ultrastructure and remote microscope operation. This approach will enable us to use even larger format CCD chips that are being developed with smaller pixels.

  6. Color video camera capable of 1,000,000 fps with triple ultrahigh-speed image sensors

    NASA Astrophysics Data System (ADS)

    Maruyama, Hirotaka; Ohtake, Hiroshi; Hayashida, Tetsuya; Yamada, Masato; Kitamura, Kazuya; Arai, Toshiki; Tanioka, Kenkichi; Etoh, Takeharu G.; Namiki, Jun; Yoshida, Tetsuo; Maruno, Hiromasa; Kondo, Yasushi; Ozaki, Takao; Kanayama, Shigehiro

    2005-03-01

    We developed an ultrahigh-speed, high-sensitivity, color camera that captures moving images of phenomena too fast to be perceived by the human eye. The camera operates well even under restricted lighting conditions. It incorporates a special CCD device that is capable of ultrahigh-speed shots while retaining its high sensitivity. Its ultrahigh-speed shooting capability is made possible by directly connecting CCD storages, which record video images, to photodiodes of individual pixels. Its large photodiode area together with the low-noise characteristic of the CCD contributes to its high sensitivity. The camera can clearly capture events even under poor light conditions, such as during a baseball game at night. Our camera can record the very moment the bat hits the ball.

  7. Perfecting the Photometric Calibration of the ACS CCD Cameras

    NASA Astrophysics Data System (ADS)

    Bohlin, Ralph C.

    2016-09-01

    Newly acquired data and improved data reduction algorithms mandate a fresh look at the absolute flux calibration of the charge-coupled device cameras on the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS). The goals are to achieve a 1% accuracy and to make this calibration more accessible to the HST guest investigator. Absolute fluxes from the CALSPEC1 database for three primary hot 30,000–60,000K WDs define the sensitivity calibrations for the Wide Field Channel (WFC) and High Resolution Channel (HRC) filters. The external uncertainty for the absolute flux is ∼1%, while the internal consistency of the sensitivities in the broadband ACS filters is ∼0.3% among the three primary WD flux standards. For stars as cool as K type, the agreement with the CALSPEC standards is within 1% at the WFC1-1K subarray position, which achieves the 1% precision goal for the first time. After making a small adjustment to the filter bandpass for F814W, the 1% precision goal is achieved over the full F814W WFC field of view for stars of K type and hotter. New encircled energies and absolute sensitivities replace the seminal results of Sirianni et al. that were published in 2005. After implementing the throughput updates, synthetic predictions of the WFC and HRC count rates for the average of the three primary WD standard stars agree with the observations to 0.1%.

  8. Perfecting the Photometric Calibration of the ACS CCD Cameras

    NASA Astrophysics Data System (ADS)

    Bohlin, Ralph C.

    2016-09-01

    Newly acquired data and improved data reduction algorithms mandate a fresh look at the absolute flux calibration of the charge-coupled device cameras on the Hubble Space Telescope (HST) Advanced Camera for Surveys (ACS). The goals are to achieve a 1% accuracy and to make this calibration more accessible to the HST guest investigator. Absolute fluxes from the CALSPEC1 database for three primary hot 30,000–60,000K WDs define the sensitivity calibrations for the Wide Field Channel (WFC) and High Resolution Channel (HRC) filters. The external uncertainty for the absolute flux is ˜1%, while the internal consistency of the sensitivities in the broadband ACS filters is ˜0.3% among the three primary WD flux standards. For stars as cool as K type, the agreement with the CALSPEC standards is within 1% at the WFC1-1K subarray position, which achieves the 1% precision goal for the first time. After making a small adjustment to the filter bandpass for F814W, the 1% precision goal is achieved over the full F814W WFC field of view for stars of K type and hotter. New encircled energies and absolute sensitivities replace the seminal results of Sirianni et al. that were published in 2005. After implementing the throughput updates, synthetic predictions of the WFC and HRC count rates for the average of the three primary WD standard stars agree with the observations to 0.1%.

  9. Time-Resolved Spectra of Dense Plasma Focus Using Spectrometer, Streak Camera, CCD Combination

    SciTech Connect

    F. J. Goldin, B. T. Meehan, E. C. Hagen, P. R. Wilkins

    2010-10-01

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny–Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

  10. Time-resolved spectra of dense plasma focus using spectrometer, streak camera, and CCD combination

    SciTech Connect

    Goldin, F. J.; Meehan, B. T.; Hagen, E. C.; Wilkins, P. R.

    2010-10-15

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny-Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

  11. An EEV large-format CCD camera on the WHT ISIS spectrograph

    NASA Astrophysics Data System (ADS)

    Jorden, Paul R.

    1990-07-01

    A 770 x 1152-element CCD camera being developed for use with the ISIS spectrograph on the 4.2-m William Herschel Telescope at Observatorio del Roque de los Muchachos is described and illustrated with diagrams and graphs of test data. The super-grade EEV 88200 CCDs selected for the ISIS red-arm (401-1060 nm) camera have pixel size 22.5 microns square and rms readout noise 3.5 electrons at operating temperature 150 K; peak quantum efficiency of 50 percent is measured at wavelength 650 nm, and the spatial resolution obtained with ISIS is less than 1.2 pixels FWHM. A similar camera is to be constructed for the blue arm of ISIS. Also provided is a brief overview of the CCD controller hardware and software, summarizing the detailed descriptions of Bregman and Waltham (1986) and Bregman and Doorduin (1986).

  12. The University of Hawaii Institute for Astronomy CCD camera control system

    NASA Technical Reports Server (NTRS)

    Jim, K. T. C.; Yamada, H. T.; Luppino, G. A.; Hlivak, R. J.

    1992-01-01

    The University of Hawaii Institute for Astronomy CCD Camera Control System consists of a NeXT workstation, a graphical user interface, and a fiber optics communications interface which is connected to a San Diego State University CCD controller. The UH system employs the NeXT-resident Motorola DSP 56001 as a real time hardware controller. The DSP 56001 is interfaced to the Mach-based UNIX of the NeXT workstation by DMA and multithreading. Since the SDSU controller also uses the DPS 56001, the NeXT is used as a development platform for the embedded control software. The fiber optic interface links the two DSP 56001's through their Synchronous Serial Interfaces. The user interface is based on the NeXTStep windowing system. It is easy to use and features real-time display of image data and control over all camera functions. Both Loral and Tektronix 2048 x 2048 CCD's have been driven at full readout speeds, and the system is intended to be capable of simultaneous readout of four such CCD's. The total hardware package is compact enough to be quite portable and has been used on five different telescopes on Mauna Kea. The complete CCD control system can be assembled for a very low cost. The hardware and software of the control system has proven to be quite reliable, well adapted to the needs of astronomers, and extensible to increasingly complicated control requirements.

  13. Multi-spectral CCD camera system for ocean water color and seacoast observation

    NASA Astrophysics Data System (ADS)

    Zhu, Min; Chen, Shiping; Wu, Yanlin; Huang, Qiaolin; Jin, Weiqi

    2001-10-01

    One of the earth observing instruments on HY-1 Satellite which will be launched in 2001, the multi-spectral CCD camera system, is developed by Beijing Institute of Space Mechanics & Electricity (BISME), Chinese Academy of Space Technology (CAST). In 798 km orbit, the system can provide images with 250 m ground resolution and a swath of 500 km. It is mainly used for coast zone dynamic mapping and oceanic watercolor monitoring, which include the pollution of offshore and coast zone, plant cover, watercolor, ice, terrain underwater, suspended sediment, mudflat, soil and vapor gross. The multi- spectral camera system is composed of four monocolor CCD cameras, which are line array-based, 'push-broom' scanning cameras, and responding for four spectral bands. The camera system adapts view field registration; that is, each camera scans the same region at the same moment. Each of them contains optics, focal plane assembly, electrical circuit, installation structure, calibration system, thermal control and so on. The primary features on the camera system are: (1) Offset of the central wavelength is better than 5 nm; (2) Degree of polarization is less than 0.5%; (3) Signal-to-noise ratio is about 1000; (4) Dynamic range is better than 2000:1; (5) Registration precision is better than 0.3 pixel; (6) Quantization value is 12 bit.

  14. Applications of visible CCD cameras on the Alcator C-Mod tokamak

    NASA Astrophysics Data System (ADS)

    Boswell, C. J.; Terry, J. L.; Lipschultz, B.; Stillerman, J.

    2001-01-01

    Five 7 mm diameter remote-head visible charge-coupled device (CCD) cameras are being used on Alcator C-Mod for several different diagnostic purposes. All of the cameras' detectors and optics are placed inside a magnetic field of up to 4 T. Images of the cameras are recorded simultaneously using two three-channel color framegrabber cards. Two CCD cameras are used typically to generate two-dimensional emissivity profiles of deuterium line radiation from the divertor. Interference filters are used to select the spectral line to be measured. The local emissivity is obtained by inverting the measured brightnesses assuming toroidal symmetry of the emission. Another use of the cameras is the identification and localization of impurity sources generated by the ion cyclotron radio frequency (ICRF) antennas, which supply the auxiliary heating on Alcator C-Mod. The impurities generated by the antennas are identified by correlating in time the injections seen at the cameras with measurements made with core diagnostics. Fibers whose views aligned with the camera views and whose outputs are coupled to a visible spectrometer are also used to identify the species of the impurities injected.

  15. Optics design of laser spotter camera for ex-CCD sensor

    NASA Astrophysics Data System (ADS)

    Nautiyal, R. P.; Mishra, V. K.; Sharma, P. K.

    2015-06-01

    Development of Laser based instruments like laser range finder and laser ranger designator has received prominence in modern day military application. Aiming the laser on the target is done with the help of a bore sighted graticule as human eye cannot see the laser beam directly. To view Laser spot there are two types of detectors available, InGaAs detector and Ex-CCD detector, the latter being a cost effective solution. In this paper optics design for Ex-CCD based camera is discussed. The designed system is light weight and compact and has the ability to see the 1064nm pulsed laser spot upto a range of 5 km.

  16. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than

  17. Initial laboratory evaluation of color video cameras: Phase 2

    SciTech Connect

    Terry, P.L.

    1993-07-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than to identify intruders. The monochrome cameras were selected over color cameras because they have greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Because color camera technology is rapidly changing and because color information is useful for identification purposes, Sandia National Laboratories has established an on-going program to evaluate the newest color solid-state cameras. Phase One of the Sandia program resulted in the SAND91-2579/1 report titled: Initial Laboratory Evaluation of Color Video Cameras. The report briefly discusses imager chips, color cameras, and monitors, describes the camera selection, details traditional test parameters and procedures, and gives the results reached by evaluating 12 cameras. Here, in Phase Two of the report, we tested 6 additional cameras using traditional methods. In addition, all 18 cameras were tested by newly developed methods. This Phase 2 report details those newly developed test parameters and procedures, and evaluates the results.

  18. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  19. Scintillator-CCD camera system light output response to dosimetry parameters for proton beam range measurement

    NASA Astrophysics Data System (ADS)

    Daftari, Inder K.; Castaneda, Carlos M.; Essert, Timothy; Phillips, Theodore L.; Mishra, Kavita K.

    2012-09-01

    The purpose of this study is to investigate the luminescence light output response in a plastic scintillator irradiated by a 67.5 MeV proton beam using various dosimetry parameters. The relationship of the visible scintillator light with the beam current or dose rate, aperture size and the thickness of water in the water-column was studied. The images captured on a CCD camera system were used to determine optimal dosimetry parameters for measuring the range of a clinical proton beam. The method was developed as a simple quality assurance tool to measure the range of the proton beam and compare it to (a) measurements using two segmented ionization chambers and water column between them, and (b) with an ionization chamber (IC-18) measurements in water. We used a block of plastic scintillator that measured 5×5×5 cm3 to record visible light generated by a 67.5 MeV proton beam. A high-definition digital video camera Moticam 2300 connected to a PC via USB 2.0 communication channel was used to record images of scintillation luminescence. The brightness of the visible light was measured while changing beam current and aperture size. The results were analyzed to obtain the range and were compared with the Bragg peak measurements with an ionization chamber. The luminescence light from the scintillator increased linearly with the increase of proton beam current. The light output also increased linearly with aperture size. The relationship between the proton range in the scintillator and the thickness of the water column showed good linearity with a precision of 0.33 mm (SD) in proton range measurement. For the 67.5 MeV proton beam utilized, the optimal parameters for scintillator light output response were found to be 15 nA (16 Gy/min) and an aperture size of 15 mm with image integration time of 100 ms. The Bragg peak depth brightness distribution was compared with the depth dose distribution from ionization chamber measurements and good agreement was observed. The peak

  20. Grayscale adjustment method for CCD mosaic camera in surface defect detection system

    NASA Astrophysics Data System (ADS)

    Yan, Lu; Yang, Yongying; Wang, Xiaodan; Wang, Shitong; Cao, Pin; Li, Lu; Liu, Dong

    2014-09-01

    Based on microscopic imaging and sub-aperture stitching technology, Surface defect detection system realizes the automatic quantitative detection of submicron defects on the macroscopic surface of optical components, and solves quality control problems of numerous large- aperture precision optical components in ICF (Inertial Confinement Fusion) system. In order to improve the testing efficiency and reduce the number of sub-aperture images, the large format CCD (charged-coupled device) camera is employed to expand the field of view of the system. Large format CCD cameras are usually mosaicked by multi-channel CCD chips, but the differences among the intensity-grayscale functions of different channels will lead to the obvious gray gap among different regions of image. It may cause the shortening and fracture of defects in the process of the image binarization , and thereby lead to the misjudgment of defects. This paper analyzes the different gray characteristics in unbalance images, establishes gray matrix mode of image pixels, and finally proposes a new method to correct the gray gap of CCD self-adaptively. Firstly, by solving the inflection point of the pixel level curve in the gray histogram of the original image, the background threshold is set, and then the background of the image is obtained; Secondly, pixels are sampled from the background and calculated to get the gray gap among different regions of the image; Ultimately, the gray gap is compensated. With this method, an experiment is carried out to adjust 96 dual-channel images from testing a fused silica sample with aperture 180mm×120mm. The results show that the gray gap of the images on different channel is reduced from 3.64 to 0.70 grayscale on average. This method can be also applied to other CCD mosaic camera.

  1. Deflection Measurements of a Thermally Simulated Nuclear Core Using a High-Resolution CCD-Camera

    NASA Technical Reports Server (NTRS)

    Stanojev, B. J.; Houts, M.

    2004-01-01

    Space fission systems under consideration for near-term missions all use compact. fast-spectrum reactor cores. Reactor dimensional change with increasing temperature, which affects neutron leakage. is the dominant source of reactivity feedback in these systems. Accurately measuring core dimensional changes during realistic non-nuclear testing is therefore necessary in predicting the system nuclear equivalent behavior. This paper discusses one key technique being evaluated for measuring such changes. The proposed technique is to use a Charged Couple Device (CCD) sensor to obtain deformation readings of electrically heated prototypic reactor core geometry. This paper introduces a technique by which a single high spatial resolution CCD camera is used to measure core deformation in Real-Time (RT). Initial system checkout results are presented along with a discussion on how additional cameras could be used to achieve a three- dimensional deformation profile of the core during test.

  2. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  3. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  4. The development of a high-speed 100 fps CCD camera

    SciTech Connect

    Hoffberg, M.; Laird, R.; Lenkzsus, F. Liu, Chuande; Rodricks, B.; Gelbart, A.

    1996-09-01

    This paper describes the development of a high-speed CCD digital camera system. The system has been designed to use CCDs from various manufacturers with minimal modifications. The first camera built on this design utilizes a Thomson 512x512 pixel CCD as its sensor which is read out from two parallel outputs at a speed of 15 MHz/pixel/output. The data undergoes correlated double sampling after which, they are digitized into 12 bits. The throughput of the system translates into 60 MB/second which is either stored directly in a PC or transferred to a custom designed VXI module. The PC data acquisition version of the camera can collect sustained data in real time that is limited to the memory installed in the PC. The VXI version of the camera, also controlled by a PC, stores 512 MB of real-time data before it must be read out to the PC disk storage. The uncooled CCD can be used either with lenses for visible light imaging or with a phosphor screen for x-ray imaging. This camera has been tested with a phosphor screen coupled to a fiber-optic face plate for high-resolution, high-speed x-ray imaging. The camera is controlled through a custom event-driven user-friendly Windows package. The pixel clock speed can be changed from I MHz to 15 MHz. The noise was measure to be 1.05 bits at a 13.3 MHz pixel clock. This paper will describe the electronics, software, and characterizations that have been performed using both visible and x-ray photons.

  5. Star-field identification algorithm. [for implementation on CCD-based imaging camera

    NASA Technical Reports Server (NTRS)

    Scholl, M. S.

    1993-01-01

    A description of a new star-field identification algorithm that is suitable for implementation on CCD-based imaging cameras is presented. The minimum identifiable star pattern element consists of an oriented star triplet defined by three stars, their celestial coordinates, and their visual magnitudes. The algorithm incorporates tolerance to faulty input data, errors in the reference catalog, and instrument-induced systematic errors.

  6. Fused Six-Camera Video of STS-134 Launch

    NASA Video Gallery

    Imaging experts funded by the Space Shuttle Program and located at NASA's Ames Research Center prepared this video by merging nearly 20,000 photographs taken by a set of six cameras capturing 250 i...

  7. DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  8. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  9. Station Cameras Capture New Videos of Hurricane Katia

    NASA Video Gallery

    Aboard the International Space Station, external cameras captured new video of Hurricane Katia as it moved northwest across the western Atlantic north of Puerto Rico at 10:35 a.m. EDT on September ...

  10. A USB 2.0 computer interface for the UCO/Lick CCD cameras

    NASA Astrophysics Data System (ADS)

    Wei, Mingzhi; Stover, Richard J.

    2004-09-01

    The new UCO/Lick Observatory CCD camera uses a 200 MHz fiber optic cable to transmit image data and an RS232 serial line for low speed bidirectional command and control. Increasingly RS232 is a legacy interface supported on fewer computers. The fiber optic cable requires either a custom interface board that is plugged into the mainboard of the image acquisition computer to accept the fiber directly or an interface converter that translates the fiber data onto a widely used standard interface. We present here a simple USB 2.0 interface for the UCO/Lick camera. A single USB cable connects to the image acquisition computer and the camera's RS232 serial and fiber optic cables plug into the USB interface. Since most computers now support USB 2.0 the Lick interface makes it possible to use the camera on essentially any modern computer that has the supporting software. No hardware modifications or additions to the computer are needed. The necessary device driver software has been written for the Linux operating system which is now widely used at Lick Observatory. The complete data acquisition software for the Lick CCD camera is running on a variety of PC style computers as well as an HP laptop.

  11. The ARGOS wavefront sensor pnCCD camera for an ELT: characteristics, limitations and applications

    NASA Astrophysics Data System (ADS)

    de Xivry, G. Orban; Ihle, S.; Ziegleder, J.; Barl, L.; Hartmann, R.; Rabien, S.; Soltau, H.; Strueder, L.

    2011-09-01

    From low-order to high-order AO, future wave front sensors on ELTs require large, fast, and low-noise detectors with high quantum efficiency and low dark current. While a detector for a high-order Shack-Hartmann WFS does not exist yet, the current CCD technology pushed to its limits already provides several solutions for the ELT AO detector requirements. One of these devices is the new WFS pnCCD camera of ARGOS, the Ground-Layer Adaptive Optics system (GLAO) for LUCIFER at LBT. Indeed, with its 264x264 pixels, 48 mu m pixel size and 1kHz frame rate, this camera provides a technological solution to different needs of the AO systems for ELTs, such as low-order but as well possibly higher order correction using pyramid wavefront sensing. In this contribution, we present the newly developped WFS pnCCD camera of ARGOS and how it fulfills future detector needs of AO on ELTs.

  12. High-resolution image digitizing through 12x3-bit RGB-filtered CCD camera

    NASA Astrophysics Data System (ADS)

    Cheng, Andrew Y. S.; Pau, Michael C. Y.

    1996-09-01

    A high resolution computer-controlled CCD image capturing system is developed by using a 12 bits 1024 by 1024 pixels CCD camera and motorized RGB filters to grasp an image with color depth up to 36 bits. The filters distinguish the major components of color and collect them separately while the CCD camera maintains the spatial resolution and detector filling factor. The color separation can be done optically rather than electronically. The operation is simply by placing the capturing objects like color photos, slides and even x-ray transparencies under the camera system, the necessary parameters such as integration time, mixing level and light intensity are automatically adjusted by an on-line expert system. This greatly reduces the restrictions of the capturing species. This unique approach can save considerable time for adjusting the quality of image, give much more flexibility of manipulating captured object even if it is a 3D object with minimal setup fixers. In addition, cross sectional dimension of a 3D capturing object can be analyzed by adapting a fiber optic ring light source. It is particularly useful in non-contact metrology of a 3D structure. The digitized information can be stored in an easily transferable format. Users can also perform a special LUT mapping automatically or manually. Applications of the system include medical images archiving, printing quality control, 3D machine vision, and etc.

  13. Accuracy potential of large-format still-video cameras

    NASA Astrophysics Data System (ADS)

    Maas, Hans-Gerd; Niederoest, Markus

    1997-07-01

    High resolution digital stillvideo cameras have found wide interest in digital close range photogrammetry in the last five years. They can be considered fully autonomous digital image acquisition systems without the requirement of permanent connection to an external power supply and a host computer for camera control and data storage, thus allowing for convenient data acquisition in many applications of digital photogrammetry. The accuracy potential of stillvideo cameras has been extensively discussed. While large format CCD sensors themselves can be considered very accurate measurement devices, lenses, camera bodies and sensor mounts of stillvideo cameras are not compression techniques in image storage, which may also affect the accuracy potential. This presentation shows recent experiences from accuracy tests with a number of large format stillvideo cameras, including a modified Kodak DCS200, a Kodak DCS460, a Nikon E2 and a Polaroid PDC-2000. The tests of the cameras include absolute and relative measurements and were performed using strong photogrammetric networks and good external reference. The results of the tests indicate that very high accuracies can be achieved with large blocks of stillvideo imagery especially in deformation measurements. In absolute measurements, however, the accuracy potential of the large format CCD sensors is partly ruined by a lack of stability of the cameras.

  14. Modeling of the over-exposed pixel area of CCD cameras caused by laser dazzling

    NASA Astrophysics Data System (ADS)

    Benoist, Koen W.; Schleijpen, Ric H. M. A.

    2014-10-01

    A simple model has been developed and implemented in Matlab code, predicting the over-exposed pixel area of cameras caused by laser dazzling. Inputs of this model are the laser irradiance on the front optics of the camera, the Point Spread Function (PSF) of the used optics, the integration time of the camera, and camera sensor specifications like pixel size, quantum efficiency and full well capacity. Effects of the read-out circuit of the camera are not incorporated. The model was evaluated with laser dazzle experiments on CCD cameras using a 532 nm CW laser dazzler and shows good agreement. For relatively low laser irradiance the model predicts the over-exposed laser spot area quite accurately and shows the cube root dependency of spot diameter on laser irradiance, caused by the PSF as demonstrated before for IR cameras. For higher laser power levels the laser induced spot diameter increases more rapidly than predicted, which probably can be attributed to scatter effects in the camera. Some first attempts to model scatter contributions, using a simple scatter power function f(θ), show good resemblance with experiments. Using this model, a tool is available which can assess the performance of observation sensor systems while being subjected to laser countermeasures.

  15. Research on simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA

    NASA Astrophysics Data System (ADS)

    Ma, Fei; Liu, Qi; Cui, Xuenan

    2014-09-01

    To satisfy the needs for testing video processor of satellite remote sensing cameras, a design is provided to achieve a simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA. The correctness of video processor FPGA logic can be verified even without CCD signals or analog to digital convertor. Two Xilinx Virtex FPGAs are adopted to make a center unit, the logic of A/D digital data generating and data processing are developed with VHDL. The RS-232 interface is used to receive commands from the host computer, and different types of data are generated and outputted depending on the commands. Experimental results show that the simulation and verification system is flexible and can work well. The simulation and verification system meets the requirements of testing video processors for several different types of satellite remote sensing cameras.

  16. Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems

    NASA Technical Reports Server (NTRS)

    Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.

    2011-01-01

    The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.

  17. Video camera system for locating bullet holes in targets at a ballistics tunnel

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Rummler, D. R.; Goad, W. K.

    1990-01-01

    A system consisting of a single charge coupled device (CCD) video camera, computer controlled video digitizer, and software to automate the measurement was developed to measure the location of bullet holes in targets at the International Shooters Development Fund (ISDF)/NASA Ballistics Tunnel. The camera/digitizer system is a crucial component of a highly instrumented indoor 50 meter rifle range which is being constructed to support development of wind resistant, ultra match ammunition. The system was designed to take data rapidly (10 sec between shoots) and automatically with little operator intervention. The system description, measurement concept, and procedure are presented along with laboratory tests of repeatability and bias error. The long term (1 hour) repeatability of the system was found to be 4 microns (one standard deviation) at the target and the bias error was found to be less than 50 microns. An analysis of potential errors and a technique for calibration of the system are presented.

  18. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  19. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera…

  20. Traceability of a CCD-Camera System for High-Temperature Measurements

    NASA Astrophysics Data System (ADS)

    Bünger, L.; Anhalt, K.; Taubert, R. D.; Krüger, U.; Schmidt, F.

    2015-08-01

    A CCD camera, which has been specially equipped with narrow-band interference filters in the visible spectral range for temperature measurements above 1200 K, was characterized with respect to its temperature response traceable to ITS-90 and with respect to absolute spectral radiance responsivity. The calibration traceable to ITS-90 was performed at a high-temperature blackbody source using a radiation thermometer as a transfer standard. Use of Planck's law and the absolute spectral radiance responsivity of the camera system allows the determination of the thermodynamic temperature. For the determination of the absolute spectral radiance responsivity, a monochromator-based setup with a supercontinuum white-light laser source was developed. The CCD-camera system was characterized with respect to the dark-signal-non-uniformity, the photo-response-non-uniformity, the non-linearity, and the size-of-source effect. The influence of these parameters on the calibration and measurement was evaluated and is considered for the uncertainty budget. The results of the two different calibration schemes for the investigated temperature range from 1200 K to 1800 K are in good agreement considering the expanded uncertainty . The uncertainty for the absolute spectral responsivity of the camera is 0.56 %.

  1. Data acquisition system based on the Nios II for a CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Binhua; Hu, Keliang; Wang, Chunrong; Liu, Yangbing; He, Chun

    2006-06-01

    The FPGA with Avalon Bus architecture and Nios soft-core processor developed by Altera Corporation is an advanced embedded solution for control and interface systems. A CCD data acquisition system with an Ethernet terminal port based on the TCP/IP protocol is implemented in NAOC, which is composed of a piece of interface board with an Altera's FPGA, 32MB SDRAM and some other accessory devices integrated on it, and two packages of control software used in the Nios II embedded processor and the remote host PC respectively. The system is used to replace a 7200 series image acquisition card which is inserted in a control and data acquisition PC, and to download commands to an existing CCD camera and collect image data from the camera to the PC. The embedded chip in the system is a Cyclone FPGA with a configurable Nios II soft-core processor. Hardware structure of the system, configuration for the embedded soft-core processor, and peripherals of the processor in the PFGA are described. The C program run in the Nios II embedded system is built in the Nios II IDE kits and the C++ program used in the PC is developed in the Microsoft's Visual C++ environment. Some key techniques in design and implementation of the C and VC++ programs are presented, including the downloading of the camera commands, initialization of the camera, DMA control, TCP/IP communication and UDP data uploading.

  2. Video Cameras in the Ondrejov Flare Spectrograph Results and Prospects

    NASA Astrophysics Data System (ADS)

    Kotrc, P.

    Since 1991 video cameras have been widely used both in the image and in the spectral data acquisition of the Ondrejov Multichannel Flare Spectrograph. In addition to classical photographic data registration, this kind of detectors brought new possibilities, especially into dynamical solar phenomena observations and put new requirements on the digitization, archiving and data processing techniques. The unique complex video system consisting of four video cameras and auxiliary equipment was mostly developed, implemented and used in the Ondrejov observatory. The main advantages and limitations of the system are briefly described from the points of view of its scientific philosophy, intents and outputs. Some obtained results, experience and future prospects are discussed.

  3. Design of an Event-Driven Random-Access-Windowing CCD-Based Camera

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P.; Lam, Raymond K.; Portillo, Angel A.; Ortiz, Gerardo G.

    2003-01-01

    Commercially available cameras are not design for the combination of single frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROI). A new control paradigm is defined to eliminate the tight coupling between the camera logic and the host controller. This functionality is achieved by defining the indivisible pixel read out operation on a per ROI basis with in-camera time keeping capability. This methodology provides a Random Access, Real-Time, Event-driven (RARE) camera for adaptive camera control and is will suited for target tracking applications requiring autonomous control of multiple ROI's. This methodology additionally provides for reduced ROI read out time and higher frame rates compared to the original architecture by avoiding external control intervention during the ROI read out process.

  4. Optical readout of a two phase liquid argon TPC using CCD camera and THGEMs

    NASA Astrophysics Data System (ADS)

    Mavrokoridis, K.; Ball, F.; Carroll, J.; Lazos, M.; McCormick, K. J.; Smith, N. A.; Touramanis, C.; Walker, J.

    2014-02-01

    This paper presents a preliminary study into the use of CCDs to image secondary scintillation light generated by THick Gas Electron Multipliers (THGEMs) in a two phase LAr TPC. A Sony ICX285AL CCD chip was mounted above a double THGEM in the gas phase of a 40 litre two-phase LAr TPC with the majority of the camera electronics positioned externally via a feedthrough. An Am-241 source was mounted on a rotatable motion feedthrough allowing the positioning of the alpha source either inside or outside of the field cage. Developed for and incorporated into the TPC design was a novel high voltage feedthrough featuring LAr insulation. Furthermore, a range of webcams were tested for operation in cryogenics as an internal detector monitoring tool. Of the range of webcams tested the Microsoft HD-3000 (model no:1456) webcam was found to be superior in terms of noise and lowest operating temperature. In ambient temperature and atmospheric pressure 1 ppm pure argon gas, the THGEM gain was ≈ 1000 and using a 1 msec exposure the CCD captured single alpha tracks. Successful operation of the CCD camera in two-phase cryogenic mode was also achieved. Using a 10 sec exposure a photograph of secondary scintillation light induced by the Am-241 source in LAr has been captured for the first time.

  5. OCam with CCD220, the Fastest and Most Sensitive Camera to Date for AO Wavefront Sensing

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz Garcia, José Javier

    2011-03-01

    For the first time, subelectron readout noise has been achieved with a camera dedicated to astronomical wavefront-sensing applications. The OCam system demonstrated this performance at a 1300 Hz frame rate and with 240 × 240 pixel frame size. ESO and JRA2 OPTICON jointly funded e2v Technologies to develop a custom CCD for adaptive optics (AO) wavefront-sensing applications. The device, called CCD220, is a compact Peltier-cooled 240 × 240 pixel frame-transfer eight-output back-illuminated sensor using the EMCCD technology. This article demonstrates, for the first time, subelectron readout noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e- pixel-1 frame-1. It reports on the quantitative performance characterization of OCam and the CCD220, including readout noise, dark current, multiplication gain, quantum efficiency, and charge transfer efficiency. OCam includes a low-noise preamplifier stage, a digital board to generate the clocks, and a microcontroller. The data acquisition system includes a user-friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, has been designed to offer enhanced performance, a completely sealed camera package, and an additional Peltier stage to facilitate operation on a telescope or environmentally challenging applications. New features of OCam2 are presented in this article. This instrumental development will strongly impact the performance of the most advanced AO systems to come.

  6. Design and realization of an image mosaic system on the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Wang, Peng; Zhu, Hai bin; Li, Yan; Zhang, Shao jun

    2015-08-01

    It has long been difficulties in aerial photograph to stitch multi-route images into a panoramic image in real time for multi-route flight framing CCD camera with very large amount of data, and high accuracy requirements. An automatic aerial image mosaic system based on GPU development platform is described in this paper. Parallel computing of SIFT feature extraction and matching algorithm module is achieved by using CUDA technology for motion model parameter estimation on the platform, which makes it's possible to stitch multiple CCD images in real-time. Aerial tests proved that the mosaic system meets the user's requirements with 99% accuracy and 30 to 50 times' speed improvement of the normal mosaic system.

  7. A pnCCD-based, fast direct single electron imaging camera for TEM and STEM

    NASA Astrophysics Data System (ADS)

    Ryll, H.; Simson, M.; Hartmann, R.; Holl, P.; Huth, M.; Ihle, S.; Kondo, Y.; Kotula, P.; Liebel, A.; Müller-Caspary, K.; Rosenauer, A.; Sagawa, R.; Schmidt, J.; Soltau, H.; Strüder, L.

    2016-04-01

    We report on a new camera that is based on a pnCCD sensor for applications in scanning transmission electron microscopy. Emerging new microscopy techniques demand improved detectors with regards to readout rate, sensitivity and radiation hardness, especially in scanning mode. The pnCCD is a 2D imaging sensor that meets these requirements. Its intrinsic radiation hardness permits direct detection of electrons. The pnCCD is read out at a rate of 1,150 frames per second with an image area of 264 x 264 pixel. In binning or windowing modes, the readout rate is increased almost linearly, for example to 4000 frames per second at 4× binning (264 x 66 pixel). Single electrons with energies from 300 keV down to 5 keV can be distinguished due to the high sensitivity of the detector. Three applications in scanning transmission electron microscopy are highlighted to demonstrate that the pnCCD satisfies experimental requirements, especially fast recording of 2D images. In the first application, 65536 2D diffraction patterns were recorded in 70 s. STEM images corresponding to intensities of various diffraction peaks were reconstructed. For the second application, the microscope was operated in a Lorentz-like mode. Magnetic domains were imaged in an area of 256 x 256 sample points in less than 37 seconds for a total of 65536 images each with 264 x 132 pixels. Due to information provided by the two-dimensional images, not only the amplitude but also the direction of the magnetic field could be determined. In the third application, millisecond images of a semiconductor nanostructure were recorded to determine the lattice strain in the sample. A speed-up in measurement time by a factor of 200 could be achieved compared to a previously used camera system.

  8. Development of the analog ASIC for multi-channel readout X-ray CCD camera

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Matsuura, Daisuke; Idehara, Toshihiro; Anabuki, Naohisa; Tsunemi, Hiroshi; Doty, John P.; Ikeda, Hirokazu; Katayama, Haruyoshi; Kitamura, Hisashi; Uchihori, Yukio

    2011-03-01

    We report on the performance of an analog application-specific integrated circuit (ASIC) developed aiming for the front-end electronics of the X-ray CCD camera system onboard the next X-ray astronomical satellite, ASTRO-H. It has four identical channels that simultaneously process the CCD signals. Distinctive capability of analog-to-digital conversion enables us to construct a CCD camera body that outputs only digital signals. As the result of the front-end electronics test, it works properly with low input noise of ≤30μV at the pixel rate below 100 kHz. The power consumption is sufficiently low of ˜150mW/chip. The input signal range of ±20 mV covers the effective energy range of the typical X-ray photon counting CCD (up to 20 keV). The integrated non-linearity is 0.2% that is similar as those of the conventional CCDs in orbit. We also performed a radiation tolerance test against the total ionizing dose (TID) effect and the single event effect. The irradiation test using 60Co and proton beam showed that the ASIC has the sufficient tolerance against TID up to 200 krad, which absolutely exceeds the expected amount of dose during the period of operating in a low-inclination low-earth orbit. The irradiation of Fe ions with the fluence of 5.2×108 Ion/cm2 resulted in no single event latchup (SEL), although there were some possible single event upsets. The threshold against SEL is higher than 1.68 MeV cm2/mg, which is sufficiently high enough that the SEL event should not be one of major causes of instrument downtime in orbit.

  9. CQUEAN: New CCD Camera System For The Otto Struve Telescope At The McDonald Observatory

    NASA Astrophysics Data System (ADS)

    Pak, Soojong; Park, W.; Im, M.

    2012-01-01

    We describe the overall characteristics and the performance of an optical CCD camera system, Camera for QUasars in EArly uNiverse (CQUEAN), which is being used at the 2.1m Otto Struve Telescope of the McDonald Observatory since 2010 August. CQUEAN was developed for follow-up imaging observations of near infrared bright sources such as high redshift quasar candidates (z > 4.5), Gamma Ray Bursts, brown dwarfs, and young stellar objects. For efficient observations of the red objects, CQUEAN has a science camera with a deep depletion CCD chip. By employing an auto-guiding system and a focal reducer to enhance the field of view at the classic cassegrain focus, we achieved a stable guiding in 20 minute exposures, an imaging quality with FWHM > 0.6 arcsec over the whole field (4.8 × 4.8 arcmin), and a limiting magnitude of z = 23.4 AB mag at 5-sigma with one hour integration.

  10. An intensified/shuttered cooled CCD camera for dynamic proton radiography

    SciTech Connect

    Yates, G.J.; Albright, K.L.; Alrick, K.R.

    1998-12-31

    An intensified/shuttered cooled PC-based CCD camera system was designed and successfully fielded on proton radiography experiments at the Los Alamos National Laboratory LANSCE facility using 800-MeV protons. The four camera detector system used front-illuminated full-frame CCD arrays (two 1,024 x 1,024 pixels and two 512 x 512 pixels) fiber optically coupled to either 25-mm diameter planar diode or microchannel plate image intensifiers which provided optical shuttering for time resolved imaging of shock propagation in high explosives. The intensifiers also provided wavelength shifting and optical gain. Typical sequences consisting of four images corresponding to consecutive exposures of about 500 ns duration for 40-ns proton burst images (from a fast scintillating fiber array) separated by approximately 1 microsecond were taken during the radiography experiments. Camera design goals and measured performance characteristics including resolution, dynamic range, responsivity, system detection quantum efficiency (DQE), and signal-to-noise will be discussed.

  11. Court Reconstruction for Camera Calibration in Broadcast Basketball Videos.

    PubMed

    Wen, Pei-Chih; Cheng, Wei-Chih; Wang, Yu-Shuen; Chu, Hung-Kuo; Tang, Nick C; Liao, Hong-Yuan Mark

    2016-05-01

    We introduce a technique of calibrating camera motions in basketball videos. Our method particularly transforms player positions to standard basketball court coordinates and enables applications such as tactical analysis and semantic basketball video retrieval. To achieve a robust calibration, we reconstruct the panoramic basketball court from a video, followed by warping the panoramic court to a standard one. As opposed to previous approaches, which individually detect the court lines and corners of each video frame, our technique considers all video frames simultaneously to achieve calibration; hence, it is robust to illumination changes and player occlusions. To demonstrate the feasibility of our technique, we present a stroke-based system that allows users to retrieve basketball videos. Our system tracks player trajectories from broadcast basketball videos. It then rectifies the trajectories to a standard basketball court by using our camera calibration method. Consequently, users can apply stroke queries to indicate how the players move in gameplay during retrieval. The main advantage of this interface is an explicit query of basketball videos so that unwanted outcomes can be prevented. We show the results in Figs. 1, 7, 9, 10 and our accompanying video to exhibit the feasibility of our technique. PMID:27504515

  12. Upwelling radiance at 976 nm measured from space using the OPALS CCD camera on the ISS

    NASA Astrophysics Data System (ADS)

    Biswas, Abhijit; Kovalik, Joseph M.; Oaida, Bogdan V.; Abrahamson, Matthew; Wright, Malcolm W.

    2015-03-01

    The Optical Payload for Lasercomm Science (OPALS) Flight System on-board the International Space Station uses a charge coupled device (CCD) camera to detect a beacon laser from Earth. Relative measurements of the background contributed by upwelling radiance under diverse illumination conditions and varying surface terrain is presented. In some cases clouds in the field-of-view allowed a comparison of terrestrial and cloud-top upwelling radiance. In this paper we will report these measurements and examine the extent of agreement with atmospheric model predictions.

  13. The measurement of astronomical parallaxes with CCD imaging cameras on small telescopes

    SciTech Connect

    Ratcliff, S.J. ); Balonek, T.J. ); Marschall, L.A. ); DuPuy, D.L. ); Pennypacker, C.R. ); Verma, R. ); Alexov, A. ); Bonney, V. )

    1993-03-01

    Small telescopes equipped with charge-coupled device (CCD) imaging cameras are well suited to introductory laboratory exercises in positional astronomy (astrometry). An elegant example is the determination of the parallax of extraterrestrial objects, such as asteroids. For laboratory exercises suitable for introductory students, the astronomical hardware needs are relatively modest, and, under the best circumstances, the analysis requires little more than arithmetic and a microcomputer with image display capabilities. Results from the first such coordinated parallax observations of asteroids ever made are presented. In addition, procedures for several related experiments, involving single-site observations and/or parallaxes of earth-orbiting artificial satellites, are outlined.

  14. The Laboratory Radiometric Calibration of the CCD Stereo Camera for the Optical Payload of the Lunar Explorer Project

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Li, Chun-Lai; Zhao, Bao-Chang

    2007-03-01

    The system of the optical payload for the Lunar Explorer includes a CCD stereo camera and an imaging interferometer. The former is devised to get the solid images of the lunar surface with a laser altimeter. The camera working principle, calibration purpose, and content, nude chip detection, and the process of the relative and absolute calibration in the laboratory are introduced.

  15. Digital monochrome CCD camera for robust pixel correspondant, data compression, and preprocessing in an integrated PC-based image-processing environment

    NASA Astrophysics Data System (ADS)

    Arshad, Norhashim M.; Harvey, David M.; Hobson, Clifford A.

    1996-12-01

    This paper describes the development of a compact digital CCD camera which contains image digitization and processing which interfaces to a personal computer (PC) via a standard enhanced parallel port. Digitizing of precise pixel samples coupled with the provision of putting a single chip FPGA for data processing, became the main digital components of the camera prior to sending the data to the PC. A form of compression scheme is applied so that the digital images may be transferred within the existing parallel port bandwidth. The data is decompressed in the PC environment for a real- time display of the video images using purely native processor resources. Frame capture is built into the camera so that a full uncompressed digital image could be sent for special processing.

  16. ULTRASPEC: an electron multiplication CCD camera for very low light level high speed astronomical spectrometry

    NASA Astrophysics Data System (ADS)

    Ives, Derek; Bezawada, Nagaraja; Dhillon, Vik; Marsh, Tom

    2008-07-01

    We present the design, characteristics and astronomical results for ULTRASPEC, a high speed Electron Multiplication CCD (EMCCD) camera using an E2VCCD201 (1K frame transfer device), developed to prove the performance of this new optical detector technology in astronomical spectrometry, particularly in the high speed, low light level regime. We present both modelled and real data for these detectors with particular regard to avalanche gain and clock induced charge (CIC). We present first light results from the camera as used on the EFOSC-2 instrument at the ESO 3.6 metre telescope in La Silla. We also present the design for a proposed new 4Kx2K frame transfer EMCCD.

  17. Synchronizing Light Pulses With Video Camera

    NASA Technical Reports Server (NTRS)

    Kalshoven, James E., Jr.; Tierney, Michael; Dabney, Philip

    1993-01-01

    Interface circuit triggers laser or other external source of light to flash in proper frame and field (at proper time) for video recording and playback in "pause" mode. Also increases speed of electronic shutter (if any) during affected frame to reduce visibility of background illumination relative to that of laser illumination.

  18. LAIWO: a new wide-field CCD camera for Wise Observatory

    NASA Astrophysics Data System (ADS)

    Baumeister, Harald; Afonso, Cristina; Marien, Karl-Heinz; Klein, Ralf

    2006-06-01

    LAIWO is a new CCD wide-field camera for the 40-inch Ritchey-Chretien telescope at Wise Observatory in Mitzpe Ramon/Israel. The telescope is identical to the 40-in. telescope at Las Campanas Observatory, Chile, which is described in [2]. LAIWO was designed and built at Max-Planck-Institute for Astronomy in Heidelberg, Germany. The scientific aim of the instrument is to detect Jupiter-sized extra-solar planets around I=14-15 magnitude stars with the transit method, which relies on the temporary drop in brightness of the parent star harboring the planet. LAIWO can observe a 1.4 x 1.4 degree field-of-view and has four CCDs with 4096*4096 pixels each The Fairchild Imaging CCDs have a pixel size of 15 microns. Since they are not 2-side buttable, they are arranged with spacings between the chips that is equal to the size of a single CCD minus a small overlap. The CCDs are cooled by liquid nitrogen to a temperature of about -100 °C. The four science CCDs and the guider CCD are mounted on a common cryogenic plate which can be adjusted in three degrees of freedom. Each of these detectors can also be adjusted independently by a similar mechanism. The instrument contains large shutter and filter mechanisms, both designed in a modular way for fast exchange and easy maintenance.

  19. Origins of the instrumental background of the x-ray CCD camera in space studied with Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Murakami, Hiroshi; Kitsunezuka, Masaki; Ozaki, Masanobu; Dotani, Tadayasu; Anada, Takayasu

    2006-06-01

    We report on the origin of the instrumental background of the X-ray CCD camera in space obtained from the Monte Carlo simulation with GEANT4. In the space environment, CCD detects many non-X-ray events, which are produced by the interactions of high-energy particles with the materials surrounding CCD. Most of these events are rejected through the analysis of the charge split pattern, but some are remained to be background. Such instrumental background need to be reduced to achieve higher sensitivity especially above several keV. We simulated the interactions of the cosmic-rays with the CCD housing, and extracted the background events which escaped from the screening process by the charge split pattern. We could reproduce the observed spectral shape of the instrumental background of Suzaku XIS on orbit with the Monte Carlo simulation. This means that the simulation succeeded to duplicate the background production process in space. From the simulation, we found that the major components of the background in the front-side illuminated CCD are the recoil electrons produced by the Compton-scattering of the hard X-ray photons in the CCD. On the other hand, for the backside illuminated CCD, contribution from the low energy electrons becomes dominant, which are produced by the interactions of cosmic-ray protons or hard X-rays with the housing. These results may be important to design the X-ray CCD camera for the future missions, such as NeXT.

  20. CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

    2011-02-01

    Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

  1. Performance of a slow-scan CCD camera for macromolecular imaging in a 400 kV electron cryomicroscope.

    PubMed

    Sherman, M B; Brink, J; Chiu, W

    1996-04-01

    The feasibility and limitations of a 1024 x 1024 slow-scan charge-coupled device (CCD) camera were evaluated for imaging in a 400kV electron cryomicroscope. Catalase crystals and amorphous carbon film were used as test specimens. Using catalase crystals, it was found that the finite (24 microns) pixel size of the slow-scan CCD camera governs the ultimate resolution in the acquired images. For instance, spot-scan images of ice-embedded catalase crystals showed resolutions of 8 A and 4 A at effective magnifications of 67,000 x and 132,000 x, respectively. Using an amorphous carbon film, the damping effect of the modulation transfer function (MTF) of the slow-scan CCD camera on the specimen's Fourier spectrum relative to that of the photographic film was evaluated. The MTF of the slow-scan CCD camera fell off more rapidly compared to that of the photographic film and reached the value of 0.2 at the Nyquist frequency. Despite this attenuation, the signal-to-noise ratio of the CCD data, as determined from reflections of negatively-stained catalase crystals, was found to decrease to approximately 50% of that of photographic film data. The phases computed from images of the same negatively-stained catalase crystals recorded consecutively on both the slow-scan CCD camera and photographic film were found to be comparable to each other within 12 degrees. Ways of minimizing the effect of the MTF of the slow-scan CCD camera on the acquired images are also presented. PMID:8858867

  2. A toolkit for the characterization of CCD cameras for transmission electron microscopy.

    PubMed

    Vulovic, M; Rieger, B; van Vliet, L J; Koster, A J; Ravelli, R B G

    2010-01-01

    Charge-coupled devices (CCD) are nowadays commonly utilized in transmission electron microscopy (TEM) for applications in life sciences. Direct access to digitized images has revolutionized the use of electron microscopy, sparking developments such as automated collection of tomographic data, focal series, random conical tilt pairs and ultralarge single-particle data sets. Nevertheless, for ultrahigh-resolution work photographic plates are often still preferred. In the ideal case, the quality of the recorded image of a vitrified biological sample would solely be determined by the counting statistics of the limited electron dose the sample can withstand before beam-induced alterations dominate. Unfortunately, the image is degraded by the non-ideal point-spread function of the detector, as a result of a scintillator coupled by fibre optics to a CCD, and the addition of several inherent noise components. Different detector manufacturers provide different types of figures of merit when advertising the quality of their detector. It is hard for most laboratories to verify whether all of the anticipated specifications are met. In this report, a set of algorithms is presented to characterize on-axis slow-scan large-area CCD-based TEM detectors. These tools have been added to a publicly available image-processing toolbox for MATLAB. Three in-house CCD cameras were carefully characterized, yielding, among others, statistics for hot and bad pixels, the modulation transfer function, the conversion factor, the effective gain and the detective quantum efficiency. These statistics will aid data-collection strategy programs and provide prior information for quantitative imaging. The relative performance of the characterized detectors is discussed and a comparison is made with similar detectors that are used in the field of X-ray crystallography. PMID:20057054

  3. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument currently being developed by NASA's Marshall Space Flight Center (MSFC), the National Astronomical Observatory of Japan (NAOJ), and other partners. The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's chromosphere. The polarized spectrum imaged by the CCD cameras will capture information about the local magnetic field, allowing for measurements of magnetic strength and structure. In order to make accurate measurements of this effect, the performance characteristics of the three on- board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, read noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of 2.0+/- 0.5 e--/DN, a read noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non- linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  4. Using a Digital Video Camera to Study Motion

    ERIC Educational Resources Information Center

    Abisdris, Gil; Phaneuf, Alain

    2007-01-01

    To illustrate how a digital video camera can be used to analyze various types of motion, this simple activity analyzes the motion and measures the acceleration due to gravity of a basketball in free fall. Although many excellent commercially available data loggers and software can accomplish this task, this activity requires almost no financial…

  5. 67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST OF ASSISTANT LAUNCH CONDUCTOR PANEL SHOWN IN CA-133-1-A-66 - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  6. Compact 3D flash lidar video cameras and applications

    NASA Astrophysics Data System (ADS)

    Stettner, Roger

    2010-04-01

    The theory and operation of Advanced Scientific Concepts, Inc.'s (ASC) latest compact 3D Flash LIDAR Video Cameras (3D FLVCs) and a growing number of technical problems and solutions are discussed. The solutions range from space shuttle docking, planetary entry, decent and landing, surveillance, autonomous and manned ground vehicle navigation and 3D imaging through particle obscurants.

  7. Lights, Camera, Action! Using Video Recordings to Evaluate Teachers

    ERIC Educational Resources Information Center

    Petrilli, Michael J.

    2011-01-01

    Teachers and their unions do not want test scores to count for everything; classroom observations are key, too. But planning a couple of visits from the principal is hardly sufficient. These visits may "change the teacher's behavior"; furthermore, principals may not be the best judges of effective teaching. So why not put video cameras in…

  8. CameraCast: flexible access to remote video sensors

    NASA Astrophysics Data System (ADS)

    Kong, Jiantao; Ganev, Ivan; Schwan, Karsten; Widener, Patrick

    2007-01-01

    New applications like remote surveillance and online environmental or traffic monitoring are making it increasingly important to provide flexible and protected access to remote video sensor devices. Current systems use application-level codes like web-based solutions to provide such access. This requires adherence to user-level APIs provided by such services, access to remote video information through given application-specific service and server topologies, and that the data being captured and distributed is manipulated by third party service codes. CameraCast is a simple, easily used system-level solution to remote video access. It provides a logical device API so that an application can identically operate on local vs. remote video sensor devices, using its own service and server topologies. In addition, the application can take advantage of API enhancements to protect remote video information, using a capability-based model for differential data protection that offers fine grain control over the information made available to specific codes or machines, thereby limiting their ability to violate privacy or security constraints. Experimental evaluations of CameraCast show that the performance of accessing remote video information approximates that of accesses to local devices, given sufficient networking resources. High performance is also attained when protection restrictions are enforced, due to an efficient kernel-level realization of differential data protection.

  9. Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2004-01-01

    Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

  10. Charge-coupled device (CCD) television camera for NASA's Galileo mission to Jupiter

    NASA Technical Reports Server (NTRS)

    Klaasen, K. P.; Clary, M. C.; Janesick, J. R.

    1982-01-01

    The CCD detector under construction for use in the slow-scan television camera for the NASA Galileo Jupiter orbiter to be launched in 1985 is presented. The science objectives and the design constraints imposed by the earth telemetry link, platform residual motion, and the Jovian radiation environment are discussed. Camera optics are inherited from Voyager; filter wavelengths are chosen to enable discrimination of Galilean-satellite surface chemical composition. The CCO design, an 800 by 800-element 'virtual-phase' solid-state silicon image-sensor array with supporting electronics, is described with detailed discussion of the thermally generated dark current, quantum efficiency, signal-to-noise ratio, and resolution. Tests of the effect of ionizing radiation were performed and are analyzed statistically. An imaging mode using a 2-1/3-sec frame time and on-chip summation of the signal in 2 x 2 blocks of adjacent pixels is designed to limit the effects of the most extreme Jovian radiation. Smearing due to spacecraft/target relative velocity and platform instability will be corrected for via an algorithm maximizing spacial resolution at a given signal-to-noise level. The camera is expected to produce 40,000 images of Jupiter and its satellites during the 20-month mission.

  11. Benchmarking of Back Thinned 512x512 X-ray CCD Camera Measurements with DEF X-ray film

    NASA Astrophysics Data System (ADS)

    Shambo, N. A.; Workman, J.; Kyrala, G.; Hurry, T.; Gonzales, R.; Evans, S. C.

    1999-11-01

    Using the Trident Laser Facility at Los Alamos National Laboratory 25-micron thick, 2mm diameter titanium disks were shot with a 527nm(green) laser light to measure x-ray yield. 1.0 mil and 0.5 mil Aluminum steps were used to test the linearity of the CCD Camera and DEF X-ray film was used to test the calibration of the CCD Camera response at 4.75keV. Both laser spot size and incident laser intensity were constrained to give constancy to the experimental data. This poster will discuss both the experimental design and results.

  12. Retrieval of the optical depth using an all-sky CCD camera.

    PubMed

    Olmo, Francisco J; Cazorla, Alberto; Alados-Arboledas, Lucas; López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier

    2008-12-01

    A new method is presented for retrieval of the aerosol and cloud optical depth using a CCD camera equipped with a fish-eye lens (all-sky imager system). In a first step, the proposed method retrieves the spectral radiance from sky images acquired by the all-sky imager system using a linear pseudoinverse algorithm. Then, the aerosol or cloud optical depth at 500 nm is obtained as that which minimizes the residuals between the zenith spectral radiance retrieved from the sky images and that estimated by the radiative transfer code. The method is tested under extreme situations including the presence of nonspherical aerosol particles. The comparison of optical depths derived from the all-sky imager with those retrieved with a sunphotometer operated side by side shows differences similar to the nominal error claimed in the aerosol optical depth retrievals from sunphotometer networks. PMID:19037341

  13. Radiometric calibration of frame transfer CCD camera with uniform source system

    NASA Astrophysics Data System (ADS)

    Zhou, Jiankang; Shi, Rongbao; Chen, Yuheng; Zhou, Yuying; Shen, Weimin

    2010-08-01

    This paper presents a radiometric calibration method based on visibility function and uniform source system. The uniform system is mainly comprised of an integrating sphere and a monitoring silicon detector. The current of the silicon detector with a visibility function filter corresponds to the luminance at the exit port of integrating sphere through standard luminance meter transfer. The radiance at the camera entrance pupil is calculated for different solar zenith angles and Earth surface albedos by the MODTRAN atmospheric code. To simplify the calibration process, the radiance at its entrance pupil is integrated by visibility function. The shift smear of the frame transfer CCD is removed by the radiometric calibration and the amending ratio factor is introduced in the retrieving methods. The imaging experiment verifies the reliability of the calibration method and retrieves good quality image.

  14. A reflectance model for non-contact mapping of venous oxygen saturation using a CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Jun; Dunmire, Barbrina; Beach, Kirk W.; Leotta, Daniel F.

    2013-11-01

    A method of non-contact mapping of venous oxygen saturation (SvO2) is presented. A CCD camera is used to image skin tissue illuminated alternately by a red (660 nm) and an infrared (800 nm) LED light source. Low cuff pressures of 30-40 mmHg are applied to induce a venous blood volume change with negligible change in the arterial blood volume. A hybrid model combining the Beer-Lambert law and the light diffusion model is developed and used to convert the change in the light intensity to the change in skin tissue absorption coefficient. A simulation study incorporating the full light diffusion model is used to verify the hybrid model and to correct a calculation bias. SvO2 in the fingers, palm, and forearm for five volunteers are presented and compared with results in the published literature. Two-dimensional maps of venous oxygen saturation are given for the three anatomical regions.

  15. A novel method to measure the ambient aerosol phase function based on dual ccd-camera

    NASA Astrophysics Data System (ADS)

    Bian, Yuxuan; Zhao, Chunsheng; Tao, Jiangchuan; Kuang, Ye; Zhao, Gang

    2016-04-01

    Aerosol scattering phase function is a measure of the light intensity scattered from particles as a function of scattering angles. It's important for understanding the aerosol climate effects and remote sensing inversion analysis. In this study, a novel method to measure the ambient aerosol phase function is developed based on a dual charge-coupled device(ccd) camera laser detective system. An integrating nephelometer is used to correct the inversion result. The instrument was validated by both field and laboratory measurements of atmospheric aerosols. A Mie theory model was used with the measurements of particle number size distribution and mass concentration of black carbon to simulate the aerosol phase function for comparison with the values from the instrument. The comparison shows a great consistency.

  16. Improvement of relief algorithm to prevent inpatient's downfall accident with night-vision CCD camera

    NASA Astrophysics Data System (ADS)

    Matsuda, Noriyuki; Yamamoto, Takeshi; Miwa, Masafumi; Nukumi, Shinobu; Mori, Kumiko; Kuinose, Yuko; Maeda, Etuko; Miura, Hirokazu; Taki, Hirokazu; Hori, Satoshi; Abe, Norihiro

    2005-12-01

    "ROSAI" hospital, Wakayama City in Japan, reported that inpatient's bed-downfall is one of the most serious accidents in hospital at night. Many inpatients have been having serious damages from downfall accidents from a bed. To prevent accidents, the hospital tested several sensors in a sickroom to send warning-signal of inpatient's downfall accidents to a nurse. However, it sent too much inadequate wrong warning about inpatients' sleeping situation. To send a nurse useful information, precise automatic detection for an inpatient's sleeping situation is necessary. In this paper, we focus on a clustering-algorithm which evaluates inpatient's situation from multiple angles by several kinds of sensor including night-vision CCD camera. This paper indicates new relief algorithm to improve the weakness about exceptional cases.

  17. Construction and Use of the CCD Camera on the Automated Patrol Telescope.

    NASA Astrophysics Data System (ADS)

    Brooks, Paul Westley

    This thesis describes the construction, commissioning and use of a Charge-Coupled Device (CCD) detector system on the Automated Patrol Telescope--a converted Baker-Nunn satellite tracking camera donated to the School of Physics at the University of New South Wales. The work is divided into three distinct areas, entitled "Hardware", "Software" and "Observations". "Hardware" covers the construction, operation and measurement of the electronic performance of the CCD camera and the CCD/telescope combination thus formed. The evolution of the system to a working configuration producing images of scientific quality is presented. "Software" describes two distinct applications and software libraries that were developed for IBM-compatible PC computers running the MS-DOS operating system. One of these is IMLIB, a generic image processing package that implements the low-level details of accessing and viewing FITS-format two-dimensional images, which is the standard format for transferring astronomical images between institutions. The other is the APT CONSOLE, which provides the user-interface for operating and controlling the APT sub-systems, from source selection and tracking to image acquisition, and provides the foundations for operation of the APT in an unattended, automated mode. "Observations" covers the observation programme and research conducted with the ATP during testing. The optical and photometric performance of the telescope is measured using the Harvard E-region photometric standard star fields. Research into algorithms and methods for use in transient object detection, particularly in searches for supernovae, novae and/or asteroids is described using asteroids as "test novae". A supernova search atlas containing images of nearby galaxies for use as comparison standards was begun. Mosaic images of the Small Magellanic Cloud (SMC) in B, V and Kron-Cousins R, I bands, covering twenty -five square degrees with eight arc-second resolution, demonstrate the power of

  18. Unmanned Vehicle Guidance Using Video Camera/Vehicle Model

    NASA Technical Reports Server (NTRS)

    Sutherland, T.

    1999-01-01

    A video guidance sensor (VGS) system has flown on both STS-87 and STS-95 to validate a single camera/target concept for vehicle navigation. The main part of the image algorithm was the subtraction of two consecutive images using software. For a nominal size image of 256 x 256 pixels this subtraction can take a large portion of the time between successive frames in standard rate video leaving very little time for other computations. The purpose of this project was to integrate the software subtraction into hardware to speed up the subtraction process and allow for more complex algorithms to be performed, both in hardware and software.

  19. Electron detection characteristics of a slow-scan CCD camera, imaging plates and film, and electron image restoration.

    PubMed

    Zuo, J M

    2000-05-01

    Electron detection characteristics are summarized for the slow scan CCD (SSC) camera, imaging plates, and film. The advantage of each detector is demonstrated with the selected examples of electron diffraction and imaging. The Richardson-Lucy algorithm for image restoration is described and tested for images recorded with the SSC camera. The effectiveness of image restoration is demonstrated for the recorded high-resolution lattice image, energy-loss spectrum, and convergent beam electron diffraction (CBED) pattern. PMID:10816266

  20. In-flight Video Captured by External Tank Camera System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In this July 26, 2005 video, Earth slowly fades into the background as the STS-114 Space Shuttle Discovery climbs into space until the External Tank (ET) separates from the orbiter. An External Tank ET Camera System featuring a Sony XC-999 model camera provided never before seen footage of the launch and tank separation. The camera was installed in the ET LO2 Feedline Fairing. From this position, the camera had a 40% field of view with a 3.5 mm lens. The field of view showed some of the Bipod area, a portion of the LH2 tank and Intertank flange area, and some of the bottom of the shuttle orbiter. Contained in an electronic box, the battery pack and transmitter were mounted on top of the Solid Rocker Booster (SRB) crossbeam inside the ET. The battery pack included 20 Nickel-Metal Hydride batteries (similar to cordless phone battery packs) totaling 28 volts DC and could supply about 70 minutes of video. Located 95 degrees apart on the exterior of the Intertank opposite orbiter side, there were 2 blade S-Band antennas about 2 1/2 inches long that transmitted a 10 watt signal to the ground stations. The camera turned on approximately 10 minutes prior to launch and operated for 15 minutes following liftoff. The complete camera system weighs about 32 pounds. Marshall Space Flight Center (MSFC), Johnson Space Center (JSC), Goddard Space Flight Center (GSFC), and Kennedy Space Center (KSC) participated in the design, development, and testing of the ET camera system.

  1. Color measurement in standard CIELAB coordinates using a 3CCD camera: correction for the influence of the light source

    NASA Astrophysics Data System (ADS)

    Corbalan-Fuertes, Montserrat; Millan, Maria S.; Yzuel, Maria J.

    2000-06-01

    We have analyzed the accuracy of the compensating performance of the white-balance mechanism of a 3CCD camera for the three common types of light--fluorescent (F), incandescent (I), and daylight (D). We study the behavior of the camera using the RGB and CIELAB coordinates for a wide set of color samples covering the visible spectrum. CIELAB coordinates are obtained from the tristimulus XYZ. Using linear methods, we obtain the XYZ values from the (RGB)CCD values acquired by a 3CCD camera. We propose two different approaches: the first is specific for each particular light source (F, I, and D); the second considers the equienergetic spectral light source as an approximation for white lighting. We measure the mean color difference in the CIELAB space under a change of illuminant and compare the results in order to evaluate the performance of the response of a 3CCD camera. The transformations that are specific for a given light source allow an improved response under change of illuminant in terms of color constancy.

  2. Fast roadway detection using car cabin video camera

    NASA Astrophysics Data System (ADS)

    Krokhina, Daria; Blinov, Veniamin; Gladilin, Sergey; Tarhanov, Ivan; Postnikov, Vassili

    2015-12-01

    We describe a fast method for road detection in images from a vehicle cabin camera. Straight section of roadway is detected using Fast Hough Transform and the method of dynamic programming. We assume that location of horizon line in the image and the road pattern are known. The developed method is fast enough to detect the roadway on each frame of the video stream in real time and may be further accelerated by the use of tracking.

  3. Design of Digital Controller for a CCD Camera with Dual-Speed Tracking Imaging on Same Frame

    NASA Astrophysics Data System (ADS)

    Wang, Hui-Juan; Li, Bin-Hua; Li, Yong-Ming; He, Chun

    2007-12-01

    The CCD cameras with high performance have been widely used in astronomical observations. The techniques for observing moving objects or still objects independently are mature. However, when both the moving objects (such as satellites, debris and asteroids) and still objects (such as stars) are observed at the same time via the same CCD camera, images of one kind of these two objects must be elongated in the most time. In order to solve this problem, the authors developed a novel imaging technique and a corresponding observation method. The photosensitive areas in some CCD arrays are physically divided into two or more zones. Based on these CCD arrays, the new idea can be implemented: one half of the photosensitive area is used to image the still objects in stare mode, and another half to image the moving objects in drift scan mode. It means that both moving objects and still objects can be tracked at the same time without the elongation of their images on the same CCD frame. Thus the new technique is called Dual-Speed Tracking Imaging on Same Frame (DSTIS). This paper briefly introduces the operation principle of the DSTIS CCD camera. After some discussions on the request of a digital controller for the camera, the design philosophy and basic structure of the controller are presented. Then some simulation and testing results are shown, and problems that were encountered during the simulation and testing are analyzed in detail and solved successfully. By the results of the software simulation and hardware testing, the design has been certified correctly.

  4. Robust camera calibration for sport videos using court models

    NASA Astrophysics Data System (ADS)

    Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang

    2003-12-01

    We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.

  5. Identifying sports videos using replay, text, and camera motion features

    NASA Astrophysics Data System (ADS)

    Kobla, Vikrant; DeMenthon, Daniel; Doermann, David S.

    1999-12-01

    Automated classification of digital video is emerging as an important piece of the puzzle in the design of content management systems for digital libraries. The ability to classify videos into various classes such as sports, news, movies, or documentaries, increases the efficiency of indexing, browsing, and retrieval of video in large databases. In this paper, we discuss the extraction of features that enable identification of sports videos directly from the compressed domain of MPEG video. These features include detecting the presence of action replays, determining the amount of scene text in vide, and calculating various statistics on camera and/or object motion. The features are derived from the macroblock, motion,and bit-rate information that is readily accessible from MPEG video with very minimal decoding, leading to substantial gains in processing speeds. Full-decoding of selective frames is required only for text analysis. A decision tree classifier built using these features is able to identify sports clips with an accuracy of about 93 percent.

  6. Improving Photometric Calibration of Meteor Video Camera Systems

    NASA Technical Reports Server (NTRS)

    Ehlert, Steven; Kingery, Aaron; Cooke, William

    2016-01-01

    Current optical observations of meteors are commonly limited by systematic uncertainties in photometric calibration at the level of approximately 0.5 mag or higher. Future improvements to meteor ablation models, luminous efficiency models, or emission spectra will hinge on new camera systems and techniques that significantly reduce calibration uncertainties and can reliably perform absolute photometric measurements of meteors. In this talk we discuss the algorithms and tests that NASA's Meteoroid Environment Office (MEO) has developed to better calibrate photometric measurements for the existing All-Sky and Wide-Field video camera networks as well as for a newly deployed four-camera system for measuring meteor colors in Johnson-Cousins BV RI filters. In particular we will emphasize how the MEO has been able to address two long-standing concerns with the traditional procedure, discussed in more detail below.

  7. Scientific CCD technology at JPL

    NASA Technical Reports Server (NTRS)

    Janesick, J.; Collins, S. A.; Fossum, E. R.

    1991-01-01

    Charge-coupled devices (CCD's) were recognized for their potential as an imaging technology almost immediately following their conception in 1970. Twenty years later, they are firmly established as the technology of choice for visible imaging. While consumer applications of CCD's, especially the emerging home video camera market, dominated manufacturing activity, the scientific market for CCD imagers has become significant. Activity of the Jet Propulsion Laboratory and its industrial partners in the area of CCD imagers for space scientific instruments is described. Requirements for scientific imagers are significantly different from those needed for home video cameras, and are described. An imager for an instrument on the CRAF/Cassini mission is described in detail to highlight achieved levels of performance.

  8. Pixel-to-pixel correspondence alignment method of a 2CCD camera by using absolute phase map

    NASA Astrophysics Data System (ADS)

    Huang, Shujun; Liu, Yue; Bai, Xuefei; Wang, Zhangying; Zhang, Zonghua

    2015-06-01

    An alignment method of a 2CCD camera to build pixel-to-pixel correspondence between the infrared (IR) CCD sensor and the visible CCD sensor by using the absolute phase data is presented. Vertical and horizontal sinusoidal fringe patterns are generated by software and displayed on a liquid crystal display screen. The displayed fringe patterns are captured simultaneously by the IR sensor and the visible sensor of the 2CCD camera. The absolute phase values of each pixel at IR and visible channels are calculated from the captured fringe pattern images by using Fourier transform and the optimum three-fringe number selection method. The accurate pixel corresponding relationship between the two sensors can be determined along the vertical and the horizontal directions by comparing the obtained absolute phase data in IR and visible channels. Experimental results show the high accuracy, effectiveness, and validity of the proposed 2CCD alignment method. By using the continuous absolute phase information, this method can determine the pixel-to-pixel correspondence with high resolution.

  9. Measuring the Flatness of Focal Plane for Very Large Mosaic CCD Camera

    SciTech Connect

    Hao, Jiangang; Estrada, Juan; Cease, Herman; Diehl, H.Thomas; Flaugher, Brenna L.; Kubik, Donna; Kuk, Keivin; Kuropatkine, Nickolai; Lin, Huan; Montes, Jorge; Scarpine, Vic; /Fermilab

    2010-06-08

    Large mosaic multiCCD camera is the key instrument for modern digital sky survey. DECam is an extremely red sensitive 520 Megapixel camera designed for the incoming Dark Energy Survey (DES). It is consist of sixty two 4k x 2k and twelve 2k x 2k 250-micron thick fully-depleted CCDs, with a focal plane of 44 cm in diameter and a field of view of 2.2 square degree. It will be attached to the Blanco 4-meter telescope at CTIO. The DES will cover 5000 square-degrees of the southern galactic cap in 5 color bands (g, r, i, z, Y) in 5 years starting from 2011. To achieve the science goal of constraining the Dark Energy evolution, stringent requirements are laid down for the design of DECam. Among them, the flatness of the focal plane needs to be controlled within a 60-micron envelope in order to achieve the specified PSF variation limit. It is very challenging to measure the flatness of the focal plane to such precision when it is placed in a high vacuum dewar at 173 K. We developed two image based techniques to measure the flatness of the focal plane. By imaging a regular grid of dots on the focal plane, the CCD offset along the optical axis is converted to the variation the grid spacings at different positions on the focal plane. After extracting the patterns and comparing the change in spacings, we can measure the flatness to high precision. In method 1, the regular dots are kept in high sub micron precision and cover the whole focal plane. In method 2, no high precision for the grid is required. Instead, we use a precise XY stage moves the pattern across the whole focal plane and comparing the variations of the spacing when it is imaged by different CCDs. Simulation and real measurements show that the two methods work very well for our purpose, and are in good agreement with the direct optical measurements.

  10. OP09O-OP404-9 Wide Field Camera 3 CCD Quantum Efficiency Hysteresis

    NASA Technical Reports Server (NTRS)

    Collins, Nick

    2009-01-01

    The HST/Wide Field Camera (WFC) 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. At the nominal operating temperature of -83C, the QEH feature contrast was typically 0.1-0.2% or less. The behavior was replicated using flight spare detectors. A visible light flat-field (540nm) with a several times full-well signal level can pin the detectors at both optical (600nm) and near-UV (230nm) wavelengths, suppressing the QEH behavior. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. The HST/Wide Field Camera 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. The first observed manifestation of QEH was the presence in a small percentage of flat-field images of a bowtie-shaped contrast that spanned the width of each chip. At the nominal operating temperature of -83C, the contrast observed for this feature was typically 0.1-0.2% or less, though at warmer temperatures contrasts up to 5% (at -50C) have been observed. The bowtie morphology was replicated using flight spare detectors in tests at the GSFC Detector Characterization Laboratory by power cycling the detector while cold. Continued investigation revealed that a clearly-related global QE suppression at the approximately 5% level can be produced by cooling the detector in the dark; subsequent flat-field exposures at a constant illumination show asymptotically increasing response. This QE "pinning" can be achieved with a single high signal flat-field or a series of lower signal flats; a visible light (500-580nm) flat-field with a signal level of several hundred thousand electrons per pixel is sufficient for QE pinning at both optical (600nm) and near-UV (230nm) wavelengths. We are characterizing the timescale for the detectors to become unpinned and developing a

  11. A fast auto-focusing technique for the long focal lens TDI CCD camera in remote sensing applications

    NASA Astrophysics Data System (ADS)

    Wang, Dejiang; Ding, Xu; Zhang, Tao; Kuang, Haipeng

    2013-02-01

    The key issue in automatic focus adjustment for long focal lens TDI CCD camera in remote sensing applications is to achieve the optimum focus position as fast as possible. Existing auto-focusing techniques consume too much time as the mechanical focusing parts of the camera move in steps during the searching procedure. In this paper, we demonstrate a fast auto-focusing technique, which employs the internal optical elements and the TDI CCD itself to directly sense the deviations in back focal distance of the lens and restore the imaging system to a best-available focus. It is particularly advantageous for determination of the focus, due to that the relative motion between the TDI CCD and the focusing element can proceed without interruption. Moreover, the theoretical formulas describing the effect of imaging motion on the focusing precision and the effective focusing range are also developed. Finally, an experimental setup is constructed to evaluate the performance of the proposed technique. The results of the experiment show a ±5 μm precision of auto-focusing in a range of ±500 μmdefocus, and the searching procedure could be accomplished within 0.125 s, which leads to remarkable improvement on the real-time imaging capability for high resolution TDI CCD camera in remote sensing applications.

  12. Characterization and field use of a CCD camera system for retrieval of bidirectional reflectance distribution function

    NASA Astrophysics Data System (ADS)

    Nandy, P.; Thome, K.; Biggar, S.

    2001-06-01

    Vicarious calibration and field validation is a critical aspect of NASA's Earth Observing System program. As part of calibration and validation research related to this project, the Remote Sensing Group (RSG) of the Optical Science Center at the University of Arizona has developed an imaging radiometer for ground-based measurements of directional reflectance. The system relies on a commercially available 1024×1024 pixel, silicon CCD array. Angular measurements are accomplished using a fish-eye lens that has a full 180° field of view with each pixel on the CCD array having a nominal 0.2° field of view. Spectral selection is through four interference filters centered at 470, 575, 660, and 835 nm. The system is designed such that the entire 180° field is collected at one time with a complete multispectral data set collected in under 2 min. The results of laboratory experiments have been used to determine the gain and offset of each detector element as well as the effects of the lens on the system response. Measurements of a stable source using multiple integration times and at multiple distances for a set integration time indicate the system is linear to better than 0.5% over the upper 88% of the dynamic range of the system. The point spread function (PSF) of the lens system was measured for several field angles, and the signal level was found to fall to less than 1% of the peak signal within 1.5° for the on-axis case. The effect of this PSF on the retrieval of modeled BRDFs is shown to be less than 0.2% out to view angles of 70°. The degree of polarization of the system is shown to be negligible for on-axis imaging but to have up to a 20% effect at a field angle of 70°. The effect of the system polarization on the retrieval of modeled BRDFs is shown to be up to 3% for field angles of 70° off nadir and with a solar zenith angle of 70°. Field measurements are made by mounting the camera to a boom mounted to a large tripod that is aligned toward south. This

  13. A new paradigm for video cameras: optical sensors

    NASA Astrophysics Data System (ADS)

    Grottle, Kevin; Nathan, Anoo; Smith, Catherine

    2007-04-01

    This paper presents a new paradigm for the utilization of video surveillance cameras as optical sensors to augment and significantly improve the reliability and responsiveness of chemical monitoring systems. Incorporated into a hierarchical tiered sensing architecture, cameras serve as 'Tier 1' or 'trigger' sensors monitoring for visible indications after a release of warfare or industrial toxic chemical agents. No single sensor today yet detects the full range of these agents, but the result of exposure is harmful and yields visible 'duress' behaviors. Duress behaviors range from simple to complex types of observable signatures. By incorporating optical sensors in a tiered sensing architecture, the resulting alarm signals based on these behavioral signatures increases the range of detectable toxic chemical agent releases and allows timely confirmation of an agent release. Given the rapid onset of duress type symptoms, an optical sensor can detect the presence of a release almost immediately. This provides cues for a monitoring system to send air samples to a higher-tiered chemical sensor, quickly launch protective mitigation steps, and notify an operator to inspect the area using the camera's video signal well before the chemical agent can disperse widely throughout a building.

  14. Measurement of time varying temperature fields using visible imaging CCD cameras

    SciTech Connect

    Keanini, R.G.; Allgood, C.L.

    1996-12-31

    A method for measuring time-varying surface temperature distributions using high frame rate visible imaging CCD cameras is described. The technique is based on an ad hoc model relating measured radiance to local surface temperature. This approach is based on the fairly non-restrictive assumptions that atmospheric scattering and absorption, and secondary emission and reflection are negligible. In order to assess performance, both concurrent and non-concurrent calibration and measurement, performed under dynamic thermal conditions, are examined. It is found that measurement accuracy is comparable to the theoretical accuracy predicted for infrared-based systems. In addition, performance tests indicate that in the experimental system, real-time calibration can be achieved while real-time whole-field temperature measurements require relatively coarse spatial resolution. The principal advantages of the proposed method are its simplicity and low cost. In addition, since independent temperature measurements are used for calibration, emissivity remains unspecified, so that a potentially significant source of error is eliminated.

  15. Development of proton CT imaging system using plastic scintillator and CCD camera.

    PubMed

    Tanaka, Sodai; Nishio, Teiji; Matsushita, Keiichiro; Tsuneda, Masato; Kabuki, Shigeto; Uesaka, Mitsuru

    2016-06-01

    A proton computed tomography (pCT) imaging system was constructed for evaluation of the error of an x-ray CT (xCT)-to-WEL (water-equivalent length) conversion in treatment planning for proton therapy. In this system, the scintillation light integrated along the beam direction is obtained by photography using the CCD camera, which enables fast and easy data acquisition. The light intensity is converted to the range of the proton beam using a light-to-range conversion table made beforehand, and a pCT image is reconstructed. An experiment for demonstration of the pCT system was performed using a 70 MeV proton beam provided by the AVF930 cyclotron at the National Institute of Radiological Sciences. Three-dimensional pCT images were reconstructed from the experimental data. A thin structure of approximately 1 mm was clearly observed, with spatial resolution of pCT images at the same level as that of xCT images. The pCT images of various substances were reconstructed to evaluate the pixel value of pCT images. The image quality was investigated with regard to deterioration including multiple Coulomb scattering. PMID:27191962

  16. Development of proton CT imaging system using plastic scintillator and CCD camera

    NASA Astrophysics Data System (ADS)

    Tanaka, Sodai; Nishio, Teiji; Matsushita, Keiichiro; Tsuneda, Masato; Kabuki, Shigeto; Uesaka, Mitsuru

    2016-06-01

    A proton computed tomography (pCT) imaging system was constructed for evaluation of the error of an x-ray CT (xCT)-to-WEL (water-equivalent length) conversion in treatment planning for proton therapy. In this system, the scintillation light integrated along the beam direction is obtained by photography using the CCD camera, which enables fast and easy data acquisition. The light intensity is converted to the range of the proton beam using a light-to-range conversion table made beforehand, and a pCT image is reconstructed. An experiment for demonstration of the pCT system was performed using a 70 MeV proton beam provided by the AVF930 cyclotron at the National Institute of Radiological Sciences. Three-dimensional pCT images were reconstructed from the experimental data. A thin structure of approximately 1 mm was clearly observed, with spatial resolution of pCT images at the same level as that of xCT images. The pCT images of various substances were reconstructed to evaluate the pixel value of pCT images. The image quality was investigated with regard to deterioration including multiple Coulomb scattering.

  17. Social Justice through Literacy: Integrating Digital Video Cameras in Reading Summaries and Responses

    ERIC Educational Resources Information Center

    Liu, Rong; Unger, John A.; Scullion, Vicki A.

    2014-01-01

    Drawing data from an action-oriented research project for integrating digital video cameras into the reading process in pre-college courses, this study proposes using digital video cameras in reading summaries and responses to promote critical thinking and to teach social justice concepts. The digital video research project is founded on…

  18. Study of pixel damages in CCD cameras irradiated at the neutron tomography facility of IPEN-CNEN/SP

    NASA Astrophysics Data System (ADS)

    Pugliesi, R.; Andrade, M. L. G.; Dias, M. S.; Siqueira, P. T. D.; Pereira, M. A. S.

    2015-12-01

    A methodology to investigate damages in CCD sensors caused by radiation beams of neutron tomography facilities is proposed. This methodology has been developed in the facility installed at the nuclear research reactor of IPEN-CNEN/SP, and the damages were evaluated by counting of white spots in images. The damage production rate at the main camera position was evaluated to be in the range between 0.008 and 0.040 damages per second. For this range, only 4 to 20 CCD pixels are damaged per tomography, assuring high quality images for hundreds of tomographs. Since the present methodology is capable of quantifying the damage production rate for each type of radiation, it can also be used in other facilities to improve the radiation shielding close of the CCD sensors.

  19. Automatic radial distortion correction in zoom lens video camera

    NASA Astrophysics Data System (ADS)

    Kim, Daehyun; Shin, Hyoungchul; Oh, Juhyun; Sohn, Kwanghoon

    2010-10-01

    We present a novel method for automatically correcting the radial lens distortion in a zoom lens video camera system. We first define the zoom lens distortion model using an inherent characteristic of the zoom lens. Next, we sample some video frames with different focal lengths and estimate their radial distortion parameters and focal lengths. We then optimize the zoom lens distortion model with preestimated parameter pairs using the least-squares method. For more robust optimization, we divide the sample images into two groups according to distortion types (i.e., barrel and pincushion) and then separately optimize the zoom lens distortion models with respect to divided groups. Our results show that the zoom lens distortion model can accurately represent the radial distortion of a zoom lens.

  20. Non-mydriatic, wide field, fundus video camera

    NASA Astrophysics Data System (ADS)

    Hoeher, Bernhard; Voigtmann, Peter; Michelson, Georg; Schmauss, Bernhard

    2014-02-01

    We describe a method we call "stripe field imaging" that is capable of capturing wide field color fundus videos and images of the human eye at pupil sizes of 2mm. This means that it can be used with a non-dilated pupil even with bright ambient light. We realized a mobile demonstrator to prove the method and we could acquire color fundus videos of subjects successfully. We designed the demonstrator as a low-cost device consisting of mass market components to show that there is no major additional technical outlay to realize the improvements we propose. The technical core idea of our method is breaking the rotational symmetry in the optical design that is given in many conventional fundus cameras. By this measure we could extend the possible field of view (FOV) at a pupil size of 2mm from a circular field with 20° in diameter to a square field with 68° by 18° in size. We acquired a fundus video while the subject was slightly touching and releasing the lid. The resulting video showed changes at vessels in the region of the papilla and a change of the paleness of the papilla.

  1. Scientists Behind the Camera - Increasing Video Documentation in the Field

    NASA Astrophysics Data System (ADS)

    Thomson, S.; Wolfe, J.

    2013-12-01

    Over the last two years, Skypunch Creative has designed and implemented a number of pilot projects to increase the amount of video captured by scientists in the field. The major barrier to success that we tackled with the pilot projects was the conflicting demands of the time, space, storage needs of scientists in the field and the demands of shooting high quality video. Our pilots involved providing scientists with equipment, varying levels of instruction on shooting in the field and post-production resources (editing and motion graphics). In each project, the scientific team was provided with cameras (or additional equipment if they owned their own), tripods, and sometimes sound equipment, as well as an external hard drive to return the footage to us. Upon receiving the footage we professionally filmed follow-up interviews and created animations and motion graphics to illustrate their points. We also helped with the distribution of the final product (http://climatescience.tv/2012/05/the-story-of-a-flying-hippo-the-hiaper-pole-to-pole-observation-project/ and http://climatescience.tv/2013/01/bogged-down-in-alaska/). The pilot projects were a success. Most of the scientists returned asking for additional gear and support for future field work. Moving out of the pilot phase, to continue the project, we have produced a 14 page guide for scientists shooting in the field based on lessons learned - it contains key tips and best practice techniques for shooting high quality footage in the field. We have also expanded the project and are now testing the use of video cameras that can be synced with sensors so that the footage is useful both scientifically and artistically. Extract from A Scientist's Guide to Shooting Video in the Field

  2. Photometric correction and reflectance calculation for lunar images from the Chang'E-1 CCD stereo camera.

    PubMed

    Chen, Chao; Qin, Qiming; Chen, Li; Zheng, Hong; Fa, Wenzhe; Ghulam, Abduwasit; Zhang, Chengye

    2015-12-01

    Photometric correction and reflectance calculation are two important processes in the scientific analysis and application of Chang'E-1 (CE-1) charge-coupled device (CCD) stereo camera data. In this paper, the methods of photometric correction and reflectance calculation were developed. On the one hand, in considering the specificity of datasets acquired by the CE-1 CCD stereo camera, photometric correction was conducted based on the digital number value directly using the revised Lommel-Seeliger factor. On the other hand, on the basis of laboratory-measured bidirectional reflectances, the relative reflectance was then calculated using the empirical linear model. The presented approach can be used to identify landing sites, obtain global images, and produce topographic maps of the lunar surface. PMID:26831395

  3. Wide field NEO survey 1.0-m telescope with 10 2k×4k mosaic CCD camera

    NASA Astrophysics Data System (ADS)

    Isobe, Syuzo; Asami, Atsuo; Asher, David J.; Hashimoto, Toshiyasu; Nakano, Shi-ichi; Nishiyama, Kota; Ohshima, Yoshiaki; Terazono, Junya; Umehara, Hiroaki; Yoshikawa, Makoto

    2002-12-01

    We developed a new 1.0 m telescope with a 3 degree flat focal plane to which a mosaic CCD camera with 10 2k×4k chips is fixed. The system was set up in February 2002, and is now undergoing the final fine adjustments. Since the telescope has a focal length of 3 m, a field of 7.5 square degrees is covered in one image. In good seeing conditions, 1.5 arc seconds, at the site located in Bisei town, Okayama prefecture in Japan, we can expect to detect down to 20th magnitude stars with an exposure time of 60 seconds. Considering a read-out time, 46 seconds, of the CCD camera, one image is taken in every two minutes, and about 2,100 square degrees of field is expected to be covered in one clear night. This system is very effective for survey work, especially for Near-Earth-Asteroid detection.

  4. CCD video observation of microgravity crystallization of lysozyme and correlation with accelerometer data.

    PubMed

    Snell, E H; Boggon, T J; Helliwell, J R; Moskowitz, M E; Nadarajah, A

    1997-11-01

    Lysozyme has been crystallized using the ESA Advanced Protein Crystallization Facility onboard the NASA Space Shuttle Orbiter during the IML-2 mission. CCD video monitoring was used to follow the crystallization process and evaluate the growth rate. During the mission some tetragonal crystals were observed moving over distances of up to 200 micrometers. This was correlated with microgravity disturbances caused by firings of vernier jets on the Orbiter. Growth-rate measurement of a stationary crystal (which had nucleated on the growth reactor wall) showed spurts and lulls correlated with an onboard activity: astronaut exercise. The stepped growth rates may be responsible for the residual mosaic block structure seen in crystal mosaicity and topography measurements. PMID:11540584

  5. CCD Video Observation of Microgravity Crystallization of Lysozyme and Correlation with Accelerometer Data

    NASA Technical Reports Server (NTRS)

    Snell, E. H.; Boggon, T. J.; Helliwell, J. R.; Moskowitz, M. E.; Nadarajah, A.

    1997-01-01

    Lysozyme has been crystallized using the ESA Advanced Protein Crystallization Facility onboard the NASA Space Shuttle Orbiter during the IML-2 mission. CCD video monitoring was used to follow the crystallization process and evaluate the growth rate. During the mission some tetragonal crystals were observed moving over distances of up to 200 micrometers. This was correlated with microgravity disturbances caused by firings of vernier jets on the Orbiter. Growth-rate measurement of a stationary crystal (which had nucleated on the growth reactor wall) showed spurts and lulls correlated with an onboard activity; astronaut exercise. The stepped growth rates may be responsible for the residual mosaic block structure seen in crystal mosaicity and topography measurements.

  6. Preliminary Performance Measurements for a Streak Camera with a Large-Format Direct-Coupled CCD Readout

    SciTech Connect

    Lerche, R A; McDonald, J W; Griffith, R L; de Dios, G V; Andrews, D S; Huey, A W; Bell, P M; Landen, O L; Jaanimagi, P A; Boni, R

    2004-04-13

    Livermore's ICF Program has a large inventory of optical streak cameras built in the 1970s and 1980s. The cameras are still very functional, but difficult to maintain because many of their parts are obsolete including the original streak tube and image-intensifier tube. The University of Rochester's Laboratory for Laser Energetics is leading an effort to develop a fully automated, large-format streak camera that incorporates modern technology. Preliminary characterization of a prototype camera shows spatial resolution better than 20 lp/mm, temporal resolution of 12 ps, line-spread function of 40 {micro}m (fwhm), contrast transfer ratio (CTR) of 60% at 10 lp/mm, and system sensitivity of 16 CCD electrons per photoelectron. A dynamic range of 60 for a 2 ns window is determined from system noise, linearity and sensitivity measurements.

  7. A Large Panel Two-CCD Camera Coordinate System with an Alternate-Eight-Matrix Look-Up Table Algorithm

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Lu, An-Tsung; Hsu, Yuen-Chang; Tien, Chuen-Lin; Chen, Der-Chin

    In this study, a novel positioning model of a double-CCD cameras calibration system, with an Alternate-Eight-Matrix (AEM) Look-Up-Table (LUT), was proposed. Two CCD cameras were fixed on both sides of a large scale screen to redeem Field Of View (FOV) problems. The first to the fourth AEMLUT were used to compute the corresponding positions of intermediate blocks on the screen captured by the right side camera. In these AEMLUT for the right side camera, the coordinate mapping data of the target in a specific space were stored in two matrixes, while the gray level threshold values of different position were stored in the others. Similarly, the fifth to the eighth AEMLUT were used to compute the corresponding positions of intermediate blocks on the screen captured by the left side camera. Experimental results showed that the problems of dead angles and non-uniform light fields were solved. In addition, rapid and precision positioning results can be obtained by the proposed method.

  8. Classification of volcanic ash particles from Sakurajima volcano using CCD camera image and cluster analysis

    NASA Astrophysics Data System (ADS)

    Miwa, T.; Shimano, T.; Nishimura, T.

    2012-12-01

    Quantitative and speedy characterization of volcanic ash particle is needed to conduct a petrologic monitoring of ongoing eruption. We develop a new simple system using CCD camera images for quantitatively characterizing ash properties, and apply it to volcanic ash collected at Sakurajima. Our method characterizes volcanic ash particles by 1) apparent luminance through RGB filters and 2) a quasi-fractal dimension of the shape of particles. Using a monochromatic CCD camera (Starshoot by Orion Co. LTD.) attached to a stereoscopic microscope, we capture digital images of ash particles that are set on a glass plate under which white colored paper or polarizing plate is set. The images of 1390 x 1080 pixels are taken through three kinds of color filters (Red, Green and Blue) under incident-light and transmitted-light through polarizing plate. Brightness of the light sources is set to be constant, and luminance is calibrated by white and black colored papers. About fifteen ash particles are set on the plate at the same time, and their images are saved with a bit map format. We first extract the outlines of particles from the image taken under transmitted-light through polarizing plate. Then, luminances for each color are represented by 256 tones at each pixel in the particles, and the average and its standard deviation are calculated for each ash particle. We also measure the quasi-fractal dimension (qfd) of ash particles. We perform box counting that counts the number of boxes which consist of 1×1 and 128×128 pixels that catch the area of the ash particle. The qfd is estimated by taking the ratio of the former number to the latter one. These parameters are calculated by using software R. We characterize volcanic ash from Showa crater of Sakurajima collected in two days (Feb 09, 2009, and Jan 13, 2010), and apply cluster analyses. Dendrograms are formed from the qfd and following four parameters calculated from the luminance: Rf=R/(R+G+B), G=G/(R+G+B), B=B/(R+G+B), and

  9. Computer-vision-based weed identification of images acquired by 3CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Yun; He, Yong; Fang, Hui

    2006-09-01

    Selective application of herbicide to weeds at an earlier stage in crop growth is an important aspect of site-specific management of field crops. For approaches more adaptive in developing the on-line weed detecting application, more researchers involves in studies on image processing techniques for intensive computation and feature extraction tasks to identify the weeds from the other crops and soil background. This paper investigated the potentiality of applying the digital images acquired by the MegaPlus TM MS3100 3-CCD camera to segment the background soil from the plants in question and further recognize weeds from the crops using the Matlab script language. The image of the near-infrared waveband (center 800 nm; width 65 nm) was selected principally for segmenting soil and identifying the cottons from the thistles was achieved based on their respective relative area (pixel amount) in the whole image. The results show adequate recognition that the pixel proportion of soil, cotton leaves and thistle leaves were 78.24%(-0.20% deviation), 16.66% (+ 2.71% SD) and 4.68% (-4.19% SD). However, problems still exists by separating and allocating single plants for their clustering in the images. The information in the images acquired via the other two channels, i.e., the green and the red bands, need to be extracted to help the crop/weed discrimination. More optical specimens should be acquired for calibration and validation to establish the weed-detection model that could be effectively applied in fields.

  10. Developing a CCD camera with high spatial resolution for RIXS in the soft X-ray range

    NASA Astrophysics Data System (ADS)

    Soman, M. R.; Hall, D. J.; Tutt, J. H.; Murray, N. J.; Holland, A. D.; Schmitt, T.; Raabe, J.; Schmitt, B.

    2013-12-01

    The Super Advanced X-ray Emission Spectrometer (SAXES) at the Swiss Light Source contains a high resolution Charge-Coupled Device (CCD) camera used for Resonant Inelastic X-ray Scattering (RIXS). Using the current CCD-based camera system, the energy-dispersive spectrometer has an energy resolution (E/ΔE) of approximately 12,000 at 930 eV. A recent study predicted that through an upgrade to the grating and camera system, the energy resolution could be improved by a factor of 2. In order to achieve this goal in the spectral domain, the spatial resolution of the CCD must be improved to better than 5 μm from the current 24 μm spatial resolution (FWHM). The 400 eV-1600 eV energy X-rays detected by this spectrometer primarily interact within the field free region of the CCD, producing electron clouds which will diffuse isotropically until they reach the depleted region and buried channel. This diffusion of the charge leads to events which are split across several pixels. Through the analysis of the charge distribution across the pixels, various centroiding techniques can be used to pinpoint the spatial location of the X-ray interaction to the sub-pixel level, greatly improving the spatial resolution achieved. Using the PolLux soft X-ray microspectroscopy endstation at the Swiss Light Source, a beam of X-rays of energies from 200 eV to 1400 eV can be focused down to a spot size of approximately 20 nm. Scanning this spot across the 16 μm square pixels allows the sub-pixel response to be investigated. Previous work has demonstrated the potential improvement in spatial resolution achievable by centroiding events in a standard CCD. An Electron-Multiplying CCD (EM-CCD) has been used to improve the signal to effective readout noise ratio achieved resulting in a worst-case spatial resolution measurement of 4.5±0.2 μm and 3.9±0.1 μm at 530 eV and 680 eV respectively. A method is described that allows the contribution of the X-ray spot size to be deconvolved from these

  11. Laboratory x-ray CCD camera electronics: a test bed for the Swift X-Ray Telescope

    NASA Astrophysics Data System (ADS)

    Hill, Joanne E.; Zugger, Michael E.; Shoemaker, Jason; Witherite, Mark E.; Koch, T. Scott; Chou, Lester L.; Case, Traci; Burrows, David N.

    2000-12-01

    The Penn State University Department of Astronomy and Astrophysics has been active in the design of X-ray CCD cameras for astronomy for over two decades, including sounding rocket systems, the CUBIC instrument on the SAC-B satellite and the ACIS camera on the Chandra satellite. Currently the group is designing and building an X-ray telescope (XRT), which will comprise part of the Swift Gamma-Ray Burst Explorer satellite. The Swift satellite, selected in October 1999 as one of two winners of NASA Explorer contracts, will -- within one minute -- detect, locate, and observe gamma-ray bursts simultaneously in the optical, ultraviolet, X-ray, and gamma- ray wavelengths using three co-aligned telescopes. The XRT electronics is required to read out the telescope's CCD sensor in a number of different ways depending on the observing mode selected. Immediately after the satellite re-orients to observe a newly detected burst, the XRT will enter an imaging mode to determine the exact position of the burst. The location will then be transmitted to the ground, and the XRT will autonomously enter other modes as the X-ray intensity of the burst waxes and wanes. This paper will discuss the electronics for a laboratory X-ray CCD camera, which serves as a test bed for development of the Swift XRT camera. It will also touch upon the preliminary design of the flight camera, which is closely related. A major challenge is achieving performance and reliability goals within the cost constraints of an Explorer mission.

  12. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source

    SciTech Connect

    M. J. Haugh and M. B. Schneider

    2008-10-31

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 μm square pixels, and 15 μm thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/ΔE≈10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within ±1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  13. Video-Camera-Based Position-Measuring System

    NASA Technical Reports Server (NTRS)

    Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert

    2005-01-01

    A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white

  14. Nighttime Near Infrared Observations of Augustine Volcano Jan-Apr, 2006 Recorded With a Small Astronomical CCD Camera

    NASA Astrophysics Data System (ADS)

    Sentman, D.; McNutt, S.; Reyes, C.; Stenbaek-Nielsen, H.; Deroin, N.

    2006-12-01

    Nighttime observations of Augustine Volcano were made during Jan-Apr, 2006 using a small, unfiltered, astronomical CCD camera operating from Homer, Alaska. Time-lapse images of the volcano were made looking across the open water of the Cook Inlet over a slant range of ~105 km. A variety of volcano activities were observed that originated in near-infrared (NIR) 0.9-1.1 micron emissions, which were detectable at the upper limit of the camera passband but were otherwise invisible to the naked eye. These activities included various types of steam releases, pyroclastic flows, rockfalls and debris flows that were correlated very closely with seismic measurements made from instruments located within 4 km on the volcanic island. Specifically, flow events to the east (towards the camera) produced high amplitudes on the eastern seismic stations and events presumably to the west were stronger on western stations. The ability to detect nighttime volcanic emissions in the NIR over large horizontal distances using standard silicon CCD technology, even in the presence of weak intervening fog, came as a surprise, and is due to a confluence of several mutually reinforcing factors: (1) Hot enough (~1000K) thermal emissions from the volcano that the short wavelength portion of the Planck radiation curve overlaps the upper portions (0.9-1.1 micron) of the sensitivity of the silicon CCD detectors, and could thus be detected, (2) The existence of several atmospheric transmission windows within the NIR passband of the camera for the emissions to propagate with relatively small attenuation through more than 10 atmospheres, and (3) in the case of fog, forward Mie scattering.

  15. Characterization of OCam and CCD220: the fastest and most sensitive camera to date for AO wavefront sensing

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz-Garcia, José Javier

    2010-07-01

    For the first time, sub-electron read noise has been achieved with a camera suitable for astronomical wavefront-sensing (WFS) applications. The OCam system has demonstrated this performance at 1300 Hz frame rate and with 240×240-pixel frame rate. ESO and JRA2 OPTICON2 have jointly funded e2v technologies to develop a custom CCD for Adaptive Optics (AO) wavefront sensing applications. The device, called CCD220, is a compact Peltier-cooled 240×240 pixel frame-transfer 8-output back-illuminated sensor using the EMCCD technology. This paper demonstrates sub-electron read noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e-/pixel/frame. It reports on the comprehensive, quantitative performance characterization of OCam and the CCD220 such as readout noise, dark current, multiplication gain, quantum efficiency, charge transfer efficiency... OCam includes a low noise preamplifier stage, a digital board to generate the clocks and a microcontroller. The data acquisition system includes a user friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, was designed offering enhanced performances, a completely sealed camera package and an additional Peltier stage to facilitate operation on a telescope or environmentally rugged applications. OCam2 offers two types of built-in data link to the Real Time Computer: the CameraLink industry standard interface and various fiber link options like the sFPDP interface. OCam2 includes also a modified mechanical design to ease the integration of microlens arrays for use of this camera in all types of wavefront sensing AO system. The front cover of OCam2 can be customized to include a microlens exchange mechanism.

  16. Deep-Sea Video Cameras Without Pressure Housings

    NASA Technical Reports Server (NTRS)

    Cunningham, Thomas

    2004-01-01

    Underwater video cameras of a proposed type (and, optionally, their light sources) would not be housed in pressure vessels. Conventional underwater cameras and their light sources are housed in pods that keep the contents dry and maintain interior pressures of about 1 atmosphere (.0.1 MPa). Pods strong enough to withstand the pressures at great ocean depths are bulky, heavy, and expensive. Elimination of the pods would make it possible to build camera/light-source units that would be significantly smaller, lighter, and less expensive. The depth ratings of the proposed camera/light source units would be essentially unlimited because the strengths of their housings would no longer be an issue. A camera according to the proposal would contain an active-pixel image sensor and readout circuits, all in the form of a single silicon-based complementary metal oxide/semiconductor (CMOS) integrated- circuit chip. As long as none of the circuitry and none of the electrical leads were exposed to seawater, which is electrically conductive, silicon integrated- circuit chips could withstand the hydrostatic pressure of even the deepest ocean. The pressure would change the semiconductor band gap by only a slight amount . not enough to degrade imaging performance significantly. Electrical contact with seawater would be prevented by potting the integrated-circuit chip in a transparent plastic case. The electrical leads for supplying power to the chip and extracting the video signal would also be potted, though not necessarily in the same transparent plastic. The hydrostatic pressure would tend to compress the plastic case and the chip equally on all sides; there would be no need for great strength because there would be no need to hold back high pressure on one side against low pressure on the other side. A light source suitable for use with the camera could consist of light-emitting diodes (LEDs). Like integrated- circuit chips, LEDs can withstand very large hydrostatic pressures. If

  17. The Camera Is Not a Methodology: Towards a Framework for Understanding Young Children's Use of Video Cameras

    ERIC Educational Resources Information Center

    Bird, Jo; Colliver, Yeshe; Edwards, Susan

    2014-01-01

    Participatory research methods argue that young children should be enabled to contribute their perspectives on research seeking to understand their worldviews. Visual research methods, including the use of still and video cameras with young children have been viewed as particularly suited to this aim because cameras have been considered easy and…

  18. ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System

    SciTech Connect

    Werry, S.M.

    1995-06-06

    This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

  19. Frequency Identification of Vibration Signals Using Video Camera Image Data

    PubMed Central

    Jeng, Yih-Nen; Wu, Chia-Hung

    2012-01-01

    This study showed that an image data acquisition system connecting a high-speed camera or webcam to a notebook or personal computer (PC) can precisely capture most dominant modes of vibration signal, but may involve the non-physical modes induced by the insufficient frame rates. Using a simple model, frequencies of these modes are properly predicted and excluded. Two experimental designs, which involve using an LED light source and a vibration exciter, are proposed to demonstrate the performance. First, the original gray-level resolution of a video camera from, for instance, 0 to 256 levels, was enhanced by summing gray-level data of all pixels in a small region around the point of interest. The image signal was further enhanced by attaching a white paper sheet marked with a black line on the surface of the vibration system in operation to increase the gray-level resolution. Experimental results showed that the Prosilica CV640C CMOS high-speed camera has the critical frequency of inducing the false mode at 60 Hz, whereas that of the webcam is 7.8 Hz. Several factors were proven to have the effect of partially suppressing the non-physical modes, but they cannot eliminate them completely. Two examples, the prominent vibration modes of which are less than the associated critical frequencies, are examined to demonstrate the performances of the proposed systems. In general, the experimental data show that the non-contact type image data acquisition systems are potential tools for collecting the low-frequency vibration signal of a system. PMID:23202026

  20. Frequency identification of vibration signals using video camera image data.

    PubMed

    Jeng, Yih-Nen; Wu, Chia-Hung

    2012-01-01

    This study showed that an image data acquisition system connecting a high-speed camera or webcam to a notebook or personal computer (PC) can precisely capture most dominant modes of vibration signal, but may involve the non-physical modes induced by the insufficient frame rates. Using a simple model, frequencies of these modes are properly predicted and excluded. Two experimental designs, which involve using an LED light source and a vibration exciter, are proposed to demonstrate the performance. First, the original gray-level resolution of a video camera from, for instance, 0 to 256 levels, was enhanced by summing gray-level data of all pixels in a small region around the point of interest. The image signal was further enhanced by attaching a white paper sheet marked with a black line on the surface of the vibration system in operation to increase the gray-level resolution. Experimental results showed that the Prosilica CV640C CMOS high-speed camera has the critical frequency of inducing the false mode at 60 Hz, whereas that of the webcam is 7.8 Hz. Several factors were proven to have the effect of partially suppressing the non-physical modes, but they cannot eliminate them completely. Two examples, the prominent vibration modes of which are less than the associated critical frequencies, are examined to demonstrate the performances of the proposed systems. In general, the experimental data show that the non-contact type image data acquisition systems are potential tools for collecting the low-frequency vibration signal of a system. PMID:23202026

  1. Photon-counting gamma camera based on columnar CsI(Tl) optically coupled to a back-illuminated CCD

    PubMed Central

    Miller, Brian W.; Barber, H. Bradford; Barrett, Harrison H.; Chen, Liying; Taylor, Sean J.

    2010-01-01

    Recent advances have been made in a new class of CCD-based, single-photon-counting gamma-ray detectors which offer sub-100 μm intrinsic resolutions.1–7 These detectors show great promise in small-animal SPECT and molecular imaging and exist in a variety of configurations. Typically, a columnar CsI(Tl) scintillator or a radiography screen (Gd2O2S:Tb) is imaged onto the CCD. Gamma-ray interactions are seen as clusters of signal spread over multiple pixels. When the detector is operated in a charge-integration mode, signal spread across pixels results in spatial-resolution degradation. However, if the detector is operated in photon-counting mode, the gamma-ray interaction position can be estimated using either Anger (centroid) estimation or maximum-likelihood position estimation resulting in a substantial improvement in spatial resolution.2 Due to the low-light-level nature of the scintillation process, CCD-based gamma cameras implement an amplification stage in the CCD via electron multiplying (EMCCDs)8–10 or via an image intensifier prior to the optical path.1 We have applied ideas and techniques from previous systems to our high-resolution LumiSPECT detector.11, 12 LumiSPECT is a dual-modality optical/SPECT small-animal imaging system which was originally designed to operate in charge-integration mode. It employs a cryogenically cooled, high-quantum-efficiency, back-illuminated large-format CCD and operates in single-photon-counting mode without any intermediate amplification process. Operating in photon-counting mode, the detector has an intrinsic spatial resolution of 64 μm compared to 134 μm in integrating mode. PMID:20890397

  2. The cloud cover fraction obtained from a ground CCD camera and its effect on a radiative transfer model

    NASA Astrophysics Data System (ADS)

    Souza, M. P.; Pereira, E. B.; Martins, F. R.; Chagas, R. C.; Freitas, W. S., Jr.

    2003-04-01

    Clouds are the major factor that rules the solar irradiance over Earth's surface. They interact with solar radiation in the shortwave spectra and with terrestrial radiation emitted by Earth's surface in the longwave range. Information about cloud cover is a very important input data for radiative transfer models and great effort is being made to improve methods to get this information. This paper reports the effects on a radiative transfer model by using the simple cloud fraction obtained by a ground set CCD camera instead of the satellite derived cloud index. The BRASIL-SR model is a radiative transfer model that calculates surface solar irradiance, using a normalized cloud index determined by statistical analyses of satellites images and from climatological values of temperature and albedo. Cloud fraction was obtained from digital images collected by a ground set CCD (Charge Coupled Device) camera, in the visible range (0.4mm - 0.7mm) as RGB (Red - Green - Blue) compositions. The method initially transforms the image attributes from the RGB space to the IHS (Intensity - Hue - Saturation) space. The algorithm defines threshold values for the saturation component of the IHS system to classify a pixel as cloudy or clear sky. Clear skies are identified by high values of saturation in the visible range while cloudy condition presents a mixture of several wavelengths and consequently lower saturation values. Results from the CCD camera and from the satellite were compared with the Kt and Kd from pyranometer data obtained from a local BSRN radiation station at Florianópolis (27º 28'S, 48º 29'W) and show that cloud fraction is only a poor information about the cloud sky status since it does not bear any information on the cloud optical depth which is needed in most radiative transfer models such as the one used in this paper (the BRASIL-SR).

  3. Reliable camera motion estimation from compressed MPEG videos using machine learning approach

    NASA Astrophysics Data System (ADS)

    Wang, Zheng; Ren, Jinchang; Wang, Yubin; Sun, Meijun; Jiang, Jianmin

    2013-05-01

    As an important feature in characterizing video content, camera motion has been widely applied in various multimedia and computer vision applications. A novel method for fast and reliable estimation of camera motion from MPEG videos is proposed, using support vector machine for estimation in a regression model trained on a synthesized sequence. Experiments conducted on real sequences show that the proposed method yields much improved results in estimating camera motions while the difficulty in selecting valid macroblocks and motion vectors is skipped.

  4. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by…

  5. Full-disk solar Dopplergrams observed with a one-megapixel CCD camera and a sodium magneto-optical filter

    NASA Technical Reports Server (NTRS)

    Rhodes, Edward J., Jr.; Cacciani, Alessandro; Tomczyk, Steven

    1987-01-01

    The paper presents here the first two full-disk solar Dopplergrams obtained with the new 1024 x 1024-pixel CCD camera which has recently been installed at the 60-Foot Tower Telescope of the Mt. Wilson Observatory. These Dopplergrams have a spatial resolution of 2.2 arcseconds and were obtained in a total of one minute of time. The Dopplergrams were obtained with a magnetooptical filter which was designed to obtain images in the two Na D lines. The filter and the camera were operated together as part of the development of a solar oscillations imager experiment which is currently being designed at JPL for the Joint NASA/ESA Solar and Heliospheric Observatory mission. Two different images obtained by subtracting two pairs of the Dopplergrams from the initial time series are also included.

  6. Real-time air quality monitoring by using internet video surveillance camera

    NASA Astrophysics Data System (ADS)

    Wong, C. J.; Lim, H. S.; MatJafri, M. Z.; Abdullah, K.; Low, K. L.

    2007-04-01

    Nowadays internet video surveillance cameras are widely use in security monitoring. The quantities of installations of these cameras also become more and more. This paper reports that the internet video surveillance cameras can be applied as a remote sensor for monitoring the concentrations of particulate matter less than 10 micron (PM10), so that real time air quality can be monitored at multi location simultaneously. An algorithm was developed based on the regression analysis of relationship between the measured reflectance components from a surface material and the atmosphere. This algorithm converts multispectral image pixel values acquired from these cameras into quantitative values of the concentrations of PM10. These computed PM10 values were compared to other standard values measured by a DustTrak TM meter. The correlation results showed that the newly develop algorithm produced a high degree of accuracy as indicated by high correlation coefficient (R2) and low root-mean-square-error (RMS) values. The preliminary results showed that the accuracy produced by this internet video surveillance camera is slightly better than that from the internet protocol (IP) camera. Basically the spatial resolution of images acquired by the IP camera was poorer compared to the internet video surveillance camera. This is because the images acquired by IP camera had been compressed and there was no compression for the images from the internet video surveillance camera.

  7. Electro-optical testing of fully depleted CCD image sensors for the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Doherty, Peter E.; Antilogus, Pierre; Astier, Pierre; Chiang, James; Gilmore, D. Kirk; Guyonnet, Augustin; Huang, Dajun; Kelly, Heather; Kotov, Ivan; Kubanek, Petr; Nomerotski, Andrei; O'Connor, Paul; Rasmussen, Andrew; Riot, Vincent J.; Stubbs, Christopher W.; Takacs, Peter; Tyson, J. Anthony; Vetter, Kurt

    2014-07-01

    The LSST Camera science sensor array will incorporate 189 large format Charge Coupled Device (CCD) image sensors. Each CCD will include over 16 million pixels and will be divided into 16 equally sized segments and each segment will be read through a separate output amplifier. The science goals of the project require CCD sensors with state of the art performance in many aspects. The broad survey wavelength coverage requires fully depleted, 100 micrometer thick, high resistivity, bulk silicon as the imager substrate. Image quality requirements place strict limits on the image degradation that may be caused by sensor effects: optical, electronic, and mechanical. In this paper we discuss the design of the prototype sensors, the hardware and software that has been used to perform electro-optic testing of the sensors, and a selection of the results of the testing to date. The architectural features that lead to internal electrostatic fields, the various effects on charge collection and transport that are caused by them, including charge diffusion and redistribution, effects on delivered PSF, and potential impacts on delivered science data quality are addressed.

  8. [Research Award providing funds for a tracking video camera

    NASA Technical Reports Server (NTRS)

    Collett, Thomas

    2000-01-01

    The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

  9. Liquid-crystal-display projector-based modulation transfer function measurements of charge-coupled-device video camera systems.

    PubMed

    Teipen, B T; MacFarlane, D L

    2000-02-01

    We demonstrate the ability to measure the system modulation transfer function (MTF) of both color and monochrome charge-coupled-device (CCD) video camera systems with a liquid-crystal-display (LCD) projector. Test matrices programmed to the LCD projector were chosen primarily to have a flat power spectral density (PSD) when averaged along one dimension. We explored several matrices and present results for a matrix produced with a random-number generator, a matrix of sequency-ordered Walsh functions, a pseudorandom Hadamard matrix, and a pseudorandom uniformly redundant array. All results are in agreement with expected filtering. The Walsh matrix and the Hadamard matrix show excellent agreement with the matrix from the random-number generator. We show that shift-variant effects between the LCD array and the CCD array can be kept small. This projector test method offers convenient measurement of the MTF of a low-cost video system. Such characterization is useful for an increasing number of machine vision applications and metrology applications. PMID:18337921

  10. Implementation of a parallel-beam optical-CT apparatus for three-dimensional radiation dosimetry using a high-resolution CCD camera

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Tzeng; Chen, Chin-Hsing; Hung, Chao-Nan; Tuan, Chiu-Ching; Chang, Yuan-Jen

    2015-06-01

    In this study, a charge-coupled device (CCD) camera with 2-megapixel (1920×1080-pixel) and 12-bit resolution was developed for optical computed tomography(optical CT). The signal-to-noise ratio (SNR) of our system was 30.12 dB, better than that of commercially available CCD cameras (25.31 dB). The 50% modulation transfer function (MTF50) of our 1920×1080-pixel camera gave a line width per picture height (LW/PH) of 745, which is 73% of the diffraction-limited resolution. Compared with a commercially available 1-megapixel CCD camera (1296×966-pixel) with a LW/PH=358 and 46.6% of the diffraction-limited resolution, our camera system provided higher spatial resolution and better image quality. The NIPAM gel dosimeter was used to evaluate the optical CT with a 2-megapixel CCD. A clinical five-field irradiation treatment plan was generated using the Eclipse planning system (Varian Corp., Palo Alto, CA, USA). The gel phantom was irradiated using a 6-MV Varian Clinac IX linear accelerator (Varian). The measured NIPAM gel dose distributions and the calculated dose distributions, generated by the treatment planning software (TPS), were compared using the 3% dose-difference and 3 mm distance-to-agreement criteria. The gamma pass rate was as high as 98.2% when 2-megapixel CCD camera was used in optical CT. However, the gamma pass rate was only 96.0% when a commercially available 1-megapixel CCD camera was used.

  11. Variable high-resolution color CCD camera system with online capability for professional photo studio application

    NASA Astrophysics Data System (ADS)

    Breitfelder, Stefan; Reichel, Frank R.; Gaertner, Ernst; Hacker, Erich J.; Cappellaro, Markus; Rudolf, Peter; Voelk, Ute

    1998-04-01

    Digital cameras are of increasing significance for professional applications in photo studios where fashion, portrait, product and catalog photographs or advertising photos of high quality have to be taken. The eyelike is a digital camera system which has been developed for such applications. It is capable of working online with high frame rates and images of full sensor size and it provides a resolution that can be varied between 2048 by 2048 and 6144 by 6144 pixel at a RGB color depth of 12 Bit per channel with an also variable exposure time of 1/60s to 1s. With an exposure time of 100 ms digitization takes approx. 2 seconds for an image of 2048 by 2048 pixels (12 Mbyte), 8 seconds for the image of 4096 by 4096 pixels (48 Mbyte) and 40 seconds for the image of 6144 by 6144 pixels (108 MByte). The eyelike can be used in various configurations. Used as a camera body most commercial lenses can be connected to the camera via existing lens adaptors. On the other hand the eyelike can be used as a back to most commercial 4' by 5' view cameras. This paper describes the eyelike camera concept with the essential system components. The article finishes with a description of the software, which is needed to bring the high quality of the camera to the user.

  12. Acceptance/operational test procedure 241-AN-107 Video Camera System

    SciTech Connect

    Pedersen, L.T.

    1994-11-18

    This procedure will document the satisfactory operation of the 241-AN-107 Video Camera System. The camera assembly, including camera mast, pan-and-tilt unit, camera, and lights, will be installed in Tank 241-AN-107 to monitor activities during the Caustic Addition Project. The camera focus, zoom, and iris remote controls will be functionally tested. The resolution and color rendition of the camera will be verified using standard reference charts. The pan-and-tilt unit will be tested for required ranges of motion, and the camera lights will be functionally tested. The master control station equipment, including the monitor, VCRs, printer, character generator, and video micrometer will be set up and performance tested in accordance with original equipment manufacturer`s specifications. The accuracy of the video micrometer to measure objects in the range of 0.25 inches to 67 inches will be verified. The gas drying distribution system will be tested to ensure that a drying gas can be flowed over the camera and lens in the event that condensation forms on these components. This test will be performed by attaching the gas input connector, located in the upper junction box, to a pressurized gas supply and verifying that the check valve, located in the camera housing, opens to exhaust the compressed gas. The 241-AN-107 camera system will also be tested to assure acceptable resolution of the camera imaging components utilizing the camera system lights.

  13. Dynamic imaging with a triggered and intensified CCD camera system in a high-intensity neutron beam

    NASA Astrophysics Data System (ADS)

    Vontobel, P.; Frei, G.; Brunner, J.; Gildemeister, A. E.; Engelhardt, M.

    2005-04-01

    When time-dependent processes within metallic structures should be inspected and visualized, neutrons are well suited due to their high penetration through Al, Ag, Ti or even steel. Then it becomes possible to inspect the propagation, distribution and evaporation of organic liquids as lubricants, fuel or water. The principle set-up of a suited real-time system was implemented and tested at the radiography facility NEUTRA of PSI. The highest beam intensity there is 2×107 cm s, which enables to observe sequences in a reasonable time and quality. The heart of the detection system is the MCP intensified CCD camera PI-Max with a Peltier cooled chip (1300×1340 pixels). The intensifier was used for both gating and image enhancement, where as the information was accumulated over many single frames on the chip before readout. Although, a 16-bit dynamic range is advertised by the camera manufacturers, it must be less due to the inherent noise level from the intensifier. The obtained result should be seen as the starting point to go ahead to fit the different requirements of car producers in respect to fuel injection, lubricant distribution, mechanical stability and operation control. Similar inspections will be possible for all devices with repetitive operation principle. Here, we report about two measurements dealing with the lubricant distribution in a running motorcycle motor turning at 1200 rpm. We were monitoring the periodic stationary movements of piston, valves and camshaft with a micro-channel plate intensified CCD camera system (PI-Max 1300RB, Princeton Instruments) triggered at exactly chosen time points.

  14. Lori Losey - The Woman Behind the Video Camera

    NASA Video Gallery

    The often-spectacular aerial video imagery of NASA flight research, airborne science missions and space satellite launches doesn't just happen. Much of it is the work of Lori Losey, senior video pr...

  15. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system

  16. Operational test procedure 241-AZ-101 waste tank color video camera system

    SciTech Connect

    Robinson, R.S.

    1996-10-30

    The purpose of this procedure is to provide a documented means of verifying that all of the functional components of the 241-AZ- 101 Waste Tank Video Camera System operate properly before and after installation.

  17. Engineering task plan for flammable gas atmosphere mobile color video camera systems

    SciTech Connect

    Kohlman, E.H.

    1995-01-25

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and testing of the mobile video camera systems. The color video camera systems will be used to observe and record the activities within the vapor space of a tank on a limited exposure basis. The units will be fully mobile and designed for operation in the single-shell flammable gas producing tanks. The objective of this tank is to provide two mobile camera systems for use in flammable gas producing single-shell tanks (SSTs) for the Flammable Gas Tank Safety Program. The camera systems will provide observation, video recording, and monitoring of the activities that occur in the vapor space of applied tanks. The camera systems will be designed to be totally mobile, capable of deployment up to 6.1 meters into a 4 inch (minimum) riser.

  18. Performances of a solid streak camera based on conventional CCD with nanosecond time resolution

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Bai, Yonglin; Zhu, Bingli; Gou, Yongsheng; Xu, Peng; Bai, XiaoHong; Liu, Baiyu; Qin, Junjun

    2015-02-01

    Imaging systems with high temporal resolution are needed to study rapid physical phenomena ranging from shock waves, including extracorporeal shock waves used for surgery, to diagnostics of laser fusion and fuel injection in internal combustion engines. However, conventional streak cameras use a vacuum tube making thus fragile, cumbersome and expensive. Here we report an CMOS streak camera project consists in reproducing completely this streak camera functionality with a single CMOS chip. By changing the mode of charge transfer of CMOS image sensor, fast photoelectric diagnostics of single point with linear CMOS and high-speed line scanning with array CMOS sensor can be achieved respectively. A fast photoelectric diagnostics system has been designed and fabricated to investigate the feasibility of this method. Finally, the dynamic operation of the sensors is exposed. Measurements show a sample time of 500 ps and a time resolution better than 2 ns.

  19. Using a Video Camera to Measure the Radius of the Earth

    ERIC Educational Resources Information Center

    Carroll, Joshua; Hughes, Stephen

    2013-01-01

    A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of…

  20. MISR Level 1A CCD Science data, all cameras (MIL1A_V2)

    NASA Technical Reports Server (NTRS)

    Diner, David J. (Principal Investigator)

    The Level 1A data are raw MISR data that are decommutated, reformatted 12-bit Level 0 data shifted to byte boundaries, i.e., reversal of square-root encoding applied and converted to 16 bit, and annotated (e.g., with time information). These data are used by the Level 1B1 processing algorithm to generate calibrated radiances. The science data output preserves the spatial sampling rate of the Level 0 raw MISR CCD science data. CCD data are collected during routine science observations of the sunlit portion of the Earth. Each product represents one 'granule' of data. A 'granule' is defined to be the smallest unit of data required for MISR processing. Also, included in the Level 1A product are pointers to calibration coefficient files provided for Level 1B processing. [Location=GLOBAL] [Temporal_Coverage: Start_Date=2000-02-24; Stop_Date=] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180].

  1. Development of Measurement Device of Working Radius of Crane Based on Single CCD Camera and Laser Range Finder

    NASA Astrophysics Data System (ADS)

    Nara, Shunsuke; Takahashi, Satoru

    In this paper, what we want to do is to develop an observation device to measure the working radius of a crane truck. The device has a single CCD camera, a laser range finder and two AC servo motors. First, in order to measure the working radius, we need to consider algorithm of a crane hook recognition. Then, we attach the cross mark on the crane hook. Namely, instead of the crane hook, we try to recognize the cross mark. Further, for the observation device, we construct PI control system with an extended Kalman filter to track the moving cross mark. Through experiments, we show the usefulness of our device including new control system of mark tracking.

  2. Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera

    NASA Astrophysics Data System (ADS)

    Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji

    1999-10-01

    A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.

  3. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    SciTech Connect

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  4. Close infrared thermography using an intensified CCD camera: application in nondestructive high resolution evaluation of electrothermally actuated MEMS

    NASA Astrophysics Data System (ADS)

    Serio, B.; Hunsinger, J. J.; Conseil, F.; Derderian, P.; Collard, D.; Buchaillot, L.; Ravat, M. F.

    2005-06-01

    This communication proposes the description of an optical method for thermal characterization of MEMS devices. The method is based on the use of an intensified CCD camera to record the thermal radiation emitted by the studied device in the spectral domain from 600 nm to about 850 nm. The camera consists of an intensifier associated to a CCD sensor. The intensification allows for very low signal levels to be amplified and detected. We used a standard optical microscope to image the device with sub-micron resolution. Since, in close infrared, at very small scale and low temperature, typically 250°C for thermal MEMS (Micro-Electro-Mechanical Systems), the thermal radiation is very weak, we used image integration in order to increase the signal to noise ratio. Knowing the imaged materials emissivity, the temperature is given by using Planck"s law. In order to evaluate the system performances we have made micro-thermographies of a micro-relay thermal actuator. This device is an "U-shape" Al/SiO2 bimorph cantilever micro-relay with a gold-to-gold electrical contact, designed for secured harsh environment applications. The initial beam curvature resulting from residual stresses ensures a large gap between the contacts of the micro-relay. The current flow through the metallic layer heats the bimorph by Joule effect, and the differential expansion provides the vertical displacement for contact. The experimental results are confronted to FEM and analytical simulations. A good agreement was obtained between experimental results and simulations.

  5. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  6. Still-Video Photography: Tomorrow's Electronic Cameras in the Hands of Today's Photojournalists.

    ERIC Educational Resources Information Center

    Foss, Kurt; Kahan, Robert S.

    This paper examines the still-video camera and its potential impact by looking at recent experiments and by gathering information from some of the few people knowledgeable about the new technology. The paper briefly traces the evolution of the tools and processes of still-video photography, examining how photographers and their work have been…

  7. LED characterization for development of on-board calibration unit of CCD-based advanced wide-field sensor camera of Resourcesat-2A

    NASA Astrophysics Data System (ADS)

    Chatterjee, Abhijit; Verma, Anurag

    2016-05-01

    The Advanced Wide Field Sensor (AWiFS) camera caters to high temporal resolution requirement of Resourcesat-2A mission with repeativity of 5 days. The AWiFS camera consists of four spectral bands, three in the visible and near IR and one in the short wave infrared. The imaging concept in VNIR bands is based on push broom scanning that uses linear array silicon charge coupled device (CCD) based Focal Plane Array (FPA). On-Board Calibration unit for these CCD based FPAs is used to monitor any degradation in FPA during entire mission life. Four LEDs are operated in constant current mode and 16 different light intensity levels are generated by electronically changing exposure of CCD throughout the calibration cycle. This paper describes experimental setup and characterization results of various flight model visible LEDs (λP=650nm) for development of On-Board Calibration unit of Advanced Wide Field Sensor (AWiFS) camera of RESOURCESAT-2A. Various LED configurations have been studied to meet dynamic range coverage of 6000 pixels silicon CCD based focal plane array from 20% to 60% of saturation during night pass of the satellite to identify degradation of detector elements. The paper also explains comparison of simulation and experimental results of CCD output profile at different LED combinations in constant current mode.

  8. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e(-)/DN, a noise level less than 25e(-), a dark current level which is less than 10e(-)/pixel/s, and a residual nonlinearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  9. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Astrophysics Data System (ADS)

    Joiner, R. K.; Kobayashi, K.; Winebarger, A. R.; Champey, P. R.

    2014-12-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e­-/DN, a noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non-linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  10. Engineering task plan for Tanks 241-AN-103, 104, 105 color video camera systems

    SciTech Connect

    Kohlman, E.H.

    1994-11-17

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and installation of the video camera systems into the vapor space within tanks 241-AN-103, 104, and 105. The one camera remotely operated color video systems will be used to observe and record the activities within the vapor space. Activities may include but are not limited to core sampling, auger activities, crust layer examination, monitoring of equipment installation/removal, and any other activities. The objective of this task is to provide a single camera system in each of the tanks for the Flammable Gas Tank Safety Program.

  11. Lights! Camera! Action! Handling Your First Video Assignment.

    ERIC Educational Resources Information Center

    Thomas, Marjorie Bekaert

    1989-01-01

    The author discusses points to consider when hiring and working with a video production company to develop a video for human resources purposes. Questions to ask the consultants are included, as is information on the role of the company liaison and on how to avoid expensive, time-wasting pitfalls. (CH)

  12. Lights, Cameras, Pencils! Using Descriptive Video to Enhance Writing

    ERIC Educational Resources Information Center

    Hoffner, Helen; Baker, Eileen; Quinn, Kathleen Benson

    2008-01-01

    Students of various ages and abilities can increase their comprehension and build vocabulary with the help of a new technology, Descriptive Video. Descriptive Video (also known as described programming) was developed to give individuals with visual impairments access to visual media such as television programs and films. Described programs,…

  13. Feasibility study of transmission of OTV camera control information in the video vertical blanking interval

    NASA Technical Reports Server (NTRS)

    White, Preston A., III

    1994-01-01

    The Operational Television system at Kennedy Space Center operates hundreds of video cameras, many remotely controllable, in support of the operations at the center. This study was undertaken to determine if commercial NABTS (North American Basic Teletext System) teletext transmission in the vertical blanking interval of the genlock signals distributed to the cameras could be used to send remote control commands to the cameras and the associated pan and tilt platforms. Wavelength division multiplexed fiberoptic links are being installed in the OTV system to obtain RS-250 short-haul quality. It was demonstrated that the NABTS transmission could be sent over the fiberoptic cable plant without excessive video quality degradation and that video cameras could be controlled using NABTS transmissions over multimode fiberoptic paths as long as 1.2 km.

  14. Low cost referenced luminescent imaging of oxygen and pH with a 2-CCD colour near infrared camera.

    PubMed

    Ehgartner, Josef; Wiltsche, Helmar; Borisov, Sergey M; Mayr, Torsten

    2014-10-01

    A low cost imaging set-up for optical chemical sensors based on NIR-emitting dyes is presented. It is based on a commercially available 2-CCD colour near infrared camera, LEDs and tailor-made optical sensing materials for oxygen and pH. The set-up extends common ratiometric RGB imaging based on the red, green and blue channels of colour cameras by an additional NIR channel. The hardware and software of the camera were adapted to perform ratiometric imaging. A series of new planar sensing foils were introduced to image oxygen, pH and both parameters simultaneously. The used NIR-emitting indicators are based on benzoporphyrins and aza-BODIPYs for oxygen and pH, respectively. Moreover, a wide dynamic range oxygen sensor is presented. It allows accurate imaging of oxygen from trace levels up to ambient air concentrations. The imaging set-up in combination with the normal range ratiometric oxygen sensor showed a resolution of 4-5 hPa at low oxygen concentrations (<50 hPa) and 10-15 hPa at ambient air oxygen concentrations; the trace range oxygen sensor (<20 hPa) revealed a resolution of about 0.5-1.8 hPa. The working range of the pH-sensor was in the physiological region from pH 6.0 up to pH 8.0 and showed an apparent pKa-value of 7.3 with a resolution of about 0.1 pH units. The performance of the dual parameter oxygen/pH sensor was comparable to the single analyte pH and normal range oxygen sensors. PMID:25096329

  15. Million-frame-per-second CCD camera with 16 frames of storage

    NASA Astrophysics Data System (ADS)

    Howard, Nathan E.; Gardner, David W.; Snyder, Donald R.

    1997-12-01

    Ultrafast imaging is an important need for the development, control, and evaluation of modern air-deliverable weapons systems. Recent advances in optical imaging such as speckle interferometry can potentially improve DoD capability to deliver munitions and armaments to targets at long ranges, and under adverse seeing conditions. Moderate density arrays of at least 100 by 100 pixels and frame rates of at least 1 MHz are required. Ultrafast imaging is also required for flow field optical image analysis for hypersonic propulsion systems. Silicon Mountain Design (SMD) has built such an imager so that high quality images can be obtained for relatively low cost. The SMD-64k1M camera is capable of imaging 1,000,000 frames per second using a 256 by 256 array with the ability to store 16 frames with true 12 bits of dynamic range. This camera allows researchers to capture multiple high speed events using solid state technology housed in a 53 cubic inch package. A brief technical overview of the imager and results are presented in this paper.

  16. Evaluation of imaging performance of a taper optics CCD; FReLoN' camera designed for medical imaging.

    PubMed

    Coan, Paola; Peterzol, Angela; Fiedler, Stefan; Ponchut, Cyril; Labiche, Jean Claude; Bravin, Alberto

    2006-05-01

    The purpose of this work was to assess the imaging performance of an indirect conversion detector (taper optics CCD; FReLoN' camera) in terms of the modulation transfer function (MTF), normalized noise power spectrum (NNPS) and detective quantum efficiency (DQE). Measurements were made with a synchrotron radiation laminar beam at various monochromatic energies in the 20-51.5 keV range for a gadolinium-based fluorescent screen varying in thickness; data acquisition and analysis were made by adapting to this beam geometry protocols used for conventional cone beams. The pre-sampled MTFs of the systems were measured using an edge method. The NNPS of the systems were determined for a range of exposure levels by two-dimensional Fourier analysis of uniformly exposed radiographs. The DQEs were assessed from the measured MTF, NNPS, exposure and incoming number of photons. The MTF, for a given screen, was found to be almost energy independent and, for a given energy, higher for the thinnest screen. At 33 keV and for the 40 (100) microm screen, at 10% the MTF is 9.2 (8.6) line-pairs mm(-1). The NNPS was found to be different in the two analyzed directions in relation to frequency. Highest DQE values were found for the combination 100 microm and 25 keV (0.5); it was still equal to 0.4 at 51.5 keV (above the gadolinium K-edge). The DQE is limited by the phosphor screen conversion yield and by the CCD efficiency. At the end of the manuscript the results of the FReLoN characterization and those from a selected number of detectors presented in the literature are compared. PMID:16645252

  17. Effects of point-spread function on calibration and radiometric accuracy of CCD camera.

    PubMed

    Du, Hong; Voss, Kenneth J

    2004-01-20

    The point-spread function (PSF) of a camera can seriously affect the accuracy of radiometric calibration and measurement. We found that the PSF can produce a 3.7% difference between the apparent measured radiance of two plaques of different sizes with the same illumination. This difference can be removed by deconvolution with the measured PSF. To determine the PSF, many images of a collimated beam from a He-Ne laser are averaged. Since our optical system is focused at infinity, it should focus this source to a single pixel. Although the measured PSF is very sharp, dropping 4 and 6 orders of magnitude and 8 and 100 pixels away from the point source, respectively, we show that the effect of the PSF as far as 100 pixels away cannot be ignored without introducing an appreciable error to the calibration. We believe that the PSF should be taken into account in all optical systems to obtain accurate radiometric measurements. PMID:14765928

  18. Experimental Comparison of the High-Speed Imaging Performance of an EM-CCD and sCMOS Camera in a Dynamic Live-Cell Imaging Test Case

    PubMed Central

    Beier, Hope T.; Ibey, Bennett L.

    2014-01-01

    The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

  19. CCD and CMOS sensors

    NASA Astrophysics Data System (ADS)

    Waltham, Nick

    The charge-coupled device (CCD) has been developed primarily as a compact image sensor for consumer and industrial markets, but is now also the preeminent visible and ultraviolet wavelength image sensor in many fields of scientific research including space-science and both Earth and planetary remote sensing. Today"s scientific or science-grade CCD will strive to maximise pixel count, focal plane coverage, photon detection efficiency over the broadest spectral range and signal dynamic range whilst maintaining the lowest possible readout noise. The relatively recent emergence of complementary metal oxide semiconductor (CMOS) image sensor technology is arguably the most important development in solid-state imaging since the invention of the CCD. CMOS technology enables the integration on a single silicon chip of a large array of photodiode pixels alongside all of the ancillary electronics needed to address the array and digitise the resulting analogue video signal. Compared to the CCD, CMOS promises a more compact, lower mass, lower power and potentially more radiation tolerant camera.

  20. Measurement accuracy of stereovision systems based on CCD video-photographic equipment in application to agricultural and environmental surveys

    NASA Astrophysics Data System (ADS)

    Menesatti, Paolo

    1996-12-01

    Artificial vision and image analysis are increasing their role in agriculture. Using systems based on stereoscopic vision it is possible to associate to the large images information, a three dimensional space reference. So it is possible to measure distances between vision system and any point of real observed scene or calculate relative positions between different subjects of the same image. The work evaluates the possibility, the capacity and the accuracy of stereovision system as possible application in environmental and agricultural survey. The analysis was performed theoretically in function of the characteristics of some image acquisition CCD equipment, existing on the market of video-photographic device, and considering different parameters of environmental situations (field of view width, linear distance - z - between video system and the subject). Good accuracy is obtainable also by a 'standard' system (500 pixels resolution) for a z distance of 100 m and 5 m distance of the two video equipment. For a similar situation with a high performance equipment (3060 pixel resolution), it is possible to obtain an accuracy of the centimeter rank.

  1. A Comparison of Techniques for Camera Selection and Hand-Off in a Video Network

    NASA Astrophysics Data System (ADS)

    Li, Yiming; Bhanu, Bir

    Video networks are becoming increasingly important for solving many real-world problems. Multiple video sensors require collaboration when performing various tasks. One of the most basic tasks is the tracking of objects, which requires mechanisms to select a camera for a certain object and hand-off this object from one camera to another so as to accomplish seamless tracking. In this chapter, we provide a comprehensive comparison of current and emerging camera selection and hand-off techniques. We consider geometry-, statistics-, and game theory-based approaches and provide both theoretical and experimental comparison using centralized and distributed computational models. We provide simulation and experimental results using real data for various scenarios of a large number of cameras and objects for in-depth understanding of strengths and weaknesses of these techniques.

  2. Kids behind the Camera: Education for the Video Age.

    ERIC Educational Resources Information Center

    Berwick, Beverly

    1994-01-01

    Some San Diego teachers created the Montgomery Media Institute to tap the varied talents of young people attending area high schools and junior high schools. Featuring courses in video programming and production, photography, and journalism, this program engages students' interest while introducing them to fields with current employment…

  3. Passive millimeter-wave video camera for aviation applications

    NASA Astrophysics Data System (ADS)

    Fornaca, Steven W.; Shoucri, Merit; Yujiri, Larry

    1998-07-01

    Passive Millimeter Wave (PMMW) imaging technology offers significant safety benefits to world aviation. Made possible by recent technological breakthroughs, PMMW imaging sensors provide visual-like images of objects under low visibility conditions (e.g., fog, clouds, snow, sandstorms, and smoke) which blind visual and infrared sensors. TRW has developed an advanced, demonstrator version of a PMMW imaging camera that, when front-mounted on an aircraft, gives images of the forward scene at a rate and quality sufficient to enhance aircrew vision and situational awareness under low visibility conditions. Potential aviation uses for a PMMW camera are numerous and include: (1) Enhanced vision for autonomous take- off, landing, and surface operations in Category III weather on Category I and non-precision runways; (2) Enhanced situational awareness during initial and final approach, including Controlled Flight Into Terrain (CFIT) mitigation; (3) Ground traffic control in low visibility; (4) Enhanced airport security. TRW leads a consortium which began flight tests with the demonstration PMMW camera in September 1997. Flight testing will continue in 1998. We discuss the characteristics of PMMW images, the current state of the technology, the integration of the camera with other flight avionics to form an enhanced vision system, and other aviation applications.

  4. Camera/Video Phones in Schools: Law and Practice

    ERIC Educational Resources Information Center

    Parry, Gareth

    2005-01-01

    The emergence of mobile phones with built-in digital cameras is creating legal and ethical concerns for school systems throughout the world. Users of such phones can instantly email, print or post pictures to other MMS1 phones or websites. Local authorities and schools in Britain, Europe, USA, Canada, Australia and elsewhere have introduced…

  5. BOREAS RSS-3 Imagery and Snapshots from a Helicopter-Mounted Video Camera

    NASA Technical Reports Server (NTRS)

    Walthall, Charles L.; Loechel, Sara; Nickeson, Jaime (Editor); Hall, Forrest G. (Editor)

    2000-01-01

    The BOREAS RSS-3 team collected helicopter-based video coverage of forested sites acquired during BOREAS as well as single-frame "snapshots" processed to still images. Helicopter data used in this analysis were collected during all three 1994 IFCs (24-May to 16-Jun, 19-Jul to 10-Aug, and 30-Aug to 19-Sep), at numerous tower and auxiliary sites in both the NSA and the SSA. The VHS-camera observations correspond to other coincident helicopter measurements. The field of view of the camera is unknown. The video tapes are in both VHS and Beta format. The still images are stored in JPEG format.

  6. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.

    PubMed

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-01-01

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961

  7. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-01-01

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961

  8. Using hand-held point and shoot video cameras in clinical education.

    PubMed

    Stoten, Sharon

    2011-02-01

    Clinical educators are challenged to design and implement creative instructional strategies to provide employees with optimal clinical practice learning opportunities. Using hand-held video cameras to capture patient encounters or skills demonstrations involves employees in active learning and can increase dialogue between employees and clinical educators. The video that is created also can be used for evaluation and feedback. Hands-on experiences may energize employees with different talents and styles of learning. PMID:21323214

  9. Laser Imaging Video Camera Sees Through Fire, Fog, Smoke

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.

  10. Observation of hydrothermal flows with acoustic video camera

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Asada, A.; Tamaki, K.; Scientific Team Of Yk09-13 Leg 1

    2010-12-01

    Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. In this cruise, seven dives of Shinkai6500 were conducted. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. Processing and analyzing the acoustic video image data are going on. We will report the overview of the acoustic video image of the hydrothermal plumes and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

  11. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  12. Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera

    ERIC Educational Resources Information Center

    Levesque, Luc

    2014-01-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the…

  13. High-resolution application of YAG:Ce and LuAG:Ce imaging detectors with a CCD X-ray camera

    NASA Astrophysics Data System (ADS)

    Touš, Jan; Horváth, Martin; Pína, Ladislav; Blažek, Karel; Sopko, Bruno

    2008-06-01

    A high-resolution CCD X-ray camera based on YAG:Ce or LuAG:Ce thin scintillators is presented. High-resolution in low-energy X-ray radiation is proved with several objects. The spatial resolution achieved in the images is about 1 μm. The high-resolution imaging system is a combination of a high-sensitivity digital CCD camera and an optical system with a thin scintillator-imaging screen. The screen can consist of YAG:Ce or LuAG:Ce inorganic scintillator [J.A. Mares, Radiat. Meas. 38 (2004) 353]. These materials have the advantages of mechanical and chemical stability and non-hygroscopicity. The high-resolution imaging system can be used with different types of radiation (X-ray, electrons, UV, and VUV [M. Nikl, Meas. Sci. Technol. 17 (2006) R37]). The objects used for the imaging tests are grids and small animals with features of several microns in size. The resolution capabilities were tested using different types of CCD cameras and scintillation imaging screens.

  14. Video content analysis on body-worn cameras for retrospective investigation

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Baan, Jan; ter Haar, Frank B.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Burghouts, Gertjan J.; Wijn, Remco; van den Broek, Sebastiaan P.; van Rest, Jeroen H. C.

    2015-10-01

    In the security domain, cameras are important to assess critical situations. Apart from fixed surveillance cameras we observe an increasing number of sensors on mobile platforms, such as drones, vehicles and persons. Mobile cameras allow rapid and local deployment, enabling many novel applications and effects, such as the reduction of violence between police and citizens. However, the increased use of bodycams also creates potential challenges. For example: how can end-users extract information from the abundance of video, how can the information be presented, and how can an officer retrieve information efficiently? Nevertheless, such video gives the opportunity to stimulate the professionals' memory, and support complete and accurate reporting. In this paper, we show how video content analysis (VCA) can address these challenges and seize these opportunities. To this end, we focus on methods for creating a complete summary of the video, which allows quick retrieval of relevant fragments. The content analysis for summarization consists of several components, such as stabilization, scene selection, motion estimation, localization, pedestrian tracking and action recognition in the video from a bodycam. The different components and visual representations of summaries are presented for retrospective investigation.

  15. A small CCD zenith camera (ZC-G1) - developed for rapid geoid monitoring in difficult projects

    NASA Astrophysics Data System (ADS)

    Gerstbach, G.; Pichler, H.

    2003-10-01

    Modern Geodesy by terrestrial or space methods is accurate to millimetres or even better. This requires very exact system definitions, together with Astronomy & Physics - and a geoid of cm level. To reach this precision, astrogeodetic vertical deflections are more effective than gravimetry or other methods - as shown by the 1st author 1996 at many projects in different European countries and landscapes. While classical Astrogeodesy is rather complicated (time consuming, heavy instruments and observer's experience) new electro-optical methods are semi-automatic and fill our "geoid gap" between satellite resolution (150 km) and local requirements (2-10 km): With CCD we can speed up and achieve high accuracy almost without observer's experience. In Vienna we construct a mobile zenith camera guided by notebook and GPS: made of Dur-Al, f=20 cm with a Starlite MX-sensor (752×580 pixels à 11μm). Accuracy ±1" within 10 min, mounted at a usual survey tripod. Weight only 4 kg for a special vertical axis, controlled by springs (4×90°) and 2 levels (2002) or sensor (2003). Applications 2003: Improving parts of Austrian geoid (±4 cm→2 cm); automatic astro-points in alpine surveys (vertical deflection effects 3-15 cm per km). Transform of GPS heights to ±1 cm. Tunneling study: heighting up to ±0.1 mm without external control; combining astro-topographic and geological data. Plans 2004: Astro control of polygons and networks - to raise accuracy and economy by ~40% (Sun azimuths of ±3"; additional effort only 10-20%). Planned with servo theodolites and open co-operation groups.

  16. Acceptance/operational test procedure 101-AW tank camera purge system and 101-AW video camera system

    SciTech Connect

    Castleberry, J.L.

    1994-09-19

    This procedure will document the satisfactory operation of the 101-AW Tank Camera Purge System (CPS) and the 101-AW Video Camera System. The safety interlock which shuts down all the electronics inside the 101-AW vapor space, during loss of purge pressure, will be in place and tested to ensure reliable performance. This procedure is separated into four sections. Section 6.1 is performed in the 306 building prior to delivery to the 200 East Tank Farms and involves leak checking all fittings on the 101-AW Purge Panel for leakage using a Snoop solution and resolving the leakage. Section 7.1 verifies that PR-1, the regulator which maintains a positive pressure within the volume (cameras and pneumatic lines), is properly set. In addition the green light (PRESSURIZED) (located on the Purge Control Panel) is verified to turn on above 10 in. w.g. and after the time delay (TDR) has timed out. Section 7.2 verifies that the purge cycle functions properly, the red light (PURGE ON) comes on, and that the correct flowrate is obtained to meet the requirements of the National Fire Protection Association. Section 7.3 verifies that the pan and tilt, camera, associated controls and components operate correctly. This section also verifies that the safety interlock system operates correctly during loss of purge pressure. During the loss of purge operation the illumination of the amber light (PURGE FAILED) will be verified.

  17. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  18. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  19. Current Geoid Studies in Turkey and the need for Local High-Precision Astrogeodetic Geoid Determination Using CCD/Zenith Cameras

    NASA Astrophysics Data System (ADS)

    Halicioglu, K.; Ozener, H.; Deniz, R.

    2008-12-01

    During the last few years, the development of CCD image sensors at a reasonable price made the instruments of astrogeodetic observation possible to use for local high-precision astrogeodetic geoid and gravity field determination. Generally, the geoids of most European countries are in centimeter level accuracy except in mountainous regions. Turkish geoid also has accuracy problems in mountainous regions especially in the eastern parts of Anatolia and around boundaries of Marmara Sea. Studies performed in Europe in last decade indicate that, to reach the centimeter level accuracy in mountainous areas, astrogeodetic vertical deflections are more effective than gravimetric and other geoid determination methods. Turkey had started the geoid determination studies in 1976 with 13 absolute gravity points. Turkish National Fundamental Gravity Network (TNFGRN), densificated with 1st and 2nd order 66245 gravity points in Potsdam Gravity datum. TG03 has a final internal precision of 1 cm at the observation points and the external accuracy is within decimeter level. High precision in astrogeodetic geoid determination techniques are scarcely published by some universities around Europe using CCD/Zenith cameras. There are various zenith camera systems developed as state-of- art instrumentations using both CCD sensors for imaging stellar objects and GPS receivers for ellipsoidal coordinates, in order to determine the direction of the plumb line. These systems are designed and tested where conventional techniques are not sufficient. In this study, increasing accuracy of Turkish geoid is subjected to using CCD/Zenith cameras in the province of Istanbul. The planning test area is going to use the data available on the GPS/Leveling geoid of Istanbul and produce astrogeodetic data on a profile starting from the north shore of Marmara region, passing through the Marmara Sea to the south. The astrogeodetic instruments will be designed for engineering studies that are needed to determine

  20. A Refrigerated Web Camera for Photogrammetric Video Measurement inside Biomass Boilers and Combustion Analysis

    PubMed Central

    Porteiro, Jacobo; Riveiro, Belén; Granada, Enrique; Armesto, Julia; Eguía, Pablo; Collazo, Joaquín

    2011-01-01

    This paper describes a prototype instrumentation system for photogrammetric measuring of bed and ash layers, as well as for flying particle detection and pursuit using a single device (CCD) web camera. The system was designed to obtain images of the combustion process in the interior of a domestic boiler. It includes a cooling system, needed because of the high temperatures in the combustion chamber of the boiler. The cooling system was designed using CFD simulations to ensure effectiveness. This method allows more complete and real-time monitoring of the combustion process taking place inside a boiler. The information gained from this system may facilitate the optimisation of boiler processes. PMID:22319349

  1. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source (HTPD 08 paper)

    SciTech Connect

    Haugh, M; Schneider, M B

    2008-04-28

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 {micro}m square pixels, and 15 {micro}m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/{Delta}E {approx} 10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within {+-}1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  2. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras

    PubMed Central

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  3. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras.

    PubMed

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  4. Hardware-based smart camera for recovering high dynamic range video from multiple exposures

    NASA Astrophysics Data System (ADS)

    Lapray, Pierre-Jean; Heyrman, Barthélémy; Ginhac, Dominique

    2014-10-01

    In many applications such as video surveillance or defect detection, the perception of information related to a scene is limited in areas with strong contrasts. The high dynamic range (HDR) capture technique can deal with these limitations. The proposed method has the advantage of automatically selecting multiple exposure times to make outputs more visible than fixed exposure ones. A real-time hardware implementation of the HDR technique that shows more details both in dark and bright areas of a scene is an important line of research. For this purpose, we built a dedicated smart camera that performs both capturing and HDR video processing from three exposures. What is new in our work is shown through the following points: HDR video capture through multiple exposure control, HDR memory management, HDR frame generation, and representation under a hardware context. Our camera achieves a real-time HDR video output at 60 fps at 1.3 megapixels and demonstrates the efficiency of our technique through an experimental result. Applications of this HDR smart camera include the movie industry, the mass-consumer market, military, automotive industry, and surveillance.

  5. Surgical video recording with a modified GoPro Hero 4 camera

    PubMed Central

    Lin, Lily Koo

    2016-01-01

    Background Surgical videography can provide analytical self-examination for the surgeon, teaching opportunities for trainees, and allow for surgical case presentations. This study examined if a modified GoPro Hero 4 camera with a 25 mm lens could prove to be a cost-effective method of surgical videography with enough detail for oculoplastic and strabismus surgery. Method The stock lens mount and lens were removed from a GoPro Hero 4 camera, and was refitted with a Peau Productions SuperMount and 25 mm lens. The modified GoPro Hero 4 camera was then fixed to an overhead surgical light. Results Camera settings were set to 1080p video resolution. The 25 mm lens allowed for nine times the magnification as the GoPro stock lens. There was no noticeable video distortion. The entire cost was less than 600 USD. Conclusion The adapted GoPro Hero 4 with a 25 mm lens allows for high-definition, cost-effective, portable video capture of oculoplastic and strabismus surgery. The 25 mm lens allows for detailed videography that can enhance surgical teaching and self-examination. PMID:26834455

  6. Video and acoustic camera techniques for studying fish under ice: a review and comparison

    SciTech Connect

    Mueller, Robert P.; Brown, Richard S.; Hop, Haakon H.; Moulton, Larry

    2006-09-05

    Researchers attempting to study the presence, abundance, size, and behavior of fish species in northern and arctic climates during winter face many challenges, including the presence of thick ice cover, snow cover, and, sometimes, extremely low temperatures. This paper describes and compares the use of video and acoustic cameras for determining fish presence and behavior in lakes, rivers, and streams with ice cover. Methods are provided for determining fish density and size, identifying species, and measuring swimming speed and successful applications of previous surveys of fish under the ice are described. These include drilling ice holes, selecting batteries and generators, deploying pan and tilt cameras, and using paired colored lasers to determine fish size and habitat associations. We also discuss use of infrared and white light to enhance image-capturing capabilities, deployment of digital recording systems and time-lapse techniques, and the use of imaging software. Data are presented from initial surveys with video and acoustic cameras in the Sagavanirktok River Delta, Alaska, during late winter 2004. These surveys represent the first known successful application of a dual-frequency identification sonar (DIDSON) acoustic camera under the ice that achieved fish detection and sizing at camera ranges up to 16 m. Feasibility tests of video and acoustic cameras for determining fish size and density at various turbidity levels are also presented. Comparisons are made of the different techniques in terms of suitability for achieving various fisheries research objectives. This information is intended to assist researchers in choosing the equipment that best meets their study needs.

  7. A Novel Method to Reduce Time Investment When Processing Videos from Camera Trap Studies

    PubMed Central

    Swinnen, Kristijn R. R.; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

  8. A passive terahertz video camera based on lumped element kinetic inductance detectors

    NASA Astrophysics Data System (ADS)

    Rowe, Sam; Pascale, Enzo; Doyle, Simon; Dunscombe, Chris; Hargrave, Peter; Papageorgio, Andreas; Wood, Ken; Ade, Peter A. R.; Barry, Peter; Bideaud, Aurélien; Brien, Tom; Dodd, Chris; Grainger, William; House, Julian; Mauskopf, Philip; Moseley, Paul; Spencer, Locke; Sudiwala, Rashmi; Tucker, Carole; Walker, Ian

    2016-03-01

    We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)—designed originally for far-infrared astronomy—as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ˜0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequency domain multiplexing electronics.

  9. A digital underwater video camera system for aquatic research in regulated rivers

    USGS Publications Warehouse

    Martin, Benjamin M.; Irwin, Elise R.

    2010-01-01

    We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

  10. A passive terahertz video camera based on lumped element kinetic inductance detectors.

    PubMed

    Rowe, Sam; Pascale, Enzo; Doyle, Simon; Dunscombe, Chris; Hargrave, Peter; Papageorgio, Andreas; Wood, Ken; Ade, Peter A R; Barry, Peter; Bideaud, Aurélien; Brien, Tom; Dodd, Chris; Grainger, William; House, Julian; Mauskopf, Philip; Moseley, Paul; Spencer, Locke; Sudiwala, Rashmi; Tucker, Carole; Walker, Ian

    2016-03-01

    We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)--designed originally for far-infrared astronomy--as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ∼0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequency domain multiplexing electronics. PMID:27036756

  11. Development of a compact fast CCD camera and resonant soft x-ray scattering endstation for time-resolved pump-probe experiments.

    PubMed

    Doering, D; Chuang, Y-D; Andresen, N; Chow, K; Contarato, D; Cummings, C; Domning, E; Joseph, J; Pepper, J S; Smith, B; Zizka, G; Ford, C; Lee, W S; Weaver, M; Patthey, L; Weizeorick, J; Hussain, Z; Denes, P

    2011-07-01

    The designs of a compact, fast CCD (cFCCD) camera, together with a resonant soft x-ray scattering endstation, are presented. The cFCCD camera consists of a highly parallel, custom, thick, high-resistivity CCD, readout by a custom 16-channel application specific integrated circuit to reach the maximum readout rate of 200 frames per second. The camera is mounted on a virtual-axis flip stage inside the RSXS chamber. When this flip stage is coupled to a differentially pumped rotary seal, the detector assembly can rotate about 100°/360° in the vertical/horizontal scattering planes. With a six-degrees-of-freedom cryogenic sample goniometer, this endstation has the capability to detect the superlattice reflections from the electronic orderings showing up in the lower hemisphere. The complete system has been tested at the Advanced Light Source, Lawrence Berkeley National Laboratory, and has been used in multiple experiments at the Linac Coherent Light Source, SLAC National Accelerator Laboratory. PMID:21806178

  12. Operation and maintenance manual for the high resolution stereoscopic video camera system (HRSVS) system 6230

    SciTech Connect

    Pardini, A.F., Westinghouse Hanford

    1996-07-16

    The High Resolution Stereoscopic Video Cameral System (HRSVS),system 6230, is a stereoscopic camera system that will be used as an end effector on the LDUA to perform surveillance and inspection activities within Hanford waste tanks. It is attached to the LDUA by means of a Tool Interface Plate (TIP), which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate.

  13. Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.

  14. Improvement of Measurement Accuracy of Strain of Thin Film by CCD Camera with a Template Matching Method Using the 2ND-ORDER Polynomial Interpolation

    NASA Astrophysics Data System (ADS)

    Park, Jun-Hyub; Shin, Myung-Soo; Kang, Dong-Joong; Lim, Sung-Jo; Ha, Jong-Eun

    In this study, a system for non-contact in-situ measurement of strain during tensile test of thin films by using CCD camera with marking surface of specimen by black pen was implemented as a sensing device. To improve accuracy of measurement when CCD camera is used, this paper proposed a new method for measuring strain during tensile test of specimen with micrometer size. The size of pixel of CCD camera determines resolution of measurement, but the size of pixel can not satisfy the resolution required in tensile test of thin film because the extension of the specimen is very small during the tensile test. To increase resolution of measurement, the suggested method performs an accurate subpixel matching by applying 2nd order polynomial interpolation method to the conventional template matching. The algorithm was developed to calculate location of subpixel providing the best matching value by performing single dimensional polynomial interpolation from the results of pixel-based matching at a local region of image. The measurement resolution was less than 0.01 times of original pixel size. To verify the reliability of the system, the tensile test for the BeNi thin film was performed, which is widely used as a material in micro-probe tip. Tensile tests were performed and strains were measured using the proposed method and also the capacitance type displacement sensor for comparison. It is demonstrated that the new strain measurement system can effectively describe a behavior of materials after yield during the tensile test of the specimen at microscale with easy setup and better accuracy.

  15. Evaluation of lens distortion errors using an underwater camera system for video-based motion analysis

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.

    1994-01-01

    Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.

  16. Precise color images a high-speed color video camera system with three intensified sensors

    NASA Astrophysics Data System (ADS)

    Oki, Sachio; Yamakawa, Masafumi; Gohda, Susumu; Etoh, Takeharu G.

    1999-06-01

    High speed imaging systems have been used in a large field of science and engineering. Although the high speed camera systems have been improved to high performance, most of their applications are only to get high speed motion pictures. However, in some fields of science and technology, it is useful to get some other information, such as temperature of combustion flame, thermal plasma and molten materials. Recent digital high speed video imaging technology should be able to get such information from those objects. For this purpose, we have already developed a high speed video camera system with three-intensified-sensors and cubic prism image splitter. The maximum frame rate is 40,500 pps (picture per second) at 64 X 64 pixels and 4,500 pps at 256 X 256 pixels with 256 (8 bit) intensity resolution for each pixel. The camera system can store more than 1,000 pictures continuously in solid state memory. In order to get the precise color images from this camera system, we need to develop a digital technique, which consists of a computer program and ancillary instruments, to adjust displacement of images taken from two or three image sensors and to calibrate relationship between incident light intensity and corresponding digital output signals. In this paper, the digital technique for pixel-based displacement adjustment are proposed. Although the displacement of the corresponding circle was more than 8 pixels in original image, the displacement was adjusted within 0.2 pixels at most by this method.

  17. Structural analysis of color video camera installation on tank 241AW101 (2 Volumes)

    SciTech Connect

    Strehlow, J.P.

    1994-08-24

    A video camera is planned to be installed on the radioactive storage tank 241AW101 at the DOE` s Hanford Site in Richland, Washington. The camera will occupy the 20 inch port of the Multiport Flange riser which is to be installed on riser 5B of the 241AW101 (3,5,10). The objective of the project reported herein was to perform a seismic analysis and evaluation of the structural components of the camera for a postulated Design Basis Earthquake (DBE) per the reference Structural Design Specification (SDS) document (6). The detail of supporting engineering calculations is documented in URS/Blume Calculation No. 66481-01-CA-03 (1).

  18. Object Occlusion Detection Using Automatic Camera Calibration for a Wide-Area Video Surveillance System

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Paik, Joonki

    2016-01-01

    This paper presents an object occlusion detection algorithm using object depth information that is estimated by automatic camera calibration. The object occlusion problem is a major factor to degrade the performance of object tracking and recognition. To detect an object occlusion, the proposed algorithm consists of three steps: (i) automatic camera calibration using both moving objects and a background structure; (ii) object depth estimation; and (iii) detection of occluded regions. The proposed algorithm estimates the depth of the object without extra sensors but with a generic red, green and blue (RGB) camera. As a result, the proposed algorithm can be applied to improve the performance of object tracking and object recognition algorithms for video surveillance systems. PMID:27347978

  19. Low-complexity camera digital signal imaging for video document projection system

    NASA Astrophysics Data System (ADS)

    Hsia, Shih-Chang; Tsai, Po-Shien

    2011-04-01

    We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.

  20. A Large-panel Two-CCD Camera Coordinate System with an Alternate-Eight-Matrix Look-Up-Table Method

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Lu, An-Tsung; Hsu, Yuen-Chang; Tien, Chuen-Lin; Chen, Der-Chin; Chang, Nin-Chun

    2012-03-01

    This study proposed a novel positioning model, composing of a two-camera calibration system and an Alternate-Eight-Matrix (AEM) Look-Up-Table (LUT). Two video cameras were fixed on two sides of a large-size screen to solve the problem of field of view. The first to the fourth LUTs were used to compute the corresponding positions of specified regions on the screen captured by the camera on the right side. In these four LUTs, the coordinate mapping data of the target were stored in two matrixes, while the gray level threshold values of different positions were stored in other matrixes. Similarly, the fifth to the eighth LUTs were used to compute the corresponding positions of the specified regions on the screen captured by the camera on the left side. Experimental results showed that the proposed model can solve the problems of dead zones and non-uniform light fields, while achieving rapid and precise positioning results.

  1. Video camera observation for assessing overland flow patterns during rainfall events

    NASA Astrophysics Data System (ADS)

    Silasari, Rasmiaditya; Oismüller, Markus; Blöschl, Günter

    2015-04-01

    Physically based hydrological models have been widely used in various studies to model overland flow propagation in cases such as flood inundation and dam break flow. The capability of such models to simulate the formation of overland flow by spatial and temporal discretization of the empirical equations makes it possible for hydrologists to trace the overland flow generation both spatially and temporally across surface and subsurface domains. As the upscaling methods transforming hydrological process spatial patterns from the small obrseved scale to the larger catchment scale are still being progressively developed, the physically based hydrological models become a convenient tool to assess the patterns and their behaviors crucial in determining the upscaling process. Related studies in the past had successfully used these models as well as utilizing field observation data for model verification. The common observation data used for this verification are overland flow discharge during natural rainfall events and camera observations during synthetic events (staged field experiments) while the use of camera observations during natural events are hardly discussed in publications. This study advances in exploring the potential of video camera observations of overland flow generation during natural rainfall events to support the physically based hydrological model verification and the assessment of overland flow spatial patterns. The study is conducted within a 64ha catchment located at Petzenkirchen, Lower Austria, known as HOAL (Hydrological Open Air Laboratory). The catchment land covers are dominated by arable land (87%) with small portions (13%) of forest, pasture and paved surfaces. A 600m stream is running at southeast of the catchment flowing southward and equipped with flumes and pressure transducers measuring water level in minutely basis from various inlets along the stream (i.e. drainages, surface runoffs, springs) to be calculated into flow discharge. A

  2. A New Remote Sensing Filter Radiometer Employing a Fabry-Perot Etalon and a CCD Camera for Column Measurements of Methane in the Earth Atmosphere

    NASA Technical Reports Server (NTRS)

    Georgieva, E. M.; Huang, W.; Heaps, W. S.

    2012-01-01

    A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.

  3. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  4. An explanation for camera perspective bias in voluntariness judgment for video-recorded confession: Suggestion of cognitive frame.

    PubMed

    Park, Kwangbai; Pyo, Jimin

    2012-06-01

    Three experiments were conducted to test the hypothesis that difference in voluntariness judgment for a custodial confession filmed in different camera focuses ("camera perspective bias") could occur because a particular camera focus conveys a suggestion of a particular cognitive frame. In Experiment 1, 146 juror eligible adults in Korea showed a camera perspective bias in voluntariness judgment with a simulated confession filmed with two cameras of different focuses, one on the suspect and the other on the detective. In Experiment 2, the same bias in voluntariness judgment emerged without cameras when the participants were cognitively framed, prior to listening to the audio track of the videos used in Experiment 1, by instructions to make either a voluntariness judgment for a confession or a coerciveness judgment for an interrogation. In Experiment 3, the camera perspective bias in voluntariness judgment disappeared when the participants viewing the video focused on the suspect were initially framed to make coerciveness judgment for the interrogation and the participants viewing the video focused on the detective were initially framed to make voluntariness judgment for the confession. The results in combination indicated that a particular camera focus may convey a suggestion of a particular cognitive frame in which a video-recorded confession/interrogation is initially represented. Some forensic and policy implications were discussed. PMID:22667808

  5. Video Capture of Perforator Flap Harvesting Procedure with a Full High-definition Wearable Camera.

    PubMed

    Miyamoto, Shimpei

    2016-06-01

    Recent advances in wearable recording technology have enabled high-quality video recording of several surgical procedures from the surgeon's perspective. However, the available wearable cameras are not optimal for recording the harvesting of perforator flaps because they are too heavy and cannot be attached to the surgical loupe. The Ecous is a small high-resolution camera that was specially developed for recording loupe magnification surgery. This study investigated the use of the Ecous for recording perforator flap harvesting procedures. The Ecous SC MiCron is a high-resolution camera that can be mounted directly on the surgical loupe. The camera is light (30 g) and measures only 28 × 32 × 60 mm. We recorded 23 perforator flap harvesting procedures with the Ecous connected to a laptop through a USB cable. The elevated flaps included 9 deep inferior epigastric artery perforator flaps, 7 thoracodorsal artery perforator flaps, 4 anterolateral thigh flaps, and 3 superficial inferior epigastric artery flaps. All procedures were recorded with no equipment failure. The Ecous recorded the technical details of the perforator dissection at a high-resolution level. The surgeon did not feel any extra stress or interference when wearing the Ecous. The Ecous is an ideal camera for recording perforator flap harvesting procedures. It fits onto the surgical loupe perfectly without creating additional stress on the surgeon. High-quality video from the surgeon's perspective makes accurate documentation of the procedures possible, thereby enhancing surgical education and allowing critical self-reflection. PMID:27482504

  6. Video Capture of Perforator Flap Harvesting Procedure with a Full High-definition Wearable Camera

    PubMed Central

    2016-01-01

    Summary: Recent advances in wearable recording technology have enabled high-quality video recording of several surgical procedures from the surgeon’s perspective. However, the available wearable cameras are not optimal for recording the harvesting of perforator flaps because they are too heavy and cannot be attached to the surgical loupe. The Ecous is a small high-resolution camera that was specially developed for recording loupe magnification surgery. This study investigated the use of the Ecous for recording perforator flap harvesting procedures. The Ecous SC MiCron is a high-resolution camera that can be mounted directly on the surgical loupe. The camera is light (30 g) and measures only 28 × 32 × 60 mm. We recorded 23 perforator flap harvesting procedures with the Ecous connected to a laptop through a USB cable. The elevated flaps included 9 deep inferior epigastric artery perforator flaps, 7 thoracodorsal artery perforator flaps, 4 anterolateral thigh flaps, and 3 superficial inferior epigastric artery flaps. All procedures were recorded with no equipment failure. The Ecous recorded the technical details of the perforator dissection at a high-resolution level. The surgeon did not feel any extra stress or interference when wearing the Ecous. The Ecous is an ideal camera for recording perforator flap harvesting procedures. It fits onto the surgical loupe perfectly without creating additional stress on the surgeon. High-quality video from the surgeon’s perspective makes accurate documentation of the procedures possible, thereby enhancing surgical education and allowing critical self-reflection. PMID:27482504

  7. First results from newly developed automatic video system MAIA and comparison with older analogue cameras

    NASA Astrophysics Data System (ADS)

    Koten, P.; Páta, P.; Fliegel, K.; Vítek, S.

    2013-09-01

    New automatic video system for meteor observations MAIA was developed in recent years [1]. The goal is to replace the older analogue cameras and provide a platform for continues round the year observations from two different stations. Here we present first results obtained during testing phase as well as the first double station observations. Comparison with the older analogue cameras is provided too. MAIA (Meteor Automatic Imager and Analyzer) is based on digital monochrome camera JAI CM-040 and well proved image intensifier XX1332 (Figure 1). The camera provides spatial resolution 776 x 582 pixels. The maximum frame rate is 61.15 frames per second. Fast Pentax SMS FA 1.4/50mm lens is used as the input element of the optical system. The resulting field-of-view is about 50º in diameter. For the first time new system was used in semiautomatic regime for the observation of the Draconid outburst on 8th October, 2011. Both cameras recorded more than 160 meteors. Additional hardware and software were developed in 2012 to enable automatic observation and basic processing of the data. The system usually records the video sequences for whole night. During the daytime it looks the records for moving object, saves them into short sequences and clears the hard drives to allow additional observations. Initial laboratory measurements [2] and simultaneous observations with older system show significant improvement of the obtained data. Table 1 shows comparison of the basic parameters of both systems. In this paper we will present comparison of the double station data obtained using both systems.

  8. Flat Field Anomalies in an X-Ray CCD Camera Measured Using a Manson X-Ray Source

    SciTech Connect

    Michael Haugh

    2008-03-01

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. It determines how accurately NIF can point the laser beams and is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 μm square pixels, and 15 μm thick. A multi-anode Manson X-ray source, operating up to 10kV and 2mA, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/ΔE≈12. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within ±1.5% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. The efficiency pattern follows the properties of Si. The maximum quantum efficiency is 0.71. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation was >8% at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was less than the measurement uncertainty below 4 keV. We were also able to observe debris on the CCD chip. The debris showed maximum contrast at the lowest energy used, 930 eV, and disappeared by 4 keV. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  9. Real Time Speed Estimation of Moving Vehicles from Side View Images from an Uncalibrated Video Camera

    PubMed Central

    Doğan, Sedat; Temiz, Mahir Serhan; Külür, Sıtkı

    2010-01-01

    In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

  10. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  11. Falling-incident detection and throughput enhancement in a multi-camera video-surveillance system.

    PubMed

    Shieh, Wann-Yun; Huang, Ju-Chin

    2012-09-01

    For most elderly, unpredictable falling incidents may occur at the corner of stairs or a long corridor due to body frailty. If we delay to rescue a falling elder who is likely fainting, more serious consequent injury may occur. Traditional secure or video surveillance systems need caregivers to monitor a centralized screen continuously, or need an elder to wear sensors to detect falling incidents, which explicitly waste much human power or cause inconvenience for elders. In this paper, we propose an automatic falling-detection algorithm and implement this algorithm in a multi-camera video surveillance system. The algorithm uses each camera to fetch the images from the regions required to be monitored. It then uses a falling-pattern recognition algorithm to determine if a falling incident has occurred. If yes, system will send short messages to someone needs to be noticed. The algorithm has been implemented in a DSP-based hardware acceleration board for functionality proof. Simulation results show that the accuracy of falling detection can achieve at least 90% and the throughput of a four-camera surveillance system can be improved by about 2.1 times. PMID:22154761

  12. A compact high-definition low-cost digital stereoscopic video camera for rapid robotic surgery development.

    PubMed

    Carlson, Jay; Kowalczuk, Jędrzej; Psota, Eric; Pérez, Lance C

    2012-01-01

    Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration. PMID:22356964

  13. A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities

    PubMed Central

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  14. VideoWeb Dataset for Multi-camera Activities and Non-verbal Communication

    NASA Astrophysics Data System (ADS)

    Denina, Giovanni; Bhanu, Bir; Nguyen, Hoang Thanh; Ding, Chong; Kamal, Ahmed; Ravishankar, Chinya; Roy-Chowdhury, Amit; Ivers, Allen; Varda, Brenda

    Human-activity recognition is one of the most challenging problems in computer vision. Researchers from around the world have tried to solve this problem and have come a long way in recognizing simple motions and atomic activities. As the computer vision community heads toward fully recognizing human activities, a challenging and labeled dataset is needed. To respond to that need, we collected a dataset of realistic scenarios in a multi-camera network environment (VideoWeb) involving multiple persons performing dozens of different repetitive and non-repetitive activities. This chapter describes the details of the dataset. We believe that this VideoWeb Activities dataset is unique and it is one of the most challenging datasets available today. The dataset is publicly available online at http://vwdata.ee.ucr.edu/ along with the data annotation.

  15. A semantic autonomous video surveillance system for dense camera networks in Smart Cities.

    PubMed

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  16. System design description for the LDUA high resolution stereoscopic video camera system (HRSVS)

    SciTech Connect

    Pardini, A.F.

    1998-01-27

    The High Resolution Stereoscopic Video Camera System (HRSVS), system 6230, was designed to be used as an end effector on the LDUA to perform surveillance and inspection activities within a waste tank. It is attached to the LDUA by means of a Tool Interface Plate (TIP) which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate. Designed to perform up close weld and corrosion inspection roles in US T operations, the HRSVS will support and supplement the Light Duty Utility Arm (LDUA) and provide the crucial inspection tasks needed to ascertain waste tank condition.

  17. MOEMS-based time-of-flight camera for 3D video capturing

    NASA Astrophysics Data System (ADS)

    You, Jang-Woo; Park, Yong-Hwa; Cho, Yong-Chul; Park, Chang-Young; Yoon, Heesun; Lee, Sang-Hun; Lee, Seung-Wan

    2013-03-01

    We suggest a Time-of-Flight (TOF) video camera capturing real-time depth images (a.k.a depth map), which are generated from the fast-modulated IR images utilizing a novel MOEMS modulator having switching speed of 20 MHz. In general, 3 or 4 independent IR (e.g. 850nm) images are required to generate a single frame of depth image. Captured video image of a moving object frequently shows motion drag between sequentially captured IR images, which results in so called `motion blur' problem even when the frame rate of depth image is fast (e.g. 30 to 60 Hz). We propose a novel `single shot' TOF 3D camera architecture generating a single depth image out of synchronized captured IR images. The imaging system constitutes of 2x2 imaging lens array, MOEMS optical shutters (modulator) placed on each lens aperture and a standard CMOS image sensor. The IR light reflected from object is modulated by optical shutters on the apertures of 2x2 lens array and then transmitted images are captured on the image sensor resulting in 2x2 sub-IR images. As a result, the depth image is generated with those simultaneously captured 4 independent sub-IR images, hence the motion blur problem is canceled. The resulting performance is very useful in the applications of 3D camera to a human-machine interaction device such as user interface of TV, monitor, or hand held devices and motion capturing of human body. In addition, we show that the presented 3D camera can be modified to capture color together with depth image simultaneously on `single shot' frame rate.

  18. CCD camera systems and support electronics for a White Light Coronagraph and X-ray XUV solar telescope

    NASA Technical Reports Server (NTRS)

    Harrison, D. C.; Kubierschky, K.; Staples, M. H.; Carpenter, C. H.

    1980-01-01

    Two instruments, a White Light Coronagraph and an X-ray XUV telescope built into the same housing, share several electronic functions. Each instrument uses a CCD as an imaging detector, but due to different spectral requirements, each uses a different type. Hardware reduction, required by the stringent weight and volume allocations of the interplanetary mission, is made possible by the use of a microprocessor. Most instrument functions are software controlled with the end use circuits treated as peripherals to the microprocessor. The instruments are being developed for the International Solar Polar Mission.

  19. Optical design of high resolution and large format CCD airborne remote sensing camera on unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Cheng, Xiaowei; Shao, Jie

    2010-11-01

    Unmanned aerial vehicle remote sensing (UAVRS) is lower in cost, flexible on task arrangement and automatic and intelligent in application, it has been used widely for mapping, surveillance, reconnaissance and city planning. Airborne remote sensing missions require sensors with both high resolution and large fields of view, large format CCD digital airborne imaging systems are now a reality. A refractive system was designed to meet the requirements with the help of code V software, It has a focal length of 150mm, F number of 5.6, waveband of 0.45~0.7um, and field of view reaches 20°. It is shown that the value of modulation transfer function is higher than 0.5 at 55lp/mm, distortion is less than 0.1%, image quality reaches the diffraction limit. The system with large format CCD and wide field can satisfy the demand of the wide ground overlay area and high resolution. The optical system with simpler structure, smaller size and lighter weight, can be used in airborne remote sensing.

  20. Mounted Video Camera Captures Launch of STS-112, Shuttle Orbiter Atlantis

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A color video camera mounted to the top of the External Tank (ET) provided this spectacular never-before-seen view of the STS-112 mission as the Space Shuttle Orbiter Atlantis lifted off in the afternoon of October 7, 2002. The camera provided views as the orbiter began its ascent until it reached near-orbital speed, about 56 miles above the Earth, including a view of the front and belly of the orbiter, a portion of the Solid Rocket Booster, and ET. The video was downlinked during flight to several NASA data-receiving sites, offering the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. Atlantis carried the S1 Integrated Truss Structure and the Crew and Equipment Translation Aid (CETA) Cart. The CETA is the first of two human-powered carts that will ride along the International Space Station's railway providing a mobile work platform for future extravehicular activities by astronauts. Landing on October 18, 2002, the Orbiter Atlantis ended its 11-day mission.

  1. Mounted Video Camera Captures Launch of STS-112, Shuttle Orbiter Atlantis

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A color video camera mounted to the top of the External Tank (ET) provided this spectacular never-before-seen view of the STS-112 mission as the Space Shuttle Orbiter Atlantis lifted off in the afternoon of October 7, 2002, The camera provided views as the the orbiter began its ascent until it reached near-orbital speed, about 56 miles above the Earth, including a view of the front and belly of the orbiter, a portion of the Solid Rocket Booster, and ET. The video was downlinked during flight to several NASA data-receiving sites, offering the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. Atlantis carried the S1 Integrated Truss Structure and the Crew and Equipment Translation Aid (CETA) Cart. The CETA is the first of two human-powered carts that will ride along the International Space Station's railway providing a mobile work platform for future extravehicular activities by astronauts. Landing on October 18, 2002, the Orbiter Atlantis ended its 11-day mission.

  2. A Simple Method Based on the Application of a CCD Camera as a Sensor to Detect Low Concentrations of Barium Sulfate in Suspension

    PubMed Central

    de Sena, Rodrigo Caciano; Soares, Matheus; Pereira, Maria Luiza Oliveira; da Silva, Rogério Cruz Domingues; do Rosário, Francisca Ferreira; da Silva, Joao Francisco Cajaiba

    2011-01-01

    The development of a simple, rapid and low cost method based on video image analysis and aimed at the detection of low concentrations of precipitated barium sulfate is described. The proposed system is basically composed of a webcam with a CCD sensor and a conventional dichroic lamp. For this purpose, software for processing and analyzing the digital images based on the RGB (Red, Green and Blue) color system was developed. The proposed method had shown very good repeatability and linearity and also presented higher sensitivity than the standard turbidimetric method. The developed method is presented as a simple alternative for future applications in the study of precipitations of inorganic salts and also for detecting the crystallization of organic compounds. PMID:22346607

  3. A stroboscopic technique for using CCD cameras in flow visualization systems for continuous viewing and stop action photography

    NASA Technical Reports Server (NTRS)

    Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.

    1992-01-01

    A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.

  4. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  5. Calibration grooming and alignment for LDUA High Resolution Stereoscopic Video Camera System (HRSVS)

    SciTech Connect

    Pardini, A.F.

    1998-01-27

    The High Resolution Stereoscopic Video Camera System (HRSVS) was designed by the Savannah River Technology Center (SRTC) to provide routine and troubleshooting views of tank interiors during characterization and remediation phases of underground storage tank (UST) processing. The HRSVS is a dual color camera system designed to provide stereo viewing of the interior of the tanks including the tank wall in a Class 1, Division 1, flammable atmosphere. The HRSVS was designed with a modular philosophy for easy maintenance and configuration modifications. During operation of the system with the LDUA, the control of the camera system will be performed by the LDUA supervisory data acquisition system (SDAS). Video and control status 1458 will be displayed on monitors within the LDUA control center. All control functions are accessible from the front panel of the control box located within the Operations Control Trailer (OCT). The LDUA will provide all positioning functions within the waste tank for the end effector. Various electronic measurement instruments will be used to perform CG and A activities. The instruments may include a digital volt meter, oscilloscope, signal generator, and other electronic repair equipment. None of these instruments will need to be calibrated beyond what comes from the manufacturer. During CG and A a temperature indicating device will be used to measure the temperature of the outside of the HRSVS from initial startup until the temperature has stabilized. This device will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing. This sensor will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing.

  6. Visual fatigue modeling for stereoscopic video shot based on camera motion

    NASA Astrophysics Data System (ADS)

    Shi, Guozhong; Sang, Xinzhu; Yu, Xunbo; Liu, Yangdong; Liu, Jing

    2014-11-01

    As three-dimensional television (3-DTV) and 3-D movie become popular, the discomfort of visual feeling limits further applications of 3D display technology. The cause of visual discomfort from stereoscopic video conflicts between accommodation and convergence, excessive binocular parallax, fast motion of objects and so on. Here, a novel method for evaluating visual fatigue is demonstrated. Influence factors including spatial structure, motion scale and comfortable zone are analyzed. According to the human visual system (HVS), people only need to converge their eyes to the specific objects for static cameras and background. Relative motion should be considered for different camera conditions determining different factor coefficients and weights. Compared with the traditional visual fatigue prediction model, a novel visual fatigue predicting model is presented. Visual fatigue degree is predicted using multiple linear regression method combining with the subjective evaluation. Consequently, each factor can reflect the characteristics of the scene, and the total visual fatigue score can be indicated according to the proposed algorithm. Compared with conventional algorithms which ignored the status of the camera, our approach exhibits reliable performance in terms of correlation with subjective test results.

  7. Photometric characteristics of the Vega 1 and Vega 2 CCD cameras for the observation of Comet Halley.

    PubMed

    Abergel, A; Bertaux, J L; Avanessov, G A; Tarnopolsky, V I; Zhukov, B S

    1987-10-15

    The first pictures of the nucleus of Comet Halley were returned from the CCD TV system (TVS) placed onboard the two Soviet spacecraft Vega 1 and 2. Comet Halley was observed from 4 to 11 Mar. 1986, and ~1500 images were transmitted to the earth. The raw data are given in digital numbers which must be converted into units of brightness. After a brief description of the experiment, the on-ground calibration tests are discussed. Many images were registered and processed to obtain standard correcting images and absolute calibration. Photometric performance could also be checked during flight with observations of Jupiter; in-flight and on-ground performances are compared. PMID:20523385

  8. Gain, Level, And Exposure Control For A Television Camera

    NASA Technical Reports Server (NTRS)

    Major, Geoffrey J.; Hetherington, Rolfe W.

    1992-01-01

    Automatic-level-control/automatic-gain-control (ALC/AGC) system for charge-coupled-device (CCD) color television camera prevents over-loading in bright scenes using technique for measuring brightness of scene from red, green, and blue output signals and processing these into adjustments of video amplifiers and iris on camera lens. System faster, does not distort video brightness signals, and built with smaller components.

  9. Identifying predators and fates of grassland passerine nests using miniature video cameras

    USGS Publications Warehouse

    Pietz, P.J.; Granfors, D.A.

    2000-01-01

    Nest fates, causes of nest failure, and identities of nest predators are difficult to determine for grassland passerines. We developed a miniature video-camera system for use in grasslands and deployed it at 69 nests of 10 passerine species in North Dakota during 1996-97. Abandonment rates were higher at nests 1 day or night (22-116 hr) at 6 nests, 5 of which were depredated by ground squirrels or mice. For nests without cameras, estimated predation rates were lower for ground nests than aboveground nests (P = 0.055), but did not differ between open and covered nests (P = 0.74). Open and covered nests differed, however, when predation risk (estimated by initial-predation rate) was examined separately for day and night using camera-monitored nests; the frequency of initial predations that occurred during the day was higher for open nests than covered nests (P = 0.015). Thus, vulnerability of some nest types may depend on the relative importance of nocturnal and diurnal predators. Predation risk increased with nestling age from 0 to 8 days (P = 0.07). Up to 15% of fates assigned to camera-monitored nests were wrong when based solely on evidence that would have been available from periodic nest visits. There was no evidence of disturbance at nearly half the depredated nests, including all 5 depredated by large mammals. Overlap in types of sign left by different predator species, and variability of sign within species, suggests that evidence at nests is unreliable for identifying predators of grassland passerines.

  10. Optimizing Detection Rate and Characterization of Subtle Paroxysmal Neonatal Abnormal Facial Movements with Multi-Camera Video-Electroencephalogram Recordings.

    PubMed

    Pisani, Francesco; Pavlidis, Elena; Cattani, Luca; Ferrari, Gianluigi; Raheli, Riccardo; Spagnoli, Carlotta

    2016-06-01

    Objectives We retrospectively analyze the diagnostic accuracy for paroxysmal abnormal facial movements, comparing one camera versus multi-camera approach. Background Polygraphic video-electroencephalogram (vEEG) recording is the current gold standard for brain monitoring in high-risk newborns, especially when neonatal seizures are suspected. One camera synchronized with the EEG is commonly used. Methods Since mid-June 2012, we have started using multiple cameras, one of which point toward newborns' faces. We evaluated vEEGs recorded in newborns in the study period between mid-June 2012 and the end of September 2014 and compared, for each recording, the diagnostic accuracies obtained with one-camera and multi-camera approaches. Results We recorded 147 vEEGs from 87 newborns and found 73 episodes of paroxysmal facial abnormal movements in 18 vEEGs of 11 newborns with the multi-camera approach. By using the single-camera approach, only 28.8% of these events were identified (21/73). Ten positive vEEGs with multicamera with 52 paroxysmal facial abnormal movements (52/73, 71.2%) would have been considered as negative with the single-camera approach. Conclusions The use of one additional facial camera can significantly increase the diagnostic accuracy of vEEGs in the detection of paroxysmal abnormal facial movements in the newborns. PMID:27111027

  11. CCD image acquisition for multispectral teledetection

    NASA Astrophysics Data System (ADS)

    Peralta-Fabi, R.; Peralta, A.; Prado, Jorge M.; Vicente, Esau; Navarette, M.

    1992-08-01

    A low cost high-reliability multispectral video system has been developed for airborne remote sensing. Three low weight CCD cameras are mounted together with a photographic camera in a keviar composite self-contained structure. The CCD cameras are remotely controlled have spectral filters (80 nm at 50 T) placed in front of their optical system and all cameras are aligned to capture the same image field. Filters may be changed so as to adjust spectral bands according to the object s reflectance properties but a set of bands common to most remote sensing aircraft and satellites are usually placed covering visible and near JR. This paper presents results obtained with this system and some comparisons as to the cost resolution and atmospheric correction advantages with respect to other more costly devices. Also a brief description of the Remotely Piloted Vehicle (RPV) project where the camera system will be mounted is given. The images so obtained replace the costlier ones obtained by satellites in severai specific applications. Other applications under development include fire monitoring identification of vegetation in the field and in the laboratory discrimination of objects by color for industrial applications and for geological and engineering surveys. 1.

  12. Television automatic video-line tester

    NASA Astrophysics Data System (ADS)

    Ge, Zhaoxiang; Tang, Dongsheng; Feng, Binghua

    1998-08-01

    The linearity of telescope video-line is an important character for geodetic instruments and micrometer- telescopes. The instrument of 1 inch video-line tester, invented by University of Shanghai for Science and Technology, has been adopted in related instrument criterion and national metering regulation. But in optical and chemical reading with visual alignment, it can cause subjective error and can not give detailed data and so on. In this paper, the author put forward an improvement for video-line tester by using CCD for TV camera, displaying and processing CCD signal through computer, and auto-testing, with advantage of objectivity, reliability, rapid speed and less focusing error.

  13. Embedded FIR filter design for real-time refocusing using a standard plenoptic video camera

    NASA Astrophysics Data System (ADS)

    Hahne, Christopher; Aggoun, Amar

    2014-03-01

    A novel and low-cost embedded hardware architecture for real-time refocusing based on a standard plenoptic camera is presented in this study. The proposed layout design synthesizes refocusing slices directly from micro images by omitting the process for the commonly used sub-aperture extraction. Therefore, intellectual property cores, containing switch controlled Finite Impulse Response (FIR) filters, are developed and applied to the Field Programmable Gate Array (FPGA) XC6SLX45 from Xilinx. Enabling the hardware design to work economically, the FIR filters are composed of stored product as well as upsampling and interpolation techniques in order to achieve an ideal relation between image resolution, delay time, power consumption and the demand of logic gates. The video output is transmitted via High-Definition Multimedia Interface (HDMI) with a resolution of 720p at a frame rate of 60 fps conforming to the HD ready standard. Examples of the synthesized refocusing slices are presented.

  14. A two camera video imaging system with application to parafoil angle of attack measurements

    NASA Astrophysics Data System (ADS)

    Meyn, Larry A.; Bennett, Mark S.

    1991-01-01

    This paper describes the development of a two-camera, video imaging system for the determination of three-dimensional spatial coordinates from stereo images. This system successfully measured angle of attack at several span-wise locations for large-scale parafoils tested in the NASA Ames 80- by 120-Foot Wind Tunnel. Measurement uncertainty for angle of attack was less than 0.6 deg. The stereo ranging system was the primary source for angle of attack measurements since inclinometers sewn into the fabric ribs of the parafoils had unknown angle offsets acquired during installation. This paper includes discussions of the basic theory and operation of the stereo ranging system, system measurement uncertainty, experimental set-up, calibration results, and test results. Planned improvements and enhancements to the system are also discussed.

  15. Plant iodine-131 uptake in relation to root concentration as measured in minirhizotron by video camera:

    SciTech Connect

    Moss, K.J.

    1990-09-01

    Glass viewing tubes (minirhizotrons) were placed in the soil beneath native perennial bunchgrass (Agropyron spicatum). The tubes provided access for observing and quantifying plant roots with a miniature video camera and soil moisture estimates by neutron hydroprobe. The radiotracer I-131 was delivered to the root zone at three depths with differing root concentrations. The plant was subsequently sampled and analyzed for I-131. Plant uptake was greater when I-131 was applied at soil depths with higher root concentrations. When I-131 was applied at soil depths with lower root concentrations, plant uptake was less. However, the relationship between root concentration and plant uptake was not a direct one. When I-131 was delivered to deeper soil depths with low root concentrations, the quantity of roots there appeared to be less effective in uptake than the same quantity of roots at shallow soil depths with high root concentration. 29 refs., 6 figs., 11 tabs.

  16. Autonomous video camera system for monitoring impacts to benthic habitats from demersal fishing gear, including longlines

    NASA Astrophysics Data System (ADS)

    Kilpatrick, Robert; Ewing, Graeme; Lamb, Tim; Welsford, Dirk; Constable, Andrew

    2011-04-01

    Studies of the interactions of demersal fishing gear with the benthic environment are needed in order to manage conservation of benthic habitats. There has been limited direct assessment of these interactions through deployment of cameras on commercial fishing gear especially on demersal longlines. A compact, autonomous deep-sea video system was designed and constructed by the Australian Antarctic Division (AAD) for deployment on commercial fishing gear to observe interactions with benthos in the Southern Ocean finfish fisheries (targeting toothfish, Dissostichus spp). The Benthic Impacts Camera System (BICS) is capable of withstanding depths to 2500 m, has been successfully fitted to both longline and demersal trawl fishing gear, and is suitable for routine deployment by non-experts such as fisheries observers or crew. The system is entirely autonomous, robust, compact, easy to operate, and has minimal effect on the performance of the fishing gear it is attached to. To date, the system has successfully captured footage that demonstrates the interactions between demersal fishing gear and the benthos during routine commercial operations. It provides the first footage demonstrating the nature of the interaction between demersal longlines and benthic habitats in the Southern Ocean, as well as showing potential as a tool for rapidly assessing habitat types and presence of mobile biota such as krill ( Euphausia superba).

  17. The design and realization of a three-dimensional video system by means of a CCD array

    NASA Astrophysics Data System (ADS)

    Boizard, J. L.

    1985-12-01

    Design features and principles and initial tests of a prototype three-dimensional robot vision system based on a laser source and a CCD detector array is described. The use of a laser as a coherent illumination source permits the determination of the relief using one emitter since the location of the source is a known quantity with low distortion. The CCD signal detector array furnishes an acceptable signal/noise ratio and, when wired to an appropriate signal processing system, furnishes real-time data on the return signals, i.e., the characteristic points of an object being scanned. Signal processing involves integration of 29 kB of data per 100 samples, with sampling occurring at a rate of 5 MHz (the CCDs) and yielding an image every 12 msec. Algorithms for filtering errors from the data stream are discussed.

  18. Automatic sorting installation based on two CCD cameras for measuring gauge diameter and ellipticity of pulp extractors

    NASA Astrophysics Data System (ADS)

    Ovchinnikov, Dmitry L.; Akhtiamov, Rishad A.; Dorogov, Nikolai V.; Morozov, Oleg G.; Nureev, Ilnur I.; Yusupov, Alfred Y.

    2000-12-01

    The concrete problem, which decision is presented in given paper, consists in measuring of pulp extractor diameter apart 1 mm from its operating end and their automatic sorting. The range of measuring sizes is 180-260 micrometers , necessary measurement accuracy is 1 micrometers . The sorting is carried out on 8 subranges in 10 micrometers . The ellipticity of a pulp extractor is analyzed additionally and used as a qualitative index. The comparative analysis of different tools on metrology performances and cost problems of their embodying has allowed to select television system, as the class on the basis of which is necessary to build a required system. Problems decided at built-up of a system are: use of short focus lenses with major augmentation for shaping pulse duration solved for measuring to within 1 micrometers error; the account of lenses aberrations influence on a measuring error; use of cameras with a size of pixels in 0.7-1 micrometers ; definition of the line number, which corresponds to a gauge diameter of a pulp extractor; holding of a statistical average and extrapolation of data of measuring; the analysis of system variants with the purpose of its simplification and cost decreasing.

  19. A unified and efficient framework for court-net sports video analysis using 3D camera modeling

    NASA Astrophysics Data System (ADS)

    Han, Jungong; de With, Peter H. N.

    2007-01-01

    The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.

  20. Single event effect characterization of the mixed-signal ASIC developed for CCD camera in space use

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Fujikawa, Mari; Mori, Hideki; Kan, Hiroaki; Ueda, Shutaro; Kosugi, Hiroko; Anabuki, Naohisa; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Doty, John P.; Ikeda, Hirokazu; Kitamura, Hisashi; Uchihori, Yukio

    2013-12-01

    We present the single event effect (SEE) tolerance of a mixed-signal application-specific integrated circuit (ASIC) developed for a charge-coupled device camera onboard a future X-ray astronomical mission. We adopted proton and heavy ion beams at HIMAC/NIRS in Japan. The particles with high linear energy transfer (LET) of 57.9 MeV cm2/mg is used to measure the single event latch-up (SEL) tolerance, which results in a sufficiently low cross-section of σSEL<4.2×10-11 cm2/(Ion×ASIC). The single event upset (SEU) tolerance is estimated with various kinds of species with wide range of energy. Taking into account that a part of the protons creates recoiled heavy ions that have higher LET than that of the incident protons, we derived the probability of SEU event as a function of LET. Then the SEE event rate in a low-earth orbit is estimated considering a simulation result of LET spectrum. SEL rate is below once per 49 years, which satisfies the required latch-up tolerance. The upper limit of the SEU rate is derived to be 1.3×10-3 events/s. Although the SEU events cannot be distinguished from the signals of X-ray photons from astronomical objects, the derived SEU rate is below 1.3% of expected non-X-ray background rate of the detector and hence these events should not be a major component of the instrumental background.

  1. A versatile digital video engine for safeguards and security applications

    SciTech Connect

    Hale, W.R.; Johnson, C.S.; DeKeyser, P.

    1996-08-01

    The capture and storage of video images have been major engineering challenges for safeguard and security applications since the video camera provided a method to observe remote operations. The problems of designing reliable video cameras were solved in the early 1980`s with the introduction of the CCD (charged couple device) camera. The first CCD cameras cost in the thousands of dollars but have now been replaced by cameras costing in the hundreds. The remaining problem of storing and viewing video images in both attended and unattended video surveillance systems and remote monitoring systems is being solved by sophisticated digital compression systems. One such system is the PC-104 three card set which is literally a ``video engine`` that can provide power for video storage systems. The use of digital images in surveillance systems makes it possible to develop remote monitoring systems, portable video surveillance units, image review stations, and authenticated camera modules. This paper discusses the video card set and how it can be used in many applications.

  2. Search for Trans-Neptunian Objects: a new MIDAS context confronted with some results obtained with the UH 8k CCD Mosaic Camera

    NASA Astrophysics Data System (ADS)

    Rousselot, P.; Lombard, F.; Moreels, G.

    1998-09-01

    We present the results obtained with a new program dedicated to the automatic detection of trans-Neptunian objects (TNOs) with standard sets of images obtained in the same field of view. This program has the key advantage, when compared to other similar softwares, of being designed to be used with one of the main astronomical data processing package; the Munich Image Data Analysis System (MIDAS) developped by The European Southern Observatory (ESO). It is available freely from the World Wide Web server of the Observatory of Besan\\c con (http://www.obs-besancon/www/ publi/philippe/tno.html). This program has been tested with observational data collected with the UH 8k CCD mosaic Camera, used during two nights, on October 25 and 26, 1997, at the prime focus of the CFH telescope (Mauna Kea, Hawaii). The purpose of these observational data was to detect new TNOs and a previous analysis conducted by the classical method of blinking, had lead to a first detection of a new TNO. This object appears close to the detection limit of the images (i.e. to the 24(th) magnitude) and presents an unsual orbital inclination (i =~ 33(deg) ). It has allowed the efficient and successful testing of the program to detect faint moving objects, demonstrating its ability to detect the objects close to the sky background noise with a very limited number of false detections.

  3. Optimal camera exposure for video surveillance systems by predictive control of shutter speed, aperture, and gain

    NASA Astrophysics Data System (ADS)

    Torres, Juan; Menéndez, José Manuel

    2015-02-01

    This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together

  4. Surgeon point-of-view recording: Using a high-definition head-mounted video camera in the operating room

    PubMed Central

    Nair, Akshay Gopinathan; Kamal, Saurabh; Dave, Tarjani Vivek; Mishra, Kapil; Reddy, Harsha S; Rocca, David Della; Rocca, Robert C Della; Andron, Aleza; Jain, Vandana

    2015-01-01

    Objective: To study the utility of a commercially available small, portable ultra-high definition (HD) camera (GoPro Hero 4) for intraoperative recording. Methods: A head mount was used to fix the camera on the operating surgeon's head. Due care was taken to protect the patient's identity. The recorded video was subsequently edited and used as a teaching tool. This retrospective, noncomparative study was conducted at three tertiary eye care centers. The surgeries recorded were ptosis correction, ectropion correction, dacryocystorhinostomy, angular dermoid excision, enucleation, blepharoplasty and lid tear repair surgery (one each). The recorded videos were reviewed, edited, and checked for clarity, resolution, and reproducibility. Results: The recorded videos were found to be high quality, which allowed for zooming and visualization of the surgical anatomy clearly. Minimal distortion is a drawback that can be effectively addressed during postproduction. The camera, owing to its lightweight and small size, can be mounted on the surgeon's head, thus offering a unique surgeon point-of-view. In our experience, the results were of good quality and reproducible. Conclusions: A head-mounted ultra-HD video recording system is a cheap, high quality, and unobtrusive technique to record surgery and can be a useful teaching tool in external facial and ophthalmic plastic surgery. PMID:26655001

  5. CCD TV focal plane guider development and comparison to SIRTF applications

    NASA Technical Reports Server (NTRS)

    Rank, David M.

    1989-01-01

    It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

  6. A lateral chromatic aberration correction system for ultrahigh-definition color video camera

    NASA Astrophysics Data System (ADS)

    Yamashita, Takayuki; Shimamoto, Hiroshi; Funatsu, Ryohei; Mitani, Kohji; Nojiri, Yuji

    2006-02-01

    We have developed color camera for an 8k x 4k-pixel ultrahigh-definition video system, which is called Super Hi- Vision, with a 5x zoom lens and a signal-processing system incorporating a function for real-time lateral chromatic aberration correction. The chromatic aberration of the lens degrades color image resolution. So in order to develop a compact zoom lens consistent with ultrahigh-resolution characteristics, we incorporated a real-time correction function in the signal-processing system. The signal-processing system has eight memory tables to store the correction data at eight focal length points on the blue and red channels. When the focal length data is inputted from the lens control units, the relevant correction data are interpolated from two of eights correction data tables. This system performs geometrical conversion on both channels using this correction data. This paper describes that the correction function can successfully reduce the lateral chromatic aberration, to an amount small enough to ensure the desired image resolution was achieved over the entire range of the lens in real time.

  7. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  8. Application of video-cameras for quality control and sampling optimisation of hydrological and erosion measurements in a catchment

    NASA Astrophysics Data System (ADS)

    Lora-Millán, Julio S.; Taguas, Encarnacion V.; Gomez, Jose A.; Perez, Rafael

    2014-05-01

    Long term soil erosion studies imply substantial efforts, particularly when there is the need to maintain continuous measurements. There are high costs associated to maintenance of field equipment keeping and quality control of data collection. Energy supply and/or electronic failures, vandalism and burglary are common causes of gaps in datasets, reducing their reach in many cases. In this work, a system of three video-cameras, a recorder and a transmission modem (3G technology) has been set up in a gauging station where rainfall, runoff flow and sediment concentration are monitored. The gauging station is located in the outlet of an olive orchard catchment of 6.4 ha. Rainfall is measured with one automatic raingauge that records intensity at one minute intervals. The discharge is measured by a flume of critical flow depth, where the water is recorded by an ultrasonic sensor. When the water level rises to a predetermined level, the automatic sampler turns on and fills a bottle at different intervals according to a program depending on the antecedent precipitation. A data logger controls the instruments' functions and records the data. The purpose of the video-camera system is to improve the quality of the dataset by i) the visual analysis of the measurement conditions of flow into the flume; ii) the optimisation of the sampling programs. The cameras are positioned to record the flow at the approximation and the gorge of the flume. In order to contrast the values of ultrasonic sensor, there is a third camera recording the flow level close to a measure tape. This system is activated when the ultrasonic sensor detects a height threshold, equivalent to an electric intensity level. Thus, only when there is enough flow, video-cameras record the event. This simplifies post-processing and reduces the cost of download of recordings. The preliminary contrast analysis will be presented as well as the main improvements in the sample program.

  9. A multiframe soft x-ray camera with fast video capture for the LSX field reversed configuration (FRC) experiment

    SciTech Connect

    Crawford, E.A. )

    1992-10-01

    Soft x-ray pinhole imaging has proven to be an exceptionally useful diagnostic for qualitative observation of impurity radiation from field reversed configuration plasmas. We used a four frame device, similar in design to those discussed in an earlier paper (E. A. Crawford, D. P. Taggart, and A. D. Bailey III, Rev. Sci. Instrum. {bold 61}, 2795 (1990)) as a routine diagnostic during the last six months of the Large s Experiment (LSX) program. Our camera is an improvement over earlier implementations in several significant aspects. It was designed and used from the onset of the LSX experiments with a video frame capture system so that an instant visual record of the shot was available to the machine operator as well as facilitating quantitative interpretation of intensity information recorded in the images. The camera was installed in the end region of the LSX on axis approximately 5.5 m from the plasma midplane. Experience with bolometers on LSX showed serious problems with particle dumps'' at the axial location at various times during the plasma discharge. Therefore, the initial implementation of the camera included an effective magnetic sweeper assembly. Overall performance of the camera, video capture system, and sweeper is discussed.

  10. Determination of visible coordinates of the low-orbit space objects and their photometry by the CCD camera with the analogue output. Initial image processing

    NASA Astrophysics Data System (ADS)

    Shakun, L. S.; Koshkin, N. I.

    2014-06-01

    The number of artificial space objects in the low Earth orbit has been continuously increasing. That raises the requirements for the accuracy of measurement of their coordinates and for the precision of the prediction of their motion. The accuracy of the prediction can be improved if the actual current orientation of the non-spherical satellite is taken into account. In so doing, it becomes possible to directly determine the atmospheric density along the orbit. The problem solution is to regularly conduct the photometric surveillances of a large number of satellites and monitor the parameters of their rotation around the centre of mass. To do that, it is necessary to get and promptly process large video arrays, containing pictures of a satellite against the background stars. In the present paper, the method for the simultaneous measurement of coordinates and brightness of the low Earth orbit space objects against the background stars when they are tracked by telescope KT-50 with the mirror diameter of 50 cm and with video camera WAT-209H2 is considered. The problem of determination of the moments of exposures of images is examined in detail. The estimation of the accuracy of measuring both the apparent coordinates of stars and their photometry is given on the example of observation of the open star cluster. In the presented observations, the standard deviation of one position measured is 1σ, the accuracy of determination of the moment of exposure of images is better than 0.0001 s. The estimate of the standard deviation of one measurement of brightness is 0.1m. Some examples of the results of surveillances of satellites are also presented in the paper.

  11. HDR {sup 192}Ir source speed measurements using a high speed video camera

    SciTech Connect

    Fonseca, Gabriel P.; Rubo, Rodrigo A.; Sales, Camila P. de; Verhaegen, Frank

    2015-01-15

    Purpose: The dose delivered with a HDR {sup 192}Ir afterloader can be separated into a dwell component, and a transit component resulting from the source movement. The transit component is directly dependent on the source speed profile and it is the goal of this study to measure accurate source speed profiles. Methods: A high speed video camera was used to record the movement of a {sup 192}Ir source (Nucletron, an Elekta company, Stockholm, Sweden) for interdwell distances of 0.25–5 cm with dwell times of 0.1, 1, and 2 s. Transit dose distributions were calculated using a Monte Carlo code simulating the source movement. Results: The source stops at each dwell position oscillating around the desired position for a duration up to (0.026 ± 0.005) s. The source speed profile shows variations between 0 and 81 cm/s with average speed of ∼33 cm/s for most of the interdwell distances. The source stops for up to (0.005 ± 0.001) s at nonprogrammed positions in between two programmed dwell positions. The dwell time correction applied by the manufacturer compensates the transit dose between the dwell positions leading to a maximum overdose of 41 mGy for the considered cases and assuming an air-kerma strength of 48 000 U. The transit dose component is not uniformly distributed leading to over and underdoses, which is within 1.4% for commonly prescribed doses (3–10 Gy). Conclusions: The source maintains its speed even for the short interdwell distances. Dose variations due to the transit dose component are much lower than the prescribed treatment doses for brachytherapy, although transit dose component should be evaluated individually for clinical cases.

  12. Development of a 300,000-pixel ultrahigh-speed high-sensitivity CCD

    NASA Astrophysics Data System (ADS)

    Ohtake, H.; Hayashida, T.; Kitamura, K.; Arai, T.; Yonai, J.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Poggemann, D.; Ruckelshausen, A.; van Kuijk, H.; Bosiers, Jan T.

    2006-02-01

    We are developing an ultrahigh-speed, high-sensitivity broadcast camera that is capable of capturing clear, smooth slow-motion videos even where lighting is limited, such as at professional baseball games played at night. In earlier work, we developed an ultrahigh-speed broadcast color camera1) using three 80,000-pixel ultrahigh-speed, highsensitivity CCDs2). This camera had about ten times the sensitivity of standard high-speed cameras, and enabled an entirely new style of presentation for sports broadcasts and science programs. Most notably, increasing the pixel count is crucially important for applying ultrahigh-speed, high-sensitivity CCDs to HDTV broadcasting. This paper provides a summary of our experimental development aimed at improving the resolution of CCD even further: a new ultrahigh-speed high-sensitivity CCD that increases the pixel count four-fold to 300,000 pixels.

  13. On the use of Video Camera Systems in the Detection of Kuiper Belt Objects by Stellar Occultations

    NASA Astrophysics Data System (ADS)

    Subasinghe, Dilini

    2012-10-01

    Due to the distance between us and the Kuiper Belt, direct detection of Kuiper Belt Objects (KBOs) is not currently possible for objects less than 10 km in diameter. Indirect methods such as stellar occultations must be employed to remotely probe these bodies. The size, shape, as well as atmospheric properties and ring system information of a body (if any), can be collected through observations of stellar occultations. This method has been previously used with some success - Roques et al. (2006) detected 3 Trans-Neptunian objects; Schlichting et al. (2009) detected a single object in archival data. However, previous assessments of KBO occultation detection rates have been calculated only for telescopes - we extend this method to video camera systems. Building on Roques & Moncuquet (2000), we present a derivation that can be applied to any video camera system, taking into account camera specifications and diffraction effects. This allows for a determination of the number of observable KBO occultations per night. Example calculations are presented for some of the automated meteor camera systems currently in use at the University of Western Ontario. The results of this project will allow us to refine and improve our own camera system, as well as allow others to enhance their systems for KBO detection. Roques, F., Doressoundiram, A., Dhillon, V., Marsh, T., Bickerton, S., Kavelaars, J. J., Moncuquet, M., Auvergne, M., Belskaya, I., Chevreton, M., Colas, F., Fernandez, A., Fitzsimmons, A., Lecacheux, J., Mousis, O., Pau, S., Peixinho, N., & Tozzi, G. P. (2006). The Astronomical Journal, 132(2), 819-822. Roques, F., & Moncuquet, M. (2000). Icarus, 147(2), 530-544. Schlichting, H. E., Ofek, E. O., Wenz, M., Sari, R., Gal-Yam, A., Livio, M., Nelan, E., & Zucker, S. (2009). Nature, 462(7275), 895-897.

  14. SU-C-18A-02: Image-Based Camera Tracking: Towards Registration of Endoscopic Video to CT

    SciTech Connect

    Ingram, S; Rao, A; Wendt, R; Castillo, R; Court, L; Yang, J; Beadle, B

    2014-06-01

    Purpose: Endoscopic examinations are routinely performed on head and neck and esophageal cancer patients. However, these images are underutilized for radiation therapy because there is currently no way to register them to a CT of the patient. The purpose of this work is to develop a method to track the motion of an endoscope within a structure using images from standard clinical equipment. This method will be incorporated into a broader endoscopy/CT registration framework. Methods: We developed a software algorithm to track the motion of an endoscope within an arbitrary structure. We computed frame-to-frame rotation and translation of the camera by tracking surface points across the video sequence and utilizing two-camera epipolar geometry. The resulting 3D camera path was used to recover the surrounding structure via triangulation methods. We tested this algorithm on a rigid cylindrical phantom with a pattern spray-painted on the inside. We did not constrain the motion of the endoscope while recording, and we did not constrain our measurements using the known structure of the phantom. Results: Our software algorithm can successfully track the general motion of the endoscope as it moves through the phantom. However, our preliminary data do not show a high degree of accuracy in the triangulation of 3D point locations. More rigorous data will be presented at the annual meeting. Conclusion: Image-based camera tracking is a promising method for endoscopy/CT image registration, and it requires only standard clinical equipment. It is one of two major components needed to achieve endoscopy/CT registration, the second of which is tying the camera path to absolute patient geometry. In addition to this second component, future work will focus on validating our camera tracking algorithm in the presence of clinical imaging features such as patient motion, erratic camera motion, and dynamic scene illumination.

  15. Improved design of an ISIS for a video camera of 1,000,000 pps

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Mutoh, Hideki; Takehara, Kohsei; Okinaka, Tomoo

    1999-05-01

    The ISIS, In-situ Storage Image Sensor, may achieve the frame rate higher than 1,000,000 pps. Technical targets in development of the ISIS are listed up. A layout of the ISIS is presented, which covers the major targets, by employing slanted CCD storage and amplified CMOS readout. The layout has two different sets of orthogonal axis systems: one is mechanical and the other functional. Photodiodes, CCD registers and all the gates are designed parallel to the mechanical axis systems. The squares on which pixels are placed form the functional axis system. The axis systems are inclined to each other. To reproduce a moving image, at least fifty consecutive images are necessary for ten-second replay at 5 pps. The inclined design inlays the straight CCD storage registers for more than fifty images in the photo- receptive area of the sensor. The amplified CMOS readout circuits built in all the pixels eliminate line defects in reproduced images, which are inherent to CCD image sensors. FPN (Fixed Pattern Noise) introduced by the individual amplification is easily suppressed by digital post image processing, which is commonly employed in scientific and engineering applications. The yield rate is significantly improved by the elimination of the line defects.

  16. Advanced Video Data-Acquisition System For Flight Research

    NASA Technical Reports Server (NTRS)

    Miller, Geoffrey; Richwine, David M.; Hass, Neal E.

    1996-01-01

    Advanced video data-acquisition system (AVDAS) developed to satisfy variety of requirements for in-flight video documentation. Requirements range from providing images for visualization of airflows around fighter airplanes at high angles of attack to obtaining safety-of-flight documentation. F/A-18 AVDAS takes advantage of very capable systems like NITE Hawk forward-looking infrared (FLIR) pod and recent video developments like miniature charge-couple-device (CCD) color video cameras and other flight-qualified video hardware.

  17. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, B.T.; Yates, G.J.

    1992-06-09

    An electronic method for eliminating artifacts in a video camera employing a charge coupled device (CCD) as an image sensor is disclosed. The method comprises the step of initializing the camera prior to normal read out and includes a first dump cycle period for transferring radiation generated charge into the horizontal register while the decaying image on the phosphor being imaged is being integrated in the photosites, and a second dump cycle period, occurring after the phosphor image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers. Image charge is then transferred from the photosites and to the vertical registers and read out in conventional fashion. The inventive method allows the video camera to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers and, and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites. 3 figs.

  18. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, Bojan T.; Yates, George J.

    1992-01-01

    An electronic method for eliminating artifacts in a video camera (10) employing a charge coupled device (CCD) (12) as an image sensor. The method comprises the step of initializing the camera (10) prior to normal read out and includes a first dump cycle period (76) for transferring radiation generated charge into the horizontal register (28) while the decaying image on the phosphor (39) being imaged is being integrated in the photosites, and a second dump cycle period (78), occurring after the phosphor (39) image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers (32). Image charge is then transferred from the photosites (36) and (38) to the vertical registers (32) and read out in conventional fashion. The inventive method allows the video camera (10) to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers (28) and (32), and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites (36) and (37).

  19. Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments

    ERIC Educational Resources Information Center

    Schultz, Patrick L.; Quinn, Andrew S.

    2014-01-01

    In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student…

  20. Lights, Camera, Action: Advancing Learning, Research, and Program Evaluation through Video Production in Educational Leadership Preparation

    ERIC Educational Resources Information Center

    Friend, Jennifer; Militello, Matthew

    2015-01-01

    This article analyzes specific uses of digital video production in the field of educational leadership preparation, advancing a three-part framework that includes the use of video in (a) teaching and learning, (b) research methods, and (c) program evaluation and service to the profession. The first category within the framework examines videos…

  1. 241-AZ-101 Waste Tank Color Video Camera System Shop Acceptance Test Report

    SciTech Connect

    WERRY, S.M.

    2000-03-23

    This report includes shop acceptance test results. The test was performed prior to installation at tank AZ-101. Both the camera system and camera purge system were originally sought and procured as a part of initial waste retrieval project W-151.

  2. In-situ measurements of alloy oxidation/corrosion/erosion using a video camera and proximity sensor with microcomputer control

    NASA Technical Reports Server (NTRS)

    Deadmore, D. L.

    1984-01-01

    Two noncontacting and nondestructive, remotely controlled methods of measuring the progress of oxidation/corrosion/erosion of metal alloys, exposed to flame test conditions, are described. The external diameter of a sample under test in a flame was measured by a video camera width measurement system. An eddy current proximity probe system, for measurements outside of the flame, was also developed and tested. The two techniques were applied to the measurement of the oxidation of 304 stainless steel at 910 C using a Mach 0.3 flame. The eddy current probe system yielded a recession rate of 0.41 mils diameter loss per hour and the video system gave 0.27.

  3. Hand-gesture extraction and recognition from the video sequence acquired by a dynamic camera using condensation algorithm

    NASA Astrophysics Data System (ADS)

    Luo, Dan; Ohya, Jun

    2009-01-01

    To achieve environments in which humans and mobile robots co-exist, technologies for recognizing hand gestures from the video sequence acquired by a dynamic camera could be useful for human-to-robot interface systems. Most of conventional hand gesture technologies deal with only still camera images. This paper proposes a very simple and stable method for extracting hand motion trajectories based on the Human-Following Local Coordinate System (HFLC System), which is obtained from the located human face and both hands. Then, we apply Condensation Algorithm to the extracted hand trajectories so that the hand motion is recognized. We demonstrate the effectiveness of the proposed method by conducting experiments on 35 kinds of sign language based hand gestures.

  4. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras.

    PubMed

    Troscianko, Jolyon; Rutz, Christian

    2015-12-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types--hooked stick tools--under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an 'expanded' foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  5. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras

    PubMed Central

    Troscianko, Jolyon; Rutz, Christian

    2015-01-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types—hooked stick tools—under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an ‘expanded’ foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  6. Lights, Camera: Learning! Findings from studies of video in formal and informal science education

    NASA Astrophysics Data System (ADS)

    Borland, J.

    2013-12-01

    As part of the panel, media researcher, Jennifer Borland, will highlight findings from a variety of studies of videos across the spectrum of formal to informal learning, including schools, museums, and in viewers homes. In her presentation, Borland will assert that the viewing context matters a great deal, but there are some general take-aways that can be extrapolated to the use of educational video in a variety of settings. Borland has served as an evaluator on several video-related projects funded by NASA and the the National Science Foundation including: Data Visualization videos and Space Shows developed by the American Museum of Natural History, DragonflyTV, Earth the Operators Manual, The Music Instinct and Time Team America.

  7. Lights, camera, action…critique? Submit videos to AGU communications workshop

    NASA Astrophysics Data System (ADS)

    Viñas, Maria-José

    2011-08-01

    What does it take to create a science video that engages the audience and draws thousands of views on YouTube? Those interested in finding out should submit their research-related videos to AGU's Fall Meeting science film analysis workshop, led by oceanographer turned documentary director Randy Olson. Olson, writer-director of two films (Flock of Dodos: The Evolution-Intelligent Design Circus and Sizzle: A Global Warming Comedy) and author of the book Don't Be Such a Scientist: Talking Substance in an Age of Style, will provide constructive criticism on 10 selected video submissions, followed by moderated discussion with the audience. To submit your science video (5 minutes or shorter), post it on YouTube and send the link to the workshop coordinator, Maria-José Viñas (mjvinas@agu.org), with the following subject line: Video submission for Olson workshop. AGU will be accepting submissions from researchers and media officers of scientific institutions until 6:00 P.M. eastern time on Friday, 4 November. Those whose videos are selected to be screened will be notified by Friday, 18 November. All are welcome to attend the workshop at the Fall Meeting.

  8. Measurements of double stars with the 76 cm refractor of the côte d'azur observatory with ccd and emccd cameras and the new orbit of kui 103. 1st part. (French Title: Mesures d'etoiles doubles effectuées au grand equatorial de l'observatoire de la côte d'azur, avec des caméras ccd et emccd- 1ère partie)

    NASA Astrophysics Data System (ADS)

    Gili, R.; Agati, J. L.

    2009-10-01

    A series of 730 measurements rela?ve to 661 double star systems with the help of the speedy CCD camera, and the EMCCD mounted on a refractor of 76 cm from the Observatory of the Côte D’Azur. The methods used : Lucky Imaging and Speckle Interferometry. Calcula?on of the new orbit of KUI103.

  9. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGESBeta

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  10. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    SciTech Connect

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.

  11. Compact all-CMOS spatiotemporal compressive sensing video camera with pixel-wise coded exposure.

    PubMed

    Zhang, Jie; Xiong, Tao; Tran, Trac; Chin, Sang; Etienne-Cummings, Ralph

    2016-04-18

    We present a low power all-CMOS implementation of temporal compressive sensing with pixel-wise coded exposure. This image sensor can increase video pixel resolution and frame rate simultaneously while reducing data readout speed. Compared to previous architectures, this system modulates pixel exposure at the individual photo-diode electronically without external optical components. Thus, the system provides reduction in size and power compare to previous optics based implementations. The prototype image sensor (127 × 90 pixels) can reconstruct 100 fps videos from coded images sampled at 5 fps. With 20× reduction in readout speed, our CMOS image sensor only consumes 14μW to provide 100 fps videos. PMID:27137331

  12. Internet Teleprescence by Real-Time View-Dependent Image Generation with Omnidirectional Video Camera

    NASA Astrophysics Data System (ADS)

    Morita, Shinji; Yamazawa, Kazumasa; Yokoya, Naokazu

    2003-01-01

    This paper describes a new networked telepresence system which realizes virtual tours into a visualized dynamic real world without significant time delay. Our system is realized by the following three steps: (1) video-rate omnidirectional image acquisition, (2) transportation of an omnidirectional video stream via internet, and (3) real-time view-dependent perspective image generation from the omnidirectional video stream. Our system is applicable to real-time telepresence in the situation where the real world to be seen is far from an observation site, because the time delay from the change of user"s viewing direction to the change of displayed image is small and does not depend on the actual distance between both sites. Moreover, multiple users can look around from a single viewpoint in a visualized dynamic real world in different directions at the same time. In experiments, we have proved that the proposed system is useful for internet telepresence.

  13. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  14. Visual surveys can reveal rather different 'pictures' of fish densities: Comparison of trawl and video camera surveys in the Rockall Bank, NE Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    McIntyre, F. D.; Neat, F.; Collie, N.; Stewart, M.; Fernandes, P. G.

    2015-01-01

    Visual surveys allow non-invasive sampling of organisms in the marine environment which is of particular importance in deep-sea habitats that are vulnerable to damage caused by destructive sampling devices such as bottom trawls. To enable visual surveying at depths greater than 200 m we used a deep towed video camera system, to survey large areas around the Rockall Bank in the North East Atlantic. The area of seabed sampled was similar to that sampled by a bottom trawl, enabling samples from the towed video camera system to be compared with trawl sampling to quantitatively assess the numerical density of deep-water fish populations. The two survey methods provided different results for certain fish taxa and comparable results for others. Fish that exhibited a detectable avoidance behaviour to the towed video camera system, such as the Chimaeridae, resulted in mean density estimates that were significantly lower (121 fish/km2) than those determined by trawl sampling (839 fish/km2). On the other hand, skates and rays showed no reaction to the lights in the towed body of the camera system, and mean density estimates of these were an order of magnitude higher (64 fish/km2) than the trawl (5 fish/km2). This is probably because these fish can pass under the footrope of the trawl due to their flat body shape lying close to the seabed but are easily detected by the benign towed video camera system. For other species, such as Molva sp, estimates of mean density were comparable between the two survey methods (towed camera, 62 fish/km2; trawl, 73 fish/km2). The towed video camera system presented here can be used as an alternative benign method for providing indices of abundance for species such as ling in areas closed to trawling, or for those fish that are poorly monitored by trawl surveying in any area, such as the skates and rays.

  15. Lights! Camera! Action! Producing Library Instruction Video Tutorials Using Camtasia Studio

    ERIC Educational Resources Information Center

    Charnigo, Laurie

    2009-01-01

    From Web guides to online tutorials, academic librarians are increasingly experimenting with many different technologies in order to meet the needs of today's growing distance education populations. In this article, the author discusses one librarian's experience using Camtasia Studio to create subject specific video tutorials. Benefits, as well…

  16. Use of a UAV-mounted video camera to assess feeding behavior of Raramuri Criollo cows

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Interest in use of unmanned aerial vehicles in science has increased in recent years. It is predicted that they will be a preferred remote sensing platform for applications that inform sustainable rangeland management in the future. The objective of this study was to determine whether UAV video moni...

  17. "Lights, Camera, Reflection": Using Peer Video to Promote Reflective Dialogue among Student Teachers

    ERIC Educational Resources Information Center

    Harford, Judith; MacRuairc, Gerry; McCartan, Dermot

    2010-01-01

    This paper examines the use of peer-videoing in the classroom as a means of promoting reflection among student teachers. Ten pre-service teachers participating in a teacher education programme in a university in the Republic of Ireland and ten pre-service teachers participating in a teacher education programme in a university in the North of…

  18. Making the Most of Your Video Camera. Technology in Language Learning Series.

    ERIC Educational Resources Information Center

    Lonergan, Jack

    A practical guide for language teachers illustrates the different ways in which cameras can be employed in language work, with suggestions and advice taken from current experience. Teachers can be involved by making their own language training videotapes and focusing on an area of language, literature, or thematic interest directly applicable to…

  19. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  20. Video-based realtime IMU-camera calibration for robot navigation

    NASA Astrophysics Data System (ADS)

    Petersen, Arne; Koch, Reinhard

    2012-06-01

    This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly coupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the transfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and the horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation, information on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot navigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the estimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters can be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality and minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware is required. As will be shown, the system is capable of estimating the calibration within a short period of time. Depending on the requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system at startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality and consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and rotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the estimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability to robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered conditions easily.

  1. Video imaging system and thermal mapping of the molten hearth in an electron beam melting furnace

    SciTech Connect

    Miszkiel, M.E.; Davis, R.A.; Van Den Avyle, J.A.

    1995-12-31

    This project was initiated to develop an enhanced video imaging system for the Liquid Metal Processing Laboratory Electron Beam Melting (EB) Furnace at Sandia and to use color video images to map the temperature distribution of the surface of the molten hearth. In a series of test melts, the color output of the video image was calibrated against temperatures measured by an optical pyrometer and CCD camera viewing port above the molten pool. To prevent potential metal vapor deposition onto line-of-sight optical surfaces above the pool, argon backfill was used along with a pinhole aperture to obtain the vide image. The geometry of the optical port to the hearth set the limits for the focus lens and CCD camera`s field of view. Initial melts were completed with the pyrometer and pinhole aperture port in a fixed position. Using commercially available vacuum components, a second flange assembly was constructed to provide flexibility in choosing pyrometer target sights on the hearth and to adjust the field of view for the focus lens/CCD combination. RGB video images processed from the melts verified that red wavelength light captured with the video camera could be calibrated with the optical pyrometer target temperatures and used to generate temperature maps of the hearth surface. Two color ratio thermal mapping using red and green video images, which has theoretical advantages, was less successful due to probable camera non-linearities in the red and green image intensities.

  2. Real-time multi-camera video acquisition and processing platform for ADAS

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio

    2016-04-01

    The paper presents the design of a real-time and low-cost embedded system for image acquisition and processing in Advanced Driver Assisted Systems (ADAS). The system adopts a multi-camera architecture to provide a panoramic view of the objects surrounding the vehicle. Fish-eye lenses are used to achieve a large Field of View (FOV). Since they introduce radial distortion of the images projected on the sensors, a real-time algorithm for their correction is also implemented in a pre-processor. An FPGA-based hardware implementation, re-using IP macrocells for several ADAS algorithms, allows for real-time processing of input streams from VGA automotive CMOS cameras.

  3. A risk-based coverage model for video surveillance camera control optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua

    2015-12-01

    Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.

  4. Measurement and processing of signatures in the visible range using a calibrated video camera and the CAMDET software package

    NASA Astrophysics Data System (ADS)

    Sheffer, Dan

    1997-06-01

    A procedure for calibration of a color video camera has been developed at EORD. The RGB values of standard samples, together with the spectral radiance values of the samples, are used to calculate a transformation matrix between the RGB and CIEXYZ color spaces. The transformation matrix is then used to calculate the XYZ color coordinates of distant objects imaged in the field. These, in turn, are used in order to calculate the CIELAB color coordinates of the objects. Good agreement between the calculated coordinates and those obtained from spectroradiometric data is achieved. Processing of the RGB values of pixels in the digital image of a scene using the CAMDET software package which was developed at EORD, results in `Painting Maps' in which the true apparent CIELAB color coordinates are used. The paper discusses the calibration procedure, its advantages and shortcomings and suggests a definition for the visible signature of objects. The Camdet software package is described and some examples are given.

  5. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  6. Study of recognizing multiple persons' complicated hand gestures from the video sequence acquired by a moving camera

    NASA Astrophysics Data System (ADS)

    Dan, Luo; Ohya, Jun

    2010-02-01

    Recognizing hand gestures from the video sequence acquired by a dynamic camera could be a useful interface between humans and mobile robots. We develop a state based approach to extract and recognize hand gestures from moving camera images. We improved Human-Following Local Coordinate (HFLC) System, a very simple and stable method for extracting hand motion trajectories, which is obtained from the located human face, body part and hand blob changing factor. Condensation algorithm and PCA-based algorithm was performed to recognize extracted hand trajectories. In last research, this Condensation Algorithm based method only applied for one person's hand gestures. In this paper, we propose a principal component analysis (PCA) based approach to improve the recognition accuracy. For further improvement, temporal changes in the observed hand area changing factor are utilized as new image features to be stored in the database after being analyzed by PCA. Every hand gesture trajectory in the database is classified into either one hand gesture categories, two hand gesture categories, or temporal changes in hand blob changes. We demonstrate the effectiveness of the proposed method by conducting experiments on 45 kinds of sign language based Japanese and American Sign Language gestures obtained from 5 people. Our experimental recognition results show better performance is obtained by PCA based approach than the Condensation algorithm based method.

  7. Design and analysis of filter-based optical systems for spectral responsivity estimation of digital video cameras

    NASA Astrophysics Data System (ADS)

    Chang, Gao-Wei; Jian, Hong-Da; Yeh, Zong-Mu; Cheng, Chin-Pao

    2004-02-01

    For estimating spectral responsivities of digital video cameras, a filter-based optical system is designed with sophisticated filter selections, in this paper. The filter consideration in the presence of noise is central to the optical systems design, since the spectral filters primarily prescribe the structure of the perturbed system. A theoretical basis is presented to confirm that sophisticated filter selections can make this system as insensitive to noise as possible. Also, we propose a filter selection method based on the orthogonal-triangular (QR) decomposition with column pivoting (QRCP). To investigate the noise effects, we assess the estimation errors between the actual and estimated spectral responsivities, with the different signal-to-noise ratio (SNR) levels of an eight-bit/channel camera. Simulation results indicate that the proposed method yields satisfactory estimation accuracy. That is, the filter-based optical system with the spectral filters selected from the QRCP-based method is much less sensitive to noise than those with other filters from different selections.

  8. Technologies to develop a video camera with a frame rate higher than 100 Mfps

    NASA Astrophysics Data System (ADS)

    Vo Le, Cuong; Nguyen, H. D.; Dao, V. T. S.; Takehara, K.; Etoh, T. G.; Akino, T.; Nishi, K.; Kitamura, K.; Arai, T.; Maruyama, H.

    2008-11-01

    A feasibility study is presented for an image sensor capable of image capturing at 100 Mega-frames per second (Mfps). The basic structure of the sensor is the backside-illuminated ISIS, the in-situ storage image sensor, with slanted linear CCD memories, which has already achieved 1 Mfps with very high sensitivity. There are many potential technical barriers to further increase the frame rate up to 100 Mfps, such as traveling time of electrons within a pixel, Resistive-Capacitive (RC) delay in driving voltage transfer, heat generation, heavy electro-magnetic noises, etc. For each of the barriers, a countermeasure is newly proposed and the technical and practical possibility is examined mainly by simulations. The new technical proposals include a special wafer with n and p double epitaxial layers with smoothly changing doping profiles, a design method with curves, the thunderbolt bus lines, and digitalnoiseless image capturing by the ISIS with solely sinusoidal driving voltages. It is confirmed that the integration of these technologies is very promising to realize a practical image sensor with the ultra-high frame rate.

  9. Introducing Contactless Blood Pressure Assessment Using a High Speed Video Camera.

    PubMed

    Jeong, In Cheol; Finkelstein, Joseph

    2016-04-01

    Recent studies demonstrated that blood pressure (BP) can be estimated using pulse transit time (PTT). For PTT calculation, photoplethysmogram (PPG) is usually used to detect a time lag in pulse wave propagation which is correlated with BP. Until now, PTT and PPG were registered using a set of body-worn sensors. In this study a new methodology is introduced allowing contactless registration of PTT and PPG using high speed camera resulting in corresponding image-based PTT (iPTT) and image-based PPG (iPPG) generation. The iPTT value can be potentially utilized for blood pressure estimation however extent of correlation between iPTT and BP is unknown. The goal of this preliminary feasibility study was to introduce the methodology for contactless generation of iPPG and iPTT and to make initial estimation of the extent of correlation between iPTT and BP "in vivo." A short cycling exercise was used to generate BP changes in healthy adult volunteers in three consecutive visits. BP was measured by a verified BP monitor simultaneously with iPTT registration at three exercise points: rest, exercise peak, and recovery. iPPG was simultaneously registered at two body locations during the exercise using high speed camera at 420 frames per second. iPTT was calculated as a time lag between pulse waves obtained as two iPPG's registered from simultaneous recoding of head and palm areas. The average inter-person correlation between PTT and iPTT was 0.85 ± 0.08. The range of inter-person correlations between PTT and iPTT was from 0.70 to 0.95 (p < 0.05). The average inter-person coefficient of correlation between SBP and iPTT was -0.80 ± 0.12. The range of correlations between systolic BP and iPTT was from 0.632 to 0.960 with p < 0.05 for most of the participants. Preliminary data indicated that a high speed camera can be potentially utilized for unobtrusive contactless monitoring of abrupt blood pressure changes in a variety of settings. The initial prototype system was able to

  10. A simple, inexpensive video camera setup for the study of avian nest activity

    USGS Publications Warehouse

    Sabine, J.B.; Meyers, J.M.; Schweitzer, Sara H.

    2005-01-01

    Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (>USA $4000/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus), American Crow (Corvus brachyrhynchos), and ghost crab (Ocypode quadrata) predation on oystercatcher nests. Other detected causes of nest failure included tidal overwash, horse trampling, abandonment, and human destruction. System failure rates were comparable with commercially available units. Our system's efficacy and low cost (<$800) provided useful data for the management and conservation of the American Oystercatcher.

  11. Development of CCD controller for scientific application

    NASA Astrophysics Data System (ADS)

    Khan, M. S.; Pathan, F. M.; Shah, U. V., Prof; Makwana, D. H., Prof; Anandarao, B. G., Prof

    2010-02-01

    Photoelectric equipment has wide applications such as spectroscopy, temperature measurement in infrared region and in astronomical research etc. A photoelectric transducer converts radiant energy into electrical energy. There are two types of photoelectric transducers namely photo-multiplier tube (PMT) and charged couple device (CCD) are used to convert radiant energy into electrical signal. Now the entire modern instruments use CCD technology. We have designed and developed a CCD camera controller using camera chip CD47-10 of Marconi which has 1K × 1K pixel for space application only.

  12. Jellyfish Support High Energy Intake of Leatherback Sea Turtles (Dermochelys coriacea): Video Evidence from Animal-Borne Cameras

    PubMed Central

    Heaslip, Susan G.; Iverson, Sara J.; Bowen, W. Don; James, Michael C.

    2012-01-01

    The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n = 19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:08–3:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83–100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p = 0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p = 0.0001). Estimates of energy intake averaged 66,018 kJ•d−1 but were as high as 167,797 kJ•d−1 corresponding to turtles consuming an average of 330 kg wet mass•d−1 (up to 840 kg•d−1) or approximately 261 (up to 664) jellyfish•d-1. Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body mass•d−1 equating to an average energy intake of 3–7 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to

  13. Jellyfish support high energy intake of leatherback sea turtles (Dermochelys coriacea): video evidence from animal-borne cameras.

    PubMed

    Heaslip, Susan G; Iverson, Sara J; Bowen, W Don; James, Michael C

    2012-01-01

    The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n = 19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:08-3:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83-100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p = 0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p = 0.0001). Estimates of energy intake averaged 66,018 kJ • d(-1) but were as high as 167,797 kJ • d(-1) corresponding to turtles consuming an average of 330 kg wet mass • d(-1) (up to 840 kg • d(-1)) or approximately 261 (up to 664) jellyfish • d(-1). Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body mass • d(-1) equating to an average energy intake of 3-7 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to

  14. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  15. General purpose solid state camera for SERTS

    NASA Astrophysics Data System (ADS)

    Payne, Leslie J.; Haas, J. Patrick

    1996-11-01

    The Laboratory for Astronomy and Solar Physics at Goddard Space Flight Center uses a variety of CCD's and other solid state imaging sensors for its instrumentation programs. Traditionally, custom camera systems are built around the imaging device to optimize the circuitry for the particular sensor. This usually produces a camera that is small, uses little power and is elegant. Although these are desirable characteristics, this approach is also expensive and time consuming. An alternative approach is to design a `universal' camera that is easily customized to meet specific mission requirements. This is the approach our team used for SERTS. The camera design used to support the SERTS mission is a general purpose camera design that is derived from an existing camera on the SOHO spacecraft. This camera is designed to be rugged, modest in power requirements and flexible. The base design of the camera supports quadrant CCD devices with up to 4 phases. Imaging devices with simpler architectures are in general supportable. The basic camera is comprised of a main electronics box which performs all timing generation, voltage level control, data processing and compression. A second unit, placed close to the detector head, is responsible for driving the image device control electrodes and amplifying the multichannel detector video. Programmable high voltage units are used for the single stage MCP type intensifier. The detector head is customized for each sensor type supported. Auxiliary equipment includes a frame buffer that works either as a multi-frame storage unit or as a photon counting accumulation unit. This unit also performs interface buffering so that the camera may appear as a piece of GPIB instrumentation.

  16. Bird-Borne Video-Cameras Show That Seabird Movement Patterns Relate to Previously Unrevealed Proximate Environment, Not Prey

    PubMed Central

    Tremblay, Yann; Thiebault, Andréa; Mullers, Ralf; Pistorius, Pierre

    2014-01-01

    The study of ecological and behavioral processes has been revolutionized in the last two decades with the rapid development of biologging-science. Recently, using image-capturing devices, some pilot studies demonstrated the potential of understanding marine vertebrate movement patterns in relation to their proximate, as opposed to remote sensed environmental contexts. Here, using miniaturized video cameras and GPS tracking recorders simultaneously, we show for the first time that information on the immediate visual surroundings of a foraging seabird, the Cape gannet, is fundamental in understanding the origins of its movement patterns. We found that movement patterns were related to specific stimuli which were mostly other predators such as gannets, dolphins or fishing boats. Contrary to a widely accepted idea, our data suggest that foraging seabirds are not directly looking for prey. Instead, they search for indicators of the presence of prey, the latter being targeted at the very last moment and at a very small scale. We demonstrate that movement patterns of foraging seabirds can be heavily driven by processes unobservable with conventional methodology. Except perhaps for large scale processes, local-enhancement seems to be the only ruling mechanism; this has profounds implications for ecosystem-based management of marine areas. PMID:24523892

  17. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  18. Linear CCD attitude measurement system based on the identification of the auxiliary array CCD

    NASA Astrophysics Data System (ADS)

    Hu, Yinghui; Yuan, Feng; Li, Kai; Wang, Yan

    2015-10-01

    Object to the high precision flying target attitude measurement issues of a large space and large field of view, comparing existing measurement methods, the idea is proposed of using two array CCD to assist in identifying the three linear CCD with multi-cooperative target attitude measurement system, and to address the existing nonlinear system errors and calibration parameters and more problems with nine linear CCD spectroscopic test system of too complicated constraints among camera position caused by excessive. The mathematical model of binocular vision and three linear CCD test system are established, co-spot composition triangle utilize three red LED position light, three points' coordinates are given in advance by Cooperate Measuring Machine, the red LED in the composition of the three sides of a triangle adds three blue LED light points as an auxiliary, so that array CCD is easier to identify three red LED light points, and linear CCD camera is installed of a red filter to filter out the blue LED light points while reducing stray light. Using array CCD to measure the spot, identifying and calculating the spatial coordinates solutions of red LED light points, while utilizing linear CCD to measure three red LED spot for solving linear CCD test system, which can be drawn from 27 solution. Measured with array CCD coordinates auxiliary linear CCD has achieved spot identification, and has solved the difficult problems of multi-objective linear CCD identification. Unique combination of linear CCD imaging features, linear CCD special cylindrical lens system is developed using telecentric optical design, the energy center of the spot position in the depth range of convergence in the direction is perpendicular to the optical axis of the small changes ensuring highprecision image quality, and the entire test system improves spatial object attitude measurement speed and precision.

  19. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Technical Reports Server (NTRS)

    Bolas, M. T.; Fisher, S. S.

    1990-01-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remore manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  20. Video photographic considerations for measuring the proximity of a probe aircraft with a smoke seeded trailing vortex

    NASA Technical Reports Server (NTRS)

    Childers, Brooks A.; Snow, Walter L.

    1990-01-01

    Considerations for acquiring and analyzing 30 Hz video frames from charge coupled device (CCD) cameras mounted in the wing tips of a Beech T-34 aircraft are described. Particular attention is given to the characterization and correction of optical distortions inherent in the data.

  1. Evaluating the Effects of Camera Perspective in Video Modeling for Children with Autism: Point of View versus Scene Modeling

    ERIC Educational Resources Information Center

    Cotter, Courtney

    2010-01-01

    Video modeling has been used effectively to teach a variety of skills to children with autism. This body of literature is characterized by a variety of procedural variations including the characteristics of the video model (e.g., self vs. other, adult vs. peer). Traditionally, most video models have been filmed using third person perspective…

  2. Improvement in the light sensitivity of the ultrahigh-speed high-sensitivity CCD with a microlens array

    NASA Astrophysics Data System (ADS)

    Hayashida, T.,; Yonai, J.; Kitamura, K.; Arai, T.; Kurita, T.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Kitagawa, S.; Hatade, K.; Yamaguchi, T.; Takeuchi, H.; Iida, K.

    2008-02-01

    We are advancing the development of ultrahigh-speed, high-sensitivity CCDs for broadcast use that are capable of capturing smooth slow-motion videos in vivid colors even where lighting is limited, such as at professional baseball games played at night. We have already developed a 300,000 pixel, ultrahigh-speed CCD, and a single CCD color camera that has been used for sports broadcasts and science programs using this CCD. However, there are cases where even higher sensitivity is required, such as when using a telephoto lens during a baseball broadcast or a high-magnification microscope during science programs. This paper provides a summary of our experimental development aimed at further increasing the sensitivity of CCDs using the light-collecting effects of a microlens array.

  3. Automatic CCD Imaging Systems for Time-series CCD Photometry

    NASA Astrophysics Data System (ADS)

    Caton, D. B.; Pollock, J. T.; Davis, S. A.

    2004-12-01

    CCDs allow precision photometry to be done with small telescopes and at sites with less than ideal seeing conditions. The addition of an automatic observing mode makes it easy to do time-series CCD photometry of variable stars and AGN/QSOs. At Appalachian State University's Dark Sky Observatory (DSO), we have implemented automatic imaging systems for image acquisition, scripted filter changing, data storage and quick-look online photometry two different telescopes, the 32-inch and 18-inch telescopes. The camera at the 18-inch allows a simple system where the data acquisition PC controls a DFM Engineering filter wheel and Photometrics/Roper camera. The 32-inch system is the more complex, with three computers communicating in order to make good use of its camera's 30-second CCD-read time for filter change. Both telescopes use macros written in the PMIS software (GKR Computer Consulting). Both systems allow automatic data capture with only tended care provided by the observer. Indeed, one observer can easily run both telescopes simultaneously. The efficiency and reliability of these systems also reduces observer errors. The only unresolved problem is an occasional but rare camera-read error (the PC is apparently interrupted). We also sometimes experience a crash of the PMIS software, probably due to its 16-bit code now running in the Windows 2000 32-bit environment. We gratefully acknowledge the support of the National Science Foundation through grants number AST-0089248 and AST-9119750, the Dunham Fund for Astrophysical Research, and the ASU Research Council.

  4. Design of video interface conversion system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Heng; Wang, Xiang-jun

    2014-11-01

    This paper presents a FPGA based video interface conversion system that enables the inter-conversion between digital and analog video. Cyclone IV series EP4CE22F17C chip from Altera Corporation is used as the main video processing chip, and single-chip is used as the information interaction control unit between FPGA and PC. The system is able to encode/decode messages from the PC. Technologies including video decoding/encoding circuits, bus communication protocol, data stream de-interleaving and de-interlacing, color space conversion and the Camera Link timing generator module of FPGA are introduced. The system converts Composite Video Broadcast Signal (CVBS) from the CCD camera into Low Voltage Differential Signaling (LVDS), which will be collected by the video processing unit with Camera Link interface. The processed video signals will then be inputted to system output board and displayed on the monitor.The current experiment shows that it can achieve high-quality video conversion with minimum board size.

  5. Cryostat and CCD for MEGARA at GTC

    NASA Astrophysics Data System (ADS)

    Castillo-Domínguez, E.; Ferrusca, D.; Tulloch, S.; Velázquez, M.; Carrasco, E.; Gallego, J.; Gil de Paz, A.; Sánchez, F. M.; Vílchez Medina, J. M.

    2012-09-01

    MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is the new integral field unit (IFU) and multi-object spectrograph (MOS) instrument for the GTC. The spectrograph subsystems include the pseudo-slit, the shutter, the collimator with a focusing mechanism, pupil elements on a volume phase holographic grating (VPH) wheel and the camera joined to the cryostat through the last lens, with a CCD detector inside. In this paper we describe the full preliminary design of the cryostat which will harbor the CCD detector for the spectrograph. The selected cryogenic device is an LN2 open-cycle cryostat which has been designed by the "Astronomical Instrumentation Lab for Millimeter Wavelengths" at INAOE. A complete description of the cryostat main body and CCD head is presented as well as all the vacuum and temperature sub-systems to operate it. The CCD is surrounded by a radiation shield to improve its performance and is placed in a custom made mechanical mounting which will allow physical adjustments for alignment with the spectrograph camera. The 4k x 4k pixel CCD231 is our selection for the cryogenically cooled detector of MEGARA. The characteristics of this CCD, the internal cryostat cabling and CCD controller hardware are discussed. Finally, static structural finite element modeling and thermal analysis results are shown to validate the cryostat model.

  6. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  7. Are traditional methods of determining nest predators and nest fates reliable? An experiment with Wood Thrushes (Hylocichla mustelina) using miniature video cameras

    USGS Publications Warehouse

    Williams, Gary E.; Wood, P.B.

    2002-01-01

    We used miniature infrared video cameras to monitor Wood Thrush (Hylocichla mustelina) nests during 1998-2000. We documented nest predators and examined whether evidence at nests can be used to predict predator identities and nest fates. Fifty-six nests were monitored; 26 failed, with 3 abandoned and 23 depredated. We predicted predator class (avian, mammalian, snake) prior to review of video footage and were incorrect 57% of the time. Birds and mammals were underrepresented whereas snakes were over-represented in our predictions. We documented ???9 nest-predator species, with the southern flying squirrel (Glaucomys volans) taking the most nests (n = 8). During 2000, we predicted fate (fledge or fail) of 27 nests; 23 were classified correctly. Traditional methods of monitoring nests appear to be effective for classifying success or failure of nests, but ineffective at classifying nest predators.

  8. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  9. Optical signal processing of video surveillance for recognizing and measurement location railway infrastructure elements

    NASA Astrophysics Data System (ADS)

    Diyazitdinov, Rinat R.; Vasin, Nikolay N.

    2016-03-01

    Processing of optical signals, which are received from CCD sensors of video cameras, allows to extend the functionality of video surveillance systems. Traditional video surveillance systems are used for saving, transmitting and preprocessing of the video content from the controlled objects. Video signal processing by analytics systems allows to get more information about object's location and movement, the flow of technological processes and to measure other parameters. For example, the signal processing of video surveillance systems, installed on carriage-laboratories, are used for getting information about certain parameters of the railways. Two algorithms for video processing, allowing recognition of pedestrian crossings of the railways, as well as location measurement of the so-called "Anchor Marks" used to control the mechanical stresses of continuous welded rail track are described in this article. The algorithms are based on the principle of determining the region of interest (ROI), and then the analysis of the fragments inside this ROI.

  10. Proactive PTZ Camera Control

    NASA Astrophysics Data System (ADS)

    Qureshi, Faisal Z.; Terzopoulos, Demetri

    We present a visual sensor network—comprising wide field-of-view (FOV) passive cameras and pan/tilt/zoom (PTZ) active cameras—capable of automatically capturing closeup video of selected pedestrians in a designated area. The passive cameras can track multiple pedestrians simultaneously and any PTZ camera can observe a single pedestrian at a time. We propose a strategy for proactive PTZ camera control where cameras plan ahead to select optimal camera assignment and handoff with respect to predefined observational goals. The passive cameras supply tracking information that is used to control the PTZ cameras.

  11. High Precision CCD Imaging Polarimetry

    NASA Astrophysics Data System (ADS)

    Magalhaes, A. M.; Rodrigues, C. V.; Margoniner, V. E.; Pereyra, A.; Heathcote, S.; Coyne, G. V.

    1994-12-01

    We describe a recent modification to the direct CCD Cameras at CTIO and LNA (Brazil) observatories in order to allow for high precision optical polarimetry. We make use of a rotating achromatic half-wave plate as a retarder and a Savart plate as analyser. Cancellation of sky polarization and independence of the CCD flat field correction are among the advantages of the arrangement. We show preliminary data that indicate the high polarimetric precision achievable with the method for non-extended sources. We give a brief description of the on-going observational programs employing the technique. Polarimetry of extended objects can be performed by using a Polaroid sheet in place of the Savart plate. Use of the Savart plate with such fields can also be valuable in the reduction, and analysis, of the extended source images as it provides polarization data on the non-extended objects in the field.

  12. Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic

    NASA Astrophysics Data System (ADS)

    Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

    2008-11-01

    Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

  13. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  14. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  15. Identification of Prey Captures in Australian Fur Seals (Arctocephalus pusillus doriferus) Using Head-Mounted Accelerometers: Field Validation with Animal-Borne Video Cameras

    PubMed Central

    Volpov, Beth L.; Hoskins, Andrew J.; Battaile, Brian C.; Viviant, Morgane; Wheatley, Kathryn E.; Marshall, Greg; Abernathy, Kyler; Arnould, John P. Y.

    2015-01-01

    This study investigated prey captures in free-ranging adult female Australian fur seals (Arctocephalus pusillus doriferus) using head-mounted 3-axis accelerometers and animal-borne video cameras. Acceleration data was used to identify individual attempted prey captures (APC), and video data were used to independently verify APC and prey types. Results demonstrated that head-mounted accelerometers could detect individual APC but were unable to distinguish among prey types (fish, cephalopod, stingray) or between successful captures and unsuccessful capture attempts. Mean detection rate (true positive rate) on individual animals in the testing subset ranged from 67-100%, and mean detection on the testing subset averaged across 4 animals ranged from 82-97%. Mean False positive (FP) rate ranged from 15-67% individually in the testing subset, and 26-59% averaged across 4 animals. Surge and sway had significantly greater detection rates, but also conversely greater FP rates compared to heave. Video data also indicated that some head movements recorded by the accelerometers were unrelated to APC and that a peak in acceleration variance did not always equate to an individual prey item. The results of the present study indicate that head-mounted accelerometers provide a complementary tool for investigating foraging behaviour in pinnipeds, but that detection and FP correction factors need to be applied for reliable field application. PMID:26107647

  16. CCD Double Star Measures: Jack Jones Observatory Report #2

    NASA Astrophysics Data System (ADS)

    Jones, James L.

    2009-10-01

    This paper submits 44 CCD measurements of 41 multiple star systems for inclusion in the WDS. Observations were made during the calendar year 2008. Measurements were made using a CCD camera and an 11" Schmidt-Cassegrain telescope. Brief discussions of pertinent observations are included.

  17. Large area CCD image sensors for space astronomy

    NASA Technical Reports Server (NTRS)

    Schwarzschild, M.

    1979-01-01

    The Defense Advanced Research Projects Agency (DARPA) has a substantial program to develop a 2200 x 2200 pixel CCD (Charge Coupled Device) mosaic array made up of 400 individual CCD's, 110 x 110 pixels square. This type of image sensor appeared to have application in space and ground-based astronomy. Under this grant a CCD television camera system was built which was capable of operating an array of 4 CCD's to explore the suitability of the CCD's to explore the suitability of the CCD for astronomical applications. Two individual packaged CCD's were received and evaluated. Evaluation of the basic characteristics of the best individual chips was encouraging, but the manufacturer found that their yield in manufacturing this design is two low to supply sufficient CDD's for the DARPA mosaic array. The potential utility of large mosaic arrays in astronomy is still substantial and continued monitoring of the manufacturers progress in the coming year is recommended.

  18. Fast measurement of temporal noise of digital camera's photosensors

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    Currently photo- and videocameras are widespread parts of both scientific experimental setups and consumer applications. They are used in optics, radiophysics, astrophotography, chemistry, and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photoand videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Spatial part usually several times lower in magnitude than temporal. At first approximation spatial noises might be neglected. Earlier we proposed modification of the automatic segmentation of non-uniform targets (ASNT) method for measurement of temporal noise of photo- and videocameras. Only two frames are sufficient for noise measurement with the modified method. In result, proposed ASNT modification should allow fast and accurate measurement of temporal noise. In this paper, we estimated light and dark temporal noises of four cameras of different types using the modified ASNT method with only several frames. These cameras are: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PLB781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. We measured elapsed time for processing of shots used for temporal noise estimation. The results demonstrate the possibility of fast obtaining of dependency of camera full temporal noise on signal value with the proposed ASNT modification.

  19. Inexpensive video cameras used by parents to record social communication in epidemiological investigations in early childhood-A feasibility study.

    PubMed

    Wilson, Philip; Puckering, Christine; McConnachie, Alex; Marwick, Helen; Reissland, Nadja; Gillberg, Christopher

    2011-02-01

    We tested the feasibility of parents recording social interactions with their infants using inexpensive camcorders, as a potential method of effective, convenient, and economical large scale data gathering on social communication. Participants were asked to record two short video clips during either play or a mealtime, and return the data. Sixty-five video clips (32 pairs) were returned by 33 families, comprising 8.5% of families contacted, 44.6% of respondents and 51.6% of those sent a camcorder, and the general visual and sound quality of the data was assessed. Audio and video quality were adequate for analysis in 85% of clips and several social behaviours, including social engagement and contingent responsiveness, could be assessed in 97% of clips. We examined two quantifiable social behaviours quantitatively in both adults and infants: gaze direction and duration, and vocalization occurrence and duration. It proved difficult for most observers to obtain a simultaneous clear view of the parents and infant's face. Video clips obtained by parents are informative and usable for analysis. Further work is required to establish the acceptability of this technique in longitudinal studies of child development and to maximize the return of usable data. PMID:21036401

  20. The Effect of Smartphone Video Camera as a Tool to Create Gigital Stories for English Learning Purposes

    ERIC Educational Resources Information Center

    Gromik, Nicolas A.

    2015-01-01

    The integration of smartphones in the language learning environment is gaining research interest. However, using a smartphone to learn to speak spontaneously has received little attention. The emergence of smartphone technology and its video recording feature are recognised as suitable learning tools. This paper reports on a case study conducted…

  1. Use of an unmanned aerial vehicle-mounted video camera to assess feeding behavior of Raramuri Criollo cows

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We determined the feasibility of using unmanned aerial vehicle (UAV) video monitoring to predict intake of discrete food items of rangeland-raised Raramuri Criollo non-nursing beef cows. Thirty-five cows were released into a 405-m2 rectangular dry lot, either in pairs (pilot tests) or individually (...

  2. What Does the Camera Communicate? An Inquiry into the Politics and Possibilities of Video Research on Learning

    ERIC Educational Resources Information Center

    Vossoughi, Shirin; Escudé, Meg

    2016-01-01

    This piece explores the politics and possibilities of video research on learning in educational settings. The authors (a research-practice team) argue that changing the stance of inquiry from "surveillance" to "relationship" is an ongoing and contingent practice that involves pedagogical, political, and ethical choices on the…

  3. A geometric comparison of video camera-captured raster data to vector-parented raster data generated by the X-Y digitizing table

    NASA Technical Reports Server (NTRS)

    Swalm, C.; Pelletier, R.; Rickman, D.; Gilmore, K.

    1989-01-01

    The relative accuracy of a georeferenced raster data set captured by the Megavision 1024XM system using the Videk Megaplus CCD cameras is compared to a georeferenced raster data set generated from vector lines manually digitized through the ELAS software package on a Summagraphics X-Y digitizer table. The study also investigates the amount of time necessary to fully complete the rasterization of the two data sets, evaluating individual areas such as time necessary to generate raw data, time necessary to edit raw data, time necessary to georeference raw data, and accuracy of georeferencing against a norm. Preliminary results exhibit a high level of agreement between areas of the vector-parented data and areas of the captured file data where sufficient control points were chosen. Maps of 1:20,000 scale were digitized into raster files of 5 meter resolution per pixel and overall error in RMS was estimated at less than eight meters. Such approaches offer time and labor-saving advantages as well as increasing the efficiency of project scheduling and enabling the digitization of new types of data.

  4. Video tracking method for three-dimensional measurement of a free-swimming fish

    NASA Astrophysics Data System (ADS)

    Wu, Guanhao; Zeng, Lijiang

    2007-12-01

    A video system for tracking a free-swimming fish two-dimensionally is introduced in this paper. The tracking is accomplished by simultaneously taking images from the ventral view and the lateral view of the fish with two CCD cameras mounted on two computer-controlled and mutually orthogonal translation stages. By processing the images recorded during tracking, three-dimensional kinematic parameters of the tail and pectoral fin of the fish in forward, backward and turning swimming modes are obtained.

  5. Overview of a hybrid underwater camera system

    NASA Astrophysics Data System (ADS)

    Church, Philip; Hou, Weilin; Fournier, Georges; Dalgleish, Fraser; Butler, Derek; Pari, Sergio; Jamieson, Michael; Pike, David

    2014-05-01

    The paper provides an overview of a Hybrid Underwater Camera (HUC) system combining sonar with a range-gated laser camera system. The sonar is the BlueView P900-45, operating at 900kHz with a field of view of 45 degrees and ranging capability of 60m. The range-gated laser camera system is based on the third generation LUCIE (Laser Underwater Camera Image Enhancer) sensor originally developed by the Defence Research and Development Canada. LUCIE uses an eye-safe laser generating 1ns pulses at a wavelength of 532nm and at the rate of 25kHz. An intensified CCD camera operates with a gating mechanism synchronized with the laser pulse. The gate opens to let the camera capture photons from a given range of interest and can be set from a minimum delay of 5ns with increments of 200ps. The output of the sensor is a 30Hz video signal. Automatic ranging is achieved using a sonar altimeter. The BlueView sonar and LUCIE sensors are integrated with an underwater computer that controls the sensors parameters and displays the real-time data for the sonar and the laser camera. As an initial step for data integration, graphics overlays representing the laser camera field-of-view along with the gate position and width are overlaid on the sonar display. The HUC system can be manually handled by a diver and can also be controlled from a surface vessel through an umbilical cord. Recent test data obtained from the HUC system operated in a controlled underwater environment will be presented along with measured performance characteristics.

  6. The Video Book.

    ERIC Educational Resources Information Center

    Clendenin, Bruce

    This book provides a comprehensive step-by-step learning guide to video production. It begins with camera equipment, both still and video. It then describes how to reassemble the video and build a final product out of "video blocks," and discusses multiple-source configurations, which are required for professional level productions of live shows.…

  7. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  8. CCD BVI c observations of Cepheids

    NASA Astrophysics Data System (ADS)

    Berdnikov, L. N.; Kniazev, A. Yu.; Sefako, R.; Kravtsov, V. V.; Zhujko, S. V.

    2014-02-01

    In 2008-2013, we obtained 11333 CCD BVI c frames for 57 Cepheids from the General Catalogue of Variable Stars. We performed our observations with the 76-cm telescope of the South African Astronomical Observatory (SAAO, South Africa) and the 40-cm telescope of the Cerro Armazones Astronomical Observatory of the Universidad Católica del Norte (OCA, Chile) using the SBIG ST-10XME CCD camera. The tables of observations, the plots of light curves, and the current light elements are presented. Comparison of our light curves with those constructed from photoelectric observations shows that the differences between their mean magnitudes exceed 0ṃ05 in 20% of the cases. This suggests the necessity of performing CCD observations for all Cepheids.

  9. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  10. Concerning the Video Drift Method to Measure Double Stars

    NASA Astrophysics Data System (ADS)

    Nugent, Richard L.; Iverson, Ernest W.

    2015-05-01

    Classical methods to measure position angles and separations of double stars rely on just a few measurements either from visual observations or photographic means. Visual and photographic CCD observations are subject to errors from the following sources: misalignments from eyepiece/camera/barlow lens/micrometer/focal reducers, systematic errors from uncorrected optical distortions, aberrations from the telescope system, camera tilt, magnitude and color effects. Conventional video methods rely on calibration doubles and graphically calculating the east-west direction plus careful choice of select video frames stacked for measurement. Atmospheric motion is one of the larger sources of error in any exposure/measurement method which is on the order of 0.5-1.5. Ideally, if a data set from a short video can be used to derive position angle and separation, with each data set self-calibrating independent of any calibration doubles or star catalogues, this would provide measurements of high systematic accuracy. These aims are achieved by the video drift method first proposed by the authors in 2011. This self calibrating video method automatically analyzes 1,000's of measurements from a short video clip.

  11. The future scientific CCD

    NASA Technical Reports Server (NTRS)

    Janesick, J. R.; Elliott, T.; Collins, S.; Marsh, H.; Blouke, M. M.

    1984-01-01

    Since the first introduction of charge-coupled devices (CCDs) in 1970, CCDs have been considered for applications related to memories, logic circuits, and the detection of visible radiation. It is pointed out, however, that the mass market orientation of CCD development has left largely untapped the enormous potential of these devices for advanced scientific instrumentation. The present paper has, therefore, the objective to introduce the CCD characteristics to the scientific community, taking into account prospects for further improvement. Attention is given to evaluation criteria, a summary of current CCDs, CCD performance characteristics, absolute calibration tools, quantum efficiency, aspects of charge collection, charge transfer efficiency, read noise, and predictions regarding the characteristics of the next generation of silicon scientific CCD imagers.

  12. On the development of new SPMN diurnal video systems for daylight fireball monitoring

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.

    2008-09-01

    Daylight fireball video monitoring High-sensitivity video devices are commonly used for the study of the activity of meteor streams during the night. These provide useful data for the determination, for instance, of radiant, orbital and photometric parameters ([1] to [7]). With this aim, during 2006 three automated video stations supported by Universidad de Huelva were set up in Andalusia within the framework of the SPanish Meteor Network (SPMN). These are endowed with 8-9 high sensitivity wide-field video cameras that achieve a meteor limiting magnitude of about +3. These stations have increased the coverage performed by the low-scan allsky CCD systems operated by the SPMN and, besides, achieve a time accuracy of about 0.01s for determining the appearance of meteor and fireball events. Despite of these nocturnal monitoring efforts, we realised the need of setting up stations for daylight fireball detection. Such effort was also motivated by the appearance of the two recent meteorite-dropping events of Villalbeto de la Peña [8,9] and Puerto Lápice [10]. Although the Villalbeto de la Peña event was casually videotaped, and photographed, no direct pictures or videos were obtained for the Puerto Lápice event. Consequently, in order to perform a continuous recording of daylight fireball events, we setup new automated systems based on CCD video cameras. However, the development of these video stations implies several issues with respect to nocturnal systems that must be properly solved in order to get an optimal operation. The first of these video stations, also supported by University of Huelva, has been setup in Sevilla (Andalusia) during May 2007. But, of course, fireball association is unequivocal only in those cases when two or more stations recorded the fireball, and when consequently the geocentric radiant is accurately determined. With this aim, a second diurnal video station is being setup in Andalusia in the facilities of Centro Internacional de Estudios y

  13. CCD Imaging of KIC 8462852

    NASA Astrophysics Data System (ADS)

    Lahey, Adam

    2016-06-01

    A particularly interesting star, KIC 8562852, recently became famous for its enigmatic dips in brightness. The interpretation broadcast by many popular media outlets was that the dips were caused by a megastructure built around the star by an intelligent civilization. The best scientific hypothesis relies on a natural phenomenon: the break-up of a comet orbiting the star. To further address this problem, we have measured the star for four months using BGSU’s 0.5m telescope and digital CCD camera, and we present the star’s brightness as a function of time. Using three very clear nights, we refined the brightness of four comparison stars which can be used by the local astronomical community to monitor the star’s brightness. These newly refined magnitudes should reduce the uncertainties in our brightness measurements; this error analysis is essential in determining the significance of any brightness deviations. An observed dip in brightness would confirm the comet hypothesis by establishing a cyclical pattern, or may serve as a basis for new understanding of variable stars. An additional element to the project involves creating CCD calibration images and a well-documented procedure for future use.

  14. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable

  15. The Video Guide. Second Edition.

    ERIC Educational Resources Information Center

    Bensinger, Charles

    Intended for both novice and experienced users, this guide is designed to inform and entertain the reader in unravelling the jargon surrounding video equipment and in following carefully delineated procedures for its use. Chapters include "Exploring the Video Universe,""A Grand Tour of Video Technology,""The Video System,""The Video Camera,""The…

  16. Application of a Two Camera Video Imaging System to Three-Dimensional Vortex Tracking in the 80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1993-01-01

    A description is presented of two enhancements for a two-camera, video imaging system that increase the accuracy and efficiency of the system when applied to the determination of three-dimensional locations of points along a continuous line. These enhancements increase the utility of the system when extracting quantitative data from surface and off-body flow visualizations. The first enhancement utilizes epipolar geometry to resolve the stereo "correspondence" problem. This is the problem of determining, unambiguously, corresponding points in the stereo images of objects that do not have visible reference points. The second enhancement, is a method to automatically identify and trace the core of a vortex in a digital image. This is accomplished by means of an adaptive template matching algorithm. The system was used to determine the trajectory of a vortex generated by the Leading-Edge eXtension (LEX) of a full-scale F/A-18 aircraft tested in the NASA Ames 80- by 120-Foot Wind Tunnel. The system accuracy for resolving the vortex trajectories is estimated to be +/-2 inches over distance of 60 feet. Stereo images of some of the vortex trajectories are presented. The system was also used to determine the point where the LEX vortex "bursts". The vortex burst point locations are compared with those measured in small-scale tests and in flight and found to be in good agreement.

  17. High-speed multicolour photometry with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Zhilyaev, B. E.; Reshetnyk, V. M.

    2012-11-01

    We present the results of testing the commercial digital camera Nikon D90 with a CMOS sensor for high-speed photometry with a small telescope Celestron 11'' at the Peak Terskol Observatory. CMOS sensor allows to perform photometry in 3 filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system of CMOS sensors is close to the Johnson BVR system. The results of testing show that one can carry out photometric measurements with CMOS cameras for stars with the V-magnitude up to ≃14^{m} with the precision of 0.01^{m}. Stars with the V-magnitude up to ˜10 can be shot at 24 frames per second in the video mode.

  18. High-resolution CCD imagers using area-array CCD's for sensing spectral components of an optical line image

    NASA Technical Reports Server (NTRS)

    Elabd, Hammam (Inventor); Kosonocky, Walter F. (Inventor)

    1987-01-01

    CCD imagers with a novel replicated-line-imager architecture are abutted to form an extended line sensor. The sensor is preceded by optics having a slit aperture and having an optical beam splitter or astigmatic lens for projecting multiple line images through an optical color-discriminating stripe filter to the CCD imagers. A very high resolution camera suitable for use in a satellite, for example, is thus provided. The replicated-line architecture of the imager comprises an area-array CCD, successive rows of which are illuminated by replications of the same line segment, as transmitted by respective color filter stripes. The charge packets formed by accumulation of photoresponsive charge in the area-array CCD are read out row by row. Each successive row of charge packets is then converted from parallel to serial format in a CCD line register and its amplitude sensed to generate a line of output signal.

  19. Video-Level Monitor

    NASA Technical Reports Server (NTRS)

    Gregory, Ray W.

    1993-01-01

    Video-level monitor developed to provide full-scene monitoring of video and indicates level of brightest portion. Circuit designed nonspecific and can be inserted in any closed-circuit camera system utilizing RS170 or RS330 synchronization and standard CCTV video levels. System made of readily available, off-the-shelf components. Several units are in service.

  20. The use of video for air pollution source monitoring

    SciTech Connect

    Ferreira, F.; Camara, A.

    1999-07-01

    The evaluation of air pollution impacts from single industrial emission sources is a complex environmental engineering problem. Recent developments in multimedia technologies used by personal computers improved the digitizing and processing of digital video sequences. This paper proposes a methodology where statistical analysis of both meteorological and air quality data combined with digital video images are used for monitoring air pollution sources. One of the objectives of this paper is to present the use of image processing algorithms in air pollution source monitoring. CCD amateur video cameras capture images that are further processed by computer. The use of video as a remote sensing system was implemented with the goal of determining some particular parameters, either meteorological or related with air quality monitoring and modeling of point sources. These parameters include the remote calculation of wind direction, wind speed, gases stack's outlet velocity, and stack's effective emission height. The characteristics and behavior of a visible pollutant's plume is also studied. Different sequences of relatively simple image processing operations are applied to the images gathered by the different cameras to segment the plume. The algorithms are selected depending on the atmospheric and lighting conditions. The developed system was applied to a 1,000 MW fuel power plant located at Setubal, Portugal. The methodology presented shows that digital video can be an inexpensive form to get useful air pollution related data for monitoring and modeling purposes.

  1. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  2. CCD technology applied to laser cladding

    NASA Astrophysics Data System (ADS)

    Meriaudeau, Fabrice; Renier, Eric; Truchetet, Frederic

    1996-03-01

    Power lasers are more and more used in aerospace industry or automobile industry; their widespread use through different processes such as: welding, drilling or coating, in order to perform some surface treatments of material, requires a better understanding. In order to control the quality of the process, many technics have been developed, but most of them are based on a post-mortem analysis of the samples, and/or require an important financial investment. Welding, coating or other material treatments involving material transformations are often controlled with a metallurgical analysis. We here propose a new method, a new approach of the phenomena, we control the industrial process during the application. For this, we use information provided by two CCD cameras. One supplies information related to the intensity, and geometry of the melted surface, the second about the shape of the powder distribution within the laser beam. We use data provided by post-mortem metallurgical analysis and correlate those informations with parameters measured by both CCD, we create a datas bank which represents the relation between the measured parameters and the quality of the coating. Both informations, provided by the 2 CCD cameras allows us to optimize the industrial process. We are actually working on the real time aspect of the application and expect an implementation of the system.

  3. Upgrades to NDSF Vehicle Camera Systems and Development of a Prototype System for Migrating and Archiving Video Data in the National Deep Submergence Facility Archives at WHOI

    NASA Astrophysics Data System (ADS)

    Fornari, D.; Howland, J.; Lerner, S.; Gegg, S.; Walden, B.; Bowen, A.; Lamont, M.; Kelley, D.

    2003-12-01

    In recent years, considerable effort has been made to improve the visual recording capabilities of Alvin and ROV Jason. This has culminated in the routine use of digital cameras, both internal and external on these vehicles, which has greatly expanded the scientific recording capabilities of the NDSF. The UNOLS National Deep Submergence Facility (NDSF) archives maintained at Woods Hole Oceanograpic Institution (WHOI) are the repository for the diverse suite of photographic still images (both 35mm and recently digital), video imagery, vehicle data and navigation, and near-bottom side-looking sonar data obtained by the facility vehicles. These data comprise a unique set of information from a wide range of seafloor environments over the more than 25 years of NDSF operations in support of science. Included in the holdings are Alvin data plus data from the tethered vehicles- ROV Jason, Argo II, and the DSL-120 side scan sonar. This information conservatively represents an outlay in facilities and science costs well in excess of \\$100 million. Several archive related improvement issues have become evident over the past few years. The most critical are: 1. migration and better access to the 35mm Alvin and Jason still images through digitization and proper cataloging with relevant meta-data, 2. assessing Alvin data logger data, migrating data on older media no longer in common use, and properly labeling and evaluating vehicle attitude and navigation data, 3. migrating older Alvin and Jason video data, especially data recorded on Hi-8 tape that is very susceptible to degradation on each replay, to newer digital format media such as DVD, 4. improving the capabilities of the NDSF archives to better serve the increasingly complex needs of the oceanographic community, including researchers involved in focused programs like Ridge2000 and MARGINS, where viable distributed databases in various disciplinary topics will form an important component of the data management structure

  4. Cone penetrometer deployed in situ video microscope for characterizing sub-surface soil properties

    SciTech Connect

    Lieberman, S.H.; Knowles, D.S.; Kertesz, J.

    1997-12-31

    In this paper we report on the development and field testing of an in situ video microscope that has been integrated with a cone penetrometer probe in order to provide a real-time method for characterizing subsurface soil properties. The video microscope system consists of a miniature CCD color camera system coupled with an appropriate magnification and focusing optics to provide a field of view with a coverage of approximately 20 mm. The camera/optic system is mounted in a cone penetrometer probe so that the camera views the soil that is in contact with a sapphire window mounted on the side of the probe. The soil outside the window is illuminated by diffuse light provided through the window by an optical fiber illumination system connected to a white light source at the surface. The video signal from the camera is returned to the surface where it can be displayed in real-time on a video monitor, recorded on a video cassette recorder (VCR), and/or captured digitally with a frame grabber installed in a microcomputer system. In its highest resolution configuration, the in situ camera system has demonstrated a capability to resolve particle sizes as small as 10 {mu}m. By using other lens systems to increase the magnification factor, smaller particles could be resolved, however, the field of view would be reduced. Initial field tests have demonstrated the ability of the camera system to provide real-time qualitative characterization of soil particle sizes. In situ video images also reveal information on porosity of the soil matrix and the presence of water in the saturated zone. Current efforts are focused on the development of automated imaging processing techniques as a means of extracting quantitative information on soil particle size distributions. Data will be presented that compares data derived from digital images with conventional sieve/hydrometer analyses.

  5. Video borehole depth measuring system

    SciTech Connect

    Utasi, J.G.

    1986-09-02

    A method is described of determining penetration of a drill string into the earth utilizing an element of a drilling rig, comprising: providing a target on the element of the drill rig; positioning a video camera at a remote location relative to the drill rig placing the video camera within a waterproof housing at the remote location; directing the video camera at the target; and tracking the movement of the target with the drill string into the earth.

  6. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1983-08-02

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid. 4 figs.

  7. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1981-06-10

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid.

  8. Video flowmeter

    DOEpatents

    Lord, David E.; Carter, Gary W.; Petrini, Richard R.

    1983-01-01

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid (10) containing entrained particles (12) is formed and positioned by a rod optic lens assembly (31) on the raster area of a low-light level television camera (20). The particles (12) are illuminated by light transmitted through a bundle of glass fibers (32) surrounding the rod optic lens assembly (31). Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen (40). The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid (10).

  9. Megapixel imaging camera for expanded H{sup {minus}} beam measurements

    SciTech Connect

    Simmons, J.E.; Lillberg, J.W.; McKee, R.J.; Slice, R.W.; Torrez, J.H.; McCurnin, T.W.; Sanchez, P.G.

    1994-02-01

    A charge coupled device (CCD) imaging camera system has been developed as part of the Ground Test Accelerator project at the Los Alamos National Laboratory to measure the properties of a large diameter, neutral particle beam. The camera is designed to operate in the accelerator vacuum system for extended periods of time. It would normally be cooled to reduce dark current. The CCD contains 1024 {times} 1024 pixels with pixel size of 19 {times} 19 {mu}m{sup 2} and with four phase parallel clocking and two phase serial clocking. The serial clock rate is 2.5{times}10{sup 5} pixels per second. Clock sequence and timing are controlled by an external logic-word generator. The DC bias voltages are likewise located externally. The camera contains circuitry to generate the analog clocks for the CCD and also contains the output video signal amplifier. Reset switching noise is removed by an external signal processor that employs delay elements to provide noise suppression by the method of double-correlated sampling. The video signal is digitized to 12 bits in an analog to digital converter (ADC) module controlled by a central processor module. Both modules are located in a VME-type computer crate that communicates via ethernet with a separate workstation where overall control is exercised and image processing occurs. Under cooled conditions the camera shows good linearity with dynamic range of 2000 and with dark noise fluctuations of about {plus_minus}1/2 ADC count. Full well capacity is about 5{times}10{sup 5} electron charges.

  10. Dynamic MTF improvement scheme and its validation for CCD operating in TDI mode for Earth imaging applications

    NASA Astrophysics Data System (ADS)

    Dubey, Neeraj; Banerjee, Arup

    2016-05-01

    The paper presents the scheme for improving the image contrast in the remote sensing images and highlights the novelty in hardware & software design in the test system developed for measuring image contrast function. Modulation transfer function (MTF) is the most critical quality element of the high-resolution imaging payloads for earth observation consisting of TDI-CCD (Time Delayed Integration Charge Coupled Device) image. From the mathematical model for MTF Smear MTF of 65% (35% degradation) is observed. Then a operating method for TDI-CCD is developed, using which 96% of Motion Smear MTF will occur within the imaging operation. As a major part of the validation, indigenously designed and developed a test system for measuring the dynamic MTF of TDI Sensors which consists of the optical scanning system, TDI-CCD camera drive & video processing electronics, thermal control system and telecentric uniform illumination system. The experimental results confirm that image quality improvement can be achieved by this method. This method is now implemented in the flight model hardware of the remote sensing payload.

  11. Computer-aided analysis of CCD linear image sensors

    NASA Technical Reports Server (NTRS)

    Prince, S. S.

    1976-01-01

    Special test equipment and techniques to collect and process image information from charge coupled devices (CCDs) by digital computer were reviewed. The video channel was traced from the CCD to the direct memory access bus of the Interdata Computer. Software was developed to evaluate and characterize a CCD for (1) dark signal versus temperature relationship, (2) calculation of temporal noise magnitude and noise shape for each pixel, (3) spatial noise into the video chain due to dark signal, (4) response versus illumination relationship (gamma), (5) response versus wavelength of illumination (spectral), (6) optimization of forcing functions, and (7) evaluation of an image viewed by a CCD. The basic software differences and specific examples of each program operating on real data are presented.

  12. Extreme ultraviolet response of a Tektronix 1024 x 1024 CCD

    NASA Astrophysics Data System (ADS)

    Moses, Daniel J.; Hochedez, Jean-Francois E.; Howard, Russell A.; Au, Benjamin D.; Wang, Dennis; Blouke, Morley

    1992-08-01

    The goal of the detector development program for the Solar and Heliospheric Spacecraft (SOHO) EUV Imaging Telescope (EIT) is an Extreme UltraViolet (EUV) CCD (Charge Coupled Device) camera. The Naval Research Lab (NRL) SOHO COD Group has developed a design for the EIT camera and is screening CCDs for flight application. Tektronix Inc. have fabricated 1024x1024 CCDs for the EIT program. As a part of the CCD screening effort the quantum efficiency (QE) of a prototype CCD has been measured in the NRL EUV laboratory over the wavelength range of 256 to 735 Angstroms. A simplified model has been applied to these QE measurements to illustrate the relevant physical processes that determine the performance of the detector.

  13. The parts to be modified and developed before a scientific CCD is attached to the LLMC.

    NASA Astrophysics Data System (ADS)

    Li, Chennfei; Wang, Xiaobin; Wei, Mao

    According to the current situation of the LLMC and the actual situation of the video CCD used and the requirement that the instrument can determine absolutely the position of a celestial body, the parts to be modified and developed before a scientific CCD is attached to the LLMC are put forward.

  14. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  15. Measurements of 42 Wide CPM Pairs with a CCD

    NASA Astrophysics Data System (ADS)

    Harshaw, Richard

    2015-11-01

    This paper addresses the use of a Skyris 618C color CCD camera as a means of obtaining data for analysis in the measurement of wide common proper motion stars. The equipment setup is described and data collection procedure outlined. Results of the measures of 42 CPM stars are presented, showing the Skyris is a reliable device for the measurement of double stars.

  16. CCD image sensor induced error in PIV applications

    NASA Astrophysics Data System (ADS)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  17. CCD high-speed videography system with new concepts and techniques

    NASA Astrophysics Data System (ADS)

    Zheng, Zengrong; Zhao, Wenyi; Wu, Zhiqiang

    1997-05-01

    A novel CCD high speed videography system with brand-new concepts and techniques is developed by Zhejiang University recently. The system can send a series of short flash pulses to the moving object. All of the parameters, such as flash numbers, flash durations, flash intervals, flash intensities and flash colors, can be controlled according to needs by the computer. A series of moving object images frozen by flash pulses, carried information of moving object, are recorded by a CCD video camera, and result images are sent to a computer to be frozen, recognized and processed with special hardware and software. Obtained parameters can be displayed, output as remote controlling signals or written into CD. The highest videography frequency is 30,000 images per second. The shortest image freezing time is several microseconds. The system has been applied to wide fields of energy, chemistry, medicine, biological engineering, aero- dynamics, explosion, multi-phase flow, mechanics, vibration, athletic training, weapon development and national defense engineering. It can also be used in production streamline to carry out the online, real-time monitoring and controlling.

  18. Evaluation of stereoscopic video cameras synchronized with the movement of an operator's head on the teleoperation of the actual backhoe shovel

    NASA Astrophysics Data System (ADS)

    Minamoto, Masahiko; Matsunaga, Katsuya

    1999-05-01

    Operator performance while using a remote controlled backhoe shovel is described for three different stereoscopic viewing conditions: direct view, fixed stereoscopic cameras connected to a helmet mounted display (HMD), and rotating stereo camera connected and slaved to the head orientation of a free moving stereo HMD. Results showed that the head- slaved system provided the best performance.

  19. Video-based beam position monitoring at CHESS

    NASA Astrophysics Data System (ADS)

    Revesz, Peter; Pauling, Alan; Krawczyk, Thomas; Kelly, Kevin J.

    2012-10-01

    CHESS has pioneered the development of X-ray Video Beam Position Monitors (VBPMs). Unlike traditional photoelectron beam position monitors that rely on photoelectrons generated by the fringe edges of the X-ray beam, with VBPMs we collect information from the whole cross-section of the X-ray beam. VBPMs can also give real-time shape/size information. We have developed three types of VBPMs: (1) VBPMs based on helium luminescence from the intense white X-ray beam. In this case the CCD camera is viewing the luminescence from the side. (2) VBPMs based on luminescence of a thin (~50 micron) CVD diamond sheet as the white beam passes through it. The CCD camera is placed outside the beam line vacuum and views the diamond fluorescence through a viewport. (3) Scatter-based VBPMs. In this case the white X-ray beam passes through a thin graphite filter or Be window. The scattered X-rays create an image of the beam's footprint on an X-ray sensitive fluorescent screen using a slit placed outside the beam line vacuum. For all VBPMs we use relatively inexpensive 1.3 Mega-pixel CCD cameras connected via USB to a Windows host for image acquisition and analysis. The VBPM host computers are networked and provide live images of the beam and streams of data about the beam position, profile and intensity to CHESS's signal logging system and to the CHESS operator. The operational use of VBPMs showed great advantage over the traditional BPMs by providing direct visual input for the CHESS operator. The VBPM precision in most cases is on the order of ~0.1 micron. On the down side, the data acquisition frequency (50-1000ms) is inferior to the photoelectron based BPMs. In the future with the use of more expensive fast cameras we will be able create VBPMs working in the few hundreds Hz scale.

  20. Readout electronics for the Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Castilla, Javier; Ballester, Otger; Cardiel, Laia; Chappa, Steve; de Vicente, Juan; Holm, Scott; Huffman, David; Kozlovsky, Mark; Martinez, Gustavo; Olsen, Jamieson; Shaw, Theresa; Stuermer, Walter

    2010-07-01

    The goal of the Dark Energy Survey (DES) is to measure the dark energy equation of state parameter with four complementary techniques: galaxy cluster counts, weak lensing, angular power spectrum and type Ia supernovae. DES will survey a 5000 sq. degrees area of the sky in five filter bands using a new 3 deg2 mosaic camera (DECam) mounted at the prime focus of the Blanco 4-meter telescope at the Cerro-Tololo International Observatory (CTIO). DECam is a ~520 megapixel optical CCD camera that consists of 62 2k x 4k science sensors plus 4 2k x 2k sensors for guiding. The CCDs, developed at the Lawrence Berkeley National Laboratory (LBNL) and packaged and tested at Fermilab, have been selected to obtain images efficiently at long wavelengths. A front-end electronics system has been developed specifically to perform the CCD readout. The system is based in Monsoon, an open source image acquisition system designed by the National Optical Astronomy Observatory (NOAO). The electronics consists mainly of three types of modules: Control, Acquisition and Clock boards. The system provides a total of 132 video channels, 396 bias levels and around 1000 clock channels in order to readout the full mosaic at 250 kpixel/s speed with 10 e- noise performance. System configuration and data acquisition is done by means of six 0.8 Gbps optical links. The production of the whole system is currently underway. The contribution will focus on the testing, calibration and general performance of the full system in a realistic environment.

  1. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  2. Tests of commercial colour CMOS cameras for astronomical applications

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Reshetnyk, V. M.; Zhilyaev, B. E.

    2013-12-01

    We present some results of testing commercial colour CMOS cameras for astronomical applications. Colour CMOS sensors allow to perform photometry in three filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system realized in colour CMOS sensors is close to the astronomical Johnson BVR system. The basic camera characteristics: read noise (e^{-}/pix), thermal noise (e^{-}/pix/sec) and electronic gain (e^{-}/ADU) for the commercial digital camera Canon 5D MarkIII are presented. We give the same characteristics for the scientific high performance cooled CCD camera system ALTA E47. Comparing results for tests of Canon 5D MarkIII and CCD ALTA E47 show that present-day commercial colour CMOS cameras can seriously compete with the scientific CCD cameras in deep astronomical imaging.

  3. A CCD search for geosynchronous debris

    NASA Technical Reports Server (NTRS)

    Gehrels, Tom; Vilas, Faith

    1986-01-01

    Using the Spacewatch Camera, a search was conducted for objects in geosynchronous earth orbit. The system is equipped with a CCD camera cooled with dry ice; the image scale is 1.344 arcsec/pixel. The telescope drive was off so that during integrations the stars were trailed while geostationary objects appeared as round images. The technique should detect geostationary objects to a limiting apparent visual magnitude of 19. A sky area of 8.8 square degrees was searched for geostationary objects while geosynchronous debris passing through was 16.4 square degrees. Ten objects were found of which seven are probably geostationary satellites having apparent visual magnitudes brighter than 13.1. Three objects having magnitudes equal to or fainter than 13.7 showed motion in the north-south direction. The absence of fainter stationary objects suggests that a gap in debris size exists between satellites and particles having diameters in the millimeter range.

  4. Handbook of CCD Astronomy

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.

    2000-04-01

    This handbook constitutes a concise and accessible reference on all practical aspects of using Charge-Coupled Devices (CCDs). Starting with the electronic workings of these modern marvels, Steven Howell discusses their basic characteristics and then gives methods and examples for determining their values. While the focus is on using CCDs in professional observational astronomy, advanced amateur astronomers, and researchers in physics, chemistry, medical imaging, and remote sensing will also benefit from the material. Tables of useful and hard-to-find data, and key practical equations round off the book's treatment. For exercises and more information, log on to www.psi.edu/~howell/ccd.html.

  5. Camera Edge Response

    NASA Astrophysics Data System (ADS)

    Zisk, Stanley H.; Wittels, Norman

    1988-02-01

    Edge location is an important machine vision task. Machine vision systems perform mathematical operations on rectangular arrays of numbers that are intended to faithfully represent the spatial distribution of scene luminance. The numbers are produced by periodic sampling and quantization of the camera's video output. This sequence can cause artifacts to appear in the data with a noise spectrum that is high in power at high spatial frequencies. This is a problem because most edge detection algorithms are preferentially sensitive to the high-frequency content in an image. Solid state cameras can introduce errors because of the spatial periodicity of their sensor elements. This can result in problems when image edges are aligned with camera pixel boundaries: (a) some cameras introduce transients into the video signal while switching between sensor elements; (b) most cameras use analog low-pass filters to minimize sampling artifacts and these introduce video phase delays that shift the locations of edges. The problems compound when the vision system samples asynchronously with the camera's pixel rate. Moire patterns (analogous to beat frequencies) can result. In this paper, we examine and model quantization effects in a machine vision system with particular emphasis on edge detection performance. We also compare our models with experimental measurements.

  6. Fluorescence endoscopic video system

    NASA Astrophysics Data System (ADS)

    Papayan, G. V.; Kang, Uk

    2006-10-01

    This paper describes a fluorescence endoscopic video system intended for the diagnosis of diseases of the internal organs. The system operates on the basis of two-channel recording of the video fluxes from a fluorescence channel and a reflected-light channel by means of a high-sensitivity monochrome television camera and a color camera, respectively. Examples are given of the application of the device in gastroenterology.

  7. The Orthogonal Transfer CCD

    NASA Astrophysics Data System (ADS)

    Tonry, J.; Burke, Barry E.; Schechter, Paul L.

    1997-10-01

    We have designed and built a new type of CCD that we call an orthogonal transfer CCD (OTCCD), which permits parallel clocking horizontally as well as vertically. The device has been used successfully to remove image motion caused by atmospheric turbulence at rates up to 100 Hz, and promises to be a better, cheaper way to carry out image motion correction for imaging than by using fast tip/tilt mirrors. We report on the device characteristics, and find that the large number of transfers needed to track image motion does not significantly degrade the image either because of charge transfer inefficiency or because of charge traps. For example, after 100 sec of tracking at 100 Hz approximately 3% of the charge would diffuse into a skirt around the point spread function. Four nights of data at the Michigan-Dartmouth-MIT (MDM) 2.4-m telescope also indicate that the atmosphere is surprisingly benign, in terms of both the speed and coherence angle of image motion. Image motion compensation improved image sharpness by about 0.5'' in quadrature with no degradation over a field of at least 3 arcminutes. (SECTION: Astronomical Instrumentation)

  8. Video Golf

    NASA Technical Reports Server (NTRS)

    1995-01-01

    George Nauck of ENCORE!!! invented and markets the Advanced Range Performance (ARPM) Video Golf System for measuring the result of a golf swing. After Nauck requested their assistance, Marshall Space Flight Center scientists suggested video and image processing/computing technology, and provided leads on commercial companies that dealt with the pertinent technologies. Nauck contracted with Applied Research Inc. to develop a prototype. The system employs an elevated camera, which sits behind the tee and follows the flight of the ball down range, catching the point of impact and subsequent roll. Instant replay of the video on a PC monitor at the tee allows measurement of the carry and roll. The unit measures distance and deviation from the target line, as well as distance from the target when one is selected. The information serves as an immediate basis for making adjustments or as a record of skill level progress for golfers.

  9. CCD technique for longitude/latitude astronomy

    NASA Astrophysics Data System (ADS)

    Damljanović, G.; Gerstbach, G.; de Biasi, M. S.; Pejović, N.

    2003-10-01

    We report about CCD (Charge Coupled Device) experiments with the isntruments of astrometry and geodesy for the longitude and latitude determinations. At the Techn. University Vienna (TU Vienna), a mobile zenith camera "G1" was developed, based on CCD MX916 (Starlight Xpress) and F=20 cm photo optic. With Hipparcos/Tycho Catalogue, the first results show accuracy up to 0."5 for latitude/longitude. The PC-guided observations can be completed within 10 minutes. The camera G1 (near 4 kg) is used for astrogeodesy (geoid, Earth's crust, etc.). At the Belgrade Astronomical Observatory (AOB), the accuracy of (mean value of) latitude/longitude determinations can be a few 0."01 using zenith stars, Tycho-2 Catalogue and a ST-8 of SBIG (Santa Barbara Instrument Group) with zenith-telescope BLZ (D=11 cm, F=128.7 cm). The same equipment with PIP instrument (D=20 cm and F=457.7 cm, Punta Indio PZT, near La Plata) yields a little better accuracy than the BLZ's one. Both instruments, BLZ and PIP, where in the list of Bureau International de l'Heure - BIH. The mentioned instruments have acquired good possibilities for semi or full-automatic observations.

  10. Colorized linear CCD data acquisition system with automatic exposure control

    NASA Astrophysics Data System (ADS)

    Li, Xiaofan; Sui, Xiubao

    2014-11-01

    Colorized linear cameras deliver superb color fidelity at the fastest line rates in the industrial inspection. It's RGB trilinear sensor eliminates image artifacts by placing a separate row of pixels for each color on a single sensor. It's advanced design minimizes distance between rows to minimize image artifacts due to synchronization. In this paper, the high-speed colorized linear CCD data acquisition system was designed take advantages of the linear CCD sensor μpd3728. The hardware and software design of the system based on FPGA is introduced and the design of the functional modules is performed. The all system is composed of CCD driver module, data buffering module, data processing module and computer interface module. The image data was transferred to computer by Camera link interface. The system which automatically adjusts the exposure time of linear CCD, is realized with a new method. The integral time of CCD can be controlled by the program. The method can automatically adjust the integration time for different illumination intensity under controlling of FPGA, and respond quickly to brightness changes. The data acquisition system is also offering programmable gains and offsets for each color. The quality of image can be improved after calibration in FPGA. The design has high expansibility and application value. It can be used in many application situations.

  11. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  12. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  13. Cameras for digital microscopy.

    PubMed

    Spring, Kenneth R

    2013-01-01

    This chapter reviews the fundamental characteristics of charge-coupled devices (CCDs) and related detectors, outlines the relevant parameters for their use in microscopy, and considers promising recent developments in the technology of detectors. Electronic imaging with a CCD involves three stages--interaction of a photon with the photosensitive surface, storage of the liberated charge, and readout or measurement of the stored charge. The most demanding applications in fluorescence microscopy may require as much as four orders of greater magnitude sensitivity. The image in the present-day light microscope is usually acquired with a CCD camera. The CCD is composed of a large matrix of photosensitive elements (often referred to as "pixels" shorthand for picture elements, which simultaneously capture an image over the entire detector surface. The light-intensity information for each pixel is stored as electronic charge and is converted to an analog voltage by a readout amplifier. This analog voltage is subsequently converted to a numerical value by a digitizer situated on the CCD chip, or very close to it. Several (three to six) amplifiers are required for each pixel, and to date, uniform images with a homogeneous background have been a problem because of the inherent difficulties of balancing the gain in all of the amplifiers. Complementary metal oxide semiconductor sensors also exhibit relatively high noise associated with the requisite high-speed switching. Both of these deficiencies are being addressed, and sensor performance is nearing that required for scientific imaging. PMID:23931507

  14. Based on line scan CCD print image detection system

    NASA Astrophysics Data System (ADS)

    Zhang, Lifeng; Xie, Kai; Li, Tong

    2015-12-01

    In this paper, a new method based on machine vision is proposed for the defects of the traditional manual inspection of the quality of printed matter. With the aid of on line array CCD camera for image acquisition, using stepper motor as a sampling of drive circuit. Through improvement of driving circuit, to achieve the different size or precision image acquisition. In the terms of image processing, the standard image registration algorithm then, because of the characteristics of CCD-image acquisition, rigid body transformation is usually used in the registration, so as to achieve the detection of printed image.

  15. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  16. New video pupillometer

    NASA Astrophysics Data System (ADS)

    McLaren, Jay W.; Fjerstad, Wayne H.; Ness, Anders B.; Graham, Matthew D.; Brubaker, Richard F.

    1995-03-01

    An instrument is developed to measure pupil diameter from both eyes in the dark. Each eye is monitored with a small IR video camera and pupil diameters are calculated from the video signal at a rate of 60 Hz. A processing circuit, designed around a video digitizer, a digital logic circuit, and a microcomputer, extracts pupil diameter from each video frame in real time. This circuit also highlights the detected outline of the pupil on a monitored video image of each eye. Diameters are exported to a host computer that displays, graphs, analyzes, and stores them as pupillograms. The host computer controls pupil measurements and can turn on a yellow light emitting diode mounted just above each video camera to excite the pupillary light reflex. We present examples of pupillograms to illustrate how this instrument is used to measure the pupillary light reflex and pupil motility in the dark.

  17. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  18. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  19. STIS-01 CCD Functional

    NASA Astrophysics Data System (ADS)

    Valenti, Jeff

    2001-07-01

    This activity measures the baseline performance and commandability of the CCD subsystem. Only primary amplifier D is used. Bias, Dark, and Flat Field exposures are taken in order to measure read noise, dark current, CTE, and gain. Numerous bias frames are taken to permit construction of "superbias" frames in which the effects of read noise have been rendered negligible. Dark exposures are made outside the SAA. Full frame and binned observations are made, with binning factors of 1x1 and 2x2. Finally, tungsten lamp exposures are taken through narrow slits to confirm the slit positions in the current database. All exposures are internals. This is a reincarnation of SM3A proposal 8502 with some unnecessary tests removed from the program.

  20. Measurement of marine picoplankton cell size by using a cooled, charge-coupled device camera with image-analyzed fluorescence microscopy

    SciTech Connect

    Viles, C.L.; Sieracki, M.E. )

    1992-02-01

    Accurate measurement of the biomass and size distribution of picoplankton cells (0.2 to 2.0 {mu}m) is paramount in characterizing their contribution to the oceanic food web and global biogeochemical cycling. Image-analyzed fluorescence microscopy, usually based on video camera technology, allows detailed measurements of individual cells to be taken. The application of an imaging system employing a cooled, slow-scan charge-coupled device (CCD) camera to automated counting and sizing of individual picoplankton cells from natural marine samples is described. A slow-scan CCD-based camera was compared to a video camera and was superior for detecting and sizing very small, dim particles such as fluorochrome-stained bacteria. Several edge detection methods for accurately measuring picoplankton cells were evaluated. Standard fluorescent microspheres and a Sargasso Sea surface water picoplankton population were used in the evaluation. Global thresholding was inappropriate for these samples. Methods used previously in image analysis of nanoplankton cells (2 to 20 {mu}m) also did not work well with the smaller picoplankton cells. A method combining an edge detector and an adaptive edge strength operator worked best for rapidly generating accurate cell sizes. A complete sample analysis of more than 1,000 cells averages about 50 min and yields size, shape, and fluorescence data for each cell. With this system, the entire size range of picoplankton can be counted and measured.

  1. Video monitoring system for car seat

    NASA Technical Reports Server (NTRS)

    Elrod, Susan Vinz (Inventor); Dabney, Richard W. (Inventor)

    2004-01-01

    A video monitoring system for use with a child car seat has video camera(s) mounted in the car seat. The video images are wirelessly transmitted to a remote receiver/display encased in a portable housing that can be removably mounted in the vehicle in which the car seat is installed.

  2. System for control of cooled CCD and image data processing for plasma spectroscopy

    SciTech Connect

    Mimura, M.; Kakeda, T.; Inoko, A.

    1995-12-31

    A Spectroscopic measurement system which has a spacial resolution is important for plasma study. This is especially true for a measurement of a plasma without axial symmetry like the LHD-plasma. Several years ago, we developed an imaging spectroscopy system using a CCD camera and an image-memory board of a personal computer. It was very powerful to study a plasma-gas interaction phenomena. In which system, however, an ordinary CCD was used so that the dark-current noise of the CCD prevented to measure dark spectral lines. Recently, a cooled CCD system can be obtained for the high sensitivity measurement. But such system is still very expensive. The cooled CCD itself as an element can be purchased cheaply, because amateur agronomists began to use it to take a picture of heavenly bodies. So we developed an imaging spectroscopy system using such a cheap cooled CCD for plasma experiment.

  3. Study on CCD measurement of temperature field in laser molten pool

    NASA Astrophysics Data System (ADS)

    Lei, Jian-bo; Yang, Xi-chen; Wang, Yun-shan; Li, Hui-shan

    2005-01-01

    The quality of laser remanufacturing depends on temperature field distribution in laser molten pool. In this paper, two-dimensional temperature field model and CCD measurement of temperature field were developed. According to radiant transfer function, bright of light signal of temperature field grabbed by CCD was transformed to spectral radiant signal. A new system model for CCD measurement of temperature filed was proposed. It concluded optical system, CCD camera, image plate, orientating laser, special image software and computer. Thermal image signal received by CCD is transformed to digital signal by image plate. After processing digital image signal by computer, Temperature field distribution can be obtained by thermal image displaying. It was proved that CCD Measurement of molten pool temperature field was available. Automatic control of laser remanufacturing processing could be achieved by feedback control of thermal radiant signal.

  4. Guerrilla Video: A New Protocol for Producing Classroom Video

    ERIC Educational Resources Information Center

    Fadde, Peter; Rich, Peter

    2010-01-01

    Contemporary changes in pedagogy point to the need for a higher level of video production value in most classroom video, replacing the default video protocol of an unattended camera in the back of the classroom. The rich and complex environment of today's classroom can be captured more fully using the higher level, but still easily manageable,…

  5. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  6. Representing videos in tangible products

    NASA Astrophysics Data System (ADS)

    Fageth, Reiner; Weiting, Ralf

    2014-03-01

    Videos can be taken with nearly every camera, digital point and shoot cameras, DSLRs as well as smartphones and more and more with so-called action cameras mounted on sports devices. The implementation of videos while generating QR codes and relevant pictures out of the video stream via a software implementation was contents in last years' paper. This year we present first data about what contents is displayed and how the users represent their videos in printed products, e.g. CEWE PHOTOBOOKS and greeting cards. We report the share of the different video formats used, the number of images extracted out of the video in order to represent the video, the positions in the book and different design strategies compared to regular books.

  7. World's fastest and most sensitive astronomical camera

    NASA Astrophysics Data System (ADS)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these

  8. Underwater camera with depth measurement

    NASA Astrophysics Data System (ADS)

    Wang, Wei-Chih; Lin, Keng-Ren; Tsui, Chi L.; Schipf, David; Leang, Jonathan

    2016-04-01

    The objective of this study is to develop an RGB-D (video + depth) camera that provides three-dimensional image data for use in the haptic feedback of a robotic underwater ordnance recovery system. Two camera systems were developed and studied. The first depth camera relies on structured light (as used by the Microsoft Kinect), where the displacement of an object is determined by variations of the geometry of a projected pattern. The other camera system is based on a Time of Flight (ToF) depth camera. The results of the structural light camera system shows that the camera system requires a stronger light source with a similar operating wavelength and bandwidth to achieve a desirable working distance in water. This approach might not be robust enough for our proposed underwater RGB-D camera system, as it will require a complete re-design of the light source component. The ToF camera system instead, allows an arbitrary placement of light source and camera. The intensity output of the broadband LED light source in the ToF camera system can be increased by putting them into an array configuration and the LEDs can be modulated comfortably with any waveform and frequencies required by the ToF camera. In this paper, both camera were evaluated and experiments were conducted to demonstrate the versatility of the ToF camera.

  9. CCD imager with photodetector bias introduced via the CCD register

    NASA Technical Reports Server (NTRS)

    Kosonocky, Walter F. (Inventor)

    1986-01-01

    An infrared charge-coupled-device (IR-CCD) imager uses an array of Schottky-barrier diodes (SBD's) as photosensing elements and uses a charge-coupled-device (CCD) for arranging charge samples supplied in parallel from the array of SBD's into a succession of serially supplied output signal samples. Its sensitivity to infrared (IR) is improved by placing bias charges on the Schottky barrier diodes. Bias charges are transported to the Schottky barrier diodes by a CCD also used for charge sample read-out.

  10. Video Mosaicking for Inspection of Gas Pipelines

    NASA Technical Reports Server (NTRS)

    Magruder, Darby; Chien, Chiun-Hong

    2005-01-01

    A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable

  11. Scanning CCD Detector for X-ray Powder Diffraction

    NASA Astrophysics Data System (ADS)

    Madden, T.; Baldwin, J.; Von Dreele, R.; Suchomel, M.; Toby, B. H.

    2014-03-01

    We discuss the design, fabrication and use of a custom CCD detector for x-ray powder diffraction measurements. The detector is mounted on a diffractometer arm, where line-by-line readout of the CCD is coupled to continuous motion of the arm. As the arm moves, the data from the CCD detector are accumulated and can be viewed as if it were a "film strip" with partial powder diffraction rings. Because of the unique design of the camera, both high-resolution and rapid measurements can be performed. Powder diffraction patterns are collected with speeds of a few minutes, or less, with many of the advantages of large area position-sensitive detectors, for example amorphous silicon flat panels, such as high sensitivity, direct evidence of grainy samples and freedom from low-angle asymmetry, but with resolution better than linear position-sensitive detectors and nearly as good as the ultimate in resolution, analyser-crystal detection [2,3].

  12. Single-spin CCD.

    PubMed

    Baart, T A; Shafiei, M; Fujita, T; Reichl, C; Wegscheider, W; Vandersypen, L M K

    2016-04-01

    Spin-based electronics or spintronics relies on the ability to store, transport and manipulate electron spin polarization with great precision. In its ultimate limit, information is stored in the spin state of a single electron, at which point quantum information processing also becomes a possibility. Here, we demonstrate the manipulation, transport and readout of individual electron spins in a linear array of three semiconductor quantum dots. First, we demonstrate single-shot readout of three spins with fidelities of 97% on average, using an approach analogous to the operation of a charge-coupled device (CCD). Next, we perform site-selective control of the three spins, thereby writing the content of each pixel of this 'single-spin charge-coupled device'. Finally, we show that shuttling an electron back and forth in the array hundreds of times, covering a cumulative distance of 80 μm, has negligible influence on its spin projection. Extrapolating these results to the case of much larger arrays points at a diverse range of potential applications, from quantum information to imaging and sensing. PMID:26727201

  13. Single-spin CCD

    NASA Astrophysics Data System (ADS)

    Baart, T. A.; Shafiei, M.; Fujita, T.; Reichl, C.; Wegscheider, W.; Vandersypen, L. M. K.

    2016-04-01

    Spin-based electronics or spintronics relies on the ability to store, transport and manipulate electron spin polarization with great precision. In its ultimate limit, information is stored in the spin state of a single electron, at which point quantum information processing also becomes a possibility. Here, we demonstrate the manipulation, transport and readout of individual electron spins in a linear array of three semiconductor quantum dots. First, we demonstrate single-shot readout of three spins with fidelities of 97% on average, using an approach analogous to the operation of a charge-coupled device (CCD). Next, we perform site-selective control of the three spins, thereby writing the content of each pixel of this ‘single-spin charge-coupled device’. Finally, we show that shuttling an electron back and forth in the array hundreds of times, covering a cumulative distance of 80 μm, has negligible influence on its spin projection. Extrapolating these results to the case of much larger arrays points at a diverse range of potential applications, from quantum information to imaging and sensing.

  14. CCD correlated quadruple sampling processor

    NASA Technical Reports Server (NTRS)

    Gaalema, S. D. (Inventor)

    1981-01-01

    A correlated quadruple sampling processor for improved signal-to-noise ratio in the output of a charge-coupled device (CCD) is comprised of: switching means for momentarily clamping a CCD signal line at a first reference level A before a CCD data pulse and then obtaining a first data sample B with respect to the reference A during a CCD data pulse, and storing the positive sample B-A; switching means for momentarily clamping the CCD signal line a second time at the level C during the presence of the CCD data pulse and then obtaining a second data sample D with respect to the reference level C after the CCD data pulse, and storing the negative sample D-C; and means for obtaining the difference between the stored samples +(B-A) and -(D-C), thus increasing the net signal amplitude by a factor of about 2 while the noise would be increased by only a factor of square root of 2 since there will be no correlation in the noise between the double samples +(B-A) and -(D-C) effectively added.

  15. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  16. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  17. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  18. Use of a high-resolution profiling sonar and a towed video camera to map a Zostera marina bed, Solent, UK

    NASA Astrophysics Data System (ADS)

    Lefebvre, A.; Thompson, C. E. L.; Collins, K. J.; Amos, C. L.

    2009-04-01

    Seagrasses are flowering plants that develop into extensive underwater meadows and play a key role in the coastal ecosystem. In the last few years, several techniques have been developed to map and monitor seagrass beds in order to protect them. Here, we present the results of a survey using a profiling sonar, the Sediment Imager Sonar (SIS) and a towed video sledge to study a Zostera marina bed in the Solent, southern UK. The survey aimed to test the instruments for seagrass detection and to describe the area for the first time. On the acoustic data, the bed produced the strongest backscatter along a beam. A high backscatter above the bottom indicated the presence of seagrass. The results of an algorithm developed to detect seagrass from the sonar data were tested against video data. Four parameters were calculated from the SIS data: water depth, a Seagrass Index (average backscatter 10-15 cm above the bed), canopy height (height above the bed where the backscatter crosses a threshold limit) and patchiness (percentage of beams in a sweep where the backscatter 10-15 cm above the bed is greater than a threshold limit). From the video, Zostera density was estimated together with macroalgae abundance and bottom type. Patchiness calculated from the SIS data was strongly correlated to seagrass density evaluated from the video, indicating that this parameter could be used for seagrass detection. The survey area has been classified based upon seagrass density, macroalgae abundance and bottom type. Only a small area was occupied by a dense canopy whereas most of the survey area was characterised by patchy seagrass. Results indicated that Zostera marina developed only on sandy bottoms and was not found in regions of gravel. Furthermore, it was limited to a depth shallower than 1.5 m below the level of Lowest Astronomical Tide and present in small patches across the intertidal zone. The average canopy height was 15 cm and the highest density was 150 shoots m -2.

  19. System of video observation for electron beam welding process

    NASA Astrophysics Data System (ADS)

    Laptenok, V. D.; Seregin, Y. N.; Bocharov, A. N.; Murygin, A. V.; Tynchenko, V. S.

    2016-04-01

    Equipment of video observation system for electron beam welding process was developed. Construction of video observation system allows to reduce negative effects on video camera during the process of electron beam welding and get qualitative images of this process.

  20. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  1. Video sensor with range measurement capability

    NASA Technical Reports Server (NTRS)

    Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Howard, Richard T. (Inventor); Broderick, David J. (Inventor)

    2008-01-01

    A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.

  2. IR Hiding: A Method to Prevent Video Re-shooting by Exploiting Differences between Human Perceptions and Recording Device Characteristics

    NASA Astrophysics Data System (ADS)

    Yamada, Takayuki; Gohshi, Seiichi; Echizen, Isao

    A method is described to prevent video images and videos displayed on screens from being re-shot by digital cameras and camcorders. Conventional methods using digital watermarking for re-shooting prevention embed content IDs into images and videos, and they help to identify the place and time where the actual content was shot. However, these methods do not actually prevent digital content from being re-shot by camcorders. We developed countermeasures to stop re-shooting by exploiting the differences between the sensory characteristics of humans and devices. The countermeasures require no additional functions to use-side devices. It uses infrared light (IR) to corrupt the content recorded by CCD or CMOS devices. In this way, re-shot content will be unusable. To validate the method, we developed a prototype system and implemented it on a 100-inch cinema screen. Experimental evaluations showed that the method effectively prevents re-shooting.

  3. Enhanced performance CCD output amplifier

    DOEpatents

    Dunham, Mark E.; Morley, David W.

    1996-01-01

    A low-noise FET amplifier is connected to amplify output charge from a che coupled device (CCD). The FET has its gate connected to the CCD in common source configuration for receiving the output charge signal from the CCD and output an intermediate signal at a drain of the FET. An intermediate amplifier is connected to the drain of the FET for receiving the intermediate signal and outputting a low-noise signal functionally related to the output charge signal from the CCD. The amplifier is preferably connected as a virtual ground to the FET drain. The inherent shunt capacitance of the FET is selected to be at least equal to the sum of the remaining capacitances.

  4. Scientific CCD characterisation at Universidad Complutense LICA Laboratory

    NASA Astrophysics Data System (ADS)

    Tulloch, S.; Gil de Paz, A.; Gallego, J.; Zamorano, J.; Tapia, Carlos

    2012-07-01

    A CCD test-bench has been built at the Universidad Complutensés LICA laboratory. It is initially intended for commissioning of the MEGARA1 (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) instrument but can be considered as a general purpose scientific CCD test-bench. The test-bench uses an incandescent broad-band light source in combination with a monochromator and two filter wheels to provide programmable narrow-band illumination across the visible band. Light from the monochromator can be directed to an integrating sphere for flat-field measurements or sent via a small aperture directly onto the CCD under test for high accuracy diode-mode quantum efficiency measurements. Point spread function measurements can also be performed by interposing additional optics between sphere and the CCD under test. The whole system is under LabView control via a clickable GUI. Automated measurement scans of quantum efficiency can be performed requiring only that the user replace the CCD under test with a calibrated photodiode after each measurement run. A 20cm diameter cryostat with a 10cm window and Brooks Polycold PCC closed-cycle cooler also form part of the test-bench. This cryostat is large enough to accommodate almost all scientific CCD formats has initially been used to house an E2V CCD230 in order to fully prove the test-bench functionality. This device is read-out using an Astronomical Research Camera controller connected to the UKATC's UCAM data acquisition system.

  5. STS-134 Launch Composite Video Comparison

    NASA Video Gallery

    A side-by-side comparison video shows a one-camera view of the STS-134 launch (left) with the six-camera composited view (right). Imaging experts funded by the Space Shuttle Program and located at ...

  6. The NEAT Camera Project

    NASA Technical Reports Server (NTRS)

    Jr., Ray L. Newburn

    1995-01-01

    The NEAT (Near Earth Asteroid Tracking) camera system consists of a camera head with a 6.3 cm square 4096 x 4096 pixel CCD, fast electronics, and a Sun Sparc 20 data and control computer with dual CPUs, 256 Mbytes of memory, and 36 Gbytes of hard disk. The system was designed for optimum use with an Air Force GEODSS (Ground-based Electro-Optical Deep Space Surveillance) telescope. The GEODSS telescopes have 1 m f/2.15 objectives of the Ritchey-Chretian type, designed originally for satellite tracking. Installation of NEAT began July 25 at the Air Force Facility on Haleakala, a 3000 m peak on Maui in Hawaii.

  7. Timing of satellite observations for telescope with TV CCD camera

    NASA Astrophysics Data System (ADS)

    Dragomiretskoy, V. V.; Koshkin, N. I.; Korobeinikova, E. A.; Melikyants, C. M.; Ryabov, A. V.; Strahova, S. L.; Terpan, S. S.; Shakun, L. S.

    2013-12-01

    The time reference system to be used for linking of the satellite position and brightness measurements to the universal time scale UTC are described. These are used in Odessa astronomical observatory. They provides stable error does not exceeding the absolute value of 0.1 ms. The achieved accuracy of the timing allows us to study a very short-term satellite brightness variations and the actual unevenness of its orbital motion.

  8. Inspection focus technology of space tridimensional mapping camera based on astigmatic method

    NASA Astrophysics Data System (ADS)

    Wang, Zhi; Zhang, Liping

    2010-10-01

    The CCD plane of the space tridimensional mapping camera will be deviated from the focal plane(including the CCD plane deviated due to camera focal length changed), under the condition of space environment and vibration, impact when satellite is launching, image resolution ratio will be descended because defocusing. For tridimensional mapping camera, principal point position and focal length variation of the camera affect positioning accuracy of ground target, conventional solution is under the condition of vacuum and focusing range, calibrate the position of CCD plane with code of photoelectric encoder, when the camera defocusing in orbit, the magnitude and direction of defocusing amount are obtained by photoelectric encoder, then the focusing mechanism driven by step motor to compensate defocusing amount of the CCD plane. For tridimensional mapping camera, under the condition of space environment and vibration, impact when satellite is launching, if the camera focal length changes, above focusing method has been meaningless. Thus, the measuring and focusing method was put forward based on astigmation, a quadrant detector was adopted to measure the astigmation caused by the deviation of the CCD plane, refer to calibrated relation between the CCD plane poison and the asrigmation, the deviation vector of the CCD plane can be obtained. This method includes all factors caused deviation of the CCD plane, experimental results show that the focusing resolution of mapping camera focusing mechanism based on astigmatic method can reach 0.25 μm.

  9. High-accuracy x-ray imaging: screen, lens, and CCD

    NASA Astrophysics Data System (ADS)

    Zeman, Herbert D.; DiBianca, Frank A.; Thomason, Donald B.; Sebes, Jeno I.; Lovhoiden, Gunnar; Liao, D.-Z.; Kambeyanda, Dona

    1993-09-01

    A liquid nitrogen cooled CCD TV camera from Astromed, Ltd. has been used for quantitative X-ray medical imaging. The CCD camera is coupled to a Kodak Lanex Regular X-ray intensifying screen with a 5:1 macro lens for bone mineral densitometry of the femur of a rat for a study of the development of osteoporosis. As a feasibility study of the use of the CCD for mammography, a 2:1 macro lens has been used to couple the CCD to a clear CsI(Tl) crystal, 100 mm in diameter and 3 mm thick. The spatial resolution and quantum efficiency of the system is significantly improved by replacing the Lanex Regular screen with the CsI(Tl) crystal.

  10. Sensors for the Hubble Space Telescope wide field and planetary cameras (1 and 2)

    NASA Technical Reports Server (NTRS)

    Trauger, John T.

    1990-01-01

    The technology of the wide field planetary camera (WFPC-2) CCD technology is examined with reference to the WFPC-1 experience. Strategies are presented for elimination of quantum efficiency (QE) hysteresis and implementation of maintenance-free QE stability, improved far-UV performance, on-orbit photometric calibrations, refinements in CCD electronics, and anticipated CCD particle radiation effects. Absorption depth vs. wavelength in silicon and a cross section of the CCD membrane are shown.

  11. Development of a CCD array as an imaging detector for advanced X-ray astrophysics facilities

    NASA Technical Reports Server (NTRS)

    Schwartz, D. A.

    1981-01-01

    The development of a charge coupled device (CCD) X-ray imager for a large aperture, high angular resolution X-ray telescope is discussed. Existing CCDs were surveyed and three candidate concepts were identified. An electronic camera control and computer interface, including software to drive a Fairchild 211 CCD, is described. In addition a vacuum mounting and cooling system is discussed. Performance data for the various components are given.

  12. CCD detector development projects by the Beamline Technical Support Group at the Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    Lee, John H.; Fernandez, Patricia; Madden, Tim; Molitsky, Michael; Weizeorick, John

    2007-11-01

    This paper will describe two ongoing detector projects being developed by the Beamline Technical Support Group at the Advanced Photon Source (APS) at Argonne National Laboratory (ANL). The first project is the design and construction of two detectors: a single-CCD system and a two-by-two Mosaic CCD camera for Small-Angle X-ray Scattering (SAXS). Both of these systems utilize the Kodak KAF-4320E CCD coupled to fiber optic tapers, custom mechanical hardware, electronics, and software developed at ANL. The second project is a Fast-CCD (FCCD) detector being developed in a collaboration between ANL and Lawrence Berkeley National Laboratory (LBNL). This detector will use ANL-designed readout electronics and a custom LBNL-designed CCD, with 480×480 pixels and 96 outputs, giving very fast readout.

  13. CCD detector development projects by the beamline technical support group at the Advanced Photon Source.

    SciTech Connect

    Lee, J. H.; Fernandez, P.; Madden, T.; Molitsky, M.; Weizeorick, J.

    2007-11-11

    This paper will describe two ongoing detector projects being developed by the Beamline Technical Support Group at the Advanced Photon Source (APS) at Argonne National Laboratory (ANL). The first project is the design and construction of two detectors: a single-CCD system and a two-by-two Mosaic CCD camera for Small-Angle X-ray Scattering (SAXS). Both of these systems utilize the Kodak KAF-4320E CCD coupled to fiber optic tapers, custom mechanical hardware, electronics, and software developed at ANL. The second project is a Fast-CCD (FCCD) detector being developed in a collaboration between ANL and Lawrence Berkeley National Laboratory (LBNL). This detector will use ANL-designed readout electronics and a custom LBNL-designed CCD, with 480 x 480 pixels and 96 outputs, giving very fast readout.

  14. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  15. Classical astrometry longitude and latitude determination by using CCD technique

    NASA Astrophysics Data System (ADS)

    Damljanović, G.; de Biasi, M. S.; Gerstbach, G.

    At the AOB, it is the zenith-telescope (D=11 cm, F=128.7 cm, denoted by BLZ in the list of Bureau International de l'Heure - BIH), and at Punta Indio (near La Plata) it is the photographic zenith tube (D=20 cm, F=457.7 cm, denoted by PIP in the list of BIH). At the AOB there is a CCD camera ST-8 of Santa Barbara Instrument Group (SBIG) with 1530×1020 number of pixels, 9×9 microns pixel size and 13.8×9.2 mm array dimension. We did some investigations about the possibilities for longitude (λ) and latitude (φ) determinations by using ST-8 with BLZ and PIP, and our predicted level of accuracy is few 0."01 from one CCD zenith stars processing with Tycho-2 Catalogue. Also, astro-geodesy has got new practicability with the CCDs (to reach a good accuracy of geoid determination via astro-geodesy λ and φ observations). At the TU Wien there is the CCD MX916 of Starlight Xpress (with 752×580 pixels, 11×12 microns, 8.7×6.5 mm active area). Our predicted level of accuracy for λ and φ measurements is few 0."1 from one CCD MX916 processing of zenith stars, with small optic (20 cm focus length because of not stable, but mobile instrument) and Tycho-2. A transportable zenith camera with CCD is under development at the TU Wien for astro-geodesy subjects.

  16. Data acquisition and control system for high-performance large-area CCD systems

    NASA Astrophysics Data System (ADS)

    Afanasieva, I. V.

    2015-04-01

    Astronomical CCD systems based on second-generation DINACON controllers were developed at the SAO RAS Advanced Design Laboratory more than seven years ago and since then have been in constant operation at the 6-meter and Zeiss-1000 telescopes. Such systems use monolithic large-area CCDs. We describe the software developed for the control of a family of large-area CCD systems equipped with a DINACON-II controller. The software suite serves for acquisition, primary reduction, visualization, and storage of video data, and also for the control, setup, and diagnostics of the CCD system.

  17. Video Toroid Cavity Imager

    SciTech Connect

    Gerald, Rex E. II; Sanchez, Jairo; Rathke, Jerome W.

    2004-08-10

    A video toroid cavity imager for in situ measurement of electrochemical properties of an electrolytic material sample includes a cylindrical toroid cavity resonator containing the sample and employs NMR and video imaging for providing high-resolution spectral and visual information of molecular characteristics of the sample on a real-time basis. A large magnetic field is applied to the sample under controlled temperature and pressure conditions to simultaneously provide NMR spectroscopy and video imaging capabilities for investigating electrochemical transformations of materials or the evolution of long-range molecular aggregation during cooling of hydrocarbon melts. The video toroid cavity imager includes a miniature commercial video camera with an adjustable lens, a modified compression coin cell imager with a fiat circular principal detector element, and a sample mounted on a transparent circular glass disk, and provides NMR information as well as a video image of a sample, such as a polymer film, with micrometer resolution.

  18. High-speed video analysis system using multiple shuttered charge-coupled device imagers and digital storage

    NASA Astrophysics Data System (ADS)

    Racca, Roberto G.; Stephenson, Owen; Clements, Reginald M.

    1992-06-01

    A fully solid state high-speed video analysis system is presented. It is based on the use of several independent charge-coupled device (CCD) imagers, each shuttered by a liquid crystal light valve. The imagers are exposed in rapid succession and are then read out sequentially at standard video rate into digital memory, generating a time-resolved sequence with as many frames as there are imagers. This design allows the use of inexpensive, consumer-grade camera modules and electronics. A microprocessor-based controller, designed to accept up to ten imagers, handles all phases of the recording from exposure timing to image capture and storage to playback on a standard video monitor. A prototype with three CCD imagers and shutters has been built. It has allowed successful three-image video recordings of phenomena such as the action of an air rifle pellet shattering a piece of glass, using a high-intensity pulsed light emitting diode as the light source. For slower phenomena, recordings in continuous light are also possible by using the shutters themselves to control the exposure time. The system records full-screen black and white images with spatial resolution approaching that of standard television, at rates up to 5000 images per second.

  19. Pinhole Camera For Viewing Electron Beam Materials Processing

    NASA Astrophysics Data System (ADS)

    Rushford, M. C.; Kuzmenko, P. J.

    1986-10-01

    A very rugged, compact (4x4x10 inches), gas purged "PINHOLE CAMERA" has been developed for viewing electron beam materials processing (e.g. melting or vaporizing metal). The video image is computer processed, providing dimensional and temperature measurements of objects within the field of view, using an IBM PC. The "pinhole camera" concept is similar to a TRW optics system for viewing into a coal combustor through a 2 mm hole. Gas is purged through the hole to repel particulates from optical surfaces. In our system light from the molten metal passes through the 2 mm hole "PINHOLE", reflects off an aluminum coated glass substrate and passes through a window into a vacuum tight container holding the camera and optics at atmospheric pressure. The mirror filters out X rays which pass through the AL layer and are absorbed in the glass mirror substrate. Since metallic coatings are usually reflective, the image quality is not severely degraded by small amounts of vapor that overcome the gas purge to reach the mirror. Coating thicknesses of up to 2 microns can be tolerated. The mirror is the only element needing occasional servicing. We used a telescope eyepiece as a convenient optical design, but with the traditional optical path reversed. The eyepiece images a scene through a small entrance aperture onto an image plane where a CCD camera is placed. Since the iris of the eyepiece is fixed and the scene intensity varies it was necessary to employ a variable neutral density filter for brightness control. Devices used for this purpose include PLZT light valve from Motorola, mechanically rotated linear polarizer sheets, and nematic liquid crystal light valves. These were placed after the mirror and entrance aperture but before the lens to operate as a voltage variable neutral density filter. The molten metal surface temp being viewed varies from 4000 to 1200 degrees Kelvin. The resultant intensity change (at 488 nm with 10 nm bandwidth) is seven orders of magnitude. This

  20. Image responses to x-ray radiation in ICCD camera

    NASA Astrophysics Data System (ADS)

    Ma, Jiming; Duan, Baojun; Song, Yan; Song, Guzhou; Han, Changcai; Zhou, Ming; Du, Jiye; Wang, Qunshu; Zhang, Jianqi

    2013-08-01

    When used in digital radiography, ICCD camera will be inevitably irradiated by x-ray and the output image will degrade. In this research, we separated ICCD camera into two optical-electric parts, CCD camera and MCP image intensifier, and irradiated them respectively on Co-60 gamma ray source and pulsed x-ray source. By changing time association between radiation and the shutter of CCD camera, the state of power supply of MCP image intensifier, significant differences have been observed in output images. A further analysis has revealed the influence of the CCD chip, readout circuit in CCD camera, and the photocathode, microchannel plate and fluorescent screen in MCP image intensifier on image quality of an irradiated ICCD camera. The study demonstrated that compared with other parts, irradiation response of readout circuit is very slight and in most cases negligible. The interaction of x-ray with CCD chip usually behaves as bright spots or rough background in output images, which depends on x-ray doses. As to the MCP image intensifier, photocathode and microchannel plate are the two main steps that degrade output images. When being irradiated by x-ray, microchannel plate in MCP image intensifier tends to contribute a bright background in output images. Background caused by the photocathode looks more bright and fluctuant. Image responses of fluorescent screen in MCP image intensifier in ICCD camera and that of a coupling fiber bundle are also evaluated in this presentation.