Science.gov

Sample records for video ccd camera

  1. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  2. CCD Camera

    DOEpatents

    Roth, Roger R. (Minnetonka, MN)

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  3. A CCD-Based Video Camera For Medical X-Ray Imaging

    NASA Astrophysics Data System (ADS)

    Snoeren, Rudolph M.

    1989-05-01

    Using an anamorphic imaging technique, the solid state image sensor can replace the vacuum pick-up tube in medical X-ray diagnostics, at least for the standard quality fluoroscopic application. A video camera is described in which, by optical compression, the circular output window of an image intensifier is imaged as an ellipse on the 3:4 image rectangle of a CCD sensor. The original shape is restored by electronics later. Information transfer is maximized this way: the total entrance field of the image intensifier is available, at the same time the maximum number of pixels of a high resolution CCD image sensor is used, there is also an increase of the sensor illuminance by 4/3 and the aliasing effects are minimized. Imaging descriptors such as Modulation transfer, noise generation and transfer are given in comparison with a Plumbicon-based camera, as well as shape transfer, luminance transfer and aliasing. A method is given to isolate the basic noise components of the sensor. A short description of the camera optics and electronics is given.

  4. Programmable CCD camera equipped with user-configurable video rate digital video processing for use in industrial inspection

    NASA Astrophysics Data System (ADS)

    Roberts, James W.; Wynen, J.

    1996-10-01

    A new high performance CCD camera family is presented. The camera incorporates a micro-controller/PLD combination to provide users with computer control of image acquisition, image processing and analysis. User control of image acquisition includes adjustable gain and offset, data rate and timing.Image processing and analysis algorithms are implemented within PLDs and regulated via the microcontroller. A variety of image processing algorithms are discussed including gray scale thresholding, RLE, edge detection and gauging. Parameters that govern the nature of the image processing algorithm can be computer controlled and randomly accessed. The camera transmits a combination of high speed digital video for frame grabber acquisition and low speed serial analysis and status information to a PC serial prot. Pixel rates at up to 20Mbyte/sec/channel for up to 8 channels combine for data throughputs of up to 160Mbytes/sec/camera. Camera configurations include single and multi-tap, linescan, TDI and area array formats. Up to 15 cameras can be easily integrated together to form a single network through the use of a programmable communication HUB unit. An even larger number of cameras can be linked together by combining several networks into a web. Applications for the programmable camera/HUB technology include web and parts inspection, template matching and gauging.

  5. Upgrading a CCD camera for astronomical use 

    E-print Network

    Lamecker, James Frank

    1993-01-01

    Existing charge-coupled device (CCD) video cameras have been modified to be used for astronomical imaging on telescopes in order to improve imaging times over those of photography. An astronomical CCD camera at the Texas A&M Observatory would...

  6. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H. (Lafayette, CA)

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  7. Calibration Tests of Industrial and Scientific CCD Cameras

    NASA Technical Reports Server (NTRS)

    Shortis, M. R.; Burner, A. W.; Snow, W. L.; Goad, W. K.

    1991-01-01

    Small format, medium resolution CCD cameras are at present widely used for industrial metrology applications. Large format, high resolution CCD cameras are primarily in use for scientific applications, but in due course should increase both the range of applications and the object space accuracy achievable by close range measurement. Slow scan, cooled scientific CCD cameras provide the additional benefit of additional quantisation levels which enables improved radiometric resolution. The calibration of all types of CCD cameras is necessary in order to characterize the geometry of the sensors and lenses. A number of different types of CCD cameras have been calibrated a the NASA Langley Research Center using self calibration and a small test object. The results of these calibration tests will be described, with particular emphasis on the differences between standard CCD video cameras and scientific slow scan CCD cameras.

  8. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  9. Omnifocus video camera

    NASA Astrophysics Data System (ADS)

    Iizuka, Keigo

    2011-04-01

    The omnifocus video camera takes videos, in which objects at different distances are all in focus in a single video display. The omnifocus video camera consists of an array of color video cameras combined with a unique distance mapping camera called the Divcam. The color video cameras are all aimed at the same scene, but each is focused at a different distance. The Divcam provides real-time distance information for every pixel in the scene. A pixel selection utility uses the distance information to select individual pixels from the multiple video outputs focused at different distances, in order to generate the final single video display that is everywhere in focus. This paper presents principle of operation, design consideration, detailed construction, and over all performance of the omnifocus video camera. The major emphasis of the paper is the proof of concept, but the prototype has been developed enough to demonstrate the superiority of this video camera over a conventional video camera. The resolution of the prototype is high, capturing even fine details such as fingerprints in the image. Just as the movie camera was a significant advance over the still camera, the omnifocus video camera represents a significant advance over all-focus cameras for still images.

  10. Application of the CCD camera in medical imaging

    NASA Astrophysics Data System (ADS)

    Chu, Wei-Kom; Smith, Chuck; Bunting, Ralph; Knoll, Paul; Wobig, Randy; Thacker, Rod

    1999-04-01

    Medical fluoroscopy is a set of radiological procedures used in medical imaging for functional and dynamic studies of digestive system. Major components in the imaging chain include image intensifier that converts x-ray information into an intensity pattern on its output screen and a CCTV camera that converts the output screen intensity pattern into video information to be displayed on a TV monitor. To properly respond to such a wide dynamic range on a real-time basis, such as fluoroscopy procedure, are very challenging. Also, similar to all other medical imaging studies, detail resolution is of great importance. Without proper contrast, spatial resolution is compromised. The many inherent advantages of CCD make it a suitable choice for dynamic studies. Recently, CCD camera are introduced as the camera of choice for medical fluoroscopy imaging system. The objective of our project was to investigate a newly installed CCD fluoroscopy system in areas of contrast resolution, details, and radiation dose.

  11. Solid state television camera (CCD-buried channel)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The development of an all solid state television camera, which uses a buried channel charge coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array is utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control (i.e., ALC and AGC) techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  12. Solid state television camera (CCD-buried channel), revision 1

    NASA Technical Reports Server (NTRS)

    1977-01-01

    An all solid state television camera was designed which uses a buried channel charge coupled device (CCD) as the image sensor. A 380 x 488 element CCD array is utilized to ensure compatibility with 525-line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (1) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (2) techniques for the elimination or suppression of CCD blemish effects, and (3) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  13. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D. (Tracy, CA)

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  14. The SXI: CCD camera onboard the NeXT mission

    E-print Network

    Bautz, Marshall W.

    The Soft X-ray Imager (SXI) is the X-ray CCD camera on board the NeXT mission that is to be launched around 2013. We are going to employ the CCD chips developed at Hamamatsu Photonics, K.K. We have been developing two types ...

  15. Solid state, CCD-buried channel, television camera study and design

    NASA Technical Reports Server (NTRS)

    Hoagland, K. A.; Balopole, H.

    1976-01-01

    An investigation of an all solid state television camera design, which uses a buried channel charge-coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array was utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a design which addresses the program requirements for a deliverable solid state TV camera.

  16. Printed circuit board for a CCD camera head

    DOEpatents

    Conder, Alan D. (Tracy, CA)

    2002-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  17. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. )

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  18. High frame rate CCD cameras with fast optical shutters for military and medical imaging applications

    SciTech Connect

    King, N.S.P.; Albright, K.; Jaramillo, S.A.; McDonald, T.E.; Yates, G.J.; Turko, B.T.

    1994-09-01

    Los Alamos National Laboratory has designed and prototyped high-frame rate intensified/shuttered Charge-Coupled-Device (CCD) cameras capable of operating at kilohertz frame rates (non-interlaced mode) with optical shutters capable of acquiring nanosecond-to-microsecond exposures each frame. These cameras utilize an Interline Transfer CCD, Loral Fairchild CCD-222 with 244 {times} 380 pixels operated at pixel rates approaching 100 Mhz. Initial prototype designs demonstrated single-port serial readout rates exceeding 3.97 Kilohertz with greater than 51p/mm spatial resolution at shutter speeds as short as 5ns. Readout was achieved by using a truncated format of 128 {times} 128 pixels by partial masking of the CCD and then subclocking the array at approximately 65Mhz pixel rate. Shuttering was accomplished with a proximity focused microchannel plate (MCP) image intensifier (MCPII) that incorporated a high strip current MCP and a design modification for high-speed stripline gating geometry to provide both fast shuttering and high repetition rate capabilities. Later camera designs use a close-packed quadruple head geometry fabricated using an array of four separate CCDs (pseudo 4-port device). This design provides four video outputs with optional parallel or time-phased sequential readout modes. The quad head format was designed with flexibility for coupling to various image intensifier configurations, including individual intensifiers for each CCD imager, a single intensifier with fiber optic or lens/prism coupled fanout of the input image to be shared by the four CCD imagers or a large diameter phosphor screen of a gateable framing type intensifier for time sequential relaying of a complete new input image to each CCD imager. Camera designs and their potential use in ongoing military and medical time-resolved imaging applications are discussed.

  19. Color measurements using a colorimeter and a CCD camera

    SciTech Connect

    Spratlin, T.L.; Simpson, M.L.

    1992-02-01

    Two new techniques are introduced for measuring the color content of printed graphic images with applications to web inspection such as color flaws and measurement of color quality. The techniques involve the development of algorithms for combining the information obtained from commercially available CCD color cameras and colorimeters to produce a colorimeter system with pixel resolution. 9 refs.

  20. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  1. Development of an all-in-one gamma camera/CCD system for safeguard verification

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Il; An, Su Jung; Chung, Yong Hyun; Kwak, Sung-Woo

    2014-12-01

    For the purpose of monitoring and verifying efforts at safeguarding radioactive materials in various fields, a new all-in-one gamma camera/charged coupled device (CCD) system was developed. This combined system consists of a gamma camera, which gathers energy and position information on gamma-ray sources, and a CCD camera, which identifies the specific location in a monitored area. Therefore, 2-D image information and quantitative information regarding gamma-ray sources can be obtained using fused images. A gamma camera consists of a diverging collimator, a 22 × 22 array CsI(Na) pixelated scintillation crystal with a pixel size of 2 × 2 × 6 mm3 and Hamamatsu H8500 position-sensitive photomultiplier tube (PSPMT). The Basler scA640-70gc CCD camera, which delivers 70 frames per second at video graphics array (VGA) resolution, was employed. Performance testing was performed using a Co-57 point source 30 cm from the detector. The measured spatial resolution and sensitivity were 4.77 mm full width at half maximum (FWHM) and 7.78 cps/MBq, respectively. The energy resolution was 18% at 122 keV. These results demonstrate that the combined system has considerable potential for radiation monitoring.

  2. Design of high-speed low-noise pre-amplifier for CCD camera

    NASA Astrophysics Data System (ADS)

    Xue, Xucheng; Zhang, Shuyan; Li, Hongfa; Guo, Yongfei

    2010-10-01

    Pre-amplifier circuit is critical for the noise performance of the high speed CCD camera. Its main functions are amplification and impedance transform. The high speed and low noise pre-amplifier of CCD camera is discussed and designed in this paper. The high speed and low noise operational amplifier OPA842 is adopted as the main part. The gain-set resistors for the amplifier are designed optimally. The different precision gain-set resistors are swept using Monte Carlo method. CCD video signal which has high DC offset voltage is AC coupled to the amplifier. The output signal of the amplifier is source terminated using 50 ohms matching resistor so as to transmit the video signal through coaxial cable. When the circuit works in high speed, the PCB will have important effect to circuit performance and can even cause the amplifier unstable due to the parasitic problem of PCB. So the parasitic model of the PCB is established and the PCB layout design issues are also presented. The design result shows that the pre-amplifier can be used in the camera whose pixel rate could be up to 40 MHz and its input referred noise density is about 3 nV/Hz1/2.

  3. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  4. High frame rate CCD camera with fast optical shutter

    SciTech Connect

    Yates, G.J.; McDonald, T.E. Jr.; Turko, B.T.

    1998-09-01

    A high frame rate CCD camera coupled with a fast optical shutter has been designed for high repetition rate imaging applications. The design uses state-of-the-art microchannel plate image intensifier (MCPII) technology fostered/developed by Los Alamos National Laboratory to support nuclear, military, and medical research requiring high-speed imagery. Key design features include asynchronous resetting of the camera to acquire random transient images, patented real-time analog signal processing with 10-bit digitization at 40--75 MHz pixel rates, synchronized shutter exposures as short as 200pS, sustained continuous readout of 512 x 512 pixels per frame at 1--5Hz rates via parallel multiport (16-port CCD) data transfer. Salient characterization/performance test data for the prototype camera are presented, temporally and spatially resolved images obtained from range-gated LADAR field testing are included, an alternative system configuration using several cameras sequenced to deliver discrete numbers of consecutive frames at effective burst rates up to 5GHz (accomplished by time-phasing of consecutive MCPII shutter gates without overlap) is discussed. Potential applications including dynamic radiography and optical correlation will be presented.

  5. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (inventors)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  6. Two-wavelength microscopic speckle interferometry using colour CCD camera

    NASA Astrophysics Data System (ADS)

    Upputuri, Paul K.; Pramanik, Manojit; Kothiyal, Mahendra P.; Nandigana, Krishna M.

    2015-03-01

    Single wavelength microscopic speckle interferometry is widely used for deformation, shape and non-destructive testing (NDT) of engineering structures. However the single wavelength configuration fails to quantify the large deformation due to the overcrowding of fringes and it cannot provide shape of a specimen under test. In this paper, we discuss a two wavelength microscopic speckle interferometry using single-chip colour CCD camera for characterization of microsamples. The use of colour CCD allows simultaneous acquisition of speckle patterns at two different wavelengths and thus it makes the data acquisition as simple as single wavelength case. For the quantitative measurement, an error compensating 8-step phase shifted algorithm is used. The system allows quantification of large deformation and shape of a specimen with rough surface. The design of the system along with few experimental results on small scale rough specimens is presented.

  7. Wind dynamic range video camera

    NASA Technical Reports Server (NTRS)

    Craig, G. D. (inventor)

    1985-01-01

    A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

  8. Research of fiber position measurement by multi CCD cameras

    NASA Astrophysics Data System (ADS)

    Zhou, Zengxiang; Hu, Hongzhuan; Wang, Jianping; Zhai, Chao; Chu, Jiaru; Liu, Zhigang

    2014-07-01

    Parallel controlled fiber positioner as an efficiency observation system, has been used in LAMOST for four years, and will be proposed in ngCFHT and rebuilt telescope Mayall. The fiber positioner research group in USTC have designed a new generation prototype by a close-packed module robotic positioner mechanisms. The prototype includes about 150 groups fiber positioning module plugged in 1 meter diameter honeycombed focal plane. Each module has 37 12mm diameter fiber positioners. Furthermore the new system promotes the accuracy from 40 um in LAMOST to 10um in MSDESI. That's a new challenge for measurement. Close-loop control system are to be used in new system. The CCD camera captures the photo of fiber tip position covered the focal plane, calculates the precise position information and feeds back to control system. After the positioner rotated several loops, the accuracy of all positioners will be confined to less than 10um. We report our component development and performance measurement program of new measuring system by using multi CCD cameras. With the stereo vision and image processing method, we precisely measure the 3-demension position of fiber tip carried by fiber positioner. Finally we present baseline parameters for the fiber positioner measurement as a reference of next generation survey telescope design.

  9. Method of video capture port design for IP camera

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Ruan, Shuangchen; Zhang, Min; Liu, Chengxiang

    2006-11-01

    The IP surveillance market is growing significantly and is receiving global attention now. IP camera is the heart of this new surveillance system. As the key component of IP camera, video capture port contributes greatly to camera's cost. CCD image sensors are employed in most IP camera for its excellent performance and maturity in the market. In fact, new types of CMOS image sensors have become possible recent years due to CMOS technology improvements. This paper presents a design of an IP camera's video capture port using CMOS image sensor based on embedded environment. Also contained is a brief introduction to the hardware design including the interface and PCB layout. The paper also provides information on setup of important registers, functions usage and debug tips. The design was tested on an IP camera which has been on market for three years. The results show that using CMOS image sensor can achieve good image and save cost. Therefore, it is well suited for the surveillance field where image resolution is not the focus. The method can easily be extended to any other IP camera design with little change both in hardware and software.

  10. Development of high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  11. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  12. Laboratory Calibration and Characterization of Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1989-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of non-perpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitable aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  13. Laboratory calibration and characterization of video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1990-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  14. Video Analysis with a Web Camera

    ERIC Educational Resources Information Center

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as…

  15. A CCD CAMERA-BASED HYPERSPECTRAL IMAGING SYSTEM FOR STATIONARY AND AIRBORNE APPLICATIONS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes a charge coupled device (CCD) camera-based hyperspectral imaging system designed for both stationary and airborne remote sensing applications. The system consists of a high performance digital CCD camera, an imaging spectrograph, an optional focal plane scanner, and a PC comput...

  16. Video imagers with low speed CCD and LC based on temporal compressed

    NASA Astrophysics Data System (ADS)

    Zhong, Xiaoming; Li, Huan; Zhao, Haibo; Liu, Yanli

    2015-08-01

    Traditional video imagers require high-speed CCD, we present a new method to implement video imagers with low speed CCD detector imager system based on video compressed. Using low speed CCD detector and transmissive liquid crystal (LC) instead of high speed CCD to get data cube; by the method of data processing method , we make high precision reconstruction of compressed video data, theoretical analysis and experimental result show that it is not ensures the video imaging quality but also reduced the frame rate of the detectors and complexity of video imaging system greatly.

  17. Photogrammetric Applications of Immersive Video Cameras

    NASA Astrophysics Data System (ADS)

    Kwiatek, K.; Tokarczyk, R.

    2014-05-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug®3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

  18. Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Hardware

    NASA Astrophysics Data System (ADS)

    Kang, Y.-W.; Byun, Y. I.; Rhee, J. H.; Oh, S. H.; Kim, D. K.

    2007-12-01

    We designed and developed a multi-purpose CCD camera system for three kinds of CCDs; KAF-0401E(768×512), KAF-1602E(1536×1024), KAF-3200E(2184×1472) made by KODAK Co.. The system supports fast USB port as well as parallel port for data I/O and control signal. The packing is based on two stage circuit boards for size reduction and contains built-in filter wheel. Basic hardware components include clock pattern circuit, A/D conversion circuit, CCD data flow control circuit, and CCD temperature control unit. The CCD temperature can be controlled with accuracy of approximately 0.4° C in the max. range of temperature, ? 33° C. This CCD camera system has with readout noise 6 e^{-}, and system gain 5 e^{-}/ADU. A total of 10 CCD camera systems were produced and our tests show that all of them show passable performance.

  19. Analysis of focusing accuracy for multispectral CCD camera based on satellite

    NASA Astrophysics Data System (ADS)

    Lv, Shiliang; Liu, Jinguo

    2015-10-01

    As a key technology to improve the imaging quality of remote multispectral CCD camera, the performance of a focusing system for multispectral CCD camera was presented in detail in this paper. Firstly, the focusing precision required was calculated in the optical system. The method of direct adjusting multispectral CCD focal plane was proposed, which was suitable for this multispectral CCD camera optical system. Secondly, we developed a focusing system which has the advantages of lower constructional complexity, easier hardware implementation and high focusing sensitivity. Finally, experimental test was constructed to evaluate the focusing precision performance of the focusing system. The result of focusing precision test is 3.62?m(3?) in a focusing range of +/-2.5mm. The experimental result shows that the focusing system we proposed is reasonable, and reliability as well as stable, which meet the focusing precision requirements for multispectral CCD camera.

  20. Abilities of Russian digital CCD cameras of serial manufacture for astronomical applications

    NASA Astrophysics Data System (ADS)

    Komarov, Vladimir V.; Komarov, Anton V.

    2007-05-01

    There is the presentation of investigation results of last native elaborations of b/w high sensitive CCD cameras for optical telescope application. By the example of SDU-259 camera (000"Specteletehnika", Moscow) capability of its use as a digitized TV guiding camera for large optical telescopes is demonstrated. In SAO RAS there is constructed SDU-259C camera with termoelectric cooler equipped. The parameters of CCD camera SDU-259C and its test results when used with 10 inch telescope "Meade LXD-55" are given.

  1. 2M pixels FIT-CCD with Hyper HAD sensor and camera for HDTV

    NASA Astrophysics Data System (ADS)

    Ishikawa, Kikue; Wada, Kazushi; Nakamura, Satoshi; Abe, Hideshi

    1992-08-01

    2 million pixels FIT-CCD(Erame Lnterline Iransfer-CCD) which has the latest Hyper HAD(h[ole accumulation iode) Sensor has been developed for HDTV. This new CCD has higher performancesover the conventional HD-tube sensor by use of the new technologies such as the Hyper HAD Sensor and the aluminum wire interconnection. This new CCD has realized the compact and light HD-CCD camera which has improved one and half F stop sensitivity and 8dB SN ratio in comparison with the current HD-tube camera. The main characteristics of the HD-CCD camera prototype are as follows 1. Sensitivity F8/2000lx 2. SN Ratio 52dB/3OMHz 3. Dynamic range 600 4. Smear reduction -100dB 5. Horizontal resolution 1000 TV Lines 6. Camera head weight 6. 5Kg(including camera adaptor) This new HD-CCD camera has realized not only the studio system but also the stand alone system for portable use. 1 .

  2. Applying CCD Cameras in Stereo Panorama Systems for 3d Environment Reconstruction

    NASA Astrophysics Data System (ADS)

    Ashamini, A. Sh.; Varshosaz, M.; Saadatseresht, M.

    2012-07-01

    Proper recontruction of 3D environments is nowadays needed by many organizations and applications. In addition to conventional methods the use of stereo panoramas is an appropriate technique to use due to simplicity, low cost and the ability to view an environment the way it is in reality. This paper investigates the ability of applying stereo CCD cameras for 3D reconstruction and presentation of the environment and geometric measuring among that. For this purpose, a rotating stereo panorama was established using two CCDs with a base-length of 350 mm and a DVR (digital video recorder) box. The stereo system was first calibrated using a 3D test-field and used to perform accurate measurements. The results of investigating the system in a real environment showed that although this kind of cameras produce noisy images and they do not have appropriate geometric stability, but they can be easily synchronized, well controlled and reasonable accuracy (about 40 mm in objects at 12 meters distance from the camera) can be achieved.

  3. Measuring neutron fluences and gamma/x-ray fluxes with CCD cameras

    SciTech Connect

    Yates, G.J.; Smith, G.W.; Zagarino, P.; Thomas, M.C.

    1991-12-01

    The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4--12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate {approx}.05 V/rad responsivity with {ge}1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or ``peaks`` binned by area and amplitude as functions of fluence in the 10{sup 5} to 10{sup 7} n/cm{sup 2} range indicate smearing over {approx}1 to 10% of CCD array with charge per pixel ranging between noise and saturation levels.

  4. Measuring neutron fluences and gamma/x-ray fluxes with CCD cameras

    SciTech Connect

    Yates, G.J. ); Smith, G.W. . Atomic Weapons Establishment); Zagarino, P.; Thomas, M.C. . Santa Barbara Operations)

    1991-01-01

    The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4--12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate {approx}.05 V/rad responsivity with {ge}1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or peaks'' binned by area and amplitude as functions of fluence in the 10{sup 5} to 10{sup 7} n/cm{sup 2} range indicate smearing over {approx}1 to 10% of CCD array with charge per pixel ranging between noise and saturation levels.

  5. Auto-measuring system of aero-camera lens focus using linear CCD

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-ye; Zhao, Yu-liang; Wang, Shu-juan

    2014-09-01

    The automatic and accurate focal length measurement of aviation camera lens is of great significance and practical value. The traditional measurement method depends on the human eye to read the scribed line on the focal plane of parallel light pipe by means of reading microscope. The method is of low efficiency and the measuring results are influenced by artificial factors easily. Our method used linear array solid-state image sensor instead of reading microscope to transfer the imaging size of specific object to be electrical signal pulse width, and used computer to measure the focal length automatically. In the process of measurement, the lens to be tested placed in front of the object lens of parallel light tube. A couple of scribed line on the surface of the parallel light pipe's focal plane were imaging on the focal plane of the lens to be tested. Placed the linear CCD drive circuit on the image plane, the linear CCD can convert the light intensity distribution of one dimension signal into time series of electrical signals. After converting, a path of electrical signals is directly brought to the video monitor by image acquisition card for optical path adjustment and focusing. The other path of electrical signals is processed to obtain the pulse width corresponding to the scribed line by electrical circuit. The computer processed the pulse width and output focal length measurement result. Practical measurement results showed that the relative error was about 0.10%, which was in good agreement with the theory.

  6. Video Chat with Multiple Cameras John MacCormick

    E-print Network

    MacCormick, John

    Video Chat with Multiple Cameras John MacCormick Dickinson College Technical Report March 2012 Abstract The dominant paradigm for video chat employs a single camera at each end of the con- versation provides the first rigorous investigation of multi-camera video chat, concentrating especially

  7. The image pretreatment based on the FPGA inside digital CCD camera

    NASA Astrophysics Data System (ADS)

    Tian, Rui; Liu, Yan-ying

    2009-07-01

    In a space project, a digital CCD camera which can image more clearly in the 1 Lux light environment has been asked to design . The CCD sensor ICX285AL produced by SONY Co.Ltd has been used in the CCD camera. The FPGA (Field Programmable Gate Array) chip XQR2V1000 has been used as a timing generator and a signal processor inside the CCD camera. But in the low-light environment, two kinds of random noise become apparent because of the improving of CCD camera's variable gain, one is dark current noise in the image background, the other is vertical transfer noise. The real time method for eliminating noise based on FPGA inside the CCD camera would be introduced. The causes and characteristics of the random noise have been analyzed. First, several ideas for eliminating dark current noise had been motioned; then they were emulated by VC++ in order to compare their speed and effect; Gauss filter has been chosen because of the filtering effect. The vertical transfer vertical noise has the character that the vertical noise points have regular ordinate in the image two-dimensional coordinates; and the performance of the noise is fixed, the gray value of the noise points is 16-20 less than the surrounding pixels. According to these characters, local median filter has been used to clear up the vertical noise. Finally, these algorithms had been transplanted into the FPGA chip inside the CCD camera. A large number of experiments had proved that the pretreatment has better real-time features. The pretreatment makes the digital CCD camera improve the signal-to-noise ratio of 3-5dB in the low-light environment.

  8. Processing of multiport CCD video signals at very high frame rates

    SciTech Connect

    Turko, B.T.; Yates, G.J.; King, N.S.P.

    1995-12-31

    Rates exceeding 1,000 frames/s can be achieved with multiport CCD state-of-art video sensors. In order to provide sufficient spatial resolution, sensor configurations of 512 x 512 pixels are typical. Image area is divided into segments with individual video ports. Each port includes a photocharge sensitive amplifier, typically comprising sample/hold and charge reset circuits. Some amplifiers are even provided with a double correlated sample circuit for improving the signal/noise ratio. Frame rates are proportional to the number of ports, since the individual sensor segments are run in parallel. Unfortunately, the amount of external circuitry required for signal processing increases accordingly. 16-port sensors are a quite common configuration. Cameras with even higher number of ports are prohibitively expensive. Therefore, in order to achieve very high frame readout rates with a moderate number of ports, the sensor`s charge transport clock frequencies must be increased to the limit. Horizontal charge transfer frequencies exceeding 30 MHz have been achieved. The quality of the video signal deteriorates with frequency due to bandwidth limitation of the photocharge detecting amplifier. Its sample/hold and double correlated sample circuits are useless at such rates. Methods and circuits for the processing of video signals under such conditions are described. The circuits include wide bandwidth video buffer amplifiers/level translator/line drivers, fast peak stretchers, 10-bit resolution (or more) A/D converters and fiber optic data links to a remote mass digital data storage and processors. Also, the circuits must satisfy a number of practical conditions (size, power dissipation, cost) in order to make such camera useful in applications where space is limited and multiple head high frame rate cameras are required.

  9. Panoramic Video Capturing and Compressed Domain Virtual Camera Control

    E-print Network

    California at Santa Barbara, University of

    Panoramic Video Capturing and Compressed Domain Virtual Camera Control Xinding Sun*, Jonathan Foote by stitching the video pictures from multiple stationary cameras. The panoramic video sequence is compressed problem emerges when the panoramic video is compressed and stored or delivered to a client, where

  10. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  11. Automated CCD camera characterization. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports

    SciTech Connect

    Silbermann, J.

    1999-03-01

    The OMEGA system uses CCD cameras for a broad range of applications. Over 100 video rate CCD cameras are used for such purposes as targeting, aligning, and monitoring areas such as the target chamber, laser bay, and viewing gallery. There are approximately 14 scientific grade CCD cameras on the system which are used to obtain precise photometric results from the laser beam as well as target diagnostics. It is very important that these scientific grade CCDs are properly characterized so that the results received from them can be evaluated appropriately. Currently characterization is a tedious process done by hand. The operator must manually operate the camera and light source simultaneously. Because more exposures means more accurate information on the camera, the characterization tests can become very length affairs. Sometimes it takes an entire day to complete just a single plot. Characterization requires the testing of many aspects of the camera`s operation. Such aspects include the following: variance vs. mean signal level--this should be proportional due to Poisson statistics of the incident photon flux; linearity--the ability of the CCD to produce signals proportional to the light it received; signal-to-noise ratio--the relative magnitude of the signal vs. the uncertainty in that signal; dark current--the amount of noise due to thermal generation of electrons (cooling lowers this noise contribution significantly). These tests, as well as many others, must be conducted in order to properly understand a CCD camera. The goal of this project was to construct an apparatus that could characterize a camera automatically.

  12. Development of CCD Cameras for Soft X-ray Imaging at the National Ignition Facility

    SciTech Connect

    Teruya, A. T.; Palmer, N. E.; Schneider, M. B.; Bell, P. M.; Sims, G.; Toerne, K.; Rodenburg, K.; Croft, M.; Haugh, M. J.; Charest, M. R.; Romano, E. D.; Jacoby, K. D.

    2013-09-01

    The Static X-Ray Imager (SXI) is a National Ignition Facility (NIF) diagnostic that uses a CCD camera to record time-integrated X-ray images of target features such as the laser entrance hole of hohlraums. SXI has two dedicated positioners on the NIF target chamber for viewing the target from above and below, and the X-ray energies of interest are 870 eV for the “soft” channel and 3 – 5 keV for the “hard” channels. The original cameras utilize a large format back-illuminated 2048 x 2048 CCD sensor with 24 micron pixels. Since the original sensor is no longer available, an effort was recently undertaken to build replacement cameras with suitable new sensors. Three of the new cameras use a commercially available front-illuminated CCD of similar size to the original, which has adequate sensitivity for the hard X-ray channels but not for the soft. For sensitivity below 1 keV, Lawrence Livermore National Laboratory (LLNL) had additional CCDs back-thinned and converted to back-illumination for use in the other two new cameras. In this paper we describe the characteristics of the new cameras and present performance data (quantum efficiency, flat field, and dynamic range) for the front- and back-illuminated cameras, with comparisons to the original cameras.

  13. The In-flight Spectroscopic Performance of the Swift XRT CCD Camera During 2006-2007

    NASA Technical Reports Server (NTRS)

    Godet, O.; Beardmore, A.P.; Abbey, A.F.; Osborne, J.P.; Page, K.L.; Evans, P.; Starling, R.; Wells, A.A.; Angelini, L.; Burrows, D.N.; Kennea, J.; Campana, S.; Chincarini, G.; Citterio, O.; Cusumano, G.; LaParola, V.; Mangano, V.; Mineo, T.; Giommi, P.; Perri, M.; Capalbi, M.; Tamburelli, F.

    2007-01-01

    The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.

  14. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  15. Wilbur: A low-cost CCD camera system for MDM Observatory

    NASA Technical Reports Server (NTRS)

    Metzger, M. R.; Luppino, G. A.; Tonry, J. L.

    1992-01-01

    The recent availability of several 'off-the-shelf' components, particularly CCD control electronics from SDSU, has made it possible to put together a flexible CCD camera system at relatively low cost and effort. The authors describe Wilbur, a complete CCD camera system constructed for the Michigan-Dartmouth-MIT Observatory. The hardware consists of a Loral 2048(exp 2) CCD controlled by the SDSU electronics, an existing dewar design modified for use at MDM, a Sun Sparcstation 2 with a commercial high-speed parallel controller, and a simple custom interface between the controller and the SDSU electronics. The camera is controlled from the Sparcstation by software that provides low-level I/O in real time, collection of additional information from the telescope, and a simple command interface for use by an observer. Readout of the 2048(exp 2) array is complete in under two minutes at 5 e(sup -) read noise, and readout time can be decreased at the cost of increased noise. The system can be easily expanded to handle multiple CCD's/multiple readouts, and can control other dewars/CCD's using the same host software.

  16. Preliminary results from a single-photon imaging X-ray charge coupled device /CCD/ camera

    NASA Technical Reports Server (NTRS)

    Griffiths, R. E.; Polucci, G.; Mak, A.; Murray, S. S.; Schwartz, D. A.; Zombeck, M. V.

    1981-01-01

    A CCD camera is described which has been designed for single-photon X-ray imaging in the 1-10 keV energy range. Preliminary results are presented from the front-side illuminated Fairchild CCD 211, which has been shown to image well at 3 keV. The problem of charge-spreading above 4 keV is discussed by analogy with a similar problem at infrared wavelengths. The total system noise is discussed and compared with values obtained by other CCD users.

  17. Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund

    NASA Technical Reports Server (NTRS)

    Hagyard, Mona J.

    1992-01-01

    The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.

  18. Auto-measurement system of aerial camera lens' resolution based on orthogonal linear CCD

    NASA Astrophysics Data System (ADS)

    Zhao, Yu-liang; Zhang, Yu-ye; Ding, Hong-yi

    2010-10-01

    The resolution of aerial camera lens is one of the most important camera's performance indexes. The measurement and calibration of resolution are important test items in in maintenance of camera. The traditional method that is observing resolution panel of collimator rely on human's eyes using microscope and doing some computing. The method is of low efficiency and susceptible to artificial factors. The measurement results are unstable, too. An auto-measurement system of aerial camera lens' resolution, which uses orthogonal linear CCD sensor as the detector to replace reading microscope, is introduced. The system can measure automatically and show result real-timely. In order to measure the smallest diameter of resolution panel which could be identified, two orthogonal linear CCD is laid on the imaging plane of measured lens and four intersection points are formed on the orthogonal linear CCD. A coordinate system is determined by origin point of the linear CCD. And a circle is determined by four intersection points. In order to obtain the circle's radius, firstly, the image of resolution panel is transformed to pulse width of electric signal which is send to computer through amplifying circuit and threshold comparator and counter. Secondly, the smallest circle would be extracted to do measurement. The circle extraction made using of wavelet transform which has character of localization in the domain of time and frequency and has capability of multi-scale analysis. Lastly, according to the solution formula of lens' resolution, we could obtain the resolution of measured lens. The measuring precision on practical measurement is analyzed, and the result indicated that the precision will be improved when using linear CCD instead of reading microscope. Moreover, the improvement of system error is determined by the pixel's size of CCD. With the technique of CCD developed, the pixel's size will smaller, the system error will be reduced greatly too. So the auto-measuring system has high practical value and wide application prospect.

  19. Research on detecting heterogeneous fibre from cotton based on linear CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Xian-bin; Cao, Bing; Zhang, Xin-peng; Shi, Wei

    2009-07-01

    The heterogeneous fibre in cotton make a great impact on production of cotton textile, it will have a bad effect on the quality of product, thereby affect economic benefits and market competitive ability of corporation. So the detecting and eliminating of heterogeneous fibre is particular important to improve machining technics of cotton, advance the quality of cotton textile and reduce production cost. There are favorable market value and future development for this technology. An optical detecting system obtains the widespread application. In this system, we use a linear CCD camera to scan the running cotton, then the video signals are put into computer and processed according to the difference of grayscale, if there is heterogeneous fibre in cotton, the computer will send an order to drive the gas nozzle to eliminate the heterogeneous fibre. In the paper, we adopt monochrome LED array as the new detecting light source, it's lamp flicker, stability of luminous intensity, lumens depreciation and useful life are all superior to fluorescence light. We analyse the reflection spectrum of cotton and various heterogeneous fibre first, then select appropriate frequency of the light source, we finally adopt violet LED array as the new detecting light source. The whole hardware structure and software design are introduced in this paper.

  20. Video-Based Point Cloud Generation Using Multiple Action Cameras

    NASA Astrophysics Data System (ADS)

    Teo, T.

    2015-05-01

    Due to the development of action cameras, the use of video technology for collecting geo-spatial data becomes an important trend. The objective of this study is to compare the image-mode and video-mode of multiple action cameras for 3D point clouds generation. Frame images are acquired from discrete camera stations while videos are taken from continuous trajectories. The proposed method includes five major parts: (1) camera calibration, (2) video conversion and alignment, (3) orientation modelling, (4) dense matching, and (5) evaluation. As the action cameras usually have large FOV in wide viewing mode, camera calibration plays an important role to calibrate the effect of lens distortion before image matching. Once the camera has been calibrated, the author use these action cameras to take video in an indoor environment. The videos are further converted into multiple frame images based on the frame rates. In order to overcome the time synchronous issues in between videos from different viewpoints, an additional timer APP is used to determine the time shift factor between cameras in time alignment. A structure form motion (SfM) technique is utilized to obtain the image orientations. Then, semi-global matching (SGM) algorithm is adopted to obtain dense 3D point clouds. The preliminary results indicated that the 3D points from 4K video are similar to 12MP images, but the data acquisition performance of 4K video is more efficient than 12MP digital images.

  1. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    NASA Astrophysics Data System (ADS)

    Oh, S. H.; Kang, Y. W.; Byun, Y. I.

    2007-12-01

    We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512), KAF-1602E (15367times;1024), KAF-3200E (2184×1472) made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  2. Inexpensive range camera operating at video speed.

    PubMed

    Kramer, J; Seitz, P; Baltes, H

    1993-05-01

    An optoelectronic device has been developed and built that acquires and displays the range data of an object surface in space in video real time. The recovery of depth is performed with active triangulation. A galvanometer scanner system sweeps a sheet of light across the object at a video field rate of 50 Hz. High-speed signal processing is achieved through the use of a special optical sensor and hardware implementation of the simple electronic-processing steps. Fifty range maps are generated per second and converted into a European standard video signal where the depth is encoded in gray levels or color. The image resolution currently is 128 x 500 pixels with a depth accuracy of 1.5% of the depth range. The present setup uses a 500-mW diode laser for the generation of the light sheet. A 45-mm imaging lens covers a measurement volume of 93 mm x 61 mm x 63 mm at a medium distance of 250 mm from the camera, but this can easily be adapted to other dimensions. PMID:20820391

  3. Observing the interplanetary dust particles by the wide-field CCD camera

    NASA Astrophysics Data System (ADS)

    Usui, Fumihiko; Ishiguro, Masateru

    2002-11-01

    Zodiacal light is a scattered sunlight by interplanetary dust particles. We have performed the zodiacal light observations outside the SUBARU dome by using the wide-field CCD camera. In this paper, we introduce the recent results of interplanetary dust particles opened from the SUBARU site, together with the development of WIZARD, which is originally developed for zodiacal light observations.

  4. A Simple Approach of CCD Camera Calibration for Optical Diagnostics Instrumentation

    NASA Technical Reports Server (NTRS)

    Cha, Soyoung Stephen; Leslie, Fred W.; Ramachandran, Narayanan; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    Solid State array sensors are ubiquitous nowadays for obtaining gross field images in numerous scientific and engineering applications including optical diagnostics and instrumentation. Linear responses of these sensors are often required as in interferometry, light scattering and attenuation measurements, and photometry. In most applications, the linearity is usually taken to be granted without thorough quantitative assessment or correction through calibration. Upper-grade CCD cameras of high price may offer better linearity: however, they also require linearity checking and correction if necessary. Intermediate- or low-grade CCD cameras are more likely to need calibration for linearity . Here, we present two very simple approaches: one for quickly checking camera linearity without any additional setup and one for precisely correcting nonlinear sensor responses. It is believed that after calibration, those sensors of intermediate or low grade can function as effectively as their expensive counterpart.

  5. PIV camera response to high frequency signal: comparison of CCD and CMOS cameras using particle image simulation

    NASA Astrophysics Data System (ADS)

    Abdelsalam, D. G.; Stanislas, M.; Coudert, S.

    2014-08-01

    We present a quantitative comparison between FlowMaster3 CCD and Phantom V9.1 CMOS cameras’ response in the scope of application to particle image velocimetry (PIV). First, the subpixel response is characterized using a specifically designed set-up. The crosstalk between adjacent pixels for the two cameras is then estimated and compared. Then, the camera response is experimentally characterized using particle image simulation. Based on a three-point Gaussian peak fitting, the bias and RMS errors between locations of simulated and real images for the two cameras are accurately calculated using a homemade program. The results show that, although the pixel response is not perfect, the optical crosstalk between adjacent pixels stays relatively low and the accuracy of the position determination of an ideal PIV particle image is much better than expected.

  6. The University of Hawaii Institute for Astronomy CCD camera control system

    NASA Technical Reports Server (NTRS)

    Jim, K. T. C.; Yamada, H. T.; Luppino, G. A.; Hlivak, R. J.

    1992-01-01

    The University of Hawaii Institute for Astronomy CCD Camera Control System consists of a NeXT workstation, a graphical user interface, and a fiber optics communications interface which is connected to a San Diego State University CCD controller. The UH system employs the NeXT-resident Motorola DSP 56001 as a real time hardware controller. The DSP 56001 is interfaced to the Mach-based UNIX of the NeXT workstation by DMA and multithreading. Since the SDSU controller also uses the DPS 56001, the NeXT is used as a development platform for the embedded control software. The fiber optic interface links the two DSP 56001's through their Synchronous Serial Interfaces. The user interface is based on the NeXTStep windowing system. It is easy to use and features real-time display of image data and control over all camera functions. Both Loral and Tektronix 2048 x 2048 CCD's have been driven at full readout speeds, and the system is intended to be capable of simultaneous readout of four such CCD's. The total hardware package is compact enough to be quite portable and has been used on five different telescopes on Mauna Kea. The complete CCD control system can be assembled for a very low cost. The hardware and software of the control system has proven to be quite reliable, well adapted to the needs of astronomers, and extensible to increasingly complicated control requirements.

  7. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  8. Optics design of laser spotter camera for ex-CCD sensor

    NASA Astrophysics Data System (ADS)

    Nautiyal, R. P.; Mishra, V. K.; Sharma, P. K.

    2015-06-01

    Development of Laser based instruments like laser range finder and laser ranger designator has received prominence in modern day military application. Aiming the laser on the target is done with the help of a bore sighted graticule as human eye cannot see the laser beam directly. To view Laser spot there are two types of detectors available, InGaAs detector and Ex-CCD detector, the latter being a cost effective solution. In this paper optics design for Ex-CCD based camera is discussed. The designed system is light weight and compact and has the ability to see the 1064nm pulsed laser spot upto a range of 5 km.

  9. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera

  10. Initial laboratory evaluation of color video cameras: Phase 2

    SciTech Connect

    Terry, P.L.

    1993-07-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than to identify intruders. The monochrome cameras were selected over color cameras because they have greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Because color camera technology is rapidly changing and because color information is useful for identification purposes, Sandia National Laboratories has established an on-going program to evaluate the newest color solid-state cameras. Phase One of the Sandia program resulted in the SAND91-2579/1 report titled: Initial Laboratory Evaluation of Color Video Cameras. The report briefly discusses imager chips, color cameras, and monitors, describes the camera selection, details traditional test parameters and procedures, and gives the results reached by evaluating 12 cameras. Here, in Phase Two of the report, we tested 6 additional cameras using traditional methods. In addition, all 18 cameras were tested by newly developed methods. This Phase 2 report details those newly developed test parameters and procedures, and evaluates the results.

  11. Camera Control and Geo-Registration for Video Sensor Networks

    NASA Astrophysics Data System (ADS)

    Davis, James W.

    With the use of large video networks, there is a need to coordinate and interpret the video imagery for decision support systems with the goal of reducing the cognitive and perceptual overload of human operators. We present computer vision strategies that enable efficient control and management of cameras to effectively monitor wide-coverage areas, and examine the framework within an actual multi-camera outdoor urban video surveillance network. First, we construct a robust and precise camera control model for commercial pan-tilt-zoom (PTZ) video cameras. In addition to providing a complete functional control mapping for PTZ repositioning, the model can be used to generate wide-view spherical panoramic viewspaces for the cameras. Using the individual camera control models, we next individually map the spherical panoramic viewspace of each camera to a large aerial orthophotograph of the scene. The result provides a unified geo-referenced map representation to permit automatic (and manual) video control and exploitation of cameras in a coordinated manner. The combined framework provides new capabilities for video sensor networks that are of significance and benefit to the broad surveillance/security community.

  12. Deflection Measurements of a Thermally Simulated Nuclear Core Using a High-Resolution CCD-Camera

    NASA Technical Reports Server (NTRS)

    Stanojev, B. J.; Houts, M.

    2004-01-01

    Space fission systems under consideration for near-term missions all use compact. fast-spectrum reactor cores. Reactor dimensional change with increasing temperature, which affects neutron leakage. is the dominant source of reactivity feedback in these systems. Accurately measuring core dimensional changes during realistic non-nuclear testing is therefore necessary in predicting the system nuclear equivalent behavior. This paper discusses one key technique being evaluated for measuring such changes. The proposed technique is to use a Charged Couple Device (CCD) sensor to obtain deformation readings of electrically heated prototypic reactor core geometry. This paper introduces a technique by which a single high spatial resolution CCD camera is used to measure core deformation in Real-Time (RT). Initial system checkout results are presented along with a discussion on how additional cameras could be used to achieve a three- dimensional deformation profile of the core during test.

  13. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  14. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  15. Spectroheliograms recorded using the new CCD camera in the OAUC, Coimbra, Portugal

    NASA Astrophysics Data System (ADS)

    Garcia, A.; Klva?a, M.; Sobotka, M.; Bumba, V.

    2010-12-01

    Spectroheliograms in the OAUC (Coimbra, Portugal) have been photographed in the spectral line of Ca II continuously since 1926, and since 1990 spectroheliograms in H-alpha have been photographed as well. Since 2007, all the spectroheliograms are recorded using new CCD camera. Specifications of the camera, including the new optical scheme of the spectrograph, were presented in a previous paper (Klvana et al., 2006). On the data recorded in 2010 we demonstrate the good quality of spectroheliograms taken during standard observing conditions, influence of the clouds and the effects introduced by filtering.

  16. Outer planet investigations using a CCD camera system. [Saturn disk photommetry

    NASA Technical Reports Server (NTRS)

    Price, M. J.

    1980-01-01

    Problems related to analog noise, data transfer from the camera buffer to the storage computer, and loss of sensitivity of a two dimensional charge coupled device imaging system are reported. To calibrate the CCD system, calibrated UBV pinhole scans of the Saturn disk were obtained with a photoelectric area scanning photometer. Atmospheric point spread functions were also obtained. The UBV observations and models of the Saturn atmosphere are analyzed.

  17. Modeling of the over-exposed pixel area of CCD cameras caused by laser dazzling

    NASA Astrophysics Data System (ADS)

    Benoist, Koen W.; Schleijpen, Ric H. M. A.

    2014-10-01

    A simple model has been developed and implemented in Matlab code, predicting the over-exposed pixel area of cameras caused by laser dazzling. Inputs of this model are the laser irradiance on the front optics of the camera, the Point Spread Function (PSF) of the used optics, the integration time of the camera, and camera sensor specifications like pixel size, quantum efficiency and full well capacity. Effects of the read-out circuit of the camera are not incorporated. The model was evaluated with laser dazzle experiments on CCD cameras using a 532 nm CW laser dazzler and shows good agreement. For relatively low laser irradiance the model predicts the over-exposed laser spot area quite accurately and shows the cube root dependency of spot diameter on laser irradiance, caused by the PSF as demonstrated before for IR cameras. For higher laser power levels the laser induced spot diameter increases more rapidly than predicted, which probably can be attributed to scatter effects in the camera. Some first attempts to model scatter contributions, using a simple scatter power function f(?), show good resemblance with experiments. Using this model, a tool is available which can assess the performance of observation sensor systems while being subjected to laser countermeasures.

  18. High-resolution image digitizing through 12x3-bit RGB-filtered CCD camera

    NASA Astrophysics Data System (ADS)

    Cheng, Andrew Y. S.; Pau, Michael C. Y.

    1996-09-01

    A high resolution computer-controlled CCD image capturing system is developed by using a 12 bits 1024 by 1024 pixels CCD camera and motorized RGB filters to grasp an image with color depth up to 36 bits. The filters distinguish the major components of color and collect them separately while the CCD camera maintains the spatial resolution and detector filling factor. The color separation can be done optically rather than electronically. The operation is simply by placing the capturing objects like color photos, slides and even x-ray transparencies under the camera system, the necessary parameters such as integration time, mixing level and light intensity are automatically adjusted by an on-line expert system. This greatly reduces the restrictions of the capturing species. This unique approach can save considerable time for adjusting the quality of image, give much more flexibility of manipulating captured object even if it is a 3D object with minimal setup fixers. In addition, cross sectional dimension of a 3D capturing object can be analyzed by adapting a fiber optic ring light source. It is particularly useful in non-contact metrology of a 3D structure. The digitized information can be stored in an easily transferable format. Users can also perform a special LUT mapping automatically or manually. Applications of the system include medical images archiving, printing quality control, 3D machine vision, and etc.

  19. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  20. DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  1. Video camera system for locating bullet holes in targets at a ballistics tunnel

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Rummler, D. R.; Goad, W. K.

    1990-01-01

    A system consisting of a single charge coupled device (CCD) video camera, computer controlled video digitizer, and software to automate the measurement was developed to measure the location of bullet holes in targets at the International Shooters Development Fund (ISDF)/NASA Ballistics Tunnel. The camera/digitizer system is a crucial component of a highly instrumented indoor 50 meter rifle range which is being constructed to support development of wind resistant, ultra match ammunition. The system was designed to take data rapidly (10 sec between shoots) and automatically with little operator intervention. The system description, measurement concept, and procedure are presented along with laboratory tests of repeatability and bias error. The long term (1 hour) repeatability of the system was found to be 4 microns (one standard deviation) at the target and the bias error was found to be less than 50 microns. An analysis of potential errors and a technique for calibration of the system are presented.

  2. Traceability of a CCD-Camera System for High-Temperature Measurements

    NASA Astrophysics Data System (ADS)

    Bünger, L.; Anhalt, K.; Taubert, R. D.; Krüger, U.; Schmidt, F.

    2015-08-01

    A CCD camera, which has been specially equipped with narrow-band interference filters in the visible spectral range for temperature measurements above 1200 K, was characterized with respect to its temperature response traceable to ITS-90 and with respect to absolute spectral radiance responsivity. The calibration traceable to ITS-90 was performed at a high-temperature blackbody source using a radiation thermometer as a transfer standard. Use of Planck's law and the absolute spectral radiance responsivity of the camera system allows the determination of the thermodynamic temperature. For the determination of the absolute spectral radiance responsivity, a monochromator-based setup with a supercontinuum white-light laser source was developed. The CCD-camera system was characterized with respect to the dark-signal-non-uniformity, the photo-response-non-uniformity, the non-linearity, and the size-of-source effect. The influence of these parameters on the calibration and measurement was evaluated and is considered for the uncertainty budget. The results of the two different calibration schemes for the investigated temperature range from 1200 K to 1800 K are in good agreement considering the expanded uncertainty . The uncertainty for the absolute spectral responsivity of the camera is 0.56 %.

  3. Video-rate recognition and localization for wearable cameras

    E-print Network

    Oxford, University of

    Video-rate recognition and localization for wearable cameras R O Castle, D J Gawley, G Klein, and D of a wearable or hand-held camera provides the geometrical foundation for sev- eral capabilities of value to an autonomous wearable vision system. The one explored here is the ability to incorporate recognized objects

  4. Design of an Event-Driven Random-Access-Windowing CCD-Based Camera

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P.; Lam, Raymond K.; Portillo, Angel A.; Ortiz, Gerardo G.

    2003-01-01

    Commercially available cameras are not design for the combination of single frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROI). A new control paradigm is defined to eliminate the tight coupling between the camera logic and the host controller. This functionality is achieved by defining the indivisible pixel read out operation on a per ROI basis with in-camera time keeping capability. This methodology provides a Random Access, Real-Time, Event-driven (RARE) camera for adaptive camera control and is will suited for target tracking applications requiring autonomous control of multiple ROI's. This methodology additionally provides for reduced ROI read out time and higher frame rates compared to the original architecture by avoiding external control intervention during the ROI read out process.

  5. Thermal modeling of cooled instrument: from the WIRCam IR camera to CCD Peltier cooled compact packages

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Stadler, Eric; Downing, Mark; Hurrell, Steve; Wheeler, Patrick; Gach, Jean-Luc; Magnard, Yves; Balard, Philippe; Guillaume, Christian; Hubin, Norbert; Diaz, José Javier; Suske, Wolfgang; Jorden, Paul

    2006-06-01

    In the past decade, new thermal modelling tools have been offered to system designers. These modelling tools have rarely been used for the cooled instruments in ground-based astronomy. In addition to an overwhelming increase of PC computer capabilities, these tools are now mature enough to drive the design of complex astronomical instruments that are cooled. This is the case for WIRCam, the new wide-field infrared camera installed on the CFHT in Hawaii on the Mauna Kea summit. This camera uses four 2K×2K Rockwell Hawaii-2RG infrared detectors and includes 2 optical barrels and 2 filter wheels. This camera is mounted at the prime focus of the 3.6m CFHT telescope. The mass to be cooled is close to 100 kg. The camera uses a Gifford Mac-Mahon closed-cycle cryo-cooler. The capabilities of the I-deas thermal module (TMG) is demonstrated for our particular application: predicted performances are presented and compared to real measurements after integration on the telescope in December 2004. In addition, we present thermal modelling of small Peltier cooled CCD packages, including the thermal model of the CCD220 Peltier package (fabricated by e2v technologies) and cold head. ESO and the OPTICON European network have funded e2v technologies to develop a compact packaged Peltier-cooled 8-output back illuminated L3Vision CCD. The device will achieve sub-electron read-noise at frame rates up to 1.5 kHz. The development, fully dedicated to the latest generation of adaptive optics wavefront sensors, has many unique features. Among them, the ultra-compactness offered by a Peltier package integrated in a small cold head including the detector drive electronics, is a way to achieve amazing performances for adaptive optics systems. All these models were carried out using a normal PC laptop.

  6. Optical readout of a two phase liquid argon TPC using CCD camera and THGEMs

    NASA Astrophysics Data System (ADS)

    Mavrokoridis, K.; Ball, F.; Carroll, J.; Lazos, M.; McCormick, K. J.; Smith, N. A.; Touramanis, C.; Walker, J.

    2014-02-01

    This paper presents a preliminary study into the use of CCDs to image secondary scintillation light generated by THick Gas Electron Multipliers (THGEMs) in a two phase LAr TPC. A Sony ICX285AL CCD chip was mounted above a double THGEM in the gas phase of a 40 litre two-phase LAr TPC with the majority of the camera electronics positioned externally via a feedthrough. An Am-241 source was mounted on a rotatable motion feedthrough allowing the positioning of the alpha source either inside or outside of the field cage. Developed for and incorporated into the TPC design was a novel high voltage feedthrough featuring LAr insulation. Furthermore, a range of webcams were tested for operation in cryogenics as an internal detector monitoring tool. Of the range of webcams tested the Microsoft HD-3000 (model no:1456) webcam was found to be superior in terms of noise and lowest operating temperature. In ambient temperature and atmospheric pressure 1 ppm pure argon gas, the THGEM gain was ? 1000 and using a 1 msec exposure the CCD captured single alpha tracks. Successful operation of the CCD camera in two-phase cryogenic mode was also achieved. Using a 10 sec exposure a photograph of secondary scintillation light induced by the Am-241 source in LAr has been captured for the first time.

  7. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  8. Autoguiding on the 20-inch Telescope The direct imaging camera on the telescope has a second, smaller, CCD that can be used to

    E-print Network

    Gustafsson, Torgny

    Autoguiding on the 20-inch Telescope The direct imaging camera on the telescope has a second, smaller, CCD that can be used to autoguide the telescope while exposing an image on the main CCD. Use the imager CCD and the guider CCD, so you can move the telescope to bring a good (i.e. as bright as possible

  9. Video Cameras in the Ondrejov Flare Spectrograph Results and Prospects

    NASA Astrophysics Data System (ADS)

    Kotrc, P.

    Since 1991 video cameras have been widely used both in the image and in the spectral data acquisition of the Ondrejov Multichannel Flare Spectrograph. In addition to classical photographic data registration, this kind of detectors brought new possibilities, especially into dynamical solar phenomena observations and put new requirements on the digitization, archiving and data processing techniques. The unique complex video system consisting of four video cameras and auxiliary equipment was mostly developed, implemented and used in the Ondrejov observatory. The main advantages and limitations of the system are briefly described from the points of view of its scientific philosophy, intents and outputs. Some obtained results, experience and future prospects are discussed.

  10. Design and realization of an image mosaic system on the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Wang, Peng; Zhu, Hai bin; Li, Yan; Zhang, Shao jun

    2015-08-01

    It has long been difficulties in aerial photograph to stitch multi-route images into a panoramic image in real time for multi-route flight framing CCD camera with very large amount of data, and high accuracy requirements. An automatic aerial image mosaic system based on GPU development platform is described in this paper. Parallel computing of SIFT feature extraction and matching algorithm module is achieved by using CUDA technology for motion model parameter estimation on the platform, which makes it's possible to stitch multiple CCD images in real-time. Aerial tests proved that the mosaic system meets the user's requirements with 99% accuracy and 30 to 50 times' speed improvement of the normal mosaic system.

  11. 800 x 800 charge-coupled device /CCD/ camera for the Galileo Jupiter Orbiter mission

    NASA Technical Reports Server (NTRS)

    Clary, M. C.; Klaasen, K. P.; Snyder, L. M.; Wang, P. K.

    1979-01-01

    During January 1982 the NASA space transportation system will launch a Galileo spacecraft composed of an orbiting bus and an atmospheric entry probe to arrive at the planet Jupiter in July 1985. A prime element of the orbiter's scientific instrument payload will be a new generation slow-scan planetary imaging system based on a newly developed 800 x 800 charge-coupled device (CCD) image sensor. Following Jupiter orbit insertion, the single, narrow-angle, CCD camera, designated the Solid State Imaging (SSI) Subsystem, will operate for 20 months as the orbiter makes repeated encounters with Jupiter and its Galilean Satellites. During this period the SSI will acquire 40,000 images of Jupiter's atmosphere and the surfaces of the Galilean Satellites. This paper describes the SSI, its operational modes, and science objectives.

  12. An intensified/shuttered cooled CCD camera for dynamic proton radiography

    SciTech Connect

    Yates, G.J.; Albright, K.L.; Alrick, K.R.

    1998-12-31

    An intensified/shuttered cooled PC-based CCD camera system was designed and successfully fielded on proton radiography experiments at the Los Alamos National Laboratory LANSCE facility using 800-MeV protons. The four camera detector system used front-illuminated full-frame CCD arrays (two 1,024 x 1,024 pixels and two 512 x 512 pixels) fiber optically coupled to either 25-mm diameter planar diode or microchannel plate image intensifiers which provided optical shuttering for time resolved imaging of shock propagation in high explosives. The intensifiers also provided wavelength shifting and optical gain. Typical sequences consisting of four images corresponding to consecutive exposures of about 500 ns duration for 40-ns proton burst images (from a fast scintillating fiber array) separated by approximately 1 microsecond were taken during the radiography experiments. Camera design goals and measured performance characteristics including resolution, dynamic range, responsivity, system detection quantum efficiency (DQE), and signal-to-noise will be discussed.

  13. CQUEAN: New CCD Camera System For The Otto Struve Telescope At The McDonald Observatory

    NASA Astrophysics Data System (ADS)

    Pak, Soojong; Park, W.; Im, M.

    2012-01-01

    We describe the overall characteristics and the performance of an optical CCD camera system, Camera for QUasars in EArly uNiverse (CQUEAN), which is being used at the 2.1m Otto Struve Telescope of the McDonald Observatory since 2010 August. CQUEAN was developed for follow-up imaging observations of near infrared bright sources such as high redshift quasar candidates (z > 4.5), Gamma Ray Bursts, brown dwarfs, and young stellar objects. For efficient observations of the red objects, CQUEAN has a science camera with a deep depletion CCD chip. By employing an auto-guiding system and a focal reducer to enhance the field of view at the classic cassegrain focus, we achieved a stable guiding in 20 minute exposures, an imaging quality with FWHM > 0.6 arcsec over the whole field (4.8 × 4.8 arcmin), and a limiting magnitude of z = 23.4 AB mag at 5-sigma with one hour integration.

  14. Experimental research on femto-second laser damaging array CCD cameras

    NASA Astrophysics Data System (ADS)

    Shao, Junfeng; Guo, Jin; Wang, Ting-feng; Wang, Ming

    2013-05-01

    Charged Coupled Devices (CCD) are widely used in military and security applications, such as airborne and ship based surveillance, satellite reconnaissance and so on. Homeland security requires effective means to negate these advanced overseeing systems. Researches show that CCD based EO systems can be significantly dazzled or even damaged by high-repetition rate pulsed lasers. Here, we report femto - second laser interaction with CCD camera, which is probable of great importance in future. Femto - second laser is quite fresh new lasers, which has unique characteristics, such as extremely short pulse width (1 fs = 10-15 s), extremely high peak power (1 TW = 1012W), and especially its unique features when interacting with matters. Researches in femto second laser interaction with materials (metals, dielectrics) clearly indicate non-thermal effect dominates the process, which is of vast difference from that of long pulses interaction with matters. Firstly, the damage threshold test are performed with femto second laser acting on the CCD camera. An 800nm, 500?J, 100fs laser pulse is used to irradiate interline CCD solid-state image sensor in the experiment. In order to focus laser energy onto tiny CCD active cells, an optical system of F/5.6 is used. A Sony production CCDs are chose as typical targets. The damage threshold is evaluated with multiple test data. Point damage, line damage and full array damage were observed when the irradiated pulse energy continuously increase during the experiment. The point damage threshold is found 151.2 mJ/cm2.The line damage threshold is found 508.2 mJ/cm2.The full-array damage threshold is found to be 5.91 J/cm2. Although the phenomenon is almost the same as that of nano laser interaction with CCD, these damage thresholds are substantially lower than that of data obtained from nano second laser interaction with CCD. Then at the same time, the electric features after different degrees of damage are tested with electronic multi meter. The resistance values between clock signal lines are measured. Contrasting the resistance values of the CCD before and after damage, it is found that the resistances decrease significantly between the vertical transfer clock signal lines values. The same results are found between the vertical transfer clock signal line and the earth electrode (ground).At last, the damage position and the damage mechanism were analyzed with above results and SEM morphological experiments. The point damage results in the laser destroying material, which shows no macro electro influence. The line damage is quite different from that of point damage, which shows deeper material corroding effect. More importantly, short circuits are found between vertical clock lines. The full array damage is even more severe than that of line damage starring with SEM, while no obvious different electrical features than that of line damage are found. Further researches are anticipated in femto second laser caused CCD damage mechanism with more advanced tools. This research is valuable in EO countermeasure and/or laser shielding applications.

  15. Flutter Shutter Video Camera for Compressive Sensing of Videos Jason Holloway Aswin C. Sankaranarayanan Ashok Veeraraghavan Salil Tambe

    E-print Network

    Sankaranarayanan, Aswin C.

    Flutter Shutter Video Camera for Compressive Sensing of Videos Jason Holloway Aswin C Abstract Video cameras are invariably bandwidth limited and this results in a trade-off between spatial the incoming video using spatio-temporal light modulators and capture the modulated video at a lower bandwidth

  16. Optical system design of multi-spectral and large format color CCD aerial photogrammetric camera

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Sun, Tianxiang; Gao, Xiaodong; Liang, Wei

    2007-12-01

    Multi-spectrum and high spatial resolution is the vital problem for optical design of aerial photogrammetric camera all the time. It is difficult to obtain an outstanding optical system with high modulation transfer function (MTF) as a result of wide band. At the same time, for acquiring high qualified image, chromatic distortion in optical system must be expected to be controlled below 0.5 pixels; it is a trouble thing because of wide field and multi-spectrum. In this paper, MTF and band of the system are analyzed. A Russar type photogrammetric objective is chosen as the basic optical structure. A novel optical system is presented to solve the problem. The new optical photogrammetric system, which consists of panchromatic optical system and chromatic optical system, is designed. The panchromatic optical system, which can obtain panchromatic image, makes up of a 9k×9k large format CCD and high-accuracy photographic objective len, its focal length is 69.83mm, field angle is 60°×60°, the size of CCD pixels is 8.75um×8.75um, spectral scope is from 0.43um to 0.74um, modulation transfer function is all above 0.4 in whole field when spatial frequency is at 60lp/mm, distortion is less than 0.007%. In a chromatic optical system, three 2k×2k array CCDs combine individually three same photographic objectives, the high resolution chromatic image is acquired by the synthesis of red, green, blue image data information delivered by three CCD sensors. For the chromatic system, their focal length is 24.83mm and they have the same spectral range of 0.39um to 0.74um. A difference is that they are coated in different film on their protect glass. The pixel number is 2048 × 2048; its MTF exceeds 0.4 in full field when spatial frequency is 30lp/mm. The advantages of digital aerial photogrammetric camera comparison with traditional film camera are described. It is considered that the two development trends on digital aerial photogrammetric camera are high-spectral resolution and high-spatial resolution. Merits of the aerial photogrammetric camera are multi-spectral, high resolution, low distortion and light-weight and wide field. It can apply in aerial photography and remote sense in place of traditional film camera. After put on trial and analyzing from the design results, the system can meet large scale aerial survey.

  17. Upwelling radiance at 976 nm measured from space using the OPALS CCD camera on the ISS

    NASA Astrophysics Data System (ADS)

    Biswas, Abhijit; Kovalik, Joseph M.; Oaida, Bogdan V.; Abrahamson, Matthew; Wright, Malcolm W.

    2015-03-01

    The Optical Payload for Lasercomm Science (OPALS) Flight System on-board the International Space Station uses a charge coupled device (CCD) camera to detect a beacon laser from Earth. Relative measurements of the background contributed by upwelling radiance under diverse illumination conditions and varying surface terrain is presented. In some cases clouds in the field-of-view allowed a comparison of terrestrial and cloud-top upwelling radiance. In this paper we will report these measurements and examine the extent of agreement with atmospheric model predictions.

  18. Telespectrophotometry of human skin diseases by means of a CCD camera

    NASA Astrophysics Data System (ADS)

    Marchesini, Renato; Ballerini, Mauro; Bartoli, Cesare; Pignoli, Emanuele; Sichirollo, Adele E.; Tomatis, Stefano; Zurrida, Stefano; Cascinelli, Natale

    1994-01-01

    Spectrophotometry of in vivo skin pigmented lesions by means of an integrating sphere coupled to a conventional spectrophotometer has recently been suggested as a useful tool to discriminate cutaneous melanoma from other pigmented cutaneous lesions. To improve reflectance spectral analysis of moles, a new spectrophotometric procedure based on the use of a CCD camera provided with interferential filters has been developed. Preliminary results suggest that the new method, allowing a spatially resolved analysis of the spectral components from 420 to 1040 nm, would improve, when properly implemented with imaging data handling, the quality of a computer assisted diagnosis of malignant lesions.

  19. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument currently being developed by NASA's Marshall Space Flight Center (MSFC), the National Astronomical Observatory of Japan (NAOJ), and other partners. The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's chromosphere. The polarized spectrum imaged by the CCD cameras will capture information about the local magnetic field, allowing for measurements of magnetic strength and structure. In order to make accurate measurements of this effect, the performance characteristics of the three on- board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, read noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of 2.0+/- 0.5 e--/DN, a read noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non- linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  20. Camera networks and microphone arrays for video conferencing

    NASA Astrophysics Data System (ADS)

    Trivedi, Mohan M.; Rao, Bhaskar D.; Ng, Kim C.

    1999-11-01

    Robust and adaptive operation of a video conferencing system can be realized by proper utilization of a network of video cameras and microphone arrays. Such multimodal sensory modules can provide valuable information about the geometrical and acoustical properties of an environment as well as can allow for real-time monitoring for dynamic activity in the environment. In this paper, we present an overview of the research activities focused on utilization of omnidirectional camera networks for geometrical and environmental modeling and microphone arrays for speaker localization.

  1. Ball lightning observation: an objective video-camera analysis report

    E-print Network

    Sello, Stefano; Paganini, Enrico

    2011-01-01

    In this paper we describe a video-camera recording of a (probable) ball lightning event and both the related image and signal analyses for its photometric and dynamical characterization. The results strongly support the BL nature of the recorded luminous ball object and allow the researchers to have an objective and unique video document of a possible BL event for further analyses. Some general evaluations of the obtained results considering the proposed ball lightning models conclude the paper.

  2. Ball lightning observation: an objective video-camera analysis report

    E-print Network

    Stefano Sello; Paolo Viviani; Enrico Paganini

    2011-02-18

    In this paper we describe a video-camera recording of a (probable) ball lightning event and both the related image and signal analyses for its photometric and dynamical characterization. The results strongly support the BL nature of the recorded luminous ball object and allow the researchers to have an objective and unique video document of a possible BL event for further analyses. Some general evaluations of the obtained results considering the proposed ball lightning models conclude the paper.

  3. Demo: DSLR Video A DSLR camera is capable of shooting high definition video with the added benefit of using

    E-print Network

    Stowell, Michael

    Demo: DSLR Video A DSLR camera is capable of shooting high definition video with the added benefit disadvantage is that DSLR cameras do not have a fast auto adjusting focus feature that is standard on video-way down ! Or "M" manual focus · Turn lens focus ring · Shooting Video o Live view ! Lever next to setting

  4. Performance of the low light level CCD camera for speckle imaging

    E-print Network

    Saha, S K

    2002-01-01

    A new generation CCD detector called low light level CCD (L3CCD) that performs like an intensified CCD without incorporating a micro channel plate (MCP) for light amplification was procured and tested. A series of short exposure images with millisecond integration time has been obtained. The L3CCD is cooled to about $-80^\\circ$ C by Peltier cooling.

  5. Performance of the low light level CCD camera for speckle imaging

    E-print Network

    S. K. Saha; V. Chinnappan

    2002-09-20

    A new generation CCD detector called low light level CCD (L3CCD) that performs like an intensified CCD without incorporating a micro channel plate (MCP) for light amplification was procured and tested. A series of short exposure images with millisecond integration time has been obtained. The L3CCD is cooled to about $-80^\\circ$ C by Peltier cooling.

  6. Benchmarking of Back Thinned 512x512 X-ray CCD Camera Measurements with DEF X-ray film

    NASA Astrophysics Data System (ADS)

    Shambo, N. A.; Workman, J.; Kyrala, G.; Hurry, T.; Gonzales, R.; Evans, S. C.

    1999-11-01

    Using the Trident Laser Facility at Los Alamos National Laboratory 25-micron thick, 2mm diameter titanium disks were shot with a 527nm(green) laser light to measure x-ray yield. 1.0 mil and 0.5 mil Aluminum steps were used to test the linearity of the CCD Camera and DEF X-ray film was used to test the calibration of the CCD Camera response at 4.75keV. Both laser spot size and incident laser intensity were constrained to give constancy to the experimental data. This poster will discuss both the experimental design and results.

  7. Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2004-01-01

    Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

  8. Charge-coupled device (CCD) television camera for NASA's Galileo mission to Jupiter

    NASA Technical Reports Server (NTRS)

    Klaasen, K. P.; Clary, M. C.; Janesick, J. R.

    1982-01-01

    The CCD detector under construction for use in the slow-scan television camera for the NASA Galileo Jupiter orbiter to be launched in 1985 is presented. The science objectives and the design constraints imposed by the earth telemetry link, platform residual motion, and the Jovian radiation environment are discussed. Camera optics are inherited from Voyager; filter wavelengths are chosen to enable discrimination of Galilean-satellite surface chemical composition. The CCO design, an 800 by 800-element 'virtual-phase' solid-state silicon image-sensor array with supporting electronics, is described with detailed discussion of the thermally generated dark current, quantum efficiency, signal-to-noise ratio, and resolution. Tests of the effect of ionizing radiation were performed and are analyzed statistically. An imaging mode using a 2-1/3-sec frame time and on-chip summation of the signal in 2 x 2 blocks of adjacent pixels is designed to limit the effects of the most extreme Jovian radiation. Smearing due to spacecraft/target relative velocity and platform instability will be corrected for via an algorithm maximizing spacial resolution at a given signal-to-noise level. The camera is expected to produce 40,000 images of Jupiter and its satellites during the 20-month mission.

  9. Real-time video tracking using PTZ cameras Sangkyu Kanga

    E-print Network

    Abidi, Mongi A.

    as possible to the targets, to get accurate tracking sangkyu@iristown.engr.utk.edu, phone: +1Real-time video tracking using PTZ cameras Sangkyu Kanga , Joonki Paikab , Andreas Koschana , Besma-Dong, Dongjak-Gu, Seoul, 157-756, Korea ABSTRACT Automatic tracking is essential for a 24 hours intruder

  10. 67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST OF ASSISTANT LAUNCH CONDUCTOR PANEL SHOWN IN CA-133-1-A-66 - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  11. Lights, Camera, Action! Using Video Recordings to Evaluate Teachers

    ERIC Educational Resources Information Center

    Petrilli, Michael J.

    2011-01-01

    Teachers and their unions do not want test scores to count for everything; classroom observations are key, too. But planning a couple of visits from the principal is hardly sufficient. These visits may "change the teacher's behavior"; furthermore, principals may not be the best judges of effective teaching. So why not put video cameras in…

  12. Performance of front-end mixed-signal ASIC for onboard CCD cameras

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Inoue, Shota; Nagino, Ryo; Anabuki, Naohisa; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Doty, John P.; Ikeda, Hirokazu

    2014-07-01

    We report on the development status of the readout ASIC for an onboard X-ray CCD camera. The quick low- noise readout is essential for the pile-up free imaging spectroscopy with the future highly sensitive telescope. The dedicated ASIC for ASTRO-H/SXI has sufficient noise performance only at the slow pixel rate of 68 kHz. Then we have been developing the upgraded ASIC with the fourth-order ?? modulators. Upgrading the order of the modulator enables us to oversample the CCD signals less times so that we. The digitized pulse height is a serial bit stream that is decrypted with a decimation filter. The weighting coefficient of the filter is optimized to maximize the signal-to-noise ratio by a simulation. We present the performances such as the input equivalent noise (IEN), gain, effective signal range. The digitized pulse height data are successfully obtained in the first functional test up to 625 kHz. IEN is almost the same as that obtained with the chip for ASTRO-H/SXI. The residuals from the gain function is about 0.1%, which is better than that of the conventional ASIC by a factor of two. Assuming that the gain of the CCD is the same as that for ASTRO-H, the effective range is 30 keV in the case of the maximum gain. By changing the gain it can manage the signal charges of 100 ke-. These results will be fed back to the optimization of the pulse height decrypting filter.

  13. Advantages of the CCD camera measurements for profile and wear of cutting tools

    NASA Astrophysics Data System (ADS)

    Varga, G.; Balajti, Z.; Dudás, I.

    2005-01-01

    In our paper we prepared an evaluating study of which conclusions draw mainly two directions for our fields of research. On the one hand, this means the measuring of fix, standing workpieces, on the other hand this means geometrical measurement of moving tools. The first case seems to be solved in many respects (in general cases), but the second one is not completely worked out according to the relevant literature. The monitoring of tool wear, the determination of geometrical parameters (this is mainly in case of gear-generating tools) is not really widespread yet, mainly, if optical parameters have influence on the evaluating procedure (e.g. examination of profiles of grinding wheels). We show the elaboration of a process for the practical application of measuring techniques performed by image processing CCD cameras on the basis of wearing criteria of different cutting tools (drilling tool, turning tool). We have made a profile and cutting tool wear measuring program.

  14. A reflectance model for non-contact mapping of venous oxygen saturation using a CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Jun; Dunmire, Barbrina; Beach, Kirk W.; Leotta, Daniel F.

    2013-11-01

    A method of non-contact mapping of venous oxygen saturation (SvO2) is presented. A CCD camera is used to image skin tissue illuminated alternately by a red (660 nm) and an infrared (800 nm) LED light source. Low cuff pressures of 30-40 mmHg are applied to induce a venous blood volume change with negligible change in the arterial blood volume. A hybrid model combining the Beer-Lambert law and the light diffusion model is developed and used to convert the change in the light intensity to the change in skin tissue absorption coefficient. A simulation study incorporating the full light diffusion model is used to verify the hybrid model and to correct a calculation bias. SvO2 in the fingers, palm, and forearm for five volunteers are presented and compared with results in the published literature. Two-dimensional maps of venous oxygen saturation are given for the three anatomical regions.

  15. High resolution three-dimensional photoacoutic tomography with CCD-camera based ultrasound detection

    PubMed Central

    Nuster, Robert; Slezak, Paul; Paltauf, Guenther

    2014-01-01

    A photoacoustic tomograph based on optical ultrasound detection is demonstrated, which is capable of high resolution real-time projection imaging and fast three-dimensional (3D) imaging. Snapshots of the pressure field outside the imaged object are taken at defined delay times after photoacoustic excitation by use of a charge coupled device (CCD) camera in combination with an optical phase contrast method. From the obtained wave patterns photoacoustic projection images are reconstructed using a back propagation Fourier domain reconstruction algorithm. Applying the inverse Radon transform to a set of projections recorded over a half rotation of the sample provides 3D photoacoustic tomography images in less than one minute with a resolution below 100 µm. The sensitivity of the device was experimentally determined to be 5.1 kPa over a projection length of 1 mm. In vivo images of the vasculature of a mouse demonstrate the potential of the developed method for biomedical applications. PMID:25136491

  16. Method for searching the mapping relationship between space points and their image points in CCD camera

    NASA Astrophysics Data System (ADS)

    Sun, Yuchen; Ge, Baozhen; Lu, Qieni; Zou, Jin; Zhang, Yimo

    2005-01-01

    BP Neural Network Method and Linear Partition Method are proposed to search the mapping relationship between space points and their image points in CCD cameras, which can be adopted to calibrate three-dimensional digitization systems based on optical method. Both of the methods only need the coordinates of calibration points and their corresponding image points" coordinates as parameters. The principle of the calibration techniques includes the formula and solution procedure is deduced in detail. Calibration experiment results indicate that the use of Linear Partition Method to coplanar points enables its measuring mean relative error to reach 0.44 percent and the use of BP Neural Network Method to non-coplanar points enables its testing accuracy to reach 0.5-0.6 pixels.

  17. ULTRACAM: an ultra-fast, triple-beam CCD camera for high-speed astrophysics

    E-print Network

    V. S. Dhillon; T. R. Marsh; M. J. Stevenson; D. C. Atkinson; P. Kerry; P. T. Peacocke; A. J. A. Vick; S. M. Beard; D. J. Ives; D. W. Lunney; S. A. McLay; C. J. Tierney; J. Kelly; S. P. Littlefair; R. Nicholson; R. Pashley; E. T. Harlaftis; K. O'Brien

    2007-04-19

    ULTRACAM is a portable, high-speed imaging photometer designed to study faint astronomical objects at high temporal resolutions. ULTRACAM employs two dichroic beamsplitters and three frame-transfer CCD cameras to provide three-colour optical imaging at frame rates of up to 500 Hz. The instrument has been mounted on both the 4.2-m William Herschel Telescope on La Palma and the 8.2-m Very Large Telescope in Chile, and has been used to study white dwarfs, brown dwarfs, pulsars, black-hole/neutron-star X-ray binaries, gamma-ray bursts, cataclysmic variables, eclipsing binary stars, extrasolar planets, flare stars, ultra-compact binaries, active galactic nuclei, asteroseismology and occultations by Solar System objects (Titan, Pluto and Kuiper Belt objects). In this paper we describe the scientific motivation behind ULTRACAM, present an outline of its design and report on its measured performance.

  18. 0.25mm-thick CCD packaging for the Dark Energy Survey Camera array

    SciTech Connect

    Derylo, Greg; Diehl, H.Thomas; Estrada, Juan; /Fermilab

    2006-06-01

    The Dark Energy Survey Camera focal plane array will consist of 62 2k x 4k CCDs with a pixel size of 15 microns and a silicon thickness of 250 microns for use at wavelengths between 400 and 1000 nm. Bare CCD die will be received from the Lawrence Berkeley National Laboratory (LBNL). At the Fermi National Accelerator Laboratory, the bare die will be packaged into a custom back-side-illuminated module design. Cold probe data from LBNL will be used to select the CCDs to be packaged. The module design utilizes an aluminum nitride readout board and spacer and an Invar foot. A module flatness of 3 microns over small (1 sqcm) areas and less than 10 microns over neighboring areas on a CCD are required for uniform images over the focal plane. A confocal chromatic inspection system is being developed to precisely measure flatness over a grid up to 300 x 300 mm. This system will be utilized to inspect not only room-temperature modules, but also cold individual modules and partial arrays through flat dewar windows.

  19. Applications of visible CCD cameras on the Alcator C-Mod C. J. Boswell, J. L. Terry, B. Lipschultz, J. Stillerman

    E-print Network

    Boswell, Christopher

    diameter remote-head visible charge-coupled device (CCD) cam- eras are being used on Alcator C to the divertor as high as 600 MWm-2 .1 The use of charge-coupled device (CCD) cameras has several advantages over views aligned with the camera views and whose outputs are coupled to a visible spectrometer are also

  20. Video Chat with Multiple Cameras John MacCormick, Dickinson College

    E-print Network

    MacCormick, John

    Video Chat with Multiple Cameras John MacCormick, Dickinson College ABSTRACT The dominant paradigm for video chat employs a single camera at each end of the conversation, but some conver- sations can investigation of multi-camera video chat, concentrating es- pecially on the ability of users to switch between

  1. Double Star Measurements at the Southern Sky with a 50 cm Reflector and a Fast CCD Camera in 2014

    NASA Astrophysics Data System (ADS)

    Anton, Rainer

    2015-04-01

    A Ritchey-Chrétien reflector with 50 cm aperture was used in Namibia for recordings of double stars with a fast CCD camera and a notebook computer. From superposition of "lucky images", measurements of 91 pairings in 79 double and multiple systems were obtained and compared with literature data. Occasional deviations are discussed. Some images of noteworthy systems are also presented.

  2. Double Star Measurements at the Southern Sky with 50 cm Reflectors and Fast CCD Cameras in 2012

    NASA Astrophysics Data System (ADS)

    Anton, Rainer

    2014-07-01

    A Cassegrain and a Ritchey-Chrétien reflector, both with 50 cm aperture, were used in Namibia for recordings of double stars with fast CCD cameras and a notebook computer. From superposition of "lucky images", measurements of 39 double and multiple systems were obtained and compared with literature data. Occasional deviations are discussed. Images of some remarkable systems are also presented.

  3. Masking a CCD camera allows multichord charge exchange spectroscopy measurements at high speed on the DIII-D tokamak

    SciTech Connect

    Meyer, O.; Burrell, K. H.; Chavez, J. A.; Kaplan, D. H.; Chrystal, C.; Pablant, N. A.; Solomon, W. M.

    2011-02-15

    Charge exchange spectroscopy is one of the standard plasma diagnostic techniques used in tokamak research to determine ion temperature, rotation speed, particle density, and radial electric field. Configuring a charge coupled device (CCD) camera to serve as a detector in such a system requires a trade-off between the competing desires to detect light from as many independent spatial views as possible while still obtaining the best possible time resolution. High time resolution is essential, for example, for studying transient phenomena such as edge localized modes. By installing a mask in front of a camera with a 1024 x 1024 pixel CCD chip, we are able to acquire spectra from eight separate views while still achieving a minimum time resolution of 0.2 ms. The mask separates the light from the eight spectra, preventing spatial and temporal cross talk. A key part of the design was devising a compact translation stage which attaches to the front of the camera and allows adjustment of the position of the mask openings relative to the CCD surface. The stage is thin enough to fit into the restricted space between the CCD camera and the spectrometer endplate.

  4. Unmanned Vehicle Guidance Using Video Camera/Vehicle Model

    NASA Technical Reports Server (NTRS)

    Sutherland, T.

    1999-01-01

    A video guidance sensor (VGS) system has flown on both STS-87 and STS-95 to validate a single camera/target concept for vehicle navigation. The main part of the image algorithm was the subtraction of two consecutive images using software. For a nominal size image of 256 x 256 pixels this subtraction can take a large portion of the time between successive frames in standard rate video leaving very little time for other computations. The purpose of this project was to integrate the software subtraction into hardware to speed up the subtraction process and allow for more complex algorithms to be performed, both in hardware and software.

  5. Status of the CCD camera for the eROSITA space telescope

    NASA Astrophysics Data System (ADS)

    Meidinger, Norbert; Andritschke, Robert; Elbs, Johannes; Granato, Stefanie; Hälker, Olaf; Hartner, Gisela; Herrmann, Sven; Miessner, Danilo; Pietschner, Daniel; Predehl, Peter; Reiffers, Jonas; Rommerskirchen, Tanja; Schmaler, Gabriele; Strüder, Lothar; Tiedemann, Lars

    2011-09-01

    The approved German X-ray telescope eROSITA (extended ROentgen Survey with an Imaging Telescope Array) is the core instrument on the Russian Spektrum-Roentgen-Gamma (SRG) mission. After satellite launch to Lagrangian point L2 in near future, eROSITA will perform a survey of the entire X-ray sky. In the soft band (0.5 keV - 2 keV), it will be about 30 times more sensitive than ROSAT, while in the hard band (2 keV - 8 keV) it will provide the first complete imaging survey of the sky. The design driving science is the detection of 100,000 clusters of galaxies up to redshift z ~ 1.3 in order to study the large scale structure in the Universe and test cosmological models including Dark Energy. Detection of single X-ray photons with information about their energy, arrival angle and time is accomplished by an array of seven identical and independent PNCCD cameras. Each camera is assigned to a dedicated mirror system of Wolter-I type. The key component of the camera is a 5 cm • 3 cm large, back-illuminated, 450 ?m thick and fully depleted frame store PNCCD chip. It is a further development of the sensor type which is in operation aboard the XMM-Newton satellite since 1999. Development and production of the CCDs for the eROSITA project were performed in the semiconductor laboratory of the Max-Planck-Institutes for Physics and Extraterrestrial Physics, the MPI Halbleiterlabor. By means of a unique so-called 'cold-chuck probe station', we have characterized the performance of each PNCCD sensor on chip-level. Various tests were carried out for a detailed characterization of the CCD and its custom-made analog readout ASIC. This includes in particular the evaluation of the optimum detector operating conditions in terms of operating sequence, supply voltages and operating temperature in order to achieve optimum performance. In the course of the eROSITA camera development, an engineering model of the eROSITA flight detector was assembled and is used for tests since 2010. Based on these results and on the extensive tests with lab model detectors, the design of the front-end electronics has meanwhile been finalized for the flight cameras. Furthermore, the specifications for the other supply and control electronics were precisely concluded on the basis of the experimental tests.

  6. In-flight Video Captured by External Tank Camera System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In this July 26, 2005 video, Earth slowly fades into the background as the STS-114 Space Shuttle Discovery climbs into space until the External Tank (ET) separates from the orbiter. An External Tank ET Camera System featuring a Sony XC-999 model camera provided never before seen footage of the launch and tank separation. The camera was installed in the ET LO2 Feedline Fairing. From this position, the camera had a 40% field of view with a 3.5 mm lens. The field of view showed some of the Bipod area, a portion of the LH2 tank and Intertank flange area, and some of the bottom of the shuttle orbiter. Contained in an electronic box, the battery pack and transmitter were mounted on top of the Solid Rocker Booster (SRB) crossbeam inside the ET. The battery pack included 20 Nickel-Metal Hydride batteries (similar to cordless phone battery packs) totaling 28 volts DC and could supply about 70 minutes of video. Located 95 degrees apart on the exterior of the Intertank opposite orbiter side, there were 2 blade S-Band antennas about 2 1/2 inches long that transmitted a 10 watt signal to the ground stations. The camera turned on approximately 10 minutes prior to launch and operated for 15 minutes following liftoff. The complete camera system weighs about 32 pounds. Marshall Space Flight Center (MSFC), Johnson Space Center (JSC), Goddard Space Flight Center (GSFC), and Kennedy Space Center (KSC) participated in the design, development, and testing of the ET camera system.

  7. Pixel-to-pixel correspondence alignment method of a 2CCD camera by using absolute phase map

    NASA Astrophysics Data System (ADS)

    Huang, Shujun; Liu, Yue; Bai, Xuefei; Wang, Zhangying; Zhang, Zonghua

    2015-06-01

    An alignment method of a 2CCD camera to build pixel-to-pixel correspondence between the infrared (IR) CCD sensor and the visible CCD sensor by using the absolute phase data is presented. Vertical and horizontal sinusoidal fringe patterns are generated by software and displayed on a liquid crystal display screen. The displayed fringe patterns are captured simultaneously by the IR sensor and the visible sensor of the 2CCD camera. The absolute phase values of each pixel at IR and visible channels are calculated from the captured fringe pattern images by using Fourier transform and the optimum three-fringe number selection method. The accurate pixel corresponding relationship between the two sensors can be determined along the vertical and the horizontal directions by comparing the obtained absolute phase data in IR and visible channels. Experimental results show the high accuracy, effectiveness, and validity of the proposed 2CCD alignment method. By using the continuous absolute phase information, this method can determine the pixel-to-pixel correspondence with high resolution.

  8. Robust camera calibration for sport videos using court models

    NASA Astrophysics Data System (ADS)

    Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang

    2003-12-01

    We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.

  9. OP09O-OP404-9 Wide Field Camera 3 CCD Quantum Efficiency Hysteresis

    NASA Technical Reports Server (NTRS)

    Collins, Nick

    2009-01-01

    The HST/Wide Field Camera (WFC) 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. At the nominal operating temperature of -83C, the QEH feature contrast was typically 0.1-0.2% or less. The behavior was replicated using flight spare detectors. A visible light flat-field (540nm) with a several times full-well signal level can pin the detectors at both optical (600nm) and near-UV (230nm) wavelengths, suppressing the QEH behavior. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. The HST/Wide Field Camera 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. The first observed manifestation of QEH was the presence in a small percentage of flat-field images of a bowtie-shaped contrast that spanned the width of each chip. At the nominal operating temperature of -83C, the contrast observed for this feature was typically 0.1-0.2% or less, though at warmer temperatures contrasts up to 5% (at -50C) have been observed. The bowtie morphology was replicated using flight spare detectors in tests at the GSFC Detector Characterization Laboratory by power cycling the detector while cold. Continued investigation revealed that a clearly-related global QE suppression at the approximately 5% level can be produced by cooling the detector in the dark; subsequent flat-field exposures at a constant illumination show asymptotically increasing response. This QE "pinning" can be achieved with a single high signal flat-field or a series of lower signal flats; a visible light (500-580nm) flat-field with a signal level of several hundred thousand electrons per pixel is sufficient for QE pinning at both optical (600nm) and near-UV (230nm) wavelengths. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. A preliminary estimate of the decay timescale for one detector is that a drop of 0.1-0.2% occurs over a ten day period, indicating that relatively infrequent cal lamp exposures can mitigate the behavior to extremely low levels.

  10. Measuring the Flatness of Focal Plane for Very Large Mosaic CCD Camera

    SciTech Connect

    Hao, Jiangang; Estrada, Juan; Cease, Herman; Diehl, H.Thomas; Flaugher, Brenna L.; Kubik, Donna; Kuk, Keivin; Kuropatkine, Nickolai; Lin, Huan; Montes, Jorge; Scarpine, Vic

    2010-06-08

    Large mosaic multiCCD camera is the key instrument for modern digital sky survey. DECam is an extremely red sensitive 520 Megapixel camera designed for the incoming Dark Energy Survey (DES). It is consist of sixty two 4k x 2k and twelve 2k x 2k 250-micron thick fully-depleted CCDs, with a focal plane of 44 cm in diameter and a field of view of 2.2 square degree. It will be attached to the Blanco 4-meter telescope at CTIO. The DES will cover 5000 square-degrees of the southern galactic cap in 5 color bands (g, r, i, z, Y) in 5 years starting from 2011. To achieve the science goal of constraining the Dark Energy evolution, stringent requirements are laid down for the design of DECam. Among them, the flatness of the focal plane needs to be controlled within a 60-micron envelope in order to achieve the specified PSF variation limit. It is very challenging to measure the flatness of the focal plane to such precision when it is placed in a high vacuum dewar at 173 K. We developed two image based techniques to measure the flatness of the focal plane. By imaging a regular grid of dots on the focal plane, the CCD offset along the optical axis is converted to the variation the grid spacings at different positions on the focal plane. After extracting the patterns and comparing the change in spacings, we can measure the flatness to high precision. In method 1, the regular dots are kept in high sub micron precision and cover the whole focal plane. In method 2, no high precision for the grid is required. Instead, we use a precise XY stage moves the pattern across the whole focal plane and comparing the variations of the spacing when it is imaged by different CCDs. Simulation and real measurements show that the two methods work very well for our purpose, and are in good agreement with the direct optical measurements.

  11. The 12K×8K CCD mosaic camera for the Palomar Transient Factory

    NASA Astrophysics Data System (ADS)

    Rahmer, Gustavo; Smith, Roger; Velur, Viswa; Hale, David; Law, Nicholas; Bui, Khanh; Petrie, Hal; Dekany, Richard

    2008-07-01

    The Palomar Transient Factory is an automated wide-field survey facility dedicated to identifying a wide range of transient phenomena. Typically, a new 7.5 square degree field will be acquired every 90 seconds with 66% observing efficiency, in g' band when the sky is dark, or in R band when the moon is up. An imaging camera with a 12Kx8K mosaic of MIT/LL CCDs, acquired from CFHT, is being repackaged to fit in the prime focus mounting hub of the Palomar 48-inch Oschin Schmidt Telescope. We discuss how we have addressed the broad range of issues presented by this application: faster CCD readout to improve observing efficiency, a new cooling system to fit within the constrained space, a low impact shutter to maintain reliability at the fast observing cadence, a new filter exchange mechanism, and the field flattener needed to correct for focal plane curvature. The most critical issue was the tight focal plane alignment and co-planarity requirements created by the fast beam and coarse plate scale. We built an optical profilometer system to measure CCDs heights and tilts with 1 ?m RMS accuracy.

  12. Measurement of time varying temperature fields using visible imaging CCD cameras

    SciTech Connect

    Keanini, R.G.; Allgood, C.L.

    1996-12-31

    A method for measuring time-varying surface temperature distributions using high frame rate visible imaging CCD cameras is described. The technique is based on an ad hoc model relating measured radiance to local surface temperature. This approach is based on the fairly non-restrictive assumptions that atmospheric scattering and absorption, and secondary emission and reflection are negligible. In order to assess performance, both concurrent and non-concurrent calibration and measurement, performed under dynamic thermal conditions, are examined. It is found that measurement accuracy is comparable to the theoretical accuracy predicted for infrared-based systems. In addition, performance tests indicate that in the experimental system, real-time calibration can be achieved while real-time whole-field temperature measurements require relatively coarse spatial resolution. The principal advantages of the proposed method are its simplicity and low cost. In addition, since independent temperature measurements are used for calibration, emissivity remains unspecified, so that a potentially significant source of error is eliminated.

  13. Study of pixel damages in CCD cameras irradiated at the neutron tomography facility of IPEN-CNEN/SP

    NASA Astrophysics Data System (ADS)

    Pugliesi, R.; Andrade, M. L. G.; Dias, M. S.; Siqueira, P. T. D.; Pereira, M. A. S.

    2015-12-01

    A methodology to investigate damages in CCD sensors caused by radiation beams of neutron tomography facilities is proposed. This methodology has been developed in the facility installed at the nuclear research reactor of IPEN-CNEN/SP, and the damages were evaluated by counting of white spots in images. The damage production rate at the main camera position was evaluated to be in the range between 0.008 and 0.040 damages per second. For this range, only 4 to 20 CCD pixels are damaged per tomography, assuring high quality images for hundreds of tomographs. Since the present methodology is capable of quantifying the damage production rate for each type of radiation, it can also be used in other facilities to improve the radiation shielding close of the CCD sensors.

  14. A dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. I - Optical and detector systems

    NASA Technical Reports Server (NTRS)

    Meyer, S. S.; Ricker, G. R.

    1980-01-01

    The MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes), an instrument capable of simultaneously performing both direct imaging and spectrometry of faint objects, is examined. An optical layout is given of the instrument which uses two CCD's mounted on the same temperature regulated detector block. Two sources of noise on the signal are discussed: (1) the CCD readout noise, which results in a constant uncertainty in the number of electrons collected from each pixel; and (2) the photon counting noise. The sensitivity of the device is limited by the sky brightness, the overall quantum efficiency, the resolution, and the readout noise of the CCD. Therefore, total system efficiency is calculated at about 15%.

  15. Free-Viewpoint Video from Depth Cameras Alexander Bogomjakov Craig Gotsman Marcus Magnor

    E-print Network

    Lehmann, Daniel

    Braunschweig Abstract Depth cameras, which provide color and depth in- formation per pixel at video rates-viewpoint, novel view, depth camera, graphics hardware. 1 Introduction Free-viewpoint video is an emerging area of active research in computer graphics. The goal is to allow the viewer of a video dataset, whether

  16. A new paradigm for video cameras: optical sensors

    NASA Astrophysics Data System (ADS)

    Grottle, Kevin; Nathan, Anoo; Smith, Catherine

    2007-04-01

    This paper presents a new paradigm for the utilization of video surveillance cameras as optical sensors to augment and significantly improve the reliability and responsiveness of chemical monitoring systems. Incorporated into a hierarchical tiered sensing architecture, cameras serve as 'Tier 1' or 'trigger' sensors monitoring for visible indications after a release of warfare or industrial toxic chemical agents. No single sensor today yet detects the full range of these agents, but the result of exposure is harmful and yields visible 'duress' behaviors. Duress behaviors range from simple to complex types of observable signatures. By incorporating optical sensors in a tiered sensing architecture, the resulting alarm signals based on these behavioral signatures increases the range of detectable toxic chemical agent releases and allows timely confirmation of an agent release. Given the rapid onset of duress type symptoms, an optical sensor can detect the presence of a release almost immediately. This provides cues for a monitoring system to send air samples to a higher-tiered chemical sensor, quickly launch protective mitigation steps, and notify an operator to inspect the area using the camera's video signal well before the chemical agent can disperse widely throughout a building.

  17. Refocusing images and videos with a conventional compact camera

    NASA Astrophysics Data System (ADS)

    Kang, Lai; Wu, Lingda; Wei, Yingmei; Song, Hanchen; Yang, Zheng

    2015-03-01

    Digital refocusing is an interesting and useful tool for generating dynamic depth-of-field (DOF) effects in many types of photography such as portraits and creative photography. Since most existing digital refocusing methods rely on four-dimensional light field captured by special precisely manufactured devices or a sequence of images captured by a single camera, existing systems are either expensive for wide practical use or incapable of handling dynamic scenes. We present a low-cost approach for refocusing high-resolution (up to 8 mega pixels) images and videos based on a single shot using an easy to build camera-mirror stereo system. Our proposed method consists of four main steps, namely system calibration, image rectification, disparity estimation, and refocusing rendering. The effectiveness of our proposed method has been evaluated extensively using both static and dynamic scenes with various depth ranges. Promising experimental results demonstrate that our method is able to simulate various controllable realistic DOF effects. To the best of our knowledge, our method is the first that allows one to refocus high-resolution images and videos of dynamic scenes captured by a conventional compact camera.

  18. Miniature, vacuum compatible 1024 {times} 1024 CCD camera for x-ray, ultra-violet, or optical imaging

    SciTech Connect

    Conder, A.D.; Dunn, J.; Young, B.K.F.

    1994-05-01

    We have developed a very compact (60 {times} 60 {times} 75 mm{sup 3}), vacuum compatible, large format (25 {times} 25 mm{sup 2}, 1024 {times} 1024 pixels) CCD camera for digital imaging of visible and ultraviolet radiation, soft to penetrating x-rays ({le}20 keV), and charged particles. This camera provides a suitable replacement for film with a linear response, dynamic range and intrinsic signal-to- noise response superior than current x-ray film, and provides real- time access to the data. The spatial resolution of the camera (< 25 {mu}m) is similar to typical digitization slit or step sizes used in processing film data. This new large format CCD camera has immediate applications as the recording device for steak cameras or gated microchannel plate diagnostic, or when used directly as the detector for x-ray, xuv, or optical signals. This is especially important in studying high-energy plasmas produced in pulse-power, ICF, and high powered laser-plasma experiments, as well as other medical and industrial applications.

  19. MOA-cam3: a wide-field mosaic CCD camera for a gravitational microlensing survey in New Zealand

    NASA Astrophysics Data System (ADS)

    Sako, T.; Sekiguchi, T.; Sasaki, M.; Okajima, K.; Abe, F.; Bond, I. A.; Hearnshaw, J. B.; Itow, Y.; Kamiya, K.; Kilmartin, P. M.; Masuda, K.; Matsubara, Y.; Muraki, Y.; Rattenbury, N. J.; Sullivan, D. J.; Sumi, T.; Tristram, P.; Yanagisawa, T.; Yock, P. C. M.

    2008-10-01

    We have developed a wide-field mosaic CCD camera, MOA-cam3, mounted at the prime focus of the Microlensing Observations in Astrophysics (MOA) 1.8-m telescope. The camera consists of ten E2V CCD4482 chips, each having 2k×4k pixels, and covers a 2.2 deg2 field of view with a single exposure. The optical system is well optimized to realize uniform image quality over this wide field. The chips are constantly cooled by a cryocooler at - 80° C, at which temperature dark current noise is negligible for a typical 1 3 min exposure. The CCD output charge is converted to a 16-bit digital signal by the GenIII system (Astronomical Research Cameras Inc.) and readout is within 25 s. Readout noise of 2 3 ADU (rms) is also negligible. We prepared a wide-band red filter for an effective microlensing survey and also Bessell V, I filters for standard astronomical studies. Microlensing studies have entered into a new era, which requires more statistics, and more rapid alerts to catch exotic light curves. Our new system is a powerful tool to realize both these requirements.

  20. Developing a CCD camera with high spatial resolution for RIXS in the soft X-ray range

    NASA Astrophysics Data System (ADS)

    Soman, M. R.; Hall, D. J.; Tutt, J. H.; Murray, N. J.; Holland, A. D.; Schmitt, T.; Raabe, J.; Schmitt, B.

    2013-12-01

    The Super Advanced X-ray Emission Spectrometer (SAXES) at the Swiss Light Source contains a high resolution Charge-Coupled Device (CCD) camera used for Resonant Inelastic X-ray Scattering (RIXS). Using the current CCD-based camera system, the energy-dispersive spectrometer has an energy resolution (E/?E) of approximately 12,000 at 930 eV. A recent study predicted that through an upgrade to the grating and camera system, the energy resolution could be improved by a factor of 2. In order to achieve this goal in the spectral domain, the spatial resolution of the CCD must be improved to better than 5 ?m from the current 24 ?m spatial resolution (FWHM). The 400 eV-1600 eV energy X-rays detected by this spectrometer primarily interact within the field free region of the CCD, producing electron clouds which will diffuse isotropically until they reach the depleted region and buried channel. This diffusion of the charge leads to events which are split across several pixels. Through the analysis of the charge distribution across the pixels, various centroiding techniques can be used to pinpoint the spatial location of the X-ray interaction to the sub-pixel level, greatly improving the spatial resolution achieved. Using the PolLux soft X-ray microspectroscopy endstation at the Swiss Light Source, a beam of X-rays of energies from 200 eV to 1400 eV can be focused down to a spot size of approximately 20 nm. Scanning this spot across the 16 ?m square pixels allows the sub-pixel response to be investigated. Previous work has demonstrated the potential improvement in spatial resolution achievable by centroiding events in a standard CCD. An Electron-Multiplying CCD (EM-CCD) has been used to improve the signal to effective readout noise ratio achieved resulting in a worst-case spatial resolution measurement of 4.5±0.2 ?m and 3.9±0.1 ?m at 530 eV and 680 eV respectively. A method is described that allows the contribution of the X-ray spot size to be deconvolved from these worst-case resolution measurements, estimating the spatial resolution to be approximately 3.5 ?m and 3.0 ?m at 530 eV and 680 eV, well below the resolution limit of 5 ?m required to improve the spectral resolution by a factor of 2.

  1. Non-mydriatic, wide field, fundus video camera

    NASA Astrophysics Data System (ADS)

    Hoeher, Bernhard; Voigtmann, Peter; Michelson, Georg; Schmauss, Bernhard

    2014-02-01

    We describe a method we call "stripe field imaging" that is capable of capturing wide field color fundus videos and images of the human eye at pupil sizes of 2mm. This means that it can be used with a non-dilated pupil even with bright ambient light. We realized a mobile demonstrator to prove the method and we could acquire color fundus videos of subjects successfully. We designed the demonstrator as a low-cost device consisting of mass market components to show that there is no major additional technical outlay to realize the improvements we propose. The technical core idea of our method is breaking the rotational symmetry in the optical design that is given in many conventional fundus cameras. By this measure we could extend the possible field of view (FOV) at a pupil size of 2mm from a circular field with 20° in diameter to a square field with 68° by 18° in size. We acquired a fundus video while the subject was slightly touching and releasing the lid. The resulting video showed changes at vessels in the region of the papilla and a change of the paleness of the papilla.

  2. Scientists Behind the Camera - Increasing Video Documentation in the Field

    NASA Astrophysics Data System (ADS)

    Thomson, S.; Wolfe, J.

    2013-12-01

    Over the last two years, Skypunch Creative has designed and implemented a number of pilot projects to increase the amount of video captured by scientists in the field. The major barrier to success that we tackled with the pilot projects was the conflicting demands of the time, space, storage needs of scientists in the field and the demands of shooting high quality video. Our pilots involved providing scientists with equipment, varying levels of instruction on shooting in the field and post-production resources (editing and motion graphics). In each project, the scientific team was provided with cameras (or additional equipment if they owned their own), tripods, and sometimes sound equipment, as well as an external hard drive to return the footage to us. Upon receiving the footage we professionally filmed follow-up interviews and created animations and motion graphics to illustrate their points. We also helped with the distribution of the final product (http://climatescience.tv/2012/05/the-story-of-a-flying-hippo-the-hiaper-pole-to-pole-observation-project/ and http://climatescience.tv/2013/01/bogged-down-in-alaska/). The pilot projects were a success. Most of the scientists returned asking for additional gear and support for future field work. Moving out of the pilot phase, to continue the project, we have produced a 14 page guide for scientists shooting in the field based on lessons learned - it contains key tips and best practice techniques for shooting high quality footage in the field. We have also expanded the project and are now testing the use of video cameras that can be synced with sensors so that the footage is useful both scientifically and artistically. Extract from A Scientist's Guide to Shooting Video in the Field

  3. Spatiotemporal Sampling and Interpolation for Dense Video Camera Arrays Bennett S. Wilburn

    E-print Network

    Stanford University

    University Neel S Joshi Stanford University Katherine Chou§ Stanford University Marc Levoy¶ Stanford cameras [Wilburn et al. ] while others have used dense video camera arrays for view interpolation [J

  4. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source

    SciTech Connect

    M. J. Haugh and M. B. Schneider

    2008-10-31

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 ?m square pixels, and 15 ?m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/?E?10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within ±1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  5. Laboratory characterization of a CCD camera system for retrieval of bi-directional reflectance distribution function

    NASA Astrophysics Data System (ADS)

    Nandy, Prabal; Thome, Kurtis J.; Biggar, Stuart F.

    1999-12-01

    The Remote Sensing Group of the Optical Science Center at the University of Arizona has developed a four-band, multi- spectral, wide-angle, imaging radiometer for the retrieval of the bi-directional reflectance distribution function (BRDF) for vicarious calibration applications. The system consists of a fisheye lens with four interference filters centered at 470 nm, 575 nm, 660 nm, and 835 nm for spectral selection and an astronomical grade 1024 X 1024-pixel, silicon CCD array. Data taken by the system fit in the array as a nominally 0.2 degree per pixel image. This imaging radiometer system has been used in support of the calibration of Landsat-5 and SPOT- satellite sensors. This paper presents the results of laboratory characterization of the system to determine linearity of the detector, point spread function (PSF) and polarization effects. The linearity study was done on detector array without the lens, using a spherical-integrating source with a 1.5-mm aperture. This aperture simulates a point source for distances larger than 60 cm. Data were collected as both a function of exposure time and distance from the source. The results of these measurements indicate that each detector of the array is linear to better than 0.5%. Assuming a quadratic response improves this fit to better than 0.1% over 88% of the upper end of the detector's dynamic range. The point spread function (PSF) of the lens system was measured using the sphere source and aperture with the full camera system operated at a distance of 700 mm from the source, thus the aperture subtends less than the field of view of one pixel. The PSF was measured for several field angles and the signal level was found to fall to less than 1% of the peak signal within 1.5-degrees (10 pixels) for the on-axis case. The effect of this PSF on the retrieval of modeled BRDFs is shown to be less than 0.2% out to view angles of 70 degrees. The final test presented is one to assess the polarization effects of the lens system by illuminating the camera system with the same spherical-integrating source with a 50-mm aperture with a linear polarizing filter. The degree of polarization of the system is shown to be negligible for on-axis imaging but to have up to a 20% effect for field angles of 70 degrees. The effect of the system polarization on the retrieval of modeled BRDFs is shown to be up to 3% for field angles of 70 degrees off nadir and solar zenith angle of 70 degrees. Polarization response is therefore found to be the greatest source of error in the system. A method to account for polarization effects in digital camera imagery is proposed.

  6. CCD Video Observation of Microgravity Crystallization of Lysozyme and Correlation with Accelerometer Data

    NASA Technical Reports Server (NTRS)

    Snell, E. H.; Boggon, T. J.; Helliwell, J. R.; Moskowitz, M. E.; Nadarajah, A.

    1997-01-01

    Lysozyme has been crystallized using the ESA Advanced Protein Crystallization Facility onboard the NASA Space Shuttle Orbiter during the IML-2 mission. CCD video monitoring was used to follow the crystallization process and evaluate the growth rate. During the mission some tetragonal crystals were observed moving over distances of up to 200 micrometers. This was correlated with microgravity disturbances caused by firings of vernier jets on the Orbiter. Growth-rate measurement of a stationary crystal (which had nucleated on the growth reactor wall) showed spurts and lulls correlated with an onboard activity; astronaut exercise. The stepped growth rates may be responsible for the residual mosaic block structure seen in crystal mosaicity and topography measurements.

  7. Video-Camera-Based Position-Measuring System

    NASA Technical Reports Server (NTRS)

    Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert

    2005-01-01

    A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white squares to an object of interest (see Figure 2). For other situations, where circular symmetry is more desirable, circular targets also can be created. Such a target can readily be generated and modified by use of commercially available software and printed by use of a standard office printer. All three relative coordinates (x, y, and z) of each target can be determined by processing the video image of the target. Because of the unique design of corresponding image-processing filters and targets, the vision-based position- measurement system is extremely robust and tolerant of widely varying fields of view, lighting conditions, and varying background imagery.

  8. Deep-Sea Video Cameras Without Pressure Housings

    NASA Technical Reports Server (NTRS)

    Cunningham, Thomas

    2004-01-01

    Underwater video cameras of a proposed type (and, optionally, their light sources) would not be housed in pressure vessels. Conventional underwater cameras and their light sources are housed in pods that keep the contents dry and maintain interior pressures of about 1 atmosphere (.0.1 MPa). Pods strong enough to withstand the pressures at great ocean depths are bulky, heavy, and expensive. Elimination of the pods would make it possible to build camera/light-source units that would be significantly smaller, lighter, and less expensive. The depth ratings of the proposed camera/light source units would be essentially unlimited because the strengths of their housings would no longer be an issue. A camera according to the proposal would contain an active-pixel image sensor and readout circuits, all in the form of a single silicon-based complementary metal oxide/semiconductor (CMOS) integrated- circuit chip. As long as none of the circuitry and none of the electrical leads were exposed to seawater, which is electrically conductive, silicon integrated- circuit chips could withstand the hydrostatic pressure of even the deepest ocean. The pressure would change the semiconductor band gap by only a slight amount . not enough to degrade imaging performance significantly. Electrical contact with seawater would be prevented by potting the integrated-circuit chip in a transparent plastic case. The electrical leads for supplying power to the chip and extracting the video signal would also be potted, though not necessarily in the same transparent plastic. The hydrostatic pressure would tend to compress the plastic case and the chip equally on all sides; there would be no need for great strength because there would be no need to hold back high pressure on one side against low pressure on the other side. A light source suitable for use with the camera could consist of light-emitting diodes (LEDs). Like integrated- circuit chips, LEDs can withstand very large hydrostatic pressures. If power-supply regulators or filter capacitors were needed, these could be attached in chip form directly onto the back of, and potted with, the imager chip. Because CMOS imagers dissipate little power, the potting would not result in overheating. To minimize the cost of the camera, a fixed lens could be fabricated as part of the plastic case. For improved optical performance at greater cost, an adjustable glass achromatic lens would be mounted in a reservoir that would be filled with transparent oil and subject to the full hydrostatic pressure, and the reservoir would be mounted on the case to position the lens in front of the image sensor. The lens would by adjusted for focus by use of a motor inside the reservoir (oil-filled motors already exist).

  9. Online Submission ID: latebreaking 0040 Integrating Multiple Depth-Sensors into the Virtual Video Camera

    E-print Network

    Magnor, Marcus

    and image sensors (two left). The additional depth information can be used as a soft hint to guide the imageOnline Submission ID: latebreaking 0040 Integrating Multiple Depth-Sensors into the Virtual Video Camera Figure 1: The integration of low-resolution depth sensors into the Virtual Video Camera. Our new

  10. ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System

    SciTech Connect

    Werry, S.M.

    1995-06-06

    This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

  11. HDA dataset -DRAFT 1 A Multi-camera video data set for research on

    E-print Network

    Instituto de Sistemas e Robotica

    HDA dataset - DRAFT 1 A Multi-camera video data set for research on High-Definition surveillance Abstract: We present a fully labelled image sequence data set for benchmarking video surveillance algorithms. The data set was acquired from 13 indoor cameras distributed over three floors of one building

  12. A Note on Calibration of Video Cameras for Autonomous Vehicles with Optical Flow

    E-print Network

    Ziegler, Günter M.

    A Note on Calibration of Video Cameras for Autonomous Vehicles with Optical Flow Ernesto Tapia, Raul Rojas B-13-02 Februar 2013 #12;A Note on Calibration of Video Cameras for Autonomous Vehicles rotation matrix. We also show how the flow lines on the ground together with the vehicle's velocity provide

  13. A high resolution Small Field Of View (SFOV) gamma camera: a columnar scintillator coated CCD imager for medical applications

    NASA Astrophysics Data System (ADS)

    Lees, J. E.; Bassford, D. J.; Blake, O. E.; Blackshaw, P. E.; Perkins, A. C.

    2011-12-01

    We describe a high resolution, small field of view (SFOV), Charge Coupled Device (CCD) based camera for imaging small volumes of radionuclide uptake in tissues. The Mini Gamma Ray Camera (MGRC) is a collimated, scintillator-coated, low cost, high performance imager using low noise CCDs. The prototype MGRC has a 600 ?m thick layer of columnar CsI(Tl) and operates in photon counting mode using a thermoelectric cooler to achieve an operating temperature of - 10°C. Collimation was performed using a pin hole collimator. We have measured the spatial resolution, energy resolution and efficiency using a number of radioisotope sources including 140 keV gamma-rays from 99mTc in a specially designed phantom. We also describe our first imaging of a volunteer patient.

  14. Shooting a Smooth Video with a Shaky Camera Zoran Duric Azriel Rosenfeld

    E-print Network

    Duric, Zoran

    Shooting a Smooth Video with a Shaky Camera Zoran Duric Azriel Rosenfeld Department of Computer of these irregularities so that the sequence is smoothed, i.e. is approximately the same as the sequence that would have been obtained if the motion of the camera had been smooth. Keywords: camera motion analysis, image

  15. Electro-optical testing of fully depleted CCD image sensors for the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Doherty, Peter E.; Antilogus, Pierre; Astier, Pierre; Chiang, James; Gilmore, D. Kirk; Guyonnet, Augustin; Huang, Dajun; Kelly, Heather; Kotov, Ivan; Kubanek, Petr; Nomerotski, Andrei; O'Connor, Paul; Rasmussen, Andrew; Riot, Vincent J.; Stubbs, Christopher W.; Takacs, Peter; Tyson, J. Anthony; Vetter, Kurt

    2014-07-01

    The LSST Camera science sensor array will incorporate 189 large format Charge Coupled Device (CCD) image sensors. Each CCD will include over 16 million pixels and will be divided into 16 equally sized segments and each segment will be read through a separate output amplifier. The science goals of the project require CCD sensors with state of the art performance in many aspects. The broad survey wavelength coverage requires fully depleted, 100 micrometer thick, high resistivity, bulk silicon as the imager substrate. Image quality requirements place strict limits on the image degradation that may be caused by sensor effects: optical, electronic, and mechanical. In this paper we discuss the design of the prototype sensors, the hardware and software that has been used to perform electro-optic testing of the sensors, and a selection of the results of the testing to date. The architectural features that lead to internal electrostatic fields, the various effects on charge collection and transport that are caused by them, including charge diffusion and redistribution, effects on delivered PSF, and potential impacts on delivered science data quality are addressed.

  16. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by…

  17. Implementation of a parallel-beam optical-CT apparatus for three-dimensional radiation dosimetry using a high-resolution CCD camera

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Tzeng; Chen, Chin-Hsing; Hung, Chao-Nan; Tuan, Chiu-Ching; Chang, Yuan-Jen

    2015-06-01

    In this study, a charge-coupled device (CCD) camera with 2-megapixel (1920×1080-pixel) and 12-bit resolution was developed for optical computed tomography(optical CT). The signal-to-noise ratio (SNR) of our system was 30.12 dB, better than that of commercially available CCD cameras (25.31 dB). The 50% modulation transfer function (MTF50) of our 1920×1080-pixel camera gave a line width per picture height (LW/PH) of 745, which is 73% of the diffraction-limited resolution. Compared with a commercially available 1-megapixel CCD camera (1296×966-pixel) with a LW/PH=358 and 46.6% of the diffraction-limited resolution, our camera system provided higher spatial resolution and better image quality. The NIPAM gel dosimeter was used to evaluate the optical CT with a 2-megapixel CCD. A clinical five-field irradiation treatment plan was generated using the Eclipse planning system (Varian Corp., Palo Alto, CA, USA). The gel phantom was irradiated using a 6-MV Varian Clinac IX linear accelerator (Varian). The measured NIPAM gel dose distributions and the calculated dose distributions, generated by the treatment planning software (TPS), were compared using the 3% dose-difference and 3 mm distance-to-agreement criteria. The gamma pass rate was as high as 98.2% when 2-megapixel CCD camera was used in optical CT. However, the gamma pass rate was only 96.0% when a commercially available 1-megapixel CCD camera was used.

  18. [Research Award providing funds for a tracking video camera

    NASA Technical Reports Server (NTRS)

    Collett, Thomas

    2000-01-01

    The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

  19. Variable high-resolution color CCD camera system with online capability for professional photo studio application

    NASA Astrophysics Data System (ADS)

    Breitfelder, Stefan; Reichel, Frank R.; Gaertner, Ernst; Hacker, Erich J.; Cappellaro, Markus; Rudolf, Peter; Voelk, Ute

    1998-04-01

    Digital cameras are of increasing significance for professional applications in photo studios where fashion, portrait, product and catalog photographs or advertising photos of high quality have to be taken. The eyelike is a digital camera system which has been developed for such applications. It is capable of working online with high frame rates and images of full sensor size and it provides a resolution that can be varied between 2048 by 2048 and 6144 by 6144 pixel at a RGB color depth of 12 Bit per channel with an also variable exposure time of 1/60s to 1s. With an exposure time of 100 ms digitization takes approx. 2 seconds for an image of 2048 by 2048 pixels (12 Mbyte), 8 seconds for the image of 4096 by 4096 pixels (48 Mbyte) and 40 seconds for the image of 6144 by 6144 pixels (108 MByte). The eyelike can be used in various configurations. Used as a camera body most commercial lenses can be connected to the camera via existing lens adaptors. On the other hand the eyelike can be used as a back to most commercial 4' by 5' view cameras. This paper describes the eyelike camera concept with the essential system components. The article finishes with a description of the software, which is needed to bring the high quality of the camera to the user.

  20. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system matched the clinical results. Digital image measurement of specimen deformation based on CCD cameras and Image J software has good perspective for application in biomechanical research, which has the advantage of simple optical setup, no-contact, high precision, and no special requirement of test environment.

  1. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    SciTech Connect

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  2. Development of the control circuits for the TID-CCD stereo camera of the Chang'E-2 satellite based on FPGAs

    NASA Astrophysics Data System (ADS)

    Duan, Yong-Qiang; Gao, Wei; Qiao, Wei-Dong; Wen, De-Sheng; Zhao, Bao-Chang

    2013-09-01

    The TDI-CCD Stereo Camera is the optical sensor on the Chang'E-2 (CE-2) satellite created for the Chinese Lunar Exploration Program. The camera was designed to acquire three-dimensional stereoscopic images of the lunar surface based upon three-line array photogrammetric theory. The primary objective of the camera is, (1) to obtain about 1-m pixel spatial resolution images of the preparative landing location from an ellipse orbit at an altitude of ~15km, and (2) to obtain about 7-m pixel spatial resolution global images of the Moon from a circular orbit at an altitude of ~100km. The focal plane of the camera is comprised of two TDI-CCDs. The control circuits of the camera are designed based on two SRAM-type FPGAs, XQR2V3000-4CG717. In this paper, a variable frequency control and multi-tap data readout technology for the TDI-CCD is presented, which is able to change the data processing capabilities according to the different orbit mode for the TDI-CCD stereo camera. By this way, the data rate of the camera is extremely reduced from 100Mbps to 25Mbps at high orbit mode, which is benefit to raise the reliability of the image transfer. The results of onboard flight validate that the proposed methodology is reasonable and reliable.

  3. Requisites for the remote-controlled wide-view CCD camera unit for natural orifice transluminal endoscopic surgery placed in the intraperitoneal cavity.

    PubMed

    Ohdaira, Takeshi; Yasuda, Yoshikazu; Hashizume, Makoto

    2010-04-01

    In natural orifice transluminal endoscopic surgery (NOTES) using a single endoscope, the visual field moves unstably and a wide blind space is formed. We used wireless two wireless CCD cameras (270,000 and 380,000 pixels) placed on the abdominal wall of pigs and a conventional endoscope (410,000 pixels) at the same time to assess whether it was possible to observe the entire process of sigmoidectomy by NOTES. The titanium dioxide-coated lens was used as an antifogging apparatus. To control the CCD image frames, a magnetic body was affixed to the back of the CCD camera unit. To select a suitable visual-transmitter, three frequency bands were assessed: 0.07 GHz, 1.2 GHz, and 2.4 GHz. The cameras showed good performance for monitoring all procedures of the sigmoidectomy. The magnetic force most suitable to control the cameras was found to be 360 mT, and the best transmission frequency was 1.2 GHz. The battery could be used for up to 4 hours with intermittent use. The issue of lens fogging could be resolved by a water supply into the anal canal and a more than 12-hour ultraviolet irradiation. We verified that the CCD camera with the titanium dioxide-coated lens may be useful as the second eye in NOTES. PMID:20437343

  4. Station Cameras Capture New Videos of Hurricane Katia - Duration: 5 minutes, 36 seconds.

    NASA Video Gallery

    Aboard the International Space Station, external cameras captured new video of Hurricane Katia as it moved northwest across the western Atlantic north of Puerto Rico at 10:35 a.m. EDT on September ...

  5. Fused Six-Camera Video of STS-134 Launch - Duration: 79 seconds.

    NASA Video Gallery

    Imaging experts funded by the Space Shuttle Program and located at NASA's Ames Research Center prepared this video by merging nearly 20,000 photographs taken by a set of six cameras capturing 250 i...

  6. Performances of a solid streak camera based on conventional CCD with nanosecond time resolution

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Bai, Yonglin; Zhu, Bingli; Gou, Yongsheng; Xu, Peng; Bai, XiaoHong; Liu, Baiyu; Qin, Junjun

    2015-02-01

    Imaging systems with high temporal resolution are needed to study rapid physical phenomena ranging from shock waves, including extracorporeal shock waves used for surgery, to diagnostics of laser fusion and fuel injection in internal combustion engines. However, conventional streak cameras use a vacuum tube making thus fragile, cumbersome and expensive. Here we report an CMOS streak camera project consists in reproducing completely this streak camera functionality with a single CMOS chip. By changing the mode of charge transfer of CMOS image sensor, fast photoelectric diagnostics of single point with linear CMOS and high-speed line scanning with array CMOS sensor can be achieved respectively. A fast photoelectric diagnostics system has been designed and fabricated to investigate the feasibility of this method. Finally, the dynamic operation of the sensors is exposed. Measurements show a sample time of 500 ps and a time resolution better than 2 ns.

  7. Close infrared thermography using an intensified CCD camera: application in nondestructive high resolution evaluation of electrothermally actuated MEMS

    NASA Astrophysics Data System (ADS)

    Serio, B.; Hunsinger, J. J.; Conseil, F.; Derderian, P.; Collard, D.; Buchaillot, L.; Ravat, M. F.

    2005-06-01

    This communication proposes the description of an optical method for thermal characterization of MEMS devices. The method is based on the use of an intensified CCD camera to record the thermal radiation emitted by the studied device in the spectral domain from 600 nm to about 850 nm. The camera consists of an intensifier associated to a CCD sensor. The intensification allows for very low signal levels to be amplified and detected. We used a standard optical microscope to image the device with sub-micron resolution. Since, in close infrared, at very small scale and low temperature, typically 250°C for thermal MEMS (Micro-Electro-Mechanical Systems), the thermal radiation is very weak, we used image integration in order to increase the signal to noise ratio. Knowing the imaged materials emissivity, the temperature is given by using Planck"s law. In order to evaluate the system performances we have made micro-thermographies of a micro-relay thermal actuator. This device is an "U-shape" Al/SiO2 bimorph cantilever micro-relay with a gold-to-gold electrical contact, designed for secured harsh environment applications. The initial beam curvature resulting from residual stresses ensures a large gap between the contacts of the micro-relay. The current flow through the metallic layer heats the bimorph by Joule effect, and the differential expansion provides the vertical displacement for contact. The experimental results are confronted to FEM and analytical simulations. A good agreement was obtained between experimental results and simulations.

  8. Characterization of the luminance and shape of ash particles at Sakurajima volcano, Japan, using CCD camera images

    NASA Astrophysics Data System (ADS)

    Miwa, Takahiro; Shimano, Taketo; Nishimura, Takeshi

    2015-01-01

    We develop a new method for characterizing the properties of volcanic ash at the Sakurajima volcano, Japan, based on automatic processing of CCD camera images. Volcanic ash is studied in terms of both luminance and particle shape. A monochromatic CCD camera coupled with a stereomicroscope is used to acquire digital images through three filters that pass red, green, or blue light. On single ash particles, we measure the apparent luminance, corresponding to 256 tones for each color (red, green, and blue) for each pixel occupied by ash particles in the image, and the average and standard deviation of the luminance. The outline of each ash particle is captured from a digital image taken under transmitted light through a polarizing plate. Also, we define a new quasi-fractal dimension ( D qf ) to quantify the complexity of the ash particle outlines. We examine two ash samples, each including about 1000 particles, which were erupted from the Showa crater of the Sakurajima volcano, Japan, on February 09, 2009 and January 13, 2010. The apparent luminance of each ash particle shows a lognormal distribution. The average luminance of the ash particles erupted in 2009 is higher than that of those erupted in 2010, which is in good agreement with the results obtained from component analysis under a binocular microscope (i.e., the number fraction of dark juvenile particles is lower for the 2009 sample). The standard deviations of apparent luminance have two peaks in the histogram, and the quasi-fractal dimensions show different frequency distributions between the two samples. These features are not recognized in the results of conventional qualitative classification criteria or the sphericity of the particle outlines. Our method can characterize and distinguish ash samples, even for ash particles that have gradual property changes, and is complementary to component analysis. This method also enables the relatively fast and systematic analysis of ash samples that is required for petrologic monitoring of ongoing activity, such as at the Sakurajima volcano.

  9. Flux Calibration of the ACS CCD Cameras III. Sensitivity Changes over Time

    NASA Astrophysics Data System (ADS)

    Bohlin, Ralph C.; Mack, Jennifer; Ubeda, Leonardo

    2011-06-01

    The flux calibration of HST instruments is normally specified after removal of artifacts such as a decline in charge transfer efficiency (CTE) for CCD detectors and optical throughput degradation. This ACS ISR deals with the HRC and WFC losses in sensitivity from polymerization of contaminants on the optical surfaces. Prior to the demise of the ACS CCD channels on 2007 Jan. 27, the losses are less than ?0.003 mag/year, except for the two short wavelength HRC filters F220W and F250W. The measurements of the sensitivity loss rates using a set of observations of WD flux standards has a precision of ?0.0008 mag/year, while the sensitivity loss rates using repeated observations of the globular cluster 47 Tuc are probably consistent within their currently lower precision. Following the revival of ACS WFC during the Servicing Mission 4 (SM4) in 2009 May, the gain of the new electronics was set so that the measured signal in electrons s-1 matched the signal for the same 47 Tuc field as measured in 2002 with the F606W filter. However, a longer time baseline is required to reliably determine the post-SM4 loss rates.

  10. Using a Video Camera to Measure the Radius of the Earth

    ERIC Educational Resources Information Center

    Carroll, Joshua; Hughes, Stephen

    2013-01-01

    A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of…

  11. Large Format, Dual Head,Triple Sensor, Self-Guiding CCD Cameras

    E-print Network

    Metchev, Stanimir

    include a dual head self-guiding option, simplified power requirements and an internal 2" filter wheel for overall system cost and weight reduction. These large format cameras are designed to offer more features system that included large area detectors, flexible self-guiding options and a large format filter wheel

  12. Automated image acquisition and processing using a new generation of 4K x 4K CCD cameras for cryo electron microscopic studies of macromolecular assemblies.

    PubMed

    Zhang, Peijun; Borgnia, Mario J; Mooney, Paul; Shi, Dan; Pan, Ming; O'Herron, Philip; Mao, Albert; Brogan, David; Milne, Jacqueline L S; Subramaniam, Sriram

    2003-08-01

    We have previously reported the development of AutoEM, a software package for semi-automated acquisition of data from a transmission electron microscope. In continuing efforts to improve the speed of structure determination of macromolecular assemblies by electron microscopy, we report here on the performance of a new generation of 4 K CCD cameras for use in cryo electron microscopic applications. We demonstrate that at 120 kV, and at a nominal magnification of 67000 x, power spectra and signal-to-noise ratios for the new 4 K CCD camera are comparable to values obtained for film images scanned using a Zeiss scanner to resolutions as high as approximately 1/6.5A(-1). The specimen area imaged for each exposure on the 4 K CCD is about one-third of the area that can be recorded with a similar exposure on film. The CCD camera also serves the purpose of recording images at low magnification from the center of the hole to measure the thickness of vitrified ice in the hole. The performance of the camera is satisfactory under the low-dose conditions used in cryo electron microscopy, as demonstrated here by the determination of a three-dimensional map at 15 A for the catalytic core of the 1.8 MDa Bacillus stearothermophilus icosahedral pyruvate dehydrogenase complex, and its comparison with the previously reported atomic model for this complex obtained by X-ray crystallography. PMID:12972350

  13. Measuring Night-Sky Brightness with a Wide-Field CCD Camera

    E-print Network

    D. M. Duriscoe; C. B. Luginbuhl; C. A. Moore

    2007-03-27

    We describe a system for rapidly measuring the brightness of the night sky using a mosaic of CCD images obtained with a low-cost automated system. The portable system produces millions of independent photometric measurements covering the entire sky, enabling the detailed characterization of natural sky conditions and light domes produced by cities. The measurements are calibrated using images of standard stars contained within the raw data, producing results closely tracking the Johnson V astronomical standard. The National Park Service has collected hundreds of data sets at numerous parks since 2001 and is using these data for the protection and monitoring of the night-sky visual resource. This system also allows comprehensive characterization of sky conditions at astronomical observatories. We explore photometric issues raised by the broadband measurement of the complex and variable night-sky spectrum, and potential indices of night-sky quality.

  14. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e(-)/DN, a noise level less than 25e(-), a dark current level which is less than 10e(-)/pixel/s, and a residual nonlinearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  15. Accurate Camera Calibration for Off-line, Video-Based Augmented Reality Simon Gibson, Jon Cook, Toby Howard, Roger Hubbold

    E-print Network

    Manchester, University of

    Accurate Camera Calibration for Off-line, Video-Based Augmented Reality Simon Gibson, Jon Cook. Abstract Camera tracking is a fundamental requirement for video- based Augmented Reality applications-line video- based Augmented Reality applications. We first describe an improved feature tracking algorithm

  16. Feasibility study of transmission of OTV camera control information in the video vertical blanking interval

    NASA Technical Reports Server (NTRS)

    White, Preston A., III

    1994-01-01

    The Operational Television system at Kennedy Space Center operates hundreds of video cameras, many remotely controllable, in support of the operations at the center. This study was undertaken to determine if commercial NABTS (North American Basic Teletext System) teletext transmission in the vertical blanking interval of the genlock signals distributed to the cameras could be used to send remote control commands to the cameras and the associated pan and tilt platforms. Wavelength division multiplexed fiberoptic links are being installed in the OTV system to obtain RS-250 short-haul quality. It was demonstrated that the NABTS transmission could be sent over the fiberoptic cable plant without excessive video quality degradation and that video cameras could be controlled using NABTS transmissions over multimode fiberoptic paths as long as 1.2 km.

  17. Million-frame-per-second CCD camera with 16 frames of storage

    NASA Astrophysics Data System (ADS)

    Howard, Nathan E.; Gardner, David W.; Snyder, Donald R.

    1997-12-01

    Ultrafast imaging is an important need for the development, control, and evaluation of modern air-deliverable weapons systems. Recent advances in optical imaging such as speckle interferometry can potentially improve DoD capability to deliver munitions and armaments to targets at long ranges, and under adverse seeing conditions. Moderate density arrays of at least 100 by 100 pixels and frame rates of at least 1 MHz are required. Ultrafast imaging is also required for flow field optical image analysis for hypersonic propulsion systems. Silicon Mountain Design (SMD) has built such an imager so that high quality images can be obtained for relatively low cost. The SMD-64k1M camera is capable of imaging 1,000,000 frames per second using a 256 by 256 array with the ability to store 16 frames with true 12 bits of dynamic range. This camera allows researchers to capture multiple high speed events using solid state technology housed in a 53 cubic inch package. A brief technical overview of the imager and results are presented in this paper.

  18. CCD and CMOS sensors

    NASA Astrophysics Data System (ADS)

    Waltham, Nick

    The charge-coupled device (CCD) has been developed primarily as a compact image sensor for consumer and industrial markets, but is now also the preeminent visible and ultraviolet wavelength image sensor in many fields of scientific research including space-science and both Earth and planetary remote sensing. Today"s scientific or science-grade CCD will strive to maximise pixel count, focal plane coverage, photon detection efficiency over the broadest spectral range and signal dynamic range whilst maintaining the lowest possible readout noise. The relatively recent emergence of complementary metal oxide semiconductor (CMOS) image sensor technology is arguably the most important development in solid-state imaging since the invention of the CCD. CMOS technology enables the integration on a single silicon chip of a large array of photodiode pixels alongside all of the ancillary electronics needed to address the array and digitise the resulting analogue video signal. Compared to the CCD, CMOS promises a more compact, lower mass, lower power and potentially more radiation tolerant camera.

  19. Camera Angle Affects Dominance in Video-Mediated Communication

    E-print Network

    Olson, Judith S.

    -Mediated Communication (VMC), these are distorted in various ways. Monitors and camera zooms make people look close or far, monitors and camera angles can be high or low making people look tall or short, volume can or shorter than they are. A person looking up all the time (with the remote person apparently looking down

  20. Lights! Camera! Action! Handling Your First Video Assignment.

    ERIC Educational Resources Information Center

    Thomas, Marjorie Bekaert

    1989-01-01

    The author discusses points to consider when hiring and working with a video production company to develop a video for human resources purposes. Questions to ask the consultants are included, as is information on the role of the company liaison and on how to avoid expensive, time-wasting pitfalls. (CH)

  1. Lights, Cameras, Pencils! Using Descriptive Video to Enhance Writing

    ERIC Educational Resources Information Center

    Hoffner, Helen; Baker, Eileen; Quinn, Kathleen Benson

    2008-01-01

    Students of various ages and abilities can increase their comprehension and build vocabulary with the help of a new technology, Descriptive Video. Descriptive Video (also known as described programming) was developed to give individuals with visual impairments access to visual media such as television programs and films. Described programs,…

  2. The Importance of Camera Calibration and Distortion Correction to Obtain Measurements with Video Surveillance Systems

    NASA Astrophysics Data System (ADS)

    Cattaneo, C.; Mainetti, G.; Sala, R.

    2015-11-01

    Video surveillance systems are commonly used as important sources of quantitative information but from the acquired images it is possible to obtain a large amount of metric information. Yet, different methodological issues must be considered in order to perform accurate measurements using images. The most important one is the camera calibration, which is the estimation of the parameters defining the camera model. One of the most used camera calibration method is the Zhang's method, that allows the estimation of the linear parameters of the camera model. This method is very diffused as it requires a simple setup and it allows to calibrate cameras using a simple and fast procedure, but it does not consider lenses distortions, that must be taken into account with short focal lenses, commonly used in video surveillance systems. In order to perform accurate measurements, the linear camera model and the Zhang's method are improved in order to take nonlinear parameters into account and compensate the distortion contribute. In this paper we first describe the pinhole camera model that considers cameras as central projection systems. After a brief introduction to the camera calibration process and in particular the Zhang's method, we give a description of the different types of lens distortions and the techniques used for the distortion compensation. At the end some numerical example are shown in order to demonstrate the importance of the distortion compensation to obtain accurate measurements.

  3. Robust Camera Calibration Tool for Video Surveillance Camera in Urban Environment

    E-print Network

    Southern California, University of

    position, orientation, and fo- cal length) is very useful for various surveillance systems because it can and Ram Nevatia Institute for Robotics and Intelligent Systems, University of Southern California Los such as smart room and security system are prevailing nowadays. Camera calibra- tion information (e.g. camera

  4. BOREAS RSS-3 Imagery and Snapshots from a Helicopter-Mounted Video Camera

    NASA Technical Reports Server (NTRS)

    Walthall, Charles L.; Loechel, Sara; Nickeson, Jaime (Editor); Hall, Forrest G. (Editor)

    2000-01-01

    The BOREAS RSS-3 team collected helicopter-based video coverage of forested sites acquired during BOREAS as well as single-frame "snapshots" processed to still images. Helicopter data used in this analysis were collected during all three 1994 IFCs (24-May to 16-Jun, 19-Jul to 10-Aug, and 30-Aug to 19-Sep), at numerous tower and auxiliary sites in both the NSA and the SSA. The VHS-camera observations correspond to other coincident helicopter measurements. The field of view of the camera is unknown. The video tapes are in both VHS and Beta format. The still images are stored in JPEG format.

  5. Street Sense and Video Cameras: Broadcasting Moves Off-Campus.

    ERIC Educational Resources Information Center

    Nimmer, David

    1992-01-01

    Advocates sending news reporting students off campus and into the community to expand their horizons, interests, courage, and possibilities. Describes the use of such experiential learning in a broadcast reporting class, where students left the cameras at home as they experienced the police or social services sector, and wrote an in-depth story…

  6. Camera/Video Phones in Schools: Law and Practice

    ERIC Educational Resources Information Center

    Parry, Gareth

    2005-01-01

    The emergence of mobile phones with built-in digital cameras is creating legal and ethical concerns for school systems throughout the world. Users of such phones can instantly email, print or post pictures to other MMS1 phones or websites. Local authorities and schools in Britain, Europe, USA, Canada, Australia and elsewhere have introduced…

  7. Laser Imaging Video Camera Sees Through Fire, Fog, Smoke

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.

  8. Novel insights into green sea turtle behaviour using animal-borne video cameras

    E-print Network

    Dill, Lawrence M.

    Novel insights into green sea turtle behaviour using animal-borne video cameras Michael R. Heithaus occasionally made short darts outside the ¢eld of view while turtles swam in the water column, foraging on sea to be consumed in a single bite. However, one turtle scavenged a large jelly¢sh that was extracted from a sea

  9. Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera

    ERIC Educational Resources Information Center

    Levesque, Luc

    2014-01-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the…

  10. Design and Implementation of a Wireless Video Camera Network for Coastal Erosion Monitoring

    E-print Network

    Little, Thomas

    Design and Implementation of a Wireless Video Camera Network for Coastal Erosion Monitoring Yuting­The short-term rate of coastal erosion and recession has been observed at island shore- line bluffs near waterways among Boston Harbor, Massachusetts, USA. This erosion has been hypothesized partially related

  11. Observation of hydrothermal flows with acoustic video camera

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Asada, A.; Tamaki, K.; Scientific Team Of Yk09-13 Leg 1

    2010-12-01

    To evaluate hydrothermal discharging and its diffusion process along the ocean ridge is necessary for understanding balance of mass and flux in the ocean, ecosystem around hydrothermal fields and so on. However, it has been difficult for us to measure hydrothermal activities without disturbance caused by observation platform ( submersible, ROV, AUV ). We wanted to have some observational method to observe hydrothermal discharging behavior as it was. DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. DIDSON operates at two frequencies, 1.8MHz or 1.1MHz, and forms 96 beams spaced 0.3° apart or 48 beams spaced 0.6° apart respectively. It images out to 12m at 1.8MHz and 40m at 1.1MHz. The transmit and receive beams are formed with acoustic lenses with rectangular apertures and made of polymethylpentene plastic and FC-70 liquid. This physical beam forming allows DIDSON to consume only 30W of power. DIDSON updates its image between 20 to 1 frames/s depending on the operating frequency and the maximum range imaged. It communicates its host using Ethernet. Institute of Industrial Science, University of Tokyo ( IIS ) has understood DIDSON’s superior performance and tried to find new method for utilization of it. The observation systems that IIS has ever developed based on DIDSON are waterside surveillance system, automatic measurement system for fish length, automatic system for fish counting, diagnosis system for deterioration of underwater structure and so on. A next challenge is to develop an observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. In this cruise, seven dives of Shinkai6500 were conducted. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. Processing and analyzing the acoustic video image data are going on. We will report the overview of the acoustic video image of the hydrothermal plumes and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

  12. Acceptance/operational test procedure 101-AW tank camera purge system and 101-AW video camera system

    SciTech Connect

    Castleberry, J.L.

    1994-09-19

    This procedure will document the satisfactory operation of the 101-AW Tank Camera Purge System (CPS) and the 101-AW Video Camera System. The safety interlock which shuts down all the electronics inside the 101-AW vapor space, during loss of purge pressure, will be in place and tested to ensure reliable performance. This procedure is separated into four sections. Section 6.1 is performed in the 306 building prior to delivery to the 200 East Tank Farms and involves leak checking all fittings on the 101-AW Purge Panel for leakage using a Snoop solution and resolving the leakage. Section 7.1 verifies that PR-1, the regulator which maintains a positive pressure within the volume (cameras and pneumatic lines), is properly set. In addition the green light (PRESSURIZED) (located on the Purge Control Panel) is verified to turn on above 10 in. w.g. and after the time delay (TDR) has timed out. Section 7.2 verifies that the purge cycle functions properly, the red light (PURGE ON) comes on, and that the correct flowrate is obtained to meet the requirements of the National Fire Protection Association. Section 7.3 verifies that the pan and tilt, camera, associated controls and components operate correctly. This section also verifies that the safety interlock system operates correctly during loss of purge pressure. During the loss of purge operation the illumination of the amber light (PURGE FAILED) will be verified.

  13. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source (HTPD 08 paper)

    SciTech Connect

    Haugh, M; Schneider, M B

    2008-04-28

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 {micro}m square pixels, and 15 {micro}m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/{Delta}E {approx} 10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within {+-}1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  14. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (inventor); Venema, Steven C. (inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  15. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (inventor); Venema, Steven C. (inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  16. Hardware-based smart camera for recovering high dynamic range video from multiple exposures

    NASA Astrophysics Data System (ADS)

    Lapray, Pierre-Jean; Heyrman, Barthélémy; Ginhac, Dominique

    2014-10-01

    In many applications such as video surveillance or defect detection, the perception of information related to a scene is limited in areas with strong contrasts. The high dynamic range (HDR) capture technique can deal with these limitations. The proposed method has the advantage of automatically selecting multiple exposure times to make outputs more visible than fixed exposure ones. A real-time hardware implementation of the HDR technique that shows more details both in dark and bright areas of a scene is an important line of research. For this purpose, we built a dedicated smart camera that performs both capturing and HDR video processing from three exposures. What is new in our work is shown through the following points: HDR video capture through multiple exposure control, HDR memory management, HDR frame generation, and representation under a hardware context. Our camera achieves a real-time HDR video output at 60 fps at 1.3 megapixels and demonstrates the efficiency of our technique through an experimental result. Applications of this HDR smart camera include the movie industry, the mass-consumer market, military, automotive industry, and surveillance.

  17. Development of a compact fast CCD camera and resonant soft x-ray scattering endstation for time-resolved pump-probe experiments.

    PubMed

    Doering, D; Chuang, Y-D; Andresen, N; Chow, K; Contarato, D; Cummings, C; Domning, E; Joseph, J; Pepper, J S; Smith, B; Zizka, G; Ford, C; Lee, W S; Weaver, M; Patthey, L; Weizeorick, J; Hussain, Z; Denes, P

    2011-07-01

    The designs of a compact, fast CCD (cFCCD) camera, together with a resonant soft x-ray scattering endstation, are presented. The cFCCD camera consists of a highly parallel, custom, thick, high-resistivity CCD, readout by a custom 16-channel application specific integrated circuit to reach the maximum readout rate of 200 frames per second. The camera is mounted on a virtual-axis flip stage inside the RSXS chamber. When this flip stage is coupled to a differentially pumped rotary seal, the detector assembly can rotate about 100°/360° in the vertical/horizontal scattering planes. With a six-degrees-of-freedom cryogenic sample goniometer, this endstation has the capability to detect the superlattice reflections from the electronic orderings showing up in the lower hemisphere. The complete system has been tested at the Advanced Light Source, Lawrence Berkeley National Laboratory, and has been used in multiple experiments at the Linac Coherent Light Source, SLAC National Accelerator Laboratory. PMID:21806178

  18. Development of a compact fast CCD camera and resonant soft x-ray scattering endstation for time-resolved pump-probe experiments

    NASA Astrophysics Data System (ADS)

    Doering, D.; Chuang, Y.-D.; Andresen, N.; Chow, K.; Contarato, D.; Cummings, C.; Domning, E.; Joseph, J.; Pepper, J. S.; Smith, B.; Zizka, G.; Ford, C.; Lee, W. S.; Weaver, M.; Patthey, L.; Weizeorick, J.; Hussain, Z.; Denes, P.

    2011-07-01

    The designs of a compact, fast CCD (cFCCD) camera, together with a resonant soft x-ray scattering endstation, are presented. The cFCCD camera consists of a highly parallel, custom, thick, high-resistivity CCD, readout by a custom 16-channel application specific integrated circuit to reach the maximum readout rate of 200 frames per second. The camera is mounted on a virtual-axis flip stage inside the RSXS chamber. When this flip stage is coupled to a differentially pumped rotary seal, the detector assembly can rotate about 100°/360° in the vertical/horizontal scattering planes. With a six-degrees-of-freedom cryogenic sample goniometer, this endstation has the capability to detect the superlattice reflections from the electronic orderings showing up in the lower hemisphere. The complete system has been tested at the Advanced Light Source, Lawrence Berkeley National Laboratory, and has been used in multiple experiments at the Linac Coherent Light Source, SLAC National Accelerator Laboratory.

  19. Video and acoustic camera techniques for studying fish under ice: a review and comparison

    SciTech Connect

    Mueller, Robert P.; Brown, Richard S.; Hop, Haakon H.; Moulton, Larry

    2006-09-05

    Researchers attempting to study the presence, abundance, size, and behavior of fish species in northern and arctic climates during winter face many challenges, including the presence of thick ice cover, snow cover, and, sometimes, extremely low temperatures. This paper describes and compares the use of video and acoustic cameras for determining fish presence and behavior in lakes, rivers, and streams with ice cover. Methods are provided for determining fish density and size, identifying species, and measuring swimming speed and successful applications of previous surveys of fish under the ice are described. These include drilling ice holes, selecting batteries and generators, deploying pan and tilt cameras, and using paired colored lasers to determine fish size and habitat associations. We also discuss use of infrared and white light to enhance image-capturing capabilities, deployment of digital recording systems and time-lapse techniques, and the use of imaging software. Data are presented from initial surveys with video and acoustic cameras in the Sagavanirktok River Delta, Alaska, during late winter 2004. These surveys represent the first known successful application of a dual-frequency identification sonar (DIDSON) acoustic camera under the ice that achieved fish detection and sizing at camera ranges up to 16 m. Feasibility tests of video and acoustic cameras for determining fish size and density at various turbidity levels are also presented. Comparisons are made of the different techniques in terms of suitability for achieving various fisheries research objectives. This information is intended to assist researchers in choosing the equipment that best meets their study needs.

  20. Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.

  1. A Novel Method to Reduce Time Investment When Processing Videos from Camera Trap Studies

    PubMed Central

    Swinnen, Kristijn R. R.; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

  2. A novel method to reduce time investment when processing videos from camera trap studies.

    PubMed

    Swinnen, Kristijn R R; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

  3. A digital underwater video camera system for aquatic research in regulated rivers

    USGS Publications Warehouse

    Martin, Benjamin M.; Irwin, Elise R.

    2010-01-01

    We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

  4. A passive THz video camera based on lumped element kinetic inductance detectors

    E-print Network

    Rowe, Sam; Doyle, Simon; Dunscombe, Chris; Hargrave, Peter; Papageorgio, Andreas; Wood, Ken; Ade, Peter A R; Barry, Peter; Bideaud, Aurélien; Brien, Tom; Dodd, Chris; Grainger, William; House, Julian; Mauskopf, Philip; Moseley, Paul; Spencer, Locke; Sudiwala, Rashmi; Tucker, Carole; Walker, Ian

    2015-01-01

    We have developed a passive 350 GHz (850 {\\mu}m) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs) -- designed originally for far-infrared astronomy -- as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of $\\sim$0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequency domain multiplexing electronics.

  5. Wilbur: A Low-Cost CCD System for MDM Observatory

    NASA Astrophysics Data System (ADS)

    Metzger, Mark R.; Tonry, John L.; Luppino, Gerard A.

    1993-01-01

    We describe ``Wilbur'', a CCD camera constructed for the Michigan-Dartmouth-MIT Observatory. The camera system hardware was constructed using existing designs for the dewar and control electronics and a commercially available control computer. The requirements for new hardware design was reduced to a simple interface, allowing us to keep the cost low and produce a working system on the telescope in under three months. New software written for operation of the camera consists of several individual components which provide data acquisition from the CCD, control of the telescope, and operation of auxiliary instruments. The hardware and software are modular, giving the flexibility to operate with other existing and future detectors at the observatory. The software also provides advanced CCD readout features such as shutterless video and drift scanning, and can be operated remotely from other computers over an IP-based network.

  6. Operation and maintenance manual for the high resolution stereoscopic video camera system (HRSVS) system 6230

    SciTech Connect

    Pardini, A.F., Westinghouse Hanford

    1996-07-16

    The High Resolution Stereoscopic Video Cameral System (HRSVS),system 6230, is a stereoscopic camera system that will be used as an end effector on the LDUA to perform surveillance and inspection activities within Hanford waste tanks. It is attached to the LDUA by means of a Tool Interface Plate (TIP), which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate.

  7. Human Daily Activities Indexing in Videos from Wearable Cameras for Monitoring of Patients with Dementia Diseases

    E-print Network

    Karaman, Svebor; Mégret, Rémi; Dovgalecs, Vladislavs; Dartigues, Jean-François; Gaëstel, Yann

    2010-01-01

    Our research focuses on analysing human activities according to a known behaviorist scenario, in case of noisy and high dimensional collected data. The data come from the monitoring of patients with dementia diseases by wearable cameras. We define a structural model of video recordings based on a Hidden Markov Model. New spatio-temporal features, color features and localization features are proposed as observations. First results in recognition of activities are promising.

  8. A New Remote Sensing Filter Radiometer Employing a Fabry-Perot Etalon and a CCD Camera for Column Measurements of Methane in the Earth Atmosphere

    NASA Technical Reports Server (NTRS)

    Georgieva, E. M.; Huang, W.; Heaps, W. S.

    2012-01-01

    A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.

  9. Design and evaluation of controls for drift, video gain, and color balance in spaceborne facsimile cameras

    NASA Technical Reports Server (NTRS)

    Katzberg, S. J.; Kelly, W. L., IV; Rowland, C. W.; Burcher, E. E.

    1973-01-01

    The facsimile camera is an optical-mechanical scanning device which has become an attractive candidate as an imaging system for planetary landers and rovers. This paper presents electronic techniques which permit the acquisition and reconstruction of high quality images with this device, even under varying lighting conditions. These techniques include a control for low frequency noise and drift, an automatic gain control, a pulse-duration light modulation scheme, and a relative spectral gain control. Taken together, these techniques allow the reconstruction of radiometrically accurate and properly balanced color images from facsimile camera video data. These techniques have been incorporated into a facsimile camera and reproduction system, and experimental results are presented for each technique and for the complete system.

  10. Structural analysis of color video camera installation on tank 241AW101 (2 Volumes)

    SciTech Connect

    Strehlow, J.P.

    1994-08-24

    A video camera is planned to be installed on the radioactive storage tank 241AW101 at the DOE` s Hanford Site in Richland, Washington. The camera will occupy the 20 inch port of the Multiport Flange riser which is to be installed on riser 5B of the 241AW101 (3,5,10). The objective of the project reported herein was to perform a seismic analysis and evaluation of the structural components of the camera for a postulated Design Basis Earthquake (DBE) per the reference Structural Design Specification (SDS) document (6). The detail of supporting engineering calculations is documented in URS/Blume Calculation No. 66481-01-CA-03 (1).

  11. Low-complexity camera digital signal imaging for video document projection system

    NASA Astrophysics Data System (ADS)

    Hsia, Shih-Chang; Tsai, Po-Shien

    2011-04-01

    We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.

  12. Evaluation of lens distortion errors using an underwater camera system for video-based motion analysis

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.

    1994-01-01

    Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.

  13. Video camera observation for assessing overland flow patterns during rainfall events

    NASA Astrophysics Data System (ADS)

    Silasari, Rasmiaditya; Oismüller, Markus; Blöschl, Günter

    2015-04-01

    Physically based hydrological models have been widely used in various studies to model overland flow propagation in cases such as flood inundation and dam break flow. The capability of such models to simulate the formation of overland flow by spatial and temporal discretization of the empirical equations makes it possible for hydrologists to trace the overland flow generation both spatially and temporally across surface and subsurface domains. As the upscaling methods transforming hydrological process spatial patterns from the small obrseved scale to the larger catchment scale are still being progressively developed, the physically based hydrological models become a convenient tool to assess the patterns and their behaviors crucial in determining the upscaling process. Related studies in the past had successfully used these models as well as utilizing field observation data for model verification. The common observation data used for this verification are overland flow discharge during natural rainfall events and camera observations during synthetic events (staged field experiments) while the use of camera observations during natural events are hardly discussed in publications. This study advances in exploring the potential of video camera observations of overland flow generation during natural rainfall events to support the physically based hydrological model verification and the assessment of overland flow spatial patterns. The study is conducted within a 64ha catchment located at Petzenkirchen, Lower Austria, known as HOAL (Hydrological Open Air Laboratory). The catchment land covers are dominated by arable land (87%) with small portions (13%) of forest, pasture and paved surfaces. A 600m stream is running at southeast of the catchment flowing southward and equipped with flumes and pressure transducers measuring water level in minutely basis from various inlets along the stream (i.e. drainages, surface runoffs, springs) to be calculated into flow discharge. A video camera with 10x optical zoom is installed 7m above the ground at the middle of the catchment overlooking the west hillslope area of the stream. Minutely images are taken daily during daylight while video recording is triggered by raindrop movements. The observed images and videos are analyzed in accordance to the overland flow signals captured by the assigned pressure transducers and the rainfall intensities measured by four rain gauges across the catchment. The results show that the video camera observations enable us to assess the spatial and temporal development of the overland flow generation during natural events, thus showing potentials to be used in model verification as well as in spatial patterns analysis.

  14. An explanation for camera perspective bias in voluntariness judgment for video-recorded confession: Suggestion of cognitive frame.

    PubMed

    Park, Kwangbai; Pyo, Jimin

    2012-06-01

    Three experiments were conducted to test the hypothesis that difference in voluntariness judgment for a custodial confession filmed in different camera focuses ("camera perspective bias") could occur because a particular camera focus conveys a suggestion of a particular cognitive frame. In Experiment 1, 146 juror eligible adults in Korea showed a camera perspective bias in voluntariness judgment with a simulated confession filmed with two cameras of different focuses, one on the suspect and the other on the detective. In Experiment 2, the same bias in voluntariness judgment emerged without cameras when the participants were cognitively framed, prior to listening to the audio track of the videos used in Experiment 1, by instructions to make either a voluntariness judgment for a confession or a coerciveness judgment for an interrogation. In Experiment 3, the camera perspective bias in voluntariness judgment disappeared when the participants viewing the video focused on the suspect were initially framed to make coerciveness judgment for the interrogation and the participants viewing the video focused on the detective were initially framed to make voluntariness judgment for the confession. The results in combination indicated that a particular camera focus may convey a suggestion of a particular cognitive frame in which a video-recorded confession/interrogation is initially represented. Some forensic and policy implications were discussed. PMID:22667808

  15. Flat Field Anomalies in an X-Ray CCD Camera Measured Using a Manson X-Ray Source

    SciTech Connect

    Michael Haugh

    2008-03-01

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. It determines how accurately NIF can point the laser beams and is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 ?m square pixels, and 15 ?m thick. A multi-anode Manson X-ray source, operating up to 10kV and 2mA, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/?E?12. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within ±1.5% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. The efficiency pattern follows the properties of Si. The maximum quantum efficiency is 0.71. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation was >8% at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was less than the measurement uncertainty below 4 keV. We were also able to observe debris on the CCD chip. The debris showed maximum contrast at the lowest energy used, 930 eV, and disappeared by 4 keV. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  16. First results from newly developed automatic video system MAIA and comparison with older analogue cameras

    NASA Astrophysics Data System (ADS)

    Koten, P.; Páta, P.; Fliegel, K.; Vítek, S.

    2013-09-01

    New automatic video system for meteor observations MAIA was developed in recent years [1]. The goal is to replace the older analogue cameras and provide a platform for continues round the year observations from two different stations. Here we present first results obtained during testing phase as well as the first double station observations. Comparison with the older analogue cameras is provided too. MAIA (Meteor Automatic Imager and Analyzer) is based on digital monochrome camera JAI CM-040 and well proved image intensifier XX1332 (Figure 1). The camera provides spatial resolution 776 x 582 pixels. The maximum frame rate is 61.15 frames per second. Fast Pentax SMS FA 1.4/50mm lens is used as the input element of the optical system. The resulting field-of-view is about 50º in diameter. For the first time new system was used in semiautomatic regime for the observation of the Draconid outburst on 8th October, 2011. Both cameras recorded more than 160 meteors. Additional hardware and software were developed in 2012 to enable automatic observation and basic processing of the data. The system usually records the video sequences for whole night. During the daytime it looks the records for moving object, saves them into short sequences and clears the hard drives to allow additional observations. Initial laboratory measurements [2] and simultaneous observations with older system show significant improvement of the obtained data. Table 1 shows comparison of the basic parameters of both systems. In this paper we will present comparison of the double station data obtained using both systems.

  17. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  18. Gain, Level, And Exposure Control For A Television Camera

    NASA Technical Reports Server (NTRS)

    Major, Geoffrey J.; Hetherington, Rolfe W.

    1992-01-01

    Automatic-level-control/automatic-gain-control (ALC/AGC) system for charge-coupled-device (CCD) color television camera prevents over-loading in bright scenes using technique for measuring brightness of scene from red, green, and blue output signals and processing these into adjustments of video amplifiers and iris on camera lens. System faster, does not distort video brightness signals, and built with smaller components.

  19. Traffic camera system development

    NASA Astrophysics Data System (ADS)

    Hori, Toshi

    1997-04-01

    The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

  20. Acute gastroenteritis and video camera surveillance: a cruise ship case report.

    PubMed

    Diskin, Arthur L; Caro, Gina M; Dahl, Eilif

    2014-01-01

    A 'faecal accident' was discovered in front of a passenger cabin of a cruise ship. After proper cleaning of the area the passenger was approached, but denied having any gastrointestinal symptoms. However, when confronted with surveillance camera evidence, she admitted having the accident and even bringing the towel stained with diarrhoea back to the pool towels bin. She was isolated until the next port where she was disembarked. Acute gastroenteritis (AGE) caused by Norovirus is very contagious and easily transmitted from person to person on cruise ships. The main purpose of isolation is to avoid public vomiting and faecal accidents. To quickly identify and isolate contagious passengers and crew and ensure their compliance are key elements in outbreak prevention and control, but this is difficult if ill persons deny symptoms. All passenger ships visiting US ports now have surveillance video cameras, which under certain circumstances can assist in finding potential index cases for AGE outbreaks. PMID:24677123

  1. A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities

    PubMed Central

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  2. A stroboscopic technique for using CCD cameras in flow visualization systems for continuous viewing and stop action photography

    NASA Technical Reports Server (NTRS)

    Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.

    1992-01-01

    A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.

  3. Framework for an easy Jini extension demonstrated with a video camera example

    NASA Astrophysics Data System (ADS)

    Vaskivuo, Teemu O.

    2001-07-01

    Jini technology makes it possible to move parts of software from a device to another as a solution to achieve flexible distribution. Accordingly, each part of software that could be useful if available for other applications may also be made available for them. This interesting feature is particularly useful if there are lots of services and users for them. Even though Jini is easy to implement when building new software, it should be made very simple in order to be attached to existing solutions without large modifications. Especially if different parts of the existing software already have clear functional roles, for example as a client and as a server. In this paper, a framework encapsulating Jini functionality is presented. The purpose is especially to offer an easy add-on implementation for existing solutions. However, the framework has also been seen as useful when building simple Jini applications from scratch. The framework does not allow great flexibility but it offers simplicity. All that the Jini distribution requires is started by only a few commands in the code. The framework also takes care of the class transfer through a simple HTTP server without further user intervention. The framework is demonstrated with a video camera example. In the example, a moving picture of a video camera attached to a server may be followed in any node of the local area network with minimal software requirements. A multipurpose remote controller application is made to act as a universal client for all the programs that are built using the framework.

  4. Calibration grooming and alignment for LDUA High Resolution Stereoscopic Video Camera System (HRSVS)

    SciTech Connect

    Pardini, A.F.

    1998-01-27

    The High Resolution Stereoscopic Video Camera System (HRSVS) was designed by the Savannah River Technology Center (SRTC) to provide routine and troubleshooting views of tank interiors during characterization and remediation phases of underground storage tank (UST) processing. The HRSVS is a dual color camera system designed to provide stereo viewing of the interior of the tanks including the tank wall in a Class 1, Division 1, flammable atmosphere. The HRSVS was designed with a modular philosophy for easy maintenance and configuration modifications. During operation of the system with the LDUA, the control of the camera system will be performed by the LDUA supervisory data acquisition system (SDAS). Video and control status 1458 will be displayed on monitors within the LDUA control center. All control functions are accessible from the front panel of the control box located within the Operations Control Trailer (OCT). The LDUA will provide all positioning functions within the waste tank for the end effector. Various electronic measurement instruments will be used to perform CG and A activities. The instruments may include a digital volt meter, oscilloscope, signal generator, and other electronic repair equipment. None of these instruments will need to be calibrated beyond what comes from the manufacturer. During CG and A a temperature indicating device will be used to measure the temperature of the outside of the HRSVS from initial startup until the temperature has stabilized. This device will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing. This sensor will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing.

  5. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  6. Visual fatigue modeling for stereoscopic video shot based on camera motion

    NASA Astrophysics Data System (ADS)

    Shi, Guozhong; Sang, Xinzhu; Yu, Xunbo; Liu, Yangdong; Liu, Jing

    2014-11-01

    As three-dimensional television (3-DTV) and 3-D movie become popular, the discomfort of visual feeling limits further applications of 3D display technology. The cause of visual discomfort from stereoscopic video conflicts between accommodation and convergence, excessive binocular parallax, fast motion of objects and so on. Here, a novel method for evaluating visual fatigue is demonstrated. Influence factors including spatial structure, motion scale and comfortable zone are analyzed. According to the human visual system (HVS), people only need to converge their eyes to the specific objects for static cameras and background. Relative motion should be considered for different camera conditions determining different factor coefficients and weights. Compared with the traditional visual fatigue prediction model, a novel visual fatigue predicting model is presented. Visual fatigue degree is predicted using multiple linear regression method combining with the subjective evaluation. Consequently, each factor can reflect the characteristics of the scene, and the total visual fatigue score can be indicated according to the proposed algorithm. Compared with conventional algorithms which ignored the status of the camera, our approach exhibits reliable performance in terms of correlation with subjective test results.

  7. Dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. II - Data handling and control systems

    NASA Technical Reports Server (NTRS)

    Dewey, D.; Ricker, G. R.

    1980-01-01

    The data collection system for the MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes) is described. The system relies on an RCA 1802 microprocessor-based controller, which serves to collect and format data, to present data to a scan converter, and to operate a device communication bus. A NOVA minicomputer is used to record and recall frame images and to perform refined image processing. The RCA 1802 also provides instrument mode control for the MASCOT. Commands are issued using STOIC, a FORTH-like language. Sufficient flexibility has been provided so that a variety of CCDs can be accommodated.

  8. Embedded FIR filter design for real-time refocusing using a standard plenoptic video camera

    NASA Astrophysics Data System (ADS)

    Hahne, Christopher; Aggoun, Amar

    2014-03-01

    A novel and low-cost embedded hardware architecture for real-time refocusing based on a standard plenoptic camera is presented in this study. The proposed layout design synthesizes refocusing slices directly from micro images by omitting the process for the commonly used sub-aperture extraction. Therefore, intellectual property cores, containing switch controlled Finite Impulse Response (FIR) filters, are developed and applied to the Field Programmable Gate Array (FPGA) XC6SLX45 from Xilinx. Enabling the hardware design to work economically, the FIR filters are composed of stored product as well as upsampling and interpolation techniques in order to achieve an ideal relation between image resolution, delay time, power consumption and the demand of logic gates. The video output is transmitted via High-Definition Multimedia Interface (HDMI) with a resolution of 720p at a frame rate of 60 fps conforming to the HD ready standard. Examples of the synthesized refocusing slices are presented.

  9. Plant iodine-131 uptake in relation to root concentration as measured in minirhizotron by video camera:

    SciTech Connect

    Moss, K.J.

    1990-09-01

    Glass viewing tubes (minirhizotrons) were placed in the soil beneath native perennial bunchgrass (Agropyron spicatum). The tubes provided access for observing and quantifying plant roots with a miniature video camera and soil moisture estimates by neutron hydroprobe. The radiotracer I-131 was delivered to the root zone at three depths with differing root concentrations. The plant was subsequently sampled and analyzed for I-131. Plant uptake was greater when I-131 was applied at soil depths with higher root concentrations. When I-131 was applied at soil depths with lower root concentrations, plant uptake was less. However, the relationship between root concentration and plant uptake was not a direct one. When I-131 was delivered to deeper soil depths with low root concentrations, the quantity of roots there appeared to be less effective in uptake than the same quantity of roots at shallow soil depths with high root concentration. 29 refs., 6 figs., 11 tabs.

  10. ESTABLISHING SPECIES-HABITAT ASSOCIATIONS FOR 4 ETELINE SNAPPERS USING A BAITED STEREO-VIDEO CAMERA SYSTEM

    E-print Network

    Qiu, Bo

    ESTABLISHING SPECIES-HABITAT ASSOCIATIONS FOR 4 ETELINE SNAPPERS USING A BAITED STEREO-VIDEO CAMERA on deepwater bottomfish makes it difficult to define their essential fish habitat (EFH), an integral concept to quantitatively define the habitat associations of four of these species (Pristipomoides filamentosus, P

  11. Dual-modality imaging in vivo with an NIR and gamma emitter using an intensified CCD camera and a conventional gamma camera

    NASA Astrophysics Data System (ADS)

    Houston, Jessica P.; Ke, Shi; Wang, Wei; Li, Chun; Sevick-Muraca, Eva M.

    2005-04-01

    Fluorescence-enhanced optical imaging measurements and conventional gamma camera images on human M21 melanoma xenografts were acquired for a "dual-modality" molecular imaging study. The avb3 integrin cell surface receptors were imaged using a cyclic peptide, cyclopentapeptide cyclo(lys-Arg-Gly-Asp-phe) [c(KRGDf)] probe which is known to target the membrane receptor. The probe, dual-labeled with a radiotracer, 111Indium, for gamma scintigraphy as well as with a near-infrared dye, IRDye800, was injected into six nude mice at a dose equivalent to 90mCi of 111In and 5 nanomoles of near-infrared (NIR) dye. A 15 min gamma scan and 800 millisecond NIR-sensitive ICCD optical photograph were collected 24 hours after injection of the dual-labeled probe. The image quality between the nuclear and optical data was investigated with the results showing similar target-to-background ratios (TBR) based on the origin of fluorescence and gamma emissions at the targeted tumor site. Furthermore, an analysis of SNR versus contrast showed greater sensitivity of optical over nuclear imaging for the subcutaneous tumor targets measured by surface regions of interest.

  12. Hand contour detection in wearable camera video using an adaptive histogram region of interest

    PubMed Central

    2013-01-01

    Background Monitoring hand function at home is needed to better evaluate the effectiveness of rehabilitation interventions. Our objective is to develop wearable computer vision systems for hand function monitoring. The specific aim of this study is to develop an algorithm that can identify hand contours in video from a wearable camera that records the user’s point of view, without the need for markers. Methods The two-step image processing approach for each frame consists of: (1) Detecting a hand in the image, and choosing one seed point that lies within the hand. This step is based on a priori models of skin colour. (2) Identifying the contour of the region containing the seed point. This is accomplished by adaptively determining, for each frame, the region within a colour histogram that corresponds to hand colours, and backprojecting the image using the reduced histogram. Results In four test videos relevant to activities of daily living, the hand detector classification accuracy was 88.3%. The contour detection results were compared to manually traced contours in 97 test frames, and the median F-score was 0.86. Conclusion This algorithm will form the basis for a wearable computer-vision system that can monitor and log the interactions of the hand with its environment. PMID:24354542

  13. A versatile digital video engine for safeguards and security applications

    SciTech Connect

    Hale, W.R.; Johnson, C.S.; DeKeyser, P.

    1996-08-01

    The capture and storage of video images have been major engineering challenges for safeguard and security applications since the video camera provided a method to observe remote operations. The problems of designing reliable video cameras were solved in the early 1980`s with the introduction of the CCD (charged couple device) camera. The first CCD cameras cost in the thousands of dollars but have now been replaced by cameras costing in the hundreds. The remaining problem of storing and viewing video images in both attended and unattended video surveillance systems and remote monitoring systems is being solved by sophisticated digital compression systems. One such system is the PC-104 three card set which is literally a ``video engine`` that can provide power for video storage systems. The use of digital images in surveillance systems makes it possible to develop remote monitoring systems, portable video surveillance units, image review stations, and authenticated camera modules. This paper discusses the video card set and how it can be used in many applications.

  14. CCD TV focal plane guider development and comparison to SIRTF applications

    NASA Technical Reports Server (NTRS)

    Rank, David M.

    1989-01-01

    It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

  15. A cooled CCD camera-based protocol provides an effective solution for in vitro monitoring of luciferase.

    PubMed

    Afshari, Amirali; Uhde-Stone, Claudia; Lu, Biao

    2015-03-13

    Luciferase assay has become an increasingly important technique to monitor a wide range of biological processes. However, the mainstay protocols require a luminometer to acquire and process the data, therefore limiting its application to specialized research labs. To overcome this limitation, we have developed an alternative protocol that utilizes a commonly available cooled charge-coupled device (CCCD), instead of a luminometer for data acquiring and processing. By measuring activities of different luciferases, we characterized their substrate specificity, assay linearity, signal-to-noise levels, and fold-changes via CCCD. Next, we defined the assay parameters that are critical for appropriate use of CCCD for different luciferases. To demonstrate the usefulness in cultured mammalian cells, we conducted a case study to examine NF?B gene activation in response to inflammatory signals in human embryonic kidney cells (HEK293 cells). We found that data collected by CCCD camera was equivalent to those acquired by luminometer, thus validating the assay protocol. In comparison, The CCCD-based protocol is readily amenable to live-cell and high-throughput applications, offering fast simultaneous data acquisition and visual and quantitative data presentation. In conclusion, the CCCD-based protocol provides a useful alternative for monitoring luciferase reporters. The wide availability of CCCD will enable more researchers to use luciferases to monitor and quantify biological processes. PMID:25677617

  16. A unified and efficient framework for court-net sports video analysis using 3D camera modeling

    NASA Astrophysics Data System (ADS)

    Han, Jungong; de With, Peter H. N.

    2007-01-01

    The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.

  17. The Automatically Triggered Video or Imaging Station (ATVIS): An Inexpensive Way to Catch Geomorphic Events on Camera

    NASA Astrophysics Data System (ADS)

    Wickert, A. D.

    2010-12-01

    To understand how single events can affect landscape change, we must catch the landscape in the act. Direct observations are rare and often dangerous. While video is a good alternative, commercially-available video systems for field installation cost 11,000, weigh ~100 pounds (45 kg), and shoot 640x480 pixel video at 4 frames per second. This is the same resolution as a cheap point-and-shoot camera, with a frame rate that is nearly an order of magnitude worse. To overcome these limitations of resolution, cost, and portability, I designed and built a new observation station. This system, called ATVIS (Automatically Triggered Video or Imaging Station), costs 450--500 and weighs about 15 pounds. It can take roughly 3 hours of 1280x720 pixel video, 6.5 hours of 640x480 video, or 98,000 1600x1200 pixel photos (one photo every 7 seconds for 8 days). The design calls for a simple Canon point-and-shoot camera fitted with custom firmware that allows 5V pulses through its USB cable to trigger it to take a picture or to initiate or stop video recording. These pulses are provided by a programmable microcontroller that can take input from either sensors or a data logger. The design is easily modifiable to a variety of camera and sensor types, and can also be used for continuous time-lapse imagery. We currently have prototypes set up at a gully near West Bijou Creek on the Colorado high plains and at tributaries to Marble Canyon in northern Arizona. Hopefully, a relatively inexpensive and portable system such as this will allow geomorphologists to supplement sensor networks with photo or video monitoring and allow them to see—and better quantify—the fantastic array of processes that modify landscapes as they unfold. Camera station set up at Badger Canyon, Arizona.Inset: view into box. Clockwise from bottom right: camera, microcontroller (blue), DC converter (red), solar charge controller, 12V battery. Materials and installation assistance courtesy of Ron Griffiths and the USGS Grand Canyon Monitoring and Research Center.

  18. The design and realization of a three-dimensional video system by means of a CCD array

    NASA Astrophysics Data System (ADS)

    Boizard, J. L.

    1985-12-01

    Design features and principles and initial tests of a prototype three-dimensional robot vision system based on a laser source and a CCD detector array is described. The use of a laser as a coherent illumination source permits the determination of the relief using one emitter since the location of the source is a known quantity with low distortion. The CCD signal detector array furnishes an acceptable signal/noise ratio and, when wired to an appropriate signal processing system, furnishes real-time data on the return signals, i.e., the characteristic points of an object being scanned. Signal processing involves integration of 29 kB of data per 100 samples, with sampling occurring at a rate of 5 MHz (the CCDs) and yielding an image every 12 msec. Algorithms for filtering errors from the data stream are discussed.

  19. Optimal camera exposure for video surveillance systems by predictive control of shutter speed, aperture, and gain

    NASA Astrophysics Data System (ADS)

    Torres, Juan; Menéndez, José Manuel

    2015-02-01

    This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together with a representative auto-exposure algorithm in the recent literature. Besides the sunrises and the nightfalls, multiple weather conditions occurred which produced light changes in the scene: sunny hours that produced sharpen shadows and highlights; cloud coverages that softened the shadows; and cloudy and rainy hours that dimmed the scene. Several indicators were used to measure the performance of the algorithms. They provided the objective quality as regards: the time that the algorithms recover from an under or over exposure, the brightness stability, and the change related to the optimal exposure. The results demonstrated that our algorithm reacts faster to all the light changes than the selected state-of-the-art algorithm. It is also capable of acquiring well exposed images and maintaining the brightness stable during more time. Summing up the results, we concluded that the proposed algorithm provides a fast and stable auto-exposure method that maintains an optimal exposure for video surveillance applications. Future work will involve the evaluation of this algorithm in robotics.

  20. Peering Into Virtual Space--Camera Shot Selection in the Video Conference Class.

    ERIC Educational Resources Information Center

    Dolhon, James P.

    1998-01-01

    Focuses on three essential information stations integral to the electronic classroom: the instructor camera (information station #1), the student camera (information station #2), and the copy-stand camera (information station #3). For each, the basic issues, such as camera location, instructional function, learning mode, information quality, and…

  1. Nyquist sampling theorem: understanding the illusion of a spinning wheel captured with a video camera

    NASA Astrophysics Data System (ADS)

    Lévesque, Luc

    2014-11-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the sampling time is chosen judiciously, then it is possible to accurately determine the frequency of a signal varying periodically with time. This paper is of educational value as it presents the principles of sampling during data acquisition. The concept of the Nyquist sampling theorem is usually introduced very briefly in the literature, with very little practical examples to grasp its importance during data acquisitions. Through a series of carefully chosen examples, we attempt to present data sampling from the elementary conceptual idea and try to lead the reader naturally to the Nyquist sampling theorem so we may more clearly understand why a signal can be interpreted incorrectly during a data acquisition procedure in the case of undersampling.

  2. Application of video-cameras for quality control and sampling optimisation of hydrological and erosion measurements in a catchment

    NASA Astrophysics Data System (ADS)

    Lora-Millán, Julio S.; Taguas, Encarnacion V.; Gomez, Jose A.; Perez, Rafael

    2014-05-01

    Long term soil erosion studies imply substantial efforts, particularly when there is the need to maintain continuous measurements. There are high costs associated to maintenance of field equipment keeping and quality control of data collection. Energy supply and/or electronic failures, vandalism and burglary are common causes of gaps in datasets, reducing their reach in many cases. In this work, a system of three video-cameras, a recorder and a transmission modem (3G technology) has been set up in a gauging station where rainfall, runoff flow and sediment concentration are monitored. The gauging station is located in the outlet of an olive orchard catchment of 6.4 ha. Rainfall is measured with one automatic raingauge that records intensity at one minute intervals. The discharge is measured by a flume of critical flow depth, where the water is recorded by an ultrasonic sensor. When the water level rises to a predetermined level, the automatic sampler turns on and fills a bottle at different intervals according to a program depending on the antecedent precipitation. A data logger controls the instruments' functions and records the data. The purpose of the video-camera system is to improve the quality of the dataset by i) the visual analysis of the measurement conditions of flow into the flume; ii) the optimisation of the sampling programs. The cameras are positioned to record the flow at the approximation and the gorge of the flume. In order to contrast the values of ultrasonic sensor, there is a third camera recording the flow level close to a measure tape. This system is activated when the ultrasonic sensor detects a height threshold, equivalent to an electric intensity level. Thus, only when there is enough flow, video-cameras record the event. This simplifies post-processing and reduces the cost of download of recordings. The preliminary contrast analysis will be presented as well as the main improvements in the sample program.

  3. High speed cooled CCD experiments

    SciTech Connect

    Pena, C.R.; Albright, K.L.; Yates, G.J.

    1998-12-31

    Experiments were conducted using cooled and intensified CCD cameras. Two different cameras were identically tested using different Optical test stimulus variables. Camera gain and dynamic range were measured by varying microchannel plate (MCP) voltages and controlling light flux using neutral density (ND) filters to yield analog digitized units (ADU) which are digitized values of the CCD pixel`s analog charge. A Xenon strobe (5 {micro}s FWHM, blue light, 430 nm) and a doubled Nd.yag laser (10 ns FWHM, green light, 532 nm) were both used as pulsed illumination sources for the cameras. Images were captured on PC desktop computer system using commercial software. Camera gain and integration time values were adjusted using camera software. Mean values of camera volts versus input flux were also obtained by performing line scans through regions of interest. Experiments and results will be discussed.

  4. A Numerical Analysis of a Frame Calibration Method for Video-based All-Sky Camera Systems

    NASA Astrophysics Data System (ADS)

    Bannister, Steven M.; Boucheron, Laura E.; Voelz, David G.

    2013-09-01

    The field of meteor monitoring has grown considerably over the past 20 years with the development of affordable, automated video camera systems. We describe a method for calibrating video all-sky cameras in terms of local zenith and azimuth angles. The method involves the observation of known training points (stars) and is based on an approach developed by Ceplecha & Borovi?ka. We use a simplified equation set, incorporate a quadratic expression for modeling the lens response, and utilize a nonlinear solver to obtain the calibration parameters. Simulation results with synthetic star data are presented to examine the effect of a limited number of training points, training point location, and initial parameter values on the calibration. Assumed simulation parameters are consistent with expectations for cameras in the NMSU SkySentinel network. Our modified calibration approach is shown to be stable over a broad range of calibration parameters with typical azimuth and zenith residual errors of less than 1°. Example calibration results for three camera nodes in the SkySentinel network are presented.

  5. The MMT all-sky camera

    NASA Astrophysics Data System (ADS)

    Pickering, T. E.

    2006-06-01

    The MMT all-sky camera is a low-cost, wide-angle camera system that takes images of the sky every 10 seconds, day and night. It is based on an Adirondack Video Astronomy StellaCam II video camera and utilizes an auto-iris fish-eye lens to allow safe operation under all lighting conditions, even direct sunlight. This combined with the anti-blooming characteristics of the StellaCam's detector allows useful images to be obtained during sunny days as well as brightly moonlit nights. Under dark skies the system can detect stars as faint as 6th magnitude as well as very thin cirrus and low surface brightness zodiacal features such as gegenschein. The total hardware cost of the system was less than $3500 including computer and framegrabber card, a fraction of the cost of comparable systems utilizing traditional CCD cameras.

  6. HDR {sup 192}Ir source speed measurements using a high speed video camera

    SciTech Connect

    Fonseca, Gabriel P.; Rubo, Rodrigo A.; Sales, Camila P. de; Verhaegen, Frank

    2015-01-15

    Purpose: The dose delivered with a HDR {sup 192}Ir afterloader can be separated into a dwell component, and a transit component resulting from the source movement. The transit component is directly dependent on the source speed profile and it is the goal of this study to measure accurate source speed profiles. Methods: A high speed video camera was used to record the movement of a {sup 192}Ir source (Nucletron, an Elekta company, Stockholm, Sweden) for interdwell distances of 0.25–5 cm with dwell times of 0.1, 1, and 2 s. Transit dose distributions were calculated using a Monte Carlo code simulating the source movement. Results: The source stops at each dwell position oscillating around the desired position for a duration up to (0.026 ± 0.005) s. The source speed profile shows variations between 0 and 81 cm/s with average speed of ?33 cm/s for most of the interdwell distances. The source stops for up to (0.005 ± 0.001) s at nonprogrammed positions in between two programmed dwell positions. The dwell time correction applied by the manufacturer compensates the transit dose between the dwell positions leading to a maximum overdose of 41 mGy for the considered cases and assuming an air-kerma strength of 48 000 U. The transit dose component is not uniformly distributed leading to over and underdoses, which is within 1.4% for commonly prescribed doses (3–10 Gy). Conclusions: The source maintains its speed even for the short interdwell distances. Dose variations due to the transit dose component are much lower than the prescribed treatment doses for brachytherapy, although transit dose component should be evaluated individually for clinical cases.

  7. Lights, Camera, Action: Advancing Learning, Research, and Program Evaluation through Video Production in Educational Leadership Preparation

    ERIC Educational Resources Information Center

    Friend, Jennifer; Militello, Matthew

    2015-01-01

    This article analyzes specific uses of digital video production in the field of educational leadership preparation, advancing a three-part framework that includes the use of video in (a) teaching and learning, (b) research methods, and (c) program evaluation and service to the profession. The first category within the framework examines videos

  8. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, Bojan T. (Moraga, CA); Yates, George J. (Santa Fe, NM)

    1992-01-01

    An electronic method for eliminating artifacts in a video camera (10) employing a charge coupled device (CCD) (12) as an image sensor. The method comprises the step of initializing the camera (10) prior to normal read out and includes a first dump cycle period (76) for transferring radiation generated charge into the horizontal register (28) while the decaying image on the phosphor (39) being imaged is being integrated in the photosites, and a second dump cycle period (78), occurring after the phosphor (39) image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers (32). Image charge is then transferred from the photosites (36) and (38) to the vertical registers (32) and read out in conventional fashion. The inventive method allows the video camera (10) to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers (28) and (32), and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites (36) and (37).

  9. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, B.T.; Yates, G.J.

    1992-06-09

    An electronic method for eliminating artifacts in a video camera employing a charge coupled device (CCD) as an image sensor is disclosed. The method comprises the step of initializing the camera prior to normal read out and includes a first dump cycle period for transferring radiation generated charge into the horizontal register while the decaying image on the phosphor being imaged is being integrated in the photosites, and a second dump cycle period, occurring after the phosphor image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers. Image charge is then transferred from the photosites and to the vertical registers and read out in conventional fashion. The inventive method allows the video camera to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers and, and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites. 3 figs.

  10. SU-C-18A-02: Image-Based Camera Tracking: Towards Registration of Endoscopic Video to CT

    SciTech Connect

    Ingram, S; Rao, A; Wendt, R; Castillo, R; Court, L; Yang, J; Beadle, B

    2014-06-01

    Purpose: Endoscopic examinations are routinely performed on head and neck and esophageal cancer patients. However, these images are underutilized for radiation therapy because there is currently no way to register them to a CT of the patient. The purpose of this work is to develop a method to track the motion of an endoscope within a structure using images from standard clinical equipment. This method will be incorporated into a broader endoscopy/CT registration framework. Methods: We developed a software algorithm to track the motion of an endoscope within an arbitrary structure. We computed frame-to-frame rotation and translation of the camera by tracking surface points across the video sequence and utilizing two-camera epipolar geometry. The resulting 3D camera path was used to recover the surrounding structure via triangulation methods. We tested this algorithm on a rigid cylindrical phantom with a pattern spray-painted on the inside. We did not constrain the motion of the endoscope while recording, and we did not constrain our measurements using the known structure of the phantom. Results: Our software algorithm can successfully track the general motion of the endoscope as it moves through the phantom. However, our preliminary data do not show a high degree of accuracy in the triangulation of 3D point locations. More rigorous data will be presented at the annual meeting. Conclusion: Image-based camera tracking is a promising method for endoscopy/CT image registration, and it requires only standard clinical equipment. It is one of two major components needed to achieve endoscopy/CT registration, the second of which is tying the camera path to absolute patient geometry. In addition to this second component, future work will focus on validating our camera tracking algorithm in the presence of clinical imaging features such as patient motion, erratic camera motion, and dynamic scene illumination.

  11. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  12. In-situ measurements of alloy oxidation/corrosion/erosion using a video camera and proximity sensor with microcomputer control

    NASA Technical Reports Server (NTRS)

    Deadmore, D. L.

    1984-01-01

    Two noncontacting and nondestructive, remotely controlled methods of measuring the progress of oxidation/corrosion/erosion of metal alloys, exposed to flame test conditions, are described. The external diameter of a sample under test in a flame was measured by a video camera width measurement system. An eddy current proximity probe system, for measurements outside of the flame, was also developed and tested. The two techniques were applied to the measurement of the oxidation of 304 stainless steel at 910 C using a Mach 0.3 flame. The eddy current probe system yielded a recession rate of 0.41 mils diameter loss per hour and the video system gave 0.27.

  13. 241-AZ-101 Waste Tank Color Video Camera System Shop Acceptance Test Report

    SciTech Connect

    WERRY, S.M.

    2000-03-23

    This report includes shop acceptance test results. The test was performed prior to installation at tank AZ-101. Both the camera system and camera purge system were originally sought and procured as a part of initial waste retrieval project W-151.

  14. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  15. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras.

    PubMed

    Troscianko, Jolyon; Rutz, Christian

    2015-12-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types-hooked stick tools-under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an 'expanded' foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  16. Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments

    ERIC Educational Resources Information Center

    Schultz, Patrick L.; Quinn, Andrew S.

    2014-01-01

    In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student…

  17. The Ortega Telescope Andor CCD

    NASA Astrophysics Data System (ADS)

    Tucker, M.; Batcheldor, D.

    2012-07-01

    We present a preliminary instrument report for an Andor iKon-L 936 charge-couple device (CCD) being operated at Florida Tech's 0.8 m Ortega Telescope. This camera will replace the current Finger Lakes Instrumentation (FLI) Proline CCD. Details of the custom mount produced for this camera are presented, as is a quantitative and qualitative comparison of the new and old cameras. We find that the Andor camera has 50 times less noise than the FLI, has no significant dark current over 30 seconds, and has a smooth, regular flat field. The Andor camera will provide significantly better sensitivity for direct imaging programs and, once it can be satisfactorily tested on-sky, will become the standard imaging device on the Ortega Telescope.

  18. Videos

    Cancer.gov

    Home News and Events Multimedia Library Videos Videos:  Miscellaneous Videos Video: Louis Staudt, M.D., Ph.D., Director of the National Cancer Institute's Center for Cancer Genomics, discusses the future of genomics research. Dr. Louis Staudt discusses

  19. Video imaging system and thermal mapping of the molten hearth in an electron beam melting furnace

    SciTech Connect

    Miszkiel, M.E.; Davis, R.A.; Van Den Avyle, J.A.

    1995-12-31

    This project was initiated to develop an enhanced video imaging system for the Liquid Metal Processing Laboratory Electron Beam Melting (EB) Furnace at Sandia and to use color video images to map the temperature distribution of the surface of the molten hearth. In a series of test melts, the color output of the video image was calibrated against temperatures measured by an optical pyrometer and CCD camera viewing port above the molten pool. To prevent potential metal vapor deposition onto line-of-sight optical surfaces above the pool, argon backfill was used along with a pinhole aperture to obtain the vide image. The geometry of the optical port to the hearth set the limits for the focus lens and CCD camera`s field of view. Initial melts were completed with the pyrometer and pinhole aperture port in a fixed position. Using commercially available vacuum components, a second flange assembly was constructed to provide flexibility in choosing pyrometer target sights on the hearth and to adjust the field of view for the focus lens/CCD combination. RGB video images processed from the melts verified that red wavelength light captured with the video camera could be calibrated with the optical pyrometer target temperatures and used to generate temperature maps of the hearth surface. Two color ratio thermal mapping using red and green video images, which has theoretical advantages, was less successful due to probable camera non-linearities in the red and green image intensities.

  20. Lights, Camera: Learning! Findings from studies of video in formal and informal science education

    NASA Astrophysics Data System (ADS)

    Borland, J.

    2013-12-01

    As part of the panel, media researcher, Jennifer Borland, will highlight findings from a variety of studies of videos across the spectrum of formal to informal learning, including schools, museums, and in viewers homes. In her presentation, Borland will assert that the viewing context matters a great deal, but there are some general take-aways that can be extrapolated to the use of educational video in a variety of settings. Borland has served as an evaluator on several video-related projects funded by NASA and the the National Science Foundation including: Data Visualization videos and Space Shows developed by the American Museum of Natural History, DragonflyTV, Earth the Operators Manual, The Music Instinct and Time Team America.

  1. Reconstruction of the Pose of Uncalibrated Cameras via User-Generated Videos

    E-print Network

    Bennett, Stuart; Lasenby, Joan; Kokaram, Anil; Inguva, Sasi; Birkbeck, Neil

    2014-01-01

    might be indicative, experiments showed this not to be a discriminative metric. Instead we use a ‘relative blurriness’ measure, bt , comparing blurriness between frames from one video sequence, taken from the video stabilization literature ([16]): bt “ 1... significant perspective distortion are not unnecessarily pruned. The second detail relates to the bin edges: in order to not bias the binning by using some particular bin-edge alignment, one should attempt the histogramming with all possible offsets...

  2. Lights, camera, action…critique? Submit videos to AGU communications workshop

    NASA Astrophysics Data System (ADS)

    Viñas, Maria-José

    2011-08-01

    What does it take to create a science video that engages the audience and draws thousands of views on YouTube? Those interested in finding out should submit their research-related videos to AGU's Fall Meeting science film analysis workshop, led by oceanographer turned documentary director Randy Olson. Olson, writer-director of two films (Flock of Dodos: The Evolution-Intelligent Design Circus and Sizzle: A Global Warming Comedy) and author of the book Don't Be Such a Scientist: Talking Substance in an Age of Style, will provide constructive criticism on 10 selected video submissions, followed by moderated discussion with the audience. To submit your science video (5 minutes or shorter), post it on YouTube and send the link to the workshop coordinator, Maria-José Viñas (mjvinas@agu.org), with the following subject line: Video submission for Olson workshop. AGU will be accepting submissions from researchers and media officers of scientific institutions until 6:00 P.M. eastern time on Friday, 4 November. Those whose videos are selected to be screened will be notified by Friday, 18 November. All are welcome to attend the workshop at the Fall Meeting.

  3. A new method to calculate the camera focusing area and player position on playfield in soccer video

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Huang, Qingming; Ye, Qixiang; Gao, Wen

    2005-07-01

    Sports video enrichment is attracting many researchers. People want to appreciate some highlight segments with cartoon. In order to automatically generate these cartoon video, we have to estimate the players" and ball"s 3D position. In this paper, we propose an algorithm to cope with the former problem, i.e. to compute players" position on court. For the image with sufficient corresponding points, the algorithm uses these points to calibrate the map relationship between image and playfield plane (called as homography). For the images without enough corresponding points, we use global motion estimation (GME) and the already calibrated image to compute the images" homographies. Thus, the problem boils down to estimating global motion. To enhance the performance of global motion estimation, two strategies are exploited. The first one is removing the moving objects based on adaptive GMM playfield detection, which can eliminate the influence of non-still object; The second one is using LKT tracking feature points to determine horizontal and vertical translation, which makes the optimization process for GME avoid being trapped into local minimum. Thus, if some images of a sequence can be calibrated directly from the intersection points of court line, all images of the sequence can by calibrated through GME. When we know the homographies between image and playfield, we can compute the camera focusing area and players" position in real world. We have tested our algorithm on real video and the result is encouraging.

  4. Visual surveys can reveal rather different 'pictures' of fish densities: Comparison of trawl and video camera surveys in the Rockall Bank, NE Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    McIntyre, F. D.; Neat, F.; Collie, N.; Stewart, M.; Fernandes, P. G.

    2015-01-01

    Visual surveys allow non-invasive sampling of organisms in the marine environment which is of particular importance in deep-sea habitats that are vulnerable to damage caused by destructive sampling devices such as bottom trawls. To enable visual surveying at depths greater than 200 m we used a deep towed video camera system, to survey large areas around the Rockall Bank in the North East Atlantic. The area of seabed sampled was similar to that sampled by a bottom trawl, enabling samples from the towed video camera system to be compared with trawl sampling to quantitatively assess the numerical density of deep-water fish populations. The two survey methods provided different results for certain fish taxa and comparable results for others. Fish that exhibited a detectable avoidance behaviour to the towed video camera system, such as the Chimaeridae, resulted in mean density estimates that were significantly lower (121 fish/km2) than those determined by trawl sampling (839 fish/km2). On the other hand, skates and rays showed no reaction to the lights in the towed body of the camera system, and mean density estimates of these were an order of magnitude higher (64 fish/km2) than the trawl (5 fish/km2). This is probably because these fish can pass under the footrope of the trawl due to their flat body shape lying close to the seabed but are easily detected by the benign towed video camera system. For other species, such as Molva sp, estimates of mean density were comparable between the two survey methods (towed camera, 62 fish/km2; trawl, 73 fish/km2). The towed video camera system presented here can be used as an alternative benign method for providing indices of abundance for species such as ling in areas closed to trawling, or for those fish that are poorly monitored by trawl surveying in any area, such as the skates and rays.

  5. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGESBeta

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore »into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  6. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    SciTech Connect

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.

  7. Characterization of color texture: CIEL*a*b* calibration of CCD device

    NASA Astrophysics Data System (ADS)

    Laflaquiere, Philippe; Lafon, Dominique; Eterradossi, O.; Slangen, Pierre R.

    1998-09-01

    A lot of materials used in construction industry are materials showing strong color texture, which may give the product its commercial value. We plan to perform automated appearance sorting using a tri-CCD color video camera as a measurement tool. The aim of the present work is the refinement of a calibration process allowing this tool to deliver for each pixel the same information as a spectroradiometer (CIEL*a*b* coordinates). Analysis of the acquisition tool allows characterization of the card and camera behavior (linearity of RGB gains, spatially varying and fixed pattern noises). Color calibration is performed using measurements with a spectroradiometer.

  8. A LOW-COST VIDEO CAMERA FOR MACHINE VISION AND CONSUMER USE

    E-print Network

    Bradbeer, Robin Sarah

    of development of a low cost camera, initially for toy use, with a target manufacture price of $20 manufactured cost - below $20. It had to be easily mass produced and small enough to fit into mobile toys allow easy manufacture of a single-chip version using design automation techniques where many embedded

  9. "Lights, Camera, Reflection": Using Peer Video to Promote Reflective Dialogue among Student Teachers

    ERIC Educational Resources Information Center

    Harford, Judith; MacRuairc, Gerry; McCartan, Dermot

    2010-01-01

    This paper examines the use of peer-videoing in the classroom as a means of promoting reflection among student teachers. Ten pre-service teachers participating in a teacher education programme in a university in the Republic of Ireland and ten pre-service teachers participating in a teacher education programme in a university in the North of…

  10. Quick Reference for Video Start a video call

    E-print Network

    Quick Reference for Video Start a video call 1. Pause on a contact's picture and click the camera on the camera button and select an option: · Stop My Video: ends your video, but you can still see others' videos. · End Video: ends all the videos for you, but you'll still have audio. Video © 2012 Microsoft

  11. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Technical Reports Server (NTRS)

    Bolas, M. T.; Fisher, S. S.

    1990-01-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remore manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  12. A simple, inexpensive video camera setup for the study of avian nest activity

    USGS Publications Warehouse

    Sabine, J.B.; Meyers, J.M.; Schweitzer, S.H.

    2005-01-01

    Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (>USA $4000/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus), American Crow (Corvus brachyrhynchos), and ghost crab (Ocypode quadrata) predation on oystercatcher nests. Other detected causes of nest failure included tidal overwash, horse trampling, abandonment, and human destruction. System failure rates were comparable with commercially available units. Our system's efficacy and low cost (<$800) provided useful data for the management and conservation of the American Oystercatcher.

  13. Jellyfish Support High Energy Intake of Leatherback Sea Turtles (Dermochelys coriacea): Video Evidence from Animal-Borne Cameras

    PubMed Central

    Heaslip, Susan G.; Iverson, Sara J.; Bowen, W. Don; James, Michael C.

    2012-01-01

    The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n?=?19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:08–3:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83–100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p?=?0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p?=?0.0001). Estimates of energy intake averaged 66,018 kJ•d?1 but were as high as 167,797 kJ•d?1 corresponding to turtles consuming an average of 330 kg wet mass•d?1 (up to 840 kg•d?1) or approximately 261 (up to 664) jellyfish•d-1. Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body mass•d?1 equating to an average energy intake of 3–7 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to southward migration. PMID:22438906

  14. Video photographic considerations for measuring the proximity of a probe aircraft with a smoke seeded trailing vortex

    NASA Technical Reports Server (NTRS)

    Childers, Brooks A.; Snow, Walter L.

    1990-01-01

    Considerations for acquiring and analyzing 30 Hz video frames from charge coupled device (CCD) cameras mounted in the wing tips of a Beech T-34 aircraft are described. Particular attention is given to the characterization and correction of optical distortions inherent in the data.

  15. Evaluation of a high dynamic range video camera with non-regular sensor

    NASA Astrophysics Data System (ADS)

    Schöberl, Michael; Keinert, Joachim; Ziegler, Matthias; Seiler, Jürgen; Niehaus, Marco; Schuller, Gerald; Kaup, André; Foessel, Siegfried

    2013-01-01

    Although there is steady progress in sensor technology, imaging with a high dynamic range (HDR) is still difficult for motion imaging with high image quality. This paper presents our new approach for video acquisition with high dynamic range. The principle is based on optical attenuation of some of the pixels of an existing image sensor. This well known method traditionally trades spatial resolution for an increase in dynamic range. In contrast to existing work, we use a non-regular pattern of optical ND filters for attenuation. This allows for an image reconstruction that is able to recover high resolution images. The reconstruction is based on the assumption that natural images can be represented nearly sparse in transform domains, which allows for recovery of scenes with high detail. The proposed combination of non-regular sampling and image reconstruction leads to a system with an increase in dynamic range without sacrificing spatial resolution. In this paper, a further evaluation is presented on the achievable image quality. In our prototype we found that crosstalk is present and significant. The discussion thus shows the limits of the proposed imaging system.

  16. Linear CCD attitude measurement system based on the identification of the auxiliary array CCD

    NASA Astrophysics Data System (ADS)

    Hu, Yinghui; Yuan, Feng; Li, Kai; Wang, Yan

    2015-10-01

    Object to the high precision flying target attitude measurement issues of a large space and large field of view, comparing existing measurement methods, the idea is proposed of using two array CCD to assist in identifying the three linear CCD with multi-cooperative target attitude measurement system, and to address the existing nonlinear system errors and calibration parameters and more problems with nine linear CCD spectroscopic test system of too complicated constraints among camera position caused by excessive. The mathematical model of binocular vision and three linear CCD test system are established, co-spot composition triangle utilize three red LED position light, three points' coordinates are given in advance by Cooperate Measuring Machine, the red LED in the composition of the three sides of a triangle adds three blue LED light points as an auxiliary, so that array CCD is easier to identify three red LED light points, and linear CCD camera is installed of a red filter to filter out the blue LED light points while reducing stray light. Using array CCD to measure the spot, identifying and calculating the spatial coordinates solutions of red LED light points, while utilizing linear CCD to measure three red LED spot for solving linear CCD test system, which can be drawn from 27 solution. Measured with array CCD coordinates auxiliary linear CCD has achieved spot identification, and has solved the difficult problems of multi-objective linear CCD identification. Unique combination of linear CCD imaging features, linear CCD special cylindrical lens system is developed using telecentric optical design, the energy center of the spot position in the depth range of convergence in the direction is perpendicular to the optical axis of the small changes ensuring highprecision image quality, and the entire test system improves spatial object attitude measurement speed and precision.

  17. Micro-rheology Using Multi Speckle DWS with Video Camera. Application to Film Formation, Drying and Rheological Stability

    NASA Astrophysics Data System (ADS)

    Brunel, Laurent; Dihang, Hélène

    2008-07-01

    We present in this work two applications of microrheology: the monitoring of film formation and the rheological stability. Microrheology is based on the Diffusing Wave Spectroscopy (DWS) method [1] that relates the particle dynamics to the speckle field dynamics, and further to the visco-elastic moduli G' and G? with respect to frequency [2]. Our technology uses the Multi Speckle DWS (MS-DWS) set-up in backscattering with a video camera. For film formation and drying application, we present a new algorithm called "Adaptive Speckle Imaging Interferometry" (ASII) that extracts a simple kinetics from the speckle field dynamics [3,4]. Different film forming and drying have been investigated (water-based, solvent and solvent-free paints, inks, adhesives, varnishes, …) on various types of substrates and at different thickness (few to hundreds microns). For rheological stability we show that the robust measurement of speckle correlation using the inter image distance [3] can bring useful information for industry on viscoelasticity variations over a wide range of frequency without additional parameter.

  18. Bird-Borne Video-Cameras Show That Seabird Movement Patterns Relate to Previously Unrevealed Proximate Environment, Not Prey

    PubMed Central

    Tremblay, Yann; Thiebault, Andréa; Mullers, Ralf; Pistorius, Pierre

    2014-01-01

    The study of ecological and behavioral processes has been revolutionized in the last two decades with the rapid development of biologging-science. Recently, using image-capturing devices, some pilot studies demonstrated the potential of understanding marine vertebrate movement patterns in relation to their proximate, as opposed to remote sensed environmental contexts. Here, using miniaturized video cameras and GPS tracking recorders simultaneously, we show for the first time that information on the immediate visual surroundings of a foraging seabird, the Cape gannet, is fundamental in understanding the origins of its movement patterns. We found that movement patterns were related to specific stimuli which were mostly other predators such as gannets, dolphins or fishing boats. Contrary to a widely accepted idea, our data suggest that foraging seabirds are not directly looking for prey. Instead, they search for indicators of the presence of prey, the latter being targeted at the very last moment and at a very small scale. We demonstrate that movement patterns of foraging seabirds can be heavily driven by processes unobservable with conventional methodology. Except perhaps for large scale processes, local-enhancement seems to be the only ruling mechanism; this has profounds implications for ecosystem-based management of marine areas. PMID:24523892

  19. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  20. The in-flight spectroscopic performance of the Swift XRT CCD J. P. Osbornea

    E-print Network

    Nishikawa, Ken-Ichi

    The in-flight spectroscopic performance of the Swift XRT CCD camera J. P. Osbornea , A. P) focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 144 eV FWHM at 6.5 keV. We describe the CCD calibration program based on celestial and on-board calibration sources

  1. Improvement in the light sensitivity of the ultrahigh-speed high-sensitivity CCD with a microlens array

    NASA Astrophysics Data System (ADS)

    Hayashida, T.,; Yonai, J.; Kitamura, K.; Arai, T.; Kurita, T.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Kitagawa, S.; Hatade, K.; Yamaguchi, T.; Takeuchi, H.; Iida, K.

    2008-02-01

    We are advancing the development of ultrahigh-speed, high-sensitivity CCDs for broadcast use that are capable of capturing smooth slow-motion videos in vivid colors even where lighting is limited, such as at professional baseball games played at night. We have already developed a 300,000 pixel, ultrahigh-speed CCD, and a single CCD color camera that has been used for sports broadcasts and science programs using this CCD. However, there are cases where even higher sensitivity is required, such as when using a telephoto lens during a baseball broadcast or a high-magnification microscope during science programs. This paper provides a summary of our experimental development aimed at further increasing the sensitivity of CCDs using the light-collecting effects of a microlens array.

  2. Magellan Instant Camera testbed

    E-print Network

    McEwen, Heather K. (Heather Kristine), 1982-

    2004-01-01

    The Magellan Instant Camera (MagIC) is an optical CCD camera that was built at MIT and is currently used at Las Campanas Observatory (LCO) in La Serena, Chile. It is designed to be both simple and efficient with minimal ...

  3. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 ?m thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 ?m. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  4. Individual camera identification using correlation of fixed pattern noise in image sensors.

    PubMed

    Kurosawa, Kenji; Kuroki, Kenro; Akiba, Norimitsu

    2009-05-01

    This paper presents results of experiments related to individual video camera identification using a correlation coefficient of fixed pattern noise (FPN) in image sensors. Five color charge-coupled device (CCD) modules of the same brand were examined. Images were captured using a 12-bit monochrome video capture board and stored in a personal computer. For each module, 100 frames were captured. They were integrated to obtain FPN. The results show that a specific CCD module was distinguished among the five modules by analyzing the normalized correlation coefficient. The temporal change of the correlation coefficient during several days had only a negligible effect on identifying the modules. Furthermore, a positive relation was found between the correlation coefficient of the same modules and the number of frames that were used for image integration. Consequently, precise individual camera identification is enhanced by acquisition of as many frames as possible. PMID:19302379

  5. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  6. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  7. Are traditional methods of determining nest predators and nest fates reliable? An experiment with Wood Thrushes (Hylocichla mustelina) using miniature video cameras

    USGS Publications Warehouse

    Williams, G.E.; Wood, P.B.

    2002-01-01

    We used miniature infrared video cameras to monitor Wood Thrush (Hylocichla mustelina) nests during 1998-2000. We documented nest predators and examined whether evidence at nests can be used to predict predator identities and nest fates. Fifty-six nests were monitored; 26 failed, with 3 abandoned and 23 depredated. We predicted predator class (avian, mammalian, snake) prior to review of video footage and were incorrect 57% of the time. Birds and mammals were underrepresented whereas snakes were over-represented in our predictions. We documented ???9 nest-predator species, with the southern flying squirrel (Glaucomys volans) taking the most nests (n = 8). During 2000, we predicted fate (fledge or fail) of 27 nests; 23 were classified correctly. Traditional methods of monitoring nests appear to be effective for classifying success or failure of nests, but ineffective at classifying nest predators.

  8. 3D-Color Video Camera O. Rubinstein Y. Honen A. M. Bronstein M. M. Bronstein R. Kimmel

    E-print Network

    Kimmel, Ron

    depth and color at video rates. The re- construction and display are performed at around 30 depth and reconstruction methods, achieving high spatial resolution at video rates with a low cost system is still and manufacturing factors, its spatial resolu- tion has so far been limited to a few centimeters. Other techniques

  9. CCD Photometer Installed on the Telescope - 600 OF the Shamakhy Astrophysical Observatory II. The Technique of Observation and Data Processing of CCD Photometry

    NASA Astrophysics Data System (ADS)

    Abdullayev, B. I.; Gulmaliyev, N. I.; Majidova, S. O.; Mikayilov, Kh. M.; Rustamov, B. N.

    2009-12-01

    Basic technical characteristics of CCD matrix U-47 made by the Apogee Alta Instruments Inc. are provided. Short description and features of various noises introduced by optical system and CCD camera are presented. The technique of getting calibration frames: bias, dark, flat field and main stages of processing of results CCD photometry are described.

  10. CCD Double Star Measures: Jack Jones Memorial Observatory Report #1

    NASA Astrophysics Data System (ADS)

    Jones, James

    2008-01-01

    This paper reports on 63 CCD measurements of 58 multiple star systems observed between 2003 and 2007. It also reports on delta mag(V) measurements of selected doubles. Measurements were made using a CCD camera and 8" or 11" SCT. A brief description of methods used is provided.

  11. CCD Double Star Measures: Jack Jones Observatory Report #2

    NASA Astrophysics Data System (ADS)

    Jones, James L.

    2009-10-01

    This paper submits 44 CCD measurements of 41 multiple star systems for inclusion in the WDS. Observations were made during the calendar year 2008. Measurements were made using a CCD camera and an 11" Schmidt-Cassegrain telescope. Brief discussions of pertinent observations are included.

  12. IR CCD staring imaging system

    NASA Astrophysics Data System (ADS)

    Zhou, Qibo

    1991-12-01

    An infrared staring imaging system in the 3 to 5 micrometers spectral band has been developed by SITP laboratories in China. The sensor utilized is a Pt-Si Schottky-barrier infrared CCD focal plane array. The digital video processing electronics is designed for 32 X 64, 64 X 64, and 128 X 128 pixel formats and contains the elimination of fixed pattern noise and the correction of response nonuniformity in real time and provides the high-quality IR image. The standard TV compatible and portable features are part of the design. In this paper, we describe the design and performance of this prototype system and present some experimental examples of IR imagery. The results demonstrate that the Pt-Si IR CCD imaging system has good performance, high reliability, low cost, and can be suitable for a variety of commercial applications.

  13. A video precipitation sensor for imaging and velocimetry of hydrometeors

    NASA Astrophysics Data System (ADS)

    Liu, X. C.; Gao, T. C.; Liu, L.

    2013-11-01

    A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of Video Precipitation Sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fiber cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit, the cylindrical space between the optical unit and imaging unit is sampling volume (300 mm × 40 mm × 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposures two times in a single frame, by which the double-exposure of particles images can be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of linear scan CCD disdrometer and impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

  14. A video precipitation sensor for imaging and velocimetry of hydrometeors

    NASA Astrophysics Data System (ADS)

    Liu, X. C.; Gao, T. C.; Liu, L.

    2014-07-01

    A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of a video precipitation sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fibre cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit. The cylindrical space between the optical unit and imaging unit is sampling volume (300 mm × 40 mm × 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposes twice in a single frame, which allows the double exposure of particles images to be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in the double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of a linear scan CCD disdrometer and an impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

  15. An electronic pan/tilt/zoom camera system

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steve; Martin, H. L.

    1992-01-01

    A small camera system is described for remote viewing applications that employs fisheye optics and electronics processing for providing pan, tilt, zoom, and rotational movements. The fisheye lens is designed to give a complete hemispherical FOV with significant peripheral distortion that is corrected with high-speed electronic circuitry. Flexible control of the viewing requirements is provided by a programmable transformation processor so that pan/tilt/rotation/zoom functions can be accomplished without mechanical movements. Images are presented that were taken with a prototype system using a CCD camera, and 5 frames/sec can be acquired from a 180-deg FOV. The image-tranformation device can provide multiple images with different magnifications and pan/tilt/rotation sequences at frame rates compatible with conventional video devices. The system is of interest to the object tracking, surveillance, and viewing in constrained environments that would require the use of several cameras.

  16. Identification of Prey Captures in Australian Fur Seals (Arctocephalus pusillus doriferus) Using Head-Mounted Accelerometers: Field Validation with Animal-Borne Video Cameras

    PubMed Central

    Volpov, Beth L.; Hoskins, Andrew J.; Battaile, Brian C.; Viviant, Morgane; Wheatley, Kathryn E.; Marshall, Greg; Abernathy, Kyler; Arnould, John P. Y.

    2015-01-01

    This study investigated prey captures in free-ranging adult female Australian fur seals (Arctocephalus pusillus doriferus) using head-mounted 3-axis accelerometers and animal-borne video cameras. Acceleration data was used to identify individual attempted prey captures (APC), and video data were used to independently verify APC and prey types. Results demonstrated that head-mounted accelerometers could detect individual APC but were unable to distinguish among prey types (fish, cephalopod, stingray) or between successful captures and unsuccessful capture attempts. Mean detection rate (true positive rate) on individual animals in the testing subset ranged from 67-100%, and mean detection on the testing subset averaged across 4 animals ranged from 82-97%. Mean False positive (FP) rate ranged from 15-67% individually in the testing subset, and 26-59% averaged across 4 animals. Surge and sway had significantly greater detection rates, but also conversely greater FP rates compared to heave. Video data also indicated that some head movements recorded by the accelerometers were unrelated to APC and that a peak in acceleration variance did not always equate to an individual prey item. The results of the present study indicate that head-mounted accelerometers provide a complementary tool for investigating foraging behaviour in pinnipeds, but that detection and FP correction factors need to be applied for reliable field application. PMID:26107647

  17. Lines identification in the emission spectrum and orbital elements of a sporadic video meteor

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Zamorano, J.; Ocaña, F.; Izquierdo, J.; Sanchez de Miguel, A.; Trigo-Rodriguez, J. M.; Toscano, F. M.

    2011-10-01

    Since 2006 the SPanish Meteor Network (SPMN) employs high-sensitivity CCD video cameras to monitor meteor and fireball activity over the Iberian Peninsula and neighboring areas. These allow us to obtain the trajectory and orbit for multi-station events and, when combined with holographic diffraction gratings, also provide information about the chemical composition of the corresponding meteoroids. In this context, we analyze here the emission spectrum, trajectory and orbital parameters of a sporadic bolide imaged on 2010.

  18. Real-time integral imaging system with handheld light field camera

    NASA Astrophysics Data System (ADS)

    Jeong, Youngmo; Kim, Jonghyun; Yeom, Jiwoon; Lee, Byoungho

    2014-11-01

    Our objective is to construct real-time pickup and display in integral imaging system with handheld light field camera. A micro lens array and high frame rate charge-coupled device (CCD) are used to implement handheld light field camera, and a simple lens array and a liquid crystal (LC) display panel are used to reconstruct three-dimensional (3D) images in real-time. Handheld light field camera is implemented by adding the micro lens array on CCD sensor. Main lens, which is mounted on CCD sensor, is used to capture the scene. To make the elemental image in real-time, pixel mapping algorithm is applied. With this algorithm, not only pseudoscopic problem can be solved, but also user can change the depth plane of the displayed 3D images in real-time. For real-time high quality 3D video generation, a high resolution and high frame rate CCD and LC display panel are used in proposed system. Experiment and simulation results are presented to verify our proposed system. As a result, 3D image is captured and reconstructed in real-time through integral imaging system.

  19. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

  20. Study of design and control of remote manipulators. Part 4: Experiments in video camera positioning with regard to remote manipulation

    NASA Technical Reports Server (NTRS)

    Mackro, J.

    1973-01-01

    The results are presented of a study involving closed circuit television as the means of providing the necessary task-to-operator feedback for efficient performance of the remote manipulation system. Experiments were performed to determine the remote video configuration that will result in the best overall system. Two categories of tests were conducted which include: those which involved remote control position (rate) of just the video system, and those in which closed circuit TV was used along with manipulation of the objects themselves.

  1. AN ACTIVE CAMERA SYSTEM FOR ACQUIRING MULTI-VIEW VIDEO Robert T. Collins, Omead Amidi, and Takeo Kanade

    E-print Network

    effects in the movie The Matrix, where playing back frames from a single time step, across all cameras recogni- tion, 3D reconstruction, entertainment and sports, it is often desirable to capture a set. However, in surveillance or sports applications it is not possible to predict beforehand the precise

  2. A geometric comparison of video camera-captured raster data to vector-parented raster data generated by the X-Y digitizing table

    NASA Technical Reports Server (NTRS)

    Swalm, C.; Pelletier, R.; Rickman, D.; Gilmore, K.

    1989-01-01

    The relative accuracy of a georeferenced raster data set captured by the Megavision 1024XM system using the Videk Megaplus CCD cameras is compared to a georeferenced raster data set generated from vector lines manually digitized through the ELAS software package on a Summagraphics X-Y digitizer table. The study also investigates the amount of time necessary to fully complete the rasterization of the two data sets, evaluating individual areas such as time necessary to generate raw data, time necessary to edit raw data, time necessary to georeference raw data, and accuracy of georeferencing against a norm. Preliminary results exhibit a high level of agreement between areas of the vector-parented data and areas of the captured file data where sufficient control points were chosen. Maps of 1:20,000 scale were digitized into raster files of 5 meter resolution per pixel and overall error in RMS was estimated at less than eight meters. Such approaches offer time and labor-saving advantages as well as increasing the efficiency of project scheduling and enabling the digitization of new types of data.

  3. Computer-assisted skull identification system using video superimposition.

    PubMed

    Yoshino, M; Matsuda, H; Kubota, S; Imaizumi, K; Miyasaka, S; Seta, S

    1997-12-01

    This system consists of two main units, namely a video superimposition system and a computer-assisted skull identification system. The video superimposition system is comprised of the following five parts: a skull-positioning box having a monochrome CCD camera, a photo-stand having a color CCD camera, a video image mixing device, a TV monitor and a videotape recorder. The computer-assisted skull identification system is composed of a host computer including our original application software, a film recorder and a color printer. After the determination of the orientation and size of the skull to those of the facial photograph using the video superimposition system, the skull and facial photograph images are digitized and stored within the computer, and then both digitized images are superimposed on the monitor. For the assessment of anatomical consistency between the digitized skull and face, the distance between the landmarks and the thickness of soft tissue of the anthropometrical points are semi-automatically measured on the monitor. The wipe images facilitates the comparison of positional relationships between the digitized skull and face. The software includes the polynomial functions and Fourier harmonic analysis for evaluating the match of the outline such as the forehead and mandibular line in both the digitized images. PMID:9493339

  4. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  5. Emission spectrum and orbital elements of a sporadic video meteor with a cometary origin

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Ortiz, J. L.; Castro-Tirado, A. J.; Cabrera, J.; Trigo-Rodriguez, J. M.; Toscano, F. M.

    2012-09-01

    The SPanish Meteor Network (SPMN) monitors meteor and fireball activity over the Iberian Peninsula and neighboring areas by using, among other systems, high-sensitivity CCD video cameras. In this way, we can obtain the atmospheric trajectory and orbit in the Solar System for multi-station events. Besides, holographic diffraction gratings attached to some of our cameras also provide information about the chemical composition of the corresponding meteoroids. In this context, we analyze here the emission spectrum, trajectory and orbital parameters of a sporadic bolide imaged in 2011.

  6. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable assembly. The FPGA controller card is connected to the host computer via a standard peripheral component interface (PCI).

  7. A CCD offset guider for the KAO

    NASA Technical Reports Server (NTRS)

    Colgan, Sean W. J.; Erickson, Edwin F.; Haynes, Fredric B.; Rank, David M.

    1995-01-01

    We describe a focal plane guider for the Kuiper Airborne Observatory which consists of a CCD camera interfaced to an AMIGA personal computer. The camera is made by Photometrics Ltd. and utilizes a Thomson 576 x 384 pixel CCD chip operated in Frame Transfer mode. Custom optics produce a scale of 2.4 arc-sec/pixel, yielding an approx. 12 ft. diameter field of view. Chopped images of stars with HST Guide Star Catalog magnitudes fainter than 14 have been used for guiding at readout rates greater than or equal to 0.5 Hz. The software includes automatic map generation, subframing and zooming, and correction for field rotation when two stars are in the field of view.

  8. Lights, Camera…Citizen Science: Assessing the Effectiveness of Smartphone-Based Video Training in Invasive Plant Identification

    PubMed Central

    Starr, Jared; Schweik, Charles M.; Bush, Nathan; Fletcher, Lena; Finn, Jack; Fish, Jennifer; Bargeron, Charles T.

    2014-01-01

    The rapid growth and increasing popularity of smartphone technology is putting sophisticated data-collection tools in the hands of more and more citizens. This has exciting implications for the expanding field of citizen science. With smartphone-based applications (apps), it is now increasingly practical to remotely acquire high quality citizen-submitted data at a fraction of the cost of a traditional study. Yet, one impediment to citizen science projects is the question of how to train participants. The traditional “in-person” training model, while effective, can be cost prohibitive as the spatial scale of a project increases. To explore possible solutions, we analyze three training models: 1) in-person, 2) app-based video, and 3) app-based text/images in the context of invasive plant identification in Massachusetts. Encouragingly, we find that participants who received video training were as successful at invasive plant identification as those trained in-person, while those receiving just text/images were less successful. This finding has implications for a variety of citizen science projects that need alternative methods to effectively train participants when in-person training is impractical. PMID:25372597

  9. Dual-Sampler Processor Digitizes CCD Output

    NASA Technical Reports Server (NTRS)

    Salomon, P. M.

    1986-01-01

    Circuit for processing output of charge-coupled device (CCD) imager provides increased time for analog-to-digital conversion, thereby reducing bandwidth required for video processing. Instead of one sampleand-hold circuit of conventional processor, improved processor includes two sample-and-hold circuits alternated with each other. Dual-sampler processor operates with lower bandwidth and with timing requirements less stringent than those of single-sample processor.

  10. Video Object Tracking and Analysis for Computer Assisted Surgery

    E-print Network

    Pallath, Nobert Thomas

    2012-01-01

    Pedicle screw insertion technique has made revolution in the surgical treatment of spinal fractures and spinal disorders. Although X- ray fluoroscopy based navigation is popular, there is risk of prolonged exposure to X- ray radiation. Systems that have lower radiation risk are generally quite expensive. The position and orientation of the drill is clinically very important in pedicle screw fixation. In this paper, the position and orientation of the marker on the drill is determined using pattern recognition based methods, using geometric features, obtained from the input video sequence taken from CCD camera. A search is then performed on the video frames after preprocessing, to obtain the exact position and orientation of the drill. Animated graphics, showing the instantaneous position and orientation of the drill is then overlaid on the processed video for real time drill control and navigation.

  11. The future scientific CCD

    NASA Technical Reports Server (NTRS)

    Janesick, J. R.; Elliott, T.; Collins, S.; Marsh, H.; Blouke, M. M.

    1984-01-01

    Since the first introduction of charge-coupled devices (CCDs) in 1970, CCDs have been considered for applications related to memories, logic circuits, and the detection of visible radiation. It is pointed out, however, that the mass market orientation of CCD development has left largely untapped the enormous potential of these devices for advanced scientific instrumentation. The present paper has, therefore, the objective to introduce the CCD characteristics to the scientific community, taking into account prospects for further improvement. Attention is given to evaluation criteria, a summary of current CCDs, CCD performance characteristics, absolute calibration tools, quantum efficiency, aspects of charge collection, charge transfer efficiency, read noise, and predictions regarding the characteristics of the next generation of silicon scientific CCD imagers.

  12. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  13. Application of a Two Camera Video Imaging System to Three-Dimensional Vortex Tracking in the 80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1993-01-01

    A description is presented of two enhancements for a two-camera, video imaging system that increase the accuracy and efficiency of the system when applied to the determination of three-dimensional locations of points along a continuous line. These enhancements increase the utility of the system when extracting quantitative data from surface and off-body flow visualizations. The first enhancement utilizes epipolar geometry to resolve the stereo "correspondence" problem. This is the problem of determining, unambiguously, corresponding points in the stereo images of objects that do not have visible reference points. The second enhancement, is a method to automatically identify and trace the core of a vortex in a digital image. This is accomplished by means of an adaptive template matching algorithm. The system was used to determine the trajectory of a vortex generated by the Leading-Edge eXtension (LEX) of a full-scale F/A-18 aircraft tested in the NASA Ames 80- by 120-Foot Wind Tunnel. The system accuracy for resolving the vortex trajectories is estimated to be +/-2 inches over distance of 60 feet. Stereo images of some of the vortex trajectories are presented. The system was also used to determine the point where the LEX vortex "bursts". The vortex burst point locations are compared with those measured in small-scale tests and in flight and found to be in good agreement.

  14. Megapixel imaging camera for expanded H{sup {minus}} beam measurements

    SciTech Connect

    Simmons, J.E.; Lillberg, J.W.; McKee, R.J.; Slice, R.W.; Torrez, J.H.; McCurnin, T.W.; Sanchez, P.G.

    1994-02-01

    A charge coupled device (CCD) imaging camera system has been developed as part of the Ground Test Accelerator project at the Los Alamos National Laboratory to measure the properties of a large diameter, neutral particle beam. The camera is designed to operate in the accelerator vacuum system for extended periods of time. It would normally be cooled to reduce dark current. The CCD contains 1024 {times} 1024 pixels with pixel size of 19 {times} 19 {mu}m{sup 2} and with four phase parallel clocking and two phase serial clocking. The serial clock rate is 2.5{times}10{sup 5} pixels per second. Clock sequence and timing are controlled by an external logic-word generator. The DC bias voltages are likewise located externally. The camera contains circuitry to generate the analog clocks for the CCD and also contains the output video signal amplifier. Reset switching noise is removed by an external signal processor that employs delay elements to provide noise suppression by the method of double-correlated sampling. The video signal is digitized to 12 bits in an analog to digital converter (ADC) module controlled by a central processor module. Both modules are located in a VME-type computer crate that communicates via ethernet with a separate workstation where overall control is exercised and image processing occurs. Under cooled conditions the camera shows good linearity with dynamic range of 2000 and with dark noise fluctuations of about {plus_minus}1/2 ADC count. Full well capacity is about 5{times}10{sup 5} electron charges.

  15. The use of video for air pollution source monitoring

    SciTech Connect

    Ferreira, F.; Camara, A.

    1999-07-01

    The evaluation of air pollution impacts from single industrial emission sources is a complex environmental engineering problem. Recent developments in multimedia technologies used by personal computers improved the digitizing and processing of digital video sequences. This paper proposes a methodology where statistical analysis of both meteorological and air quality data combined with digital video images are used for monitoring air pollution sources. One of the objectives of this paper is to present the use of image processing algorithms in air pollution source monitoring. CCD amateur video cameras capture images that are further processed by computer. The use of video as a remote sensing system was implemented with the goal of determining some particular parameters, either meteorological or related with air quality monitoring and modeling of point sources. These parameters include the remote calculation of wind direction, wind speed, gases stack's outlet velocity, and stack's effective emission height. The characteristics and behavior of a visible pollutant's plume is also studied. Different sequences of relatively simple image processing operations are applied to the images gathered by the different cameras to segment the plume. The algorithms are selected depending on the atmospheric and lighting conditions. The developed system was applied to a 1,000 MW fuel power plant located at Setubal, Portugal. The methodology presented shows that digital video can be an inexpensive form to get useful air pollution related data for monitoring and modeling purposes.

  16. Evaluation test of CCD area sensor for star sensor

    NASA Astrophysics Data System (ADS)

    Sakurai, Yoshio; Kimura, Takeo

    1987-04-01

    The star sensor permits the determination of the posture of an artificial satellite or space vehicle. Coupled with the recent progress of semi- conductor technologies, the desire to develop the charge coupled device (CCD) star sensor has been expressed. This sensor is expected to be small-sized and lightweight, to consume little power, to provide high accuracy and to have a long life expectancy. In proceeding with the R and D on the CCD star sensor, it is essentially necessary to fully grasp the photoelectric characteristics of a CCD area sensor, which serves as the eye of the CCD star sensor. To increase their sensitivity, those sensors which are employed in the CCD star sensor require a longer accumulation time than a TV camera generally employed and they require cooling as well. Such usage, however, is rather special. The photoelectric characteristics of a sensor measured under such working conditions as referred to above, therefore, have been very rarely reported. National Aerospace Laboratory (NAL) has been conducting the research related to the trial manufacture and tests of CCD star sensors since 1982, including a joint research with Toshiba Corporation. To obtain the basic data for designing and trial manufacturing a CCD star sensor to be borne on an artificial satellite, an evaluation test relating to the photoelectronic characteristics of a CCD area sensor, was carried out.

  17. Application of PLZT electro-optical shutter to diaphragm of visible and mid-infrared cameras

    NASA Astrophysics Data System (ADS)

    Fukuyama, Yoshiyuki; Nishioka, Shunji; Chonan, Takao; Sugii, Masakatsu; Shirahata, Hiromichi

    1997-04-01

    Pb0.9La0.09(Zr0.65,Ti0.35)0.9775O3 9/65/35) commonly used as an electro-optical shutter exhibits large phase retardation with low applied voltage. This shutter features as follows; (1) high shutter speed, (2) wide optical transmittance, and (3) high optical density in 'OFF'-state. If the shutter is applied to a diaphragm of video-camera, it could protect its sensor from intense lights. We have tested the basic characteristics of the PLZT electro-optical shutter and resolved power of imaging. The ratio of optical transmittance at 'ON' and 'OFF'-states was 1.1 X 103. The response time of the PLZT shutter from 'ON'-state to 'OFF'-state was 10 micro second. MTF reduction when putting the PLZT shutter in from of the visible video- camera lens has been observed only with 12 percent at a spatial frequency of 38 cycles/mm which are sensor resolution of the video-camera. Moreover, we took the visible image of the Si-CCD video-camera. The He-Ne laser ghost image was observed at 'ON'-state. On the contrary, the ghost image was totally shut out at 'OFF'-state. From these teste, it has been found that the PLZT shutter is useful for the diaphragm of the visible video-camera. The measured optical transmittance of PLZT wafer with no antireflection coating was 78 percent over the range from 2 to 6 microns.

  18. Cone penetrometer deployed in situ video microscope for characterizing sub-surface soil properties

    SciTech Connect

    Lieberman, S.H.; Knowles, D.S.; Kertesz, J.

    1997-12-31

    In this paper we report on the development and field testing of an in situ video microscope that has been integrated with a cone penetrometer probe in order to provide a real-time method for characterizing subsurface soil properties. The video microscope system consists of a miniature CCD color camera system coupled with an appropriate magnification and focusing optics to provide a field of view with a coverage of approximately 20 mm. The camera/optic system is mounted in a cone penetrometer probe so that the camera views the soil that is in contact with a sapphire window mounted on the side of the probe. The soil outside the window is illuminated by diffuse light provided through the window by an optical fiber illumination system connected to a white light source at the surface. The video signal from the camera is returned to the surface where it can be displayed in real-time on a video monitor, recorded on a video cassette recorder (VCR), and/or captured digitally with a frame grabber installed in a microcomputer system. In its highest resolution configuration, the in situ camera system has demonstrated a capability to resolve particle sizes as small as 10 {mu}m. By using other lens systems to increase the magnification factor, smaller particles could be resolved, however, the field of view would be reduced. Initial field tests have demonstrated the ability of the camera system to provide real-time qualitative characterization of soil particle sizes. In situ video images also reveal information on porosity of the soil matrix and the presence of water in the saturated zone. Current efforts are focused on the development of automated imaging processing techniques as a means of extracting quantitative information on soil particle size distributions. Data will be presented that compares data derived from digital images with conventional sieve/hydrometer analyses.

  19. Video-Level Monitor

    NASA Technical Reports Server (NTRS)

    Gregory, Ray W.

    1993-01-01

    Video-level monitor developed to provide full-scene monitoring of video and indicates level of brightest portion. Circuit designed nonspecific and can be inserted in any closed-circuit camera system utilizing RS170 or RS330 synchronization and standard CCTV video levels. System made of readily available, off-the-shelf components. Several units are in service.

  20. Readout electronics for the Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Castilla, Javier; Ballester, Otger; Cardiel, Laia; Chappa, Steve; de Vicente, Juan; Holm, Scott; Huffman, David; Kozlovsky, Mark; Martinez, Gustavo; Olsen, Jamieson; Shaw, Theresa; Stuermer, Walter

    2010-07-01

    The goal of the Dark Energy Survey (DES) is to measure the dark energy equation of state parameter with four complementary techniques: galaxy cluster counts, weak lensing, angular power spectrum and type Ia supernovae. DES will survey a 5000 sq. degrees area of the sky in five filter bands using a new 3 deg2 mosaic camera (DECam) mounted at the prime focus of the Blanco 4-meter telescope at the Cerro-Tololo International Observatory (CTIO). DECam is a ~520 megapixel optical CCD camera that consists of 62 2k x 4k science sensors plus 4 2k x 2k sensors for guiding. The CCDs, developed at the Lawrence Berkeley National Laboratory (LBNL) and packaged and tested at Fermilab, have been selected to obtain images efficiently at long wavelengths. A front-end electronics system has been developed specifically to perform the CCD readout. The system is based in Monsoon, an open source image acquisition system designed by the National Optical Astronomy Observatory (NOAO). The electronics consists mainly of three types of modules: Control, Acquisition and Clock boards. The system provides a total of 132 video channels, 396 bias levels and around 1000 clock channels in order to readout the full mosaic at 250 kpixel/s speed with 10 e- noise performance. System configuration and data acquisition is done by means of six 0.8 Gbps optical links. The production of the whole system is currently underway. The contribution will focus on the testing, calibration and general performance of the full system in a realistic environment.

  1. Upgrades to NDSF Vehicle Camera Systems and Development of a Prototype System for Migrating and Archiving Video Data in the National Deep Submergence Facility Archives at WHOI

    NASA Astrophysics Data System (ADS)

    Fornari, D.; Howland, J.; Lerner, S.; Gegg, S.; Walden, B.; Bowen, A.; Lamont, M.; Kelley, D.

    2003-12-01

    In recent years, considerable effort has been made to improve the visual recording capabilities of Alvin and ROV Jason. This has culminated in the routine use of digital cameras, both internal and external on these vehicles, which has greatly expanded the scientific recording capabilities of the NDSF. The UNOLS National Deep Submergence Facility (NDSF) archives maintained at Woods Hole Oceanograpic Institution (WHOI) are the repository for the diverse suite of photographic still images (both 35mm and recently digital), video imagery, vehicle data and navigation, and near-bottom side-looking sonar data obtained by the facility vehicles. These data comprise a unique set of information from a wide range of seafloor environments over the more than 25 years of NDSF operations in support of science. Included in the holdings are Alvin data plus data from the tethered vehicles- ROV Jason, Argo II, and the DSL-120 side scan sonar. This information conservatively represents an outlay in facilities and science costs well in excess of \\$100 million. Several archive related improvement issues have become evident over the past few years. The most critical are: 1. migration and better access to the 35mm Alvin and Jason still images through digitization and proper cataloging with relevant meta-data, 2. assessing Alvin data logger data, migrating data on older media no longer in common use, and properly labeling and evaluating vehicle attitude and navigation data, 3. migrating older Alvin and Jason video data, especially data recorded on Hi-8 tape that is very susceptible to degradation on each replay, to newer digital format media such as DVD, 4. improving the capabilities of the NDSF archives to better serve the increasingly complex needs of the oceanographic community, including researchers involved in focused programs like Ridge2000 and MARGINS, where viable distributed databases in various disciplinary topics will form an important component of the data management structure. We report on an archiving effort to transfer video footage currently on Hi-8 and VHS tape to digital media (DVD). At the same time as this is being done, frame grab imagery at reasonable resolution (640x480) at 30 sec. intervals will be compiled and the images will be integrated, as much as possible with vehicle attitude/navigation data and provided to the user community in a web-browser format, such as has already been done for the recent Jason and Alvin frame grabbed imagery. The frame-grabbed images will be tagged with time, thereby permitting integration of vehicle attitude and navigation data once that is available. In order to prototype this system, we plan to utilize data from the East Pacific Rise and Juan de Fuca Ridge which are field areas selected by the community as Ridge2000 Integrated Study Sites. There are over 500 Alvin dives in both these areas and having frame-grabbed, synoptic views of the terrains covered during those dives will be invaluable for scientific and outreach use as part of Ridge2000. We plan to coordinate this activity with the Ridge2000 Data Management Office at LDEO.

  2. Competetive and Mature CCD Imaging Systems for Planetary Raman Spectrometers

    NASA Astrophysics Data System (ADS)

    Ingley, R.; Hutchinson, I.; Harris, L. V.; McHugh, M.; Edwards, H. G. M.; Waltham, N. R.; Brown, P.; Pool, P.

    2014-06-01

    Progress on the design of a CCD-based imaging system is presented. The camera system, provided by the UK, uses space-qualified and mature technology and is included in the ExoMars RLS instrument due for launch 2018.

  3. CCD high-speed videography system with new concepts and techniques

    NASA Astrophysics Data System (ADS)

    Zheng, Zengrong; Zhao, Wenyi; Wu, Zhiqiang

    1997-05-01

    A novel CCD high speed videography system with brand-new concepts and techniques is developed by Zhejiang University recently. The system can send a series of short flash pulses to the moving object. All of the parameters, such as flash numbers, flash durations, flash intervals, flash intensities and flash colors, can be controlled according to needs by the computer. A series of moving object images frozen by flash pulses, carried information of moving object, are recorded by a CCD video camera, and result images are sent to a computer to be frozen, recognized and processed with special hardware and software. Obtained parameters can be displayed, output as remote controlling signals or written into CD. The highest videography frequency is 30,000 images per second. The shortest image freezing time is several microseconds. The system has been applied to wide fields of energy, chemistry, medicine, biological engineering, aero- dynamics, explosion, multi-phase flow, mechanics, vibration, athletic training, weapon development and national defense engineering. It can also be used in production streamline to carry out the online, real-time monitoring and controlling.

  4. Video-based beam position monitoring at CHESS

    NASA Astrophysics Data System (ADS)

    Revesz, Peter; Pauling, Alan; Krawczyk, Thomas; Kelly, Kevin J.

    2012-10-01

    CHESS has pioneered the development of X-ray Video Beam Position Monitors (VBPMs). Unlike traditional photoelectron beam position monitors that rely on photoelectrons generated by the fringe edges of the X-ray beam, with VBPMs we collect information from the whole cross-section of the X-ray beam. VBPMs can also give real-time shape/size information. We have developed three types of VBPMs: (1) VBPMs based on helium luminescence from the intense white X-ray beam. In this case the CCD camera is viewing the luminescence from the side. (2) VBPMs based on luminescence of a thin (~50 micron) CVD diamond sheet as the white beam passes through it. The CCD camera is placed outside the beam line vacuum and views the diamond fluorescence through a viewport. (3) Scatter-based VBPMs. In this case the white X-ray beam passes through a thin graphite filter or Be window. The scattered X-rays create an image of the beam's footprint on an X-ray sensitive fluorescent screen using a slit placed outside the beam line vacuum. For all VBPMs we use relatively inexpensive 1.3 Mega-pixel CCD cameras connected via USB to a Windows host for image acquisition and analysis. The VBPM host computers are networked and provide live images of the beam and streams of data about the beam position, profile and intensity to CHESS's signal logging system and to the CHESS operator. The operational use of VBPMs showed great advantage over the traditional BPMs by providing direct visual input for the CHESS operator. The VBPM precision in most cases is on the order of ~0.1 micron. On the down side, the data acquisition frequency (50-1000ms) is inferior to the photoelectron based BPMs. In the future with the use of more expensive fast cameras we will be able create VBPMs working in the few hundreds Hz scale.

  5. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  6. Measurement of marine picoplankton cell size by using a cooled, charge-coupled device camera with image-analyzed fluorescence microscopy

    SciTech Connect

    Viles, C.L.; Sieracki, M.E. )

    1992-02-01

    Accurate measurement of the biomass and size distribution of picoplankton cells (0.2 to 2.0 {mu}m) is paramount in characterizing their contribution to the oceanic food web and global biogeochemical cycling. Image-analyzed fluorescence microscopy, usually based on video camera technology, allows detailed measurements of individual cells to be taken. The application of an imaging system employing a cooled, slow-scan charge-coupled device (CCD) camera to automated counting and sizing of individual picoplankton cells from natural marine samples is described. A slow-scan CCD-based camera was compared to a video camera and was superior for detecting and sizing very small, dim particles such as fluorochrome-stained bacteria. Several edge detection methods for accurately measuring picoplankton cells were evaluated. Standard fluorescent microspheres and a Sargasso Sea surface water picoplankton population were used in the evaluation. Global thresholding was inappropriate for these samples. Methods used previously in image analysis of nanoplankton cells (2 to 20 {mu}m) also did not work well with the smaller picoplankton cells. A method combining an edge detector and an adaptive edge strength operator worked best for rapidly generating accurate cell sizes. A complete sample analysis of more than 1,000 cells averages about 50 min and yields size, shape, and fluorescence data for each cell. With this system, the entire size range of picoplankton can be counted and measured.

  7. Electronic still camera

    NASA Astrophysics Data System (ADS)

    Holland, S. Douglas

    1992-09-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  8. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  9. Video flowmeter

    DOEpatents

    Lord, David E. (Livermore, CA); Carter, Gary W. (Livermore, CA); Petrini, Richard R. (Livermore, CA)

    1983-01-01

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid (10) containing entrained particles (12) is formed and positioned by a rod optic lens assembly (31) on the raster area of a low-light level television camera (20). The particles (12) are illuminated by light transmitted through a bundle of glass fibers (32) surrounding the rod optic lens assembly (31). Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen (40). The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid (10).

  10. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1981-06-10

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid.

  11. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1983-08-02

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid. 4 figs.

  12. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  13. World's fastest and most sensitive astronomical camera

    NASA Astrophysics Data System (ADS)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these corrections to be done at an even higher rate, more than one thousand times a second, and this is where OCam is essential. "The quality of the adaptive optics correction strongly depends on the speed of the camera and on its sensitivity," says Philippe Feautrier from the LAOG, France, who coordinated the whole project. "But these are a priori contradictory requirements, as in general the faster a camera is, the less sensitive it is." This is why cameras normally used for very high frame-rate movies require extremely powerful illumination, which is of course not an option for astronomical cameras. OCam and its CCD220 detector, developed by the British manufacturer e2v technologies, solve this dilemma, by being not only the fastest available, but also very sensitive, making a significant jump in performance for such cameras. Because of imperfect operation of any physical electronic devices, a CCD camera suffers from so-called readout noise. OCam has a readout noise ten times smaller than the detectors currently used on the VLT, making it much more sensitive and able to take pictures of the faintest of sources. "Thanks to this technology, all the new generation instruments of ESO's Very Large Telescope will be able to produce the best possible images, with an unequalled sharpness," declares Jean-Luc Gach, from the Laboratoire d'Astrophysique de Marseille, France, who led the team that built the camera. "Plans are now underway to develop the adaptive optics detectors required for ESO's planned 42-metre European Extremely Large Telescope, together with our research partners and the industry," says Hubin. Using sensitive detectors developed in the UK, with a control system developed in France, with German and Spanish participation, OCam is truly an outcome of a European collaboration that will be widely used and commercially produced. More information The three French laboratories involved are the Laboratoire d'Astrophysique de Marseille (LAM/INSU/CNRS, Université de Provence; Observatoire Astronomique de Marseille Prov

  14. An advanced CCD emulator with 32MB image memory

    NASA Astrophysics Data System (ADS)

    O'Connor, P.; Fried, J.; Kotov, I.

    2012-07-01

    As part of the LSST sensor development program we have developed an advanced CCD emulator for testing new multichannel readout electronics. The emulator, based on an Altera Stratix II FPGA for timing and control, produces 4 channels of simulated video waveforms in response to an appropriate sequence of horizontal and vertical clocks. It features 40MHz, 16-bit DACs for reset and video generation, 32MB of image memory for storage of arbitrary grayscale bitmaps, and provision to simulate reset and clock feedthrough ("glitches") on the video channels. Clock inputs are qualified for proper sequences and levels before video output is generated. Binning, region of interest, and reverse clock sequences are correctly recognized and appropriate video output will be produced. Clock transitions are timestamped and can be played back to a control PC. A simplified user interface is provided via a daughter card having an ARM M3 Cortex microprocessor and miniature color LCD display and joystick. The user can select video modes from stored bitmap images, or flat, gradient, bar, chirp, or checkerboard test patterns; set clock thresholds and video output levels; and set row/column formats for image outputs. Multiple emulators can be operated in parallel to simulate complex CCDs or CCD arrays.

  15. Handbook of CCD Astronomy

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.

    2000-04-01

    This handbook constitutes a concise and accessible reference on all practical aspects of using Charge-Coupled Devices (CCDs). Starting with the electronic workings of these modern marvels, Steven Howell discusses their basic characteristics and then gives methods and examples for determining their values. While the focus is on using CCDs in professional observational astronomy, advanced amateur astronomers, and researchers in physics, chemistry, medical imaging, and remote sensing will also benefit from the material. Tables of useful and hard-to-find data, and key practical equations round off the book's treatment. For exercises and more information, log on to www.psi.edu/~howell/ccd.html.

  16. CCD-based guiding sensor resolution enhancement

    NASA Astrophysics Data System (ADS)

    Sobotka, Milos; Prochazka, Ivan; Hamal, Karel; Blazej, Josef

    1998-04-01

    The paper presents the achievements in research and development of the compact optical CCD based guiding system. The ultimate goal of the project is a modest size, low mass and rugged systems to be applied for sub-arc second optical ground-space guiding and tracking. The system includes the optics, the CCD sensor with the readout and an image- processing algorithm. The optics consist of a diffraction limited objective, four-element lens system with eh effective input aperture 90 millimeters. The objective focal length 300 mm is extended by the additional relay optics. The resulting effective focal length is 3000 millimeters, the focal spot size is 65 micrometers Airy disc diameter. The combination of the diffraction limited objective design, focal extender and mechanical construction permitted to keep the overall length bellow 600 millimeters and the total mass bellow 5 kilograms while maintaining high ruggedness at one arc-second level. A sensor, the Texas Instrument CCD chip 192 X 164 pixels, 15 micrometers size is used. the custom designed readout and data processing hardware has been developed. Parallel communication maintains image download time 0.6 second with 12 bits amplitude resolution. The data acquisition and image processing software package running under MS Windows 95 or NT provide all functions for the camera control, data acquisition and image processing for precise target position evaluation. The position is evaluated as the center of mass of square neighborhood of the brightest CCD pixel. Indoor test of the ultimate position resolution using different diffraction limited images and sizes are described. The image position resolution +/- 0.03 pixel has been achieved. It corresponds to 0.03 arc seconds of angular resolution of the entire guiding sensor.

  17. Video Compressive Sensing Using Gaussian Mixture Models

    E-print Network

    Carin, Lawrence

    1 Video Compressive Sensing Using Gaussian Mixture Models Jianbo Yang, Xin Yuan, Xuejun Liao) based algorithm is proposed for video reconstruction from temporally-compressed video measurements reconstructed from simulated compressive video measurements, and from a real compressive video camera. We also

  18. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  19. System for control of cooled CCD and image data processing for plasma spectroscopy

    SciTech Connect

    Mimura, M.; Kakeda, T.; Inoko, A.

    1995-12-31

    A Spectroscopic measurement system which has a spacial resolution is important for plasma study. This is especially true for a measurement of a plasma without axial symmetry like the LHD-plasma. Several years ago, we developed an imaging spectroscopy system using a CCD camera and an image-memory board of a personal computer. It was very powerful to study a plasma-gas interaction phenomena. In which system, however, an ordinary CCD was used so that the dark-current noise of the CCD prevented to measure dark spectral lines. Recently, a cooled CCD system can be obtained for the high sensitivity measurement. But such system is still very expensive. The cooled CCD itself as an element can be purchased cheaply, because amateur agronomists began to use it to take a picture of heavenly bodies. So we developed an imaging spectroscopy system using such a cheap cooled CCD for plasma experiment.

  20. Galeotti, et al., http://www.ncigt.org/pages/IGT_Workshop_2011 ProbeSight: Video Cameras on an Ultrasound Probe for

    E-print Network

    Stetten, George

    features to the phantom. The tracing paper was saturated with ultrasound gel so Cameras on an Ultrasound Probe for Computer Vision of the Patient's Exterior J Medical ultrasound typically deals with the interior of the patient

  1. The Dark Energy Camera

    E-print Network

    Flaugher, B; Honscheid, K; Abbott, T M C; Alvarez, O; Angstadt, R; Annis, J T; Antonik, M; Ballester, O; Beaufore, L; Bernstein, G M; Bernstein, R A; Bigelow, B; Bonati, M; Boprie, D; Brooks, D; Buckley-Geer, E J; Campa, J; Cardiel-Sas, L; Castander, F J; Castilla, J; Cease, H; Cela-Ruiz, J M; Chappa, S; Chi, E; Cooper, C; da Costa, L N; Dede, E; Derylo, G; DePoy, D L; de Vicente, J; Doel, P; Drlica-Wagner, A; Eiting, J; Elliott, A E; Emes, J; Estrada, J; Neto, A Fausti; Finley, D A; Flores, R; Frieman, J; Gerdes, D; Gladders, M D; Gregory, B; Gutierrez, G R; Hao, J; Holland, S E; Holm, S; Huffman, D; Jackson, C; James, D J; Jonas, M; Karcher, A; Karliner, I; Kent, S; Kessler, R; Kozlovsky, M; Kron, R G; Kubik, D; Kuehn, K; Kuhlmann, S; Kuk, K; Lahav, O; Lathrop, A; Lee, J; Levi, M E; Lewis, P; Li, T S; Mandrichenko, I; Marshall, J L; Martinez, G; Merritt, K W; Miquel, R; Munoz, F; Neilsen, E H; Nichol, R C; Nord, B; Ogando, R; Olsen, J; Palio, N; Patton, K; Peoples, J; Plazas, A A; Rauch, J; Reil, K; Rheault, J -P; Roe, N A; Rogers, H; Roodman, A; Sanchez, E; Scarpine, V; Schindler, R H; Schmidt, R; Schmitt, R; Schubnell, M; Schultz, K; Schurter, P; Scott, L; Serrano, S; Shaw, T M; Smith, R C; Soares-Santos, M; Stefanik, A; Stuermer, W; Suchyta, E; Sypniewski, A; Tarle, G; Thaler, J; Tighe, R; Tran, C; Tucker, D; Walker, A R; Wang, G; Watson, M; Weaverdyck, C; Wester, W; Woods, R; Yanny, B

    2015-01-01

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250 micron thick fully-depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2kx4k CCDs for imaging and 12 2kx2k CCDs for guiding and focus. The CCDs have 15 microns x15 microns pixels with a plate scale of 0.263 arc sec per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construct...

  2. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; The DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 ?m thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 ?m × 15 ?m pixels with a plate scale of 0.?263 pixel?1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  3. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  4. Video monitoring system for car seat

    NASA Technical Reports Server (NTRS)

    Elrod, Susan Vinz (Inventor); Dabney, Richard W. (Inventor)

    2004-01-01

    A video monitoring system for use with a child car seat has video camera(s) mounted in the car seat. The video images are wirelessly transmitted to a remote receiver/display encased in a portable housing that can be removably mounted in the vehicle in which the car seat is installed.

  5. LSST camera readout chip ASPIC: test tools

    NASA Astrophysics Data System (ADS)

    Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.

    2012-02-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  6. Echelle spectroscopy with a charge-coupled device /CCD/

    NASA Technical Reports Server (NTRS)

    York, D. G.; Jenkins, E. B.; Zucchino, P.; Lowrance, J. L.; Long, D.; Songaila, A.

    1981-01-01

    The recent availability of large format CCD's with high quantum efficiency makes it possible to achieve significant advances in high dispersion astronomical spectroscopy. An echelle CCD combination excels or equals other techniques presently available, and offers the advantage of complete spectral coverage of several thousand Angstroms in a single exposure. Attention is given to experiments which were conducted with a CCD camera head and an echelle spectrograph on a 4-meter telescope. It was found possible to achieve a signal-to-noise ratio of 150/1 on a 13th magnitude star at 6000 A in a two-hour exposure at 0.16 A/pixel, limited primarily by photon statistics. For fainter objects, readout noise is the limiting factor in precision. For 20 electron rms readout noise, an S/N = 15/1 at 18th magnitude is expected, all other things being equal.

  7. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward (Hercules, CA)

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  8. Fractals in pixellated video feedback.

    PubMed

    Courtial, J; Leach, J; Padgett, M J

    Video feedback occurs whenever a video camera is directed at a screen displaying the image currently being recorded by the camera. It can be observed in everyday situations, for example at sporting events when a stadium's display screen comes into the camera's view. Here we consider how this simple physical process is affected by the fact that monitors are pixel-based, and show that it can result in stationary fractal patterns such as von-Koch snowflakes and Sierpinski gaskets. PMID:11780051

  9. Scalable Multi-camera Tracking in a Metropolis Yogesh Raja and Shaogang Gong

    E-print Network

    Gong, Shaogang

    quantities of open-world CCTV video data sourced from a large distributed multi-camera network encompassing investigators tasked with the forensic analysis of video from multi-camera CCTV networks face many challenges

  10. Toying with obsolescence : Pixelvision filmmakers and the Fisher Price PXL 2000 camera

    E-print Network

    McCarty, Andrea Nina

    2005-01-01

    This thesis is a study of the Fisher Price PXL 2000 camera and the artists and amateurs who make films and videos with this technology. The Pixelvision camera records video onto an audiocassette; its image is low-resolution, ...

  11. Digital autoradiography using room temperature CCD and CMOS imaging technology

    NASA Astrophysics Data System (ADS)

    Cabello, Jorge; Bailey, Alexis; Kitchen, Ian; Prydderch, Mark; Clark, Andy; Turchetta, Renato; Wells, Kevin

    2007-08-01

    CCD (charged coupled device) and CMOS imaging technologies can be applied to thin tissue autoradiography as potential imaging alternatives to using conventional film. In this work, we compare two particular devices: a CCD operating in slow scan mode and a CMOS-based active pixel sensor, operating at near video rates. Both imaging sensors have been operated at room temperature using direct irradiation with images produced from calibrated microscales and radiolabelled tissue samples. We also compare these digital image sensor technologies with the use of conventional film. We show comparative results obtained with 14C calibrated microscales and 35S radiolabelled tissue sections. We also present the first results of 3H images produced under direct irradiation of a CCD sensor operating at room temperature. Compared to film, silicon-based imaging technologies exhibit enhanced sensitivity, dynamic range and linearity.

  12. Guerrilla Video: A New Protocol for Producing Classroom Video

    ERIC Educational Resources Information Center

    Fadde, Peter; Rich, Peter

    2010-01-01

    Contemporary changes in pedagogy point to the need for a higher level of video production value in most classroom video, replacing the default video protocol of an unattended camera in the back of the classroom. The rich and complex environment of today's classroom can be captured more fully using the higher level, but still easily manageable,…

  13. Video Mosaicking for Inspection of Gas Pipelines

    NASA Technical Reports Server (NTRS)

    Magruder, Darby; Chien, Chiun-Hong

    2005-01-01

    A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable differences: Because the wide-angle lens introduces considerable distortion, the image data must be processed to effectively unwarp the images (see Figure 2). The computer executes special software that includes an unwarping algorithm that takes explicit account of the cylindrical pipe geometry. To reduce the processing time needed for unwarping, parameters of the geometric mapping between the circular view of a fisheye lens and pipe wall are determined in advance from calibration images and compiled into an electronic lookup table. The software incorporates the assumption that the optical axis of the camera is parallel (rather than perpendicular) to the direction of motion of the camera. The software also compensates for the decrease in illumination with distance from the ring of LEDs.

  14. CCD Photometer Installed on the Telescope - 600 OF the Shamakhy Astrophysical Observatory: I. Adjustment of CCD Photometer with Optics - 600

    NASA Astrophysics Data System (ADS)

    Lyuty, V. M.; Abdullayev, B. I.; Alekberov, I. A.; Gulmaliyev, N. I.; Mikayilov, Kh. M.; Rustamov, B. N.

    2009-12-01

    Short description of optical and electric scheme of CCD photometer with camera U-47 installed on the Cassegrain focus of ZEISS-600 telescope of the ShAO NAS Azerbaijan is provided. The reducer of focus with factor of reduction 1.7 is applied. It is calculated equivalent focal distances of a telescope with a focus reducer. General calculations of optimum distance from focal plane and t sizes of optical filters of photometer are presented.

  15. Automated characterization of CCD detectors for DECam

    NASA Astrophysics Data System (ADS)

    Kubik, D.; Alvarez, R.; Abbott, T.; Annis, J.; Bonati, M.; Buckley-Geer, E.; Campa, J.; Cease, H.; Chappa, S.; DePoy, D.; Derylo, G.; Diehl, H. T.; Estrada, J.; Flaugher, B.; Hao, J.; Holland, S.; Huffman, D.; Karliner, I.; Kuhlmann, S.; Kuk, K.; Lin, H.; Montes, J.; Roe, N.; Scarpine, V.; Schmidt, R.; Schultz, K.; Shaw, T.; Simaitis, V.; Spinka, H.; Stuermer, W.; Tucker, D.; Walker, A.; Wester, W.

    2010-07-01

    The Dark Energy Survey Camera (DECam) will be comprised of a mosaic of 74 charge-coupled devices (CCDs). The Dark Energy Survey (DES) science goals set stringent technical requirements for the CCDs. The CCDs are provided by LBNL with valuable cold probe data at 233 K, providing an indication of which CCDs are more likely to pass. After comprehensive testing at 173 K, about half of these qualify as science grade. Testing this large number of CCDs to determine which best meet the DES requirements is a very time-consuming task. We have developed a multistage testing program to automatically collect and analyze CCD test data. The test results are reviewed to select those CCDs that best meet the technical specifications for charge transfer efficiency, linearity, full well capacity, quantum efficiency, noise, dark current, cross talk, diffusion, and cosmetics.

  16. Video sensor with range measurement capability

    NASA Technical Reports Server (NTRS)

    Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Howard, Richard T. (Inventor); Broderick, David J. (Inventor)

    2008-01-01

    A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.

  17. How to Upload Videos to YouTube To submit a video to the event, you should make a 30-to-60-second video. You can use your cell

    E-print Network

    Engel, Robert

    How to Upload Videos to YouTube To submit a video to the event, you should make a 30-to-60-second video. You can use your cell phone, a video camera, your webcam, or other sources for your video. You can use various video editing programs to make your video, including iMovie (on Macs), Windows Live

  18. The high resolution video capture system on the alcator C-Mod tokamak

    SciTech Connect

    Allen, A.J.; Terry, J.L.; Garnier, D.; Stillerman, J.A.; Wurden, G.A.

    1997-01-01

    A new system for routine digitization of video images is presently operating on the Alcator C-Mod tokamak. The PC-based system features high resolution video capture, storage, and retrieval. The captured images are stored temporarily on the PC, but are eventually written to CD. Video is captured from one of five filtered RS-170 CCD cameras at 30 frames per second (fps) with 640{times}480 pixel resolution. In addition, the system can digitize the output from a filtered Kodak Ektapro EM Digital Camera which captures images at 1000 fps with 239{times}192 resolution. Present views of this set of cameras include a wide angle and a tangential view of the plasma, two high resolution views of gas puff capillaries embedded in the plasma facing components, and a view of ablating, high speed Li pellets. The system is being used to study (1) the structure and location of visible emissions (including MARFEs) from the main plasma and divertor, (2) asymmetries in gas puff plumes due to flows in the scrape-off layer (SOL), and (3) the tilt and cigar-shaped spatial structure of the Li pellet ablation cloud. {copyright} {ital 1997 American Institute of Physics.}

  19. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  20. CCDPACK -- CCD data reduction package

    NASA Astrophysics Data System (ADS)

    Draper, Peter W.; Taylor, Mark; Allan, Alasdair

    CCDPACK is a package of programs for reducing CCD-like data. They allow you to debias, remove dark current, flatfield, register, resample and normalize data from single- or multiple-CCD instruments. CCDPACK is designed to help you to reduce your data easily. The basic reduction stages can be set up using an X based GUI that controls an automated reduction system. The GUI is designed to allow you to start working without any detailed knowledge of the package (or indeed of CCD reduction). Registration is performed using graphical, script based or automated techniques that help reduce the amount of work to a minimum. This document is intended for all users of CCDPACK. It provides instruction in how to use CCDPACK, describes CCD reduction in general and contains complete descriptions of the individual programs.

  1. CCD Astronomy Software User's Guide

    E-print Network

    CCDSoft CCD Astronomy Software User's Guide Version 5 Revision 1.11 Copyright © 1992­2006 SantaSky Astronomy Software, and AutomaDome are trademarks of Software Bisque. WindowsTM is a trademark of Microsoft

  2. Enhanced performance CCD output amplifier

    DOEpatents

    Dunham, Mark E. (Los Alamos, NM); Morley, David W. (Santa Fe, NM)

    1996-01-01

    A low-noise FET amplifier is connected to amplify output charge from a che coupled device (CCD). The FET has its gate connected to the CCD in common source configuration for receiving the output charge signal from the CCD and output an intermediate signal at a drain of the FET. An intermediate amplifier is connected to the drain of the FET for receiving the intermediate signal and outputting a low-noise signal functionally related to the output charge signal from the CCD. The amplifier is preferably connected as a virtual ground to the FET drain. The inherent shunt capacitance of the FET is selected to be at least equal to the sum of the remaining capacitances.

  3. Digital Charge Coupled Device (CCD) Camera System Architecture

    NASA Astrophysics Data System (ADS)

    Babey, S. K.; Anger, C. D.; Green, B. D.

    1987-03-01

    We propose a modeling system for generic objects in order to recognize different objects from the same category with only one generic model. The representation consists of a prototype, represented by parts and their configuration. Parts are modeled by superquadric volumetric primitives which are combined via Boolean operations to form objects. Variations between objects within a category are described by allowable changes in structure and shape deformations of prototypical parts. Each prototypical part and relation has a set of associated features that can be recognized in the images. These features are used for selecting models from the model data base. The selected hypothetical models are then verified on the geometric level by deforming the prototype in allowable ways to match the data. We base our design of the modeling system upon the current psychological theories of categorization and of human visual perception.

  4. Performance comparison of streak camera recording systems

    SciTech Connect

    Derzon, M.; Barber, T.

    1995-07-01

    Streak camera based diagnostics are vital to the inertial confinement fusion program at Sandia National Laboratories. Performance characteristics of various readout systems coupled to an EGG-AVO streak camera were analyzed and compared to scaling estimates. The purpose of the work was to determine the limits of the streak camera performance and the optimal fielding conditions for the Amador Valley Operations (AVO) streak camera systems. The authors measured streak camera limitations in spatial resolution and sensitivity. Streak camera limits on spatial resolution are greater than 18 lp/mm at 4% contrast. However, it will be difficult to make use of any resolution greater than this because of high spatial frequency variation in the photocathode sensitivity. They have measured a signal to noise of 3,000 with 0.3 mW/cm{sup 2} of 830 nm light at a 10 ns/mm sweep speed. They have compared lens coupling systems with and without micro-channel plate intensifiers and systems using film or charge coupled device (CCD) readout. There were no conditions where film was found to be an improvement over the CCD readout. Systems utilizing a CCD readout without an intensifier have comparable resolution, for these source sizes and at a nominal cost in signal to noise of 3, over those with an intensifier. Estimates of the signal-to-noise for different light coupling methods show how performance can be improved.

  5. Data acquisition and control system for high-performance large-area CCD systems

    NASA Astrophysics Data System (ADS)

    Afanasieva, I. V.

    2015-04-01

    Astronomical CCD systems based on second-generation DINACON controllers were developed at the SAO RAS Advanced Design Laboratory more than seven years ago and since then have been in constant operation at the 6-meter and Zeiss-1000 telescopes. Such systems use monolithic large-area CCDs. We describe the software developed for the control of a family of large-area CCD systems equipped with a DINACON-II controller. The software suite serves for acquisition, primary reduction, visualization, and storage of video data, and also for the control, setup, and diagnostics of the CCD system.

  6. Data Acquisition and Control System for High-Performance Large-Area CCD Systems

    E-print Network

    Afanasieva, I V

    2015-01-01

    Astronomical CCD systems based on second-generation DINACON controllers were developed at the SAO RAS Advanced Design Laboratory more than seven years ago and since then have been in constant operation at the 6-meter and Zeiss-1000 telescopes. Such systems use monolithic large-area CCDs. We describe the software developed for the control of a family of large-area CCD systems equipped with a DINACON-II controller. The software suite serves for acquisition, primary reduction, visualization, and storage of video data, and also for the control, setup, and diagnostics of the CCD system.

  7. Page 1 of 2 How to Upload Videos to YouTube

    E-print Network

    Engel, Robert

    Page 1 of 2 How to Upload Videos to YouTube To submit a video to the event, you should make a 30-to-60-second video. You can use your cell phone, a video camera, your webcam, or other sources for your video. You can use various video editing programs to make your video, including iMovie (on Macs

  8. CCD readout electronics for the Subaru Prime Focus Spectrograph

    NASA Astrophysics Data System (ADS)

    Hope, Stephen C.; Gunn, James E.; Loomis, Craig P.; Fitzgerald, Roger E.; Peacock, Grant O.

    2014-07-01

    The following paper details the design for the CCD readout electronics for the Subaru Telescope Prime Focus Spectrograph (PFS). PFS is designed to gather spectra from 2394 objects simultaneously, covering wavelengths that extend from 380 nm to 1260 nm. The spectrograph is comprised of four identical spectrograph modules, each collecting roughly 600 spectra. The spectrograph modules provide simultaneous wavelength coverage over the entire band through the use of three separate optical channels: blue, red, and near infrared (NIR). A camera in each channel images the multi-object spectra onto a 4k × 4k, 15 ?m pixel, detector format. The two visible cameras use a pair of Hamamatsu 2k × 4k CCDs with readout provided by custom electronics, while the NIR camera uses a single Teledyne HgCdTe 4k × 4k detector and Teledyne's ASIC Sidecar to read the device. The CCD readout system is a custom design comprised of three electrical subsystems - the Back End Electronics (BEE), the Front End Electronics (FEE), and a Pre-amplifier. The BEE is an off-the-shelf PC104 computer, with an auxiliary Xilinx FPGA module. The computer serves as the main interface to the Subaru messaging hub and controls other peripheral devices associated with the camera, while the FPGA is used to generate the necessary clocks and transfer image data from the CCDs. The FEE board sets clock biases, substrate bias, and CDS offsets. It also monitors bias voltages, offset voltages, power rail voltage, substrate voltage and CCD temperature. The board translates LVDS clock signals to biased clocks and returns digitized analog data via LVDS. Monitoring and control messages are sent from the BEE to the FEE using a standard serial interface. The Pre-amplifier board resides behind the detectors and acts as an interface to the two Hamamatsu CCDs. The Pre-amplifier passes clocks and biases to the CCDs, and analog CCD data is buffered and amplified prior to being returned to the FEE. In this paper we describe the detailed design of the PFS CCD readout electronics and discuss current status of the design, preliminary performance, and proposed enhancements.

  9. The Crimean CCD Telescope for the asteroid observations

    NASA Astrophysics Data System (ADS)

    Chernykh, N. S.; Rumyantsev, V. V.

    2002-09-01

    The old 64-cm Richter-Slefogt telescope (F=90 cm) of the Crimean Astrophysical Observatory was reconstructed and equipped with the SBIG ST-8 CCD camera received from the Planetary Society for Eugene Shoemaker's Near Earth Object Grant. First observations of minor planets and comets were made with it. The CCD matrix of the St-8 camera in the focus of our telescope covers a field of 52'.7 x 35'.1. The 120 - second exposure yields stars up to the limiting magnitude of 20.5 for S/N=3. According to preliminary estimations, the telescope of today state enables us to cover, during the year, the sky area of not more than 600 sq. deg. with threefold overlaps. An automation of the telescope can increase the productivity up to 20000 sq. deg. per year. The software for object localization, image parameters determination, stars identification, astrometric reduction, identification and catalogue of asteroids is worked up. The first results obtained with the Crimean CCD 64-cm telescope are discussed.

  10. The Crimean CCD telescope for the asteroid observations

    NASA Astrophysics Data System (ADS)

    Chernykh, Nikolaj; Rumyantsev, Vasilij

    2002-11-01

    The old 64-cm Richter-Slefogt telescope (F=90 cm) of the Crimean Astrophysical Observatory was reconstructed and equipped with the St-8 CCD camera supplied by the Planetary Society as the Eugene Shoemaker Near Earth Object Grant. The first observations of minor planets and comets were made with the telescope in 2000. The CCD matrix of St-8 camera in the focus of our telescope covers field of 52'.7×35'.1. With 120-second exposure we obtain the images of stars up to the limiting magnitude of 20.5 mag within S/N=3. The first phase of automation of the telescope was completed in May of 2002. According to our estimations, the telescope will be able to cover the sky area of 20 square deg with threefold overlapping during the night. The software for object localization, image parameters determination, stars identification, astrometric reduction, identification and cataloguing of asteroids is worked up. The first observation results obtained with the 64-cm CCD telescope are discussed.

  11. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  12. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  13. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  14. On the development of new SPMN diurnal video systems for daylight fireball monitoring

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.

    2008-09-01

    Daylight fireball video monitoring High-sensitivity video devices are commonly used for the study of the activity of meteor streams during the night. These provide useful data for the determination, for instance, of radiant, orbital and photometric parameters ([1] to [7]). With this aim, during 2006 three automated video stations supported by Universidad de Huelva were set up in Andalusia within the framework of the SPanish Meteor Network (SPMN). These are endowed with 8-9 high sensitivity wide-field video cameras that achieve a meteor limiting magnitude of about +3. These stations have increased the coverage performed by the low-scan allsky CCD systems operated by the SPMN and, besides, achieve a time accuracy of about 0.01s for determining the appearance of meteor and fireball events. Despite of these nocturnal monitoring efforts, we realised the need of setting up stations for daylight fireball detection. Such effort was also motivated by the appearance of the two recent meteorite-dropping events of Villalbeto de la Peña [8,9] and Puerto Lápice [10]. Although the Villalbeto de la Peña event was casually videotaped, and photographed, no direct pictures or videos were obtained for the Puerto Lápice event. Consequently, in order to perform a continuous recording of daylight fireball events, we setup new automated systems based on CCD video cameras. However, the development of these video stations implies several issues with respect to nocturnal systems that must be properly solved in order to get an optimal operation. The first of these video stations, also supported by University of Huelva, has been setup in Sevilla (Andalusia) during May 2007. But, of course, fireball association is unequivocal only in those cases when two or more stations recorded the fireball, and when consequently the geocentric radiant is accurately determined. With this aim, a second diurnal video station is being setup in Andalusia in the facilities of Centro Internacional de Estudios y Convenciones Ecológicas y Medioambientales (CIECEM, University of Huelva), in the environment of Doñana Natural Park (Huelva province). In this way, both stations, which are separated by a distance of 75 km, will work as a double video station system in order to provide trajectory and orbit information of mayor bolides and, thus, increase the chance of meteorite recovery in the Iberian Peninsula. The new diurnal SPMN video stations are endowed with different models of Mintron cameras (Mintron Enterprise Co., LTD). These are high-sensitivity devices that employ a colour 1/2" Sony interline transfer CCD image sensor. Aspherical lenses are attached to the video cameras in order to maximize image quality. However, the use of fast lenses is not a priority here: while most of our nocturnal cameras use f0.8 or f1.0 lenses in order to detect meteors as faint as magnitude +3, diurnal systems employ in most cases f1.4 to f2.0 lenses. Their focal length ranges from 3.8 to 12 mm to cover different atmospheric volumes. The cameras are arranged in such a way that the whole sky is monitored from every observing station. Figure 1. A daylight event recorded from Sevilla on May 26, 2008 at 4h30m05.4 +-0.1s UT. The way our diurnal video cameras work is similar to the operation of our nocturnal systems [1]. Thus, diurnal stations are automatically switched on and off at sunrise and sunset, respectively. The images taken at 25 fps and with a resolution of 720x576 pixels are continuously sent to PC computers through a video capture device. The computers run a software (UFOCapture, by SonotaCo, Japan) that automatically registers meteor trails and stores the corresponding video frames on hard disk. Besides, before the signal from the cameras reaches the computers, a video time inserter that employs a GPS device (KIWI-OSD, by PFD Systems) inserts time information on every video frame. This allows us to measure time in a precise way (about 0.01 sec.) along the whole fireball path. EPSC Abstracts, Vol. 3, EPSC2008-A-00319, 2008 European Planetary Science Congress, Author(s) 2008

  15. CCDPACK: CCD Data Reduction Package

    NASA Astrophysics Data System (ADS)

    Warren-Smith, Rodney F.; Draper, Peter W.; Taylor, Mark; Allan, Alasdair

    2014-03-01

    CCDPACK contains programs to debias, remove dark current, flatfield, register, resample and normalize data from single- or multiple-CCD instruments. The basic reduction stages can be set up using an X based GUI that controls an automated reduction system so one can to start working without any detailed knowledge of the package (or indeed of CCD reduction). Registration is performed using graphical, script based or automated techniques that keep the amount of work to a minimum. CCDPACK uses the Starlink environment (ascl:1110.012).

  16. Video Toroid Cavity Imager

    DOEpatents

    Gerald, Rex E. II; Sanchez, Jairo; Rathke, Jerome W.

    2004-08-10

    A video toroid cavity imager for in situ measurement of electrochemical properties of an electrolytic material sample includes a cylindrical toroid cavity resonator containing the sample and employs NMR and video imaging for providing high-resolution spectral and visual information of molecular characteristics of the sample on a real-time basis. A large magnetic field is applied to the sample under controlled temperature and pressure conditions to simultaneously provide NMR spectroscopy and video imaging capabilities for investigating electrochemical transformations of materials or the evolution of long-range molecular aggregation during cooling of hydrocarbon melts. The video toroid cavity imager includes a miniature commercial video camera with an adjustable lens, a modified compression coin cell imager with a fiat circular principal detector element, and a sample mounted on a transparent circular glass disk, and provides NMR information as well as a video image of a sample, such as a polymer film, with micrometer resolution.

  17. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  18. Making a room-sized camera obscura

    NASA Astrophysics Data System (ADS)

    Flynt, Halima; Ruiz, Michael J.

    2015-01-01

    We describe how to convert a room into a camera obscura as a project for introductory geometrical optics. The view for our camera obscura is a busy street scene set against a beautiful mountain skyline. We include a short video with project instructions, ray diagrams and delightful moving images of cars driving on the road outside.

  19. Cameras Monitor Spacecraft Integrity to Prevent Failures

    NASA Technical Reports Server (NTRS)

    2014-01-01

    The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

  20. Modular integrated video system

    SciTech Connect

    Gaertner, K.J.; Heaysman, B.; Holt, R.; Sonnier, C.

    1986-01-01

    The Modular Integrated Video System (MIVS) is intended to provide a simple, highly reliable closed circuit television (CCTV) system capable of replacing the IAEA Twin Minolta Film Camera Systems in those safeguards facilities where mains power is readily available, and situations where it is desired to have the CCTV camera separated from the CCTV recording console. This paper describes the MIVS and the Program Plan which is presently being followed for the development, testing, and implementation of the system.

  1. Multicolor CCD photometry of the open cluster NGC 752

    NASA Astrophysics Data System (ADS)

    Bartaši?t?, Stanislava; Janusz, Robert; Boyle, Richard P.; Philip, A. G. Davis; Deveikis, Viktoras

    2010-01-01

    We obtained CCD observations of the open cluster NGC 752 with the 1.8m Vatican Advanced Technology Telescope (Mt. Graham, Arizona) with a 4K CCD camera and eight intermediate-band filters of the Stromvil (Strömgren + Vilnius) system. Four 12? × 12? fields were observed, covering the central part of the cluster. The good-quality multicolor data made it possible to obtain precise estimates of distance moduli, metallicity and foreground reddening for individual stars down to the limiting magnitude, V = 17.5, enabling photometric identification of faint cluster members. The new observations provide an extension of the lower main sequence to three magnitudes beyond the previous (photographic) limit. A relatively small number of photometric members identified at fainter magnitudes seems to be indicative of actual dissolution of the cluster from the low-mass end.

  2. CCD Photometry of bright stars using objective wire mesh

    SciTech Connect

    Kami?ski, Krzysztof; Zgórz, Marika; Schwarzenberg-Czerny, Aleksander

    2014-06-01

    Obtaining accurate photometry of bright stars from the ground remains problematic due to the danger of overexposing the target and/or the lack of suitable nearby comparison stars. The century-old method of using objective wire mesh to produce multiple stellar images seems promising for the precise CCD photometry of such stars. Furthermore, our tests on ? Cep and its comparison star, differing by 5 mag, are very encouraging. Using a CCD camera and a 20 cm telescope with the objective covered by a plastic wire mesh, in poor weather conditions, we obtained differential photometry with a precision of 4.5 mmag per two minute exposure. Our technique is flexible and may be tuned to cover a range as big as 6-8 mag. We discuss the possibility of installing a wire mesh directly in the filter wheel.

  3. Patterned Video Sensors For Low Vision

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.

    1996-01-01

    Miniature video cameras containing photoreceptors arranged in prescribed non-Cartesian patterns to compensate partly for some visual defects proposed. Cameras, accompanied by (and possibly integrated with) miniature head-mounted video display units restore some visual function in humans whose visual fields reduced by defects like retinitis pigmentosa.

  4. RESEARCH Open Access Hand contour detection in wearable camera

    E-print Network

    Popovic, Milos R.

    RESEARCH Open Access Hand contour detection in wearable camera video using an adaptive histogram is to develop wearable computer vision systems for hand function monitoring. The specific aim of this study is to develop an algorithm that can identify hand contours in video from a wearable camera that records the user

  5. Stationary Camera Aims And Zooms Electronically

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steven D.

    1994-01-01

    Microprocessors select, correct, and orient portions of hemispherical field of view. Video camera pans, tilts, zooms, and provides rotations of images of objects of field of view, all without moving parts. Used for surveillance in areas where movement of camera conspicuous or constrained by obstructions. Also used for closeup tracking of multiple objects in field of view or to break image into sectors for simultaneous viewing, thereby replacing several cameras.

  6. Illumination box and camera system

    DOEpatents

    Haas, Jeffrey S. (San Ramon, CA); Kelly, Fredrick R. (Modesto, CA); Bushman, John F. (Oakley, CA); Wiefel, Michael H. (La Honda, CA); Jensen, Wayne A. (Livermore, CA); Klunder, Gregory L. (Oakland, CA)

    2002-01-01

    A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.

  7. Video image position determination

    DOEpatents

    Christensen, Wynn (Los Alamos, NM); Anderson, Forrest L. (Bernalillo, NM); Kortegaard, Birchard L. (Los Alamos, NM)

    1991-01-01

    An optical beam position controller in which a video camera captures an image of the beam in its video frames, and conveys those images to a processing board which calculates the centroid coordinates for the image. The image coordinates are used by motor controllers and stepper motors to position the beam in a predetermined alignment. In one embodiment, system noise, used in conjunction with Bernoulli trials, yields higher resolution centroid coordinates.

  8. Improving Radar Snowfall Measurements Using a Video Disdrometer

    NASA Astrophysics Data System (ADS)

    Newman, A. J.; Kucera, P. A.

    2005-05-01

    A video disdrometer has been recently developed at NASA/Wallops Flight Facility in an effort to improve surface precipitation measurements. The recent upgrade of the UND C-band weather radar to dual-polarimetric capabilities along with the development of the UND Glacial Ridge intensive atmospheric observation site has presented a valuable opportunity to attempt to improve radar estimates of snowfall. The video disdrometer, referred to as the Rain Imaging System (RIS), has been deployed at the Glacial Ridge site for most of the 2004-2005 winter season to measure size distributions, precipitation rate, and density estimates of snowfall. The RIS uses CCD grayscale video camera with a zoom lens to observe hydrometers in a sample volume located 2 meters from end of the lens and approximately 1.5 meters away from an independent light source. The design of the RIS may eliminate sampling errors from wind flow around the instrument. The RIS has proven its ability to operate continuously in the adverse conditions often observed in the Northern Plains. The RIS is able to provide crystal habit information, variability of particle size distributions for the lifecycle of the storm, snowfall rates, and estimates of snow density. This information, in conjunction with hand measurements of density and crystal habit, will be used to build a database for comparisons with polarimetric data from the UND radar. This database will serve as the basis for improving snowfall estimates using polarimetric radar observations. Preliminary results from several case studies will be presented.

  9. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  10. Multimedia Content Creation using Societal-Scale Ubiquitous Camera Networks and Human-Centric

    E-print Network

    of user-generated, documentary video using a distributed network of sensor- enabled video cameras Video Networks, Wearable Sensing, Sensor-Controlled Video Creation, Doc- umentary, User sensor network equipped with high-quality video capture and a suite of sensate, communicative

  11. Wide field imaging of solar system objects with an 8192 x 8192 CCD mosaic

    NASA Technical Reports Server (NTRS)

    Hall, Donald N. B.

    1995-01-01

    As part of this program, we successfully completed the construction of the world's largest CCD camera, an 8192 x 8192 CCD mosaic. The system employs 8 2K x 4K 3-edge buttable CCDs arranged in a 2 x 4 chip mosaic. The focal plane has small gaps (less than 1 mm) between mosaic elements and measures over 120 mm x 120 mm. The initial set of frontside illuminated CCDs were developed with Loral-Fairchild in a custom foundry run. The initial lots yielded of order 20 to 25 functional devices, of which we selected the best eight for inclusion for the camera. We have designed a custom 3-edge-buttable package that ensures the CCD dies are mounted flat to plus or minus 10 microns over the entire area of the mosaic. The mosaic camera system consists of eight separate readout signal chains controlled by two independent DSP microcontrollers. These are in turn interfaced to a Sun Sparc-10 workstation through two high speed fiber optic interfaces. The system saw first-light on the Canada-France-Hawaii Telescope on Mauna Kea in March 1995. First-light on the University of Hawaii 2.2-M Telescope on Mauna Kea was in July 1995. Both runs were quite successful. A sample of some of the early science from the first light run is reported in the publication, 'Observations of Weak Lensing in Clusters with an 8192 x 8192 CCD Mosaic Camera'.

  12. Ground-based observations of 951 Gaspra: CCD lightcurves and spectrophotometry with the Galileo filters

    NASA Technical Reports Server (NTRS)

    Mottola, Stefano; Dimartino, M.; Gonano-Beurer, M.; Hoffmann, H.; Neukum, G.

    1992-01-01

    This paper reports the observations of 951 Gaspra carried out at the European Southern Observatory (La Silla, Chile) during the 1991 apparition, using the DLR CCD Camera equipped with a spare set of the Galileo SSI filters. Time-resolved spectrophotometric measurements are presented. The occurrence of spectral variations with rotation suggests the presence of surface variegation.

  13. Expeditions during 2014 with AMOS cameras

    NASA Astrophysics Data System (ADS)

    Tóth, Juraj; Zigo, Pavol; Kornoš, Leonard; Világi, Jozef

    2014-02-01

    Slovak Video Meteor Network (SVMN) is a project of the Comenius University in Bratislava for continuous monitoring of meteor activity over Slovakia and surrounding countries. The network is based on AMOS (All-sky Meteor Orbit System) Cameras, which astrometric precision was calibrated using several commonly observed fireballs within the European Fireball Network. We cooperate with other national video networks and amateur observers and submit all data to the EDMOND video meteor database. The extension of the AMOS Cameras to the Canary Islands and Chile to cover the Southern hemisphere is planned. We present preliminary results from the expedition on the Canary Islands (April 2014) and from Canada (Camelopardalids, May 2014).

  14. Jig Aligns Shadow Mask On CCD

    NASA Technical Reports Server (NTRS)

    Matus, Carlos V.

    1989-01-01

    Alignment viewed through microscope. Alignment jig positions shadow mask on charge-coupled device (CCD) so metal film deposited on it precisely. Allows CCD package to be inserted and removed without disturbing alignment of mask. Holds CCD packages securely and isolates it electrostatically while providing electrical contact to each of its pins. When alignment jig assembled with CCD, used to move mask under micrometer control.

  15. Feasibility of Radon projection acquisition for compressive imaging in MMW region based new video rate 16×16 GDD FPA camera

    NASA Astrophysics Data System (ADS)

    Levanon, Assaf; Konstantinovsky, Michael; Kopeika, Natan S.; Yitzhaky, Yitzhak; Stern, A.; Turak, Svetlana; Abramovich, Amir

    2015-05-01

    In this article we present preliminary results for the combination of two interesting fields in the last few years: 1) Compressed imaging (CI), which is a joint sensing and compressing process, that attempts to exploit the large redundancy in typical images in order to capture fewer samples than usual. 2) Millimeter Waves (MMW) imaging. MMW based imaging systems are required for a large variety of applications in many growing fields such as medical treatments, homeland security, concealed weapon detection, and space technology. Moreover, the possibility to create a reliable imaging in low visibility conditions such as heavy cloud, smoke, fog and sandstorms in the MMW region, generate high interest from military groups in order to be ready for new combat. The lack of inexpensive room temperature imaging sensors makes it difficult to provide a suitable MMW system for many of the above applications. A system based on Glow Discharge Detector (GDD) Focal Plane Arrays (FPA) can be very efficient in real time imaging with significant results. The GDD is located in free space and it can detect MMW radiation almost isotropically. In this article, we present a new approach of reconstruction MMW imaging by rotation scanning of the target. The Collection process here, based on Radon projections allows implementation of the compressive sensing principles into the MMW region. Feasibility of concept was obtained as radon line imaging results. MMW imaging results with our resent sensor are also presented for the first time. The multiplexing frame rate of 16×16 GDD FPA permits real time video rate imaging of 30 frames per second and comprehensive 3D MMW imaging. It uses commercial GDD lamps with 3mm diameter, Ne indicator lamps as pixel detectors. Combination of these two fields should make significant improvement in MMW region imaging research, and new various of possibilities in compressing sensing technique.

  16. Snowfall Retrivals Using a Video Disdrometer

    NASA Astrophysics Data System (ADS)

    Newman, A. J.; Kucera, P. A.

    2004-12-01

    A video disdrometer has been recently developed at NASA/Wallops Flight Facility in an effort to improve surface precipitation measurements. One of the goals of the upcoming Global Precipitation Measurement (GPM) mission is to provide improved satellite-based measurements of snowfall in mid-latitudes. Also, with the planned dual-polarization upgrade of US National Weather Service weather radars, there is potential for significant improvements in radar-based estimates of snowfall. The video disdrometer, referred to as the Rain Imaging System (RIS), was deployed in Eastern North Dakota during the 2003-2004 winter season to measure size distributions, precipitation rate, and density estimates of snowfall. The RIS uses CCD grayscale video camera with a zoom lens to observe hydrometers in a sample volume located 2 meters from end of the lens and approximately 1.5 meters away from an independent light source. The design of the RIS may eliminate sampling errors from wind flow around the instrument. The RIS operated almost continuously in the adverse conditions often observed in the Northern Plains. Preliminary analysis of an extended winter snowstorm has shown encouraging results. The RIS was able to provide crystal habit information, variability of particle size distributions for the lifecycle of the storm, snowfall rates, and estimates of snow density. Comparisons with coincident snow core samples and measurements from the nearby NWS Forecast Office indicate the RIS provides reasonable snowfall measurements. WSR-88D radar observations over the RIS were used to generate a snowfall-reflectivity relationship from the storm. These results along with several other cases will be shown during the presentation.

  17. Effects On Beam Alignment Due To Neutron-Irradiated CCD Images At The National Ignition Facility

    SciTech Connect

    Awwal, A; Manuel, A; Datte, P; Burkhart, S

    2011-02-28

    The 192 laser beams in the National Ignition Facility (NIF) are automatically aligned to the target-chamber center using images obtained through charged coupled device (CCD) cameras. Several of these cameras are in and around the target chamber during an experiment. Current experiments for the National Ignition Campaign are attempting to achieve nuclear fusion. Neutron yields from these high energy fusion shots expose the alignment cameras to neutron radiation. The present work explores modeling and predicting laser alignment performance degradation due to neutron radiation effects, and demonstrates techniques to mitigate performance degradation. Camera performance models have been created based on the measured camera noise from the cumulative single-shot fluence at the camera location. We have found that the effect of the neutron-generated noise for all shots to date have been well within the alignment tolerance of half a pixel, and image processing techniques can be utilized to reduce the effect even further on the beam alignment.

  18. Innovative camera system developed for Sprint vehicle

    SciTech Connect

    Not Available

    1985-04-01

    A new inspection system for the Sprint 101 ROV eliminates parallax errors because all three camera modules use a single lens for viewing. Parallax is the apparent displacement of an object when it is viewed from two points not in the same line of sight. The central camera is a Pentax 35-mm single lens reflex with a 28-mm lens. It comes with 250-shot film cassettes, an automatic film wind-on, and a data chamber display. An optical transfer assembly on the stills camera viewfinder transmits the image to one of the two video camera modules. The video picture transmitted to the surface is exactly the same as the stills photo. The surface operator can adjust the focus by viewing the video display.

  19. 78 FR 39619 - Closed Captioning of Internet Protocol-Delivered Video Programming: Implementation of the Twenty...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-02

    ...apparatus. For example, digital still cameras may be...enable consumers to use a memory card to view video programming...applying this analysis to digital cameras, we find that...video programming on digital cameras with no ability...video programming to a memory card on another...

  20. Camera for Quasars in Early Universe (CQUEAN)

    NASA Astrophysics Data System (ADS)

    Park, Won-Kee; Pak, Soojong; Im, Myungshin; Choi, Changsu; Jeon, Yiseul; Chang, Seunghyuk; Jeong, Hyeonju; Lim, Juhee; Kim, Eunbin

    2012-08-01

    We describe the overall characteristics and the performance of an optical CCD camera system, Camera for Quasars in Early Universe (CQUEAN), which has been used at the 2.1 m Otto Struve Telescope of the McDonald Observatory since 2010 August. CQUEAN was developed for follow-up imaging observations of red sources such as high-redshift quasar candidates (z ? 5), gamma-ray bursts, brown dwarfs, and young stellar objects. For efficient observations of the red objects, CQUEAN has a science camera with a deep-depletion CCD chip, which boasts a higher quantum efficiency at 0.7--1.1 ?m than conventional CCD chips. The camera was developed in a short timescale (˜1 yr) and has been working reliably. By employing an autoguiding system and a focal reducer to enhance the field of view on the classical Cassegrain focus, we achieve a stable guiding in 20 minute exposures, an imaging quality with FWHM?0.6" over the whole field (4.8'×4.8'), and a limiting magnitude of z = 23.4AB mag at 5-? with 1 hr total integration time.

  1. A Stochastic Approach to Tracking Objects Across Multiple Cameras

    E-print Network

    Dick, Anthony

    A Stochastic Approach to Tracking Objects Across Multiple Cameras Anthony R. Dick and Michael J of view of multiple video cameras. The paper builds upon existing methods for tracking moving objects-Verlag Berlin Heidelberg 2004 #12;A Stochastic Approach to Tracking Objects Across Multiple Cameras 161 Tracking

  2. An attentive multi-camera system

    NASA Astrophysics Data System (ADS)

    Napoletano, Paolo; Tisato, Francesco

    2014-03-01

    Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

  3. Bandpass filters with CCD resonators

    NASA Astrophysics Data System (ADS)

    Schreiber, R.; Betzl, H.; Bardl, A.; Feil, M.

    1981-04-01

    Two bandpass filters with novel CCD resonators in double-polysilicon-gate MOS technology are discussed. The two filters are a Chebyshev bandpass with a relative 3-dB bandwidth of 3.1%, and a fully self-contained signal filter for a FDM channel modulator with a 3-dB bandwidth of 97 Hz at 131.85 kHz. The filters have reached a high stopband attenuation and a dynamics of more than 70 dB. They have extremely stable center frequency and a bandwidth independently controlled by a capacitance ratio. The CCD resonators are therefore ideal modules for the monolithic implementation of narrow bandpass filters at higher frequencies.

  4. Astronomical CCD observing and reduction techniques

    NASA Technical Reports Server (NTRS)

    Howell, Steve B. (editor)

    1992-01-01

    CCD instrumentation and techniques in observational astronomy are surveyed. The general topics addressed include: history of large array scientific CCD imagers; noise sources and reduction processes; basic photometry techniques; introduction to differential time-series astronomical photometry using CCDs; 2D imagery; point source spectroscopy; extended object spectrophotometry; introduction to CCD astrometry; solar system applications for CCDs; CCD data; observing with infrared arrays; image processing, data analysis software, and computer systems for CCD data reduction and analysis. (No individual items are abstracted in this volume)

  5. Mobile Panoramic Video Applications for Learning

    ERIC Educational Resources Information Center

    Multisilta, Jari

    2014-01-01

    The use of videos on the internet has grown significantly in the last few years. For example, Khan Academy has a large collection of educational videos, especially on STEM subjects, available for free on the internet. Professional panoramic video cameras are expensive and usually not easy to carry because of the large size of the equipment.…

  6. Video-based Rendering Marcus Magnor

    E-print Network

    Suresh, Subra

    Course 16 Video-based Rendering Organizers Marcus Magnor MPI Informatik Marc Pollefeys University calibration 141 Multi-camera network 147 Unsynchronized video 164 Spacetime Coherence 168 Spatiotemporal View-Viewpoint Video 201 References 221 Online VBR Resources 228 Presenters' Contact Information 229 #12;III Course

  7. Low-light-level integrating video system

    NASA Technical Reports Server (NTRS)

    Duncan, B. J.; Fay, T. D.; Miller, E. R.; Wamsteker, W.; Brown, R. M.; Neely, P.

    1977-01-01

    System consists of television camera using 25 mm SEC vidicon, low dispersion spectrograph, and digital video image system used for buffer storage of video data during tube readout scanning. Six-bit ADC converts video to digital data which are stored on magnetic tape for future evaluation.

  8. Understanding Computer-Based Digital Video.

    ERIC Educational Resources Information Center

    Martindale, Trey

    2002-01-01

    Discussion of new educational media and technology focuses on producing and delivering computer-based digital video. Highlights include video standards, including international standards and aspect ratio; camera formats and features, including costs; shooting digital video; editing software; compression; and a list of informative Web sites. (LRW)

  9. Optical stereo video signal processor

    NASA Technical Reports Server (NTRS)

    Craig, G. D. (inventor)

    1985-01-01

    An otpical video signal processor is described which produces a two-dimensional cross-correlation in real time of images received by a stereo camera system. The optical image of each camera is projected on respective liquid crystal light valves. The images on the liquid crystal valves modulate light produced by an extended light source. This modulated light output becomes the two-dimensional cross-correlation when focused onto a video detector and is a function of the range of a target with respect to the stereo camera. Alternate embodiments utilize the two-dimensional cross-correlation to determine target movement and target identification.

  10. Automated Meteor Fluxes with a Wide-Field Meteor Camera Network

    NASA Technical Reports Server (NTRS)

    Blaauw, R. C.; Campbell-Brown, M. D.; Cooke, W.; Weryk, R. J.; Gill, J.; Musci, R.

    2013-01-01

    Within NASA, the Meteoroid Environment Office (MEO) is charged to monitor the meteoroid environment in near ]earth space for the protection of satellites and spacecraft. The MEO has recently established a two ]station system to calculate automated meteor fluxes in the millimeter ]size ]range. The cameras each consist of a 17 mm focal length Schneider lens on a Watec 902H2 Ultimate CCD video camera, producing a 21.7 x 16.3 degree field of view. This configuration has a red ]sensitive limiting meteor magnitude of about +5. The stations are located in the South Eastern USA, 31.8 kilometers apart, and are aimed at a location 90 km above a point 50 km equidistant from each station, which optimizes the common volume. Both single station and double station fluxes are found, each having benefits; more meteors will be detected in a single camera than will be seen in both cameras, producing a better determined flux, but double station detections allow for non ]ambiguous shower associations and permit speed/orbit determinations. Video from the cameras are fed into Linux computers running the ASGARD (All Sky and Guided Automatic Real ]time Detection) software, created by Rob Weryk of the University of Western Ontario Meteor Physics Group. ASGARD performs the meteor detection/photometry, and invokes the MILIG and MORB codes to determine the trajectory, speed, and orbit of the meteor. A subroutine in ASGARD allows for the approximate shower identification in single station meteors. The ASGARD output is used in routines to calculate the flux in units of #/sq km/hour. The flux algorithm employed here differs from others currently in use in that it does not assume a single height for all meteors observed in the common camera volume. In the MEO system, the volume is broken up into a set of height intervals, with the collecting areas determined by the radiant of active shower or sporadic source. The flux per height interval is summed to obtain the total meteor flux. As ASGARD also computes the meteor mass from the photometry, a mass flux can be also calculated. Weather conditions in the southeastern United States are seldom ideal, which introduces the difficulty of a variable sky background. First a weather algorithm indicates if sky conditions are clear enough to calculate fluxes, at which point a limiting magnitude algorithm is employed. The limiting magnitude algorithm performs a fit of stellar magnitudes vs camera intensities. The stellar limiting magnitude is derived from this and easily converted to a limiting meteor magnitude for the active shower or sporadic source.

  11. Video compressive sensing using Gaussian mixture models.

    PubMed

    Yang, Jianbo; Yuan, Xin; Liao, Xuejun; Llull, Patrick; Brady, David J; Sapiro, Guillermo; Carin, Lawrence

    2014-11-01

    A Gaussian mixture model (GMM)-based algorithm is proposed for video reconstruction from temporally compressed video measurements. The GMM is used to model spatio-temporal video patches, and the reconstruction can be efficiently computed based on analytic expressions. The GMM-based inversion method benefits from online adaptive learning and parallel computation. We demonstrate the efficacy of the proposed inversion method with videos reconstructed from simulated compressive video measurements, and from a real compressive video camera. We also use the GMM as a tool to investigate adaptive video compressive sensing, i.e., adaptive rate of temporal compression. PMID:25095253

  12. Caught on Video

    ERIC Educational Resources Information Center

    Sprankle, Bob

    2008-01-01

    When cheaper video cameras with built-in USB connectors were first introduced, the author relates that he pined for one so he introduced the technology into the classroom. The author believes that it would not only be a great tool for students to capture their own learning, but also make his job of collecting authentic assessment more streamlined…

  13. Online camera-gyroscope autocalibration for cell phones.

    PubMed

    Jia, Chao; Evans, Brian L

    2014-12-01

    The gyroscope is playing a key role in helping estimate 3D camera rotation for various vision applications on cell phones, including video stabilization and feature tracking. Successful fusion of gyroscope and camera data requires that the camera, gyroscope, and their relative pose to be calibrated. In addition, the timestamps of gyroscope readings and video frames are usually not well synchronized. Previous paper performed camera-gyroscope calibration and synchronization offline after the entire video sequence has been captured with restrictions on the camera motion, which is unnecessarily restrictive for everyday users to run apps that directly use the gyroscope. In this paper, we propose an online method that estimates all the necessary parameters, whereas a user is capturing video. Our contributions are: 1) simultaneous online camera self-calibration and camera-gyroscope calibration based on an implicit extended Kalman filter and 2) generalization of the multiple-view coplanarity constraint on camera rotation in a rolling shutter camera model for cell phones. The proposed method is able to estimate the needed calibration and synchronization parameters online with all kinds of camera motion and can be embedded in gyro-aided applications, such as video stabilization and feature tracking. Both Monte Carlo simulation and cell phone experiments show that the proposed online calibration and synchronization method converge fast to the ground truth values. PMID:25265608

  14. MultiCam permits use of two or more webcams simultaneously for video chat

    E-print Network

    MacCormick, John

    · MultiCam permits use of two or more webcams simultaneously for video chat in existing chat individual camera): 1. Main tool for this investigation: MultiCam, a new video chat plugin Video Chat previous column) 1. Is multiple-camera video chat useful and/or desirable? · Answer: Yes, for certain

  15. Modeling Background from Compressed Video Weiqiang Wang, Datong Chen and Jie Yang

    E-print Network

    Chen, Datong

    Modeling Background from Compressed Video Weiqiang Wang, Datong Chen and Jie Yang School background models directly from compressed video data. The proposed approach utilizes the information from video cameras onto a computer hard disk using hardware compression devices. However, how

  16. VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras

    NASA Technical Reports Server (NTRS)

    Champey, Patrick R.; Kobayashi, Ken; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.

  17. Holistic video detection

    NASA Astrophysics Data System (ADS)

    Gong, Shaogang

    2007-10-01

    There are large amount of CCTV cameras collecting colossal amounts of video data about people and their behaviour. However, this overwhelming amount of data also causes overflow of information if their content is not analysed in a wider context to provide selective focus and automated alert triggering. To date, truly semantics based video analytic systems do not exist. There is an urgent need for the development of automated systems to monitor holistically the behaviours of people, vehicles and the whereabout of objects of interest in public space. In this work, we highlight the challenges and recent progress towards building computer vision systems for holistic video detection in a distributed network of multiple cameras based on object localisation, categorisation and tagging from different views in highly cluttered scenes.

  18. STS-134 Launch Composite Video Comparison - Duration: 56 seconds.

    NASA Video Gallery

    A side-by-side comparison video shows a one-camera view of the STS-134 launch (left) with the six-camera composited view (right). Imaging experts funded by the Space Shuttle Program and located at ...

  19. Distance-of-Flight Mass Spectrometry with IonCCD Detection and an Inductively Coupled Plasma Source

    NASA Astrophysics Data System (ADS)

    Dennis, Elise A.; Ray, Steven J.; Enke, Christie G.; Gundlach-Graham, Alexander W.; Barinaga, Charles J.; Koppenaal, David W.; Hieftje, Gary M.

    2015-11-01

    Distance-of-flight mass spectrometry (DOFMS) is demonstrated for the first time with a commercially available ion detector—the IonCCD camera. Because DOFMS is a velocity-based MS technique that provides spatially dispersive, simultaneous mass spectrometry, a position-sensitive ion detector is needed for mass-spectral collection. The IonCCD camera is a 5.1-cm long, 1-D array that is capable of simultaneous, multichannel ion detection along a focal plane, which makes it an attractive option for DOFMS. In the current study, the IonCCD camera is evaluated for DOFMS with an inductively coupled plasma (ICP) ionization source over a relatively short field-free mass-separation distance of 25.3-30.4 cm. The combination of ICP-DOFMS and the IonCCD detector results in a mass-spectral resolving power (FWHM) of approximately 900 and isotope-ratio precision equivalent to or slightly better than current ICP-TOFMS systems. The measured isotope-ratio precision in % relative standard deviation (%RSD) was ?0.008%RSD for nonconsecutive isotopes at 10-ppm concentration (near the ion-signal saturation point) and ?0.02%RSD for all isotopes at 1-ppm. Results of DOFMS with the IonCCD camera are also compared with those of two previously characterized detection setups.

  20. Small arms video sight for the "German Army Soldier-of-the-Future Program": lessons learned

    NASA Astrophysics Data System (ADS)

    Ledertheil, Bernd H.; Berlips, Carsten; Ohlmann, Marco

    2009-05-01

    The Small Arms Video Sight is part of the optronical sights developed for the German IdZ-ES program (German Army Soldier-of-the Future Program - Enhanced System) [1]. The aim of the development was to use this highly integrated sight on three different rifles (G36 = the assault rifle of the German soldier optional with a 40mm underslung grenade launcher, MG 4 = a light machine gun , PZF3 = a 60mm / 110mm bazooka). The Video Sight will be used for observation, target detection, recognition and identification, direct and indirect aiming and shooting with ballistic calculation and aiming mark correction, still and video picture capturing and wireless transmission, determination of a located target position by distance and angle to the own position and so on. To perform all these tasks the Video Sight is equipped with an uncooled IR and a CCD camera, a laser range finder, a digital compass, angle sensors, electronic display, all this sensors highly integrated and controlled by an operating system.

  1. An environmental change detection and analysis tool using terrestrial video

    E-print Network

    Velez, Javier J.

    2006-01-01

    We developed a prototype system to detect and flag changes between pairs of geo-tagged videos of the same scene with similar camera trajectories. The purpose of the system is to help human video analysts detect threats ...

  2. Multi-Camera Saliency.

    PubMed

    Luo, Yan; Jiang, Ming; Wong, Yongkang; Zhao, Qi

    2015-10-01

    A significant body of literature on saliency modeling predicts where humans look in a single image or video. Besides the scientific goal of understanding how information is fused from multiple visual sources to identify regions of interest in a holistic manner, there are tremendous engineering applications of multi-camera saliency due to the widespread of cameras. This paper proposes a principled framework to smoothly integrate visual information from multiple views to a global scene map, and to employ a saliency algorithm incorporating high-level features to identify the most important regions by fusing visual information. The proposed method has the following key distinguishing features compared with its counterparts: (1) the proposed saliency detection is global (salient regions from one local view may not be important in a global context), (2) it does not require special ways for camera deployment or overlapping field of view, and (3) the key saliency algorithm is effective in highlighting interesting object regions though not a single detector is used. Experiments on several data sets confirm the effectiveness of the proposed principled framework. PMID:26340257

  3. The Development of the Spanish Fireball Network Using a New All-Sky CCD System

    NASA Astrophysics Data System (ADS)

    Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.; Llorca, J.; Fabregat, J.; Martínez, V. J.; Reglero, V.; Jelínek, M.; Kubánek, P.; Mateo, T.; Postigo, A. De Ugarte

    2004-12-01

    We have developed an all-sky charge coupled devices (CCD) automatic system for detecting meteors and fireballs that will be operative in four stations in Spain during 2005. The cameras were developed following the BOOTES-1 prototype installed at the El Arenosillo Observatory in 2002, which is based on a CCD detector of 4096 × 4096 pixels with a fish-eye lens that provides an all-sky image with enough resolution to make accurate astrometric measurements. Since late 2004, a couple of cameras at two of the four stations operate for 30 s in alternate exposures, allowing 100% time coverage. The stellar limiting magnitude of the images is +10 in the zenith, and +8 below ~ 65° of zenithal angle. As a result, the images provide enough comparison stars to make astrometric measurements of faint meteors and fireballs with an accuracy of ~ 2°arcminutes. Using this prototype, four automatic all-sky CCD stations have been developed, two in Andalusia and two in the Valencian Community, to start full operation of the Spanish Fireball Network. In addition to all-sky coverage, we are developing a fireball spectroscopy program using medium field lenses with additional CCD cameras. Here we present the first images obtained from the El Arenosillo and La Mayora stations in Andalusia during their first months of activity. The detection of the Jan 27, 2003 superbolide of ± 17 ± 1 absolute magnitude that overflew Algeria and Morocco is an example of the detection capability of our prototype.

  4. CCD-based astrometric measurements of photographic plates

    NASA Astrophysics Data System (ADS)

    Bustos Fierro, I. H.; Calderon, J. H.

    2005-01-01

    A methodology for the astrometric measurement of photographic plates making use of a scientific grade CCD camera was developed and tested on a Carte du Ciel plate. In order to measure a complete CdC plate a mosaic of 64 frames with partial overlap in both coordinates was taken. With the aim of evaluate the accuracy of stellar centroids a MAMA-based digitization of the same plate was employed as pattern. It was found a noticeable radial distortion produced by the optical system of the camera that was corrected. The reduction to celestial coordinates was performed by means of the block-adjustment technique using Tycho-2 as reference catalog. Differences with Tycho-2 suggest that the errors of CCD-based positions obtained from the CdC plate are between 0.20 and 0.25 arcseconds. These positions are intended to be employed in the determination of proper motions at few mas/yr level therefore allowing densification of the proper motions system in regions of interest of the sky up to photographic magnitude 14.5 (for CdC plates) using a relatively low cost device available at our own Observatory.

  5. Innovative Solution to Video Enhancement

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Through a licensing agreement, Intergraph Government Solutions adapted a technology originally developed at NASA's Marshall Space Flight Center for enhanced video imaging by developing its Video Analyst(TM) System. Marshall's scientists developed the Video Image Stabilization and Registration (VISAR) technology to help FBI agents analyze video footage of the deadly 1996 Olympic Summer Games bombing in Atlanta, Georgia. VISAR technology enhanced nighttime videotapes made with hand-held camcorders, revealing important details about the explosion. Intergraph's Video Analyst System is a simple, effective, and affordable tool for video enhancement and analysis. The benefits associated with the Video Analyst System include support of full-resolution digital video, frame-by-frame analysis, and the ability to store analog video in digital format. Up to 12 hours of digital video can be stored and maintained for reliable footage analysis. The system also includes state-of-the-art features such as stabilization, image enhancement, and convolution to help improve the visibility of subjects in the video without altering underlying footage. Adaptable to many uses, Intergraph#s Video Analyst System meets the stringent demands of the law enforcement industry in the areas of surveillance, crime scene footage, sting operations, and dash-mounted video cameras.

  6. The Dark Energy Camera (DECam)

    E-print Network

    K. Honscheid; D. L. DePoy; for the DES Collaboration

    2008-10-20

    In this paper we describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). It consists of a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), a modern data acquisition and control system and the associated infrastructure for operation in the prime focus cage. The focal plane includes of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view and 12 smaller 2K x 2K CCDs for guiding, focus and alignment. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

  7. Secure authenticated video equipment

    SciTech Connect

    Doren, N.E.

    1993-07-01

    In the verification technology arena, there is a pressing need for surveillance and monitoring equipment that produces authentic, verifiable records of observed activities. Such a record provides the inspecting party with confidence that observed activities occurred as recorded, without undetected tampering or spoofing having taken place. The secure authenticated video equipment (SAVE) system provides an authenticated series of video images of an observed activity. Being self-contained and portable, it can be installed as a stand-alone surveillance system or used in conjunction with existing monitoring equipment in a non-invasive manner. Security is provided by a tamper-proof camera enclosure containing a private, electronic authentication key. Video data is transferred communication link consisting of a coaxial cable, fiber-optic link or other similar media. A video review station, located remotely from the camera, receives, validates, displays and stores the incoming data. Video data is validated within the review station using a public key, a copy of which is held by authorized panics. This scheme allows the holder of the public key to verify the authenticity of the recorded video data but precludes undetectable modification of the data generated by the tamper-protected private authentication key.

  8. Development of the miniature video docking sensor

    NASA Astrophysics Data System (ADS)

    Rodgers, Lennon; Nolet, Simon; Miller, David W.

    2006-05-01

    To perform realistic demonstrations of autonomous docking maneuvers using micro-satellites, the MIT Space Systems Laboratory (SSL) developed a miniature universal docking port along with an optical sensing system for relative state estimation. The docking port has an androgynous design and is universal since any two identical ports can be connected together. After a rigid connection is made, it is capable of passing electrical loads between the connected micro-satellites. The optical sensor uses a set of infrared LED's, a miniature CCD-based video camera, and an Extended Kalman Filter to determine the six relative degrees of freedom of the docking satellite. The SPHERES testbed, also developed by the MIT SSL, was used to demonstrate the integrated docking port and sensor system. This study focuses on the development of the optical docking sensor, and presents test results collected to date during fully autonomous docking experiments performed at the MIT SSL 2-D laboratory. Tests were performed to verify the validity of the docking sensor by taking measurements at known distances. These results give an estimate of the sensor accuracy, and are compared with a theoretical model to understand the sources of error in the state measurements.

  9. Exposing Digital Forgeries in Interlaced and De-Interlaced Video

    E-print Network

    Bucci, David J.

    1 Exposing Digital Forgeries in Interlaced and De-Interlaced Video Weihong Wang, Student Member, IEEE, and Hany Farid, Member, IEEE Abstract With the advent of high-quality digital video cameras and sophisticated video editing software, it is becoming increasingly easier to tamper with digital video. A growing

  10. Video overlay of GPS precision timestamps Matt Montanaro

    E-print Network

    Richmond, Michael W.

    KIWIOSD Video overlay of GPS precision timestamps #12;Matt Montanaro Dr. Michael Richmond Rochester............................................................................................................16 Purpose of the KIWIOSD: The KIWIOSD device timestamps video frames to millisecond precision. A video feed from a source (video camera) is sent into the KIWI. A timing signal from a GPS receiver

  11. Streaming Audio and Video: New Challenges and Opportunities for Museums.

    ERIC Educational Resources Information Center

    Spadaccini, Jim

    Streaming audio and video present new challenges and opportunities for museums. Streaming media is easier to author and deliver to Internet audiences than ever before; digital video editing is commonplace now that the tools--computers, digital video cameras, and hard drives--are so affordable; the cost of serving video files across the Internet…

  12. Ambient-Light-Canceling Camera Using Subtraction of Frames

    NASA Technical Reports Server (NTRS)

    Morookian, John Michael

    2004-01-01

    The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light during one frame period, and would be illuminated with only ambient (background) light during the next frame period. The camera output would be digitized and sent to a computer, wherein the pixel values of the background-only frame would be subtracted from the pixel values of the signal-plus-background frame to obtain signal-only pixel values (see figure). To prevent artifacts of motion from entering the images, it would be necessary to acquire image data at a rate greater than the standard video rate of 30 frames per second. For this purpose, the ALCC would exploit a novel control technique developed at NASA s Jet Propulsion Laboratory for advanced charge-coupled-device (CCD) cameras. This technique provides for readout from a subwindow [region of interest (ROI)] within the image frame. Because the desired reflections from the eye would typically occupy a small fraction of the area within the image frame, the ROI capability would make it possible to acquire and subtract pixel values at rates of several hundred frames per second considerably greater than the standard video rate and sufficient to both (1) suppress motion artifacts and (2) track the motion of the eye between consecutive subtractive frame pairs.

  13. Design of area array CCD image acquisition and display system based on FPGA

    NASA Astrophysics Data System (ADS)

    Li, Lei; Zhang, Ning; Li, Tianting; Pan, Yue; Dai, Yuming

    2014-09-01

    With the development of science and technology, CCD(Charge-coupled Device) has been widely applied in various fields and plays an important role in the modern sensing system, therefore researching a real-time image acquisition and display plan based on CCD device has great significance. This paper introduces an image data acquisition and display system of area array CCD based on FPGA. Several key technical challenges and problems of the system have also been analyzed and followed solutions put forward .The FPGA works as the core processing unit in the system that controls the integral time sequence .The ICX285AL area array CCD image sensor produced by SONY Corporation has been used in the system. The FPGA works to complete the driver of the area array CCD, then analog front end (AFE) processes the signal of the CCD image, including amplification, filtering, noise elimination, CDS correlation double sampling, etc. AD9945 produced by ADI Corporation to convert analog signal to digital signal. Developed Camera Link high-speed data transmission circuit, and completed the PC-end software design of the image acquisition, and realized the real-time display of images. The result through practical testing indicates that the system in the image acquisition and control is stable and reliable, and the indicators meet the actual project requirements.

  14. Intelligent real-time CCD data processing system based on variable frame rate

    NASA Astrophysics Data System (ADS)

    Chen, Su-ting

    2009-07-01

    In order to meet the need of image shooting with CCD in unmanned aerial vehicles, a real-time high resolution CCD data processing system based on variable frame rate is designed. The system is consisted of three modules: CCD control module, data processing module and data display module. In the CCD control module, real-time flight parameters (e.g. flight height, velocity and longitude) should be received from GPS through UART (Universal Asynchronous Receiver Transmitter) and according to the corresponding flight parameters, the variable frame rate is calculated. Based on the calculated variable frame rate, CCD external synchronization control impulse signal is generated in the control of FPGA and then CCD data is read out. In the data processing module, data segmentation is designed to extract ROI (region of interest), whose resolution is equal to valid data resolution of HDTV standard conforming to SMPTE (1080i). On one hand, Ping-pong SRAM storage controller is designed in FPGA to real-time store ROI data. On the other hand, according to the need of intelligent observing, changeable window position is designed, and a flexible area of interest is obtained. In the real-time display module, a special video encoder is used to accomplish data format conversion. Data after storage is packeted to HDTV format by creating corresponding format information in FPGA. Through inner register configuration, high definition video analog signal is implemented. The entire system has been implemented in FPGA and validated. It has been used in various real-time CCD data processing situations.

  15. Multi-Camera Activity Correlation Analysis Chen Change Loy, Tao Xiang and Shaogang Gong

    E-print Network

    Gong, Shaogang

    -camera activity analysis methods [6, 12, Figure 1. Three consecutive frames from a typical public space CCTV video, both assumptions are largely invalid for activities captured by CCTV cameras in public spaces typ

  16. MECHANICAL ADVANCING HANDLE THAT SIMPLIFIES MINIRHIZOTRON CAMERA REGISTRATION AND IMAGE COLLECTION

    EPA Science Inventory

    Minirkizotrons in conjunction with a minirkizotron video camera system are becoming widely used tools for investigating root production and survical in a variety of ecosystems. Image collection with a minirhizotron camera can be time consuming and tedious particularly when hundre...

  17. Rugged Video System For Inspecting Animal Burrows

    NASA Technical Reports Server (NTRS)

    Triandafils, Dick; Maples, Art; Breininger, Dave

    1992-01-01

    Video system designed for examining interiors of burrows of gopher tortoises, 5 in. (13 cm) in diameter or greater, to depth of 18 ft. (about 5.5 m), includes video camera, video cassette recorder (VCR), television monitor, control unit, and power supply, all carried in backpack. Polyvinyl chloride (PVC) poles used to maneuver camera into (and out of) burrows, stiff enough to push camera into burrow, but flexible enough to bend around curves. Adult tortoises and other burrow inhabitants observable, young tortoises and such small animals as mice obscured by sand or debris.

  18. What Counts as Educational Video?: Working toward Best Practice Alignment between Video Production Approaches and Outcomes

    ERIC Educational Resources Information Center

    Winslett, Greg

    2014-01-01

    The twenty years since the first digital video camera was made commercially available has seen significant increases in the use of low-cost, amateur video productions for teaching and learning. In the same period, production and consumption of professionally produced video has also increased, as has the distribution platforms to access it.…

  19. CCD research. [design, fabrication, and applications

    NASA Technical Reports Server (NTRS)

    Gassaway, J. D.

    1976-01-01

    The fundamental problems encountered in designing, fabricating, and applying CCD's are reviewed. Investigations are described and results and conclusions are given for the following: (1) the development of design analyses employing computer aided techniques and their application to the design of a grapped structure; (2) the role of CCD's in applications to electronic functions, in particular, signal processing; (3) extending the CCD to silicon films on sapphire (SOS); and (4) all aluminum transfer structure with low noise input-output circuits. Related work on CCD imaging devices is summarized.

  20. Hazmat Cam Wireless Video System

    SciTech Connect

    Kevin L. Young

    2006-02-01

    This paper describes the Hazmat Cam Wireless Video System and its application to emergency response involving chemical, biological or radiological contamination. The Idaho National Laboratory designed the Hazmat Cam Wireless Video System to assist the National Guard Weapons of Mass Destruction - Civil Support Teams during their mission of emergency response to incidents involving weapons of mass destruction. The lightweight, handheld camera transmits encrypted, real-time video from inside a contaminated area, or hot-zone, to a command post located a safe distance away. The system includes a small wireless video camera, a true-diversity receiver, viewing console, and an optional extension link that allows the command post to be placed up to five miles from danger. It can be fully deployed by one person in a standalone configuration in less than 10 minutes. The complete system is battery powered. Each rechargeable camera battery powers the camera for 3 hours with the receiver and video monitor battery lasting 22 hours on a single charge. The camera transmits encrypted, low frequency analog video signals to a true-diversity receiver with three antennas. This unique combination of encryption and transmission technologies delivers encrypted, interference-free images to the command post under conditions where other wireless systems fail. The lightweight camera is completely waterproof for quick and easy decontamination after use. The Hazmat Cam Wireless Video System is currently being used by several National Guard Teams, the US Army, and by fire fighters. The system has been proven to greatly enhance situational awareness during the crucial, initial phase of a hazardous response allowing commanders to make better, faster, safer decisions.

  1. A low-cost, high-resolution, video-rate imaging optical radar

    SciTech Connect

    Sackos, J.T.; Nellums, R.O.; Lebien, S.M.; Diegert, C.F.; Grantham, J.W.; Monson, T.

    1998-04-01

    Sandia National Laboratories has developed a unique type of portable low-cost range imaging optical radar (laser radar or LADAR). This innovative sensor is comprised of an active floodlight scene illuminator and an image intensified CCD camera receiver. It is a solid-state device (no moving parts) that offers significant size, performance, reliability, and simplicity advantages over other types of 3-D imaging sensors. This unique flash LADAR is based on low cost, commercially available hardware, and is well suited for many government and commercial uses. This paper presents an update of Sandia`s development of the Scannerless Range Imager technology and applications, and discusses the progress that has been made in evolving the sensor into a compact, low, cost, high-resolution, video rate Laser Dynamic Range Imager.

  2. Methode nouvelle pour la mesure CCD du diametre solaire avec un astrolabe. A new method for CCD measurements of the solar diameter with an astrolabe

    NASA Astrophysics Data System (ADS)

    Sinceac, V.; Chollet, F.; Laclare, F.; Delmas, C.

    1998-03-01

    Observing the Solar disk is a challenge and, as for the past visual observations, we have many results depending on observers and/or instruments. This was due to the differences in visual perceptions of the Sun's limb, instrumental errors and atmospheric disturbances. After a long series of visual observations at Calern Observatory, Francis Laclare felt the need for more impersonal and automatic measurements of the Solar diameter. After a series of analog CCD measurements (1989-1995), a digital data acquisition and processing was tested by the Paris Observatory group (F. Chollet and V. Sinceac) during the 1996 spring at Calern Observatory. Before starting a new continuous campaign of observations, to confirm eventual variations of the diameter and solar flatness, the aim is to find the best definition of the solar edge. The test campaign was spent comparing different solutions that were tried on two different astrolabes at Calern Observatory: The ``classical'' one, outfitted with eleven zerodur ceramic prisms (S astrolabe), that has been used for twenty years in the Laclare series and on the other hand an instrument equipped with a varying angle prism (V astrolabe) enabling many measurements (385 in 1996) for perfecting the know how. This article focusses on acquisition techniques and their feasibility. Two procedures were tried: The first one used alternately the direct and reflected images (separated using a revolving shutter in front of the objective) and the second one mathematically sorts out both components inside the computer (an image being a two-dimensionral array of numbers). According to the principle of the astrolabe, the measured quantity is the exact time crossing the parallel of altitude (defined by the prism angle) by the Sun's edge, i.e. the time of merging of the two images of the Sun in the focal plane of the telescope where the CCD matrix stands. Here comes the definition of the Solar edge for one frame as the collection of the inflect points on the luminosity function along each of the 256 useful lines (the matrix is 512 by 512 pixels). This means that a numerical derivation is performed on every other line of the CCD video camera which has to stand as vertical as possible. Then, for every frame, and through the 256 points, a parabola is fitted, using the least squares method. The top of this parabola materializes the prospective characteristic point. The sets of such points associated with the corresponding times of acquisition, are collected for both images and the exact time of contact of the two images may be obtained. This time is also the time when the solar edge crosses the almucantar. The results for the semi-diameter obtained during 1996 campaign are derived from sixty measurements with the revolving mask and sixty seven without it, performed on the Solar Astrolabe. They give a mean value of 959\\farcs39 +/- 0\\farcs 03 with a scatter of 0\\farcs 29. It is interesting to remark that the values of the error bar and the scatter obtained do not depend on the definition of the Solar edge, whereas the mean value does depend on it. It is noticed that going with the method is made a systematic error which slightly shrinks the diameter, but this value can be known statistically and the correction can easily be done. Choosing the best definition of the Solar edge will be the matter of a following article. The main advantage of such a digital acquisition procedure has to be stressed, as it enables to store the full data for further reference and, if possible, better future processing.

  3. Improved video guidance sensor for automated docking

    NASA Astrophysics Data System (ADS)

    Howard, Richard T.; Book, Michael L.

    1995-06-01

    The Video Guidance Sensor (VGS) has been developed by NASA's Marshall Space Flight Center (MSFC) to provide the capability for a spacecraft to find and track a target vehicle and determine the relative positions and attitudes between the sensor and the target. The sensor uses laser diodes to illuminate the target, a CCD-based camera to sense the target, and a frame-grabber and processor to convert the video information into relative range, azimuth, elevation, roll, pitch, and yaw. The sensor was first built in 1988 and used in successful automated docking experiments using the air-bearing spacecraft simulator in MSFC's Flight Robotics Laboratory. Since then, many changes and improvements have been made, based on the results of testing. In addition to the use of this system for space vehicles, it has been adapted for commercial application. The current design is being built as a prototype to prepare for flight testing on the Space Shuttle. Some of the changes from the original system were designed to improve the noise rejection of the system. Other changes were made to improve the overall range of operation of the system, and still other changes improved the bandwidth of the system. The current VGS is designed to operate from 110 meters down to 0.5 meters and output the relative position and attitude data at 5 Hz. The system will be able to operate under any orbital lighting conditions from full solar illumination to complete darkness. The VGS is also designed to be used with more than one target and sensor to allow for redundant configurations. This new prototype should be completed and undergoing open- and closed-loop testing after March 1995.

  4. Mobile Video in Everyday Social Interactions

    NASA Astrophysics Data System (ADS)

    Reponen, Erika; Lehikoinen, Jaakko; Impiö, Jussi

    Video recording has become a spontaneous everyday activity for many people, thanks to the video capabilities of modern mobile phones. Internet connectivity of mobile phones enables fluent sharing of captured material even real-time, which makes video an up-and-coming everyday interaction medium. In this article we discuss the effect of the video camera in the social environment, everyday life situations, mainly based on a study where four groups of people used digital video cameras in their normal settings. We also reflect on another study of ours, relating to real-time mobile video communication and discuss future views. The aim of our research is to understand the possibilities in the domain of mobile video. Live and delayed sharing seem to have their special characteristics, live video being used as a virtual window between places whereas delayed video usage has more scope for good-quality content. While this novel way of interacting via mobile video enables new social patterns, it also raises new concerns for privacy and trust between participating persons in all roles, largely due to the widely spreading possibilities of videos. Video in a social situation affects cameramen (who record), targets (who are recorded), passers-by (who are unintentionally in the situation), and the audience (who follow the videos or recording situations) but also the other way around, the participants affect the video by their varying and evolving personal and communicational motivations for recording.

  5. Traffic camera markup language (TCML)

    NASA Astrophysics Data System (ADS)

    Cai, Yang; Bunn, Andrew; Snyder, Kerry

    2012-01-01

    In this paper, we present a novel video markup language for articulating semantic traffic data from surveillance cameras and other sensors. The markup language includes three layers: sensor descriptions, traffic measurement, and application interface descriptions. The multi-resolution based video codec algorithm enables a quality-of-service-aware video streaming according the data traffic. A set of object detection APIs are developed using Convex Hull and Adaptive Proportion models and 3D modeling. It is found that our approach outperforms 3D modeling and Scale-Independent Feature Transformation (SIFT) algorithms in terms of robustness. Furthermore, our empirical data shows that it is feasible to use TCML to facilitate the real-time communication between an infrastructure and a vehicle for safer and more efficient traffic control.

  6. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  7. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  8. State of the art in video system performance

    NASA Technical Reports Server (NTRS)

    Lewis, Michael J.

    1990-01-01

    The closed circuit television (CCTV) system that is onboard the Space Shuttle has the following capabilities: camera, video signal switching and routing unit (VSU); and Space Shuttle video tape recorder. However, this system is inadequate for use with many experiments that require video imaging. In order to assess the state-of-the-art in video technology and data storage systems, a survey was conducted of the High Resolution, High Frame Rate Video Technology (HHVT) products. The performance of the state-of-the-art solid state cameras and image sensors, video recording systems, data transmission devices, and data storage systems versus users' requirements are shown graphically.

  9. LSST Camera Electronics

    NASA Astrophysics Data System (ADS)

    Newcomer, F. Mitchell; Bailey, S.; Britton, C. L.; Felt, N.; Geary, J.; Hashimi, K.; Lebbolo, H.; Lebbolo, H.; Ning, Z.; O'Connor, P.; Oliver, J.; Radeka, V.; Sefri, R.; Tocut, V.; Van Berg, R.

    2009-01-01

    The 3.2 Gpixel LSST camera will be read out by means of 189 highly segmented 4K x 4K CCDs. A total of 3024 video channels will be processed by a modular, in-cryostat electronics package based on two custom multichannel analog ASICs now in development. Performance goals of 5 electrons noise, .01% electronic crosstalk, and 80 mW power dissipation per channel are targeted. The focal plane is organized as a set of 12K x 12K sub-mosaics ("rafts") with front end electronics housed in an enclosure falling within the footprint of the CCDs making up the raft. The assembly of CCDs, baseplate, electronics boards, and cooling components constitutes a self-contained and testable 144 Mpix imager ("raft tower"), and 21 identical raft towers make up the LSST science focal plane. Electronic, mechanical, and thermal prototypes are now undergoing testing and results will be presented at the meeting.

  10. CCD temperature control CTIO 60 inches Chiron

    E-print Network

    Tokovinin, Andrei A.

    CCD temperature control CTIO 60 inches Chiron CHI60HF4.1 La Serena, November 2009 #12;Contents......................................................................................................6 Figure 3: Long term temperature stability...........................................................................................................9 CTIO 60 inches Chiron / CCD temperature control CHI60HF1.1 2 #12;Introduction The goal

  11. Robotic CCD microscope for enhanced crystal recognition

    DOEpatents

    Segelke, Brent W. (San Ramon, CA); Toppani, Dominique (Livermore, CA)

    2007-11-06

    A robotic CCD microscope and procedures to automate crystal recognition. The robotic CCD microscope and procedures enables more accurate crystal recognition, leading to fewer false negative and fewer false positives, and enable detection of smaller crystals compared to other methods available today.

  12. Mars exploration rover engineering cameras Allan Eisenman,1

    E-print Network

    by solar imaging. The rovers have six additional cameras that will be used, exclusively, for engineering and avoidance. Keywords: Mars, Mars Exploration Rover, CCD, surface navigation, hazard avoidance, solar imaging. They are independent of the landing vehicle and therefore, they communicate directly with Earth. Also, the rovers

  13. An airborne four-camera imaging system for agricultural applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  14. "Desktop Video" Uses Synergy of Computer, Video Camera.

    ERIC Educational Resources Information Center

    Ferraro, Carl David

    1989-01-01

    Describes the uses of the Amiga computer, a "digital canvas" that creates computer-generated art and graphic displays. Examines how computer-graphics programs can be used in a course that is totally interactive with the computer. (MM)

  15. Combining Content-based Analysis and Crowdsourcing to Improve User Interaction with Zoomable Video

    E-print Network

    Grigoras, .Romulus

    that defines a rectangular region in the high resolution video from which the displayed video is cropped. While, and video sensors have lead to video cameras that are capable of capturing high resolution videos. Ability sequences, of resolution as high as 7,680×4,320 (UHDTV) have been recorded and transmitted over Permission

  16. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  17. Stereoscopic camera design

    NASA Astrophysics Data System (ADS)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  18. Deshaking Endoscopic Video for Kymography David C. Schneider

    E-print Network

    Eisert, Peter

    Deshaking Endoscopic Video for Kymography David C. Schneider , Anna Hilsmann, Peter Eisert: The vibrating folds are filmed with an endoscopic camera pointed into the larynx. The camera records at a high diagnosis is a time-slice image, i.e. an X-t-cut through the X-Y-t image cube of the endoscopic video (fig

  19. Through the Eye of the Camera: A Teacher's View of Video-Conferencing Mathematics Teacher, Gaston County Schools, Gastonia, NC USA jtpotter@gaston.k12.nc.us

    E-print Network

    Spagnolo, Filippo

    video-conferencing has gained popularity in education in the U.S. Once the domain of universities classes was video taped lectures and correspondence tests sent through the mail. Over the last fifteen location and view and participate in a class located in another. Because college distance education

  20. VizieR Online Data Catalog: CCD observations of saturnian satellites (Grosheva+, 2011)

    NASA Astrophysics Data System (ADS)

    Grosheva, E. A.; Izmailov, I. S.; Kiseleva, T. P.

    2011-10-01

    All observations were carried out with 26-inch Zeiss refractor(D=650mm, F=10413mm, scale is 19.80"/mm) at Pulkovo (code is 084). CCD camera FLI Pro Line 09000 (3056x3056pix) was used. Scale per pixel is 0.24" and field observed with this camera is 12'x12'. Time of exposures is 1.5s. All presented positions are topocentric. Ephemerides for comparison are given by web-server "Natural Satellites Ephemeride Server MULTI-SAT" developed by N.V. Emelyanov (http://lnfm1.sai.msu.ru/neb/nss/nssephmr.htm). (3 data files).

  1. Coeficientes de extinción del CASLEO y características del CCD directo con el telescopio JS

    NASA Astrophysics Data System (ADS)

    Baume, G. L.; Campuzano-Castro, F.; Fernández-Lajús, E.; Gamen, R.; Haucke, M.; Marchesini, E. J.; Molina-Lera, J. A.; Rossignoli, N. L.; San Sebastián, I. L.; Tello Huanca, E. L.; Zanardi, M.

    We computed atmospheric extinction coefficients at the Complejo Astronómico El Leoncito (CASLEO) in UBVRI bands. They were obtained from photometric observations carried out in December 2011 using the 2.15-m ``Jorge Sahade'' Telescope (JS) and we compared then with previous published values. We also computed several instrumental parameters of the ROPER CCD 1300B camera working in direct mode and we presented results of a linearity test. FULL TEXT IN SPANISH

  2. Nonlinear feedback model attitude control using CCD in magnetic suspension system

    NASA Technical Reports Server (NTRS)

    Lin, CHIN-E.; Hou, Ann-San

    1994-01-01

    A model attitude control system for a CCD camera magnetic suspension system is studied in this paper. In a recent work, a position and attitude sensing method was proposed. From this result, model position and attitude of a magnetic suspension system can be detected by generating digital outputs. Based on this achievement, a control system design using nonlinear feedback techniques for magnetic suspended model attitude control is proposed.

  3. Mechanical Design of the LSST Camera

    SciTech Connect

    Nordby, Martin; Bowden, Gordon; Foss, Mike; Guiffre, Gary; Ku, John; Schindler, Rafe; /SLAC

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors in image reconstruction. Design and analysis for the camera body and cryostat will be detailed.

  4. Video alignment for change detection.

    PubMed

    Diego, Ferran; Ponsa, Daniel; Serrat, Joan; López, Antonio M

    2011-07-01

    In this work, we address the problem of aligning two video sequences. Such alignment refers to synchronization, i.e., the establishment of temporal correspondence between frames of the first and second video, followed by spatial registration of all the temporally corresponding frames. Video synchronization and alignment have been attempted before, but most often in the relatively simple cases of fixed or rigidly attached cameras and simultaneous acquisition. In addition, restrictive assumptions have been applied, including linear time correspondence or the knowledge of the complete trajectories of corresponding scene points; to some extent, these assumptions limit the practical applicability of any solutions developed. We intend to solve the more general problem of aligning video sequences recorded by independently moving cameras that follow similar trajectories, based only on the fusion of image intensity and GPS information. The novelty of our approach is to pose the synchronization as a MAP inference problem on a Bayesian network including the observations from these two sensor types, which have been proved complementary. Alignment results are presented in the context of videos recorded from vehicles driving along the same track at different times, for different road types. In addition, we explore two applications of the proposed video alignment method, both based on change detection between aligned videos. One is the detection of vehicles, which could be of use in ADAS. The other is online difference spotting videos of surveillance rounds. PMID:21118773

  5. Design for the correction system of the real time nonuniformity of large area-array CCD image

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Li, Chunmei; Lei, Ning

    2012-10-01

    With the robust thriving of aviation cameras and remote sensing technology, the linear-array CCD (charge-coupled device) and area CCD have developed toward large area CCD, which has a broad coverage and avoids the difficulty in jointing small area CCDs in addition to improving time resolution. However, due to the high amount of pixels and channels of large area CCD, photo-response non-uniformity (PRNU) is severe. In this paper, a real time non-uniformity correction system is introduced for a sort of large area full frame transfer CCD. First, the correction algorithm is elaborated according to CCD's working principle. Secondly, due to the high number of pixels and correction coefficient, ordinary chip memory cannot meet the requirement. The combination of external flash memory and DDR described in the paper satisfies large capacity memory and rapid real time correction. The methods and measurement steps for obtaining correction factors are provided simultaneously. At the end, an imaging test is made. The non-uniformity of the image is reduced to 0.38 % from the pre-correction 2.96 %, achieving an obvious reduction of non-uniformity. The result shows that the real time non-uniformity correction system can meet the demands of large area-array CCD.

  6. Video otoscopy in audiologic practice.

    PubMed

    Sullivan, R F

    1997-12-01

    Recent advances in endoscopic optics and miniature video camera technology have made video otoscopy (VO) accessible to audiologists in a practical way. Seven categories of VO applications are presented with clinical examples: (1) general examination of the earcanal and tympanic membrane, (2) physician communication/telemedicine, (3) hearing instrument selection and fitting applications, (4) patient education, (5) scope of practice reinforcement, (6) knowledge base/skill growth, and (7) cerumen management. PMID:9433690

  7. TV Video-Level Controller

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E.

    1986-01-01

    Constant output maintained, though luminance varies by 5 million to 1. Three means of normalizing video output utilized in video-level controller: iris adjustment, tube voltage adjustment, and automatic gain control. With aid of automatic light control and gain control, television camera accommodates maximum light level 5 million times greater than lowest light level, while outputting constant 3-V peak signal to processing circuitry.

  8. 15 CFR 740.19 - Consumer Communications Devices (CCD).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...2010-01-01 false Consumer Communications Devices (CCD). 740... § 740.19 Consumer Communications Devices (CCD). ...designated EAR99; (8) Network access controllers and communications channel controllers...

  9. SCP -- A Simple CCD Processing Package

    NASA Astrophysics Data System (ADS)

    Lewis, J. R.

    This note describes a small set of programs, written at RGO, which deal with basic CCD frame processing (e.g. bias subtraction, flat fielding, trimming etc.). The need to process large numbers of CCD frames from devices such as FOS or ISIS in order to extract spectra has prompted the writing of routines which will do the basic hack-work with a minimal amount of interaction from the user. Although they were written with spectral data in mind, there are no ``spectrum-specific'' features in the software which means they can be applied to any CCD data.

  10. CCD photometry to V = 21 in a Puppis field

    NASA Astrophysics Data System (ADS)

    Harris, W. E.; Reed, B. C.; Hesser, J. E.

    1986-06-01

    A UBV calibrating sequence of 235 stars brighter than V = 21.25 in the 'Puppis Window' at (l,b) = (254 deg, 0 deg) is presented. The photoelectric data were obtained during the night of October 3, 1983, using a CCD camera on the CTIO 4-m telescope. The program used consisted of 10-s and 120-s exposure pairs in B and V of a field centered on the Reed and FitzGerald (1983) star C7-53. After preprocessing the program frames (bias, flat-fielding, and fringe removal), the data were reduced with an operating version of DAOPHOT. The internal errors of the photometry, derived through DAOPHOT by adding scaled images scattered randomly throughout the frame and then remeasuring the frame, are summarized. The cutoff of V = 21.25 corresponds to the value of sigma(V,B) about 0.1 mag.

  11. Vilnius Multicolor CCD Photometry of the Open Cluster NGC 752

    NASA Astrophysics Data System (ADS)

    Bartaši?t?, S.; Janusz, R.; Boyle, R. P.; Philip, A. G. Davis

    We have performed multicolor CCD observations of the central area of NGC 752 to search for faint, low-mass members of this open cluster. Four 12'x12' fields were taken on the 1.8 m Vatican Advanced Technology Telescope (Mt. Graham, Arizona) using a 4K CCD camera and eight intermediate-band filters of the Strömvil system. In this paper we present a catalog of photometry for 405 stars down to the limiting magnitude V=18.5, which contains V magnitudes and color indices of the Vilnius system, together with photometric determinations of spectral types, absolute magnitudes MV, interstellar reddening values EY-V and metallicity parameters [Fe/H]. The good quality multicolor data made it possible to identify the locus of the lower main sequence to four magnitudes beyond the previous (photographic) limit. A relatively small number of photometric members identified at faint magnitudes seems to be indicative of actual dissolution of the cluster from the low-mass end.

  12. CCD Astrometry of Selected Compact Extragalactic Radio Sources

    NASA Astrophysics Data System (ADS)

    Fedorov, P.; Velichko, F.; Filonenko, V.; Myznikov, A.; Sergeev, V.

    The 64 optical positions relative to the Catalog of Astrometric Standards (USNO-A2.0) and 9 optical positions relative to the Extragalactic Reference Link Catalog (de Vegt at al., 2001) had been obtained for the optical counterparts of 50 northern compact extragalactic radio sources (CERS). These positions were determined at the Kharkov Astronomical Observatory with use the CCD-camera ST-6 of the 0.7-m telescope AZT-8. More than 325 CCD-images of field 10.5' × 8' with optical counterparts of selected CERS had been obtained during 1997-2001. Positions of reference stars (from 6 to 12 stars for each CERS) were obtained from USNO-A2.0 catalogue, Extragalactic Reference Link Catalog and Nikolayev AMC catalogue (Pinigin & Shulga, 1999). The mean internal formal errors of the optical positions of these CERS are 100 mas in right ascension and 70 mas in declination. A comparison with VLBI radio positions for these sources is presented. The mean differences between radio and optical positions from our observations are not significantly differing from zero on the 0.05 significance level. The optical data which we obtained is potentially useful to possibly improve the current link of the Hipparcos reference frame to the ICRS. References de Vegt, C., Hindsley, R., Zacharias, N., Winter, L. 2001, AJ, 2815 Pinigin, G.I., Shulga A.V., 1999, Proc. JOURNESS 1999 & IX. Lohrmann-Kolloquium, Dresden (Germany), 64

  13. Fast frame scanning camera system for light-sheet microscopy.

    PubMed

    Wu, Di; Zhou, Xing; Yao, Baoli; Li, Runze; Yang, Yanlong; Peng, Tong; Lei, Ming; Dan, Dan; Ye, Tong

    2015-10-10

    In the interest of improving the temporal resolution for light-sheet microscopy, we designed a fast frame scanning camera system that incorporated a galvanometer scanning mirror into the imaging path of a home-built light-sheet microscope. This system transformed a temporal image sequence to a spatial one so that multiple images could be acquired during one exposure period. The improvement factor of the frame rate was dependent on the number of sub-images that could be tiled on the sensor without overlapping each other and was therefore a trade-off with the image size. As a demonstration, we achieved 960 frames/s (fps) on a CCD camera that was originally capable of recording images at only 30 fps (full frame). This allowed us to observe millisecond or sub-millisecond events with ordinary CCD cameras. PMID:26479797

  14. Smart Video Systems in Police Cars Amirali Jazayeri, Hongyuan Cai, Mihran Tuceryan, Jiang Yu Zheng

    E-print Network

    Tuceryan, Mihran

    Smart Video Systems in Police Cars Amirali Jazayeri, Hongyuan Cai, Mihran Tuceryan, Jiang Yu Zheng@cs.iupui.edu ABSTRACT The use of video cameras in police cars has been found to have significant value and the number for later use in legal settings, in-car video cameras can be used to analyze in real-time or near real

  15. Explosive Transient Camera (ETC) Program

    NASA Technical Reports Server (NTRS)

    Ricker, George

    1991-01-01

    Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

  16. Ultraviolet Viewing with a Television Camera.

    ERIC Educational Resources Information Center

    Eisner, Thomas; And Others

    1988-01-01

    Reports on a portable video color camera that is fully suited for seeing ultraviolet images and offers some expanded viewing possibilities. Discusses the basic technique, specialized viewing, and the instructional value of this system of viewing reflectance patterns of flowers and insects that are invisible to the unaided eye. (CW)

  17. Camera! Action! Collaborate with Digital Moviemaking

    ERIC Educational Resources Information Center

    Swan, Kathleen Owings; Hofer, Mark; Levstik, Linda S.

    2007-01-01

    Broadly defined, digital moviemaking integrates a variety of media (images, sound, text, video, narration) to communicate with an audience. There is near-ubiquitous access to the necessary software (MovieMaker and iMovie are bundled free with their respective operating systems) and hardware (computers with Internet access, digital cameras, etc.).…

  18. Scintillator to CCD coupling in x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Davis, Graham R.; Elliott, James C.

    2006-08-01

    Optical coupling between the X-ray scintillator and digital camera (typically CCD) is a major design consideration in X-ray microtomography. Previously, we used a pair of 50mm f1.2 lenses, which we determined to be approximately 60 % efficient, that is, the signal to noise ratio is that which would occur if 60 % of the X-ray photons absorbed by the scintillator were directly detected. For larger CCDs, lenses become excessively large, heavy and expensive. For our 60 x 60 mm time-delay integration CCD camera, we used parallel fibre-optic coupling, giving greater efficiency. A problem with this is the scattering of light through the fibre cladding, which reduces image contrast, adding a very blurred image to the sharp image transmitted through the fibres. This problem is ideally suited to solution by deconvolution. Since the high frequency image components are present (direct fibre image) deconvolution can be used to eliminate the low frequency scatter image, without the problems normally associated with de-blurring. The point spread function was assumed to be rotationally symmetrical and was determined from an edge image of a lead plate positioned close to the scintillator. In frequency space, the mid frequency portion was extrapolated into the low frequency portion using a parabolic fit. The difference between the extrapolated and measured low frequency portions was deemed to be the scatter response. This was then added to the frequency response for a perfect delta function to obtain the frequency response used for deconvolution. The results showed excellent correction of the X-ray microtomographic images.

  19. Towards Mobile HDR Video Tassio Castro

    E-print Network

    de Figueiredo, Luiz Henrique

    Towards Mobile HDR Video Tassio Castro tknop@impa.br Alexandre Chapiro achapiro@impa.br Marcelo is the iPhone 4 platform, which offers an HDR mode in the camera app since the release of the iOS 4 method for hand-held cameras, including those installed in some mobile phones. Being based on histograms

  20. Multicamera Video Summarization from Optimal Reconstruction

    E-print Network

    California at Santa Barbara, University of

    with stationary cameras, much of the recorded video is uninteresting, so time spent having a human review, if the travel time does significantly differ from what is expected, the summary should spend extra time data. However, long recordings over many deployed cameras can easily overwhelm a human operator

  1. Dashboard Videos

    ERIC Educational Resources Information Center

    Gleue, Alan D.; Depcik, Chris; Peltier, Ted

    2012-01-01

    Last school year, I had a web link emailed to me entitled "A Dashboard Physics Lesson." The link, created and posted by Dale Basier on his "Lab Out Loud" blog, illustrates video of a car's speedometer synchronized with video of the road. These two separate video streams are compiled into one video that students can watch and analyze. After seeing…

  2. A multi-frame, megahertz CCD imager

    SciTech Connect

    Mendez, Jacob A; Balzer, Stephen J; Watson, Scott A

    2008-01-01

    A high-efficiency, high-speed imager has been fabricated capable of framing rates of 2 MHz. This device utilizes a 512 x 512 pixel charge coupled device (CCD) with a 25cmZ active area, and incorporates an electronic shutter technology designed for back-illuminated CCD's, making this the largest and fastest back-illuminated CCD in the world. Characterizing an imager capable of this frame rate presents unique challenges. High speed LED drivers and intense radioactive sources are needed to perform basic measurements. We investigate properties normally associated with single-frame CCD's such as read noise, gain, full-well capacity, detective quantum efficiency (DQE), sensitivity, and linearity. In addition, we investigate several properties associated with the imager's multi-frame operation such as transient frame response and frame-to-frame isolation while contrasting our measurement techniques and results with more conventional devices.

  3. A multi-frame, megahertz CCd imager

    SciTech Connect

    Mendez, Jacob; Balzer, Stephen; Watson, Scott; Reich, Robert

    2010-01-01

    To record high-speed, explosively driven, events, a high efficiency, high speed, imager has been fabricated which is capable of framing rates of 2 MHz. This device utilizes a 512 x 512 pixel charge coupled device (CCD) with a 25cm{sup 2} active area, and incorporates an electronic shutter technology designed for back-illuminated CCD's, making this the largest and fastest back-illuminated CCD in the world. Characterizing an imager capable of this frame rate presents unique challenges. High speed LED drivers and intense radioactive sources are needed to perform the most basic measurements. We investigate properties normally associated with single-frame CCD's such as read noise, full-well capacity, sensitivity, signal to noise ratio, linearity and dynamic range. In addition, we investigate several properties associated with the imager's multi-frame operation such as transient frame response and frame-to-frame isolation while contrasting our measurement techniques and results with more conventional devices.

  4. Just the interesting bits A quick way to find incidents recorded by security cameras

    E-print Network

    Peleg, Shmuel

    -circuit television (CCTV) cameras. The combined output of all these systems is far greater than the capacity of the CCTV cameras watching a bank's cash machines now make that job easier by linking video footage company, makes looking through CCTV footage a breeze. In some cases, 24 hours of video from a security

  5. Video systems for alarm assessment

    SciTech Connect

    Greenwoll, D.A.; Matter, J.C. ); Ebel, P.E. )

    1991-09-01

    The purpose of this NUREG is to present technical information that should be useful to NRC licensees in designing closed-circuit television systems for video alarm assessment. There is a section on each of the major components in a video system: camera, lens, lighting, transmission, synchronization, switcher, monitor, and recorder. Each section includes information on component selection, procurement, installation, test, and maintenance. Considerations for system integration of the components are contained in each section. System emphasis is focused on perimeter intrusion detection and assessment systems. A glossary of video terms is included. 13 figs., 9 tabs.

  6. Fully depleted back-illuminated p-channel CCD development

    SciTech Connect

    Bebek, Chris J.; Bercovitz, John H.; Groom, Donald E.; Holland, Stephen E.; Kadel, Richard W.; Karcher, Armin; Kolbe, William F.; Oluseyi, Hakeem M.; Palaio, Nicholas P.; Prasad, Val; Turko, Bojan T.; Wang, Guobin

    2003-07-08

    An overview of CCD development efforts at Lawrence Berkeley National Laboratory is presented. Operation of fully-depleted, back-illuminated CCD's fabricated on high resistivity silicon is described, along with results on the use of such CCD's at ground-based observatories. Radiation damage and point-spread function measurements are described, as well as discussion of CCD fabrication technologies.

  7. VIDEO INPAINTING OF OCCLUDING AND OCCLUDED OBJECTS Kedar A. Patwardhan

    E-print Network

    VIDEO INPAINTING OF OCCLUDING AND OCCLUDED OBJECTS By Kedar A. Patwardhan Guillermo Sapiro­0436 Phone: 612/624-6066 Fax: 612/626-7370 URL: http://www.ima.umn.edu #12;VIDEO INPAINTING OF OCCLUDING present a basic technique to fill-in missing parts of a video sequence taken from a static camera. Two

  8. Video Registration using Dynamic Textures Avinash Ravichandran and Rene Vidal

    E-print Network

    Vidal, René

    Video Registration using Dynamic Textures Avinash Ravichandran and Ren´e Vidal Center for Imaging-based algorithm for registering two video sequences of a rigid or nonrigid scene taken from two synchronous or asynchronous cameras. We model each video sequence as the output of a linear dynamical system, and transform

  9. POTENTIAL FOR REMOTE SENSING FROM AGRICULTURAL AIRCRAFT USING DIGITAL VIDEO

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An imaging system for remote sensing was developed for agricultural aircraft. The system uses a digital video camera, GPS, and a video mapping system (VMS) as the GPS interface to video. Remote control and monitoring was implemented to allow the pilot to image only field areas of interest, facilitat...

  10. Secure Video Surveillance System Acquisition Software

    Energy Science and Technology Software Center (ESTSC)

    2009-12-04

    The SVSS Acquisition Software collects and displays video images from two cameras through a VPN, and store the images onto a collection controller. The software is configured to allow a user to enter a time window to display up to 2 1/2, hours of video review. The software collects images from the cameras at a rate of 1 image per second and automatically deletes images older than 3 hours. The software code operates in amore »linux environment and can be run in a virtual machine on Windows XP. The Sandia software integrates the different COTS software together to build the video review system.« less

  11. Secure Video Surveillance System Acquisition Software

    SciTech Connect

    2009-12-04

    The SVSS Acquisition Software collects and displays video images from two cameras through a VPN, and store the images onto a collection controller. The software is configured to allow a user to enter a time window to display up to 2 1/2, hours of video review. The software collects images from the cameras at a rate of 1 image per second and automatically deletes images older than 3 hours. The software code operates in a linux environment and can be run in a virtual machine on Windows XP. The Sandia software integrates the different COTS software together to build the video review system.

  12. Video Tracking Using Acoustic Triangulation 

    E-print Network

    Ivanov, Alexander

    2012-05-03

    This study focuses on the detection and triangulation of sound sources. Specifically, we focus on the detection of sound in order to track a person’s position with a video camera. Acoustic tracking, an alternative to visual tracking, is relatively...

  13. Solid State Replacement of Rotating Mirror Cameras

    SciTech Connect

    Frank, A M; Bartolick, J M

    2006-08-25

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed ''In-situ Storage Image Sensor'' or ''ISIS'', by Prof. Goji Etoh, has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  14. Solid state replacement of rotating mirror cameras

    NASA Astrophysics Data System (ADS)

    Frank, Alan M.; Bartolick, Joseph M.

    2007-01-01

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed 'In-situ Storage Image Sensor' or 'ISIS', by Prof. Goji Etoh has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  15. X-ray imaging using digital cameras

    NASA Astrophysics Data System (ADS)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  16. Sensitive detection of active Shiga toxin using low cost CCD based optical detector.

    PubMed

    Rasooly, Reuven; Balsam, Josh; Hernlem, Bradley J; Rasooly, Avraham

    2015-06-15

    To reduce the sources and incidence of food-borne illness there is a need to develop affordable, sensitive devices for detection of active toxins, such as Shiga toxin type 2 (Stx2). Currently the widely used methods for measuring Shiga toxin are immunoassay that cannot distinguish between the active form of the toxin, which poses a threat to life, to the inactive form which can bind to antibodies but show no toxicity. In this work, we determine toxin activity based on Shiga toxin inhibition of green fluorescent protein (GFP) combined with low cost charge-coupled device (CCD) fluorescence detection, which is more clinically relevant than immunoassay. For assay detection, a simple low cost fluorescence detection system was constructed using a CCD camera and light emitting diode (LED) excitation source, to measure GFP expression. The system was evaluated and compared to a commercial fluorometer using photomultiplier detection for detecting active Stx2 in the range 100 ng/mL-0.01 pg/mL. The result shows that there is a negative linear relationship between Stx2 concentrations and luminous intensity of GFP, imaged by the CCD camera (R(2)=0.85) or fluorometer (R(2)=0.86). The low cost (?$300) CCD camera is capable of detecting Shiga toxin activity at comparable levels as a more expensive (?$30,000) fluorometer. These results demonstrate the utility and the potential of low cost detectors for toxin activity; this approach may increase the availability of foodborne bacterial toxin diagnostics in regions where there are limited resources and could be readily adapted to the detection of other food-borne toxins. PMID:25677808

  17. Relative radiometric calibration method based on linear CCD imaging the same region of non-uniform scene

    NASA Astrophysics Data System (ADS)

    Li, Haichao; Man, Yi-yun

    2014-11-01

    This paper provides a relative radiometric calibration method based on the linear CCD imaging the same region of non-uniform scene, which makes full use of the ability of yaw angle control to ensure all the linear CCD detectors imaging the same scene. Firstly, when it is needed to perform the satellite relative radiometric calibration task, the initial drift angle will be calculated, according to which the yaw angle can be adjusted to ensure on-orbit satellite performing the calibration imaging mode, and in this mode the linear CCD and the satellite motion are in the approximate direction. Secondly, in calibration imaging process the yaw angle will be continuously adjusted to control the push-broom direction, and the linear CCD camera can be sequentially on the same region of non-uniform scene, which can obtain the remote-sensing image observing the same region with all the CCD detectors. Finally, after obtaining the same region image with the linear CCD camera, histogram matching method is used to establish the high-precision nonlinear relative radiometric calibration model, and this method overcomes the nonlinear response problem caused by the camera photon noise, the dark current noise. This method needs neither the on orbit calibration device, nor the ground uniform scaling field, and the general earth observation scene can meet the requirements. This method does not need a lot of on-orbit imaging data for statistical analysis compared with the statistical method, and each track is scaled to meet the conditions for calibration imaging, which avoids the unreliable problem of the calibration source itself caused by the unstable differences between the different tracks.

  18. Using APART for wall visibility calculations in the calibration channel of wide field planetary camera II

    NASA Technical Reports Server (NTRS)

    Scholl, James W.; Scholl, Marija S.

    1993-01-01

    The cone visibility from the CCD detector array plane in the calibration channel of wide field planetary camera II (WFPC II) is analyzed, using APART, for three representative wavelengths as characterized by indices of refraction. The light pipe walls are visible from the corners of the equivalent CCD detector array when imaging with the smallest index of refraction, n = 1.375. Painting the inside of the light pipe walls will result in a decrease in their visibility.

  19. CCD vs. CMOS from: http://www.dalsa.com/markets/ccd_vs_cmos.asp

    E-print Network

    Giger, Christine

    vs. oranges: they can both be good for you. DALSA offers both. CCD (charge coupled device) and CMOS into electric charge and process it into electronic signals. In a CCD sensor, every pixel's charge charge tovoltage conversion, and the sensor often also includes amplifiers, noisecorrection

  20. Camera for QUasars in EArly uNiverse (CQUEAN)

    E-print Network

    Park, Won-Kee; Im, Myungshin; Choi, Changsu; Jeon, Yiseul; Chang, Seunghyuk; Jeong, Hyeonju; Lim, Juhee; Kim, Eunbin

    2012-01-01

    We describe the overall characteristics and the performance of an optical CCD camera system, Camera for QUasars in EArly uNiverse (CQUEAN), which is being used at the 2.1 m Otto Struve Telescope of the McDonald Observatory since 2010 August. CQUEAN was developed for follow-up imaging observations of red sources such as high redshift quasar candidates (z >= 5), Gamma Ray Bursts, brown dwarfs, and young stellar objects. For efficient observations of the red objects, CQUEAN has a science camera with a deep depletion CCD chip which boasts a higher quantum efficiency at 0.7 - 1.1 um than conventional CCD chips. The camera was developed in a short time scale (~ one year), and has been working reliably. By employing an auto-guiding system and a focal reducer to enhance the field of view on the classical Cassegrain focus, we achieve a stable guiding in 20 minute exposures, an imaging quality with FWHM >= 0.6" over the whole field (4.8' * 4.8'), and a limiting magnitude of z = 23.4 AB mag at 5-sigma with one hour tota...