These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

CCD Camera  

DOEpatents

A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

Roth, Roger R. (Minnetonka, MN)

1983-01-01

2

ISIS with curved coupled CCD channels for a video camera of 1,000,000 pps  

Microsoft Academic Search

An improved design is presented for an ISIS, In-situ Storage Image Sensor, previously proposed by the authors for a high frame rate video camera of 1,000,000 pps. CCD channels of the sensor play dual roles for signal storage in an image capturing phase and for signal transfer in a read-out phase, which minimizes unutilized spaces on the light receptive area.

Takeharu Etoh; Hideki Mutoh; Kohsei Takehara; Sachio Oki

1999-01-01

3

ISIS with curved coupled CCD channels for a video camera of 1,000,000 pps  

NASA Astrophysics Data System (ADS)

An improved design is presented for an ISIS, In-situ Storage Image Sensor, previously proposed by the authors for a high frame rate video camera of 1,000,000 pps. CCD channels of the sensor play dual roles for signal storage in an image capturing phase and for signal transfer in a read-out phase, which minimizes unutilized spaces on the light receptive area. The transfer direction is only vertical, which simplifies the structure of the sensor and provides better quality in reproduced images. An overwriting mechanism is built in, which facilitates synchronization of cease of the image capturing phase to the occurrence of a target event. The design is improved by coupling adjacent two CCD channels and two photodiodes, which provides wider spaces to place metal wires to increase rate of charge drive.

Etoh, Takeharu G.; Mutoh, Hideki; Takehara, Kohsei; Oki, Sachio

1999-06-01

4

Advanced CCD camera developments  

SciTech Connect

Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

Condor, A. [Lawrence Livermore National Lab., CA (United States)

1994-11-15

5

Programmable CCD camera equipped with user-configurable video rate digital video processing for use in industrial inspection  

Microsoft Academic Search

A new high performance CCD camera family is presented. The camera incorporates a micro-controller\\/PLD combination to provide users with computer control of image acquisition, image processing and analysis. User control of image acquisition includes adjustable gain and offset, data rate and timing.Image processing and analysis algorithms are implemented within PLDs and regulated via the microcontroller. A variety of image processing

Jim W. Roberts; J. Wynen

1996-01-01

6

Radiometric CCD camera calibration and noise estimation  

Microsoft Academic Search

Changes in measured image irradiance have many physical causes and are the primary cue for several visual processes, such as edge detection and shape from shading. Using physical models for charged-coupled device (CCD) video cameras and material reflectance, we quantify the variation in digitized pixel values that is due to sensor noise and scene variation. This analysis forms the basis

Glenn E. Healey; Raghava Kondepudy

1994-01-01

7

Calibration Tests of Industrial and Scientific CCD Cameras  

NASA Technical Reports Server (NTRS)

Small format, medium resolution CCD cameras are at present widely used for industrial metrology applications. Large format, high resolution CCD cameras are primarily in use for scientific applications, but in due course should increase both the range of applications and the object space accuracy achievable by close range measurement. Slow scan, cooled scientific CCD cameras provide the additional benefit of additional quantisation levels which enables improved radiometric resolution. The calibration of all types of CCD cameras is necessary in order to characterize the geometry of the sensors and lenses. A number of different types of CCD cameras have been calibrated a the NASA Langley Research Center using self calibration and a small test object. The results of these calibration tests will be described, with particular emphasis on the differences between standard CCD video cameras and scientific slow scan CCD cameras.

Shortis, M. R.; Burner, A. W.; Snow, W. L.; Goad, W. K.

1991-01-01

8

A control system for LAMOST CCD cameras  

NASA Astrophysics Data System (ADS)

32 scientific CCD cameras within 16 low-dispersion spectrographs of LAMOST are used for object spectra. This paper introduced the CCD Master system designed for camera management and control based on UCAM controller. The layers of Master, UDP and CCD-end daemons are described in detail. The commands, statuses, user interface and spectra viewer are discussed.

Deng, Xiaochao; Wang, Jian; Dong, Jian; Luo, Yu; Liu, Guangcao; Yuan, Hailong; Jin, Ge

2010-07-01

9

Digital video camera workshop Sony VX2000  

E-print Network

Digital video camera workshop Sony VX2000 Sony DSR-PDX10 #12;Borrowing Eligibility · Currently · Completed quiz with score of 100% #12;Sony VX2000 and Panasonic AG- DVC7P · These are 3CCD cameras · Both ­ Firewire #12;Video Camera Operation Installing the Battery Sony VX2000 Insert the battery with the arrow

10

An auto-focusing CCD camera mount  

NASA Astrophysics Data System (ADS)

The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

Arbour, R. W.

1994-08-01

11

Selection of video cameras for stroboscopic videolaryngoscopy.  

PubMed

Stroboscopic evaluation for the analysis of laryngeal function and disease has been reemphasized recently and its routine clinical use recommended. Many have found, however, that it is not always possible to obtain consistently satisfactory video images of stroboscopic laryngoscopy. The problem is related to the low intensity of the xenon light source during stroboscopy. The authors have tried many different video cameras available, along with the Brel & Kjaer Rhino-Larynx Stroboscope type 4914, and two types of endoscopes (flexible and rigid). The cameras included 1) single tube camera, 2) single chip metal oxide semiconductors (MOS) solid-state camera, 3) single chip charge-coupled devices (CCD) solid-state camera, 4) three-tube camera, and 5) three-chip CCD camera. Currently available video cameras and their adaptability for stroboscopic videolaryngoscopy are discussed. PMID:3674656

Yanagisawa, E; Godley, F; Muta, H

1987-01-01

12

Solid state television camera (CCD-buried channel)  

NASA Technical Reports Server (NTRS)

The development of an all solid state television camera, which uses a buried channel charge coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array is utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control (i.e., ALC and AGC) techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

1976-01-01

13

Solid state television camera (CCD-buried channel), revision 1  

NASA Technical Reports Server (NTRS)

An all solid state television camera was designed which uses a buried channel charge coupled device (CCD) as the image sensor. A 380 x 488 element CCD array is utilized to ensure compatibility with 525-line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (1) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (2) techniques for the elimination or suppression of CCD blemish effects, and (3) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

1977-01-01

14

Operating the CCD Camera 1995 Edition  

E-print Network

and even the focal position may vary. If you turn on the camera system first, then open up the roof, get. To exit from the ccd program (after you have finished taking pictures and transferring them to hard disk Taking a Picture = Operating the Electronics To control the camera, type in commands at the asterisk

Veilleux, Sylvain

15

Experiment and research on the TDI CCD camera  

NASA Astrophysics Data System (ADS)

At present, CCD is widely applied in many fields. Here, we introduce you a special linear area CCD camera which uses Time-Delay-Integration (TDI) technique to provide high sensitivity, high speed, high spatial resolution and wide dynamic range performance under low light conditions. TDI is a method of scanning which provides greater sensitivity than other video scanning methods, and the interface that it needs is quite different from conventional ones. In the paper, the principle and the features are presented. In order to apply the TDI method and verify the effect, we do an experiment of grabbing moving image through using CT-E1 and CL-E2 2048X96 TDI Line Scan CCD camera produced by DALSA INC.. We design a grabber and make a complete system by ourselves. The system includes: camera, camera control, frame grabber, revolving stage, PC. The result indicates that when the line-shift-rate of the CCD camera is not synchronized with the rate of moving object, the image that we obtain is not clear. But when we use phase-shift- compensation technique and make the line-shift-rate of the camera match the rate of the image, we can get very distinct image.

Wang, Haining; Wei, Zhonghui

1996-09-01

16

Operating the CCD Camera 1995 Edition  

E-print Network

, get the telescope ready, find the field of your source, etc., it will likely be well stabilized on the computer screen. 7. To exit from the ccd program (after you have finished taking pictures and transferring the enter key. 3 Taking a Picture = Operating the Electronics To control the camera, type in commands

Harrington, J. Patrick

17

Security camera video authentication  

Microsoft Academic Search

The ability to authenticate images captured by a security camera, and localise any tampered areas, will increase the value of these images as evidence in a court of law. This paper outlines the challenges in security camera video authentication, and discusses the reasons why fingerprinting, a robust type of digital signature, provides a solution preferable to semi-fragile watermarking. A fingerprint

D. K. Roberts

2002-01-01

18

The QUEST Large Area CCD Camera  

E-print Network

We have designed, constructed and put into operation a very large area CCD camera that covers the field of view of the 1.2 m Samuel Oschin Schmidt Telescope at the Palomar Observatory. The camera consists of 112 CCDs arranged in a mosaic of four rows with 28 CCDs each. The CCDs are 600 x 2400 pixel Sarnoff thinned, back illuminated devices with 13 um x 13 um pixels. The camera covers an area of 4.6 deg x 3.6 deg on the sky with an active area of 9.6 square degrees. This camera has been installed at the prime focus of the telescope, commissioned, and scientific quality observations on the Palomar-QUEST Variability Sky Survey were started in September of 2003. The design considerations, construction features, and performance parameters of this camera are described in this paper.

Charlie Baltay; David Rabinowitz; Peter Andrews; Anne Bauer; Nancy Ellman; William Emmet; Rebecca Hudson; Thomas Hurteau; Jonathan Jerke; Rochelle Lauer; Julia Silge; Andrew Szymkowiak; Brice Adams; Mark Gebhard; James Musser; Michael Doyle; Harold Petrie; Roger Smith; Robert Thicksten; John Geary

2007-02-21

19

Four-chip CCD camera for HDTV  

NASA Astrophysics Data System (ADS)

In an effort to realize a compact HDTV camera with high performance, we have developed a prototype equipped with four 2/3-inch CCDs. A smaller image format is preferable for downsizing TV cameras. However, this causes shrinkage of the unit pixel size and inevitably makes it more difficult to produce an image-pickup device with required HDTV qualities, especially sensitivity, and dynamic range. We have overcome this problem by using CCD imagers with high performance but with a relatively small number of pixels and by increasing the number of CCD chips used in a camera to secure the necessary spatial sampling points for HDTV. In the newly developed color-separating system of the camera, two of the four CCDs are assigned for the green (G) light component and one each for red (R) and blue (B). We succeeded in improving the resolution by introducing spatial pixel offset imaging. This new method has two major advantages: it prevents resolution degradation caused by chromatic aberration and improves the resolution of colored signals over a wide range.

Sugawara, Masayuki; Mitani, Kohji; Saitoh, Toshinori; Fujita, Yoshihiro; Suetsugi, Keisuke

1994-05-01

20

Linear array CCD sensor for multispectral camera  

NASA Astrophysics Data System (ADS)

Design, operational and performance features are described for a new 2048 element CCD array in a ceramic package for beam sharing focal plane arrangements on remote sensing satellites. The device, labeled the TH 7805, furnishes 13 micron square pixels at 13 microns pitch over the 480-930 nm interval, two video outputs and a single-phase, buried channel CCD register. Each n-p photodiode is linked to a Si coating by a gate storing the photocharges. Crosstalk between elements is less than 1 percent and the rms noise level is 180 micro-V. The array output sensitivity is 1.37 micro-V/electron, linearity to less than 1 percent, and a 10 MHz maximum data rate. The entire sensor package draws under 150 mW power from the spacecraft. The TH 7805 has withstood over 10 krads in tests without exhibiting faults.

Chabbal, J.; Boucharlat, G.; Capppechi, F.; Benoit-Gonin, R.

1985-10-01

21

Detecting organic materials with a CCD camera.  

PubMed

Absorption bands in the near-infrared are used to detect materials composed of organic molecules, in scenes imaged with a conventional CCD camera. A simple model of reflectance spectra (between 850 and 980nm) is proposed and tested on a wide range of materials. An existing vision system that was designed to detect materials with high water content is tested on organic materials. The system cannot detect materials (such as cellulose and starch) that consist of chains of sugars. It is able to robustly detect materials such as fats and aliphatic plastics (in their pure form), whose molecules are essentially long chains of CH2 and CH3 groups. The ability of the system to detect plastic objects is limited by inorganic additives in the plastics. PMID:25608077

McGunnigle, G; Kraft, M

2014-12-10

22

Lightweight Video-Camera Head  

NASA Technical Reports Server (NTRS)

Compact, lightweight video camera head constructed by remounting lens and charge-coupled-device image detector from small commercial video camera in separate assembly. Useful in robotics, artificial vision, and vision guidance systems. Designed to be mounted on visor of helmet to monitor motions of eyes in experiments on vestibulo-ocular reflexes.

Proctor, David R.

1988-01-01

23

Design of the KMTNet large format CCD camera  

NASA Astrophysics Data System (ADS)

We present the design for the 340 Mpixel KMTNet CCD camera comprising four newly developed e2v CCD290-99 imaging sensors mounted to a common focal plane assembly. The high performance CCDs have 9k x 9k format, 10 micron pixels, and multiple outputs for rapid readout time. The camera Dewar is cooled using closed cycle coolers and vacuum is maintained with a cryosorption pump. The CCD controller electronics, the electronics cooling system, and the camera control software are also described.

Atwood, Bruce; O'Brien, Thomas P.; Colarosa, Christopher; Mason, Jerry; Johnson, Mark O.; Pappalardo, Dan; Derwent, Mark; Schaller, Skip; Lee, Chung-Uk; Kim, Seung-Lee; Park, Byeong-Gon; Cha, Sang-Mok; Jorden, Paul; Darby, Steve; Walker, Alex; Renshaw, Ryan

2012-09-01

24

CCD cameras for the polarimetric channels of HERSCHEL  

NASA Astrophysics Data System (ADS)

A new concept CCD camera is currently under development at the XUVLab of the Department of Astronomy and Space Science of the University of Florence. This CCD camera is the proposed detector for the broadband visible light polarimetric channels of the UVCI coronagraph of HERSCHEL and of Solar Orbiter space missions. The main features of this camera are a high level of versatility and a fast pixel that will satisfy the requirements of both the space missions. Within this project, a versatile CCD controller has been produced with interesting and innovative features: it allows the selection of all the parameters related to charge transfer and CCD readout and therefore it allows the use of virtually any CCD sensor. The software interface is LabVIEW 6i based and it will allow both local and remote control and display.

Gori, Luca; Pace, Emanuele; Gherardi, Alessandro; Sozzi, M.; Puri, S.

2003-02-01

25

Impact of CCD camera SNR on polarimetric accuracy.  

PubMed

A comprehensive charge-coupled device (CCD) camera noise model is employed to study the impact of CCD camera signal-to-noise ratio (SNR) on polarimetric accuracy. The study shows that the standard deviations of the measured degree of linear polarization (DoLP) and angle of linear polarization (AoLP) are mainly dependent on the camera SNR. With increase in the camera SNR, both the measurement errors and the standard deviations caused by the CCD camera noise decrease. When the DoLP of the incident light is smaller than 0.1, the camera SNR should be at least 75 to achieve a measurement error of less than 0.01. When the input DoLP is larger than 0.5, a SNR of 15 is sufficient to achieve the same measurement accuracy. An experiment is carried out to verify the simulation results. PMID:25402986

Chen, Zhenyue; Wang, Xia; Pacheco, Shaun; Liang, Rongguang

2014-11-10

26

Dazzling effect of repetitive short pulse laser on TDI CCD camera  

NASA Astrophysics Data System (ADS)

A dazzling experiment was performed on a 64-stage TDI CCD camera using a 20 Hz repetition frequency picoseconds pulse laser, during which we found a new dazzling effect in which the fringes appeared in the video of the camera beside the saturation spot induced by the laser. We considered it to be the scattered light of the repetition frequency laser pulse to have induced the phenomenon. Width and visibility of the fringes recorded the information of the scattered light, such as repetitive frequency, pulse width and intensity distribution. With the assumption that the laser pulse width is less than one stage integral time of TDI CCD, width expressions for the fringes and space between the fringes were given using the repetitive frequency of laser pulse, row output frequency and integral stage number of the TDI CCD camera.

Zhang, Zhen; Cheng, Xiang-ai; Wang, Rui; Jiang, Tian; Qiu, Dong-dong; Jiang, Zong-fu

2011-02-01

27

Solid state, CCD-buried channel, television camera study and design  

NASA Technical Reports Server (NTRS)

An investigation of an all solid state television camera design, which uses a buried channel charge-coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array was utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a design which addresses the program requirements for a deliverable solid state TV camera.

Hoagland, K. A.; Balopole, H.

1976-01-01

28

Upgrading a CCD camera for astronomical use  

E-print Network

the image frames. The software used for adding and subtracting frames was IMDISP by Ron Baalke of the Jet Propulsion Lab. Further image manipulation was done by PhotoStyler. IMDISP requires a Graphics Interchange Format (GIF), so TIFF to GIF conversions.... The remaining data was accidentally destroyed while trying to convert the image files from an 89a GIF to an 87a GIF. 4))j . "" IV. MEASUREMENTS WITH THE CCD IMAGE INTENSIFIER COMBINATION This section describes the lab and field tests of the CCD and image...

Lamecker, James Frank

1993-01-01

29

Solid-State Video Camera for the Accelerator Environment  

SciTech Connect

Solid-State video cameras employing CMOS technology have been developed and tested for several years in the SLAC accelerator, notably the PEPII (BaBar) injection lines. They have proven much more robust than their CCD counterparts in radiation areas. Repair is simple, inexpensive, and generates very little radioactive waste.

Brown, R.L.V.; Roster, B.H.; Yee, C.K. [Stanford Linear Accelerator Center, Menlo Park, CA 94309 (United States)

2004-11-10

30

Solid-State Video Camera for the Accelerator Environment  

SciTech Connect

Solid-State video cameras employing CMOS technology have been developed and tested for several years in the SLAC accelerator; notably the PEPII (BaBar) injection lines. They have proven much more robust than their CCD counterparts in radiation areas. Repair is simple, inexpensive, and generates very little radioactive waste.

Brown, R

2004-05-27

31

1024 X 1024 pixel high-frame-rate digital CCD cameras  

NASA Astrophysics Data System (ADS)

Field deployable, high frame rate visible CCD camera systems have been developed to support the Test and Evaluation activities at the White Sands Missile Range. These visible cameras are designed around a Sarnoff 1024 X 1024 pixel, backside illuminated CCD with a 32-port, split-frame transfer architecture. The cameras exploit this architecture to provide selectable modes from a 30 Hz frame rate at 1024 X 1024 pixels to a 300 Hz frame rate with 1024 X 512 pixels (2:1 vertical binning). The cameras are configured with a 500 mm, f/4 lens, and a Ferro-electric liquid crystal electro-optic shutter, to provide variable integration times from 0.5 to 32 msec. Video outputs provided are RS170 analog video in a reduced 512 X 480 pixel format, and 12-bit full resolution digital video data stream provided through a high speed serial/parallel digital coaxial interface. At a frame rate of 300 frames per second, these cameras deliver video data at an average rate of 1.9 Gbits/sec, and a burst rate of 2.8 Gbits/sec, with the capability of reaching an average 12 bit digital data rate of 3.8 Gbits/sec when higher frame rate imagers become available.

Hughes, Gary W.; Levine, Peter A.; McCaffrey, Nathaniel J.; Villani, Thomas S.; O'Mara, K.; Sjursen, W.; Pantuso, Francis P.; Ambrose, Joseph G.; King, B.

1997-05-01

32

Ultra high-speed video camera and its applications  

Microsoft Academic Search

We have developed an ultra high-speed video camera announced by Etoh at the 24th ICHSPP. This new camera can capture 100 continuous images with a frame rate of up to 1,000,000 frames per second (fps). It comprises a new developed single-chip CCD image sensor called In-situ Storage Image Sensor (ISIS). The spatial resolution is 312 x 260 pixels and this

Yasushi Kondo; Hiromasa Maruno; Hideki Tominaga; Hideki Soya; Takeharu G. Etoh

2003-01-01

33

ULTRACAM - an ultra-fast, triple-beam CCD camera  

E-print Network

ULTRACAM is an ultra-fast, triple-beam CCD camera which has been designed to study one of the few remaining unexplored regions of observational parameter space - high temporal resolution. The camera will see first light in Spring 2002, at a total cost of GBP 300 k, and will be used on 2-m, 4-m and 8-m class telescopes to study astrophysics on the fastest timescales.

Vik Dhillon; Tom Marsh; the ULTRACAM team

2001-10-01

34

Visual enhancement of laparoscopic nephrectomies using the 3-CCD camera  

NASA Astrophysics Data System (ADS)

Many surgical techniques are currently shifting from the more conventional, open approach towards minimally invasive laparoscopic procedures. Laparoscopy results in smaller incisions, potentially leading to less postoperative pain and more rapid recoveries . One key disadvantage of laparoscopic surgery is the loss of three-dimensional assessment of organs and tissue perfusion. Advances in laparoscopic technology include high-definition monitors for improved visualization and upgraded single charge coupled device (CCD) detectors to 3-CCD cameras, to provide a larger, more sensitive color palette to increase the perception of detail. In this discussion, we further advance existing laparoscopic technology to create greater enhancement of images obtained during radical and partial nephrectomies in which the assessment of tissue perfusion is crucial but limited with current 3-CCD cameras. By separating the signals received by each CCD in the 3-CCD camera and by introducing a straight forward algorithm, rapid differentiation of renal vessels and perfusion is accomplished and could be performed real time. The newly acquired images are overlaid onto conventional images for reference and comparison. This affords the surgeon the ability to accurately detect changes in tissue oxygenation despite inherent limitations of the visible light image. Such additional capability should impact procedures in which visual assessment of organ vitality is critical.

Crane, Nicole J.; Kansal, Neil S.; Dhanani, Nadeem; Alemozaffar, Mehrdad; Kirk, Allan D.; Pinto, Peter A.; Elster, Eric A.; Huffman, Scott W.; Levin, Ira W.

2006-02-01

35

Development of an all-in-one gamma camera/CCD system for safeguard verification  

NASA Astrophysics Data System (ADS)

For the purpose of monitoring and verifying efforts at safeguarding radioactive materials in various fields, a new all-in-one gamma camera/charged coupled device (CCD) system was developed. This combined system consists of a gamma camera, which gathers energy and position information on gamma-ray sources, and a CCD camera, which identifies the specific location in a monitored area. Therefore, 2-D image information and quantitative information regarding gamma-ray sources can be obtained using fused images. A gamma camera consists of a diverging collimator, a 22 22 array CsI(Na) pixelated scintillation crystal with a pixel size of 2 2 6 mm3 and Hamamatsu H8500 position-sensitive photomultiplier tube (PSPMT). The Basler scA640-70gc CCD camera, which delivers 70 frames per second at video graphics array (VGA) resolution, was employed. Performance testing was performed using a Co-57 point source 30 cm from the detector. The measured spatial resolution and sensitivity were 4.77 mm full width at half maximum (FWHM) and 7.78 cps/MBq, respectively. The energy resolution was 18% at 122 keV. These results demonstrate that the combined system has considerable potential for radiation monitoring.

Kim, Hyun-Il; An, Su Jung; Chung, Yong Hyun; Kwak, Sung-Woo

2014-12-01

36

Design and application of TEC controller Using in CCD camera  

NASA Astrophysics Data System (ADS)

Thermoelectric cooler (TEC) is a kind of solid hot pump performed with Peltier effect. And it is small, light and noiseless. The cooling quantity is proportional to the TEC working current when the temperature difference between the hot side and the cold side keeps stable. The heating quantity and cooling quantity can be controlled by changing the value and direction of current of two sides of TEC. So, thermoelectric cooling technology is the best way to cool CCD device. The E2V's scientific image sensor CCD47-20 integrates TEC and CCD together. This package makes easier of electrical design. Software and hardware system of TEC controller are designed with CCD47-20 which is packaged with integral solid-state Peltier cooler. For hardware system, 80C51 MCU is used as CPU, 8-bit ADC and 8-bit DAC compose of closed-loop controlled system. Controlled quantity can be computed by sampling the temperature from thermistor in CCD. TEC is drove by MOSFET which consists of constant current driving circuit. For software system, advanced controlled precision and convergence speed of TEC system can be gotten by using PID controlled algorithm and tuning proportional, integral and differential coefficient. The result shows: if the heat emission of the hot side of TEC is good enough to keep the temperature stable, and when the sampling frequency is 2 seconds, temperature controlled velocity is 5C/min. And temperature difference can reach -40C controlled precision can achieve 0.3C. When the hot side temperature is stable at C, CCD temperature can reach -C, and thermal noise of CCD is less than 1e-/pix/s. The controlled system restricts the dark-current noise of CCD and increases SNR of the camera system.

Gan, Yu-quan; Ge, Wei; Qiao, Wei-dong; Lu, Di; Lv, Juan

2011-08-01

37

An ultrahigh-speed video camera and its applications  

NASA Astrophysics Data System (ADS)

We have developed an ultra high-speed video camera announced by Etoh at the 24th ICHSPP. This new camera can capture 100 continuous images with a frame rate of up to 1,000,000 frames per second (fps). It comprises a new developed single-chip CCD image sensor called In-situ Storage Image Sensor (ISIS). The spatial resolution is 312 x 260 pixels and this high resolution is kept even at the maximum frame rate. This camera enables us to observe the fast phenomena, which could not be seen before. The principle of this system and some applications are introduced.

Kondo, Yasushi; Maruno, Hiromasa; Tominaga, Hideki; Soya, Hideki; Etoh, Takeharu G.

2003-07-01

38

Design of high-speed low-noise pre-amplifier for CCD camera  

NASA Astrophysics Data System (ADS)

Pre-amplifier circuit is critical for the noise performance of the high speed CCD camera. Its main functions are amplification and impedance transform. The high speed and low noise pre-amplifier of CCD camera is discussed and designed in this paper. The high speed and low noise operational amplifier OPA842 is adopted as the main part. The gain-set resistors for the amplifier are designed optimally. The different precision gain-set resistors are swept using Monte Carlo method. CCD video signal which has high DC offset voltage is AC coupled to the amplifier. The output signal of the amplifier is source terminated using 50 ohms matching resistor so as to transmit the video signal through coaxial cable. When the circuit works in high speed, the PCB will have important effect to circuit performance and can even cause the amplifier unstable due to the parasitic problem of PCB. So the parasitic model of the PCB is established and the PCB layout design issues are also presented. The design result shows that the pre-amplifier can be used in the camera whose pixel rate could be up to 40 MHz and its input referred noise density is about 3 nV/Hz1/2.

Xue, Xucheng; Zhang, Shuyan; Li, Hongfa; Guo, Yongfei

2010-10-01

39

System Synchronizes Recordings from Separated Video Cameras  

NASA Technical Reports Server (NTRS)

A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

2009-01-01

40

High frame rate CCD camera with fast optical shutter  

SciTech Connect

A high frame rate CCD camera coupled with a fast optical shutter has been designed for high repetition rate imaging applications. The design uses state-of-the-art microchannel plate image intensifier (MCPII) technology fostered/developed by Los Alamos National Laboratory to support nuclear, military, and medical research requiring high-speed imagery. Key design features include asynchronous resetting of the camera to acquire random transient images, patented real-time analog signal processing with 10-bit digitization at 40--75 MHz pixel rates, synchronized shutter exposures as short as 200pS, sustained continuous readout of 512 x 512 pixels per frame at 1--5Hz rates via parallel multiport (16-port CCD) data transfer. Salient characterization/performance test data for the prototype camera are presented, temporally and spatially resolved images obtained from range-gated LADAR field testing are included, an alternative system configuration using several cameras sequenced to deliver discrete numbers of consecutive frames at effective burst rates up to 5GHz (accomplished by time-phasing of consecutive MCPII shutter gates without overlap) is discussed. Potential applications including dynamic radiography and optical correlation will be presented.

Yates, G.J.; McDonald, T.E. Jr. [Los Alamos National Lab., NM (United States); Turko, B.T. [Lawrence Berkeley National Lab., CA (United States)

1998-09-01

41

Television camera video level control system  

NASA Technical Reports Server (NTRS)

A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (inventors)

1985-01-01

42

Absolute calibration of a CCD camera with twin beams  

E-print Network

We report on the absolute calibration of a CCD camera by exploiting quantum correlation. This novel method exploits a certain number of spatial pairwise quantum correlated modes produced by spontaneous parametric-down-conversion. We develop a measurement model taking into account all the possible source of losses and noise that are not related to the quantum efficiency,accounting for all the uncertainty contributions, and we reach the relative uncertainty of 0.3% in low photon flux regime. This represents a significant step forward for the characterizaion of (scientific) CCDs used in mesoscopic light regime.

I. Ruo-Berchera; A. Meda; I. P. Degiovanni; G. Brida; M. L. Rastello; M. Genovese

2014-05-07

43

Wind dynamic range video camera  

NASA Technical Reports Server (NTRS)

A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

Craig, G. D. (inventor)

1985-01-01

44

CCD Camera Lens Interface for Real-Time Theodolite Alignment  

NASA Technical Reports Server (NTRS)

Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

Wake, Shane; Scott, V. Stanley, III

2012-01-01

45

Video compressive sensing for spatial multiplexing cameras  

E-print Network

CS-MUVI Video compressive sensing for spatial multiplexing cameras Aswin Sankaranarayanan information can help in obtaining better tradeoffs [Reddy et al. 2011] ­ State-of-the-art video compression] ­ State-of-the-art video compression naïve reconstruction motion estimates #12;Key points · Motion blur

46

Design of 300 frames per second 16-port CCD video processing circuit  

NASA Astrophysics Data System (ADS)

It is hard to achieve the speed of hundreds frames per second in high resolution charge coupled device (CCD) cameras, because the pixels' charge must be read out one by one in serial mode, this cost a lot of time. The multiple-port CCD technology is a new efficiency way to realize high frame rate high resolution solid state imaging systems. The pixel charge is read out from a multiple-port CCD through several ports in parallel mode, witch decrease the reading time of the CCD. But it is hard for the multiple-port CCDs' video processing circuit design, and the real time high speed image data acquisition is also a knotty problem. A 16-port high frame rate CCD video processing circuit based on Complex Programmable Logic Device (CPLD) and VSP5010 has been developed around a specialized back illuminated, 512 x 512 pixels, 400fps (frames per second) frame transfer CCD sensor from Sarnoff Ltd. A CPLD is used to produce high precision sample clock and timing, and the high accurate CCD video voltage sample is achieved with Correlated Double Sampling (CDS) technology. 8 chips of VSP5010 with CDS function is adopted to achieve sample and digitize CCD analog signal into 12 bit digital image data. Thus the 16 analog CCD output was digitized into 192 bit 6.67MHz parallel digital image data. Then CPLD and Time Division Multiplexing (TDM) technology are used to encode the 192 bit wide data into two 640MHz serial data and transmitted to remote data acquisition module via two fibers. The acquisition module decodes the serial data into original image data and stores the data into a frame cache, and then the software reads the data from the frame cache based on USB2.0 technology and stores the data in a hard disk. The digital image data with 12bit per pixel was collected and displayed with system software. The results show that the 16-por 300fps CCD output signals could be digitized and transmitted with the video processing circuit, and the remote data acquisition has been realized.

Yang, Shao-hua; Guo, Ming-an; Li, Bin-kang; Xia, Jing-tao; Wang, Qunshu

2011-08-01

47

High-speed gated, high-resolution digital intensified CCD camera  

Microsoft Academic Search

We have developed a high-speed gated, high-resolution digital intensified CCD (D-ICCD) camera. This camera consists of a highly sensitive, high-speed gated image intensifier (UV to infrared image intensifier or X-ray image intensifier) coupled to a digital CCD camera via a fiber optic plate (or relay lens). This camera has an IEEE 1394 interface for control and data transfer to a

Hidehiro Kume; Toshiyuki Kakihara; Haruhito Nakamura

2003-01-01

48

Development of high-speed video cameras  

NASA Astrophysics Data System (ADS)

Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

2001-04-01

49

Advanced High-Definition Video Cameras  

NASA Technical Reports Server (NTRS)

A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

Glenn, William

2007-01-01

50

A CCD CAMERA-BASED HYPERSPECTRAL IMAGING SYSTEM FOR STATIONARY AND AIRBORNE APPLICATIONS  

Technology Transfer Automated Retrieval System (TEKTRAN)

This paper describes a charge coupled device (CCD) camera-based hyperspectral imaging system designed for both stationary and airborne remote sensing applications. The system consists of a high performance digital CCD camera, an imaging spectrograph, an optional focal plane scanner, and a PC comput...

51

Use of a wide angle CCD line camera for BRDF measurements  

Microsoft Academic Search

In order to determine the Bi-directional Reflectance Distribution Function (BRDF) of natural surfaces a CCD line camera is used. This allows measurements under natural conditions with a high azimuth and zenith angular resolution in a short time. The CCD line spans a field of view of 80 as the zenith angle range. For covering the azimuth range, the camera is

A. Demircan; R. Schuster; M. Radke; M. Schnermark; H. P Rser

2000-01-01

52

Video Analysis with a Web Camera  

ERIC Educational Resources Information Center

Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as

Wyrembeck, Edward P.

2009-01-01

53

Object tracking using multiple camera video streams  

NASA Astrophysics Data System (ADS)

Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

2010-05-01

54

Measuring neutron fluences and gamma/x ray fluxes with CCD cameras  

NASA Astrophysics Data System (ADS)

The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCD's) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4-12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate approx. = .05 V/rad responsivity with greater than or = 1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or 'peaks' binned by area and amplitude as functions of fluence in the 105 to 107 n/cc range indicate smearing over approx. 1 to 10 percent of the CCD array with charge per pixel ranging between noise and saturation levels.

Yates, G. J.; Smith, G. W.; Zagarino, P.; Thomas, M. C.

55

Photogrammetric Applications of Immersive Video Cameras  

NASA Astrophysics Data System (ADS)

The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360 field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

Kwiatek, K.; Tokarczyk, R.

2014-05-01

56

Apogee Imaging Systems Alta F42 CCD Camera with Back Illuminated EV2 CCD42-The Apogee Alta F42 CCD Camera has a back-illuminated full frame 4 megapixel EV2 CCD42-40  

E-print Network

quantum efficiency. Midband, broadband, and UV-enhanced versions of the CCD are available. Ideal, international power supply and ActiveX driver. Apogee Alta® F Series Faster readout speeds, Alta reliability The Apogee Alta® F Series represents the next step in the evolution of the Alta line of cameras with faster

Kleinfeld, David

57

Ultrahigh-definition experimental camera system with an 8M-pixel CCD  

NASA Astrophysics Data System (ADS)

An ultra-high definition experimental camera system has been designed with double the horizontal and vertical resolution of HDTV. An 8M-pixel CCD with a progressive 60 frame-per- second scan-rate has been developed for the system. The 34 mm X 17.2 mm image area has 4046 (H) X 2048 (V) active imaging pixels with 8.4-micrometers squares. This CCD has a split- frame transfer structure and sixteen 37.125 MHz outputs so that the vertical and horizontal transfer frequencies are almost the same as those of HDTV. The split-frame transfer structure halves the required VCCD clock speeds and thus improves charge transfer efficiency. The multiple-output structure with its 16 outputs enables high data-rate imaging for ultra-high resolution moving pictures. In the signal processing section, analog gain adjustment circuits correct for the mismatches in the characteristics of outputs, and a correlated double-sampling technology is employed on each of the 16 CCD output signals. The output signals are digitized by 12-bit ADCs. The converted signals are then sent to the digital signal processing (DSP) circuits. In the DSP circuits, the upper half of the captured image is vertically inverted. All of the output data is then merged into a 4K X 2K pixel image and reformatted to create twenty-four 640 (H) X 480 (V) pixel sub-images for image processing. After contour compensation processing, the video signals are converted into an analog signal and presented on two ultra high resolution video monitors.

Mitani, Kohji; Sugawara, Masayuki; Shimamoto, Hiroshi; Smith, Charles R.; Farrier, Michael G.; Tang, Queintin; Okano, Fumio

2000-05-01

58

Automatic DEM Generation from CE-1's CCD Stereo Camera Images  

NASA Astrophysics Data System (ADS)

The goal of the CCD Stereo Camera is to acquire 3D-images of lunar surface bewteen 70S and 70N. We describe the process of the images acquired, configuration of imaging system, camera sensor model, camera trajectory model and EFP photogrammetric triangulation algorithm.

Liu, J. J.; Ren, X.; Mu, L. L.; Zhao, B. C.; Xiangli, B.; Yang, J. F.; Zou, Y. L.; Zhang, H. B.; Lu, C.; Liu, J. Z.; Zuo, W.; Su, Y.; Wen, W. B.; Bian, W.; Zou, X. D.; Li, C. L.

2009-03-01

59

An ultrahigh-speed color video camera operating at 1,000,000 fps with 288 frame memories  

Microsoft Academic Search

We developed an ultrahigh-speed color video camera that operates at 1,000,000 fps (frames per second) and had capacity to store 288 frame memories. In 2005, we developed an ultrahigh-speed, high-sensitivity portable color camera with a 300,000-pixel single CCD (ISIS-V4: In-situ Storage Image Sensor, Version 4). Its ultrahigh-speed shooting capability of 1,000,000 fps was made possible by directly connecting CCD storages,

K. Kitamura; T. Arai; J. Yonai; T. Hayashida; T. Kurita; H. Maruyama; J. Namiki; T. Yanagi; T. Yoshida; H. van Kuijk; Jan T. Bosiers; A. Saita; S. Kanayama; K. Hatade; S. Kitagawa; T. Goji Etoh

2008-01-01

60

Large Format, Dual Head,Triple Sensor, Self-Guiding CCD Cameras  

E-print Network

of the camera is easily removed for changing filters. Since the CCD is in a separate sealed chamber, removal and the sealed CCD chamber. A custom LRGBC filters set is optional. 12VDC Operation We have added an internal 12 of input voltage variation. When operating in the field from a 12V battery, current drain, power cord

Walter, Frederick M.

61

Digital video camera for application of particle image velocimetry in high-speed flows  

Microsoft Academic Search

A high-speed digital camera based on video technology for application of particle image velocimetry in wind tunnels is described. The camera contains two independently triggerable interline CCD sensors which are mounted on two faces of a cube beam splitter permitting the use of a single lens. Each of the sensors has a minimal exposure time of 0.8 microsecond(s) with a

Christian Willert; Boleslaw Stasicki; Markus Raffel; Juergen Kompenhans

1995-01-01

62

Development of high-speed video cameras  

Microsoft Academic Search

Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed

Takeharu Etoh; Kohsei Takehara; Tomoo Okinaka; Yasuhide Takano; Arno Ruckelshausen; Dirk Poggemann

2001-01-01

63

Measuring neutron fluences and gamma/x-ray fluxes with CCD cameras  

SciTech Connect

The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4--12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate {approx}.05 V/rad responsivity with {ge}1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or ``peaks`` binned by area and amplitude as functions of fluence in the 10{sup 5} to 10{sup 7} n/cm{sup 2} range indicate smearing over {approx}1 to 10% of CCD array with charge per pixel ranging between noise and saturation levels.

Yates, G.J. [Los Alamos National Lab., NM (United States); Smith, G.W. [Ministry of Defense, Aldermaston (United Kingdom). Atomic Weapons Establishment; Zagarino, P.; Thomas, M.C. [EG and G Energy Measurements, Inc., Goleta, CA (United States). Santa Barbara Operations

1991-12-01

64

Measuring neutron fluences and gamma/x-ray fluxes with CCD cameras  

SciTech Connect

The capability to measure bursts of neutron fluences and gamma/x-ray fluxes directly with charge coupled device (CCD) cameras while being able to distinguish between the video signals produced by these two types of radiation, even when they occur simultaneously, has been demonstrated. Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (4--12 MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate {approx}.05 V/rad responsivity with {ge}1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or peaks'' binned by area and amplitude as functions of fluence in the 10{sup 5} to 10{sup 7} n/cm{sup 2} range indicate smearing over {approx}1 to 10% of CCD array with charge per pixel ranging between noise and saturation levels.

Yates, G.J. (Los Alamos National Lab., NM (United States)); Smith, G.W. (Ministry of Defense, Aldermaston (United Kingdom). Atomic Weapons Establishment); Zagarino, P.; Thomas, M.C. (EG and G Energy Measurements, Inc., Goleta, CA (United States). Santa Barbara Operations)

1991-01-01

65

The image pretreatment based on the FPGA inside digital CCD camera  

NASA Astrophysics Data System (ADS)

In a space project, a digital CCD camera which can image more clearly in the 1 Lux light environment has been asked to design . The CCD sensor ICX285AL produced by SONY Co.Ltd has been used in the CCD camera. The FPGA (Field Programmable Gate Array) chip XQR2V1000 has been used as a timing generator and a signal processor inside the CCD camera. But in the low-light environment, two kinds of random noise become apparent because of the improving of CCD camera's variable gain, one is dark current noise in the image background, the other is vertical transfer noise. The real time method for eliminating noise based on FPGA inside the CCD camera would be introduced. The causes and characteristics of the random noise have been analyzed. First, several ideas for eliminating dark current noise had been motioned; then they were emulated by VC++ in order to compare their speed and effect; Gauss filter has been chosen because of the filtering effect. The vertical transfer vertical noise has the character that the vertical noise points have regular ordinate in the image two-dimensional coordinates; and the performance of the noise is fixed, the gray value of the noise points is 16-20 less than the surrounding pixels. According to these characters, local median filter has been used to clear up the vertical noise. Finally, these algorithms had been transplanted into the FPGA chip inside the CCD camera. A large number of experiments had proved that the pretreatment has better real-time features. The pretreatment makes the digital CCD camera improve the signal-to-noise ratio of 3-5dB in the low-light environment.

Tian, Rui; Liu, Yan-ying

2009-07-01

66

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

67

Video Chat with Multiple Cameras John MacCormick  

E-print Network

. Benchmark experiments em- ploying up to four webcams simultaneously demonstrate that multi-camera video chatVideo Chat with Multiple Cameras John MacCormick Dickinson College Technical Report March 2012 Abstract The dominant paradigm for video chat employs a single camera at each end of the con- versation

MacCormick, John

68

Use of a wide angle CCD line camera for BRDF measurements  

NASA Astrophysics Data System (ADS)

In order to determine the Bi-directional Reflectance Distribution Function (BRDF) of natural surfaces a CCD line camera is used. This allows measurements under natural conditions with a high azimuth and zenith angular resolution in a short time. The CCD line spans a field of view of 80 as the zenith angle range. For covering the azimuth range, the camera is mounted on a rotating device and an extendible boom provides an aerial platform. This set up allows the measurement of the [almost] complete reflectance distribution of the surface below the camera for the 30-s rotation period of the camera. The camera used for this set up is the wide angle airborne camera (WAAC), which was developed at the DLR for airborne stereo imaging purposes. This paper presents the radiometric calibration of the system and shows the initial results of our approach in measuring the BRDF with high angular resolution for a short period.

Demircan, A.; Schuster, R.; Radke, M.; Schnermark, M.; Rser, H. P.

2000-02-01

69

Radiation damage of the PCO Pixelfly VGA CCD camera of the BES system on KSTAR tokamak  

NASA Astrophysics Data System (ADS)

A PCO Pixelfly VGA CCD camera which is part a of the Beam Emission Spectroscopy (BES) diagnostic system of the Korea Superconducting Tokamak Advanced Research (KSTAR) used for spatial calibrations, suffered from serious radiation damage, white pixel defects have been generated in it. The main goal of this work was to identify the origin of the radiation damage and to give solutions to avoid it. Monte Carlo N-Particle eXtended (MCNPX) model was built using Monte Carlo Modeling Interface Program (MCAM) and calculations were carried out to predict the neutron and gamma-ray fields in the camera position. Besides the MCNPX calculations pure gamma-ray irradiations of the CCD camera were carried out in the Training Reactor of BME. Before, during and after the irradiations numerous frames were taken with the camera with 5 s long exposure times. The evaluation of these frames showed that with the applied high gamma-ray dose (1.7 Gy) and dose rate levels (up to 2 Gy/h) the number of the white pixels did not increase. We have found that the origin of the white pixel generation was the neutron-induced thermal hopping of the electrons which means that in the future only neutron shielding is necessary around the CCD camera. Another solution could be to replace the CCD camera with a more radiation tolerant one for example with a suitable CMOS camera or apply both solutions simultaneously.

Nfrdi, Gbor; Kovcsik, kos; Pr, Gbor; Lampert, Mt; Un Nam, Yong; Zoletnik, Sndor

2015-01-01

70

Synchronizing A Stroboscope With A Video Camera  

NASA Technical Reports Server (NTRS)

Circuit synchronizes flash of light from stroboscope with frame and field periods of video camera. Sync stripper sends vertical-synchronization signal to delay generator, which generates trigger signal. Flashlamp power supply accepts delayed trigger signal and sends pulse of power to flash lamp. Designed for use in making short-exposure images that "freeze" flow in wind tunnel. Also used for making longer-exposure images obtained by use of continuous intense illumination.

Rhodes, David B.; Franke, John M.; Jones, Stephen B.; Dismond, Harriet R.

1993-01-01

71

RS-170 to 700 frame-per-second CCD camera  

NASA Astrophysics Data System (ADS)

A versatile new camera, the Los Alamos National Laboratory (LANL) model GY6, is described. It operates at a wide variety of frame rates, from RS-170 to 700 frames per second. The camera operates as an NTSC compatible black and white camera when operating at RS- 170 rates. When used for variable high-frame rates, a simple substitution is made of the RS- 170 sync/clock generator circuit card with a high speed emitter-coupled logic (ECL) circuit card.

Albright, Kevin L.; King, Nicholas S. P.; Yates, George J.; McDonald, Thomas E.; Turko, Bojan T.

1993-10-01

72

An RS-170 to 700 frame per second CCD camera  

SciTech Connect

A versatile new camera, the Los Alamos National Laboratory (LANL) model GY6, is described. It operates at a wide variety of frame rates, from RS-170 to 700 frames per second. The camera operates as an NTSC compatible black and white camera when operating at RS-170 rates. When used for variable high-frame rates, a simple substitution is made of the RS-170 sync/clock generator circuit card with a high speed emitter-coupled logic (ECL) circuit card.

Albright, K.L.; King, N.S.P.; Yates, G.J.; McDonald, T.E. [Los Alamos National Lab., NM (United States); Turko, B.T. [Lawrence Berkeley Lab., CA (United States)

1993-08-01

73

Automated CCD camera characterization. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports  

SciTech Connect

The OMEGA system uses CCD cameras for a broad range of applications. Over 100 video rate CCD cameras are used for such purposes as targeting, aligning, and monitoring areas such as the target chamber, laser bay, and viewing gallery. There are approximately 14 scientific grade CCD cameras on the system which are used to obtain precise photometric results from the laser beam as well as target diagnostics. It is very important that these scientific grade CCDs are properly characterized so that the results received from them can be evaluated appropriately. Currently characterization is a tedious process done by hand. The operator must manually operate the camera and light source simultaneously. Because more exposures means more accurate information on the camera, the characterization tests can become very length affairs. Sometimes it takes an entire day to complete just a single plot. Characterization requires the testing of many aspects of the camera`s operation. Such aspects include the following: variance vs. mean signal level--this should be proportional due to Poisson statistics of the incident photon flux; linearity--the ability of the CCD to produce signals proportional to the light it received; signal-to-noise ratio--the relative magnitude of the signal vs. the uncertainty in that signal; dark current--the amount of noise due to thermal generation of electrons (cooling lowers this noise contribution significantly). These tests, as well as many others, must be conducted in order to properly understand a CCD camera. The goal of this project was to construct an apparatus that could characterize a camera automatically.

Silbermann, J. [Penfield High School, NY (United States)

1999-03-01

74

The research of the accurate measure of static transfer function for the TDI CCD camera  

NASA Astrophysics Data System (ADS)

In the test course of static transfer function of TDI CCD camera, because of the influence that gets environmental and artificial etc. factor, the value of static transfer function measured at any time is between unceasing fluctuation, so, make accuracy reduce. To solve this problem, a kind of accurate measurement technique of static transfer function is put forward. First, before carrying out the measure of static quiet of transfer function, the best test point of transfer function of the TDI CCD camera must be determined, it is parallel to guarantee the rectangle target surface of parallel optical pipe and camera focal plane maintenance parallel, and again guarantee target strip in rectangle target and TDI CCD in camera focal plane maintenance vertical. TDI CCD catches rectangle target image, per 1000 lines of target mark image as a measures sample of static transfer function, exclude because of atmosphere tremble twisted, vague rectangle target mark image, retain 500 distinct and steady target mark image as measure sample set. Then, calculate the static transfer function of each measure sample respectively, take the average of all static quiet transfer function in measure sample set as the static transfer function of camera. Finally, the measure of the static transfer function for TDI CCD camera makes error analysis. Experimental results indicate that the value of the static transfer function of TDI CCD camera measured with this kind of method is 0.2923, with before measurement technique comparison, the value of static transfer function has raised 0.02, makes the accuracy of the measure of static transfer function have gotten raising.

Li, Guo-Ning; Wang, Wen-Hua; Han, Shuang-Li; Jin, Long-Xu; Liu, Yan-Yan

2011-11-01

75

The research of the accurate measure of static transfer function for the TDI CCD camera  

NASA Astrophysics Data System (ADS)

In the test course of static transfer function of TDI CCD camera, because of the influence that gets environmental and artificial etc. factor, the value of static transfer function measured at any time is between unceasing fluctuation, so, make accuracy reduce. To solve this problem, a kind of accurate measurement technique of static transfer function is put forward. First, before carrying out the measure of static quiet of transfer function, the best test point of transfer function of the TDI CCD camera must be determined, it is parallel to guarantee the rectangle target surface of parallel optical pipe and camera focal plane maintenance parallel, and again guarantee target strip in rectangle target and TDI CCD in camera focal plane maintenance vertical. TDI CCD catches rectangle target image, per 1000 lines of target mark image as a measures sample of static transfer function, exclude because of atmosphere tremble twisted, vague rectangle target mark image, retain 500 distinct and steady target mark image as measure sample set. Then, calculate the static transfer function of each measure sample respectively, take the average of all static quiet transfer function in measure sample set as the static transfer function of camera. Finally, the measure of the static transfer function for TDI CCD camera makes error analysis. Experimental results indicate that the value of the static transfer function of TDI CCD camera measured with this kind of method is 0.2923, with before measurement technique comparison, the value of static transfer function has raised 0.02, makes the accuracy of the measure of static transfer function have gotten raising.

Li, Guo-ning; Wang, Wen-hua; Han, Shuang-li; Jin, Long-xu; Liu, Yan-Yan

2012-01-01

76

The research of the accurate measure of static transfer function for the TDI CCD camera  

NASA Astrophysics Data System (ADS)

In the test course of static transfer function of TDI CCD camera, because of the influence that gets environmental and artificial etc. factor, the value of static transfer function measured at any time is between unceasing fluctuation, so, make accuracy reduce. To solve this problem, a kind of accurate measurement technique of static transfer function is put forward. First, before carrying out the measure of static quiet of transfer function, the best test point of transfer function of the TDI CCD camera must be determined, it is parallel to guarantee the rectangle target surface of parallel optical pipe and camera focal plane maintenance parallel, and again guarantee target strip in rectangle target and TDI CCD in camera focal plane maintenance vertical. TDI CCD catches rectangle target image, per 1000 lines of target mark image as a measures sample of static transfer function, exclude because of atmosphere tremble twisted, vague rectangle target mark image, retain 500 distinct and steady target mark image as measure sample set. Then, calculate the static transfer function of each measure sample respectively, take the average of all static quiet transfer function in measure sample set as the static transfer function of camera. Finally, the measure of the static transfer function for TDI CCD camera makes error analysis. Experimental results indicate that the value of the static transfer function of TDI CCD camera measured with this kind of method is 0.2923, with before measurement technique comparison, the value of static transfer function has raised 0.02, makes the accuracy of the measure of static transfer function have gotten raising.

Guo-ning, Li; Long-xu, Jin; Jian-yue, Ren; Wen-hua, Wang; Shuang-li, Han

2011-02-01

77

Photometric Calibration of Consumer Video Cameras  

NASA Technical Reports Server (NTRS)

Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

Suggs, Robert; Swift, Wesley, Jr.

2007-01-01

78

The In-flight Spectroscopic Performance of the Swift XRT CCD Camera During 2006-2007  

NASA Technical Reports Server (NTRS)

The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.

Godet, O.; Beardmore, A.P.; Abbey, A.F.; Osborne, J.P.; Page, K.L.; Evans, P.; Starling, R.; Wells, A.A.; Angelini, L.; Burrows, D.N.; Kennea, J.; Campana, S.; Chincarini, G.; Citterio, O.; Cusumano, G.; LaParola, V.; Mangano, V.; Mineo, T.; Giommi, P.; Perri, M.; Capalbi, M.; Tamburelli, F.

2007-01-01

79

The In-flight Spectroscopic Performance of the Swift XRT CCD Camera  

NASA Astrophysics Data System (ADS)

The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 145 eV FWHM at 5.9keV. We describe the status of the CCD X-ray spectral response matrices, which are made using a Monte-Carlo simulation technique based on physical models of the CCD response. We emphasize how the model has been refined following in-flight experiences with celestial and on-board calibration sources.

Beardmore, Andrew; Godet, O.; Abbey, A. F.; Osborne, J. P.; Page, K. L.; Wells, A. A.; Swift XRT Team

2006-09-01

80

Mobile phone camera-based video scanning of paper documents  

E-print Network

Mobile phone camera-based video scanning of paper documents Muhammad Muzzamil Luqman, Petra Gomez, France {muhammad_muzzamil.luqman,petra.gomez,jean-marc.ogier}@univ-lr.fr Abstract--Mobile phone camera research on mobile phone camera-based document image mosaic reconstruction method for video scanning

Paris-Sud XI, Université de

81

Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund  

NASA Technical Reports Server (NTRS)

The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.

Hagyard, Mona J.

1992-01-01

82

Research on detecting heterogeneous fibre from cotton based on linear CCD camera  

NASA Astrophysics Data System (ADS)

The heterogeneous fibre in cotton make a great impact on production of cotton textile, it will have a bad effect on the quality of product, thereby affect economic benefits and market competitive ability of corporation. So the detecting and eliminating of heterogeneous fibre is particular important to improve machining technics of cotton, advance the quality of cotton textile and reduce production cost. There are favorable market value and future development for this technology. An optical detecting system obtains the widespread application. In this system, we use a linear CCD camera to scan the running cotton, then the video signals are put into computer and processed according to the difference of grayscale, if there is heterogeneous fibre in cotton, the computer will send an order to drive the gas nozzle to eliminate the heterogeneous fibre. In the paper, we adopt monochrome LED array as the new detecting light source, it's lamp flicker, stability of luminous intensity, lumens depreciation and useful life are all superior to fluorescence light. We analyse the reflection spectrum of cotton and various heterogeneous fibre first, then select appropriate frequency of the light source, we finally adopt violet LED array as the new detecting light source. The whole hardware structure and software design are introduced in this paper.

Zhang, Xian-bin; Cao, Bing; Zhang, Xin-peng; Shi, Wei

2009-07-01

83

A fine image motion compensation method for the panoramic TDI CCD camera in remote sensing applications  

NASA Astrophysics Data System (ADS)

The image motion caused by camera housing rotation is the natural component in panoramic TDI CCD camera, and it cannot be eliminated thoroughly by traditional motion compensation schemes. After researching on the operation of a typical panoramic TDI CCD imaging system, we present a fine rolling rate independent motion compensation method. Employing the correction factor k, a TDI CCD line transfer synchronization signal is generated by the high resolution optical encoder as long as the ground scene moves one pixel with respect to the detector. Then, the effect of the motion compensation method is evaluated using the Monte Carlo method. The simulation results indicate that the magnitude of the modulation transfer function at the Nyquist frequency increases more than three times by increasing the subdivision steps from 100 to 500, and also of the imaging experimental results show that the image quality improvement can be achieved.

Wang, Dejiang; Li, Wenming; Yao, Yuan; Huang, Houtian; Wang, Yutang

2013-07-01

84

Color Measurement of Printed Textile using CCD Cameras Harro Stokman Theo Gevers  

E-print Network

Color Measurement of Printed Textile using CCD Cameras Harro Stokman Theo Gevers Intelligent invariance Abstract Automated visual inspection of industrial textile printing has the potential to increase of homogenouesly colored textile patches are explained by the dichromatic reflec­ tion model. An extra clue

Gevers, Theo

85

Pixel correspondence calibration method of a 2CCD camera based on absolute phase calculation  

NASA Astrophysics Data System (ADS)

This paper presents a novel calibration method to build up pixel correspondence between the IR CCD sensor and the visible CCD sensor of a 2CCD camera by using absolute phase calculation. Vertical and horizontal sinusoidal fringe patterns are projected onto a white plate surface through the visible and infrared (IR) channels of a DLP projector. The visible and IR fringe patterns are captured by the IR sensor and visible sensor respectively. Absolute phase of each pixel at IR and visible channels is calculated by using the optimum three-fringe number selection method. The precise pixel relationship between the two channels can be determined by the obtained absolute phase data. Experimental results show the effectiveness and validity of the proposed 2CCD calibration method. Due to using continuous phase information, this method can accurately give pixel-to-pixel correspondence.

Zhang, Zonghua; Zheng, Guoquan; Huang, Shujun

2014-11-01

86

Inexpensive range camera operating at video speed.  

PubMed

An optoelectronic device has been developed and built that acquires and displays the range data of an object surface in space in video real time. The recovery of depth is performed with active triangulation. A galvanometer scanner system sweeps a sheet of light across the object at a video field rate of 50 Hz. High-speed signal processing is achieved through the use of a special optical sensor and hardware implementation of the simple electronic-processing steps. Fifty range maps are generated per second and converted into a European standard video signal where the depth is encoded in gray levels or color. The image resolution currently is 128 x 500 pixels with a depth accuracy of 1.5% of the depth range. The present setup uses a 500-mW diode laser for the generation of the light sheet. A 45-mm imaging lens covers a measurement volume of 93 mm x 61 mm x 63 mm at a medium distance of 250 mm from the camera, but this can easily be adapted to other dimensions. PMID:20820391

Kramer, J; Seitz, P; Baltes, H

1993-05-01

87

Security camera based on a single chip solution using a sharply outlined display algorithm and variable-clock video encoder  

Microsoft Academic Search

In this paper, we have proposed a security camera system that displays high-definition images by using a sharply outlined display algorithm (SODA), which generates less hardware complexity because of a modified video encoder. While the proposed system uses a charge coupled device (CCD) with a complementary filter that may cause some problems in representing vivid color, we have been able

Joohyun Kim; Jooyoung Ha; Shinki Jeong; Hoongee Yang; Bongsoon Kang

2006-01-01

88

Time-resolved spectra of dense plasma focus using spectrometer, streak camera, and CCD combination  

SciTech Connect

A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny-Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

Goldin, F. J. [Livermore Operations, National Security Technologies, LLC, Livermore, California 94550 (United States); Meehan, B. T.; Hagen, E. C. [North Las Vegas Facility, National Security Technologies, LLC, North Las Vegas, Nevada 89030 (United States); Wilkins, P. R. [Lawrence Livermore National Laboratories, Livermore, California 94550 (United States)

2010-10-15

89

Spectrally resolving electro-optical camera system with long linear CCD or staring array detectors  

SciTech Connect

The electro-optical camera system VOS 60 from Carl Zeiss/ZEO has a measured performance close to a photogrammetric film camera. The performance of VOS 60 was further improved to 8,000 net pixel CCDs. These are installed in various combinations of single or triple CCD sensor heads which are also available with filter wheels. Full photogrammetric correction including elevation data can be derived from installations with simultaneous forward/backward stereo in combinations with the professional photogrammetric software packages. A large choice of professional camera lenses by Carl Zeiss for the 6x6 cm{sup 2} format is available. Camera bodies, spectral separation systems and electronics form a camera kit, which can be adapted to the task in a very flexible way. Several stabilization systems can so be provided. The configuration can be expanded by a preprogammable recording sequencer with GPS which allows an automatic recording session to be executed. 6 refs., 1 fig.

Teuchert, W.D.; Mayr, W.; Zeiss, C. [Carl Zeiss, Business Unit Opto-electronic Systems, Oberkochen (Germany)] [and others

1996-11-01

90

Study of atmospheric discharges caracteristics using with a standard video camera  

NASA Astrophysics Data System (ADS)

In this study is showed some preliminary statistics on lightning characteristics such as: flash multiplicity, number of ground contact points, formation of new and altered channels and presence of continuous current in the strokes that form the flash. The analysis is based on the images of a standard video camera (30 frames.s-1). The results obtained for some flashes will be compared to the images of a high-speed CCD camera (1000 frames.s-1). The camera observing site is located in So Jos dos Campos (23S,46 W) at an altitude of 630m. This observational site has nearly 360 field of view at a height of 25m. It is possible to visualize distant thunderstorms occurring within a radius of 25km from the site. The room, situated over a metal structure, has water and power supplies, a telephone line and a small crane on the roof. KEY WORDS: Video images, Lightning, Multiplicity, Stroke.

Ferraz, E. C.; Saba, M. M. F.

91

An ultrahigh-speed color video camera operating at 1,000,000 fps with 288 frame memories  

NASA Astrophysics Data System (ADS)

We developed an ultrahigh-speed color video camera that operates at 1,000,000 fps (frames per second) and had capacity to store 288 frame memories. In 2005, we developed an ultrahigh-speed, high-sensitivity portable color camera with a 300,000-pixel single CCD (ISIS-V4: In-situ Storage Image Sensor, Version 4). Its ultrahigh-speed shooting capability of 1,000,000 fps was made possible by directly connecting CCD storages, which record video images, to the photodiodes of individual pixels. The number of consecutive frames was 144. However, longer capture times were demanded when the camera was used during imaging experiments and for some television programs. To increase ultrahigh-speed capture times, we used a beam splitter and two ultrahigh-speed 300,000-pixel CCDs. The beam splitter was placed behind the pick up lens. One CCD was located at each of the two outputs of the beam splitter. The CCD driving unit was developed to separately drive two CCDs, and the recording period of the two CCDs was sequentially switched. This increased the recording capacity to 288 images, an increase of a factor of two over that of conventional ultrahigh-speed camera. A problem with the camera was that the incident light on each CCD was reduced by a factor of two by using the beam splitter. To improve the light sensitivity, we developed a microlens array for use with the ultrahigh-speed CCDs. We simulated the operation of the microlens array in order to optimize its shape and then fabricated it using stamping technology. Using this microlens increased the light sensitivity of the CCDs by an approximate factor of two. By using a beam splitter in conjunction with the microlens array, it was possible to make an ultrahigh-speed color video camera that has 288 frame memories but without decreasing the camera's light sensitivity.

Kitamura, K.; Arai, T.; Yonai, J.; Hayashida, T.; Kurita, T.; Maruyama, H.; Namiki, J.; Yanagi, T.; Yoshida, T.; van Kuijk, H.; Bosiers, Jan T.; Saita, A.; Kanayama, S.; Hatade, K.; Kitagawa, S.; Etoh, T. Goji

2008-11-01

92

A Multi-Camera Framework for Interactive Video Games  

Microsoft Academic Search

We present a framework that allows for a straightforward development of multi-camera controlled interactive video games. Compared to traditional gaming input devices, cameras provide players with many degrees of freedom and a natural kind of interaction. The use of cameras can even obsolete the need for special clothing or other tracking devices. This partly accounted for the success of the

Tom Cuypers; Cedric Vanaken; Yannick Francken; Frank Van Reeth; Philippe Bekaert

2008-01-01

93

Automated Technology for Video Surveillance Vast numbers of surveillance cameras  

E-print Network

poor. For example, in London, security cameras captured footage of some of the July 2005 subway bombersAutomated Technology for Video Surveillance Vast numbers of surveillance cameras monitor public specific objects while a main camera continues to urvey the larger scene.s Rama Chellappa and Larry Davis

Hill, Wendell T.

94

Scintillator-CCD camera system light output response to dosimetry parameters for proton beam range measurement  

NASA Astrophysics Data System (ADS)

The purpose of this study is to investigate the luminescence light output response in a plastic scintillator irradiated by a 67.5 MeV proton beam using various dosimetry parameters. The relationship of the visible scintillator light with the beam current or dose rate, aperture size and the thickness of water in the water-column was studied. The images captured on a CCD camera system were used to determine optimal dosimetry parameters for measuring the range of a clinical proton beam. The method was developed as a simple quality assurance tool to measure the range of the proton beam and compare it to (a) measurements using two segmented ionization chambers and water column between them, and (b) with an ionization chamber (IC-18) measurements in water. We used a block of plastic scintillator that measured 555 cm3 to record visible light generated by a 67.5 MeV proton beam. A high-definition digital video camera Moticam 2300 connected to a PC via USB 2.0 communication channel was used to record images of scintillation luminescence. The brightness of the visible light was measured while changing beam current and aperture size. The results were analyzed to obtain the range and were compared with the Bragg peak measurements with an ionization chamber. The luminescence light from the scintillator increased linearly with the increase of proton beam current. The light output also increased linearly with aperture size. The relationship between the proton range in the scintillator and the thickness of the water column showed good linearity with a precision of 0.33 mm (SD) in proton range measurement. For the 67.5 MeV proton beam utilized, the optimal parameters for scintillator light output response were found to be 15 nA (16 Gy/min) and an aperture size of 15 mm with image integration time of 100 ms. The Bragg peak depth brightness distribution was compared with the depth dose distribution from ionization chamber measurements and good agreement was observed. The peak/plateau ratio observed for the scintillator was found to be 2.21 as compared to the ionization chamber measurements of 3.01. The response of a scintillator block-CCD camera in 67.5 MeV proton beam was investigated. A linear response was seen between light output and beam current as well as aperture size. The relation between the thickness of water in the water column and the measured range also showed linearity. The results from the scintillator response was used to develop a simple approach to measuring the range and the Bragg peak of a proton beam by recording the visible light from a scintillator block with an accuracy of less than 0.33 mm. Optimal dosimetry parameters for our proton beam were evaluated. It is observed that this method can be used to confirm the range of a proton beam during daily treatment and will be useful as daily QA measurement for proton beam therapy.

Daftari, Inder K.; Castaneda, Carlos M.; Essert, Timothy; Phillips, Theodore L.; Mishra, Kavita K.

2012-09-01

95

Development of a portable 3CCD camera system for multispectral imaging of biological samples.  

PubMed

Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

2014-01-01

96

Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples  

PubMed Central

Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

2014-01-01

97

Initial laboratory evaluation of color video cameras: Phase 2  

SciTech Connect

Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than to identify intruders. The monochrome cameras were selected over color cameras because they have greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Because color camera technology is rapidly changing and because color information is useful for identification purposes, Sandia National Laboratories has established an on-going program to evaluate the newest color solid-state cameras. Phase One of the Sandia program resulted in the SAND91-2579/1 report titled: Initial Laboratory Evaluation of Color Video Cameras. The report briefly discusses imager chips, color cameras, and monitors, describes the camera selection, details traditional test parameters and procedures, and gives the results reached by evaluating 12 cameras. Here, in Phase Two of the report, we tested 6 additional cameras using traditional methods. In addition, all 18 cameras were tested by newly developed methods. This Phase 2 report details those newly developed test parameters and procedures, and evaluates the results.

Terry, P.L.

1993-07-01

98

Outer planet investigations using a CCD camera system. [Saturn disk photommetry  

NASA Technical Reports Server (NTRS)

Problems related to analog noise, data transfer from the camera buffer to the storage computer, and loss of sensitivity of a two dimensional charge coupled device imaging system are reported. To calibrate the CCD system, calibrated UBV pinhole scans of the Saturn disk were obtained with a photoelectric area scanning photometer. Atmospheric point spread functions were also obtained. The UBV observations and models of the Saturn atmosphere are analyzed.

Price, M. J.

1980-01-01

99

A USB 2.0 computer interface for the UCO/Lick CCD cameras  

NASA Astrophysics Data System (ADS)

The new UCO/Lick Observatory CCD camera uses a 200 MHz fiber optic cable to transmit image data and an RS232 serial line for low speed bidirectional command and control. Increasingly RS232 is a legacy interface supported on fewer computers. The fiber optic cable requires either a custom interface board that is plugged into the mainboard of the image acquisition computer to accept the fiber directly or an interface converter that translates the fiber data onto a widely used standard interface. We present here a simple USB 2.0 interface for the UCO/Lick camera. A single USB cable connects to the image acquisition computer and the camera's RS232 serial and fiber optic cables plug into the USB interface. Since most computers now support USB 2.0 the Lick interface makes it possible to use the camera on essentially any modern computer that has the supporting software. No hardware modifications or additions to the computer are needed. The necessary device driver software has been written for the Linux operating system which is now widely used at Lick Observatory. The complete data acquisition software for the Lick CCD camera is running on a variety of PC style computers as well as an HP laptop.

Wei, Mingzhi; Stover, Richard J.

2004-09-01

100

Cramer-Rao lower bound optimization of an EM-CCD-based scintillation gamma camera  

NASA Astrophysics Data System (ADS)

Scintillation gamma cameras based on low-noise electron multiplication (EM-)CCDs can reach high spatial resolutions. For further improvement of these gamma cameras, more insight is needed into how various parameters that characterize these devices influence their performance. Here, we use the Cramer-Rao lower bound (CRLB) to investigate the sensitivity of the energy and spatial resolution of an EM-CCD-based gamma camera to several parameters. The gamma camera setup consists of a 3 mm thick CsI(Tl) scintillator optically coupled by a fiber optic plate to the E2V CCD97 EM-CCD. For this setup, the position and energy of incoming gamma photons are determined with a maximum-likelihood detection algorithm. To serve as the basis for the CRLB calculations, accurate models for the depth-dependent scintillation light distribution are derived and combined with a previously validated statistical response model for the EM-CCD. The sensitivity of the lower bounds for energy and spatial resolution to the EM gain and the depth-of-interaction (DOI) are calculated and compared to experimentally obtained values. Furthermore, calculations of the influence of the number of detected optical photons and noise sources in the image area on the energy and spatial resolution are presented. Trends predicted by CRLB calculations agree with experiments, although experimental values for spatial and energy resolution are typically a factor of 1.5 above the calculated lower bounds. Calculations and experiments both show that an intermediate EM gain setting results in the best possible spatial or energy resolution and that the spatial resolution of the gamma camera degrades rapidly as a function of the DOI. Furthermore, calculations suggest that a large improvement in gamma camera performance is achieved by an increase in the number of detected photons or a reduction of noise in the image area. A large noise reduction, as is possible with a new generation of EM-CCD electronics, may improve the energy and spatial resolution by a factor of 1.5.

Korevaar, Marc A. N.; Goorden, Marlies C.; Beekman, Freek J.

2013-04-01

101

Cramer-Rao lower bound optimization of an EM-CCD-based scintillation gamma camera.  

PubMed

Scintillation gamma cameras based on low-noise electron multiplication (EM-)CCDs can reach high spatial resolutions. For further improvement of these gamma cameras, more insight is needed into how various parameters that characterize these devices influence their performance. Here, we use the Cramer-Rao lower bound (CRLB) to investigate the sensitivity of the energy and spatial resolution of an EM-CCD-based gamma camera to several parameters. The gamma camera setup consists of a 3mm thick CsI(Tl) scintillator optically coupled by a fiber optic plate to the E2V CCD97 EM-CCD. For this setup, the position and energy of incoming gamma photons are determined with a maximum-likelihood detection algorithm. To serve as the basis for the CRLB calculations, accurate models for the depth-dependent scintillation light distribution are derived and combined with a previously validated statistical response model for the EM-CCD. The sensitivity of the lower bounds for energy and spatial resolution to the EM gain and the depth-of-interaction (DOI) are calculated and compared to experimentally obtained values. Furthermore, calculations of the influence of the number of detected optical photons and noise sources in the image area on the energy and spatial resolution are presented. Trends predicted by CRLB calculations agree with experiments, although experimental values for spatial and energy resolution are typically a factor of 1.5 above the calculated lower bounds. Calculations and experiments both show that an intermediate EM gain setting results in the best possible spatial or energy resolution and that the spatial resolution of the gamma camera degrades rapidly as a function of the DOI. Furthermore, calculations suggest that a large improvement in gamma camera performance is achieved by an increase in the number of detected photons or a reduction of noise in the image area. A large noise reduction, as is possible with a new generation of EM-CCD electronics, may improve the energy and spatial resolution by a factor of 1.5. PMID:23552717

Korevaar, Marc A N; Goorden, Marlies C; Beekman, Freek J

2013-04-21

102

Modeling of the over-exposed pixel area of CCD cameras caused by laser dazzling  

NASA Astrophysics Data System (ADS)

A simple model has been developed and implemented in Matlab code, predicting the over-exposed pixel area of cameras caused by laser dazzling. Inputs of this model are the laser irradiance on the front optics of the camera, the Point Spread Function (PSF) of the used optics, the integration time of the camera, and camera sensor specifications like pixel size, quantum efficiency and full well capacity. Effects of the read-out circuit of the camera are not incorporated. The model was evaluated with laser dazzle experiments on CCD cameras using a 532 nm CW laser dazzler and shows good agreement. For relatively low laser irradiance the model predicts the over-exposed laser spot area quite accurately and shows the cube root dependency of spot diameter on laser irradiance, caused by the PSF as demonstrated before for IR cameras. For higher laser power levels the laser induced spot diameter increases more rapidly than predicted, which probably can be attributed to scatter effects in the camera. Some first attempts to model scatter contributions, using a simple scatter power function f(?), show good resemblance with experiments. Using this model, a tool is available which can assess the performance of observation sensor systems while being subjected to laser countermeasures.

Benoist, Koen W.; Schleijpen, Ric H. M. A.

2014-10-01

103

Design of an Event-Driven, Random-Access, Windowing CCD-Based Camera  

NASA Astrophysics Data System (ADS)

Commercially available cameras are not designed for a combination of single-frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROIs). A message-passing paradigm is defined to achieve low-level camera control with high-level system operation. This functionality is achieved by asynchronously sending messages to the camera for event-driven operation, where an event is defined as image capture or pixel readout of a ROI, without knowledge of detailed in-camera timing. This methodology provides a random access, real-time, event-driven (RARE) camera for adaptive camera control and is well suited for target-tracking applications requiring autonomous control of multiple ROIs. This methodology additionally provides for reduced ROI readout time and higher frame rates as compared to a predecessor architecture [1] by avoiding external control intervention during the ROI readout process.

Monacos, S. P.; Lam, R. K.; Portillo, A. A.; Zhu, D. Q.; Ortiz, G. G.

2003-11-01

104

Rapid estimation of camera motion from compressed video with application to video annotation  

Microsoft Academic Search

As digital video becomes more pervasive, efficient ways of searching and annotating video according to content will be increasingly important. Such tasks arise, for example, in the management of digital video libraries for content-based retrieval and browsing. In this paper, we develop tools based on camera motion for analyzing and annotating a class of structured video using the low-level information

Yap-peng Tan; Drew D. Saur; Sanjeev R. Kulkarni; Peter J. Ramadge

2000-01-01

105

In-camera video-stream processing for bandwidth reduction in web inspection  

NASA Astrophysics Data System (ADS)

Automated machine vision systems are now widely used for industrial inspection tasks where video-stream data information is taken in by the camera and then sent out to the inspection system for future processing. In this paper we describe a prototype system for on-line programming of arbitrary real-time video data stream bandwidth reduction algorithms; the output of the camera only contains information that has to be further processed by a host computer. The processing system is built into a DALSA CCD camera and uses a microcontroller interface to download bit-stream data to a XILINXTM FPGA. The FPGA is directly connected to the video data-stream and outputs data to a low bandwidth output bus. The camera communicates to a host computer via an RS-232 link to the microcontroller. Static memory is used to both generate a FIFO interface for buffering defect burst data, and for off-line examination of defect detection data. In addition to providing arbitrary FPGA architectures, the internal program of the microcontroller can also be changed via the host computer and a ROM monitor. This paper describes a prototype system board, mounted inside a DALSA camera, and discusses some of the algorithms currently being implemented for web inspection applications.

Jullien, Graham A.; Li, QiuPing; Hajimowlana, S. Hossain; Morvay, J.; Conflitti, D.; Roberts, James W.; Doody, Brian C.

1996-02-01

106

Video camera system for locating bullet holes in targets at a ballistics tunnel  

NASA Technical Reports Server (NTRS)

A system consisting of a single charge coupled device (CCD) video camera, computer controlled video digitizer, and software to automate the measurement was developed to measure the location of bullet holes in targets at the International Shooters Development Fund (ISDF)/NASA Ballistics Tunnel. The camera/digitizer system is a crucial component of a highly instrumented indoor 50 meter rifle range which is being constructed to support development of wind resistant, ultra match ammunition. The system was designed to take data rapidly (10 sec between shoots) and automatically with little operator intervention. The system description, measurement concept, and procedure are presented along with laboratory tests of repeatability and bias error. The long term (1 hour) repeatability of the system was found to be 4 microns (one standard deviation) at the target and the bias error was found to be less than 50 microns. An analysis of potential errors and a technique for calibration of the system are presented.

Burner, A. W.; Rummler, D. R.; Goad, W. K.

1990-01-01

107

Are Video Cameras the Key to School Safety?  

ERIC Educational Resources Information Center

Describes one high school's use of video cameras as a preventive tool in stemming theft and violent episodes within schools. The top 10 design tips for preventing crime on campus are highlighted. (GR)

Maranzano, Chuck

1998-01-01

108

DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

109

DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

110

Design of an event-driven random-access-windowing CCD-based camera  

NASA Astrophysics Data System (ADS)

Commercially available cameras are not designed for the combination of single frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROI). A new control paradigm is defined to achieve low-level camera control with high-level system operation. This functionality is achieved by defining the indivisible pixel read out operation on a per ROI basis with in-camera time keeping capability. This methodology provides a Random Access, Real-time, Event-driven (RARE) camera for adaptive camera control and is well suited for target tracking applications requiring autonomous control of multiple ROIs. This methodology additionally provides for reduced ROI read out time and higher frame rates compared to a predecessor architecture by avoiding external control intervention during the ROI read out process.

Monacos, Steve P.; Lam, Raymond K.; Portillo, Angel A.; Ortiz, Gerardo G.

2003-07-01

111

Design of an Event-Driven Random-Access-Windowing CCD-Based Camera  

NASA Technical Reports Server (NTRS)

Commercially available cameras are not design for the combination of single frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROI). A new control paradigm is defined to eliminate the tight coupling between the camera logic and the host controller. This functionality is achieved by defining the indivisible pixel read out operation on a per ROI basis with in-camera time keeping capability. This methodology provides a Random Access, Real-Time, Event-driven (RARE) camera for adaptive camera control and is will suited for target tracking applications requiring autonomous control of multiple ROI's. This methodology additionally provides for reduced ROI read out time and higher frame rates compared to the original architecture by avoiding external control intervention during the ROI read out process.

Monacos, Steve P.; Lam, Raymond K.; Portillo, Angel A.; Ortiz, Gerardo G.

2003-01-01

112

Thermal modeling of cooled instrument: from the WIRCam IR camera to CCD Peltier cooled compact packages  

NASA Astrophysics Data System (ADS)

In the past decade, new thermal modelling tools have been offered to system designers. These modelling tools have rarely been used for the cooled instruments in ground-based astronomy. In addition to an overwhelming increase of PC computer capabilities, these tools are now mature enough to drive the design of complex astronomical instruments that are cooled. This is the case for WIRCam, the new wide-field infrared camera installed on the CFHT in Hawaii on the Mauna Kea summit. This camera uses four 2K2K Rockwell Hawaii-2RG infrared detectors and includes 2 optical barrels and 2 filter wheels. This camera is mounted at the prime focus of the 3.6m CFHT telescope. The mass to be cooled is close to 100 kg. The camera uses a Gifford Mac-Mahon closed-cycle cryo-cooler. The capabilities of the I-deas thermal module (TMG) is demonstrated for our particular application: predicted performances are presented and compared to real measurements after integration on the telescope in December 2004. In addition, we present thermal modelling of small Peltier cooled CCD packages, including the thermal model of the CCD220 Peltier package (fabricated by e2v technologies) and cold head. ESO and the OPTICON European network have funded e2v technologies to develop a compact packaged Peltier-cooled 8-output back illuminated L3Vision CCD. The device will achieve sub-electron read-noise at frame rates up to 1.5 kHz. The development, fully dedicated to the latest generation of adaptive optics wavefront sensors, has many unique features. Among them, the ultra-compactness offered by a Peltier package integrated in a small cold head including the detector drive electronics, is a way to achieve amazing performances for adaptive optics systems. All these models were carried out using a normal PC laptop.

Feautrier, Philippe; Stadler, Eric; Downing, Mark; Hurrell, Steve; Wheeler, Patrick; Gach, Jean-Luc; Magnard, Yves; Balard, Philippe; Guillaume, Christian; Hubin, Norbert; Diaz, Jos Javier; Suske, Wolfgang; Jorden, Paul

2006-06-01

113

OCam with CCD220, the Fastest and Most Sensitive Camera to Date for AO Wavefront Sensing  

NASA Astrophysics Data System (ADS)

For the first time, subelectron readout noise has been achieved with a camera dedicated to astronomical wavefront-sensing applications. The OCam system demonstrated this performance at a 1300 Hz frame rate and with 240 240 pixel frame size. ESO and JRA2 OPTICON jointly funded e2v Technologies to develop a custom CCD for adaptive optics (AO) wavefront-sensing applications. The device, called CCD220, is a compact Peltier-cooled 240 240 pixel frame-transfer eight-output back-illuminated sensor using the EMCCD technology. This article demonstrates, for the first time, subelectron readout noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e- pixel-1 frame-1 . It reports on the quantitative performance characterization of OCam and the CCD220, including readout noise, dark current, multiplication gain, quantum efficiency, and charge transfer efficiency. OCam includes a low-noise preamplifier stage, a digital board to generate the clocks, and a microcontroller. The data acquisition system includes a user-friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2 , has been designed to offer enhanced performance, a completely sealed camera package, and an additional Peltier stage to facilitate operation on a telescope or environmentally challenging applications. New features of OCam2 are presented in this article. This instrumental development will strongly impact the performance of the most advanced AO systems to come.

Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz Garcia, Jos Javier

2011-03-01

114

Optical readout of a two phase liquid argon TPC using CCD camera and THGEMs  

NASA Astrophysics Data System (ADS)

This paper presents a preliminary study into the use of CCDs to image secondary scintillation light generated by THick Gas Electron Multipliers (THGEMs) in a two phase LAr TPC. A Sony ICX285AL CCD chip was mounted above a double THGEM in the gas phase of a 40 litre two-phase LAr TPC with the majority of the camera electronics positioned externally via a feedthrough. An Am-241 source was mounted on a rotatable motion feedthrough allowing the positioning of the alpha source either inside or outside of the field cage. Developed for and incorporated into the TPC design was a novel high voltage feedthrough featuring LAr insulation. Furthermore, a range of webcams were tested for operation in cryogenics as an internal detector monitoring tool. Of the range of webcams tested the Microsoft HD-3000 (model no:1456) webcam was found to be superior in terms of noise and lowest operating temperature. In ambient temperature and atmospheric pressure 1 ppm pure argon gas, the THGEM gain was ? 1000 and using a 1 msec exposure the CCD captured single alpha tracks. Successful operation of the CCD camera in two-phase cryogenic mode was also achieved. Using a 10 sec exposure a photograph of secondary scintillation light induced by the Am-241 source in LAr has been captured for the first time.

Mavrokoridis, K.; Ball, F.; Carroll, J.; Lazos, M.; McCormick, K. J.; Smith, N. A.; Touramanis, C.; Walker, J.

2014-02-01

115

Digital video camera for application of particle image velocimetry in high-speed flows  

NASA Astrophysics Data System (ADS)

A high-speed digital camera based on video technology for application of particle image velocimetry in wind tunnels is described. The camera contains two independently triggerable interline CCD sensors which are mounted on two faces of a cube beam splitter permitting the use of a single lens. Each of the sensors has a minimal exposure time of 0.8 microsecond(s) with a trigger response time of less than 1 microsecond(s) . The asynchronous reset capability permits the camera to trigger directly off a pulsed laser with a repetition rate differing from the standard 25 Hz CCIR video frame rate. Captured images are digitized within and stored in RAM the camera which can be read through the parallel port of a computer. The camera is software configurable with the settings being non-volatile. Technical aspect such as sensor alignment and calibration through software are described. Close-up PIV measurements on a free jet illustrated that, in the future, the camera can be successfully utilized at imaging high-speed flows over a small field of view covering several cm2, such as the flow between turbine blades. Further, the electronic shutter permits its use in luminous environments such as illuminated laboratories, wind tunnels or flames.

Willert, Christian; Stasicki, Boleslaw; Raffel, Markus; Kompenhans, Juergen

1995-09-01

116

A digital video camera for application of particle image velocimetry in high-speed flows  

SciTech Connect

A high-speed digital camera based on video technology for application of particle image velocimetry in wind tunnels is described. The camera contains two independently triggerable interline CCD sensors which are mounted on two faces of a cube beam splitter permitting the use of a single lens. Each of the sensors has a minimal exposure time of 0.8 {micro}s with a trigger response time of less than 1 {micro}s. The asynchronous reset capability permits the camera to trigger directly off a pulsed laser with a repetition rate differing from the standard 25 Hz CCIR video frame rate. Captured images are digitized within and stored in RAM the camera which can be read through the parallel port of a computer. The camera is software configurable with the settings being non-volatile. Technical aspects such as sensor alignment and calibration through software are described. Close-up PIV measurements on a free jet illustrate that, in the future, the camera can be successfully utilized at imaging high-speed flows over a small field of view covering several cm{sup 2}, such as the flow between turbine blades. Further, the electronic shutter permits its use in luminous environments such as illuminated laboratories, wind tunnels or flames.

Willert, C.; Stasicki, B.; Raffel, M.; Kompenhans, J. [Deutsche Forschungsanstalt fuer Luft und Raumfahrt e.V., Goettingen (Germany). Inst. fuer Stroemungsmechanik

1995-12-31

117

Proceedings of ICRC 2001: 639 c Copernicus Gesellschaft 2001 Determining the Alignment of HiRes Optics Using a CCD Camera.  

E-print Network

. The camera consisted of the CCD chip itself, a vacuum housing, a cooling fan, the shut- ter, the lens of the pixels. The CCD chip was thermoelectrically cooled to reduce electronic noise. Light entering the camera, the power supply for the thermoelectric cooling and the data #12;640 Fig. 2. A picture taken by the CCD

118

Flutter Shutter Video Camera for Compressive Sensing of Videos Jason Holloway Aswin C. Sankaranarayanan Ashok Veeraraghavan Salil Tambe  

E-print Network

Flutter Shutter Video Camera for Compressive Sensing of Videos Jason Holloway Aswin C. We propose the Flutter Shutter Video Camera (FSVC) in which each exposure of the sensor is temporally rates. Figure 1. Flutter Shutter Video Camera (FSVC): The exposure duration of each frame is modulated

Mellor-Crummey, John

119

Optimal Design of MPD based fiber optic strain sensors and comparison of power meter and CCD camera based architectures  

NASA Astrophysics Data System (ADS)

In this work, we consider optimal design of Modal Power Distribution (MPD) based fiber optic sensors and compare power-meter and CCD camera based techniques for strain measurements. To the best of authors' knowledge, majority of the power-meter based MPD sensors use a single photo detector, and there is only one known work where two photodetectors are used with no optimization on photo-detector locations. Optimal measurement location selection problem and comparison of power-meter and CCD camera based sensor measurements were both addressed in this work. Based on our experimental data, more than 100% increased sensitivity is observed in the newly designed optimal strain sensor. It was also shown that there is a fixed nonlinear relationship between CCD based and power-meter based fiber optic sensor measurements. This allows estimation of power-meter measurements utilizing CCD camera images, which in turn simplifies the optimal detector location selection problem.

Toker, Onur; Efendioglu, Hasan S.; Esen, Mehmet E.; Fidanboylu, Kemal

2011-04-01

120

Optimal design of MPD based fiber optic strain sensors and comparison of power-meter and CCD camera measurements  

NASA Astrophysics Data System (ADS)

In this paper, we consider optimal sensor design problem and compare power-meter and CCD camera based techniques for strain measurements using Modal Power Distribution (MPD). MPD is a sensitive and low-cost fiber optic sensing technique which uses spatial intensity modulation in two dimensional setting. In a power-meter based fiber optic sensor, light intensity is measured at one or more points, and selection of these points is usually very critical in sensor design. Single point based sensors are not usually very successful, because of the time varying feature of a laser source output intensity. To the best of authors' knowledge, most of the power-meter based MPD sensors utilize a single photo-detector, and there is only one known work where two photo-detectors are used with no optimization on measurement locations. In this work, we both consider optimal measurement location selection problem, and compare of power-meter and CCD camera based sensor measurements. We also show that there is an almost one-to-one relationship between CCD based and power-meter based fiber optic sensor measurements, which allows estimation of power-meter based measurements using CCD camera images. In our experimental setup, power-meter measurements and CCD camera images were recorded for different strain values. Power-meter measurements were estimated from CCD camera images with an average error of less than 4% by utilizing image processing techniques and non-linear least squares techniques.

Efendioglu, Hasan Seckin; Esen, Mehmet Enis; Toker, Onur; Fidanboylu, Kemal

2011-03-01

121

Experimental research on femto-second laser damaging array CCD cameras  

NASA Astrophysics Data System (ADS)

Charged Coupled Devices (CCD) are widely used in military and security applications, such as airborne and ship based surveillance, satellite reconnaissance and so on. Homeland security requires effective means to negate these advanced overseeing systems. Researches show that CCD based EO systems can be significantly dazzled or even damaged by high-repetition rate pulsed lasers. Here, we report femto - second laser interaction with CCD camera, which is probable of great importance in future. Femto - second laser is quite fresh new lasers, which has unique characteristics, such as extremely short pulse width (1 fs = 10-15 s), extremely high peak power (1 TW = 1012W), and especially its unique features when interacting with matters. Researches in femto second laser interaction with materials (metals, dielectrics) clearly indicate non-thermal effect dominates the process, which is of vast difference from that of long pulses interaction with matters. Firstly, the damage threshold test are performed with femto second laser acting on the CCD camera. An 800nm, 500?J, 100fs laser pulse is used to irradiate interline CCD solid-state image sensor in the experiment. In order to focus laser energy onto tiny CCD active cells, an optical system of F/5.6 is used. A Sony production CCDs are chose as typical targets. The damage threshold is evaluated with multiple test data. Point damage, line damage and full array damage were observed when the irradiated pulse energy continuously increase during the experiment. The point damage threshold is found 151.2 mJ/cm2.The line damage threshold is found 508.2 mJ/cm2.The full-array damage threshold is found to be 5.91 J/cm2. Although the phenomenon is almost the same as that of nano laser interaction with CCD, these damage thresholds are substantially lower than that of data obtained from nano second laser interaction with CCD. Then at the same time, the electric features after different degrees of damage are tested with electronic multi meter. The resistance values between clock signal lines are measured. Contrasting the resistance values of the CCD before and after damage, it is found that the resistances decrease significantly between the vertical transfer clock signal lines values. The same results are found between the vertical transfer clock signal line and the earth electrode (ground).At last, the damage position and the damage mechanism were analyzed with above results and SEM morphological experiments. The point damage results in the laser destroying material, which shows no macro electro influence. The line damage is quite different from that of point damage, which shows deeper material corroding effect. More importantly, short circuits are found between vertical clock lines. The full array damage is even more severe than that of line damage starring with SEM, while no obvious different electrical features than that of line damage are found. Further researches are anticipated in femto second laser caused CCD damage mechanism with more advanced tools. This research is valuable in EO countermeasure and/or laser shielding applications.

Shao, Junfeng; Guo, Jin; Wang, Ting-feng; Wang, Ming

2013-05-01

122

Source video camera identification for multiply compressed videos originating from YouTube  

Microsoft Academic Search

The Photo Response Non-Uniformity is a unique sensor noise pattern that is present in each image or video acquired with a digital camera. In this work a wavelet-based technique used to extract these patterns from digital images is applied to compressed low resolution videos originating mainly from webcams. After recording these videos with a variety of codec and resolution settings,

Wiger van Houten; Zeno Geradts

2009-01-01

123

Controlled Impact Demonstration (CID) tail camera video  

NASA Technical Reports Server (NTRS)

The Controlled Impact Demonstration (CID) was a joint research project by NASA and the FAA to test a survivable aircraft impact using a remotely piloted Boeing 720 aircraft. The tail camera movie is one shot running 27 seconds. It shows the impact from the perspective of a camera mounted high on the vertical stabilizer, looking forward over the fuselage and wings.

1984-01-01

124

HERSCHEL/SCORE, imaging the solar corona in visible and EUV light: CCD camera characterization.  

PubMed

The HERSCHEL (helium resonant scattering in the corona and heliosphere) experiment is a rocket mission that was successfully launched last September from White Sands Missile Range, New Mexico, USA. HERSCHEL was conceived to investigate the solar corona in the extreme UV (EUV) and in the visible broadband polarized brightness and provided, for the first time, a global map of helium in the solar environment. The HERSCHEL payload consisted of a telescope, HERSCHEL EUV Imaging Telescope (HEIT), and two coronagraphs, HECOR (helium coronagraph) and SCORE (sounding coronagraph experiment). The SCORE instrument was designed and developed mainly by Italian research institutes and it is an imaging coronagraph to observe the solar corona from 1.4 to 4 solar radii. SCORE has two detectors for the EUV lines at 121.6 nm (HI) and 30.4 nm (HeII) and the visible broadband polarized brightness. The SCORE UV detector is an intensified CCD with a microchannel plate coupled to a CCD through a fiber-optic bundle. The SCORE visible light detector is a frame-transfer CCD coupled to a polarimeter based on a liquid crystal variable retarder plate. The SCORE coronagraph is described together with the performances of the cameras for imaging the solar corona. PMID:20428852

Pancrazzi, M; Focardi, M; Landini, F; Romoli, M; Fineschi, S; Gherardi, A; Pace, E; Massone, G; Antonucci, E; Moses, D; Newmark, J; Wang, D; Rossi, G

2010-07-01

125

CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models  

NASA Astrophysics Data System (ADS)

Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

2011-02-01

126

Striping Noise Removal of Images Acquired by Cbers 2 CCD Camera Sensor  

NASA Astrophysics Data System (ADS)

CCD Camera is a multi-spectral sensor that is carried by CBERS 2 satellite. Imaging technique in this sensor is push broom. In images acquired by the CCD Camera, some vertical striping noise can be seen. This is due to the detectors mismatch, inter detector variability, improper calibration of detectors and low signal-to-noise ratio. These noises are more profound in images acquired from the homogeneous surfaces, which are processed at level 2. However, the existence of these noises render the interpretation of the data and extracting information from these images difficult. In this work, spatial moment matching method is proposed to modify these images. In this method, the statistical moments such as mean and standard deviation of columns in each band are used to balance the statistical specifications of the detector array to those of reference values. After the removal of the noise, some periodic diagonal stripes remain in the image where their removal by using the aforementioned method seems impossible. Therefore, to omit them, frequency domain Butterworth notch filter was applied. Finally to evaluate the results, the image statistical moments such as the mean and standard deviation were deployed. The study proves the effectiveness of the method in noise removal.

Amraei, E.; Mobasheri, M. R.

2014-10-01

127

Remote control video cameras on a suborbital rocket  

SciTech Connect

Three video cameras were controlled in real time from the ground to a sub-orbital rocket during a fifteen minute flight from White Sands Missile Range in New Mexico. Telemetry communications with the rocket allowed the control of the cameras. The pan, tilt, zoom, focus, and iris of two of the camera lenses, the power and record functions of the three cameras, and also the analog video signal that would be sent to the ground was controlled by separate microprocessors. A microprocessor was used to record data from three miniature accelerometers, temperature sensors and a differential pressure sensor. In addition to the selected video signal sent to the ground and recorded there, the video signals from the three cameras also were recorded on board the rocket. These recorders were mounted inside the pressurized segment of the rocket payload. The lenses, lens control mechanisms, and the three small television cameras were located in a portion of the rocket payload that was exposed to the vacuum of space. The accelerometers were also exposed to the vacuum of space.

Wessling, Francis C. [Consortium for Materials Development in Space, University of Alabama in Huntsville, RI/M65, 301 Sparkman Drive Huntsville, Alabama 35899 (United States)

1997-01-10

128

Instantaneous video and picosecond laser pulses detection with C.C.D. solid state devices  

NASA Astrophysics Data System (ADS)

The installation and operation of high power lasers, requires to know both the spatial repartition of energy in the beams and their location. Infrared films are less and less frequently used because they require a long and tedious processing. They have been progressively replaced by television systems built around tubes (vidicon with silicon or PbSe targets). Those systems simplify the laser alignment operations, nevertheless their limited dynamic range is an important drawback. In order to replace them by a new generation of cameras, we have studied several CCD sensors and we present the last result obtained when illuminating them with C.W. laser light at X = 1.06 ?m and 0.35 We have also tested the behavior of CCD illuminated by a 50 ps laser pulse at X = 1,06 ?m. These measurements show that the THOMSON TH 7861 CDA FO sensor has the best features for our applications. The associated television camera is also described.

Cavailler, C.; Fleurot, N.; Mazataud, D.; Mens, A.

1985-02-01

129

Characterization of a commercial, front-illuminated interline transfer CCD camera for use as a guide camera on a balloon-borne telescope  

E-print Network

We report results obtained during the characterization of a commercial front-illuminated progressive scan interline transfer CCD camera. We demonstrate that the unmodified camera operates successfully in temperature and pressure conditions (-40C, 4mBar) representative of a high altitude balloon mission. We further demonstrate that the centroid of a well-sampled star can be determined to better than 2% of a pixel, even though the CCD is equipped with a microlens array. This device has been selected for use in a closed-loop star-guiding and tip-tilt correction system in the BIT-STABLE balloon mission.

Clark, Paul; Chang, Herrick L; Galloway, Mathew; Israel, Holger; Jones, Laura L; Li, Lun; Mandic, Milan; Morris, Tim; Netterfield, Barth; Peacock, John; Sharples, Ray; Susca, Sara

2014-01-01

130

Ball lightning observation: an objective video-camera analysis report  

E-print Network

In this paper we describe a video-camera recording of a (probable) ball lightning event and both the related image and signal analyses for its photometric and dynamical characterization. The results strongly support the BL nature of the recorded luminous ball object and allow the researchers to have an objective and unique video document of a possible BL event for further analyses. Some general evaluations of the obtained results considering the proposed ball lightning models conclude the paper.

Sello, Stefano; Paganini, Enrico

2011-01-01

131

Ball lightning observation: an objective video-camera analysis report  

E-print Network

In this paper we describe a video-camera recording of a (probable) ball lightning event and both the related image and signal analyses for its photometric and dynamical characterization. The results strongly support the BL nature of the recorded luminous ball object and allow the researchers to have an objective and unique video document of a possible BL event for further analyses. Some general evaluations of the obtained results considering the proposed ball lightning models conclude the paper.

Stefano Sello; Paolo Viviani; Enrico Paganini

2011-02-18

132

Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras  

NASA Technical Reports Server (NTRS)

Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

McDowell, Mark; Gray, Elizabeth

2004-01-01

133

Performance of front-end mixed-signal ASIC for onboard CCD cameras  

NASA Astrophysics Data System (ADS)

We report on the development status of the readout ASIC for an onboard X-ray CCD camera. The quick low- noise readout is essential for the pile-up free imaging spectroscopy with the future highly sensitive telescope. The dedicated ASIC for ASTRO-H/SXI has sufficient noise performance only at the slow pixel rate of 68 kHz. Then we have been developing the upgraded ASIC with the fourth-order ?? modulators. Upgrading the order of the modulator enables us to oversample the CCD signals less times so that we. The digitized pulse height is a serial bit stream that is decrypted with a decimation filter. The weighting coefficient of the filter is optimized to maximize the signal-to-noise ratio by a simulation. We present the performances such as the input equivalent noise (IEN), gain, effective signal range. The digitized pulse height data are successfully obtained in the first functional test up to 625 kHz. IEN is almost the same as that obtained with the chip for ASTRO-H/SXI. The residuals from the gain function is about 0.1%, which is better than that of the conventional ASIC by a factor of two. Assuming that the gain of the CCD is the same as that for ASTRO-H, the effective range is 30 keV in the case of the maximum gain. By changing the gain it can manage the signal charges of 100 ke-. These results will be fed back to the optimization of the pulse height decrypting filter.

Nakajima, Hiroshi; Inoue, Shota; Nagino, Ryo; Anabuki, Naohisa; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Doty, John P.; Ikeda, Hirokazu

2014-07-01

134

Improvement of relief algorithm to prevent inpatient's downfall accident with night-vision CCD camera  

NASA Astrophysics Data System (ADS)

"ROSAI" hospital, Wakayama City in Japan, reported that inpatient's bed-downfall is one of the most serious accidents in hospital at night. Many inpatients have been having serious damages from downfall accidents from a bed. To prevent accidents, the hospital tested several sensors in a sickroom to send warning-signal of inpatient's downfall accidents to a nurse. However, it sent too much inadequate wrong warning about inpatients' sleeping situation. To send a nurse useful information, precise automatic detection for an inpatient's sleeping situation is necessary. In this paper, we focus on a clustering-algorithm which evaluates inpatient's situation from multiple angles by several kinds of sensor including night-vision CCD camera. This paper indicates new relief algorithm to improve the weakness about exceptional cases.

Matsuda, Noriyuki; Yamamoto, Takeshi; Miwa, Masafumi; Nukumi, Shinobu; Mori, Kumiko; Kuinose, Yuko; Maeda, Etuko; Miura, Hirokazu; Taki, Hirokazu; Hori, Satoshi; Abe, Norihiro

2005-12-01

135

High resolution three-dimensional photoacoutic tomography with CCD-camera based ultrasound detection  

PubMed Central

A photoacoustic tomograph based on optical ultrasound detection is demonstrated, which is capable of high resolution real-time projection imaging and fast three-dimensional (3D) imaging. Snapshots of the pressure field outside the imaged object are taken at defined delay times after photoacoustic excitation by use of a charge coupled device (CCD) camera in combination with an optical phase contrast method. From the obtained wave patterns photoacoustic projection images are reconstructed using a back propagation Fourier domain reconstruction algorithm. Applying the inverse Radon transform to a set of projections recorded over a half rotation of the sample provides 3D photoacoustic tomography images in less than one minute with a resolution below 100 m. The sensitivity of the device was experimentally determined to be 5.1 kPa over a projection length of 1 mm. In vivo images of the vasculature of a mouse demonstrate the potential of the developed method for biomedical applications. PMID:25136491

Nuster, Robert; Slezak, Paul; Paltauf, Guenther

2014-01-01

136

67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST OF ASSISTANT LAUNCH CONDUCTOR PANEL SHOWN IN CA-133-1-A-66 - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

137

Masking a CCD camera allows multichord charge exchange spectroscopy measurements at high speed on the DIII-D tokamak  

SciTech Connect

Charge exchange spectroscopy is one of the standard plasma diagnostic techniques used in tokamak research to determine ion temperature, rotation speed, particle density, and radial electric field. Configuring a charge coupled device (CCD) camera to serve as a detector in such a system requires a trade-off between the competing desires to detect light from as many independent spatial views as possible while still obtaining the best possible time resolution. High time resolution is essential, for example, for studying transient phenomena such as edge localized modes. By installing a mask in front of a camera with a 1024 x 1024 pixel CCD chip, we are able to acquire spectra from eight separate views while still achieving a minimum time resolution of 0.2 ms. The mask separates the light from the eight spectra, preventing spatial and temporal cross talk. A key part of the design was devising a compact translation stage which attaches to the front of the camera and allows adjustment of the position of the mask openings relative to the CCD surface. The stage is thin enough to fit into the restricted space between the CCD camera and the spectrometer endplate.

Meyer, O. [Euratom-CEA Association, DSM-IRFM, Cadarache, 13108 St Paul lez Durance (France); Burrell, K. H.; Chavez, J. A.; Kaplan, D. H. [General Atomics, PO Box 85608, San Diego, California 92186-5608 (United States); Chrystal, C.; Pablant, N. A. [University of California at San Diego, La Jolla, California 92093 (United States); Solomon, W. M. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

2011-02-15

138

Scientific CCD technology at JPL  

NASA Technical Reports Server (NTRS)

Charge-coupled devices (CCD's) were recognized for their potential as an imaging technology almost immediately following their conception in 1970. Twenty years later, they are firmly established as the technology of choice for visible imaging. While consumer applications of CCD's, especially the emerging home video camera market, dominated manufacturing activity, the scientific market for CCD imagers has become significant. Activity of the Jet Propulsion Laboratory and its industrial partners in the area of CCD imagers for space scientific instruments is described. Requirements for scientific imagers are significantly different from those needed for home video cameras, and are described. An imager for an instrument on the CRAF/Cassini mission is described in detail to highlight achieved levels of performance.

Janesick, J.; Collins, S. A.; Fossum, E. R.

1991-01-01

139

AN INTEGRATED AUDIO-VIDEO TRACKING SYSTEM WITH A PTZ VIDEO CAMERA AND A 3-D MICROPHONE ARRAY  

E-print Network

AN INTEGRATED AUDIO-VIDEO TRACKING SYSTEM WITH A PTZ VIDEO CAMERA AND A 3-D MICROPHONE ARRAY Kuo of an audio source or a video object is critical in many audio-video applications such as video surveillance an audio source in a 3-D space, which improves the accuracy of localization in different environments

Chang, Pao-Chi

140

A fast auto-focusing technique for the long focal lens TDI CCD camera in remote sensing applications  

NASA Astrophysics Data System (ADS)

The key issue in automatic focus adjustment for long focal lens TDI CCD camera in remote sensing applications is to achieve the optimum focus position as fast as possible. Existing auto-focusing techniques consume too much time as the mechanical focusing parts of the camera move in steps during the searching procedure. In this paper, we demonstrate a fast auto-focusing technique, which employs the internal optical elements and the TDI CCD itself to directly sense the deviations in back focal distance of the lens and restore the imaging system to a best-available focus. It is particularly advantageous for determination of the focus, due to that the relative motion between the TDI CCD and the focusing element can proceed without interruption. Moreover, the theoretical formulas describing the effect of imaging motion on the focusing precision and the effective focusing range are also developed. Finally, an experimental setup is constructed to evaluate the performance of the proposed technique. The results of the experiment show a 5 ?m precision of auto-focusing in a range of 500 ?mdefocus, and the searching procedure could be accomplished within 0.125 s, which leads to remarkable improvement on the real-time imaging capability for high resolution TDI CCD camera in remote sensing applications.

Wang, Dejiang; Ding, Xu; Zhang, Tao; Kuang, Haipeng

2013-02-01

141

In-flight Video Captured by External Tank Camera System  

NASA Technical Reports Server (NTRS)

In this July 26, 2005 video, Earth slowly fades into the background as the STS-114 Space Shuttle Discovery climbs into space until the External Tank (ET) separates from the orbiter. An External Tank ET Camera System featuring a Sony XC-999 model camera provided never before seen footage of the launch and tank separation. The camera was installed in the ET LO2 Feedline Fairing. From this position, the camera had a 40% field of view with a 3.5 mm lens. The field of view showed some of the Bipod area, a portion of the LH2 tank and Intertank flange area, and some of the bottom of the shuttle orbiter. Contained in an electronic box, the battery pack and transmitter were mounted on top of the Solid Rocker Booster (SRB) crossbeam inside the ET. The battery pack included 20 Nickel-Metal Hydride batteries (similar to cordless phone battery packs) totaling 28 volts DC and could supply about 70 minutes of video. Located 95 degrees apart on the exterior of the Intertank opposite orbiter side, there were 2 blade S-Band antennas about 2 1/2 inches long that transmitted a 10 watt signal to the ground stations. The camera turned on approximately 10 minutes prior to launch and operated for 15 minutes following liftoff. The complete camera system weighs about 32 pounds. Marshall Space Flight Center (MSFC), Johnson Space Center (JSC), Goddard Space Flight Center (GSFC), and Kennedy Space Center (KSC) participated in the design, development, and testing of the ET camera system.

2005-01-01

142

Development of high-speed video cameras for Dynamic PIV  

Microsoft Academic Search

The most promising next generation Image Velocimtry (IV) is the high-speed Dynamic PIV. It requires the development of innovative\\u000a high-speed video camera sensors. We started by specifying the required performance of these new sensors, for measurements\\u000a in air and water flows. These criteria founded on the most recent developments in PIV algorithms and incorporate results from\\u000a a large questionnaire survey

G. T. Etoh; Y. Takano

2002-01-01

143

Maximum-likelihood scintillation detection for EM-CCD based gamma cameras  

NASA Astrophysics Data System (ADS)

Gamma cameras based on charge-coupled devices (CCDs) coupled to continuous scintillation crystals can combine a good detection efficiency with high spatial resolutions with the aid of advanced scintillation detection algorithms. A previously developed analytical multi-scale algorithm (MSA) models the depth-dependent light distribution but does not take statistics into account. Here we present and validate a novel statistical maximum-likelihood algorithm (MLA) that combines a realistic light distribution model with an experimentally validated statistical model. The MLA was tested for an electron multiplying CCD optically coupled to CsI(Tl) scintillators of different thicknesses. For 99mTc imaging, the spatial resolution (for perpendicular and oblique incidence), energy resolution and signal-to-background counts ratio (SBR) obtained with the MLA were compared with those of the MSA. Compared to the MSA, the MLA improves the energy resolution by more than a factor of 1.6 and the SBR is enhanced by more than a factor of 1.3. For oblique incidence (approximately 45), the depth-of-interaction corrected spatial resolution is improved by a factor of at least 1.1, while for perpendicular incidence the MLA resolution does not consistently differ significantly from the MSA result for all tested scintillator thicknesses. For the thickest scintillator (3 mm, interaction probability 66% at 141 keV) a spatial resolution (perpendicular incidence) of 147 m full width at half maximum (FWHM) was obtained with an energy resolution of 35.2% FWHM. These results of the MLA were achieved without prior calibration of scintillations as is needed for many statistical scintillation detection algorithms. We conclude that the MLA significantly improves the gamma camera performance compared to the MSA.

Korevaar, Marc A. N.; Goorden, Marlies C.; Heemskerk, Jan W. T.; Beekman, Freek J.

2011-08-01

144

Field-programmable gate array-based hardware architecture for high-speed camera with KAI-0340 CCD image sensor  

NASA Astrophysics Data System (ADS)

We present a field-programmable gate array (FPGA)-based hardware architecture for high-speed camera which have fast auto-exposure control and colour filter array (CFA) demosaicing. The proposed hardware architecture includes the design of charge coupled devices (CCD) drive circuits, image processing circuits, and power supply circuits. CCD drive circuits transfer the TTL (Transistor-Transistor-Logic) level timing Sequences which is produced by image processing circuits to the timing Sequences under which CCD image sensor can output analog image signals. Image processing circuits convert the analog signals to digital signals which is processing subsequently, and the TTL timing, auto-exposure control, CFA demosaicing, and gamma correction is accomplished in this module. Power supply circuits provide the power for the whole system, which is very important for image quality. Power noises effect image quality directly, and we reduce power noises by hardware way, which is very effective. In this system, the CCD is KAI-0340 which is can output 210 full resolution frame-per-second, and our camera can work outstandingly in this mode. The speed of traditional auto-exposure control algorithms to reach a proper exposure level is so slow that it is necessary to develop a fast auto-exposure control method. We present a new auto-exposure algorithm which is fit high-speed camera. Color demosaicing is critical for digital cameras, because it converts a Bayer sensor mosaic output to a full color image, which determines the output image quality of the camera. Complexity algorithm can acquire high quality but cannot implement in hardware. An low-complexity demosaicing method is presented which can implement in hardware and satisfy the demand of quality. The experiment results are given in this paper in last.

Wang, Hao; Yan, Su; Zhou, Zuofeng; Cao, Jianzhong; Yan, Aqi; Tang, Linao; Lei, Yangjie

2013-08-01

145

Stress measurement with fiber optical sensors using modal power distribution: A comparison of power-meter and CCD camera techniques  

NASA Astrophysics Data System (ADS)

In this paper, we investigate modal power distribution (MPD)-based fiber optical stress measurement, and compare power-meter and CCD camera-based techniques. We also consider the sensor location selection problem in the power-meter-based approach, which consists of placing sensor(s) in different locations. We first show that power-meter measurement data can be estimated from CCD camera images by using image processing techniques. As a second result, we also show that the lengthy process of placing sensors in different locations in search of best sensor placement can be totally avoided. We formulate this as a max-min problem, and propose a computer-based solution.

Efendioglu, H. S.; Esen, M. E.; Toker, O.; Fidanboylu, K.

2010-04-01

146

Determination of Meteoroid Orbits and Spatial Fluxes by Using High-Resolution All-Sky CCD Cameras  

Microsoft Academic Search

By using high-resolution, low-scan-rate, all-sky CCD cameras, the SPanish Meteor Network (SPMN) is currently monitoring meteor\\u000a and fireball activity on a year round basis. Here are presented just a sampling of the accurate trajectory, radiant and orbital\\u000a data obtained for meteors imaged simultaneously from two SPMN stations during the continuous 20062007 coverage of meteor\\u000a and fireball monitoring. Typical astrometric uncertainty

Josep M. Trigo-Rodriguez; Jos M. Madiedo; Peter S. Gural; Alberto J. Castro-Tirado; Jordi Llorca; Juan Fabregat; Standa Vtek; Pep Pujols

2008-01-01

147

Soft x-ray transmission of optical blocking filters for x-ray CCD camera onboard Astro-E2  

Microsoft Academic Search

We measured optical and soft X-ray transmission of Optical Blocking Filters (OBFs) for Charge Coupled Device (CCD) cameras, which will be launched as focal plane detectors of X-ray telescopes onboard the Japanese 5th X-ray astronomical satellite, Astro-E 2. The filters were made from polyimide coated with Al. The X-ray absorption fine structures (XAFSs) at the K edges of C, N,

Shunji Kitamoto; Takayoshi Kohmura; Norimasa Yamamoto; Haruko Takano; Harue Saito; Kazuharu Suga; Hiroyuki Sekiguchi; S. Chiba; I. Okamoto; Kiyoshi Hayashida; Haruyoshi Katayama; Toyonaka Enoguchi; Yuusuke Nakashima; T. Shiroshouji; Yuzuru Tawara; Akihiro Furuzawa; Takeshi Tanaka

2004-01-01

148

Detection of multimode spatial correlation in PDC and application to the absolute calibration of a CCD camera  

E-print Network

We propose and demonstrate experimentally a new method based on the spatial entanglement for the absolute calibration of analog detector. The idea consists on measuring the sub-shot-noise intensity correlation between two branches of parametric down conversion, containing many pairwise correlated spatial modes. We calibrate a scientific CCD camera and a preliminary evaluation of the statistical uncertainty indicates the metrological interest of the method.

Giorgio Brida; Ivo Pietro Degiovanni; Marco Genovese; Maria Luisa Rastello; Ivano Ruo-Berchera

2010-05-17

149

MOA-cam3: a wide-field mosaic CCD camera for a gravitational microlensing survey in New Zealand  

E-print Network

We have developed a wide-field mosaic CCD camera, MOA-cam3, mounted at the prime focus of the Microlensing Observations in Astrophysics (MOA) 1.8-m telescope. The camera consists of ten E2V CCD4482 chips, each having 2kx4k pixels, and covers a 2.2 deg^2 field of view with a single exposure. The optical system is well optimized to realize uniform image quality over this wide field. The chips are constantly cooled by a cryocooler at -80C, at which temperature dark current noise is negligible for a typical 1-3 minute exposure. The CCD output charge is converted to a 16-bit digital signal by the GenIII system (Astronomical Research Cameras Inc.) and readout is within 25 seconds. Readout noise of 2--3 ADU (rms) is also negligible. We prepared a wide-band red filter for an effective microlensing survey and also Bessell V, I filters for standard astronomical studies. Microlensing studies have entered into a new era, which requires more statistics, and more rapid alerts to catch exotic light curves. Our new system is a powerful tool to realize both these requirements.

T. Sako; T. Sekiguchi; M. Sasaki; K. Okajima; F. Abe; I. A. Bond; J. B. Hearnshaw; Y. Itow; K. Kamiya; P. M. Kilmartin; K. Masuda; Y. Matsubara; Y. Muraki; N. J. Rattenbury; D. J. Sullivan; T. Sumi; P. Tristram; T. Yanagisawa; P. C. M. Yock

2008-04-04

150

Video summarization based on camera motion and a subjective evaluation method  

E-print Network

Video summarization based on camera motion and a subjective evaluation method M. Guironnet a , D of video summarization based on camera motion. It consists in selecting frames according to the succession summaries more generally. Subjects were asked to watch a video and to create a summary manually. From

Paris-Sud XI, Université de

151

Video Chat with Multiple Cameras John MacCormick, Dickinson College  

E-print Network

employing up to four webcams si- multaneously demonstrate that multi-camera video chat is feasible for video chat employs a single webcam at each end of the conversation. For many purposes, this is perfectlyVideo Chat with Multiple Cameras John MacCormick, Dickinson College ABSTRACT The dominant paradigm

MacCormick, John

152

Social Justice through Literacy: Integrating Digital Video Cameras in Reading Summaries and Responses  

ERIC Educational Resources Information Center

Drawing data from an action-oriented research project for integrating digital video cameras into the reading process in pre-college courses, this study proposes using digital video cameras in reading summaries and responses to promote critical thinking and to teach social justice concepts. The digital video research project is founded on

Liu, Rong; Unger, John A.; Scullion, Vicki A.

2014-01-01

153

Classification of volcanic ash particles from Sakurajima volcano using CCD camera image and cluster analysis  

NASA Astrophysics Data System (ADS)

Quantitative and speedy characterization of volcanic ash particle is needed to conduct a petrologic monitoring of ongoing eruption. We develop a new simple system using CCD camera images for quantitatively characterizing ash properties, and apply it to volcanic ash collected at Sakurajima. Our method characterizes volcanic ash particles by 1) apparent luminance through RGB filters and 2) a quasi-fractal dimension of the shape of particles. Using a monochromatic CCD camera (Starshoot by Orion Co. LTD.) attached to a stereoscopic microscope, we capture digital images of ash particles that are set on a glass plate under which white colored paper or polarizing plate is set. The images of 1390 x 1080 pixels are taken through three kinds of color filters (Red, Green and Blue) under incident-light and transmitted-light through polarizing plate. Brightness of the light sources is set to be constant, and luminance is calibrated by white and black colored papers. About fifteen ash particles are set on the plate at the same time, and their images are saved with a bit map format. We first extract the outlines of particles from the image taken under transmitted-light through polarizing plate. Then, luminances for each color are represented by 256 tones at each pixel in the particles, and the average and its standard deviation are calculated for each ash particle. We also measure the quasi-fractal dimension (qfd) of ash particles. We perform box counting that counts the number of boxes which consist of 11 and 128128 pixels that catch the area of the ash particle. The qfd is estimated by taking the ratio of the former number to the latter one. These parameters are calculated by using software R. We characterize volcanic ash from Showa crater of Sakurajima collected in two days (Feb 09, 2009, and Jan 13, 2010), and apply cluster analyses. Dendrograms are formed from the qfd and following four parameters calculated from the luminance: Rf=R/(R+G+B), G=G/(R+G+B), B=B/(R+G+B), and total luminance=(R+G+B)/665. We classify the volcanic ash particles from the Dendrograms into three groups based on the euclid distance. The groups are named as Group A, B and C in order of increasing of the average value of total luminance. The classification shows that the numbers of particles belonging to Group A, B and C are 77, 25 and 6 in Feb, 09, 2009 sample, and 102, 19 and 6 in Jan, 13, 2010 sample, respectively. The examination under stereoscopic microscope suggests that Group A, B and C mainly correspond with juvenile, altered and free-crystal particles, respectively. So the result of classification by present method demonstrates a difference in the contribution of juvenile material between the two days. To evaluate reliability of our classification, we classify pseudo-samples in which errors of 10% are added in the measured parameters. We apply our method to one thousand psuedo-samples, and the result shows that the numbers of particles classified into the three groups vary less than 20 % of the total number of 235 particles. Our system can classify 120 particles within 6 minutes so that we easily increase the number of ash particles, which enable us to improve reliabilities and resolutions of the classification and to speedily capture temporal changes of the property of ash particles from active volcanoes.

Miwa, T.; Shimano, T.; Nishimura, T.

2012-12-01

154

Laboratory x-ray CCD camera electronics: a test bed for the Swift X-Ray Telescope  

NASA Astrophysics Data System (ADS)

The Penn State University Department of Astronomy and Astrophysics has been active in the design of X-ray CCD cameras for astronomy for over two decades, including sounding rocket systems, the CUBIC instrument on the SAC-B satellite and the ACIS camera on the Chandra satellite. Currently the group is designing and building an X-ray telescope (XRT), which will comprise part of the Swift Gamma-Ray Burst Explorer satellite. The Swift satellite, selected in October 1999 as one of two winners of NASA Explorer contracts, will -- within one minute -- detect, locate, and observe gamma-ray bursts simultaneously in the optical, ultraviolet, X-ray, and gamma- ray wavelengths using three co-aligned telescopes. The XRT electronics is required to read out the telescope's CCD sensor in a number of different ways depending on the observing mode selected. Immediately after the satellite re-orients to observe a newly detected burst, the XRT will enter an imaging mode to determine the exact position of the burst. The location will then be transmitted to the ground, and the XRT will autonomously enter other modes as the X-ray intensity of the burst waxes and wanes. This paper will discuss the electronics for a laboratory X-ray CCD camera, which serves as a test bed for development of the Swift XRT camera. It will also touch upon the preliminary design of the flight camera, which is closely related. A major challenge is achieving performance and reliability goals within the cost constraints of an Explorer mission.

Hill, Joanne E.; Zugger, Michael E.; Shoemaker, Jason; Witherite, Mark E.; Koch, T. Scott; Chou, Lester L.; Case, Traci; Burrows, David N.

2000-12-01

155

Computer-vision-based weed identification of images acquired by 3CCD camera  

NASA Astrophysics Data System (ADS)

Selective application of herbicide to weeds at an earlier stage in crop growth is an important aspect of site-specific management of field crops. For approaches more adaptive in developing the on-line weed detecting application, more researchers involves in studies on image processing techniques for intensive computation and feature extraction tasks to identify the weeds from the other crops and soil background. This paper investigated the potentiality of applying the digital images acquired by the MegaPlus TM MS3100 3-CCD camera to segment the background soil from the plants in question and further recognize weeds from the crops using the Matlab script language. The image of the near-infrared waveband (center 800 nm; width 65 nm) was selected principally for segmenting soil and identifying the cottons from the thistles was achieved based on their respective relative area (pixel amount) in the whole image. The results show adequate recognition that the pixel proportion of soil, cotton leaves and thistle leaves were 78.24%(-0.20% deviation), 16.66% (+ 2.71% SD) and 4.68% (-4.19% SD). However, problems still exists by separating and allocating single plants for their clustering in the images. The information in the images acquired via the other two channels, i.e., the green and the red bands, need to be extracted to help the crop/weed discrimination. More optical specimens should be acquired for calibration and validation to establish the weed-detection model that could be effectively applied in fields.

Zhang, Yun; He, Yong; Fang, Hui

2006-09-01

156

Highly flexible and Internet-programmable CCD camera with a frequency-selectable read-out for imaging and spectroscopy applications  

Microsoft Academic Search

A new concept CCD camera is currently being realized at the XUV Lab of the Department of Astronomy and Space Science of the University of Florence. The main features we aim to get are a high level of versatility and a fast pixel rate. Within this project, a versatile CCD sequencer has been realized with interesting and innovative features. Based

Luca Gori; Emanuele Pace; Leonardo Tommasi; D. Sarocchi; V. Bagnoli; M. Sozzi; S. Puri

2001-01-01

157

Non-mydriatic, wide field, fundus video camera  

NASA Astrophysics Data System (ADS)

We describe a method we call "stripe field imaging" that is capable of capturing wide field color fundus videos and images of the human eye at pupil sizes of 2mm. This means that it can be used with a non-dilated pupil even with bright ambient light. We realized a mobile demonstrator to prove the method and we could acquire color fundus videos of subjects successfully. We designed the demonstrator as a low-cost device consisting of mass market components to show that there is no major additional technical outlay to realize the improvements we propose. The technical core idea of our method is breaking the rotational symmetry in the optical design that is given in many conventional fundus cameras. By this measure we could extend the possible field of view (FOV) at a pupil size of 2mm from a circular field with 20 in diameter to a square field with 68 by 18 in size. We acquired a fundus video while the subject was slightly touching and releasing the lid. The resulting video showed changes at vessels in the region of the papilla and a change of the paleness of the papilla.

Hoeher, Bernhard; Voigtmann, Peter; Michelson, Georg; Schmauss, Bernhard

2014-02-01

158

Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source  

SciTech Connect

The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 ?m square pixels, and 15 ?m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/?E?10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within 1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

M. J. Haugh and M. B. Schneider

2008-10-31

159

Progress beyond ISIS: combined triple-ISIS camera, video trigger, and terraced image sensor  

NASA Astrophysics Data System (ADS)

In 2001, we developed a video camera of 1,000,000 fps with an in-situ storage image sensor (ISIS). The performance is briefly explained at first. We are now developing innovative technologies to provide the ultra-high-speed video camera with higher level of performance and more useful functions, including the combined triple-ISIS camera, the built-in video trigger system, and the terraced image sensor. Their concepts are explained together with the expected performance.

Etoh, Takeharu G.

2003-07-01

160

Progress beyond ISIS: combined triple-ISIS camera, video trigger, and terraced image sensor  

Microsoft Academic Search

In 2001, we developed a video camera of 1,000,000 fps with an in-situ storage image sensor (ISIS). The performance is briefly explained at first. We are now developing innovative technologies to provide the ultra-high-speed video camera with higher level of performance and more useful functions, including the combined triple-ISIS camera, the built-in video trigger system, and the terraced image sensor.

Takeharu G. Etoh

2003-01-01

161

Characterization of OCam and CCD220: the fastest and most sensitive camera to date for AO wavefront sensing  

NASA Astrophysics Data System (ADS)

For the first time, sub-electron read noise has been achieved with a camera suitable for astronomical wavefront-sensing (WFS) applications. The OCam system has demonstrated this performance at 1300 Hz frame rate and with 240240-pixel frame rate. ESO and JRA2 OPTICON2 have jointly funded e2v technologies to develop a custom CCD for Adaptive Optics (AO) wavefront sensing applications. The device, called CCD220, is a compact Peltier-cooled 240240 pixel frame-transfer 8-output back-illuminated sensor using the EMCCD technology. This paper demonstrates sub-electron read noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e-/pixel/frame. It reports on the comprehensive, quantitative performance characterization of OCam and the CCD220 such as readout noise, dark current, multiplication gain, quantum efficiency, charge transfer efficiency... OCam includes a low noise preamplifier stage, a digital board to generate the clocks and a microcontroller. The data acquisition system includes a user friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, was designed offering enhanced performances, a completely sealed camera package and an additional Peltier stage to facilitate operation on a telescope or environmentally rugged applications. OCam2 offers two types of built-in data link to the Real Time Computer: the CameraLink industry standard interface and various fiber link options like the sFPDP interface. OCam2 includes also a modified mechanical design to ease the integration of microlens arrays for use of this camera in all types of wavefront sensing AO system. The front cover of OCam2 can be customized to include a microlens exchange mechanism.

Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz-Garcia, Jos Javier

2010-07-01

162

TDICCD video data sampling technique in the space remote sensing camera  

NASA Astrophysics Data System (ADS)

The paper analyzes the generated mechanism of the reset noise when reading out the CCD video signal. It also states a sampling technique for CCD output video signal, the Correlative Double Sampling (CDS) technique, which is on the basis of noises canceled-each-other and the mathematics correlative theory. The paper introduces the operation principle of the CDS technique and its filtering effects on the output noise of CCD (which includes the reset noise of CCD, the coupled cross-talk noise between the horizontal clock drive and the ground-wire of power supply, the white noise of output amplifier and the reset noise of 1/f noise). The paper gives a electric circuit of CDS that is applied practically. At last, it verified the conclusion that the output S/N of CCD signal can attain to 50dB.

Huang, Qiaolin

2009-07-01

163

Imaging tissues with a polarized light video camera  

NASA Astrophysics Data System (ADS)

A method for imaging the superficial epidermal and papillary dermal layers of the skin is needed when assessing many skin lesions. We have developed an imaging modality using a video camera whose mechanism of contrast is the reflectance of polarized light from superficial skin. By selecting only polarized light to create the image, one rejects the large amount of diffusely reflected light from the deeper dermis. The specular reflectance (or glare) from the skin surface is also avoided in the setup. The resulting polarization picture maximally accents the details of the superficial layer of the skin and removes the effects of melanin pigmentation from the image. For example, freckles simply disappear and nevi lose their dark pigmentation to reveal the details of abnormal cellular growth. An initial clinical study demonstrated that the polarization camera could identify the margins of sclerosing basal cell carcinoma while the eye of the doctor underestimated the margin estimate. The camera identified an 11-mm-diameter lesion while the unaided eye identified a 6-mm-diameter lesion.

Jacques, Steven L.; Lee, Kenneth

1999-09-01

164

The Camera Is Not a Methodology: Towards a Framework for Understanding Young Children's Use of Video Cameras  

ERIC Educational Resources Information Center

Participatory research methods argue that young children should be enabled to contribute their perspectives on research seeking to understand their worldviews. Visual research methods, including the use of still and video cameras with young children have been viewed as particularly suited to this aim because cameras have been considered easy and

Bird, Jo; Colliver, Yeshe; Edwards, Susan

2014-01-01

165

Cryogenic design of the high speed CCD60 camera for wavefront sensing  

NASA Astrophysics Data System (ADS)

CCD60, developed by e2v technologies, is a 128x128 pixel frame-transfer back-illuminated sensor using the EMCCD technology. This kind of detector has some attractive characteristics, such as high frame rate, low noise and high quantum efficiency. So, it is suitable for Adaptive Optical Wave Front Sensor (AO WFS) applications. However, the performance of this detector is strongly depended on its temperature. In order to achieve high multiplication gain and low dark current noise, CCD60 should be cooled under -45. For this reason, we had designed a cooling system to cool down the CCD60 detector base on thermoelectric cooler. Detail of the design, thermal analysis and the cooling experiment are presented in this paper. The performance of multiplication gain after cooling had been tested too. The result of cooling experiment shows that the thermoelectric cooler can cool the CCD to below -60 C under air cooled operation and an air temperature of 20 C. The multiplication gain test tell us the multiplication gain of CCD60 can exceed 500 times on -60.

He, Kai; Ma, Wenli; Wang, Mingfu; Zhou, Xiangdong

2014-11-01

166

Video-Camera-Based Position-Measuring System  

NASA Technical Reports Server (NTRS)

A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white squares to an object of interest (see Figure 2). For other situations, where circular symmetry is more desirable, circular targets also can be created. Such a target can readily be generated and modified by use of commercially available software and printed by use of a standard office printer. All three relative coordinates (x, y, and z) of each target can be determined by processing the video image of the target. Because of the unique design of corresponding image-processing filters and targets, the vision-based position- measurement system is extremely robust and tolerant of widely varying fields of view, lighting conditions, and varying background imagery.

Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert

2005-01-01

167

Deep-Sea Video Cameras Without Pressure Housings  

NASA Technical Reports Server (NTRS)

Underwater video cameras of a proposed type (and, optionally, their light sources) would not be housed in pressure vessels. Conventional underwater cameras and their light sources are housed in pods that keep the contents dry and maintain interior pressures of about 1 atmosphere (.0.1 MPa). Pods strong enough to withstand the pressures at great ocean depths are bulky, heavy, and expensive. Elimination of the pods would make it possible to build camera/light-source units that would be significantly smaller, lighter, and less expensive. The depth ratings of the proposed camera/light source units would be essentially unlimited because the strengths of their housings would no longer be an issue. A camera according to the proposal would contain an active-pixel image sensor and readout circuits, all in the form of a single silicon-based complementary metal oxide/semiconductor (CMOS) integrated- circuit chip. As long as none of the circuitry and none of the electrical leads were exposed to seawater, which is electrically conductive, silicon integrated- circuit chips could withstand the hydrostatic pressure of even the deepest ocean. The pressure would change the semiconductor band gap by only a slight amount . not enough to degrade imaging performance significantly. Electrical contact with seawater would be prevented by potting the integrated-circuit chip in a transparent plastic case. The electrical leads for supplying power to the chip and extracting the video signal would also be potted, though not necessarily in the same transparent plastic. The hydrostatic pressure would tend to compress the plastic case and the chip equally on all sides; there would be no need for great strength because there would be no need to hold back high pressure on one side against low pressure on the other side. A light source suitable for use with the camera could consist of light-emitting diodes (LEDs). Like integrated- circuit chips, LEDs can withstand very large hydrostatic pressures. If power-supply regulators or filter capacitors were needed, these could be attached in chip form directly onto the back of, and potted with, the imager chip. Because CMOS imagers dissipate little power, the potting would not result in overheating. To minimize the cost of the camera, a fixed lens could be fabricated as part of the plastic case. For improved optical performance at greater cost, an adjustable glass achromatic lens would be mounted in a reservoir that would be filled with transparent oil and subject to the full hydrostatic pressure, and the reservoir would be mounted on the case to position the lens in front of the image sensor. The lens would by adjusted for focus by use of a motor inside the reservoir (oil-filled motors already exist).

Cunningham, Thomas

2004-01-01

168

HDA dataset -DRAFT 1 A Multi-camera video data set for research on  

E-print Network

HDA dataset - DRAFT 1 A Multi-camera video data set for research on High-Definition surveillance Abstract: We present a fully labelled image sequence data set for benchmarking video surveillance algorithms. The data set was acquired from 13 indoor cameras distributed over three floors of one building

Instituto de Sistemas e Robotica

169

ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System  

SciTech Connect

This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

Werry, S.M.

1995-06-06

170

A high resolution Small Field Of View (SFOV) gamma camera: a columnar scintillator coated CCD imager for medical applications  

NASA Astrophysics Data System (ADS)

We describe a high resolution, small field of view (SFOV), Charge Coupled Device (CCD) based camera for imaging small volumes of radionuclide uptake in tissues. The Mini Gamma Ray Camera (MGRC) is a collimated, scintillator-coated, low cost, high performance imager using low noise CCDs. The prototype MGRC has a 600 ?m thick layer of columnar CsI(Tl) and operates in photon counting mode using a thermoelectric cooler to achieve an operating temperature of - 10C. Collimation was performed using a pin hole collimator. We have measured the spatial resolution, energy resolution and efficiency using a number of radioisotope sources including 140 keV gamma-rays from 99mTc in a specially designed phantom. We also describe our first imaging of a volunteer patient.

Lees, J. E.; Bassford, D. J.; Blake, O. E.; Blackshaw, P. E.; Perkins, A. C.

2011-12-01

171

Electro-optical testing of fully depleted CCD image sensors for the Large Synoptic Survey Telescope camera  

NASA Astrophysics Data System (ADS)

The LSST Camera science sensor array will incorporate 189 large format Charge Coupled Device (CCD) image sensors. Each CCD will include over 16 million pixels and will be divided into 16 equally sized segments and each segment will be read through a separate output amplifier. The science goals of the project require CCD sensors with state of the art performance in many aspects. The broad survey wavelength coverage requires fully depleted, 100 micrometer thick, high resistivity, bulk silicon as the imager substrate. Image quality requirements place strict limits on the image degradation that may be caused by sensor effects: optical, electronic, and mechanical. In this paper we discuss the design of the prototype sensors, the hardware and software that has been used to perform electro-optic testing of the sensors, and a selection of the results of the testing to date. The architectural features that lead to internal electrostatic fields, the various effects on charge collection and transport that are caused by them, including charge diffusion and redistribution, effects on delivered PSF, and potential impacts on delivered science data quality are addressed.

Doherty, Peter E.; Antilogus, Pierre; Astier, Pierre; Chiang, James; Gilmore, D. Kirk; Guyonnet, Augustin; Huang, Dajun; Kelly, Heather; Kotov, Ivan; Kubanek, Petr; Nomerotski, Andrei; O'Connor, Paul; Rasmussen, Andrew; Riot, Vincent J.; Stubbs, Christopher W.; Takacs, Peter; Tyson, J. Anthony; Vetter, Kurt

2014-07-01

172

Structural Dynamics Analysis and Research for FEA Modeling Method of a Light High Resolution CCD Camera  

NASA Astrophysics Data System (ADS)

resolution and wide swath. In order to assure its high optical precision smoothly passing the rigorous dynamic load of launch, it should be of high structural rigidity. Therefore, a careful study of the dynamic features of the camera structure should be performed. Pro/E. An interference examination is performed on the precise CAD model of the camera for mending the structural design. for the first time in China, and the analysis of structural dynamic of the camera is accomplished by applying the structural analysis code PATRAN and NASTRAN. The main research programs include: 1) the comparative calculation of modes analysis of the critical structure of the camera is achieved by using 4 nodes and 10 nodes tetrahedral elements respectively, so as to confirm the most reasonable general model; 2) through the modes analysis of the camera from several cases, the inherent frequencies and modes are obtained and further the rationality of the structural design of the camera is proved; 3) the static analysis of the camera under self gravity and overloads is completed and the relevant deformation and stress distributions are gained; 4) the response calculation of sine vibration of the camera is completed and the corresponding response curve and maximum acceleration response with corresponding frequencies are obtained. software technique is accurate and efficient. sensitivity, the dynamic design and engineering optimization of the critical structure of the camera are discussed. fundamental technology in design of forecoming space optical instruments.

Sun, Jiwen; Wei, Ling; Fu, Danying

2002-01-01

173

Frequency Identification of Vibration Signals Using Video Camera Image Data  

PubMed Central

This study showed that an image data acquisition system connecting a high-speed camera or webcam to a notebook or personal computer (PC) can precisely capture most dominant modes of vibration signal, but may involve the non-physical modes induced by the insufficient frame rates. Using a simple model, frequencies of these modes are properly predicted and excluded. Two experimental designs, which involve using an LED light source and a vibration exciter, are proposed to demonstrate the performance. First, the original gray-level resolution of a video camera from, for instance, 0 to 256 levels, was enhanced by summing gray-level data of all pixels in a small region around the point of interest. The image signal was further enhanced by attaching a white paper sheet marked with a black line on the surface of the vibration system in operation to increase the gray-level resolution. Experimental results showed that the Prosilica CV640C CMOS high-speed camera has the critical frequency of inducing the false mode at 60 Hz, whereas that of the webcam is 7.8 Hz. Several factors were proven to have the effect of partially suppressing the non-physical modes, but they cannot eliminate them completely. Two examples, the prominent vibration modes of which are less than the associated critical frequencies, are examined to demonstrate the performances of the proposed systems. In general, the experimental data show that the non-contact type image data acquisition systems are potential tools for collecting the low-frequency vibration signal of a system. PMID:23202026

Jeng, Yih-Nen; Wu, Chia-Hung

2012-01-01

174

Measuring small color differences in the nearly neutral region by 3CCD camera  

Microsoft Academic Search

A method to evaluate the discrimination capability of a camera to measure small color differences in the nearly neutral region is proposed. We focus on the camera's performance in the nearly neutral region of the color space because: it represents a challenge for the instrument (these colors entail a similar stimulation of the RGB channels) and, second, these colors concentrate

Edison Valencia; Maria S. Millan Garcia-Verela

2004-01-01

175

Dynamic imaging with a triggered and intensified CCD camera system in a high-intensity neutron beam  

NASA Astrophysics Data System (ADS)

When time-dependent processes within metallic structures should be inspected and visualized, neutrons are well suited due to their high penetration through Al, Ag, Ti or even steel. Then it becomes possible to inspect the propagation, distribution and evaporation of organic liquids as lubricants, fuel or water. The principle set-up of a suited real-time system was implemented and tested at the radiography facility NEUTRA of PSI. The highest beam intensity there is 2107 cm s, which enables to observe sequences in a reasonable time and quality. The heart of the detection system is the MCP intensified CCD camera PI-Max with a Peltier cooled chip (13001340 pixels). The intensifier was used for both gating and image enhancement, where as the information was accumulated over many single frames on the chip before readout. Although, a 16-bit dynamic range is advertised by the camera manufacturers, it must be less due to the inherent noise level from the intensifier. The obtained result should be seen as the starting point to go ahead to fit the different requirements of car producers in respect to fuel injection, lubricant distribution, mechanical stability and operation control. Similar inspections will be possible for all devices with repetitive operation principle. Here, we report about two measurements dealing with the lubricant distribution in a running motorcycle motor turning at 1200 rpm. We were monitoring the periodic stationary movements of piston, valves and camshaft with a micro-channel plate intensified CCD camera system (PI-Max 1300RB, Princeton Instruments) triggered at exactly chosen time points.

Vontobel, P.; Frei, G.; Brunner, J.; Gildemeister, A. E.; Engelhardt, M.

2005-04-01

176

Construction of a Junction Box for Use with an Inexpensive, Commercially Available Underwater Video Camera Suitable for Aquatic Research  

Microsoft Academic Search

Underwater video camera apparatus is an important fisheries research tool. Such cameras, developed and marketed for recreational anglers, provide an opportunity for researchers to easily obtain cost-effective and waterproof video apparatus for fisheries research. We detail a series of modifications to an inexpensive, commercially available underwater video camera (about US$125) that provide flexibility for deploying the equipment in the laboratory

Steven J. Cooke; Christopher M. Bunt

2004-01-01

177

Investigation of a CCD camera for measurements of optical atmospheric turbulence  

NASA Astrophysics Data System (ADS)

Atmospheric turbulence introduces random phase distortions in optical imaging systems. The development of new laser and imaging systems requires information on the spatial and temporal distribution of this atmospheric turbulence. Measurements of the image spread and the jitter induced by the atmosphere on an optical system provide two techniques to quantify these phenomena. This thesis evaluates a Spectra Sources Lynxx PC Plus charge coupled device (CCD) array as an atmospheric turbulence sensor. Data acquisition and processing programs were written to measure the image spread of a point source and centroid jitter of a point source imaged through the atmosphere. Since atmospheric jitter measurements require high image frame rates, on the order of 200 images per second, a large portion of this thesis involved measurements of the times for the CCD detector, interface board, and IBM compatible computer to perform their tasks. Recommendations for higher performance are presented.

Rall, William J.

1992-03-01

178

Night Vision Camera  

NASA Technical Reports Server (NTRS)

PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

1996-01-01

179

[Research Award providing funds for a tracking video camera  

NASA Technical Reports Server (NTRS)

The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

Collett, Thomas

2000-01-01

180

Simultaneous monitoring of a collapsing landslide with video cameras  

NASA Astrophysics Data System (ADS)

Effective countermeasures and risk management to reduce landslide hazards require a full understanding of the processes of collapsing landslides. While the processes are generally estimated from the features of debris deposits after collapse, simultaneous monitoring during collapse provides more insights into the processes. Such monitoring, however, is usually very difficult, because it is rarely possible to predict when a collapse will occur. This study introduces a rare case in which a collapsing landslide (150 m in width and 135 m in height) was filmed with three video cameras in Higashi-Yokoyama, Gifu Prefecture, Japan. The cameras were set up in the front and on the right and left sides of the slide in May 2006, one month after a series of small slope failures in the toe and the formation of cracks on the head indicated that a collapse was imminent. The filmed images showed that the landslide collapse started from rock falls and slope failures occurring mainly around the margin, that is, the head, sides and toe. These rock falls and slope failures, which were individually counted on the screen, increased with time. Analyzing the images, five of the failures were estimated to have each produced more than 1000 m3 of debris, and the landslide collapsed with several surface failures accompanied by a toppling movement. The manner of the collapse suggested that the slip surface initially remained on the upper slope, and then extended down the slope as the excessive internal stress shifted downwards. Image analysis, together with field measurements using a ground-based laser scanner after the collapse, indicated that the landslide produced a total of 50 000 m3 of debris. As described above, simultaneous monitoring provides valuable information about landslide processes. Further development of monitoring techniques will help clarify landslide processes qualitatively as well as quantitatively.

Fujisawa, K.; Ohara, J.

2008-01-01

181

Development of the control circuits for the TID-CCD stereo camera of the Chang'E-2 satellite based on FPGAs  

NASA Astrophysics Data System (ADS)

The TDI-CCD Stereo Camera is the optical sensor on the Chang'E-2 (CE-2) satellite created for the Chinese Lunar Exploration Program. The camera was designed to acquire three-dimensional stereoscopic images of the lunar surface based upon three-line array photogrammetric theory. The primary objective of the camera is, (1) to obtain about 1-m pixel spatial resolution images of the preparative landing location from an ellipse orbit at an altitude of ~15km, and (2) to obtain about 7-m pixel spatial resolution global images of the Moon from a circular orbit at an altitude of ~100km. The focal plane of the camera is comprised of two TDI-CCDs. The control circuits of the camera are designed based on two SRAM-type FPGAs, XQR2V3000-4CG717. In this paper, a variable frequency control and multi-tap data readout technology for the TDI-CCD is presented, which is able to change the data processing capabilities according to the different orbit mode for the TDI-CCD stereo camera. By this way, the data rate of the camera is extremely reduced from 100Mbps to 25Mbps at high orbit mode, which is benefit to raise the reliability of the image transfer. The results of onboard flight validate that the proposed methodology is reasonable and reliable.

Duan, Yong-Qiang; Gao, Wei; Qiao, Wei-Dong; Wen, De-Sheng; Zhao, Bao-Chang

2013-09-01

182

Influence of diquat on growth and death of HepG2 cells using quartz crystal and micro CCD camera.  

PubMed

Diquat is widely used agent which produces toxicity in human and implicated as an environmental toxicity. HepG2 cell was cultured onto an indium tin oxide (ITO) surface of quartz crystal modified a collagen film. In this paper, we investigated the physical properties and the morphological change of the HepG2 cells cultured onto the ITO electrode of the quartz crystal sensor with micro CCD camera. The resonance responses of the quartz crystal and the morphological change were directly monitored. After seeding the cells and diquat injection into the chamber, the resonance frequency and the resonance resistance were obtained with real time morphologies. From the resonance characteristics and the series of morphologies, we could know the diquat to be death and weakening of the cells. PMID:21780434

Kang, Hyen-Wook; Lee, Dong-Yun; Muramatsu, Hiroshi; Lee, Burm-Jong; Kwon, Young-Soo

2011-05-01

183

Development of Measurement Device of Working Radius of Crane Based on Single CCD Camera and Laser Range Finder  

NASA Astrophysics Data System (ADS)

In this paper, what we want to do is to develop an observation device to measure the working radius of a crane truck. The device has a single CCD camera, a laser range finder and two AC servo motors. First, in order to measure the working radius, we need to consider algorithm of a crane hook recognition. Then, we attach the cross mark on the crane hook. Namely, instead of the crane hook, we try to recognize the cross mark. Further, for the observation device, we construct PI control system with an extended Kalman filter to track the moving cross mark. Through experiments, we show the usefulness of our device including new control system of mark tracking.

Nara, Shunsuke; Takahashi, Satoru

184

Characterization of the luminance and shape of ash particles at Sakurajima volcano, Japan, using CCD camera images  

NASA Astrophysics Data System (ADS)

We develop a new method for characterizing the properties of volcanic ash at the Sakurajima volcano, Japan, based on automatic processing of CCD camera images. Volcanic ash is studied in terms of both luminance and particle shape. A monochromatic CCD camera coupled with a stereomicroscope is used to acquire digital images through three filters that pass red, green, or blue light. On single ash particles, we measure the apparent luminance, corresponding to 256 tones for each color (red, green, and blue) for each pixel occupied by ash particles in the image, and the average and standard deviation of the luminance. The outline of each ash particle is captured from a digital image taken under transmitted light through a polarizing plate. Also, we define a new quasi-fractal dimension ( D qf ) to quantify the complexity of the ash particle outlines. We examine two ash samples, each including about 1000 particles, which were erupted from the Showa crater of the Sakurajima volcano, Japan, on February 09, 2009 and January 13, 2010. The apparent luminance of each ash particle shows a lognormal distribution. The average luminance of the ash particles erupted in 2009 is higher than that of those erupted in 2010, which is in good agreement with the results obtained from component analysis under a binocular microscope (i.e., the number fraction of dark juvenile particles is lower for the 2009 sample). The standard deviations of apparent luminance have two peaks in the histogram, and the quasi-fractal dimensions show different frequency distributions between the two samples. These features are not recognized in the results of conventional qualitative classification criteria or the sphericity of the particle outlines. Our method can characterize and distinguish ash samples, even for ash particles that have gradual property changes, and is complementary to component analysis. This method also enables the relatively fast and systematic analysis of ash samples that is required for petrologic monitoring of ongoing activity, such as at the Sakurajima volcano.

Miwa, Takahiro; Shimano, Taketo; Nishimura, Takeshi

2015-01-01

185

A compact very high resolution camera (VHRC) for earth and planetary exploration using a large array (7k 8k) CCD  

Microsoft Academic Search

A concept is presented of a compact and very light-weight camera system for planetary exploration and terrestrial remote sensing with a (panchromatic) ground resolution of about 0.2 to 1.5 m per pixel from orbits of 100 km (Moon) to 800 km (Mars, Earth). The core of the camera system is a new 7k 8k Philips CCD (12 ?m pixels)

H.-G. Grothues; F. Lehmann; H. Michaelis; G. Neukum; R. Pischel; E. Ress; T. Behnke; M. Tschentscher

1999-01-01

186

Design of a Commutator Automatic Slotting Machine Based on CCD Camera  

Microsoft Academic Search

A commutator automatic slotting machine is a special equipment in the rotor commutator slotting of the DC motors. The detection and location of the commutator mica slot is its key technology. In this paper, a design method is put forward to implement real-time tracking and automatic indexing of the commutator mica slot by using advanced video image processing technology. Under

Gong-hu Guan; Wen-zheng Zhai

2012-01-01

187

Fast CCD camera for x-ray photon correlation spectroscopy and time-resolved x-ray scattering and imaging  

SciTech Connect

A new, fast x-ray detector system is presented for high-throughput, high-sensitivity, time-resolved, x-ray scattering and imaging experiments, most especially x-ray photon correlation spectroscopy (XPCS). After a review of the architectures of different CCD chips and a critical examination of their suitability for use in a fast x-ray detector, the new detector hardware is described. In brief, its principal component is an inexpensive, commercial camera - the SMD1M60 - originally designed for optical applications, and modified for use as a direct-illumination x-ray detector. The remainder of the system consists of two Coreco Imaging PC-DIG frame grabber boards, located inside a Dell Power-edge 6400 server. Each frame grabber sits on its own PCI bus and handles data from 2 of the CCD's 4 taps. The SMD1M60 is based on a fast, frame-transfer, 4-tap CCD chip, read out at12-bit resolution at frame rates of up to 62 Hz for full frame readout and up to 500 Hz for one-sixteenth frame readout. Experiments to characterize the camera's suitability for XPCS and small-angle x-ray scattering (SAXS) are presented. These experiments show that single photon events are readily identified, and localized to within a pixel index or so. This is a sufficiently fine spatial resolution to maintain the speckle contrast at an acceptable value for XPCS measurements. The detective quantum efficiency of the SMD1M60 is 49% for directly-detected 6.3 keV x rays. The effects of data acquisition strategies that permit near-real-time data compression are also determined and discussed. Overall, the SMD1M60 detector system represents a major improvement in the technology for time-resolved x-ray experiments, that require an area detector with time-resolutions in few-milliseconds-to-few-seconds range, and it should have wide applications, extending beyond XPCS.

Falus, P.; Borthwick, M.A.; Mochrie, S.G.J. [Department of Physics, Yale University, New Haven, Connecticut 06520 and Department of Physics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States); Department of Physics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States); Departments of Physics and Applied Physics, Yale University, New Haven, Connecticut 06520 (United States)

2004-11-01

188

Station Cameras Capture New Videos of Hurricane Katia - Duration: 5:36.  

NASA Video Gallery

Aboard the International Space Station, external cameras captured new video of Hurricane Katia as it moved northwest across the western Atlantic north of Puerto Rico at 10:35 a.m. EDT on September ...

189

Fused Six-Camera Video of STS-134 Launch - Duration: 1:19.  

NASA Video Gallery

Imaging experts funded by the Space Shuttle Program and located at NASA's Ames Research Center prepared this video by merging nearly 20,000 photographs taken by a set of six cameras capturing 250 i...

190

Measuring Night-Sky Brightness with a Wide-Field CCD Camera  

E-print Network

We describe a system for rapidly measuring the brightness of the night sky using a mosaic of CCD images obtained with a low-cost automated system. The portable system produces millions of independent photometric measurements covering the entire sky, enabling the detailed characterization of natural sky conditions and light domes produced by cities. The measurements are calibrated using images of standard stars contained within the raw data, producing results closely tracking the Johnson V astronomical standard. The National Park Service has collected hundreds of data sets at numerous parks since 2001 and is using these data for the protection and monitoring of the night-sky visual resource. This system also allows comprehensive characterization of sky conditions at astronomical observatories. We explore photometric issues raised by the broadband measurement of the complex and variable night-sky spectrum, and potential indices of night-sky quality.

D. M. Duriscoe; C. B. Luginbuhl; C. A. Moore

2007-03-27

191

Measuring neutron fluences and gamma/x ray fluxes with CCD cameras  

NASA Astrophysics Data System (ADS)

Volume and area measurements of transient radiation-induced pixel charge in English Electric Valve (EEV) Frame Transfer (FT) charge coupled devices (CCDs) from irradiation with pulsed neutrons (14 MeV) and Bremsstrahlung photons (16-MeV endpoint) are utilized to calibrate the devices as radiometric imaging sensors capable of distinguishing between the two types of ionizing radiation. Measurements indicate approximately 0.5 V/rad responsivity with greater than or equal to 1 rad required for saturation from photon irradiation. Neutron-generated localized charge centers or 'peaks' binned by area and amplitude as functions of fluence in the 10(exp 5) to 10(exp 7) n/sq cm range indicate smearing over approximately 1 to 10 percent of CCD array with charge per pixel ranging between noise and saturation levels.

Yates, G. J.; Smith, G. W.; Zagarino, P.; Thomas, M. C.

192

Bayesian Reconstruction of 3D Human Motion from Single-Camera Video  

Microsoft Academic Search

The three-dimensional motion of humans is underdetermined when the observation is limited to a single camera, due to the inherent 3D ambiguity of 2D video. We present a system that reconstructs the 3D motion of human subjects from single-camera video, relying on prior knowledge about human motion, learned from training data, to resolve those ambiguities. After initialization in 2D, the

Nicholas R. Howe; Michael E. Leventon; William T. Freeman

1999-01-01

193

Using a Video Camera to Measure the Radius of the Earth  

ERIC Educational Resources Information Center

A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of

Carroll, Joshua; Hughes, Stephen

2013-01-01

194

Improvement of the homogeneous nucleation rate measurements in a static diffusion chamber with use of a CCD camera  

NASA Astrophysics Data System (ADS)

A photographic approach introduced recently to study nucleation kinetics in a static diffusion chamber, has been further improved by using a sensitive CCD camera instead of a photographic one. The method is based on illuminating the chamber from side by a flattened vertical laser beam. Trajectories of droplets, formed by nucleation and growing inside this beam, are recorded by a camera positioned perpendicular to this beam. Evaluation of starting points of droplet trajectories leads to vertical distribution of nucleation rate, which can then be related to calculated local values of temperature and supersaturation. This approach is independent on any nucleation theory. The main drawback of the photographic method, tedious production and evaluation of photographs, is replaced here by an automated procedure. Digitized pictures are downloaded to a PC and evaluated by standard methods of image analysis. The starting points of individual droplets are sought using a normalized correlation with a template. The position of liquid films is found using a method of maximizing sum of intensities on a line. The approach seems to keep the advantages of the photographic one, but is much more effective.

dmal, Vladimr; Smolk, Ji?; Hopke, Philip K.; Matas, Ji?

2000-08-01

195

CVPR 2010 Submission #****. CONFIDENTIAL REVIEW COPY. DO NOT DISTRIBUTE. Camera-based Video Synchronization  

E-print Network

portable devices such as mo- bile phones, personal digital assistants, and digital cameras, which have COPY. DO NOT DISTRIBUTE. Camera-based Video Synchronization for a Federation of Mobile Projectors cell-phones. Such mobile devices are projected to be the primary de- vice to be used by younger people

Majumder, Aditi

196

Low cost referenced luminescent imaging of oxygen and pH with a 2-CCD colour near infrared camera.  

PubMed

A low cost imaging set-up for optical chemical sensors based on NIR-emitting dyes is presented. It is based on a commercially available 2-CCD colour near infrared camera, LEDs and tailor-made optical sensing materials for oxygen and pH. The set-up extends common ratiometric RGB imaging based on the red, green and blue channels of colour cameras by an additional NIR channel. The hardware and software of the camera were adapted to perform ratiometric imaging. A series of new planar sensing foils were introduced to image oxygen, pH and both parameters simultaneously. The used NIR-emitting indicators are based on benzoporphyrins and aza-BODIPYs for oxygen and pH, respectively. Moreover, a wide dynamic range oxygen sensor is presented. It allows accurate imaging of oxygen from trace levels up to ambient air concentrations. The imaging set-up in combination with the normal range ratiometric oxygen sensor showed a resolution of 4-5 hPa at low oxygen concentrations (<50 hPa) and 10-15 hPa at ambient air oxygen concentrations; the trace range oxygen sensor (<20 hPa) revealed a resolution of about 0.5-1.8 hPa. The working range of the pH-sensor was in the physiological region from pH 6.0 up to pH 8.0 and showed an apparent pKa-value of 7.3 with a resolution of about 0.1 pH units. The performance of the dual parameter oxygen/pH sensor was comparable to the single analyte pH and normal range oxygen sensors. PMID:25096329

Ehgartner, Josef; Wiltsche, Helmar; Borisov, Sergey M; Mayr, Torsten

2014-10-01

197

Camera Calibration for Video See-Through Head-Mounted Display Mike Bajura  

E-print Network

Camera Calibration for Video See-Through Head-Mounted Display Mike Bajura July 7, 1993 Abstract-through head-mounted display applications. 1.0 Introduction In a video see-through head-mounted display system head-mounted display. The combined imagery is shown on the head-mounted display allowing the user

North Carolina at Chapel Hill, University of

198

Algorithms for the Automatic Identification of MARFEs and UFOs in JET Database of Visible Camera Videos  

Microsoft Academic Search

MARFE instabilities and UFOs leave clear signatures in JET fast visible camera videos. Given the potential harmful consequences of these events, particularly as triggers of disruptions, it would be important to have the means of detecting them automatically. In this paper, the results of various algorithms to identify automatically the MARFEs and UFOs in JET visible videos are reported. The

A. Murari; M. Camplani; B. Cannas; D. Mazon; F. Delaunay; P. Usai; J. F. Delmond

2010-01-01

199

Video camera system using liquid-crystal polarizing filter to reduce reflected light  

Microsoft Academic Search

We have been developed a video camera system using an electrically-controllable liquid crystal polarizing filter for television program production, which can quickly reduce undesirable reflected light from a window pane or watery surface, by automatically judging the polarization state of incident light from the change in video signal intensity. More than 80% of linearly-polarized incident light is removable by this

H. Fujikake; K. Takizawa; T. Aida; T. Negishi; M. Kobayashi

1998-01-01

200

Experimental comparison of the high-speed imaging performance of an EM-CCD and sCMOS camera in a dynamic live-cell imaging test case.  

PubMed

The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

Beier, Hope T; Ibey, Bennett L

2014-01-01

201

Utilization of an Electron Multiplying CCD camera for applications in quantum information processing  

NASA Astrophysics Data System (ADS)

Electron Multiplying Charge-Coupled Device (EMCCD) cameras utilize an on-chip amplification process which boosts low-light signals above the readout noise floor. Although traditionally used for biological imaging, they have recently attracted interest for single-photon counting and entangled state characterization in quantum information processing applications. In addition, they exhibit some photon number-resolving capacity, which is attractive from the point-of-view of several applications in optical continous-variable computing, such as building a cubic phase gate. We characterize the Andor Luca-R EMCCD camera as an affordable tool for applications in optical quantum information. We present measurements of single-photon detection efficiency, dark count probability as well as photon-number resolving capacity and place quantitative bounds on the noise performance and detection efficiency of the EMCCD detector array. We find that the readout noise floor is a Gaussian distribution centered at 500 counts/pixel/frame at high EM gain setting. We also characterize the trade-off between quantum efficiency and detector dark-count probability.

Patel, Monika; Chen, Jian; Habif, Jonathan

2013-03-01

202

Feasibility study of transmission of OTV camera control information in the video vertical blanking interval  

NASA Technical Reports Server (NTRS)

The Operational Television system at Kennedy Space Center operates hundreds of video cameras, many remotely controllable, in support of the operations at the center. This study was undertaken to determine if commercial NABTS (North American Basic Teletext System) teletext transmission in the vertical blanking interval of the genlock signals distributed to the cameras could be used to send remote control commands to the cameras and the associated pan and tilt platforms. Wavelength division multiplexed fiberoptic links are being installed in the OTV system to obtain RS-250 short-haul quality. It was demonstrated that the NABTS transmission could be sent over the fiberoptic cable plant without excessive video quality degradation and that video cameras could be controlled using NABTS transmissions over multimode fiberoptic paths as long as 1.2 km.

White, Preston A., III

1994-01-01

203

Transient noise characterization and filtration in CCD cameras exposed to stray radiation from a medical linear accelerator  

PubMed Central

Charge coupled devices (CCDs) are being increasingly used in radiation therapy for dosimetric purposes. However, CCDs are sensitive to stray radiation. This effect induces transient noise. Radiation-induced noise strongly alters the image and therefore limits its quantitative analysis. The purpose of this work is to characterize the radiation-induced noise and to develop filtration algorithms to restore image quality. Two models of CCD were used for measurements close to a medical linac. The structure of the transient noise was first characterized. Then, four methods of noise filtration were compared: median filtering of a time series of identical images, uniform median filtering of single images, an adaptive filter with switching mechanism, and a modified version of the adaptive switch filter. The intensity distribution of noisy pixels was similar in both cameras. However, the spatial distribution of the noise was different: The average noise cluster size was 1.20.6 and 3.22.7 pixels for the U2000 and the Luca, respectively. The median of a time series of images resulted in the best filtration and minimal image distortion. For applications where time series is impractical, the adaptive switch filter must be used to reduce image distortion. Our modified version of the switch filter can be used in order to handle nonisolated groups of noisy pixels. PMID:18975680

Archambault, Louis; Briere, Tina Marie; Beddar, Sam

2008-01-01

204

Temperature monitoring of Nd:YAG laser cladding (CW and PP) by advanced pyrometry and CCD-camera-based diagnostic tool  

NASA Astrophysics Data System (ADS)

The set of original pyrometers and the special diagnostic CCD-camera were applied for monitoring of Nd:YAG laser cladding (Pulsed-Periodic and Continuous Wave) with coaxial powder injection and on-line measurement of cladded layer temperature. The experiments were carried out in course of elaboration of wear resistant coatings using various powder blends (WC-Co, CuSn, Mo, Stellite grade 12, etc.) applying variation of different process parameters: laser power, cladding velocity, powder feeding rate, etc. Surface temperature distribution to the cladding seam and the overall temperature mapping were registered. The CCD-camera based diagnostic tool was applied for: (1) monitoring of flux of hot particles and its instability; (2) measurement of particle-in-flight size and velocity; (3) monitoring of particle collision with the clad in the interaction zone.

Doubenskaia, M.; Bertrand, Ph.; Smurov, Igor Y.

2004-04-01

205

Engineer reconnaissance with a video camera: feasibility study  

E-print Network

resolution (the Achilles heal of digital versus emulsion based photographic images) is paramount to any mensuration attempts from video imagery. While the essential elements of information required by Army engineers for the hasty and deliberate...

Bergner, Kirk Michael

1990-01-01

206

Panoramic Video Capturing and Compressed Domain Virtual Camera Control  

E-print Network

in a classroom/seminar or teleconference. The speaker may move around, stop, turn his body, or perform some applications such as classroom lectures and video conferencing. The proposed method is based on the Fly

California at Santa Barbara, University of

207

Single video camera method for using scene metrics to measure constrained 3D displacements  

NASA Astrophysics Data System (ADS)

There are numerous ways to use video cameras to measure 3D dynamic spatial displacements. When the scene geometry is unknown and the motion is unconstrained, two calibrated cameras are required. The data from both scenes are combined to perform the measurements using well known stereoscopic techniques. There are occasions where the measurement system can be simplified considerably while still providing a calibrated spatial measurement of a complex dynamic scene. For instance, if the sizes of objects in the scene are known a priori, these data may be used to provide scene specific spatial metrics to compute calibration coefficients. With this information, it is not necessary to calibrate the camera before use, nor is it necessary to precisely know the geometry between the camera and the scene. Field-ofview coverage and sufficient spatial and temporal resolution are the main camera requirements. Further simplification may be made if the 3D displacements of interest are small or constrained enough to allow for an accurate 2D projection of the spatial variables of interest. With proper camera orientation and scene marking, the apparent pixel movements can be expressed as a linear combination of the underlying spatial variables of interest. In many cases, a single camera may be used to perform complex 3D dynamic scene measurements. This paper will explain and illustrate a technique for using a single uncalibrated video camera to measure the 3D displacement of the end of a constrained rigid body subject to a perturbation.

Gauthier, L. R.; Jansen, M. E.; Meyer, J. R.

2014-09-01

208

BOREAS RSS-3 Imagery and Snapshots from a Helicopter-Mounted Video Camera  

NASA Technical Reports Server (NTRS)

The BOREAS RSS-3 team collected helicopter-based video coverage of forested sites acquired during BOREAS as well as single-frame "snapshots" processed to still images. Helicopter data used in this analysis were collected during all three 1994 IFCs (24-May to 16-Jun, 19-Jul to 10-Aug, and 30-Aug to 19-Sep), at numerous tower and auxiliary sites in both the NSA and the SSA. The VHS-camera observations correspond to other coincident helicopter measurements. The field of view of the camera is unknown. The video tapes are in both VHS and Beta format. The still images are stored in JPEG format.

Walthall, Charles L.; Loechel, Sara; Nickeson, Jaime (Editor); Hall, Forrest G. (Editor)

2000-01-01

209

Simulation of a Video Surveillance Network Using Remote Intelligent Security Cameras  

Microsoft Academic Search

The high continuous bit-rates carried by digital fiber-based video surveillance networks have prompted demands for intelligent\\u000a sensor devices to reduce bandwidth requirements. These devices detect and report only significant events, thus optimizing\\u000a the use of recording and transmission hardware. The Remote Intelligent Security Camera (R.I.S.C.) concept devolves local autonomy\\u000a to geographically distant cameras, enabling them to switch between tasks in

J. R. Renno; M. J. Tunnicliffe; Graeme A. Jones; David J. Parish

2001-01-01

210

A small CCD zenith camera (ZC-G1) - developed for rapid geoid monitoring in difficult projects  

NASA Astrophysics Data System (ADS)

Modern Geodesy by terrestrial or space methods is accurate to millimetres or even better. This requires very exact system definitions, together with Astronomy & Physics - and a geoid of cm level. To reach this precision, astrogeodetic vertical deflections are more effective than gravimetry or other methods - as shown by the 1st author 1996 at many projects in different European countries and landscapes. While classical Astrogeodesy is rather complicated (time consuming, heavy instruments and observer's experience) new electro-optical methods are semi-automatic and fill our "geoid gap" between satellite resolution (150 km) and local requirements (2-10 km): With CCD we can speed up and achieve high accuracy almost without observer's experience. In Vienna we construct a mobile zenith camera guided by notebook and GPS: made of Dur-Al, f=20 cm with a Starlite MX-sensor (752580 pixels 11?m). Accuracy 1" within 10 min, mounted at a usual survey tripod. Weight only 4 kg for a special vertical axis, controlled by springs (490) and 2 levels (2002) or sensor (2003). Applications 2003: Improving parts of Austrian geoid (4 cm?2 cm); automatic astro-points in alpine surveys (vertical deflection effects 3-15 cm per km). Transform of GPS heights to 1 cm. Tunneling study: heighting up to 0.1 mm without external control; combining astro-topographic and geological data. Plans 2004: Astro control of polygons and networks - to raise accuracy and economy by ~40% (Sun azimuths of 3"; additional effort only 10-20%). Planned with servo theodolites and open co-operation groups.

Gerstbach, G.; Pichler, H.

2003-10-01

211

Video geographic information system using mobile mapping in mobilephone camera  

NASA Astrophysics Data System (ADS)

In this Paper is to develop core technologies such as automatic shape extraction from images (video), spatialtemporal data processing, efficient modeling, and then make it inexpensive and fast to build and process the huge 3D geographic data. The upgrade and maintenance of the technologies are also easy due to the component-based system architecture. Therefore, we designed and implemented the Video mobile GIS using a real-time database system, which consisted of a real-time GIS engine, a middleware, and a mobile client.

Kang, Jinsuk; Lee, Jae-Joon

2013-12-01

212

Passive millimeter-wave video camera for aviation applications  

NASA Astrophysics Data System (ADS)

Passive Millimeter Wave (PMMW) imaging technology offers significant safety benefits to world aviation. Made possible by recent technological breakthroughs, PMMW imaging sensors provide visual-like images of objects under low visibility conditions (e.g., fog, clouds, snow, sandstorms, and smoke) which blind visual and infrared sensors. TRW has developed an advanced, demonstrator version of a PMMW imaging camera that, when front-mounted on an aircraft, gives images of the forward scene at a rate and quality sufficient to enhance aircrew vision and situational awareness under low visibility conditions. Potential aviation uses for a PMMW camera are numerous and include: (1) Enhanced vision for autonomous take- off, landing, and surface operations in Category III weather on Category I and non-precision runways; (2) Enhanced situational awareness during initial and final approach, including Controlled Flight Into Terrain (CFIT) mitigation; (3) Ground traffic control in low visibility; (4) Enhanced airport security. TRW leads a consortium which began flight tests with the demonstration PMMW camera in September 1997. Flight testing will continue in 1998. We discuss the characteristics of PMMW images, the current state of the technology, the integration of the camera with other flight avionics to form an enhanced vision system, and other aviation applications.

Fornaca, Steven W.; Shoucri, Merit; Yujiri, Larry

1998-07-01

213

Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source (HTPD 08 paper)  

SciTech Connect

The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 {micro}m square pixels, and 15 {micro}m thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/{Delta}E {approx} 10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within {+-}1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

Haugh, M; Schneider, M B

2008-04-28

214

Onboard video cameras and instruments to measure the flight behavior of birds  

Microsoft Academic Search

Summary We have recently developed several novel techniques to measure flight kinematic parameters on free-flying birds of prey using onboard wireless video cameras and inertial measurement systems (1). Work to date has involved captive trained raptors including a Steppe Eagle (Aquila nipalensis), Peregrine falcon (Falco peregrinus) and Gyrfalcon (Falco rusticolus). We aim to describe mathematically the dynamics of the relationship

J. A. Gillies; M. Bacic; A. L. R. Thomas; G. K. Taylor

2008-01-01

215

Video-based Animal Behavior Analysis From Multiple Cameras Xinwei Xue and Thomas C. Henderson  

E-print Network

Video-based Animal Behavior Analysis From Multiple Cameras Xinwei Xue and Thomas C. Henderson subject behavior study has become a very important research area, in which the behavior of various animals or humans is studied for many different purposes. In the context of an animal, the behaviors may include

Henderson, Thomas C.

216

Design and Implementation of a Wireless Video Camera Network for Coastal Erosion Monitoring  

E-print Network

Design and Implementation of a Wireless Video Camera Network for Coastal Erosion Monitoring Yuting­The short-term rate of coastal erosion and recession has been observed at island shore- line bluffs near waterways among Boston Harbor, Massachusetts, USA. This erosion has been hypothesized partially related

Little, Thomas

217

Digital Video Cameras for Brainstorming and Outlining: The Process and Potential  

ERIC Educational Resources Information Center

This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice

Unger, John A.; Scullion, Vicki A.

2013-01-01

218

Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera  

ERIC Educational Resources Information Center

Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the

Levesque, Luc

2014-01-01

219

Highly flexible and Internet-programmable CCD camera with a frequency-selectable read-out for imaging and spectroscopy applications  

NASA Astrophysics Data System (ADS)

A new concept CCD camera is currently being realized at the XUV Lab of the Department of Astronomy and Space Science of the University of Florence. The main features we aim to get are a high level of versatility and a fast pixel rate. Within this project, a versatile CCD sequencer has been realized with interesting and innovative features. Based on a microcontroller and complex programmable logic devices (CPLD), it allows the selection of all the parameters related to charge transfer and CCD readout (number, duration and overlapping of serial and parallel transfer clocks, number of output nodes, pixel transfer rate) and therefore it allows the use of virtually any CCD sensor. Comparing to a common DSP-based sequencer, it is immune to jitter noise and it can also reach pixel rates greater than 40 MHz. The software interface is LabVIEW 6i based and it will allow both local or remote control and display. Furthermore, it will be possible to remote debug the system and to upgrade the LabVIEW interface itself and also the microcontroller resident program and the CPLD implemented schemes.

Gori, Luca; Pace, Emanuele; Tommasi, Leonardo; Sarocchi, D.; Bagnoli, V.; Sozzi, M.; Puri, S.

2001-12-01

220

Composite video and graphics display for multiple camera viewing system in robotics and teleoperation  

NASA Technical Reports Server (NTRS)

A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

Diner, Daniel B. (inventor); Venema, Steven C. (inventor)

1991-01-01

221

Composite video and graphics display for camera viewing systems in robotics and teleoperation  

NASA Technical Reports Server (NTRS)

A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

Diner, Daniel B. (inventor); Venema, Steven C. (inventor)

1993-01-01

222

Meteor velocity distribution from CILBO double station video camera data  

NASA Astrophysics Data System (ADS)

This paper is based on data from the double-station meteor camera setup on the Canary Islands - CILBO. The data has been collected from July 2011 until August 2014. The CILBO meteor data of one year (1 June 2013 - 31 May 2014) were used to analyze the velocity distribution of sporadic meteors and to compare the distribution to a reference distribution for near-Earth space. The velocity distribution for 1 AU outside the influence of Earth derived from the Harvard Radio Meteor Project (HRMP) was used as a reference. This HRMP distribution was converted to an altitude of 100 km by considering the gravitational attraction of Earth. The new, theoretical velocity distribution for a fixed meteoroid mass ranges from 11 - 71 𝑘𝑚/𝑠 and peaks at 12.5 𝑘𝑚/𝑠. This represents the predicted velocity distribution. The velocity distribution of the meteors detected simultaneously by both cameras of the CILBO system was examined. The meteors are sorted by their stream association and especially the velocity distribution of the sporadics is studied closely. The derived sporadic velocity distribution has a maximum at 64 𝑘𝑚/𝑠. This drastic difference to the theoretical curve confirms that fast meteors are usually greatly over-represented in optical and radar measurements of meteors. The majority of the fast sporadics are apparently caused by the Apex contribution in the early morning hours. This paper presents first results of the ongoing analysis of the meteor velocity distribution.

Drolshagen, Esther; Ott, Theresa; Koschny, Detlef; Drolshagen, Gerhard; Poppe, Bjoern

2014-02-01

223

Development of a compact fast CCD camera and resonant soft x-ray scattering endstation for time-resolved pump-probe experiments.  

PubMed

The designs of a compact, fast CCD (cFCCD) camera, together with a resonant soft x-ray scattering endstation, are presented. The cFCCD camera consists of a highly parallel, custom, thick, high-resistivity CCD, readout by a custom 16-channel application specific integrated circuit to reach the maximum readout rate of 200 frames per second. The camera is mounted on a virtual-axis flip stage inside the RSXS chamber. When this flip stage is coupled to a differentially pumped rotary seal, the detector assembly can rotate about 100/360 in the vertical/horizontal scattering planes. With a six-degrees-of-freedom cryogenic sample goniometer, this endstation has the capability to detect the superlattice reflections from the electronic orderings showing up in the lower hemisphere. The complete system has been tested at the Advanced Light Source, Lawrence Berkeley National Laboratory, and has been used in multiple experiments at the Linac Coherent Light Source, SLAC National Accelerator Laboratory. PMID:21806178

Doering, D; Chuang, Y-D; Andresen, N; Chow, K; Contarato, D; Cummings, C; Domning, E; Joseph, J; Pepper, J S; Smith, B; Zizka, G; Ford, C; Lee, W S; Weaver, M; Patthey, L; Weizeorick, J; Hussain, Z; Denes, P

2011-07-01

224

A refrigerated web camera for photogrammetric video measurement inside biomass boilers and combustion analysis.  

PubMed

This paper describes a prototype instrumentation system for photogrammetric measuring of bed and ash layers, as well as for flying particle detection and pursuit using a single device (CCD) web camera. The system was designed to obtain images of the combustion process in the interior of a domestic boiler. It includes a cooling system, needed because of the high temperatures in the combustion chamber of the boiler. The cooling system was designed using CFD simulations to ensure effectiveness. This method allows more complete and real-time monitoring of the combustion process taking place inside a boiler. The information gained from this system may facilitate the optimisation of boiler processes. PMID:22319349

Porteiro, Jacobo; Riveiro, Beln; Granada, Enrique; Armesto, Julia; Egua, Pablo; Collazo, Joaqun

2011-01-01

225

Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance  

NASA Astrophysics Data System (ADS)

This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

Liu, Yu-Che; Huang, Chung-Lin

2013-03-01

226

Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy  

NASA Technical Reports Server (NTRS)

Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.

1984-01-01

227

Hardware-based smart camera for recovering high dynamic range video from multiple exposures  

NASA Astrophysics Data System (ADS)

In many applications such as video surveillance or defect detection, the perception of information related to a scene is limited in areas with strong contrasts. The high dynamic range (HDR) capture technique can deal with these limitations. The proposed method has the advantage of automatically selecting multiple exposure times to make outputs more visible than fixed exposure ones. A real-time hardware implementation of the HDR technique that shows more details both in dark and bright areas of a scene is an important line of research. For this purpose, we built a dedicated smart camera that performs both capturing and HDR video processing from three exposures. What is new in our work is shown through the following points: HDR video capture through multiple exposure control, HDR memory management, HDR frame generation, and representation under a hardware context. Our camera achieves a real-time HDR video output at 60 fps at 1.3 megapixels and demonstrates the efficiency of our technique through an experimental result. Applications of this HDR smart camera include the movie industry, the mass-consumer market, military, automotive industry, and surveillance.

Lapray, Pierre-Jean; Heyrman, Barthlmy; Ginhac, Dominique

2014-10-01

228

A Large-panel Two-CCD Camera Coordinate System with an Alternate-Eight-Matrix Look-Up-Table Method  

NASA Astrophysics Data System (ADS)

This study proposed a novel positioning model, composing of a two-camera calibration system and an Alternate-Eight-Matrix (AEM) Look-Up-Table (LUT). Two video cameras were fixed on two sides of a large-size screen to solve the problem of field of view. The first to the fourth LUTs were used to compute the corresponding positions of specified regions on the screen captured by the camera on the right side. In these four LUTs, the coordinate mapping data of the target were stored in two matrixes, while the gray level threshold values of different positions were stored in other matrixes. Similarly, the fifth to the eighth LUTs were used to compute the corresponding positions of the specified regions on the screen captured by the camera on the left side. Experimental results showed that the proposed model can solve the problems of dead zones and non-uniform light fields, while achieving rapid and precise positioning results.

Lin, Chern-Sheng; Lu, An-Tsung; Hsu, Yuen-Chang; Tien, Chuen-Lin; Chen, Der-Chin; Chang, Nin-Chun

2012-03-01

229

Video and acoustic camera techniques for studying fish under ice: a review and comparison  

SciTech Connect

Researchers attempting to study the presence, abundance, size, and behavior of fish species in northern and arctic climates during winter face many challenges, including the presence of thick ice cover, snow cover, and, sometimes, extremely low temperatures. This paper describes and compares the use of video and acoustic cameras for determining fish presence and behavior in lakes, rivers, and streams with ice cover. Methods are provided for determining fish density and size, identifying species, and measuring swimming speed and successful applications of previous surveys of fish under the ice are described. These include drilling ice holes, selecting batteries and generators, deploying pan and tilt cameras, and using paired colored lasers to determine fish size and habitat associations. We also discuss use of infrared and white light to enhance image-capturing capabilities, deployment of digital recording systems and time-lapse techniques, and the use of imaging software. Data are presented from initial surveys with video and acoustic cameras in the Sagavanirktok River Delta, Alaska, during late winter 2004. These surveys represent the first known successful application of a dual-frequency identification sonar (DIDSON) acoustic camera under the ice that achieved fish detection and sizing at camera ranges up to 16 m. Feasibility tests of video and acoustic cameras for determining fish size and density at various turbidity levels are also presented. Comparisons are made of the different techniques in terms of suitability for achieving various fisheries research objectives. This information is intended to assist researchers in choosing the equipment that best meets their study needs.

Mueller, Robert P.; Brown, Richard S.; Hop, Haakon H.; Moulton, Larry

2006-09-05

230

Large area x-ray sensitive video camera: overall feasibility  

NASA Astrophysics Data System (ADS)

A large area x-ray sensitive vidicon is an alternative to the x-ray image intensifier and television camera combination. The proposed x-ray vidicon utilizes an amorphous selenium photoconductive layer which has a higher intrinsic resolution in comparison to the input phosphor of an XRII. This higher resolution could benefit diagnostic cardiac angiography as well as interventional cardiac procedures which now frequency utilize XRII/TV zoom modes to achieve higher resolution. Signal, noise, resolution and lag of an x-ray vidicon have been analyzed theoretically and indicate a medically practical device is possible. The use of a large potential to bias the a-Se photoconductor presents a problem with respect to instability of the a-Se surface potential and excessive dark current. The incorporation of a suppressor mesh into the vidicon has been shown to provide stable vidicon operation while experiments involving a-Se blocking contacts have lead to the development of an a-Se layer with low dark current.

Luhta, Randy P.; Rowlands, John A.

1997-05-01

231

A New Remote Sensing Filter Radiometer Employing a Fabry-Perot Etalon and a CCD Camera for Column Measurements of Methane in the Earth Atmosphere  

NASA Technical Reports Server (NTRS)

A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.

Georgieva, E. M.; Huang, W.; Heaps, W. S.

2012-01-01

232

A Novel Method to Reduce Time Investment When Processing Videos from Camera Trap Studies  

PubMed Central

Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

Swinnen, Kristijn R. R.; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

2014-01-01

233

A digital underwater video camera system for aquatic research in regulated rivers  

USGS Publications Warehouse

We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

Martin, Benjamin M.; Irwin, Elise R.

2010-01-01

234

Human Daily Activities Indexing in Videos from Wearable Cameras for Monitoring of Patients with Dementia Diseases  

E-print Network

Our research focuses on analysing human activities according to a known behaviorist scenario, in case of noisy and high dimensional collected data. The data come from the monitoring of patients with dementia diseases by wearable cameras. We define a structural model of video recordings based on a Hidden Markov Model. New spatio-temporal features, color features and localization features are proposed as observations. First results in recognition of activities are promising.

Karaman, Svebor; Mgret, Rmi; Dovgalecs, Vladislavs; Dartigues, Jean-Franois; Gastel, Yann

2010-01-01

235

Crack propagation imaging by the ISIS camera and a video trigger system  

Microsoft Academic Search

An ultra-high speed camera of 1Mfps was applied to visualize the crack propagation. Change of stress field around the propagating crack tip was captured as a change of the fringe pattern by means of the photo-elastic imaging technique. Newly developed video trigger system is employed to detect the occurrence of the crack propagation as a trigger in the experiment. The

Tomoo Okinaka; Pavel Karimov; Takeharu Etoh; Kenji Oguni

2007-01-01

236

Photon-number distributions of twin beams generated in spontaneous parametric down-conversion and measured by an intensified CCD camera  

E-print Network

The measurement of photon-number statistics of fields composed of photon pairs, generated in spontaneous parametric down-conversion and detected by an intensified CCD camera is described. Final quantum detection efficiencies, electronic noises, finite numbers of detector pixels, transverse intensity spatial profiles of the detected beams as well as losses of single photons from a pair are taken into account in a developed general theory of photon-number detection. The measured data provided by an iCCD camera with single-photon detection sensitivity are analyzed along the developed theory. Joint signal-idler photon-number distributions are recovered using the reconstruction method based on the principle of maximum likelihood. The range of applicability of the method is discussed. The reconstructed joint signal-idler photon-number distribution is compared with that obtained by a method that uses superposition of signal and noise and minimizes photoelectron entropy. Statistics of the reconstructed fields are identified to be multi-mode Gaussian. Elements of the measured as well as the reconstructed joint signal-idler photon-number distributions violate classical inequalities. Sub-shot-noise correlations in the difference of the signal and idler photon numbers as well as partial suppression of odd elements in the distribution of the sum of signal and idler photon numbers are observed.

Jan Perina Jr; Ondrej Haderka; Martin Hamar; Vaclav Michalek

2012-02-07

237

Structural analysis of color video camera installation on tank 241AW101 (2 Volumes)  

SciTech Connect

A video camera is planned to be installed on the radioactive storage tank 241AW101 at the DOE` s Hanford Site in Richland, Washington. The camera will occupy the 20 inch port of the Multiport Flange riser which is to be installed on riser 5B of the 241AW101 (3,5,10). The objective of the project reported herein was to perform a seismic analysis and evaluation of the structural components of the camera for a postulated Design Basis Earthquake (DBE) per the reference Structural Design Specification (SDS) document (6). The detail of supporting engineering calculations is documented in URS/Blume Calculation No. 66481-01-CA-03 (1).

Strehlow, J.P.

1994-08-24

238

Real-Time Color Correction Method for a Low-Cost Still/Video Camera  

NASA Astrophysics Data System (ADS)

This paper describes a color correction method of low-cost still/video camera images. Instead of using complex and non-linear equations, the concept of a three-dimensional reduced resolution look-up table is used for the real-time color gamut expansion of low-cost cameras. The proposed method analyzes the color gamut of low cost cameras and constructs 3-dimensional rule tables during the off-line stage. And, real-time color correction is conducted using that rule table. The experimental result shows that output images have more vivid and natural colors compared with originals. The proposed method can be easily implemented with small software and/or hardware resources.

Han, Dongil; Lee, Hak-Sung; Im, Chan; Yoo, Seong Joon

239

Evaluation of lens distortion errors using an underwater camera system for video-based motion analysis  

NASA Technical Reports Server (NTRS)

Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.

Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.

1994-01-01

240

Flat Field Anomalies in an X-Ray CCD Camera Measured Using a Manson X-Ray Source  

SciTech Connect

The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. It determines how accurately NIF can point the laser beams and is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 ?m square pixels, and 15 ?m thick. A multi-anode Manson X-ray source, operating up to 10kV and 2mA, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/?E?12. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within 1.5% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. The efficiency pattern follows the properties of Si. The maximum quantum efficiency is 0.71. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation was >8% at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was less than the measurement uncertainty below 4 keV. We were also able to observe debris on the CCD chip. The debris showed maximum contrast at the lowest energy used, 930 eV, and disappeared by 4 keV. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

Michael Haugh

2008-03-01

241

Miniaturization of a hepatitis C virus RNA polymerase assay using a -102 degrees C cooled CCD camera-based imaging system.  

PubMed

Innovations in detection technologies have allowed us to develop a novel assay in 1536-well plate format and assess the advantages of screen miniaturization compared with conventional high-throughput compound screening in 96- or 384-well plates. An HCV RNA polymerase assay has been miniaturized in 1536-well plates by using a new detection technology known as LEADseeker homogeneous imaging system. It uses a -102 degrees C cooled charge-coupled device (CCD) camera and newly designed scintillation proximity microparticles. The miniaturized assay used europium-doped streptavidin-coated yttrium oxide (YO(x)) or polystyrene (PS) microspheres to capture biotin-labeled [(3)H]RNA product transcripts. Beads in proximity to the radioisotope convert the emitted beta(-) particles into photons having wavelengths in the red region of the visible spectrum, optimal for detection by the CCD camera. Because the camera collects light from all wells of the plate simultaneously, 1536-well plates are imaged as rapidly as 384-well plates, on the order of 10 min per plate. The assay has a signal to background of approximately 20-fold, satisfactory for high-throughput robotics screening. The enzyme kinetics and potency of a known inhibitor were similar to those obtained from the conventional assay using scintillation proximity assay (SPA) beads and a scintillation plate counter. Furthermore, the newly developed microbeads (emitting at 610 to 620 nm) are less prone to quenching effects caused by yellow-colored compounds, than conventional SPA beads or scintillation fluid (emitting at 400 to 480 nm region). Thus, the LEADseeker imaging system is a useful new tool for miniaturization of assays for high-throughput screening. PMID:11237322

Zheng, W; Carroll, S S; Inglese, J; Graves, R; Howells, L; Strulovici, B

2001-03-01

242

Developments of engineering model of the X-ray CCD camera of the MAXI experiment onboard the International Space Station  

Microsoft Academic Search

MAXI, Monitor of All-sky X-ray Image, is an X-ray observatory on the Japanese Experimental Module (JEM) Exposed Facility (EF) on the International Space Station (ISS). MAXI is a slit scanning camera which consists of two kinds of X-ray detectors: one is a one-dimensional position-sensitive proportional counter with a total area of ?5000cm2, the Gas Slit Camera (GSC), and the other

Emi Miyata; Chikara Natsukari; Tomoyuki Kamazuka; Daisuke Akutsu; Hirohiko Kouno; Hiroshi Tsunemi; Masaru Matsuoka; Hiroshi Tomida; Shiro Ueno; Kenji Hamaguchi; Isao Tanaka

2002-01-01

243

First results from newly developed automatic video system MAIA and comparison with older analogue cameras  

NASA Astrophysics Data System (ADS)

New automatic video system for meteor observations MAIA was developed in recent years [1]. The goal is to replace the older analogue cameras and provide a platform for continues round the year observations from two different stations. Here we present first results obtained during testing phase as well as the first double station observations. Comparison with the older analogue cameras is provided too. MAIA (Meteor Automatic Imager and Analyzer) is based on digital monochrome camera JAI CM-040 and well proved image intensifier XX1332 (Figure 1). The camera provides spatial resolution 776 x 582 pixels. The maximum frame rate is 61.15 frames per second. Fast Pentax SMS FA 1.4/50mm lens is used as the input element of the optical system. The resulting field-of-view is about 50 in diameter. For the first time new system was used in semiautomatic regime for the observation of the Draconid outburst on 8th October, 2011. Both cameras recorded more than 160 meteors. Additional hardware and software were developed in 2012 to enable automatic observation and basic processing of the data. The system usually records the video sequences for whole night. During the daytime it looks the records for moving object, saves them into short sequences and clears the hard drives to allow additional observations. Initial laboratory measurements [2] and simultaneous observations with older system show significant improvement of the obtained data. Table 1 shows comparison of the basic parameters of both systems. In this paper we will present comparison of the double station data obtained using both systems.

Koten, P.; Pta, P.; Fliegel, K.; Vtek, S.

2013-09-01

244

CCD Photometry of Gliese 372  

NASA Astrophysics Data System (ADS)

We present optical photometry of the M-dwarf binary Gliese 372. We have searched for the predicted eclipses of this binary (Harlow 1996) using a 16-inch telescope and a CCD camera built from "The CCD Camera Cookbook." We set limits on the presence of eclipses, and on the photometric variability outside of eclipse.

Ramseyer, T. F.; Davis, C.; Lasley, C.; Leonard, C.; Portoni, A.

1997-05-01

245

Real Time Speed Estimation of Moving Vehicles from Side View Images from an Uncalibrated Video Camera  

PubMed Central

In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

Do?an, Sedat; Temiz, Mahir Serhan; Klr, S?tk?

2010-01-01

246

Real time speed estimation of moving vehicles from side view images from an uncalibrated video camera.  

PubMed

In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

Do?an, Sedat; Temiz, Mahir Serhan; Klr, Sitki

2010-01-01

247

Modelling the spectral response of the Swift-XRT CCD camera: experience learnt from in-flight calibration  

NASA Astrophysics Data System (ADS)

Context: Since its launch in November 2004, Swift has revolutionised our understanding of gamma-ray bursts. The X-ray telescope (XRT), one of the three instruments on board Swift, has played a key role in providing essential positions, timing, and spectroscopy of more than 300 GRB afterglows to date. Although Swift was designed to observe GRB afterglows with power-law spectra, Swift is spending an increasing fraction of its time observing more traditional X-ray sources, which have more complex spectra. Aims: The aim of this paper is a detailed description of the CCD response model used to compute the XRT RMFs (redistribution matrix files), the changes implemented to it based on measurements of celestial and on-board calibration sources, and current caveats in the RMFs for the spectral analysis of XRT data. Methods: The RMFs are computed via Monte-Carlo simulations based on a physical model describing the interaction of photons within the silicon bulk of the CCD detector. Results: We show that the XRT spectral response calibration was complicated by various energy offsets in photon counting (PC) and windowed timing (WT) modes related to the way the CCD is operated in orbit (variation in temperature during observations, contamination by optical light from the sunlit Earth and increase in charge transfer inefficiency). We describe how these effects can be corrected for in the ground processing software. We show that the low-energy response, the redistribution in spectra of absorbed sources, and the modelling of the line profile have been significantly improved since launch by introducing empirical corrections in our code when it was not possible to use a physical description. We note that the increase in CTI became noticeable in June 2006 (i.e. 14 months after launch), but the evidence of a more serious degradation in spectroscopic performance (line broadening and change in the low-energy response) due to large charge traps (i.e. faults in the Si crystal) became more significant after March 2007. We describe efforts to handle such changes in the spectral response. Finally, we show that the commanded increase in the substrate voltage from 0 to 6 V on 2007 August 30 reduced the dark current, enabling the collection of useful science data at higher CCD temperature (up to -50 C). We also briefly describe the plan to recalibrate the XRT response files at this new voltage. Conclusions: We show that the XRT spectral response is described well by the public response files for line and continuum spectra in the 0.3-10 keV band in both PC and WT modes.

Godet, O.; Beardmore, A. P.; Abbey, A. F.; Osborne, J. P.; Cusumano, G.; Pagani, C.; Capalbi, M.; Perri, M.; Page, K. L.; Burrows, D. N.; Campana, S.; Hill, J. E.; Kennea, J. A.; Moretti, A.

2009-02-01

248

Modelling the spectral response of the Swift-XRT CCD camera: Experience learnt from in-flight calibration  

E-print Network

(Abbreviated) We show that the XRT spectral response calibration was complicated by various energy offsets in photon counting (PC) and windowed timing (WT) modes related to the way the CCD is operated in orbit (variation in temperature during observations, contamination by optical light from the sunlit Earth and increase in charge transfer inefficiency). We describe how these effects can be corrected for in the ground processing software. We show that the low-energy response, the redistribution in spectra of absorbed sources, and the modelling of the line profile have been significantly improved since launch by introducing empirical corrections in our code when it was not possible to use a physical description. We note that the increase in CTI became noticeable in June 2006 (i.e. 14 months after launch), but the evidence of a more serious degradation in spectroscopic performance (line broadening and change in the low-energy response) due to large charge traps (i.e. faults in the Si crystal) became more significant after March 2007. We describe efforts to handle such changes in the spectral response. Finally, we show that the commanded increase in the substrate voltage from 0 to 6V on 2007 August 30 reduced the dark current, enabling the collection of useful science data at higher CCD temperature (up to -50C). We also briefly describe the plan to recalibrate the XRT response files at this new voltage.

O. Godet; A. P. Beardmore; A. F. Abbey; J. P. Osborne; G. Cusumano; C. Pagani; M. Capalbi; M. Perri; K. L. Page; D. N. Burrows; S. Campana; J. E. Hill; J. A. Kennea; A. Moretti

2008-11-26

249

Acute gastroenteritis and video camera surveillance: a cruise ship case report.  

PubMed

A 'faecal accident' was discovered in front of a passenger cabin of a cruise ship. After proper cleaning of the area the passenger was approached, but denied having any gastrointestinal symptoms. However, when confronted with surveillance camera evidence, she admitted having the accident and even bringing the towel stained with diarrhoea back to the pool towels bin. She was isolated until the next port where she was disembarked. Acute gastroenteritis (AGE) caused by Norovirus is very contagious and easily transmitted from person to person on cruise ships. The main purpose of isolation is to avoid public vomiting and faecal accidents. To quickly identify and isolate contagious passengers and crew and ensure their compliance are key elements in outbreak prevention and control, but this is difficult if ill persons deny symptoms. All passenger ships visiting US ports now have surveillance video cameras, which under certain circumstances can assist in finding potential index cases for AGE outbreaks. PMID:24677123

Diskin, Arthur L; Caro, Gina M; Dahl, Eilif

2014-01-01

250

A Simple Method Based on the Application of a CCD Camera as a Sensor to Detect Low Concentrations of Barium Sulfate in Suspension  

PubMed Central

The development of a simple, rapid and low cost method based on video image analysis and aimed at the detection of low concentrations of precipitated barium sulfate is described. The proposed system is basically composed of a webcam with a CCD sensor and a conventional dichroic lamp. For this purpose, software for processing and analyzing the digital images based on the RGB (Red, Green and Blue) color system was developed. The proposed method had shown very good repeatability and linearity and also presented higher sensitivity than the standard turbidimetric method. The developed method is presented as a simple alternative for future applications in the study of precipitations of inorganic salts and also for detecting the crystallization of organic compounds. PMID:22346607

de Sena, Rodrigo Caciano; Soares, Matheus; Pereira, Maria Luiza Oliveira; da Silva, Rogrio Cruz Domingues; do Rosrio, Francisca Ferreira; da Silva, Joao Francisco Cajaiba

2011-01-01

251

A semantic autonomous video surveillance system for dense camera networks in Smart Cities.  

PubMed

This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

Calavia, Lorena; Baladrn, Carlos; Aguiar, Javier M; Carro, Beln; Snchez-Esguevillas, Antonio

2012-01-01

252

A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities  

PubMed Central

This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

Calavia, Lorena; Baladrn, Carlos; Aguiar, Javier M.; Carro, Beln; Snchez-Esguevillas, Antonio

2012-01-01

253

System design description for the LDUA high resolution stereoscopic video camera system (HRSVS)  

SciTech Connect

The High Resolution Stereoscopic Video Camera System (HRSVS), system 6230, was designed to be used as an end effector on the LDUA to perform surveillance and inspection activities within a waste tank. It is attached to the LDUA by means of a Tool Interface Plate (TIP) which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate. Designed to perform up close weld and corrosion inspection roles in US T operations, the HRSVS will support and supplement the Light Duty Utility Arm (LDUA) and provide the crucial inspection tasks needed to ascertain waste tank condition.

Pardini, A.F.

1998-01-27

254

MOEMS-based time-of-flight camera for 3D video capturing  

NASA Astrophysics Data System (ADS)

We suggest a Time-of-Flight (TOF) video camera capturing real-time depth images (a.k.a depth map), which are generated from the fast-modulated IR images utilizing a novel MOEMS modulator having switching speed of 20 MHz. In general, 3 or 4 independent IR (e.g. 850nm) images are required to generate a single frame of depth image. Captured video image of a moving object frequently shows motion drag between sequentially captured IR images, which results in so called `motion blur' problem even when the frame rate of depth image is fast (e.g. 30 to 60 Hz). We propose a novel `single shot' TOF 3D camera architecture generating a single depth image out of synchronized captured IR images. The imaging system constitutes of 2x2 imaging lens array, MOEMS optical shutters (modulator) placed on each lens aperture and a standard CMOS image sensor. The IR light reflected from object is modulated by optical shutters on the apertures of 2x2 lens array and then transmitted images are captured on the image sensor resulting in 2x2 sub-IR images. As a result, the depth image is generated with those simultaneously captured 4 independent sub-IR images, hence the motion blur problem is canceled. The resulting performance is very useful in the applications of 3D camera to a human-machine interaction device such as user interface of TV, monitor, or hand held devices and motion capturing of human body. In addition, we show that the presented 3D camera can be modified to capture color together with depth image simultaneously on `single shot' frame rate.

You, Jang-Woo; Park, Yong-Hwa; Cho, Yong-Chul; Park, Chang-Young; Yoon, Heesun; Lee, Sang-Hun; Lee, Seung-Wan

2013-03-01

255

A stroboscopic technique for using CCD cameras in flow visualization systems for continuous viewing and stop action photography  

NASA Technical Reports Server (NTRS)

A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.

Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.

1992-01-01

256

Aug 7, 2008 Researchers in the US unveil a silicon-based CCD camera that mimics the  

E-print Network

systems) into biomedical devices that can be implanted into the human body. "We would like to explore the shape of a human eye. Electronic eye camera mimics the shape of the human eye Scientists have overcome systems, in which not only the lenses but also the geometrical layouts of the detector arrays can

Rogers, John A.

257

Calibration grooming and alignment for LDUA High Resolution Stereoscopic Video Camera System (HRSVS)  

SciTech Connect

The High Resolution Stereoscopic Video Camera System (HRSVS) was designed by the Savannah River Technology Center (SRTC) to provide routine and troubleshooting views of tank interiors during characterization and remediation phases of underground storage tank (UST) processing. The HRSVS is a dual color camera system designed to provide stereo viewing of the interior of the tanks including the tank wall in a Class 1, Division 1, flammable atmosphere. The HRSVS was designed with a modular philosophy for easy maintenance and configuration modifications. During operation of the system with the LDUA, the control of the camera system will be performed by the LDUA supervisory data acquisition system (SDAS). Video and control status 1458 will be displayed on monitors within the LDUA control center. All control functions are accessible from the front panel of the control box located within the Operations Control Trailer (OCT). The LDUA will provide all positioning functions within the waste tank for the end effector. Various electronic measurement instruments will be used to perform CG and A activities. The instruments may include a digital volt meter, oscilloscope, signal generator, and other electronic repair equipment. None of these instruments will need to be calibrated beyond what comes from the manufacturer. During CG and A a temperature indicating device will be used to measure the temperature of the outside of the HRSVS from initial startup until the temperature has stabilized. This device will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing. This sensor will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing.

Pardini, A.F.

1998-01-27

258

Visual fatigue modeling for stereoscopic video shot based on camera motion  

NASA Astrophysics Data System (ADS)

As three-dimensional television (3-DTV) and 3-D movie become popular, the discomfort of visual feeling limits further applications of 3D display technology. The cause of visual discomfort from stereoscopic video conflicts between accommodation and convergence, excessive binocular parallax, fast motion of objects and so on. Here, a novel method for evaluating visual fatigue is demonstrated. Influence factors including spatial structure, motion scale and comfortable zone are analyzed. According to the human visual system (HVS), people only need to converge their eyes to the specific objects for static cameras and background. Relative motion should be considered for different camera conditions determining different factor coefficients and weights. Compared with the traditional visual fatigue prediction model, a novel visual fatigue predicting model is presented. Visual fatigue degree is predicted using multiple linear regression method combining with the subjective evaluation. Consequently, each factor can reflect the characteristics of the scene, and the total visual fatigue score can be indicated according to the proposed algorithm. Compared with conventional algorithms which ignored the status of the camera, our approach exhibits reliable performance in terms of correlation with subjective test results.

Shi, Guozhong; Sang, Xinzhu; Yu, Xunbo; Liu, Yangdong; Liu, Jing

2014-11-01

259

Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras  

USGS Publications Warehouse

GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

Harris, A.J.L.; Thornber, C.R.

1999-01-01

260

Dual charge-coupled device /CCD/, astronomical spectrometer and direct imaging camera. II - Data handling and control systems  

NASA Astrophysics Data System (ADS)

The data collection system for the MASCOT (MIT Astronomical Spectrometer/Camera for Optical Telescopes) is described. The system relies on an RCA 1802 microprocessor-based controller, which serves to collect and format data, to present data to a scan converter, and to operate a device communication bus. A NOVA minicomputer is used to record and recall frame images and to perform refined image processing. The RCA 1802 also provides instrument mode control for the MASCOT. Commands are issued using STOIC, a FORTH-like language. Sufficient flexibility has been provided so that a variety of CCDs can be accommodated.

Dewey, D.; Ricker, G. R.

261

Identifying predators and fates of grassland passerine nests using miniature video cameras  

USGS Publications Warehouse

Nest fates, causes of nest failure, and identities of nest predators are difficult to determine for grassland passerines. We developed a miniature video-camera system for use in grasslands and deployed it at 69 nests of 10 passerine species in North Dakota during 1996-97. Abandonment rates were higher at nests 1 day or night (22-116 hr) at 6 nests, 5 of which were depredated by ground squirrels or mice. For nests without cameras, estimated predation rates were lower for ground nests than aboveground nests (P = 0.055), but did not differ between open and covered nests (P = 0.74). Open and covered nests differed, however, when predation risk (estimated by initial-predation rate) was examined separately for day and night using camera-monitored nests; the frequency of initial predations that occurred during the day was higher for open nests than covered nests (P = 0.015). Thus, vulnerability of some nest types may depend on the relative importance of nocturnal and diurnal predators. Predation risk increased with nestling age from 0 to 8 days (P = 0.07). Up to 15% of fates assigned to camera-monitored nests were wrong when based solely on evidence that would have been available from periodic nest visits. There was no evidence of disturbance at nearly half the depredated nests, including all 5 depredated by large mammals. Overlap in types of sign left by different predator species, and variability of sign within species, suggests that evidence at nests is unreliable for identifying predators of grassland passerines.

Pietz, P.J.; Granfors, D.A.

2000-01-01

262

Crack propagation imaging by the ISIS camera and a video trigger system  

NASA Astrophysics Data System (ADS)

An ultra-high speed camera of 1Mfps was applied to visualize the crack propagation. Change of stress field around the propagating crack tip was captured as a change of the fringe pattern by means of the photo-elastic imaging technique. Newly developed video trigger system is employed to detect the occurrence of the crack propagation as a trigger in the experiment. The trigger successfully perceived the initiation of the crack propagation stably. Also its response time was fast enough even for the image capturing with 1Mfps. As a result, it is revealed that the elastic wave, propagating in the continuous body, has a significant effect on the velocity and kinking behavior of the propagating crack.

Okinaka, Tomoo; Karimov, Pavel; Etoh, Takeharu; Oguni, Kenji

2007-01-01

263

Embedded FIR filter design for real-time refocusing using a standard plenoptic video camera  

NASA Astrophysics Data System (ADS)

A novel and low-cost embedded hardware architecture for real-time refocusing based on a standard plenoptic camera is presented in this study. The proposed layout design synthesizes refocusing slices directly from micro images by omitting the process for the commonly used sub-aperture extraction. Therefore, intellectual property cores, containing switch controlled Finite Impulse Response (FIR) filters, are developed and applied to the Field Programmable Gate Array (FPGA) XC6SLX45 from Xilinx. Enabling the hardware design to work economically, the FIR filters are composed of stored product as well as upsampling and interpolation techniques in order to achieve an ideal relation between image resolution, delay time, power consumption and the demand of logic gates. The video output is transmitted via High-Definition Multimedia Interface (HDMI) with a resolution of 720p at a frame rate of 60 fps conforming to the HD ready standard. Examples of the synthesized refocusing slices are presented.

Hahne, Christopher; Aggoun, Amar

2014-03-01

264

CCD TV focal plane guider development and comparison to SIRTF applications  

NASA Technical Reports Server (NTRS)

It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

Rank, David M.

1989-01-01

265

Hand contour detection in wearable camera video using an adaptive histogram region of interest  

PubMed Central

Background Monitoring hand function at home is needed to better evaluate the effectiveness of rehabilitation interventions. Our objective is to develop wearable computer vision systems for hand function monitoring. The specific aim of this study is to develop an algorithm that can identify hand contours in video from a wearable camera that records the users point of view, without the need for markers. Methods The two-step image processing approach for each frame consists of: (1) Detecting a hand in the image, and choosing one seed point that lies within the hand. This step is based on a priori models of skin colour. (2) Identifying the contour of the region containing the seed point. This is accomplished by adaptively determining, for each frame, the region within a colour histogram that corresponds to hand colours, and backprojecting the image using the reduced histogram. Results In four test videos relevant to activities of daily living, the hand detector classification accuracy was 88.3%. The contour detection results were compared to manually traced contours in 97 test frames, and the median F-score was 0.86. Conclusion This algorithm will form the basis for a wearable computer-vision system that can monitor and log the interactions of the hand with its environment. PMID:24354542

2013-01-01

266

Evolution of Ultra-High-Speed CCD Imagers  

NASA Astrophysics Data System (ADS)

This paper reviews the high-speed video cameras developed by the authors. A video camera operating at 4,500 frames per second (fps) was developed in 1991. The partial and parallel readout scheme combined with fully digital memory with overwriting function enabled the world fastest imaging at the time. The basic configuration of the camera later became a de facto standard of high-speed video cameras. A video camera mounting an innovative image sensor achieved 1,000,000 fps in 2001. In-situ storage with more than 100 CCD memory elements is installed in each pixel of the sensor, which is capable of recording image signals in all pixels in parallel. Therefore, the sensor was named ISIS, the in-situ storage image sensor. The ultimate parallel recording operation promises the theoretical maximum frame rate. A sequence of more than one hundred consecutive images reproduces a smoothly moving image at 10 fps for more than 10 seconds. Currently, an image sensor with ultrahigh sensitivity is being developed in addition to the ultra-high frame rate, named PC-ISIS, the photon-counting ISIS, for microscopic biological observation. Some other technologies supporting the ultra-high-speed imaging developed are also presented.

Etoh, T. Goji; Vo Le, Cuong; Hashishin, Yuichi; Otsuka, Nao; Takehara, Kohsei; Ohtake, Hiroshi; Hayashida, Tetsuya; Maruyama, Hirotaka

267

Comparing on site human and video counts at Igarapava fish ladder, south eastern Brazil  

Microsoft Academic Search

On site human observations and video images were collected and compared at the window of the Igarapava Dam fish ladder (IDFL), rio Grande , Southeastern Brazil, between March 1st and June 30th, 2004. We conducted four experiments with two humans (Observer 1 and Observer 2) observing fish passage in the IDFL window while a Sony 3CCD video camera (Observer 3)

Mark D. Bowen; Simone Marques; Luiz G. M. Silva; Volney Vono; Hugo P. Godinho

2006-01-01

268

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

269

Observation of cloud-to-ground lightning channels with high-speed video camera  

E-print Network

Between May and October 2013 (period of sustained thunderstorm activity in France), several cloud-to-ground lightning flashes have been observed in Paris area with a high-speed video camera (14000 frames per second). The localization and the polarity of the recorded cloud-to-ground flashes have been obtained from the French lightning detection network M{\\'e}t{\\'e}orage which is equipped with the same low frequency sensors used by the US NLDN. In this paper we focused on 7 events (3 positive cloud-to-ground lightning flashes and 4 negative cloud-to-ground lightning flashes). The propagation velocity of the leaders and its temporal evolution have been estimated; the evolution of branching of the negative leaders have been observed during the propagation of the channel which get connected to ground and initiate the first return stroke. One aim of this preliminary study is to emphasize the differences between the characteristics of the positive and of the negative leaders.

Buguet, M; Blanchet, P; Pdeboy, S; Barnoud, P; Laroche, P

2014-01-01

270

Nyquist sampling theorem: understanding the illusion of a spinning wheel captured with a video camera  

NASA Astrophysics Data System (ADS)

Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the sampling time is chosen judiciously, then it is possible to accurately determine the frequency of a signal varying periodically with time. This paper is of educational value as it presents the principles of sampling during data acquisition. The concept of the Nyquist sampling theorem is usually introduced very briefly in the literature, with very little practical examples to grasp its importance during data acquisitions. Through a series of carefully chosen examples, we attempt to present data sampling from the elementary conceptual idea and try to lead the reader naturally to the Nyquist sampling theorem so we may more clearly understand why a signal can be interpreted incorrectly during a data acquisition procedure in the case of undersampling.

Lvesque, Luc

2014-11-01

271

Application of video-cameras for quality control and sampling optimisation of hydrological and erosion measurements in a catchment  

NASA Astrophysics Data System (ADS)

Long term soil erosion studies imply substantial efforts, particularly when there is the need to maintain continuous measurements. There are high costs associated to maintenance of field equipment keeping and quality control of data collection. Energy supply and/or electronic failures, vandalism and burglary are common causes of gaps in datasets, reducing their reach in many cases. In this work, a system of three video-cameras, a recorder and a transmission modem (3G technology) has been set up in a gauging station where rainfall, runoff flow and sediment concentration are monitored. The gauging station is located in the outlet of an olive orchard catchment of 6.4 ha. Rainfall is measured with one automatic raingauge that records intensity at one minute intervals. The discharge is measured by a flume of critical flow depth, where the water is recorded by an ultrasonic sensor. When the water level rises to a predetermined level, the automatic sampler turns on and fills a bottle at different intervals according to a program depending on the antecedent precipitation. A data logger controls the instruments' functions and records the data. The purpose of the video-camera system is to improve the quality of the dataset by i) the visual analysis of the measurement conditions of flow into the flume; ii) the optimisation of the sampling programs. The cameras are positioned to record the flow at the approximation and the gorge of the flume. In order to contrast the values of ultrasonic sensor, there is a third camera recording the flow level close to a measure tape. This system is activated when the ultrasonic sensor detects a height threshold, equivalent to an electric intensity level. Thus, only when there is enough flow, video-cameras record the event. This simplifies post-processing and reduces the cost of download of recordings. The preliminary contrast analysis will be presented as well as the main improvements in the sample program.

Lora-Milln, Julio S.; Taguas, Encarnacion V.; Gomez, Jose A.; Perez, Rafael

2014-05-01

272

High-speed video capture by a single flutter shutter camera using three-dimensional hyperbolic wavelets  

NASA Astrophysics Data System (ADS)

Based on the consideration of easy achievement in modern sensors, this paper further exploits the possibility of the recovery of high-speed video (HSV) by a single flutter shutter camera. Taking into account different degrees of smoothness along the spatial and temporal dimensions of HSV, this paper proposes to use a three-dimensional hyperbolic wavelet basis based on Kronecker product to jointly model the spatial and temporal redundancy of HSV. Besides, we incorporate the total variation of temporal correlations in HSV as a prior knowledge to further enhance our reconstruction quality. We recover the underlying HSV frames from the observed low-speed coded video by solving a convex minimization problem. The experimental results on simulated and real-world videos both demonstrate the validity of the proposed method.

Huang, Kuihua; Zhang, Jun; Hou, Jinxin

2014-09-01

273

Method for eliminating artifacts in CCD imagers  

DOEpatents

An electronic method for eliminating artifacts in a video camera (10) employing a charge coupled device (CCD) (12) as an image sensor. The method comprises the step of initializing the camera (10) prior to normal read out and includes a first dump cycle period (76) for transferring radiation generated charge into the horizontal register (28) while the decaying image on the phosphor (39) being imaged is being integrated in the photosites, and a second dump cycle period (78), occurring after the phosphor (39) image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers (32). Image charge is then transferred from the photosites (36) and (38) to the vertical registers (32) and read out in conventional fashion. The inventive method allows the video camera (10) to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers (28) and (32), and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites (36) and (37).

Turko, Bojan T. (Moraga, CA); Yates, George J. (Santa Fe, NM)

1992-01-01

274

Method for eliminating artifacts in CCD imagers  

DOEpatents

An electronic method for eliminating artifacts in a video camera employing a charge coupled device (CCD) as an image sensor is disclosed. The method comprises the step of initializing the camera prior to normal read out and includes a first dump cycle period for transferring radiation generated charge into the horizontal register while the decaying image on the phosphor being imaged is being integrated in the photosites, and a second dump cycle period, occurring after the phosphor image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers. Image charge is then transferred from the photosites and to the vertical registers and read out in conventional fashion. The inventive method allows the video camera to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers and, and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites. 3 figs.

Turko, B.T.; Yates, G.J.

1992-06-09

275

High-frame-rate infrared and visible cameras for test range instrumentation  

NASA Astrophysics Data System (ADS)

Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.

Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.

1995-09-01

276

Jellyfish Support High Energy Intake of Leatherback Sea Turtles (Dermochelys coriacea): Video Evidence from Animal-Borne Cameras  

E-print Network

The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of lowenergy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n = 19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:083:38 h), and documented a total of 601 prey captures. Lions mane jellyfish (Cyanea capillata) was the dominant prey (83100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the cameras field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p = 0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p = 0.0001). Estimates of energy intake averaged 66,018 kJNd 21 but were as high as 167,797 kJNd 21 corresponding to turtles consuming

Susan G. Heaslip; Sara J. Iverson; W. Don Bowen; Michael C. James

277

Determining Camera Gain in Room Temperature Cameras  

SciTech Connect

James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

Joshua Cogliati

2010-12-01

278

Improved design of an ISIS for a video camera of 1,000,000 pps  

NASA Astrophysics Data System (ADS)

The ISIS, In-situ Storage Image Sensor, may achieve the frame rate higher than 1,000,000 pps. Technical targets in development of the ISIS are listed up. A layout of the ISIS is presented, which covers the major targets, by employing slanted CCD storage and amplified CMOS readout. The layout has two different sets of orthogonal axis systems: one is mechanical and the other functional. Photodiodes, CCD registers and all the gates are designed parallel to the mechanical axis systems. The squares on which pixels are placed form the functional axis system. The axis systems are inclined to each other. To reproduce a moving image, at least fifty consecutive images are necessary for ten-second replay at 5 pps. The inclined design inlays the straight CCD storage registers for more than fifty images in the photo- receptive area of the sensor. The amplified CMOS readout circuits built in all the pixels eliminate line defects in reproduced images, which are inherent to CCD image sensors. FPN (Fixed Pattern Noise) introduced by the individual amplification is easily suppressed by digital post image processing, which is commonly employed in scientific and engineering applications. The yield rate is significantly improved by the elimination of the line defects.

Etoh, Takeharu G.; Mutoh, Hideki; Takehara, Kohsei; Okinaka, Tomoo

1999-05-01

279

In-situ measurements of alloy oxidation/corrosion/erosion using a video camera and proximity sensor with microcomputer control  

NASA Technical Reports Server (NTRS)

Two noncontacting and nondestructive, remotely controlled methods of measuring the progress of oxidation/corrosion/erosion of metal alloys, exposed to flame test conditions, are described. The external diameter of a sample under test in a flame was measured by a video camera width measurement system. An eddy current proximity probe system, for measurements outside of the flame, was also developed and tested. The two techniques were applied to the measurement of the oxidation of 304 stainless steel at 910 C using a Mach 0.3 flame. The eddy current probe system yielded a recession rate of 0.41 mils diameter loss per hour and the video system gave 0.27.

Deadmore, D. L.

1984-01-01

280

241-AZ-101 Waste Tank Color Video Camera System Shop Acceptance Test Report  

SciTech Connect

This report includes shop acceptance test results. The test was performed prior to installation at tank AZ-101. Both the camera system and camera purge system were originally sought and procured as a part of initial waste retrieval project W-151.

WERRY, S.M.

2000-03-23

281

Camera-on-a-Chip  

NASA Technical Reports Server (NTRS)

Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

1999-01-01

282

Unsupervised soccer video abstraction based on pitch, dominant color and camera motion analysis  

Microsoft Academic Search

We present a soccer video abstraction method based on the analysis of the audio and video streams. This method could be applied to other sports as rugby or american football. The main contribution of this paper is the design of an unsupervised summarization method, and more specifically, the introduction of an efficient detector of excited speech segments. An excited commentary

F. Coldefy; Patrick Bouthemy

2004-01-01

283

Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments  

ERIC Educational Resources Information Center

In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student

Schultz, Patrick L.; Quinn, Andrew S.

2014-01-01

284

Improved design of an ISIS for a video camera of 1,000,000 pps  

Microsoft Academic Search

The ISIS, In-situ Storage Image Sensor, may achieve the frame rate higher than 1,000,000 pps. Technical targets in development of the ISIS are listed up. A layout of the ISIS is presented, which covers the major targets, by employing slanted CCD storage and amplified CMOS readout. The layout has two different sets of orthogonal axis systems: one is mechanical and

Takeharu Etoh; Hideki Mutoh; Kohsei Takehara; Tomoo Okinaka

1999-01-01

285

Technologies to develop a video camera with a frame rate higher than 100 Mfps  

Microsoft Academic Search

A feasibility study is presented for an image sensor capable of image capturing at 100 Mega-frames per second (Mfps). The basic structure of the sensor is the backside-illuminated ISIS, the in-situ storage image sensor, with slanted linear CCD memories, which has already achieved 1 Mfps with very high sensitivity. There are many potential technical barriers to further increase the frame

Cuong Vo Le; H. D. Nguyen; V. T. S. Dao; K. Takehara; T. G. Etoh; T. Akino; K. Nishi; K. Kitamura; T. Arai; H. Maruyama

2008-01-01

286

High resolution RGB color line scan camera  

NASA Astrophysics Data System (ADS)

A color line scan camera family which is available with either 6000, 8000 or 10000 pixels/color channel, utilizes off-the-shelf lenses, interfaces with currently available frame grabbers, includes on-board pixel by pixel offset correction, and is configurable and controllable via RS232 serial port for computer controlled or stand alone operation is described in this paper. This line scan camera is based on an available 8000 element monochrome line scan camera designed by AOA for OEM use. The new color version includes improvements such as better packaging and additional user features which make the camera easier to use. The heart of the camera is a tri-linear CCD sensor with on-chip color balancing for maximum accuracy and pinned photodiodes for low lag response. Each color channel is digitized to 12 bits and all three channels are multiplexed together so that the resulting camera output video is either a 12 or 8 bit data stream at a rate of up to 24Megpixels/sec. Conversion from 12 to 8 bit, or user-defined gamma, is accomplished by on board user-defined video look up tables. The camera has two user-selectable operating modes; lows speed, high sensitivity mode or high speed, reduced sensitivity mode. The intended uses of the camera include industrial inspection, digital archiving, document scanning, and graphic arts applications.

Lynch, Theodore E.; Huettig, Fred

1998-04-01

287

Adaptive compressive sensing camera  

NASA Astrophysics Data System (ADS)

We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [?] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [?]M,N M(t) = K(t) Log N(t).

Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

2013-05-01

288

The Terrascope Dataset: A Scripted Multi-Camera Indoor Video Surveillance Dataset with Ground-truth  

E-print Network

. In addition to the video data, a face and gait database for all twelve individuals observed by the network studies, assis- tive technologies for the elderly, and intelligent environ- ments that efficiently

Kale, Amit

289

Visual surveys can reveal rather different 'pictures' of fish densities: Comparison of trawl and video camera surveys in the Rockall Bank, NE Atlantic Ocean  

NASA Astrophysics Data System (ADS)

Visual surveys allow non-invasive sampling of organisms in the marine environment which is of particular importance in deep-sea habitats that are vulnerable to damage caused by destructive sampling devices such as bottom trawls. To enable visual surveying at depths greater than 200 m we used a deep towed video camera system, to survey large areas around the Rockall Bank in the North East Atlantic. The area of seabed sampled was similar to that sampled by a bottom trawl, enabling samples from the towed video camera system to be compared with trawl sampling to quantitatively assess the numerical density of deep-water fish populations. The two survey methods provided different results for certain fish taxa and comparable results for others. Fish that exhibited a detectable avoidance behaviour to the towed video camera system, such as the Chimaeridae, resulted in mean density estimates that were significantly lower (121 fish/km2) than those determined by trawl sampling (839 fish/km2). On the other hand, skates and rays showed no reaction to the lights in the towed body of the camera system, and mean density estimates of these were an order of magnitude higher (64 fish/km2) than the trawl (5 fish/km2). This is probably because these fish can pass under the footrope of the trawl due to their flat body shape lying close to the seabed but are easily detected by the benign towed video camera system. For other species, such as Molva sp, estimates of mean density were comparable between the two survey methods (towed camera, 62 fish/km2; trawl, 73 fish/km2). The towed video camera system presented here can be used as an alternative benign method for providing indices of abundance for species such as ling in areas closed to trawling, or for those fish that are poorly monitored by trawl surveying in any area, such as the skates and rays.

McIntyre, F. D.; Neat, F.; Collie, N.; Stewart, M.; Fernandes, P. G.

2015-01-01

290

Development of observation method for hydrothermal flows with acoustic video camera  

NASA Astrophysics Data System (ADS)

DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. Institute of Industrial Science, University of Tokyo (IIS) has understood DIDSON's superior performance and tried to develop a new observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. We could identify shadings inside the acoustic video images of the hydrothermal plumes. Silhouettes of the hydrothermal plumes varied from second to second, and the shadings inside them varied their shapes, too. These variations corresponded to internal structures and flows of the plumes. We are analyzing the acoustic video images in order to deduce information of their internal structures and flows in plumes. On the other hand, we are preparing a tank experiment so that we will have acoustic video images of water flow under the control of flow rate. The purpose of the experiment is to understand relation between flow rate and acoustic video image quantitatively. Results from this experiment will support the aforementioned image analysis of the hydrothermal plume data from Central Indian Ridge. We will report the overview of the image analysis and the tank experiments, and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

Mochizuki, M.; Asada, A.; Kinoshita, M.; Tamura, H.; Tamaki, K.

2011-12-01

291

CCD and IR array controllers  

NASA Astrophysics Data System (ADS)

A family of controllers has bene developed that is powerful and flexible enough to operate a wide range of CCD and IR focal plane arrays in a variety of ground-based applications. These include fast readout of small CCD and IR arrays for adaptive optics applications, slow readout of large CCD and IR mosaics, and single CCD and IR array operation at low background/low noise regimes as well as high background/high speed regimes. The CCD and IR controllers have a common digital core based on user- programmable digital signal processors that are used to generate the array clocking and signal processing signals customized for each application. A fiber optic link passes image data and commands to VME or PCI interface boards resident in a host computer to the controller. CCD signal processing is done with a dual slope integrator operating at speeds of up to one Megapixel per second per channel. Signal processing of IR arrays is done either with a dual channel video processor or a four channel video processor that has built-in image memory and a coadder to 32-bit precision for operating high background arrays. Recent developments underway include the implementation of a fast fiber optic data link operating at a speed of 12.5 Megapixels per second for fast image transfer from the controller to the host computer, and supporting image acquisition software and device drivers for the PCI interface board for the Sun Solaris, Linux and Windows 2000 operating systems.

Leach, Robert W.; Low, Frank J.

2000-08-01

292

Internet Teleprescence by Real-Time View-Dependent Image Generation with Omnidirectional Video Camera  

NASA Astrophysics Data System (ADS)

This paper describes a new networked telepresence system which realizes virtual tours into a visualized dynamic real world without significant time delay. Our system is realized by the following three steps: (1) video-rate omnidirectional image acquisition, (2) transportation of an omnidirectional video stream via internet, and (3) real-time view-dependent perspective image generation from the omnidirectional video stream. Our system is applicable to real-time telepresence in the situation where the real world to be seen is far from an observation site, because the time delay from the change of user"s viewing direction to the change of displayed image is small and does not depend on the actual distance between both sites. Moreover, multiple users can look around from a single viewpoint in a visualized dynamic real world in different directions at the same time. In experiments, we have proved that the proposed system is useful for internet telepresence.

Morita, Shinji; Yamazawa, Kazumasa; Yokoya, Naokazu

2003-01-01

293

CCD use at Lick Observatory  

NASA Technical Reports Server (NTRS)

The CCD detector and data handling system in regular use at Lick Observatory are described. A grism system has been installed on the automated Cassegrain spectrograph at the Shane 3-meter telescope. A very compact CCD cooling system has been developed using a commercial gas expansion refrigerator; the dewar is a cylinder 15 cm in diameter and 6 cm high. The data acquisition computer, an LSI 11/23 with 256 kbyte RAM, 160 Mbyte Winchester disk, and color video display, provides for FITS format magnetic tape storage as well as preliminary analysis of images and spectra. The 3-meter telescope spectrographic system uses a 500 x 500 pixel thinned CCD. Design information and operational experience for these detector systems are presented along with some results to illustrate the quality of the data being obtained and the current limitations of the CCD detectors and data system.

Lauer, T. R.; Miller, J. S.; Osborne, C. S.; Robinson, L. B.; Stover, R. J.

1984-01-01

294

Arbitrary view generation for three-dimensional scenes from uncalibrated video cameras  

Microsoft Academic Search

This paper focuses on the representation and arbitrary view generation of three dimensional (3-D) scenes. In contrast to existing methods that construct a full 3-D model or those that exploit geometric invariants, our representation consists of dense depth maps at several preselected viewpoints from an image sequence. Furthermore, instead of using multiple calibrated stationary cameras or range data, we derive

Nelson L. Chang; Avideh Zakhor

1995-01-01

295

Meteoroid flux determination using image intensified video camera data from the CILBO double station  

NASA Astrophysics Data System (ADS)

The double-station meteor camera setup on the Canary Islands, called CILBO, has been active since July 2011. This paper is based on the meteor data of one year (1.6.2013 - 31.5.2014). As a first step the statistical distribution of all observed meteors from both cameras was analyzed. Parameters under investigation include: the number of meteors observed by either one or both cameras as a function of the months, magnitude and direction. In a second step the absolute magnitude was calculated. It was found that ICC9 (La Palma) detects about 15% more meteors than ICC7 (Tenerife). A difference in the camera setting will be ruled out as a reason but different pointing directions are taken into consideration. ICC7 looks to the north-west and ICC9 looks to the south-east. A suggestion was that ICC9 sees more of the meteors originating from the Apex contribution in the early morning hours. An equation by Verniani (1973) has been used to convert brightness and velocity to the mass of the incident particle. This paper presents first results of the meteor flux analysis and compares the CILBO flux to well-known reference models (Grn et al.; 1985) and (Halliday et al.; 1996). It was found that the measured CILBO data yield a flux which fits the reference model from Grn et al. quite well.

Ott, Theresa; Drolshagen, Esther; Koschny, Detlef; Drolshagen, Gerhard; Poppe, Bjoern

2014-02-01

296

The Advanced Camera for the Hubble Space Telescope  

Microsoft Academic Search

The Advanced Camera for the Hubble Space Telescope has three cameras. The first, the Wide Field Camera, will be a high- throughput, wide field, 4096 X 4096 pixel CCD optical and I-band camera that is half-critically sampled at 500 nm. The second, the High Resolution Camera (HRC), is a 1024 X 1024 pixel CCD camera that is critically sampled at

G. D. Illingworth; Paul D. Feldman; David A. Golimowski; Zlatan Tsvetanov; Christopher J. Burrows; James H. Crocker; Pierre Y. Bely; George F. Hartig; Randy A. Kimble; Michael P. Lesser; Richard L. White; Tom Broadhurst; William B. Sparks; Robert A. Woodruff; Pamela Sullivan; Carolyn A. Krebs; Douglas B. Leviton; William Burmester; Sherri Fike; Rich Johnson; Robert B. Slusher; Paul Volmer

1997-01-01

297

Video-based realtime IMU-camera calibration for robot navigation  

NASA Astrophysics Data System (ADS)

This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly coupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the transfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and the horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation, information on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot navigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the estimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters can be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality and minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware is required. As will be shown, the system is capable of estimating the calibration within a short period of time. Depending on the requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system at startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality and consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and rotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the estimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability to robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered conditions easily.

Petersen, Arne; Koch, Reinhard

2012-06-01

298

Camera View-based American Football Video Analysis Yi Ding and Guoliang Fan  

E-print Network

and Computer Engineering Oklahoma State University, Stillwater, OK, 74078 {yi.ding, guoliang by the National Science Foundation (NSF) under Grant IIS-0347613 (CAREER). There are mainly two methodologies. The OSU football field example. In broadcast sport video, there are a variety of underlying rules

Fan, Guoliang

299

In size preserving video tracking, the camera's focal length (zoom) is adjusted automatically to compensate for  

E-print Network

. The existing method of choice for real-time target scale estimation applies structure from motion (SFM) based/background separation algorithm, the affine shape method. The resulting segmentation automatically adapts to the target preserving their sizes. 1. Introduction Video tracking systems with automatic zoom control have attracted

Abidi, Mongi A.

300

A simple, inexpensive video camera setup for the study of avian nest activity  

Microsoft Academic Search

Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (.USA $4000\\/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus),

John B. Sabine; J. Michael Meyers; Sara H. Schweitzer

301

Lights! Camera! Action! Producing Library Instruction Video Tutorials Using Camtasia Studio  

ERIC Educational Resources Information Center

From Web guides to online tutorials, academic librarians are increasingly experimenting with many different technologies in order to meet the needs of today's growing distance education populations. In this article, the author discusses one librarian's experience using Camtasia Studio to create subject specific video tutorials. Benefits, as well

Charnigo, Laurie

2009-01-01

302

Adaptive multifoveation for low-complexity video compression with a stationary camera perspective  

NASA Astrophysics Data System (ADS)

In human visual system the spatial resolution of a scene under view decreases uniformly at points of increasing distance from the point of gaze, also called foveation point. This phenomenon is referred to as foveation and has been exploited in foveated imaging to allocate bits in image and video coding according to spatially varying perceived resolution. Several digital image processing techniques have been proposed in the past to realize foveated images and video. In most cases a single foveation point is assumed in a scene. Recently there has been a significant interest in dynamic as well as multi-point foveation. The complexity involved in identification of foveation points is however significantly high in the proposed approaches. In this paper, an adaptive multi-point foveation technique for video data based on the concepts of regions of interests (ROIs) is proposed and its performance is investigated. The points of interest are assumed to be centroid of moving objects and dynamically determined by the foveation algorithm proposed. Fast algorithm for implementing region based multi-foveation processing is proposed. The proposed adaptive multi-foveation fully integrates with existing video codec standard in both spatial and DCT domain.

Sankaran, Sriram; Ansari, Rashid; Khokhar, Ashfaq A.

2005-03-01

303

A CONTENT BASED VIDEO TRAFFIC MODEL USING CAMERA Paul Bocheck and Shih-Fu Chang  

E-print Network

, the different video styles (videophone, movie, news, sport, etc.) can have distinct statistics of scene length differs from previous works in that it is not based only on matching of various statistics of the original operations. The results obtained show that the CBV model can closely match the various statistics of MPEG-2

Chang, Shih-Fu

304

CCD based beam loss monitor for ion accelerators  

NASA Astrophysics Data System (ADS)

Beam loss monitoring is an important aspect of proper accelerator functioning. There is a variety of existing solutions, but each has its own disadvantages, e.g. unsuitable dynamic range or time resolution, high cost, or short lifetime. Therefore, new options are looked for. This paper shows a method of application of a charge-coupled device (CCD) video camera as a beam loss monitor (BLM) for ion beam accelerators. The system was tested with a 500 MeV/u N+7 ion beam interacting with an aluminum target. The algorithms of camera signal processing with LabView based code and beam loss measurement are explained. Limits of applicability of this monitor system are discussed.

Belousov, A.; Mustafin, E.; Ensinger, W.

2014-04-01

305

Point Counts Underestimate the Importance of Arctic Foxes as Avian Nest Predators: Evidence from Remote Video Cameras in Arctic Alaskan Oil Fields  

Microsoft Academic Search

We used video cameras to identify nest predators at active shorebird and passerine nests and conducted point count surveys separately to determine species richness and detection frequency of potential nest predators in the Prudhoe Bay region of Alaska. From the surveys, we identified 16 potential nest predators, with glaucous gulls (Larus hyperboreus) and parasitic jaegers (Stercorarius parasiticus) making up more

JOSEPH R. LIEBEZEIT; STEVE ZACK

2008-01-01

306

Distribution of bioluminescence and plankton in a deep Norwegian fjord measured using an ISIT camera and the Digital Underwater Video Profiler  

Microsoft Academic Search

Bioluminescence and plankton profiles were obtained using a downward-looking ISIT low-light camera and the Underwater Video Profiler system in Sognefjord, Norway. The profiling systems were lowered by CTD wire and recorded continuously from the surface to a depth of 1000 m. The former system delivered the vertical distribution of mechanically stimulated bioluminescent signals while the second provided the vertical distribution

David M. Bailey; Marc Picheral; Alan J. Jamieson; Olav Rune God; Philip M. Bagley; Gabriel Gorsky

2007-01-01

307

Abstract--Video cameras are a relatively low-cost, rich source of information that can be used for "well-being"  

E-print Network

levels from a hierarchy of fuzzy inference using linguistic summarizations of activity acquired it to additional common elderly activities and contextual awareness is added for reasoning based on locationAbstract--Video cameras are a relatively low-cost, rich source of information that can be used

He, Zhihai "Henry"

308

A Motionless Camera  

NASA Technical Reports Server (NTRS)

Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

1994-01-01

309

Soft X-ray transmission of optical blocking filters for the X-ray CCD cameras onboard Astro-E 2  

Microsoft Academic Search

We measured soft X-ray transmission of Optical Blocking Filters for Charge Coupled Device cameras, which will be launched as focal plane detectors of X-ray telescopes onboard the Japanese fifth X-ray astronomical satellite, Astro-E 2. The filters were made from polyimide coated with Al. The X-ray absorption fine structures at the K edges of C, N, O and Al were measured.

Shunji Kitamoto; Takayoshi Kohmura; Norimasa Yamamoto; Harue Saito; Haruko Takano; Kazuharu Suga; Eiji Ozawa; Kazuma Suzuki; Risa Kato; Yusuke Tachibana; Yusuke Tsuji; Ken Koganei; Kiyoshi Hayashida; Haruyoshi Katayama; Hideyuki Enoguchi; Yusuke Nakashima; Takayuki Shiroshouji

2003-01-01

310

Compensating for camera translation in video eye-movement recordings by tracking a representative landmark selected automatically by a genetic algorithm.  

PubMed

It is common in oculomotor and vestibular research to use video or still cameras to acquire data on eye movements. Unfortunately, such data are often contaminated by unwanted motion of the face relative to the camera, especially during experiments in dynamic motion environments. We develop a method for estimating the motion of a camera relative to a highly deformable surface, specifically the movement of a camera relative to the face and eyes. A small rectangular region of interest (ROI) on the face is automatically selected and tracked throughout a set of video frames as a measure of vertical camera translation. The specific goal is to present a process based on a genetic algorithm that selects a suitable ROI for tracking: one whose translation within the camera image accurately matches the actual relative motion of the camera. We find that co-correlation, a statistic describing the time series of a large group of ROIs, predicts the accuracy of the ROIs, and can be used to select the best ROI from a group. After the genetic algorithm finds the best ROIs from a group, it uses recombination to form a new generation of ROIs that inherit properties of the ROIs from the previous generation. We show that the algorithm can select an ROI that will estimate camera translation and determine the direction that the eye is looking with an average accuracy of 0.75 degrees , even with camera translations of 2.5mm at a viewing distance of 120 mm, which would cause an error of 11 degrees without correction. PMID:18835407

Karmali, Faisal; Shelhamer, Mark

2009-01-30

311

Technologies to develop a video camera with a frame rate higher than 100 Mfps  

NASA Astrophysics Data System (ADS)

A feasibility study is presented for an image sensor capable of image capturing at 100 Mega-frames per second (Mfps). The basic structure of the sensor is the backside-illuminated ISIS, the in-situ storage image sensor, with slanted linear CCD memories, which has already achieved 1 Mfps with very high sensitivity. There are many potential technical barriers to further increase the frame rate up to 100 Mfps, such as traveling time of electrons within a pixel, Resistive-Capacitive (RC) delay in driving voltage transfer, heat generation, heavy electro-magnetic noises, etc. For each of the barriers, a countermeasure is newly proposed and the technical and practical possibility is examined mainly by simulations. The new technical proposals include a special wafer with n and p double epitaxial layers with smoothly changing doping profiles, a design method with curves, the thunderbolt bus lines, and digitalnoiseless image capturing by the ISIS with solely sinusoidal driving voltages. It is confirmed that the integration of these technologies is very promising to realize a practical image sensor with the ultra-high frame rate.

Vo Le, Cuong; Nguyen, H. D.; Dao, V. T. S.; Takehara, K.; Etoh, T. G.; Akino, T.; Nishi, K.; Kitamura, K.; Arai, T.; Maruyama, H.

2008-11-01

312

Crosswind sensing from optical-turbulence-induced fluctuations measured by a video camera.  

PubMed

We present a novel method for remote sensing of crosswind using a passive imaging device, such as a video recorder. The method is based on spatial and temporal correlations of the intensity fluctuations of a naturally illuminated scene induced by atmospheric turbulence. Adaptable spatial filtering, taking into account variations of the dominant spatial scales of the turbulence (due to changes in meteorological conditions, such as turbulence strength, or imaging device performance, such as frame rate or spatial resolution), is incorporated into this method. Experimental comparison with independent wind measurement using anemometers shows good agreement. PMID:20885458

Porat, Omer; Shapira, Joseph

2010-10-01

313

Apogee Imaging Systems Camera Installation Guide  

E-print Network

camera 4) Begin using your Apogee camera! 1.1 Install MaxIm DL/CCD Your Apogee camera may have included a copy of MaxIm DL/CCD image capture and processing software, developed by Diffraction Limited. If you will be using the MaxIm DL/CCD software, we recommend that you install it prior to setting up your Apogee camera

Kleinfeld, David

314

A simple, inexpensive video camera setup for the study of avian nest activity  

USGS Publications Warehouse

Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (>USA $4000/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus), American Crow (Corvus brachyrhynchos), and ghost crab (Ocypode quadrata) predation on oystercatcher nests. Other detected causes of nest failure included tidal overwash, horse trampling, abandonment, and human destruction. System failure rates were comparable with commercially available units. Our system's efficacy and low cost (<$800) provided useful data for the management and conservation of the American Oystercatcher.

Sabine, J.B.; Meyers, J.M.; Schweitzer, S.H.

2005-01-01

315

Jellyfish support high energy intake of leatherback sea turtles (Dermochelys coriacea): video evidence from animal-borne cameras.  

PubMed

The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n = 19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:08-3:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83-100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p = 0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p = 0.0001). Estimates of energy intake averaged 66,018 kJ d(-1) but were as high as 167,797 kJ d(-1) corresponding to turtles consuming an average of 330 kg wet mass d(-1) (up to 840 kg d(-1)) or approximately 261 (up to 664) jellyfish d(-1). Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body mass d(-1) equating to an average energy intake of 3-7 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to southward migration. PMID:22438906

Heaslip, Susan G; Iverson, Sara J; Bowen, W Don; James, Michael C

2012-01-01

316

Jellyfish Support High Energy Intake of Leatherback Sea Turtles (Dermochelys coriacea): Video Evidence from Animal-Borne Cameras  

PubMed Central

The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n?=?19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:083:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p?=?0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p?=?0.0001). Estimates of energy intake averaged 66,018 kJd?1 but were as high as 167,797 kJd?1 corresponding to turtles consuming an average of 330 kg wet massd?1 (up to 840 kgd?1) or approximately 261 (up to 664) jellyfishd-1. Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body massd?1 equating to an average energy intake of 37 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to southward migration. PMID:22438906

Heaslip, Susan G.; Iverson, Sara J.; Bowen, W. Don; James, Michael C.

2012-01-01

317

Micro-rheology Using Multi Speckle DWS with Video Camera. Application to Film Formation, Drying and Rheological Stability  

NASA Astrophysics Data System (ADS)

We present in this work two applications of microrheology: the monitoring of film formation and the rheological stability. Microrheology is based on the Diffusing Wave Spectroscopy (DWS) method [1] that relates the particle dynamics to the speckle field dynamics, and further to the visco-elastic moduli G' and G? with respect to frequency [2]. Our technology uses the Multi Speckle DWS (MS-DWS) set-up in backscattering with a video camera. For film formation and drying application, we present a new algorithm called "Adaptive Speckle Imaging Interferometry" (ASII) that extracts a simple kinetics from the speckle field dynamics [3,4]. Different film forming and drying have been investigated (water-based, solvent and solvent-free paints, inks, adhesives, varnishes, ) on various types of substrates and at different thickness (few to hundreds microns). For rheological stability we show that the robust measurement of speckle correlation using the inter image distance [3] can bring useful information for industry on viscoelasticity variations over a wide range of frequency without additional parameter.

Brunel, Laurent; Dihang, Hlne

2008-07-01

318

Bird-borne video-cameras show that seabird movement patterns relate to previously unrevealed proximate environment, not prey.  

PubMed

The study of ecological and behavioral processes has been revolutionized in the last two decades with the rapid development of biologging-science. Recently, using image-capturing devices, some pilot studies demonstrated the potential of understanding marine vertebrate movement patterns in relation to their proximate, as opposed to remote sensed environmental contexts. Here, using miniaturized video cameras and GPS tracking recorders simultaneously, we show for the first time that information on the immediate visual surroundings of a foraging seabird, the Cape gannet, is fundamental in understanding the origins of its movement patterns. We found that movement patterns were related to specific stimuli which were mostly other predators such as gannets, dolphins or fishing boats. Contrary to a widely accepted idea, our data suggest that foraging seabirds are not directly looking for prey. Instead, they search for indicators of the presence of prey, the latter being targeted at the very last moment and at a very small scale. We demonstrate that movement patterns of foraging seabirds can be heavily driven by processes unobservable with conventional methodology. Except perhaps for large scale processes, local-enhancement seems to be the only ruling mechanism; this has profounds implications for ecosystem-based management of marine areas. PMID:24523892

Tremblay, Yann; Thiebault, Andra; Mullers, Ralf; Pistorius, Pierre

2014-01-01

319

Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries  

SciTech Connect

After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

Dierberg, F.E. [DB Environmental Labs., Inc., Rockledge, FL (United States); Zaitzeff, J. [National Oceanographic and Atmospheric Adminstration, Washington, DC (United States)

1997-08-01

320

Locometer: on-line inspection of locomotive wheel-to-rail movements using high-precision CCD metrology  

NASA Astrophysics Data System (ADS)

A CCD camera based optical metrology system has been developed for the accurate measurement of a railway locomotive''s wheel movements with respect to the rails. The system is based on the light-sectioning method implemented with four laser diodes projecting light sheets onto the wheel and rail. A high-resolution CCD camera views the four profiles simultaneously using an appropriately folded and combined beam-path. To minimize the effects of ambient light a special narrow-band dielectric filter was designed manufactured and fitted in front of the camera lens. The desired measurement accuracy requires pixel-synchronous acquisition of the CCD video data. This is realized with a custom-built universal CCD data acquistion system with which profile tracking data compression and storage at 12. 5 Hz (half frame-rate) is made possible. A prototype system was built and tested on railway tracks at up to 140 km/h. In laboratory experiments the system surpassed the required measurement accuracies about fivefold attaining an accuracy of 0. 02 mm in relative position and better than 0. 1 mrad in relative angle. 2.

Seitz, Peter; Gale, Michael T.; Meier, Heinrich; Raynor, Jeffrey M.; Wolff, P.; Hecht, M.

1990-08-01

321

Liquid crystal polarization camera  

Microsoft Academic Search

Presents a fully automated system which unites CCD camera technology with liquid crystal technology to create a polarization camera capable of sensing the polarization of reflected light from objects at pixel resolution. As polarization affords a more general physical description of light than does intensity, it can therefore provide a richer set of descriptive physical constraints for the understanding of

L. B. Wolff; T. A. Mancini

1992-01-01

322

Liquid crystal polarization camera  

Microsoft Academic Search

We present a fully automated system which unites CCD camera technology with liquid crystal technology to create a polarization camera capable of sensing the partial linear polarization of reflected light from objects at pixel resolution. As polarization sensing not only measures intensity but also additional physical parameters of light, it can therefore provide a richer set of descriptive physical constraints

Lawrence B. Wolff; Todd A. Mancini; Philippe Pouliquen; Andreas G. Andreou

1997-01-01

323

In Proc. 1999 IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Kyongju, Korea, pp. 1489-1494, October 1999. Robust Estimation of Human Body Kinematics from Video  

E-print Network

-1494, October 1999. Robust Estimation of Human Body Kinematics from Video Ales Ude Kawato Dynamic Brain Project system requires only a standard CCD camera and no spe- cial markers on the body. We present experimental- ditional approach that decomposes the problem of human motion capture into a body tracking stage

Ude, Ales

324

Design of video interface conversion system based on FPGA  

NASA Astrophysics Data System (ADS)

This paper presents a FPGA based video interface conversion system that enables the inter-conversion between digital and analog video. Cyclone IV series EP4CE22F17C chip from Altera Corporation is used as the main video processing chip, and single-chip is used as the information interaction control unit between FPGA and PC. The system is able to encode/decode messages from the PC. Technologies including video decoding/encoding circuits, bus communication protocol, data stream de-interleaving and de-interlacing, color space conversion and the Camera Link timing generator module of FPGA are introduced. The system converts Composite Video Broadcast Signal (CVBS) from the CCD camera into Low Voltage Differential Signaling (LVDS), which will be collected by the video processing unit with Camera Link interface. The processed video signals will then be inputted to system output board and displayed on the monitor.The current experiment shows that it can achieve high-quality video conversion with minimum board size.

Zhao, Heng; Wang, Xiang-jun

2014-11-01

325

Vacuum Camera Cooler  

NASA Technical Reports Server (NTRS)

Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

Laugen, Geoffrey A.

2011-01-01

326

Camera Operator and Videographer  

ERIC Educational Resources Information Center

Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or

Moore, Pam

2007-01-01

327

Evolution of Ultra-High-Speed CCD Imagers  

Microsoft Academic Search

This paper reviews the high-speed video cameras developed by the authors. A video camera operating at 4,500 frames per second (fps) was developed in 1991. The partial and parallel readout scheme combined with fully digital memory with overwriting function enabled the world fastest imaging at the time. The basic configuration of the camera later became a de facto standard of

T. Goji Etoh; Cuong Vo Le; Yuichi Hashishin; Nao Otsuka; Kohsei Takehara; Hiroshi Ohtake; Tetsuya Hayashida; Hirotaka Maruyama

2008-01-01

328

Are traditional methods of determining nest predators and nest fates reliable? An experiment with Wood Thrushes (Hylocichla mustelina) using miniature video cameras  

USGS Publications Warehouse

We used miniature infrared video cameras to monitor Wood Thrush (Hylocichla mustelina) nests during 1998-2000. We documented nest predators and examined whether evidence at nests can be used to predict predator identities and nest fates. Fifty-six nests were monitored; 26 failed, with 3 abandoned and 23 depredated. We predicted predator class (avian, mammalian, snake) prior to review of video footage and were incorrect 57% of the time. Birds and mammals were underrepresented whereas snakes were over-represented in our predictions. We documented ???9 nest-predator species, with the southern flying squirrel (Glaucomys volans) taking the most nests (n = 8). During 2000, we predicted fate (fledge or fail) of 27 nests; 23 were classified correctly. Traditional methods of monitoring nests appear to be effective for classifying success or failure of nests, but ineffective at classifying nest predators.

Williams, G.E.; Wood, P.B.

2002-01-01

329

Observational Astronomy Gain of a CCD  

E-print Network

be inside the left-hand end of the cardboard tube on the table. The electronics cart will be set up nearby camera, which is at the other end of the cardboard tube. #12;· With the flip-mirror in place, look ­ this corresponds to the CCD chip. Rotate the cardboard tube until the image of the step wedge fits

Veilleux, Sylvain

330

A video precipitation sensor for imaging and velocimetry of hydrometeors  

NASA Astrophysics Data System (ADS)

A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of Video Precipitation Sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fiber cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit, the cylindrical space between the optical unit and imaging unit is sampling volume (300 mm 40 mm 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposures two times in a single frame, by which the double-exposure of particles images can be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of linear scan CCD disdrometer and impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

Liu, X. C.; Gao, T. C.; Liu, L.

2013-11-01

331

A video precipitation sensor for imaging and velocimetry of hydrometeors  

NASA Astrophysics Data System (ADS)

A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of a video precipitation sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fibre cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit. The cylindrical space between the optical unit and imaging unit is sampling volume (300 mm 40 mm 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposes twice in a single frame, which allows the double exposure of particles images to be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in the double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of a linear scan CCD disdrometer and an impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

Liu, X. C.; Gao, T. C.; Liu, L.

2014-07-01

332

Two-dimensional restoration of motion-degraded intensified CCD imagery.  

PubMed

A Wiener filter-based deconvolution algorithm is developed to restore vibration-degraded video imagery from an intensified CCD camera. The method is based on the use of azimuth and elevation angular optical line-of-sight data recorded from external sensors to estimate a two-dimensional vibration-blur impulse response on a per frame basis. Flight conditions are reproduced in the laboratory by use of prerecorded in-flight vibration data. The performance of the algorithm varies from frame to frame, following the time-varying characteristics of the vibration-blur impulse response. However, real-time display of the restored video minimizes these effects because of eye integration, and near-full restoration of the original uncorrupted imagery is observed for both high-light- and low-light-level conditions with minimal amplification of noise. PMID:18319749

Barnard, K J; White, C E; Absi, A E

1999-04-01

333

Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic  

NASA Astrophysics Data System (ADS)

Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

2008-11-01

334

L3CCD results in pure photon counting mode  

E-print Network

Theoretically, L3CCDs are perfect photon counting devices promising high quantum efficiency ($\\sim$90%) and sub-electron readout noise ($\\sigma$<0.1 e-). We discuss how a back-thinned 512x512 frame-transfer L3CCD (CCD97) camera operating in pure photon counting mode would behave based on experimental data. The chip is operated at high electromultiplication gain, high analogic gain and high frame rate. Its performance is compared with a modern photon counting camera (GaAs photocathode, QE $\\sim$28%) to see if L3CCD technology, in its current state, could supersede photocathode-based devices.

Olivier Daigle; Jean-Luc Gach; Christian Guillaume; Claude Carignan; Philippe Balard; Olivier Boissin

2004-07-15

335

Overview of a hybrid underwater camera system  

NASA Astrophysics Data System (ADS)

The paper provides an overview of a Hybrid Underwater Camera (HUC) system combining sonar with a range-gated laser camera system. The sonar is the BlueView P900-45, operating at 900kHz with a field of view of 45 degrees and ranging capability of 60m. The range-gated laser camera system is based on the third generation LUCIE (Laser Underwater Camera Image Enhancer) sensor originally developed by the Defence Research and Development Canada. LUCIE uses an eye-safe laser generating 1ns pulses at a wavelength of 532nm and at the rate of 25kHz. An intensified CCD camera operates with a gating mechanism synchronized with the laser pulse. The gate opens to let the camera capture photons from a given range of interest and can be set from a minimum delay of 5ns with increments of 200ps. The output of the sensor is a 30Hz video signal. Automatic ranging is achieved using a sonar altimeter. The BlueView sonar and LUCIE sensors are integrated with an underwater computer that controls the sensors parameters and displays the real-time data for the sonar and the laser camera. As an initial step for data integration, graphics overlays representing the laser camera field-of-view along with the gate position and width are overlaid on the sonar display. The HUC system can be manually handled by a diver and can also be controlled from a surface vessel through an umbilical cord. Recent test data obtained from the HUC system operated in a controlled underwater environment will be presented along with measured performance characteristics.

Church, Philip; Hou, Weilin; Fournier, Georges; Dalgleish, Fraser; Butler, Derek; Pari, Sergio; Jamieson, Michael; Pike, David

2014-05-01

336

Advanced imaging system for high-precision, high-resolution CCD imaging  

Microsoft Academic Search

The Advanced Imaging System is a slow scan, high precision CCD camera system designed specifically for low noise image acquisition and precise, highly flexible CCD testing and characterization. In addition, the system is designed to allow CCD mosaics to be supported with separate, programmable clock voltages and output amplifier operating points for each device. A high speed digital signal processor

Peter E. Doherty; Gary R. Sims

1991-01-01

337

CCD digital radiography system  

NASA Astrophysics Data System (ADS)

Amorphous silicon flat-panel detector is the mainstream used in digital radiography (DR) system. In latest years, scintillation screen coupled with CCD DR is becoming more popular in hospital. Compared with traditional amorphous silicon DR, CCD-DR has better spatial resolution and has little radiation damage. It is inexpensive and can be operated easily. In this paper, A kind of CCD based DR system is developed. We describe the construction of the system, the system performances and experiment results.

Wang, Yi; Kang, Xi; Li, Yuanjing; Cheng, Jianping; Hou, Yafei; Han, Haiwei

2009-07-01

338

Three-dimensional data capture system for stereo-video images  

NASA Astrophysics Data System (ADS)

This paper describes one of the industrial applications of our digital photogrammetric system VirtuoZo, namely a prototype system to collect 3D data from stereo-video pair sequences along a rail road track for clearance measurements. With the rapid developing of digital media such as charge-coupled-device (CCD) and digital video cameras, stereo images pairs can be captured in a much easier and faster way compared with traditional means. Digital photogrammetry can thus now be used in many new applications. However, with the geometry of CCD (or digital video) cameras different from the classic analogy metric camera, new relative orientation and epipolar image resampling algorithms have to be developed for these nonmetric cameras. An example of such a new application is given in this paper: a series of sequential stereo image pairs were captured by two digital cameras along a railway track from a moving rail platform, then relative orientation was done fully automatically by matching registering points in the two stereo scenes using a hierarchical relaxation image matching algorithm. Then, epipolar images are resampled from the original images by means of a relative linear transform, and finally a 3D data collection algorithm allows a user-friendly interface to the human operator for data capture on a SGI workstation under StereoView.

Wu, Xiaoliang; Kubik, Kurt; Maeder, Anthony J.

1995-09-01

339

1 Introduction Surround video  

E-print Network

1 Introduction Surround video: a multihead camera approach Frank Nielsen Sony Computer Science videos. Given a set of unit cameras designed to be almost aligned at a common nodal point, we first present a versatile pro- cess for stitching seamlessly synchronized streams of videos into a single

Nielsen, Frank

340

StartleCam: A Cybernetic Wearable Camera  

Microsoft Academic Search

StartleCam is a wearable video camera, computer, and sensing system, which enables the camera to be controlled via both conscious and preconscious events involving the wearer. Traditionally, a wearer consciously hits record on the video camera, or runs a computer script to trigger the camera according to some pre-specified frequency. The sys- tem described here offers an additional option: images

Jennifer Healey; Rosalind W. Picard

1998-01-01

341

A geometric comparison of video camera-captured raster data to vector-parented raster data generated by the X-Y digitizing table  

NASA Technical Reports Server (NTRS)

The relative accuracy of a georeferenced raster data set captured by the Megavision 1024XM system using the Videk Megaplus CCD cameras is compared to a georeferenced raster data set generated from vector lines manually digitized through the ELAS software package on a Summagraphics X-Y digitizer table. The study also investigates the amount of time necessary to fully complete the rasterization of the two data sets, evaluating individual areas such as time necessary to generate raw data, time necessary to edit raw data, time necessary to georeference raw data, and accuracy of georeferencing against a norm. Preliminary results exhibit a high level of agreement between areas of the vector-parented data and areas of the captured file data where sufficient control points were chosen. Maps of 1:20,000 scale were digitized into raster files of 5 meter resolution per pixel and overall error in RMS was estimated at less than eight meters. Such approaches offer time and labor-saving advantages as well as increasing the efficiency of project scheduling and enabling the digitization of new types of data.

Swalm, C.; Pelletier, R.; Rickman, D.; Gilmore, K.

1989-01-01

342

Liquid-crystal polarization camera  

Microsoft Academic Search

We present a fully automated system which unites CCD camera technology with liquid crystal technology to create a polarization camera capable of sensing the polarization of reflected light from objects at pixel resolution. As polarization affords a more general physical description of light than does intensity, it can therefore provide a richer set of descriptive physical constraints for the understanding

Lawrence B. Wolff; Todd A. Mancini

1992-01-01

343

Event-Driven Random-Access-Windowing CCD Imaging System  

NASA Technical Reports Server (NTRS)

A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable assembly. The FPGA controller card is connected to the host computer via a standard peripheral component interface (PCI).

Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

2004-01-01

344

Video Surveillance Unit  

SciTech Connect

The Video Surveillance Unit (VSU) has been designed to provide a flexible, easy to operate video surveillance and recording capability for permanent rack-mounted installations. The system consists of a single rack-mountable chassis and a camera enclosure. The chassis contains two 8 mm video recorders, a color monitor, system controller board, a video authentication verifier module (VAVM) and a universal power supply. A separate camera housing contains a solid state camera and a video authentication processor module (VAPM). Through changes in the firmware in the system, the recorders can be commanded to record at the same time, on alternate time cycle, or sequentially. Each recorder is capable of storing up to 26,000 scenes consisting of 6 to 8 video frames. The firmware can be changed to provide fewer recording with more frames per scene. The modular video authentication system provides verification of the integrity of the video transmission line between the camera and the recording chassis. 5 figs.

Martinez, R.L.; Johnson, C.S.

1990-01-01

345

The Video Book.  

ERIC Educational Resources Information Center

This book provides a comprehensive step-by-step learning guide to video production. It begins with camera equipment, both still and video. It then describes how to reassemble the video and build a final product out of "video blocks," and discusses multiple-source configurations, which are required for professional level productions of live shows.

Clendenin, Bruce

346

Catadioptric Omnidirectional Camera  

Microsoft Academic Search

Conventional video cameras have limited fields of view that make them restrictive in a variety of vision applications. There are several ways to enhance the field of view of an imaging system. However, the entire imaging system must have a single effective viewpoint to enable the generation of pure perspective images from a sensed image. A new camera with a

Shree K. Nayar

1997-01-01

347

Advisory Surveillance Cameras Page 1 of 2  

E-print Network

be produced and how will it be secured, who will have access to the tape? 7. At what will the camera to ensure the cameras' presence doesn't create a false sense of security #12;Advisory ­ Surveillance CamerasAdvisory ­ Surveillance Cameras May 2008 Page 1 of 2 ADVISORY -- USE OF CAMERAS/VIDEO SURVEILLANCE

Liebling, Michael

348

High-speed multicolour photometry with CMOS cameras  

NASA Astrophysics Data System (ADS)

We present the results of testing the commercial digital camera Nikon D90 with a CMOS sensor for high-speed photometry with a small telescope Celestron 11'' at the Peak Terskol Observatory. CMOS sensor allows to perform photometry in 3 filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system of CMOS sensors is close to the Johnson BVR system. The results of testing show that one can carry out photometric measurements with CMOS cameras for stars with the V-magnitude up to ?14^{m} with the precision of 0.01^{m}. Stars with the V-magnitude up to 10 can be shot at 24 frames per second in the video mode.

Pokhvala, S. M.; Zhilyaev, B. E.; Reshetnyk, V. M.

2012-11-01

349

Video Object Tracking and Analysis for Computer Assisted Surgery  

E-print Network

Pedicle screw insertion technique has made revolution in the surgical treatment of spinal fractures and spinal disorders. Although X- ray fluoroscopy based navigation is popular, there is risk of prolonged exposure to X- ray radiation. Systems that have lower radiation risk are generally quite expensive. The position and orientation of the drill is clinically very important in pedicle screw fixation. In this paper, the position and orientation of the marker on the drill is determined using pattern recognition based methods, using geometric features, obtained from the input video sequence taken from CCD camera. A search is then performed on the video frames after preprocessing, to obtain the exact position and orientation of the drill. Animated graphics, showing the instantaneous position and orientation of the drill is then overlaid on the processed video for real time drill control and navigation.

Pallath, Nobert Thomas

2012-01-01

350

From Video Matching to Video Grounding Georgios Evangelidis  

E-print Network

From Video Matching to Video Grounding Georgios Evangelidis INRIA Rh^one-Alpes, 38330, Montbonnot.horaud@inria.fr Abstract This paper addresses the background estimation problem for videos captured by moving cameras, referred to as video grounding. It essentially aims at reconstructing a video, as if it would be without

Boyer, Edmond

351

Smart Cameras as Embedded Systems  

Microsoft Academic Search

Recent technological advances are enabling a new generation of smart cameras that represent a quantum leap in sophistication. While today's digital cameras capture images, smart cameras capture high-level descriptions of the scene and analyze what they see. These devices could support a wide variety of applications including human and animal detection, surveillance, motion analysis, and facial identification. Video processing has

Wayne Wolf; Burak Ozer; Lv Tiehan

2002-01-01

352

The Video Guide. Second Edition.  

ERIC Educational Resources Information Center

Intended for both novice and experienced users, this guide is designed to inform and entertain the reader in unravelling the jargon surrounding video equipment and in following carefully delineated procedures for its use. Chapters include "Exploring the Video Universe,""A Grand Tour of Video Technology,""The Video System,""The Video Camera,""The

Bensinger, Charles

353

Vision Sensors and Cameras  

NASA Astrophysics Data System (ADS)

Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

Hoefflinger, Bernd

354

Application of a Two Camera Video Imaging System to Three-Dimensional Vortex Tracking in the 80- by 120-Foot Wind Tunnel  

NASA Technical Reports Server (NTRS)

A description is presented of two enhancements for a two-camera, video imaging system that increase the accuracy and efficiency of the system when applied to the determination of three-dimensional locations of points along a continuous line. These enhancements increase the utility of the system when extracting quantitative data from surface and off-body flow visualizations. The first enhancement utilizes epipolar geometry to resolve the stereo "correspondence" problem. This is the problem of determining, unambiguously, corresponding points in the stereo images of objects that do not have visible reference points. The second enhancement, is a method to automatically identify and trace the core of a vortex in a digital image. This is accomplished by means of an adaptive template matching algorithm. The system was used to determine the trajectory of a vortex generated by the Leading-Edge eXtension (LEX) of a full-scale F/A-18 aircraft tested in the NASA Ames 80- by 120-Foot Wind Tunnel. The system accuracy for resolving the vortex trajectories is estimated to be +/-2 inches over distance of 60 feet. Stereo images of some of the vortex trajectories are presented. The system was also used to determine the point where the LEX vortex "bursts". The vortex burst point locations are compared with those measured in small-scale tests and in flight and found to be in good agreement.

Meyn, Larry A.; Bennett, Mark S.

1993-01-01

355

Upgrade of ESO's FIERA CCD Controller and PULPO Subsystem  

NASA Astrophysics Data System (ADS)

An overview of FIERA is presented with emphasis on its recent upgrade to PCI. The PCI board hosts two DSPs, one for real time control of the camera and another for on-the-fly processing of the incoming video data. In addition, the board is able to make DMA transfers, to synchronize to other boards alike, to be synchronized by a TIM bus and to control PULPO via RS232. The design is based on the IOP480 chip from PLX, for which we have developed a device driver for both Solaris and Linux. One computer is able to host more than one board and therefore can control an array of FIERA detector electronics. PULPO is a multifunctional subsystem widely used at ESO for the housekeeping of CCD cryostat heads and for shutter control. The upgrade of PULPO is based on an embedded PC running Linux. The upgraded PULPO is able to handle 29 temperature sensors, control 8 heaters and one shutter, read out one vacuum sensor and log any combination of parameters.

Reyes-Moreno, J.; Geimer, C.; Balestra, A.; Haddad, N.

356

Real-World Interaction with Camera-Phones Michael Rohs  

E-print Network

With the integration of cameras, mobile phones have evolved into networked personal image capture devices. Camera-phones into account. 1 Introduction With the integration of CCD cameras, mobile phones have become networked personal, we present a visual code system that turns camera-phones into mobile sensors for 2-dimensional vi

357

The use of video for air pollution source monitoring  

SciTech Connect

The evaluation of air pollution impacts from single industrial emission sources is a complex environmental engineering problem. Recent developments in multimedia technologies used by personal computers improved the digitizing and processing of digital video sequences. This paper proposes a methodology where statistical analysis of both meteorological and air quality data combined with digital video images are used for monitoring air pollution sources. One of the objectives of this paper is to present the use of image processing algorithms in air pollution source monitoring. CCD amateur video cameras capture images that are further processed by computer. The use of video as a remote sensing system was implemented with the goal of determining some particular parameters, either meteorological or related with air quality monitoring and modeling of point sources. These parameters include the remote calculation of wind direction, wind speed, gases stack's outlet velocity, and stack's effective emission height. The characteristics and behavior of a visible pollutant's plume is also studied. Different sequences of relatively simple image processing operations are applied to the images gathered by the different cameras to segment the plume. The algorithms are selected depending on the atmospheric and lighting conditions. The developed system was applied to a 1,000 MW fuel power plant located at Setubal, Portugal. The methodology presented shows that digital video can be an inexpensive form to get useful air pollution related data for monitoring and modeling purposes.

Ferreira, F.; Camara, A.

1999-07-01

358

Cone penetrometer deployed in situ video microscope for characterizing sub-surface soil properties  

SciTech Connect

In this paper we report on the development and field testing of an in situ video microscope that has been integrated with a cone penetrometer probe in order to provide a real-time method for characterizing subsurface soil properties. The video microscope system consists of a miniature CCD color camera system coupled with an appropriate magnification and focusing optics to provide a field of view with a coverage of approximately 20 mm. The camera/optic system is mounted in a cone penetrometer probe so that the camera views the soil that is in contact with a sapphire window mounted on the side of the probe. The soil outside the window is illuminated by diffuse light provided through the window by an optical fiber illumination system connected to a white light source at the surface. The video signal from the camera is returned to the surface where it can be displayed in real-time on a video monitor, recorded on a video cassette recorder (VCR), and/or captured digitally with a frame grabber installed in a microcomputer system. In its highest resolution configuration, the in situ camera system has demonstrated a capability to resolve particle sizes as small as 10 {mu}m. By using other lens systems to increase the magnification factor, smaller particles could be resolved, however, the field of view would be reduced. Initial field tests have demonstrated the ability of the camera system to provide real-time qualitative characterization of soil particle sizes. In situ video images also reveal information on porosity of the soil matrix and the presence of water in the saturated zone. Current efforts are focused on the development of automated imaging processing techniques as a means of extracting quantitative information on soil particle size distributions. Data will be presented that compares data derived from digital images with conventional sieve/hydrometer analyses.

Lieberman, S.H.; Knowles, D.S. [Naval Command, San Diego, CA (United States); Kertesz, J. [San Diego State Univ. Foundation, CA (United States)] [and others

1997-12-31

359

A satellite-use aimed CCD Panchromatic and Multispectral Remote Sensing System  

Microsoft Academic Search

The development of the CCD Panchromatic and Multispectral Remote Sensing System (CPMRS) was a part of the preliminary research work for the program of the Operational Earth Resource Satellite of China (COERS). This paper presents the performance, the basic configuration and design of CPMRS, which includes a CCD panchromatic and multispectral camera, a 36Mb\\/s data transmission subsystem as well as

Shiping Chen; Kexiang Lin; Mingyuan Wang; Zongwei Zhu

1990-01-01

360

The DSLR Camera  

NASA Astrophysics Data System (ADS)

Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar wayintelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

Berk, Ern?; Argyle, R. W.

361

Design of Rail Surface Crack-detecting System Based on Linear CCD Sensor  

Microsoft Academic Search

The rail surface crack-detecting system was designed for reducing railway accident due to rail crack. The system adopts linear charge coupled device (CCD) TCD1208AP as image sensor, uses high-speed flash A\\/D converter AD7821 to collect CCD output video signals, and uses CPLD perform CCD timing generator, A\\/D converter timing generator, data storage and other control logic. Then DSP executes the

Qiao Jian-hua; Li Lin-sheng; Zhang Jing-gang

2008-01-01

362

Video-Augmented Environments  

E-print Network

Video-Augmented Environments A dissertation submitted for the degree of Doctor of Philosophy James of application. Video cameras are ideally suited to many real- world monitoring applications for this reason deployment in the home and office economically viable. The use of video as an input device also allows

Haddadi, Hamed

363

The Digital Interactive Video  

E-print Network

The Digital Interactive Video Exploration and Reflection (Diver) system lets users create virtual pathways through existing video content using a virtual camera and an annotation window for commentary repurposing, and discussion. W ith the inexorable growth of low-cost consumer video elec- tronics

Paris-Sud XI, Université de

364

Study of CCD eyepiece on T-4 theodolite  

Microsoft Academic Search

This document describes the effort of the University of Maryland to develop a Charge Coupled Device (CCD) Camera System, with the necessary support hardware and analysis software, to act as an impersonal electronic eyepiece on the T-4 theodolite for astronomical longitude and latitude determinations. This report will describe the concept, the implementation, and the current status of this project. Analysis

D. G. Currie

1982-01-01

365

Characterization of the Series 1000 Camera System  

SciTech Connect

The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

Kimbrough, J; Moody, J; Bell, P; Landen, O

2004-04-07

366

Characterization of the series 1000 camera system  

SciTech Connect

The National Ignition Facility requires a compact network addressable scientific grade charge coupled device (CCD) camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1 MHz readout rate. The PC104+ controller includes 16 analog inputs, four analog outputs, and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

Kimbrough, J.R.; Moody, J.D.; Bell, P.M.; Landen, O.L. [Lawrence Livermore National Laboratory, Livermore, California 94551-0808 (United States)

2004-10-01

367

Snapshot video: everyday photographers taking short video-clips  

Microsoft Academic Search

Camera phones and consumer digital cameras number hundreds of millions worldwide and most of them have the ability to take video in addition to photographs. Public discussions, marketing, and academic research often emphasize the new and innovative ways in which people use their ubiquitous digital cameras, especially camera phones, in combination with the Internet. In this paper we present our

Asko Lehmuskallio; Risto Sarvas

2008-01-01

368

The Dark Energy Camera (DECam)  

Microsoft Academic Search

We describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y),

D. L. DePoy; T. Abbott; J. Annis; M. Antonik; M. Barcel; R. Bernstein; B. Bigelow; D. Brooks; E. Buckley-Geer; J. Campa; L. Cardiel; F. Castander; J. Castilla; H. Cease; S. Chappa; E. Dede; G. Derylo; H. T. Diehl; P. Doel; J. DeVicente; J. Estrada; D. Finley; B. Flaugher; E. Gaztanaga; D. Gerdes; M. Gladders; V. Guarino; G. Gutierrez; M. Haney; S. Holland; K. Honscheid; D. Huffman; I. Karliner; D. Kau; S. Kent; M. Kozlovsky; D. Kubik; K. Kuehn; S. Kuhlmann; K. Kuk; F. Leger; H. Lin; G. Martinez; M. Martinez; W. Merritt; J. Mohr; P. Moore; T. Moore; B. Nord; R. Ogando; J. Olsen; B. Onal; J. Peoples; T. Qian; N. Roe; E. Sanchez; V. Scarpine; R. Schmidt; R. Schmitt; M. Schubnell; K. Schultz; M. Selen; T. Shaw; V. Simaitis; J. Slaughter; C. Smith; H. Spinka; A. Stefanik; W. Stuermer; R. Talaga; G. Tarle; J. Thaler; D. Tucker; A. Walker; S. Worswick; A. Zhao

2008-01-01

369

CCD high-speed videography system with new concepts and techniques  

NASA Astrophysics Data System (ADS)

A novel CCD high speed videography system with brand-new concepts and techniques is developed by Zhejiang University recently. The system can send a series of short flash pulses to the moving object. All of the parameters, such as flash numbers, flash durations, flash intervals, flash intensities and flash colors, can be controlled according to needs by the computer. A series of moving object images frozen by flash pulses, carried information of moving object, are recorded by a CCD video camera, and result images are sent to a computer to be frozen, recognized and processed with special hardware and software. Obtained parameters can be displayed, output as remote controlling signals or written into CD. The highest videography frequency is 30,000 images per second. The shortest image freezing time is several microseconds. The system has been applied to wide fields of energy, chemistry, medicine, biological engineering, aero- dynamics, explosion, multi-phase flow, mechanics, vibration, athletic training, weapon development and national defense engineering. It can also be used in production streamline to carry out the online, real-time monitoring and controlling.

Zheng, Zengrong; Zhao, Wenyi; Wu, Zhiqiang

1997-05-01

370

MoViMash: Online Mobile Video Mashup Mukesh Saini  

E-print Network

Director, Video Mashup. 1. INTRODUCTION Worldwide shipment of camera phones were estimated to reach 1 of mobile video cameras, it is becoming eas- ier for users to capture videos of live performances mobility, each video camera is able to capture only from a range of restricted viewing angles and distance

Ooi, Wei Tsang

371

Distributing digital video to multiple computers  

PubMed Central

Video is an effective teaching tool, and live video microscopy is especially helpful in teaching dissection techniques and the anatomy of small neural structures. Digital video equipment is more affordable now and allows easy conversion from older analog video devices. I here describe a simple technique for bringing digital video from one camera to all of the computers in a single room. This technique allows students to view and record the video from a single camera on a microscope. PMID:23493464

Murray, James A.

2004-01-01

372

A CCD search for geosynchronous debris  

NASA Technical Reports Server (NTRS)

Using the Spacewatch Camera, a search was conducted for objects in geosynchronous earth orbit. The system is equipped with a CCD camera cooled with dry ice; the image scale is 1.344 arcsec/pixel. The telescope drive was off so that during integrations the stars were trailed while geostationary objects appeared as round images. The technique should detect geostationary objects to a limiting apparent visual magnitude of 19. A sky area of 8.8 square degrees was searched for geostationary objects while geosynchronous debris passing through was 16.4 square degrees. Ten objects were found of which seven are probably geostationary satellites having apparent visual magnitudes brighter than 13.1. Three objects having magnitudes equal to or fainter than 13.7 showed motion in the north-south direction. The absence of fainter stationary objects suggests that a gap in debris size exists between satellites and particles having diameters in the millimeter range.

Gehrels, Tom; Vilas, Faith

1986-01-01

373

Classical astrometry longitude and latitude determination by using CCD technique  

NASA Astrophysics Data System (ADS)

At the AOB, it is the zenith-telescope (D=11 cm, F=128.7 cm, denoted by BLZ in the list of Bureau International de l'Heure - BIH), and at Punta Indio (near La Plata) it is the photographic zenith tube (D=20 cm, F=457.7 cm, denoted by PIP in the list of BIH). At the AOB there is a CCD camera ST-8 of Santa Barbara Instrument Group (SBIG) with 15301020 number of pixels, 99 microns pixel size and 13.89.2 mm array dimension. We did some investigations about the possibilities for longitude (?) and latitude (?) determinations by using ST-8 with BLZ and PIP, and our predicted level of accuracy is few 0."01 from one CCD zenith stars processing with Tycho-2 Catalogue. Also, astro-geodesy has got new practicability with the CCDs (to reach a good accuracy of geoid determination via astro-geodesy ? and ? observations). At the TU Wien there is the CCD MX916 of Starlight Xpress (with 752580 pixels, 1112 microns, 8.76.5 mm active area). Our predicted level of accuracy for ? and ? measurements is few 0."1 from one CCD MX916 processing of zenith stars, with small optic (20 cm focus length because of not stable, but mobile instrument) and Tycho-2. A transportable zenith camera with CCD is under development at the TU Wien for astro-geodesy subjects.

Damljanovi?, G.; de Biasi, M. S.; Gerstbach, G.

374

CCD technique for longitude/latitude astronomy  

NASA Astrophysics Data System (ADS)

We report about CCD (Charge Coupled Device) experiments with the isntruments of astrometry and geodesy for the longitude and latitude determinations. At the Techn. University Vienna (TU Vienna), a mobile zenith camera "G1" was developed, based on CCD MX916 (Starlight Xpress) and F=20 cm photo optic. With Hipparcos/Tycho Catalogue, the first results show accuracy up to 0."5 for latitude/longitude. The PC-guided observations can be completed within 10 minutes. The camera G1 (near 4 kg) is used for astrogeodesy (geoid, Earth's crust, etc.). At the Belgrade Astronomical Observatory (AOB), the accuracy of (mean value of) latitude/longitude determinations can be a few 0."01 using zenith stars, Tycho-2 Catalogue and a ST-8 of SBIG (Santa Barbara Instrument Group) with zenith-telescope BLZ (D=11 cm, F=128.7 cm). The same equipment with PIP instrument (D=20 cm and F=457.7 cm, Punta Indio PZT, near La Plata) yields a little better accuracy than the BLZ's one. Both instruments, BLZ and PIP, where in the list of Bureau International de l'Heure - BIH. The mentioned instruments have acquired good possibilities for semi or full-automatic observations.

Damljanovi?, G.; Gerstbach, G.; de Biasi, M. S.; Pejovi?, N.

2003-10-01

375

Solid State Television Camera (CID)  

NASA Technical Reports Server (NTRS)

The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

Steele, D. W.; Green, W. T.

1976-01-01

376

Video flowmeter  

DOEpatents

A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid.

Lord, D.E.; Carter, G.W.; Petrini, R.R.

1981-06-10

377

Video flowmeter  

DOEpatents

A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid. 4 figs.

Lord, D.E.; Carter, G.W.; Petrini, R.R.

1983-08-02

378

Video flowmeter  

DOEpatents

A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid (10) containing entrained particles (12) is formed and positioned by a rod optic lens assembly (31) on the raster area of a low-light level television camera (20). The particles (12) are illuminated by light transmitted through a bundle of glass fibers (32) surrounding the rod optic lens assembly (31). Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen (40). The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid (10).

Lord, David E. (Livermore, CA); Carter, Gary W. (Livermore, CA); Petrini, Richard R. (Livermore, CA)

1983-01-01

379

Development and use of an L3CCD high-cadence imaging system for Optical Astronomy  

NASA Astrophysics Data System (ADS)

A high cadence imaging system, based on a Low Light Level CCD (L3CCD) camera, has been developed for photometric and polarimetric applications. The camera system is an iXon DV-887 from Andor Technology, which uses a CCD97 L3CCD detector from E2V technologies. This is a back illuminated device, giving it an extended blue response, and has an active area of 512512 pixels. The camera system allows frame-rates ranging from 30 fps (full frame) to 425 fps (windowed & binned frame). We outline the system design, concentrating on the calibration and control of the L3CCD camera. The L3CCD detector can be either triggered directly by a GPS timeserver/frequency generator or be internally triggered. A central PC remotely controls the camera computer system and timeserver. The data is saved as standard `FITS' files. The large data loads associated with high frame rates, leads to issues with gathering and storing the data effectively. To overcome such problems, a specific data management approach is used, and a Python/PYRAF data reduction pipeline was written for the Linux environment. This uses calibration data collected either on-site, or from lab based measurements, and enables a fast and reliable method for reducing images. To date, the system has been used twice on the 1.5 m Cassini Telescope in Loiano (Italy) we present the reduction methods and observations made.

Sheehan, Brendan J.; Butler, Raymond F.

2008-02-01

380

BLAST Autonomous Daytime Star Cameras  

E-print Network

We have developed two redundant daytime star cameras to provide the fine pointing solution for the balloon-borne submillimeter telescope, BLAST. The cameras are capable of providing a reconstructed pointing solution with an absolute accuracy camera combines a 1 megapixel CCD with a 200 mm f/2 lens to image a 2 degree x 2.5 degree field of the sky. The instruments are autonomous. An internal computer controls the temperature, adjusts the focus, and determines a real-time pointing solution at 1 Hz. The mechanical details and flight performance of these instruments are presented.

Marie Rex; Edward Chapin; Mark J. Devlin; Joshua Gundersen; Jeff Klein; Enzo Pascale; Donald Wiebe

2006-05-01

381

Three-dimensional motion of an object determined by an image sequence of a video theodolite  

NASA Astrophysics Data System (ADS)

To get the position and orientation of a moving object, single images of the object supplied with fixed markings are taken by a CCD camera. The camera is included in a video-theodolite to obtain a nearly unlimited field of view and to obtain the exterior orientation each time. The trajectory of the object can be determined with an image sequence. This paper describes the mathematical formulation for the determination of orientation and position of the object. The problem is based on an inverse formulation of the resection in space, as the orientation and position of the camera is known. Furthermore an image processing algorithm is described to extract and match the control points within the image. The results of a test measurement of a moving object on rails are shown together with the accuracies and measuring frequencies achieved.

Heck, U.

1994-03-01

382

Electronic Still Camera  

NASA Technical Reports Server (NTRS)

A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

Holland, S. Douglas (inventor)

1992-01-01

383

Cameras for digital microscopy.  

PubMed

This chapter reviews the fundamental characteristics of charge-coupled devices (CCDs) and related detectors, outlines the relevant parameters for their use in microscopy, and considers promising recent developments in the technology of detectors. Electronic imaging with a CCD involves three stages--interaction of a photon with the photosensitive surface, storage of the liberated charge, and readout or measurement of the stored charge. The most demanding applications in fluorescence microscopy may require as much as four orders of greater magnitude sensitivity. The image in the present-day light microscope is usually acquired with a CCD camera. The CCD is composed of a large matrix of photosensitive elements (often referred to as "pixels" shorthand for picture elements, which simultaneously capture an image over the entire detector surface. The light-intensity information for each pixel is stored as electronic charge and is converted to an analog voltage by a readout amplifier. This analog voltage is subsequently converted to a numerical value by a digitizer situated on the CCD chip, or very close to it. Several (three to six) amplifiers are required for each pixel, and to date, uniform images with a homogeneous background have been a problem because of the inherent difficulties of balancing the gain in all of the amplifiers. Complementary metal oxide semiconductor sensors also exhibit relatively high noise associated with the requisite high-speed switching. Both of these deficiencies are being addressed, and sensor performance is nearing that required for scientific imaging. PMID:23931507

Spring, Kenneth R

2013-01-01

384

Back-illuminated CCD imagers for high-information-content digital photography  

Microsoft Academic Search

The advantages of digital photography are well documented, and digital photography is seeing increased use in demanding photography applications. Of the many implementations of digital cameras, the three-CCD camera provides the optimal resolution, temporal sampling, and color reproduction, which when examined together form the information content of the sensor - a physical measure of the detector's imaging performance. So that

George M. Williams; Harry Marsh; Michael Hinds

1998-01-01

385

Camera Obscura  

NSDL National Science Digital Library

Before photography was invented there was the camera obscura, useful for studying the sun, as an aid to artists, and for general entertainment. What is a camera obscura and how does it work ??? Camera = Latin for room Obscura = Latin for dark But what is a Camera Obscura? The Magic Mirror of Life What is a camera obscura? A French drawing camera with supplies A French drawing camera with supplies Drawing Camera Obscuras with Lens at the top Drawing Camera Obscuras with Lens at the top Read the first three paragraphs of this article. Under the portion Early Observations and Use in Astronomy you will find the answers to the ...

Engelman, Mr.

2008-10-28

386

System for control of cooled CCD and image data processing for plasma spectroscopy  

SciTech Connect

A Spectroscopic measurement system which has a spacial resolution is important for plasma study. This is especially true for a measurement of a plasma without axial symmetry like the LHD-plasma. Several years ago, we developed an imaging spectroscopy system using a CCD camera and an image-memory board of a personal computer. It was very powerful to study a plasma-gas interaction phenomena. In which system, however, an ordinary CCD was used so that the dark-current noise of the CCD prevented to measure dark spectral lines. Recently, a cooled CCD system can be obtained for the high sensitivity measurement. But such system is still very expensive. The cooled CCD itself as an element can be purchased cheaply, because amateur agronomists began to use it to take a picture of heavenly bodies. So we developed an imaging spectroscopy system using such a cheap cooled CCD for plasma experiment.

Mimura, M.; Kakeda, T.; Inoko, A. [Osaka City Univ. (Japan)] [and others

1995-12-31

387

USB Security Camera Software for Linux  

Microsoft Academic Search

USB Security Camera has been developed in the society security field, however, current video surveillance is too expensive to limit use widely. The paper proposes a new method that Linux system is software development, with USB camera as video gather. Using TCP\\/IP Protocol agreement realize network communication. The system inside embeds web server so users can visit resources by browser

J. Weerachai; P. Siam; K. Narawith

2011-01-01

388

World's fastest and most sensitive astronomical camera  

NASA Astrophysics Data System (ADS)

The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these corrections to be done at an even higher rate, more than one thousand times a second, and this is where OCam is essential. "The quality of the adaptive optics correction strongly depends on the speed of the camera and on its sensitivity," says Philippe Feautrier from the LAOG, France, who coordinated the whole project. "But these are a priori contradictory requirements, as in general the faster a camera is, the less sensitive it is." This is why cameras normally used for very high frame-rate movies require extremely powerful illumination, which is of course not an option for astronomical cameras. OCam and its CCD220 detector, developed by the British manufacturer e2v technologies, solve this dilemma, by being not only the fastest available, but also very sensitive, making a significant jump in performance for such cameras. Because of imperfect operation of any physical electronic devices, a CCD camera suffers from so-called readout noise. OCam has a readout noise ten times smaller than the detectors currently used on the VLT, making it much more sensitive and able to take pictures of the faintest of sources. "Thanks to this technology, all the new generation instruments of ESO's Very Large Telescope will be able to produce the best possible images, with an unequalled sharpness," declares Jean-Luc Gach, from the Laboratoire d'Astrophysique de Marseille, France, who led the team that built the camera. "Plans are now underway to develop the adaptive optics detectors required for ESO's planned 42-metre European Extremely Large Telescope, together with our research partners and the industry," says Hubin. Using sensitive detectors developed in the UK, with a control system developed in France, with German and Spanish participation, OCam is truly an outcome of a European collaboration that will be widely used and commercially produced. More information The three French laboratories involved are the Laboratoire d'Astrophysique de Marseille (LAM/INSU/CNRS, Universit de Provence; Observatoire Astronomique de Marseille Prov

2009-06-01

389

First Carlsberg Meridian Telescope (CMT) CCD Catalogue.  

NASA Astrophysics Data System (ADS)

The Carlsberg Meridian Telescope (CMT) is a telescope owned by Copenhagen University Observatory (CUO). It was installed in the Spanish observatory of El Roque de los Muchachos on the island of La Palma (Canary Islands) in 1984. It is operated jointly by the CUO, the Institute of Astronomy, Cambridge (IoA) and the Real Instituto y Observatorio de la Armada of Spain (ROA) in the framework of an international agreement. From 1984 to 1998 the instrument was provided with a moving slit micrometer and with its observations a series of 11 catalogues were published, `Carlsberg Meridian Catalogue La Palma (CMC No 1-11)'. Since 1997, the telescope has been controlled remotely via Internet. The three institutions share this remote control in periods of approximately three months. In 1998, the CMT was upgraded by installing as sensor, a commercial Spectrasource CCD camera as a test of the possibility of performing meridian transits observed in drift-scan mode. Once this was shown possible, in 1999, a second model of CCD camera, built in the CUO workshop with a better performance, was installed. The Spectrasource camera was loaned to ROA by CUO and is now installed in the San Fernando Automatic Meridian Circle in San Juan (CMASF). In 1999, the observations were started of a sky survey from -3deg to +30deg in declination. In July 2002, a first release of the survey was published, with the positions of the observed stars in the band between -3deg and +3deg in declination. This oral communication will present this first release of the survey.

Blizon, F.; Muios, J. L.; Vallejo, M.; Evans, D. W.; Irwin, M.; Helmer, L.

2003-11-01

390

Mars Science Laboratory Engineering Cameras  

NASA Technical Reports Server (NTRS)

NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

2012-01-01

391

Chip design of linear CCD drive pulse generator and control interface  

NASA Astrophysics Data System (ADS)

CCD noises and their causes are analyzed. Methods to control these noises, such as Correlated Double Sampling (CDS), filtering, cooling, clamping, and calibration are proposed. To improve CCD sensor's performances, the IC, called Analog Front End (AFE), integration of CDS, clamping, Programmable Gain Amplifier (PGA), offset, and ADC, which can fulfill the CDS and analog-to-digital conversion, is employed to process the output signal of CCD. Based on the noise control approaches, the idea of chip design of linear CCD drive pulse generator and control interface is introduced. The chip designed is playing the role of (1) drive pulse generator, for both CCD and AFE, and (2) interface, helping to analysis and transfer control command and status information between MCU controller and drive pulse generator, or between global control unit in the chip and CCD/AFE. There are 6 function blocks in the chip designed, such as clock generator for CCD and AFE, MCU interface, AFE serial interface, output interface, CCD antiblooming parameter register and global control logic unit. These functions are implemented in a CPLD chip, Xilinx XC2C256-6-VQ100, with 20MHz pixel frequency, and 16-bit high resolution. This chip with the AFE can eliminate CCD noise largely and improve the SNR of CCD camera. At last, the design result is presented.

Cai, Rongtai; Sun, Honghai; Wang, Yanjie

2006-02-01

392

Galeotti, et al., http://www.ncigt.org/pages/IGT_Workshop_2011 ProbeSight: Video Cameras on an Ultrasound Probe for  

E-print Network

Medical ultrasound typically deals with the interior of the patient Cameras on an Ultrasound Probe for Computer Vision of the Patient's Exterior J, with the exterior left to that original medical imaging modality, direct human vision

Stetten, George

393

Fully depleted back illuminated CCD  

DOEpatents

A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

Holland, Stephen Edward (Hercules, CA)

2001-01-01

394

Automated video tracking of contact lens motion  

NASA Astrophysics Data System (ADS)

Successful extended contact lens wear requires lens motion that provides adequate tear mixing to remove ocular debris. Proper lens motion of rigid contact lenses is also important for proper fitting. Moreover, a factor in final lens comfort and optical quality for contact lens fitting is lens centration. Calculation of the post lens volume of rigid contact lenses at different corneal surface locations can be used to produce a volume map. Such maps often reveal channels of minimum volume in which lenses may be expected to move, or local minima, where lenses may be expected to settle. To evaluate the utility of our volume map technology and evaluate other models of contact lens performance we have developed an automated video-based lens tracking system that provides detailed information about lens translation and rotation. The system uses standard video capture technology with a CCD camera attached to an ophthalmic slit lamp biomicroscope. The subject wears a specially marked contact lens for tracking purposes. Several seconds of video data are collected in real-time as the patient blinks naturally. The data are processed off-line, with the experimenter providing initial location estimates of the pupil and lens marks. The technique provides a fast and accurate method of quantifying lens motion. With better contact lens motion information we will gain a better understanding of the relationships between corneal shapes, lens design parameters, tear mixing, and patient comfort.

Carney, Thom; Dastmalchi, Shahram

2000-05-01

395

Video Visualization Gareth Daniel Min Chen  

E-print Network

, generated by the entertainment industry, security and traffic cameras, video conferencing systems, video, such as the United Kingdom, it is estimated that on av- erage a citizen is caught on security and traffic cameras 300 in the security industry is the ratio of surveillance cameras to security personnel. Imagine that security

Grant, P. W.

396

CCD imager with photodetector bias introduced via the CCD register  

NASA Technical Reports Server (NTRS)

An infrared charge-coupled-device (IR-CCD) imager uses an array of Schottky-barrier diodes (SBD's) as photosensing elements and uses a charge-coupled-device (CCD) for arranging charge samples supplied in parallel from the array of SBD's into a succession of serially supplied output signal samples. Its sensitivity to infrared (IR) is improved by placing bias charges on the Schottky barrier diodes. Bias charges are transported to the Schottky barrier diodes by a CCD also used for charge sample read-out.

Kosonocky, Walter F. (Inventor)

1986-01-01

397

Video monitoring system for car seat  

NASA Technical Reports Server (NTRS)

A video monitoring system for use with a child car seat has video camera(s) mounted in the car seat. The video images are wirelessly transmitted to a remote receiver/display encased in a portable housing that can be removably mounted in the vehicle in which the car seat is installed.

Elrod, Susan Vinz (Inventor); Dabney, Richard W. (Inventor)

2004-01-01

398

Noise Estimation in Video Surveillance Systems  

Microsoft Academic Search

Noise estimation plays an important role in the evaluation of video quality. In video surveillance systems, noise is mainly introduced by the camera and the quantization process. This paper proposes a method to estimate the noise in video communication systems. Firstly, the variance of noise introduced by the camera can be estimated by using inter-frame and intra-frame differential operation. Then

Jin-chao Li; Hui-ming Tang; Chao Lu

2009-01-01

399

Video Golf  

NASA Technical Reports Server (NTRS)

George Nauck of ENCORE!!! invented and markets the Advanced Range Performance (ARPM) Video Golf System for measuring the result of a golf swing. After Nauck requested their assistance, Marshall Space Flight Center scientists suggested video and image processing/computing technology, and provided leads on commercial companies that dealt with the pertinent technologies. Nauck contracted with Applied Research Inc. to develop a prototype. The system employs an elevated camera, which sits behind the tee and follows the flight of the ball down range, catching the point of impact and subsequent roll. Instant replay of the video on a PC monitor at the tee allows measurement of the carry and roll. The unit measures distance and deviation from the target line, as well as distance from the target when one is selected. The information serves as an immediate basis for making adjustments or as a record of skill level progress for golfers.

1995-01-01

400

Bit-rate-controlled DCT compression algorithm for digital still camera  

NASA Astrophysics Data System (ADS)

A digital still camera (DS) is composed of CCD(Charge coupled device), AID converter, signal processing block, and an IC memory card as the media for image data storage. (Fig. 1) CCD output signal is digitized and processed in DS camera to be stored in the IC memory card.

Watanabe, Mikio; Ito, Kenji; Saito, Osamu; Moronaga, Kenji; Ochi, Shigeharu

1990-06-01

401

Creativity and Video Production.  

ERIC Educational Resources Information Center

Suggests ways to increase creativity in individuals and in team members who are involved with developing video productions. Topics discussed include freedom of expression; setting goals and objectives; visualization; laser video discs; instant cameras; the function of evaluation; and the impact of technological developments on audience

Yeamans, George T.

1990-01-01

402

A HARDWARE PLATFORM FOR AN AUTOMATIC VIDEO TRACKING  

E-print Network

cameras. Video tracking can be used in many areas especially in security-related areas such as airportsA HARDWARE PLATFORM FOR AN AUTOMATIC VIDEO TRACKING SYSTEM USING MULTIPLE PTZ CAMERAS A report on fixed position still cameras. In this report, we proposed a hardware platform for a video tracking

Abidi, Mongi A.

403

Guerrilla Video: A New Protocol for Producing Classroom Video  

ERIC Educational Resources Information Center

Contemporary changes in pedagogy point to the need for a higher level of video production value in most classroom video, replacing the default video protocol of an unattended camera in the back of the classroom. The rich and complex environment of today's classroom can be captured more fully using the higher level, but still easily manageable,

Fadde, Peter; Rich, Peter

2010-01-01

404

Video Mosaicking for Inspection of Gas Pipelines  

NASA Technical Reports Server (NTRS)

A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable differences: Because the wide-angle lens introduces considerable distortion, the image data must be processed to effectively unwarp the images (see Figure 2). The computer executes special software that includes an unwarping algorithm that takes explicit account of the cylindrical pipe geometry. To reduce the processing time needed for unwarping, parameters of the geometric mapping between the circular view of a fisheye lens and pipe wall are determined in advance from calibration images and compiled into an electronic lookup table. The software incorporates the assumption that the optical axis of the camera is parallel (rather than perpendicular) to the direction of motion of the camera. The software also compensates for the decrease in illumination with distance from the ring of LEDs.

Magruder, Darby; Chien, Chiun-Hong

2005-01-01

405

Toying with obsolescence : Pixelvision filmmakers and the Fisher Price PXL 2000 camera  

E-print Network

This thesis is a study of the Fisher Price PXL 2000 camera and the artists and amateurs who make films and videos with this technology. The Pixelvision camera records video onto an audiocassette; its image is low-resolution, ...

McCarty, Andrea Nina

2005-01-01

406

NSTX Tangential Divertor Camera  

SciTech Connect

Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

2004-07-16

407

The Dark Energy Camera  

NASA Astrophysics Data System (ADS)

The DES Collaboration has completed construction of the Dark Energy Camera (DECam), a 3 square degree, 570 Megapixel CCD camera which is now mounted at the prime focus of the Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory. DECam is comprised of 74 250 micron thick fully depleted CCDs: 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. A filter set of u,g,r,i,z, and Y, a hexapod for focus and lateral alignment as well as thermal management of the cage temperature. DECam will be used to perform the Dark Energy Survey with 30% of the telescope time over a 5 year period. During the remainder of the time, and after the survey, DECam will be available as a community instrument. An overview of the DECam design, construction and initial on-sky performance information will be presented.

Flaugher, Brenna; DES Collaboration

2013-01-01

408

Recursive Video Matting and Denoising  

Microsoft Academic Search

In this paper, we propose a video matting method with simultaneous noise reduction based on the Unscented Kalman filter (UKF). This recursive approach extracts the alpha mattes and denoised foregrounds from noisy videos, in a unified framework. No assumptions are made about the type of motion of the camera or of the foreground object in the video. Moreover, user-specified trimaps

Sahana M. Prabhu; A. N. Rajagopalan

2010-01-01

409

CCD Astronomy Software User's Guide  

E-print Network

CCDSoft CCD Astronomy Software User's Guide Version 5 Revision 1.11 Copyright © 1992­2006 SantaSky Astronomy Software, and AutomaDome are trademarks of Software Bisque. WindowsTM is a trademark of Microsoft

410

Enhanced performance CCD output amplifier  

DOEpatents

A low-noise FET amplifier is connected to amplify output charge from a che coupled device (CCD). The FET has its gate connected to the CCD in common source configuration for receiving the output charge signal from the CCD and output an intermediate signal at a drain of the FET. An intermediate amplifier is connected to the drain of the FET for receiving the intermediate signal and outputting a low-noise signal functionally related to the output charge signal from the CCD. The amplifier is preferably connected as a virtual ground to the FET drain. The inherent shunt capacitance of the FET is selected to be at least equal to the sum of the remaining capacitances.

Dunham, Mark E. (Los Alamos, NM); Morley, David W. (Santa Fe, NM)

1996-01-01

411

The Development of the Spanish Fireball Network Using a New All-Sky CCD System  

Microsoft Academic Search

We have developed an all-sky charge coupled devices (CCD) automatic system for detecting meteors and fireballs that will be\\u000a operative in four stations in Spain during 2005. The cameras were developed following the BOOTES-1 prototype installed at\\u000a the El Arenosillo Observatory in 2002, which is based on a CCD detector of 4096 4096 pixels with a fish-eye lens that

J. M. Trigo-Rodrguez; A. J. Castro-Tirado; J. Llorca; J. Fabregat; V. J. Martnez; V. Reglero; M. Jelnek; P. Kubnek; T. Mateo; A. de Ugarte Postigo

2004-01-01

412

Development of a CCD array as an imaging detector for advanced X-ray astrophysics facilities  

NASA Technical Reports Server (NTRS)

The development of a charge coupled device (CCD) X-ray imager for a large aperture, high angular resolution X-ray telescope is discussed. Existing CCDs were surveyed and three candidate concepts were identified. An electronic camera control and computer interface, including software to drive a Fairchild 211 CCD, is described. In addition a vacuum mounting and cooling system is discussed. Performance data for the various components are given.

Schwartz, D. A.

1981-01-01

413

On the Development of a Digital Video Motion Detection Test Set  

SciTech Connect

This paper describes the current effort to develop a standardized data set, or suite of digital video sequences, that can be used for test and evaluation of digital video motion detectors (VMDS) for exterior applications. We have drawn from an extensive video database of typical application scenarios to assemble a comprehensive data set. These data, some existing for many years on analog videotape, have been converted to a reproducible digital format and edited to generate test sequences several minutes long for many scenarios. Sequences include non- alarm video, intrusions and nuisance alarm sources, taken with a variety of imaging sensors including monochrome CCD cameras and infrared (thermal) imaging cameras, under a variety of daytime and nighttime conditions. The paper presents an analysis of the variables and estimates the complexity of a thorough data set. Some of this video data test has been digitized for CD-ROM storage and playback. We are considering developing a DVD disk for possible use in screening and testing VMDs prior to government testing and deployment. In addition, this digital video data may be used by VMD developers for fhrther refinement or customization of their product to meet specific requirements. These application scenarios may also be used to define the testing parameters for futore procurement qualification. A personal computer may be used to play back either the CD-ROM or the DVD video data. A consumer electronics-style DVD player may be used to replay the DVD disk. This paper also discusses various aspects of digital video storage including formats, resolution, CD-ROM and DVD storage capacity, formats, editing and playback.

Pritchard, Daniel A.; Vigil, Jose T.

1999-06-07

414

The research on statistical properties of TDI-CCD imaging noise  

NASA Astrophysics Data System (ADS)

TDI-CCD can improve the sensitivity of space camera without any degradation of spatial resolution which is widely used in aerospace imaging devices. The article describes the basic working principle and application characteristics of TDI-CCD devices, analyses the composition of TDI-CCD imaging noise, and propose a new method to analyze TDI-CCD imaging noise with statistical probability distribution. In order to estimate the distribution of gray values affect by noise we introduced the concept of skewness and kurtosis. We design an experiment using constant illumination light source, take image with TDI-CCD working at different stage such as stage 16, stage 32, stage 48, stage 64 and stage 96, analyse the characteristics of image noise with the method we proposed, experimental results show that the gray values approximately meet normal distribution in large sample cases.

Gu, Ying-ying; Shen, Xiang-heng; He, Geng-xian

2011-08-01

415

Video sensor with range measurement capability  

NASA Technical Reports Server (NTRS)

A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.

Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Howard, Richard T. (Inventor); Broderick, David J. (Inventor)

2008-01-01

416

Automatic fire detection system using CCD camera and Bayesian network  

Microsoft Academic Search

This paper proposes a new vision-based fire detection method for real-life application. Most previous vision-based methods using color information and temporal variations of pixels produce frequent false alarms due to the use of many heuristic features. Plus, there is usually a computation delay for accurate fire detection. Thus, to overcome these problems, candidate fire regions are first detected using a

Kwang-Ho Cheong; Byoung-Chul Ko; Jae-Yeal Nam

2008-01-01

417

Automatic fire detection system using CCD camera and Bayesian network  

NASA Astrophysics Data System (ADS)

This paper proposes a new vision-based fire detection method for real-life application. Most previous vision-based methods using color information and temporal variations of pixels produce frequent false alarms due to the use of many heuristic features. Plus, there is usually a computation delay for accurate fire detection. Thus, to overcome these problems, candidate fire regions are first detected using a background model and color model of fire. Probabilistic models of fire are then generated based on the fact that fire pixel values in consecutive frames change constantly and these models are applied to a Bayesian Network. This paper uses a three-level Bayesian Network that contains intermediate nodes, and uses four probability density functions for evidence at each node. The probability density functions for each node are modeled using the skewness of the color red and three high frequency components obtained from a wavelet transform. The proposed system was successfully applied to various fire-detection tasks in real-world environments and effectively distinguished fire from fire-colored moving objects.

Cheong, Kwang-Ho; Ko, Byoung-Chul; Nam, Jae-Yeal

2008-02-01

418

Deployable Wireless Camera Penetrators  

NASA Technical Reports Server (NTRS)

A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an aerobot or a spacecraft onto a comet or asteroid. A system of 20 of these penetrators could be designed and built in a 1- to 2-kg mass envelope. Possible future modifications of the camera penetrators, such as the addition of a chemical spray device, would allow the study of simple chemical reactions of reagents sprayed at the landing site and looking at the color changes. Zoom lenses also could be added for future use.

Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

2008-01-01

419

The LSST Camera System  

NASA Astrophysics Data System (ADS)

The LSST camera provides a 3.2 Gigapixel focal plane array, tiled by 189 4Kx4K CCD science sensors with 10um pixels. This pixel count is a direct consequence of sampling the 9.6 deg^2 field-of-view (0.64m diameter) with 0.2 arcsec pixels (Nyquist sampling in the best expected seeing of 0.4 arcsec). The sensors are deep depleted, back-illuminated devices with a highly segmented architecture that enables the entire array to be read in 2 seconds. The detectors are grouped into 3x3 rafts, each containing its own dedicated front-end and back-end electronics boards. The rafts are mounted on a silicon carbide grid inside a vacuum cryostat, with an intricate thermal cryostat and the third of the three refractive lenses in the camera. The other two lenses are mounted in an optics structure at the front of the camera body, which also contains a mechanical shutter, and a carousel assembly that holds five large optical filters(ugrizy). A sixth optical filter will also be fabricated and can replace any of the others via procedures accomplished during daylight hours. This poster will illustrate the current mechanical design of the camera, FEA and thermal analysis of the cryostat, and overview of the data acquisition system and the performance characteristics of the filters.

Gilmore, D. Kirk; Kahn, S.; Fouts, K.; LSST Camera Team

2009-01-01

420

Observational Astronomy Gain of a CCD  

E-print Network

(both the shift register and the analog amplifier are usually part of the CCD chip itself carefully, you will see a rectangle in the eyepiece field of view, which corresponds to the CCD chip the cloth shroud to shield the gap between the tube and the CCD lens. When you are sure that the CCD has

Harrington, J. Patrick

421

The high resolution video capture system on the alcator C-Mod tokamak  

SciTech Connect

A new system for routine digitization of video images is presently operating on the Alcator C-Mod tokamak. The PC-based system features high resolution video capture, storage, and retrieval. The captured images are stored temporarily on the PC, but are eventually written to CD. Video is captured from one of five filtered RS-170 CCD cameras at 30 frames per second (fps) with 640{times}480 pixel resolution. In addition, the system can digitize the output from a filtered Kodak Ektapro EM Digital Camera which captures images at 1000 fps with 239{times}192 resolution. Present views of this set of cameras include a wide angle and a tangential view of the plasma, two high resolution views of gas puff capillaries embedded in the plasma facing components, and a view of ablating, high speed Li pellets. The system is being used to study (1) the structure and location of visible emissions (including MARFEs) from the main plasma and divertor, (2) asymmetries in gas puff plumes due to flows in the scrape-off layer (SOL), and (3) the tilt and cigar-shaped spatial structure of the Li pellet ablation cloud. {copyright} {ital 1997 American Institute of Physics.}

Allen, A.J.; Terry, J.L.; Garnier, D.; Stillerman, J.A. [Plasma Fusion Center, MIT, Cambridge, Massachusetts 02139-4307 (United States)] [Plasma Fusion Center, MIT, Cambridge, Massachusetts 02139-4307 (United States); Wurden, G.A. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States)] [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States)

1997-01-01

422

Meteor camera network in Hungary: some considerations about its hardware  

NASA Astrophysics Data System (ADS)

Since 2009, an efficient meteor camera network was developed in Hungary. Its main characteristics are that it is amateur-owned and -operated, based on commercial security cameras, and part of the IMO Video Network.

Igaz, Antal; Berko, Erno

2013-01-01

423

Automatic camera selection for activity monitoring in a multi-camera system for tennis  

Microsoft Academic Search

In professional tennis training matches, the coach needs to be able to view play from the most appropriate angle in order to monitor players' activities. In this paper, we describe and evaluate a system for automatic camera selection from a network of synchronised cameras within a tennis sporting arena. This work combines synchronised video streams from multiple cameras into a

Philip Kelly; C. O. Conaire; Chanyul Kim; Noel E. O'Connor

2009-01-01

424

Mobile Phones Digital Cameras  

E-print Network

News· Tutorials· Reviews· Features· Videos· Search· Mobile Phones· Notebooks· Digital Cameras· Gaming· Computers· Audio· Software· Follow Us· Subscribe· Airport Security to Get New Scanning Device Relations Accredited online university. Get an international relations degree. www.AMUOnline.com security

Suslick, Kenneth S.

425

Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras.  

PubMed

We have developed a pupil-corneal reflection method-based gaze detection system, which allows large head movements and achieves easy gaze calibration. This system contains two optical systems consisting of components such as a camera and a near-infrared light source attached to the camera. The light source has two concentric LED rings with different wavelengths. The inner and outer rings generate bright and dark pupil images, respectively. The pupils are detected from a difference image created by subtracting the bright and dark pupil images. The light source also generates the corneal reflection. The 3-D coordinates of the pupils are determined by the stereo matching method using two optical systems. The vector from the corneal reflection center to the pupil center in the camera image is determined as r. The angle between the line of sight and the line passing through the pupil center and the camera (light source) is denoted as ?. The relationship ? =k |r| is assumed, where k is a constant. The theory implies that head movement of the user is allowed and facilitates the gaze calibration procedure. In the automatic calibration method, k is automatically determined while the user looks around on the PC screen without fixating on any specific calibration target. In the one-point calibration method, the user is asked to fixate on one calibration target at the PC screen in order to correct the difference between the optical and visual axes. In the two-point calibration method, in order to correct the nonlinear relationship between ? and |r|, the user is asked to fixate on two targets. The experimental results show that the three proposed calibration methods improve the precision of gaze detection step by step. In addition, the average gaze error in the visual angle is less than 1 for the seven head positions of the user. PMID:23751948

Ebisawa, Yoshinobu; Fukumoto, Kiyotaka

2013-10-01

426

The Sloan Digital Sky Survey Photometric Camera  

Microsoft Academic Search

We have constructed a large-format mosaic CCD camera for the Sloan Digital Sky Survey. The camera consists of two arrays, a photometric array that uses 30 2048 x 2048 SITe\\/Tektronix CCDs (24 mum pixels) with an effective imaging area of 720 cm^2 and an astrometric array that uses 24 400 x 2048 CCDs with the same pixel size, which will

J. E. Gunn; M. Carr; C. Rockosi; M. Sekiguchi; K. Berry; B. Elms; E. de Haas; Z. Ivezic; G. Knapp; R. Lupton; G. Pauls; R. Simcoe; R. Hirsch; D. Sanford; S. Wang; D. York; F. Harris; J. Annis; L. Bartozek; W. Boroski; J. Bakken; M. Haldeman; S. Kent; S. Holm; D. Holmgren; D. Petravick; A. Prosapio; R. Rechenmacher; M. Doi; M. Fukugita; K. Shimasaku; N. Okada; C. Hull; W. Siegmund; E. Mannery; M. Blouke; D. Heidtman; D. Schneider; R. Lucinio; J. Brinkman

1998-01-01

427

CCD readout electronics for the Subaru Prime Focus Spectrograph  

NASA Astrophysics Data System (ADS)

The following paper details the design for the CCD readout electronics for the Subaru Telescope Prime Focus Spectrograph (PFS). PFS is designed to gather spectra from 2394 objects simultaneously, covering wavelengths that extend from 380 nm to 1260 nm. The spectrograph is comprised of four identical spectrograph modules, each collecting roughly 600 spectra. The spectrograph modules provide simultaneous wavelength coverage over the entire band through the use of three separate optical channels: blue, red, and near infrared (NIR). A camera in each channel images the multi-object spectra onto a 4k 4k, 15 ?m pixel, detector format. The two visible cameras use a pair of Hamamatsu 2k 4k CCDs with readout provided by custom electronics, while the NIR camera uses a single Teledyne HgCdTe 4k 4k detector and Teledyne's ASIC Sidecar to read the device. The CCD readout system is a custom design comprised of three electrical subsystems - the Back End Electronics (BEE), the Front End Electronics (FEE), and a Pre-amplifier. The BEE is an off-the-shelf PC104 computer, with an auxiliary Xilinx FPGA module. The computer serves as the main interface to the Subaru messaging hub and controls other peripheral devices associated with the camera, while the FPGA is used to generate the necessary clocks and transfer image data from the CCDs. The FEE board sets clock biases, substrate bias, and CDS offsets. It also monitors bias voltages, offset voltages, power rail voltage, substrate voltage and CCD temperature. The board translates LVDS clock signals to biased clocks and returns digitized analog data via LVDS. Monitoring and control messages are sent from the BEE to the FEE using a standard serial interface. The Pre-amplifier board resides behind the detectors and acts as an interface to the two Hamamatsu CCDs. The Pre-amplifier passes clocks and biases to the CCDs, and analog CCD data is buffered and amplified prior to being returned to the FEE. In this paper we describe the detailed design of the PFS CCD readout electronics and discuss current status of the design, preliminary performance, and proposed enhancements.

Hope, Stephen C.; Gunn, James E.; Loomis, Craig P.; Fitzgerald, Roger E.; Peacock, Grant O.

2014-07-01

428

CCD x-ray detectors: on-board data processing  

NASA Astrophysics Data System (ADS)

We present the results of a comparison of data processing algorithms to be used with space- borne x-ray CCD cameras such as those aboard ASCA, CUBIC and AXAF. The goal is to optimize efficiency and accuracy based upon the capabilities and limitations of the on-board processors. We examine the two main components of processing: determination of the bias (or zero) -level, and event recognition. An algorithm to generate a pixel-by-pixel bias by on-board processing is developed and tested. The on-board bias frame is compared to a bias created from a standard laboratory pixel-by-pixel averaging of dark frames. We show that an accurate pixel-by-pixel bias frame can be created with an on-board algorithm in as few as 15 frames. We show that a bias frame created from that algorithm performs as well as meanframes created in the laboratory. On-board algorithms that handle bias determination and event selection simultaneously are also developed. We show that several types of these algorithms successfully process the CCD data, although the algorithm should be chosen according to the specific capabilities of the processors. The procedures were evaluated by examining event quality and single/split event ratios, and more importantly by the determination of spectral energy resolution (e.g., the FWHM of (superscript 55)Fe). The algorithms were compared and evaluated for laboratory data from several different cameras and types of CCD devices.

Cawley, Laura J.; Nousek, John A.; Burrows, David N.; Garmire, Gordon P.; Janches, Diego; Lumb, David H.

1995-09-01

429

The Crimean CCD Telescope for the asteroid observations  

NASA Astrophysics Data System (ADS)

The old 64-cm Richter-Slefogt telescope (F=90 cm) of the Crimean Astrophysical Observatory was reconstructed and equipped with the SBIG ST-8 CCD camera received from the Planetary Society for Eugene Shoemaker's Near Earth Object Grant. First observations of minor planets and comets were made with it. The CCD matrix of the St-8 camera in the focus of our telescope covers a field of 52'.7 x 35'.1. The 120 - second exposure yields stars up to the limiting magnitude of 20.5 for S/N=3. According to preliminary estimations, the telescope of today state enables us to cover, during the year, the sky area of not more than 600 sq. deg. with threefold overlaps. An automation of the telescope can increase the productivity up to 20000 sq. deg. per year. The software for object localization, image parameters determination, stars identification, astrometric reduction, identification and catalogue of asteroids is worked up. The first results obtained with the Crimean CCD 64-cm telescope are discussed.

Chernykh, N. S.; Rumyantsev, V. V.

2002-09-01

430

The Crimean CCD telescope for the asteroid observations  

NASA Astrophysics Data System (ADS)

The old 64-cm Richter-Slefogt telescope (F=90 cm) of the Crimean Astrophysical Observatory was reconstructed and equipped with the St-8 CCD camera supplied by the Planetary Society as the Eugene Shoemaker Near Earth Object Grant. The first observations of minor planets and comets were made with the telescope in 2000. The CCD matrix of St-8 camera in the focus of our telescope covers field of 52'.735'.1. With 120-second exposure we obtain the images of stars up to the limiting magnitude of 20.5 mag within S/N=3. The first phase of automation of the telescope was completed in May of 2002. According to our estimations, the telescope will be able to cover the sky area of 20 square deg with threefold overlapping during the night. The software for object localization, image parameters determination, stars identification, astrometric reduction, identification and cataloguing of asteroids is worked up. The first observation results obtained with the 64-cm CCD telescope are discussed.

Chernykh, Nikolaj; Rumyantsev, Vasilij

2002-11-01

431

Status of the dark energy survey camera (DECam) project  

Microsoft Academic Search

The Dark Energy Survey Collaboration is building the Dark Energy Camera (DECam), a 3 square degree, 520 Megapixel CCD camera which will be mounted on the Blanco 4-meter telescope at CTIO. DECam will be used to perform the 5000 sq. deg. Dark Energy Survey with 30% of the telescope time over a 5 year period. During the remainder of the

Brenna L. Flaugher; Timothy M. C. Abbott; Jim Annis; Michelle L. Antonik; Jim Bailey; Otger Ballester; Joseph P. Bernstein; Rebecca Bernstein; Marco Bonati; Gale Bremer; Jorge Briones; David Brooks; Elizabeth J. Buckley-Geer; Julia Campa; Laia Cardiel-Sas; Francisco Castander; Javier Castilla; Herman Cease; Steve Chappa; Edward C. Chi; Luis da Costa; Darren L. Depoy; Gregory Derylo; Juan de Vicente; H. Thomas Diehl; Peter Doel; Juan Estrada; Jacob Eiting; Anne Elliott; David Finley; Josh Frieman; Enrique Gaztanaga; David Gerdes; Mike Gladders; V. Guarino; G. Gutierrez; Jim Grudzinski; Bill Hanlon; Jiangang Hao; Steve Holland; Klaus Honscheid; Dave Huffman; Cheryl Jackson; Inga Karliner; Daekwang Kau; Steve Kent; Kurt Krempetz; John Krider; Mark Kozlovsky; Donna Kubik; Kyler W. Kuehn; Stephen E. Kuhlmann; Kevin Kuk; Ofer Lahav; Peter Lewis; Huan Lin; Wolfgang Lorenzon; Stuart Marshall; Gustavo Martnez; Timothy McKay; Wyatt Merritt; Mark Meyer; Ramon Miquel; Jim Morgan; Peter Moore; Todd Moore; Brian Nord; R. Ogando; Jamieson Olsen; John Peoples; Andreas Plazas; Natalie Roe; Aaron Roodman; B. Rossetto; E. Sanchez; Vic Scarpine; Terry Schalk; Rafe Schindler; Ricardo Schmidt; Richard Schmitt; Mike Schubnell; Kenneth Schultz; M. Selen; S. Serrano; Terri Shaw; Vaidis Simaitis; Jean Slaughter; R. Christopher Smith; Hal Spinka; Andy Stefanik; Walter Stuermer; Adam Sypniewski; Rick Talaga; Greg Tarle; Jon Thaler; Doug Tucker; Alistair R. Walker; Curtis Weaverdyck; William Wester; Robert J. Woods; Sue Worswick; Allen Zhao

2010-01-01

432

High-strain-rate fracture behavior of steel: the new application of a high-speed video camera to the fracture initiation experiments of steel  

NASA Astrophysics Data System (ADS)

High-speed event capturing was conducted to determine the fracture initiation load of a hot-rolled steel under rapid loading conditions. The loading tests were carried out on compact specimens which were a single edge-notched and fatigue cracked plate loaded in tension. The impact velocities in the tests were 0.1 - 5.0 m/s. The influences of the impact velocity on the fracture initiation load were confirmed. The new application of a high-speed camera to the fracture initiation experiments has been confirmed.

Suzuki, Goro; Ichinose, Kensuke; Gomi, Kenji; Kaneda, Teruo

1997-12-01

433

A programmable CCD driver circuit for multiphase CCD operation  

NASA Technical Reports Server (NTRS)

A programmable CCD (charge-coupled device) driver circuit was designed to drive CCDs in multiphased modes. The purpose of the drive electronics is to operate developmental CCD imaging arrays for NASA's tiltable moderate resolution imaging spectrometer (MODIS-T). Five objectives for the driver were considered during its design: (1) the circuit drives CCD electrode voltages between 0 V and +30 V to produce reasonable potential wells, (2) the driving sequence is started with one input signal, (3) the driving sequence is started with one input signal, (4) the circuit allows programming of frame sequences required by arrays of any size, (5) it produces interfacing signals for the CCD and the DTF (detector test facility). Simulation of the driver verified its function with the master clock running up to 10 MHz. This suggests a maximum rate of 400,000 pixels/s. Timing and packaging parameters were verified. The design uses 54 TTL (transistor-transistor logic) chips. Two versions of hardware were fabricated: wirewrap and printed circuit board. Both were verified functionally with a logic analyzer.

Ewin, Audrey J.; Reed, Kenneth V.

1989-01-01

434

CCD Charge Shuffling Improvements for ICE  

NASA Astrophysics Data System (ADS)

NOAO has been using IRAF at its telescopes since Unix workstations were first placed in the domes. At the Kitt Peak National Observatory, this has included data acquisition using the ICE (IRAF Control Environment) package that was developed in coordination with Skip Schaller at Steward Observatory. ICE continues to be used both inside KPNO and Steward and at other observatories. Improvements to ICE are described that support a dual exposure mode implemented via charge shuffling techniques. Charge shuffling involves repeatedly shifting the charge back-and-forth from side-to-side of a CCD while nodding the telescope alternately from an object to a blank sky position. The CCD is optically masked such that the sky pixels are kept dark while the object pixels are exposed and vice versa. The nodding and shuffling and opening and closing of the camera shutter occurs on a short enough time scale that the sky brightness variations are frozen. The output of this process is a dual exposure of contemporaneous object and sky spectra accumulated through the exact same optical path. This mode is beneficial, for instance, for multi-slitlet observations such that the width of each slitlet can be minimized to allow many more slits per exposure. New parameters added to ICE include the number of nods and the number of pixels to shift for each exposure. A variety of different nodding patterns are supported, such as a simple ABAB object/sky pattern and a bracketed pattern that begins and ends with a half-length sky subexposure. The on-object and on-sky exposure times may be specified separately.

Seaman, R. L.

435

CCD Base Line Subtraction Algorithms  

SciTech Connect

High statistics astronomical surveys require photometric accuracy on a few percent level. The accuracy of sensor calibration procedures should match this goal. The first step in calibration procedures is the base line subtraction. The accuracy and robustness of different base line subtraction techniques used for Charge Coupled Device (CCD) sensors are discussed.

Kotov, I.V.; OConnor, P.; Kotov, A.; Frank, J.; Perevoztchikov, V.; Takacs, P.

2010-06-28

436

CCD photometry of distant comets  

Microsoft Academic Search

While it is apparent that many comets are active be- yond the canonical distance of 3 AU, few surveys of cometary activity have been performed in this region previously. Such a survey enables a more accurate determination of the proportion of comets that exhibit little or no outgassing at these distances. Results are presented of CCD observations of comets in

S. C. Lowry; A. Fitzsimmons; I. M. Cartwright; I. P. Williams

2003-01-01

437

Multicolor CCD photometry of the open cluster NGC 752  

NASA Astrophysics Data System (ADS)

We obtained CCD observations of the open cluster NGC 752 with the 1.8m Vatican Advanced Technology Telescope (Mt. Graham, Arizona) with a 4K CCD camera and eight intermediate-band filters of the Stromvil (Strmgren + Vilnius) system. Four 12? 12? fields were observed, covering the central part of the cluster. The good-quality multicolor data made it possible to obtain precise estimates of distance moduli, metallicity and foreground reddening for individual stars down to the limiting magnitude, V = 17.5, enabling photometric identification of faint cluster members. The new observations provide an extension of the lower main sequence to three magnitudes beyond the previous (photographic) limit. A relatively small number of photometric members identified at fainter magnitudes seems to be indicative of actual dissolution of the cluster from the low-mass end.

Bartai?t?, Stanislava; Janusz, Robert; Boyle, Richard P.; Philip, A. G. Davis; Deveikis, Viktoras

2010-01-01

438

On the development of new SPMN diurnal video systems for daylight fireball monitoring  

NASA Astrophysics Data System (ADS)

Daylight fireball video monitoring High-sensitivity video devices are commonly used for the study of the activity of meteor streams during the night. These provide useful data for the determination, for instance, of radiant, orbital and photometric parameters ([1] to [7]). With this aim, during 2006 three automated video stations supported by Universidad de Huelva were set up in Andalusia within the framework of the SPanish Meteor Network (SPMN). These are endowed with 8-9 high sensitivity wide-field video cameras that achieve a meteor limiting magnitude of about +3. These stations have increased the coverage performed by the low-scan allsky CCD systems operated by the SPMN and, besides, achieve a time accuracy of about 0.01s for determining the appearance of meteor and fireball events. Despite of these nocturnal monitoring efforts, we realised the need of setting up stations for daylight fireball detection. Such effort was also motivated by the appearance of the two recent meteorite-dropping events of Villalbeto de la Pea [8,9] and Puerto Lpice [10]. Although the Villalbeto de la Pea event was casually videotaped, and photographed, no direct pictures or videos were obtained for the Puerto Lpice event. Consequently, in order to perform a continuous recording of daylight fireball events, we setup new automated systems based on CCD video cameras. However, the development of these video stations implies several issues with respect to nocturnal systems that must be properly solved in order to get an optimal operation. The first of these video stations, also supported by University of Huelva, has been setup in Sevilla (Andalusia) during May 2007. But, of course, fireball association is unequivocal only in those cases when two or more stations recorded the fireball, and when consequently the geocentric radiant is accurately determined. With this aim, a second diurnal video station is being setup in Andalusia in the facilities of Centro Internacional de Estudios y Convenciones Ecolgicas y Medioambientales (CIECEM, University of Huelva), in the environment of Doana Natural Park (Huelva province). In this way, both stations, which are separated by a distance of 75 km, will work as a double video station system in order to provide trajectory and orbit information of mayor bolides and, thus, increase the chance of meteorite recovery in the Iberian Peninsula. The new diurnal SPMN video stations are endowed with different models of Mintron cameras (Mintron Enterprise Co., LTD). These are high-sensitivity devices that employ a colour 1/2" Sony interline transfer CCD image sensor. Aspherical lenses are attached to the video cameras in order to maximize image quality. However, the use of fast lenses is not a priority here: while most of our nocturnal cameras use f0.8 or f1.0 lenses in order to detect meteors as faint as magnitude +3, diurnal systems employ in most cases f1.4 to f2.0 lenses. Their focal length ranges from 3.8 to 12 mm to cover different atmospheric volumes. The cameras are arranged in such a way that the whole sky is monitored from every observing station. Figure 1. A daylight event recorded from Sevilla on May 26, 2008 at 4h30m05.4 +-0.1s UT. The way our diurnal video cameras work is similar to the operation of our nocturnal systems [1]. Thus, diurnal stations are automatically switched on and off at sunrise and sunset, respectively. The images taken at 25 fps and with a resolution of 720x576 pixels are continuously sent to PC computers through a video capture device. The computers run a software (UFOCapture, by SonotaCo, Japan) that automatically registers meteor trails and stores the corresponding video frames on hard disk. Besides, before the signal from the cameras reaches the computers, a video time inserter that employs a GPS device (KIWI-OSD, by PFD Systems) inserts time information on every video frame. This allows us to measure time in a precise way (about 0.01 sec.) along the whole fireball path. EPSC Abstracts, Vol. 3, EPSC2008-A-00319, 2008 European Planetary Science Congress, Author(s) 2008

Madiedo, J. M.; Trigo-Rodrguez, J. M.; Castro-Tirado, A. J.

2008-09-01

439

The PS1 Gigapixel Camera  

NASA Astrophysics Data System (ADS)

The world's largest and most advanced digital camera has been installed on the Pan-STARRS-1 (PS1) telescope on Haleakala, Maui. Built at the University of Hawaii at Manoa's Institute for Astronomy (IfA) in Honolulu, the gigapixel camera will capture images that will be used to scan the skies for killer asteroids, and to create the most comprehensive catalog of stars and galaxies ever produced. The CCD sensors at the heart of the camera were developed in collaboration with Lincoln Laboratory of the Massachusetts Institute of Technology. The image area, which is about 40 cm across, contains 60 identical silicon chips, each of which contains 64 independent imaging circuits. Each of these imaging circuits contains approximately 600 x 600 pixels, for a total of about 1.4 gigapixels in the focal plane. The CCDs themselves employ the innovative technology called "orthogonal transfer." Splitting the image area into about 4,000 separate regions in this way has three advantages: data can be recorded more quickly, saturation of the image by a very bright star is confined to a small region, and any defects in the chips only affect only a small part of the image area. The CCD camera is controlled by an ultrafast 480-channel control system developed at the IfA. The individual CCD cells are grouped in 8 x 8 arrays on a single silicon chip called an orthogonal transfer array (OTA), which measures about 5 cm square. There are a total of 60 OTAs in the focal plane of each telescope.

Tonry, John L.; Isani, S.; Onaka, P.

2007-12-01

440

ADDING PRIVACY CONSTRAINTS TO VIDEO-BASED APPLICATIONS  

E-print Network

to video- based behaviour modelling, and from interactive games to video surveillance. MoreoverADDING PRIVACY CONSTRAINTS TO VIDEO-BASED APPLICATIONS A. CAVALLARO Multimedia and Vision Lab Queen.cavallaro@elec.qmul.ac.uk Remote accessibility to video cameras, face recognition software, and searchable image and video

Cavallaro, Andrea