Measurement of the timing behaviour of off-the-shelf cameras
NASA Astrophysics Data System (ADS)
Schatz, Volker
2017-04-01
This paper presents a measurement method suitable for investigating the timing properties of cameras. A single light source illuminates the camera detector starting with a varying defined delay after the camera trigger. Pixels from the recorded camera frames are summed up and normalised, and the resulting function is indicative of the overlap between illumination and exposure. This allows one to infer the trigger delay and the exposure time with sub-microsecond accuracy. The method is therefore of interest when off-the-shelf cameras are used in reactive systems or synchronised with other cameras. It can supplement radiometric and geometric calibration methods for cameras in scientific use. A closer look at the measurement results reveals deviations from the ideal camera behaviour of constant sensitivity limited to the exposure interval. One of the industrial cameras investigated retains a small sensitivity long after the end of the nominal exposure interval. All three investigated cameras show non-linear variations of sensitivity at O≤ft({{10}-3}\\right) to O≤ft({{10}-2}\\right) during exposure. Due to its sign, the latter effect cannot be described by a sensitivity function depending on the time after triggering, but represents non-linear pixel characteristics.
The research of adaptive-exposure on spot-detecting camera in ATP system
NASA Astrophysics Data System (ADS)
Qian, Feng; Jia, Jian-jun; Zhang, Liang; Wang, Jian-Yu
2013-08-01
High precision acquisition, tracking, pointing (ATP) system is one of the key techniques of laser communication. The spot-detecting camera is used to detect the direction of beacon in laser communication link, so that it can get the position information of communication terminal for ATP system. The positioning accuracy of camera decides the capability of laser communication system directly. So the spot-detecting camera in satellite-to-earth laser communication ATP systems needs high precision on target detection. The positioning accuracy of cameras should be better than +/-1μ rad . The spot-detecting cameras usually adopt centroid algorithm to get the position information of light spot on detectors. When the intensity of beacon is moderate, calculation results of centroid algorithm will be precise. But the intensity of beacon changes greatly during communication for distance, atmospheric scintillation, weather etc. The output signal of detector will be insufficient when the camera underexposes to beacon because of low light intensity. On the other hand, the output signal of detector will be saturated when the camera overexposes to beacon because of high light intensity. The calculation accuracy of centroid algorithm becomes worse if the spot-detecting camera underexposes or overexposes, and then the positioning accuracy of camera will be reduced obviously. In order to improve the accuracy, space-based cameras should regulate exposure time in real time according to light intensity. The algorithm of adaptive-exposure technique for spot-detecting camera based on metal-oxide-semiconductor (CMOS) detector is analyzed. According to analytic results, a CMOS camera in space-based laser communication system is described, which utilizes the algorithm of adaptive-exposure to adapting exposure time. Test results from imaging experiment system formed verify the design. Experimental results prove that this design can restrain the reduction of positioning accuracy for the change of light intensity. So the camera can keep stable and high positioning accuracy during communication.
An efficient multiple exposure image fusion in JPEG domain
NASA Astrophysics Data System (ADS)
Hebbalaguppe, Ramya; Kakarala, Ramakrishna
2012-01-01
In this paper, we describe a method to fuse multiple images taken with varying exposure times in the JPEG domain. The proposed algorithm finds its application in HDR image acquisition and image stabilization for hand-held devices like mobile phones, music players with cameras, digital cameras etc. Image acquisition at low light typically results in blurry and noisy images for hand-held camera's. Altering camera settings like ISO sensitivity, exposure times and aperture for low light image capture results in noise amplification, motion blur and reduction of depth-of-field respectively. The purpose of fusing multiple exposures is to combine the sharp details of the shorter exposure images with high signal-to-noise-ratio (SNR) of the longer exposure images. The algorithm requires only a single pass over all images, making it efficient. It comprises of - sigmoidal boosting of shorter exposed images, image fusion, artifact removal and saturation detection. Algorithm does not need more memory than a single JPEG macro block to be kept in memory making it feasible to be implemented as the part of a digital cameras hardware image processing engine. The Artifact removal step reuses the JPEGs built-in frequency analysis and hence benefits from the considerable optimization and design experience that is available for JPEG.
ERIC Educational Resources Information Center
Lancor, Rachael; Lancor, Brian
2014-01-01
In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…
High dynamic range adaptive real-time smart camera: an overview of the HDR-ARTiST project
NASA Astrophysics Data System (ADS)
Lapray, Pierre-Jean; Heyrman, Barthélémy; Ginhac, Dominique
2015-04-01
Standard cameras capture only a fraction of the information that is visible to the human visual system. This is specifically true for natural scenes including areas of low and high illumination due to transitions between sunlit and shaded areas. When capturing such a scene, many cameras are unable to store the full Dynamic Range (DR) resulting in low quality video where details are concealed in shadows or washed out by sunlight. The imaging technique that can overcome this problem is called HDR (High Dynamic Range) imaging. This paper describes a complete smart camera built around a standard off-the-shelf LDR (Low Dynamic Range) sensor and a Virtex-6 FPGA board. This smart camera called HDR-ARtiSt (High Dynamic Range Adaptive Real-time Smart camera) is able to produce a real-time HDR live video color stream by recording and combining multiple acquisitions of the same scene while varying the exposure time. This technique appears as one of the most appropriate and cheapest solution to enhance the dynamic range of real-life environments. HDR-ARtiSt embeds real-time multiple captures, HDR processing, data display and transfer of a HDR color video for a full sensor resolution (1280 1024 pixels) at 60 frames per second. The main contributions of this work are: (1) Multiple Exposure Control (MEC) dedicated to the smart image capture with alternating three exposure times that are dynamically evaluated from frame to frame, (2) Multi-streaming Memory Management Unit (MMMU) dedicated to the memory read/write operations of the three parallel video streams, corresponding to the different exposure times, (3) HRD creating by combining the video streams using a specific hardware version of the Devebecs technique, and (4) Global Tone Mapping (GTM) of the HDR scene for display on a standard LCD monitor.
NASA Astrophysics Data System (ADS)
Gaddam, Vamsidhar Reddy; Griwodz, Carsten; Halvorsen, Pâl.
2014-02-01
One of the most common ways of capturing wide eld-of-view scenes is by recording panoramic videos. Using an array of cameras with limited overlapping in the corresponding images, one can generate good panorama images. Using the panorama, several immersive display options can be explored. There is a two fold synchronization problem associated to such a system. One is the temporal synchronization, but this challenge can easily be handled by using a common triggering solution to control the shutters of the cameras. The other synchronization challenge is the automatic exposure synchronization which does not have a straight forward solution, especially in a wide area scenario where the light conditions are uncontrolled like in the case of an open, outdoor football stadium. In this paper, we present the challenges and approaches for creating a completely automatic real-time panoramic capture system with a particular focus on the camera settings. One of the main challenges in building such a system is that there is not one common area of the pitch that is visible to all the cameras that can be used for metering the light in order to nd appropriate camera parameters. One approach we tested is to use the green color of the eld grass. Such an approach provided us with acceptable results only in limited light conditions.A second approach was devised where the overlapping areas between adjacent cameras are exploited, thus creating pairs of perfectly matched video streams. However, there still existed some disparity between di erent pairs. We nally developed an approach where the time between two temporal frames is exploited to communicate the exposures among the cameras where we achieve a perfectly synchronized array. An analysis of the system and some experimental results are presented in this paper. In summary, a pilot-camera approach running in auto-exposure mode and then distributing the used exposure values to the other cameras seems to give best visual results.
Wearable camera-derived microenvironments in relation to personal exposure to PM2.5.
Salmon, Maëlle; Milà, Carles; Bhogadi, Santhi; Addanki, Srivalli; Madhira, Pavitra; Muddepaka, Niharika; Mora, Amaravathi; Sanchez, Margaux; Kinra, Sanjay; Sreekanth, V; Doherty, Aiden; Marshall, Julian D; Tonne, Cathryn
2018-05-17
Data regarding which microenvironments drive exposure to air pollution in low and middle income countries are scarce. Our objective was to identify sources of time-resolved personal PM 2.5 exposure in peri-urban India using wearable camera-derived microenvironmental information. We conducted a panel study with up to 6 repeated non-consecutive 24 h measurements on 45 participants (186 participant-days). Camera images were manually annotated to derive visual concepts indicative of microenvironments and activities. Men had slightly higher daily mean PM 2.5 exposure (43 μg/m 3 ) compared to women (39 μg/m 3 ). Cameras helped identify that men also had higher exposures when near a biomass cooking unit (mean (sd) μg/m 3 : 119 (383) for men vs 83 (196) for women) and presence in the kitchen (133 (311) for men vs 48 (94) for women). Visual concepts associated in regression analysis with higher 5-minute PM 2.5 for both sexes included: smoking (+93% (95% confidence interval: 63%, 129%) in men, +29% (95% CI: 2%, 63%) in women), biomass cooking unit (+57% (95% CI: 28%, 93%) in men, +69% (95% CI: 48%, 93%) in women), visible flame or smoke (+90% (95% CI: 48%, 144%) in men, +39% (95% CI: 6%, 83%) in women), and presence in the kitchen (+49% (95% CI: 27%, 75%) in men, +14% (95% CI: 7%, 20%) in women). Our results indicate wearable cameras can provide objective, high time-resolution microenvironmental data useful for identifying peak exposures and providing insights not evident using standard self-reported time-activity. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
SFDT-1 Camera Pointing and Sun-Exposure Analysis and Flight Performance
NASA Technical Reports Server (NTRS)
White, Joseph; Dutta, Soumyo; Striepe, Scott
2015-01-01
The Supersonic Flight Dynamics Test (SFDT) vehicle was developed to advance and test technologies of NASA's Low Density Supersonic Decelerator (LDSD) Technology Demonstration Mission. The first flight test (SFDT-1) occurred on June 28, 2014. In order to optimize the usefulness of the camera data, analysis was performed to optimize parachute visibility in the camera field of view during deployment and inflation and to determine the probability of sun-exposure issues with the cameras given the vehicle heading and launch time. This paper documents the analysis, results and comparison with flight video of SFDT-1.
Registration of Large Motion Blurred Images
2016-05-09
in handling the dynamics of the capturing system, for example, a drone. CMOS sensors , used in recent times, when employed in these cameras produce...handling the dynamics of the capturing system, for example, a drone. CMOS sensors , used in recent times, when employed in these cameras produce two types...blur in the captured image when there is camera motion during exposure. However, contemporary CMOS sensors employ an electronic rolling shutter (RS
High-speed line-scan camera with digital time delay integration
NASA Astrophysics Data System (ADS)
Bodenstorfer, Ernst; Fürtler, Johannes; Brodersen, Jörg; Mayer, Konrad J.; Eckel, Christian; Gravogl, Klaus; Nachtnebel, Herbert
2007-02-01
Dealing with high-speed image acquisition and processing systems, the speed of operation is often limited by the amount of available light, due to short exposure times. Therefore, high-speed applications often use line-scan cameras, based on charge-coupled device (CCD) sensors with time delayed integration (TDI). Synchronous shift and accumulation of photoelectric charges on the CCD chip - according to the objects' movement - result in a longer effective exposure time without introducing additional motion blur. This paper presents a high-speed color line-scan camera based on a commercial complementary metal oxide semiconductor (CMOS) area image sensor with a Bayer filter matrix and a field programmable gate array (FPGA). The camera implements a digital equivalent to the TDI effect exploited with CCD cameras. The proposed design benefits from the high frame rates of CMOS sensors and from the possibility of arbitrarily addressing the rows of the sensor's pixel array. For the digital TDI just a small number of rows are read out from the area sensor which are then shifted and accumulated according to the movement of the inspected objects. This paper gives a detailed description of the digital TDI algorithm implemented on the FPGA. Relevant aspects for the practical application are discussed and key features of the camera are listed.
Electrostatic camera system functional design study
NASA Technical Reports Server (NTRS)
Botticelli, R. A.; Cook, F. J.; Moore, R. F.
1972-01-01
A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.
An automatic lightning detection and photographic system
NASA Technical Reports Server (NTRS)
Wojtasinski, R. J.; Holley, L. D.; Gray, J. L.; Hoover, R. B.
1973-01-01
Conventional 35-mm camera is activated by an electronic signal every time lightning strikes in general vicinity. Electronic circuit detects lightning by means of antenna which picks up atmospheric radio disturbances. Camera is equipped with fish-eye lense, automatic shutter advance, and small 24-hour clock to indicate time when exposures are made.
Studies on the formation, temporal evolution and forensic applications of camera "fingerprints".
Kuppuswamy, R
2006-06-02
A series of experiments was conducted by exposing negative film in brand new cameras of different make and model. The exposures were repeated at regular time intervals spread over a period of 2 years. The processed film negatives were studied under a stereomicroscope (10-40x) in transmitted illumination for the presence of the characterizing features on their four frame-edges. These features were then related to those present on the masking frame of the cameras by examining the latter in reflected light stereomicroscopy (10-40x). The purpose of the study was to determine the origin and permanence of the frame-edge-marks, and also the processes by which the marks may probably alter with time. The investigations have arrived at the following conclusions: (i) the edge-marks have originated principally from the imperfections received on the film mask from the manufacturing and also occasionally from the accumulated dirt, dust and fiber on the film mask over an extended time period. (ii) The edge profiles of the cameras have remained fixed over a considerable period of time so as to be of a valuable identification medium. (iii) The marks are found to be varying in nature even with those cameras manufactured at similar time. (iv) The influence of f/number and object distance has great effect in the recording of the frame-edge marks during exposure of the film. The above findings would serve as a useful addition to the technique of camera edge-mark comparisons.
Optimization of camera exposure durations for multi-exposure speckle imaging of the microcirculation
Kazmi, S. M. Shams; Balial, Satyajit; Dunn, Andrew K.
2014-01-01
Improved Laser Speckle Contrast Imaging (LSCI) blood flow analyses that incorporate inverse models of the underlying laser-tissue interaction have been used to develop more quantitative implementations of speckle flowmetry such as Multi-Exposure Speckle Imaging (MESI). In this paper, we determine the optimal camera exposure durations required for obtaining flow information with comparable accuracy with the prevailing MESI implementation utilized in recent in vivo rodent studies. A looping leave-one-out (LOO) algorithm was used to identify exposure subsets which were analyzed for accuracy against flows obtained from analysis with the original full exposure set over 9 animals comprising n = 314 regional flow measurements. From the 15 original exposures, 6 exposures were found using the LOO process to provide comparable accuracy, defined as being no more than 10% deviant, with the original flow measurements. The optimal subset of exposures provides a basis set of camera durations for speckle flowmetry studies of the microcirculation and confers a two-fold faster acquisition rate and a 28% reduction in processing time without sacrificing accuracy. Additionally, the optimization process can be used to identify further reductions in the exposure subsets for tailoring imaging over less expansive flow distributions to enable even faster imaging. PMID:25071956
Programmable 10 MHz optical fiducial system for hydrodiagnostic cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huen, T.
1987-07-01
A solid state light control system was designed and fabricated for use with hydrodiagnostic streak cameras of the electro-optic type. With its use, the film containing the streak images will have on it two time scales simultaneously exposed with the signal. This allows timing and cross timing. The latter is achieved with exposure modulation marking onto the time tick marks. The purpose of using two time scales will be discussed. The design is based on a microcomputer, resulting in a compact and easy to use instrument. The light source is a small red light emitting diode. Time marking can bemore » programmed in steps of 0.1 microseconds, with a range of 255 steps. The time accuracy is based on a precision 100 MHz quartz crystal, giving a divided down 10 MHz system frequency. The light is guided by two small 100 micron diameter optical fibers, which facilitates light coupling onto the input slit of an electro-optic streak camera. Three distinct groups of exposure modulation of the time tick marks can be independently set anywhere onto the streak duration. This system has been successfully used in Fabry-Perot laser velocimeters for over four years in our Laboratory. The microcomputer control section is also being used in providing optical fids to mechanical rotor cameras.« less
Visible camera imaging of plasmas in Proto-MPEX
NASA Astrophysics Data System (ADS)
Mosby, R.; Skeen, C.; Biewer, T. M.; Renfro, R.; Ray, H.; Shaw, G. C.
2015-11-01
The prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device being developed at Oak Ridge National Laboratory (ORNL). This machine plans to study plasma-material interaction (PMI) physics relevant to future fusion reactors. Measurements of plasma light emission will be made on Proto-MPEX using fast, visible framing cameras. The cameras utilize a global shutter, which allows a full frame image of the plasma to be captured and compared at multiple times during the plasma discharge. Typical exposure times are ~10-100 microseconds. The cameras are capable of capturing images at up to 18,000 frames per second (fps). However, the frame rate is strongly dependent on the size of the ``region of interest'' that is sampled. The maximum ROI corresponds to the full detector area, of ~1000x1000 pixels. The cameras have an internal gain, which controls the sensitivity of the 10-bit detector. The detector includes a Bayer filter, for ``true-color'' imaging of the plasma emission. This presentation will exmine the optimized camera settings for use on Proto-MPEX. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725.
High Dynamic Range Imaging Using Multiple Exposures
NASA Astrophysics Data System (ADS)
Hou, Xinglin; Luo, Haibo; Zhou, Peipei; Zhou, Wei
2017-06-01
It is challenging to capture a high-dynamic range (HDR) scene using a low-dynamic range (LDR) camera. This paper presents an approach for improving the dynamic range of cameras by using multiple exposure images of same scene taken under different exposure times. First, the camera response function (CRF) is recovered by solving a high-order polynomial in which only the ratios of the exposures are used. Then, the HDR radiance image is reconstructed by weighted summation of the each radiance maps. After that, a novel local tone mapping (TM) operator is proposed for the display of the HDR radiance image. By solving the high-order polynomial, the CRF can be recovered quickly and easily. Taken the local image feature and characteristic of histogram statics into consideration, the proposed TM operator could preserve the local details efficiently. Experimental result demonstrates the effectiveness of our method. By comparison, the method outperforms other methods in terms of imaging quality.
Enhanced Early View of Ceres from Dawn
2014-12-05
As the Dawn spacecraft flies through space toward the dwarf planet Ceres, the unexplored world appears to its camera as a bright light in the distance, full of possibility for scientific discovery. This view was acquired as part of a final calibration of the science camera before Dawn's arrival at Ceres. To accomplish this, the camera needed to take pictures of a target that appears just a few pixels across. On Dec. 1, 2014, Ceres was about nine pixels in diameter, nearly perfect for this calibration. The images provide data on very subtle optical properties of the camera that scientists will use when they analyze and interpret the details of some of the pictures returned from orbit. Ceres is the bright spot in the center of the image. Because the dwarf planet is much brighter than the stars in the background, the camera team selected a long exposure time to make the stars visible. The long exposure made Ceres appear overexposed, and exaggerated its size; this was corrected by superimposing a shorter exposure of the dwarf planet in the center of the image. A cropped, magnified view of Ceres appears in the inset image at lower left. The image was taken on Dec. 1, 2014 with the Dawn spacecraft's framing camera, using a clear spectral filter. Dawn was about 740,000 miles (1.2 million kilometers) from Ceres at the time. Ceres is 590 miles (950 kilometers) across and was discovered in 1801. http://photojournal.jpl.nasa.gov/catalog/PIA19050
NASA Astrophysics Data System (ADS)
Breitfelder, Stefan; Reichel, Frank R.; Gaertner, Ernst; Hacker, Erich J.; Cappellaro, Markus; Rudolf, Peter; Voelk, Ute
1998-04-01
Digital cameras are of increasing significance for professional applications in photo studios where fashion, portrait, product and catalog photographs or advertising photos of high quality have to be taken. The eyelike is a digital camera system which has been developed for such applications. It is capable of working online with high frame rates and images of full sensor size and it provides a resolution that can be varied between 2048 by 2048 and 6144 by 6144 pixel at a RGB color depth of 12 Bit per channel with an also variable exposure time of 1/60s to 1s. With an exposure time of 100 ms digitization takes approx. 2 seconds for an image of 2048 by 2048 pixels (12 Mbyte), 8 seconds for the image of 4096 by 4096 pixels (48 Mbyte) and 40 seconds for the image of 6144 by 6144 pixels (108 MByte). The eyelike can be used in various configurations. Used as a camera body most commercial lenses can be connected to the camera via existing lens adaptors. On the other hand the eyelike can be used as a back to most commercial 4' by 5' view cameras. This paper describes the eyelike camera concept with the essential system components. The article finishes with a description of the software, which is needed to bring the high quality of the camera to the user.
Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei
2016-01-01
High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera. PMID:26959023
Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei
2016-03-04
High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera.
Fluorescent image tracking velocimeter
Shaffer, Franklin D.
1994-01-01
A multiple-exposure fluorescent image tracking velocimeter (FITV) detects and measures the motion (trajectory, direction and velocity) of small particles close to light scattering surfaces. The small particles may follow the motion of a carrier medium such as a liquid, gas or multi-phase mixture, allowing the motion of the carrier medium to be observed, measured and recorded. The main components of the FITV include: (1) fluorescent particles; (2) a pulsed fluorescent excitation laser source; (3) an imaging camera; and (4) an image analyzer. FITV uses fluorescing particles excited by visible laser light to enhance particle image detectability near light scattering surfaces. The excitation laser light is filtered out before reaching the imaging camera allowing the fluoresced wavelengths emitted by the particles to be detected and recorded by the camera. FITV employs multiple exposures of a single camera image by pulsing the excitation laser light for producing a series of images of each particle along its trajectory. The time-lapsed image may be used to determine trajectory and velocity and the exposures may be coded to derive directional information.
Flexible nuclear medicine camera and method of using
Dilmanian, F.A.; Packer, S.; Slatkin, D.N.
1996-12-10
A nuclear medicine camera and method of use photographically record radioactive decay particles emitted from a source, for example a small, previously undetectable breast cancer, inside a patient. The camera includes a flexible frame containing a window, a photographic film, and a scintillation screen, with or without a gamma-ray collimator. The frame flexes for following the contour of the examination site on the patient, with the window being disposed in substantially abutting contact with the skin of the patient for reducing the distance between the film and the radiation source inside the patient. The frame is removably affixed to the patient at the examination site for allowing the patient mobility to wear the frame for a predetermined exposure time period. The exposure time may be several days for obtaining early qualitative detection of small malignant neoplasms. 11 figs.
NASA Astrophysics Data System (ADS)
Wojciechowski, Adam M.; Karadas, Mürsel; Huck, Alexander; Osterkamp, Christian; Jankuhn, Steffen; Meijer, Jan; Jelezko, Fedor; Andersen, Ulrik L.
2018-03-01
Sensitive, real-time optical magnetometry with nitrogen-vacancy centers in diamond relies on accurate imaging of small (≪10-2), fractional fluorescence changes across the diamond sample. We discuss the limitations on magnetic field sensitivity resulting from the limited number of photoelectrons that a camera can record in a given time. Several types of camera sensors are analyzed, and the smallest measurable magnetic field change is estimated for each type. We show that most common sensors are of a limited use in such applications, while certain highly specific cameras allow achieving nanotesla-level sensitivity in 1 s of a combined exposure. Finally, we demonstrate the results obtained with a lock-in camera that paves the way for real-time, wide-field magnetometry at the nanotesla level and with a micrometer resolution.
CMOS Camera Array With Onboard Memory
NASA Technical Reports Server (NTRS)
Gat, Nahum
2009-01-01
A compact CMOS (complementary metal oxide semiconductor) camera system has been developed with high resolution (1.3 Megapixels), a USB (universal serial bus) 2.0 interface, and an onboard memory. Exposure times, and other operating parameters, are sent from a control PC via the USB port. Data from the camera can be received via the USB port and the interface allows for simple control and data capture through a laptop computer.
FPGA Based Adaptive Rate and Manifold Pattern Projection for Structured Light 3D Camera System †
Lee, Sukhan
2018-01-01
The quality of the captured point cloud and the scanning speed of a structured light 3D camera system depend upon their capability of handling the object surface of a large reflectance variation in the trade-off of the required number of patterns to be projected. In this paper, we propose and implement a flexible embedded framework that is capable of triggering the camera single or multiple times for capturing single or multiple projections within a single camera exposure setting. This allows the 3D camera system to synchronize the camera and projector even for miss-matched frame rates such that the system is capable of projecting different types of patterns for different scan speed applications. This makes the system capturing a high quality of 3D point cloud even for the surface of a large reflectance variation while achieving a high scan speed. The proposed framework is implemented on the Field Programmable Gate Array (FPGA), where the camera trigger is adaptively generated in such a way that the position and the number of triggers are automatically determined according to camera exposure settings. In other words, the projection frequency is adaptive to different scanning applications without altering the architecture. In addition, the proposed framework is unique as it does not require any external memory for storage because pattern pixels are generated in real-time, which minimizes the complexity and size of the application-specific integrated circuit (ASIC) design and implementation. PMID:29642506
The Multi-site All-Sky CAmeRA (MASCARA). Finding transiting exoplanets around bright (mV < 8) stars
NASA Astrophysics Data System (ADS)
Talens, G. J. J.; Spronck, J. F. P.; Lesage, A.-L.; Otten, G. P. P. L.; Stuik, R.; Pollacco, D.; Snellen, I. A. G.
2017-05-01
This paper describes the design, operations, and performance of the Multi-site All-Sky CAmeRA (MASCARA). Its primary goal is to find new exoplanets transiting bright stars, 4 < mV < 8, by monitoring the full sky. MASCARA consists of one northern station on La Palma, Canary Islands (fully operational since February 2015), one southern station at La Silla Observatory, Chile (operational from early 2017), and a data centre at Leiden Observatory in the Netherlands. Both MASCARA stations are equipped with five interline CCD cameras using wide field lenses (24 mm focal length) with fixed pointings, which together provide coverage down to airmass 3 of the local sky. The interline CCD cameras allow for back-to-back exposures, taken at fixed sidereal times with exposure times of 6.4 sidereal seconds. The exposures are short enough that the motion of stars across the CCD does not exceed one pixel during an integration. Astrometry and photometry are performed on-site, after which the resulting light curves are transferred to Leiden for further analysis. The final MASCARA archive will contain light curves for 70 000 stars down to mV = 8.4, with a precision of 1.5% per 5 minutes at mV = 8.
Understanding the exposure-time effect on speckle contrast measurements for laser displays
NASA Astrophysics Data System (ADS)
Suzuki, Koji; Kubota, Shigeo
2018-02-01
To evaluate the influence of exposure time on speckle noise for laser displays, speckle contrast measurement method was developed observable at a human eye response time using a high-sensitivity camera which has a signal multiplying function. The nonlinearity of camera light sensitivity was calibrated to measure accurate speckle contrasts, and the measuring lower limit noise of speckle contrast was improved by applying spatial-frequency low pass filter to the captured images. Three commercially available laser displays were measured over a wide range of exposure times from tens of milliseconds to several seconds without adjusting the brightness of laser displays. The speckle contrast of raster-scanned mobile projector without any speckle-reduction device was nearly constant over various exposure times. On the contrary to this, in full-frame projection type laser displays equipped with a temporally-averaging speckle-reduction device, some of their speckle contrasts close to the lower limits noise were slightly increased at the shorter exposure time due to the noise. As a result, the exposure-time effect of speckle contrast could not be observed in our measurements, although it is more reasonable to think that the speckle contrasts of laser displays, which are equipped with the temporally-averaging speckle-reduction device, are dependent on the exposure time. This discrepancy may be attributed to the underestimation of temporal averaging factor. We expected that this method is useful for evaluating various laser displays and clarify the relationship between the speckle noise and the exposure time for a further verification of speckle reduction.
Impact of intense x-ray pulses on a NaI(Tl)-based gamma camera
NASA Astrophysics Data System (ADS)
Koppert, W. J. C.; van der Velden, S.; Steenbergen, J. H. L.; de Jong, H. W. A. M.
2018-03-01
In SPECT/CT systems x-ray and γ-ray imaging is performed sequentially. Simultaneous acquisition may have advantages, for instance in interventional settings. However, this may expose a gamma camera to relatively high x-ray doses and deteriorate its functioning. We studied the NaI(Tl) response to x-ray pulses with a photodiode, PMT and gamma camera, respectively. First, we exposed a NaI(Tl)-photodiode assembly to x-ray pulses to investigate potential crystal afterglow. Next, we exposed a NaI(Tl)-PMT assembly to 10 ms LED pulses (mimicking x-ray pulses) and measured the response to flashing LED probe-pulses (mimicking γ-pulses). We then exposed the assembly to x-ray pulses, with detector entrance doses of up to 9 nGy/pulse, and analysed the response for γ-pulse variations. Finally, we studied the response of a Siemens Diacam gamma camera to γ-rays while exposed to x-ray pulses. X-ray exposure of the crystal, read out with a photodiode, revealed 15% afterglow fraction after 3 ms. The NaI(Tl)-PMT assembly showed disturbances up to 10 ms after 10 ms LED exposure. After x-ray exposure however, responses showed elevated baselines, with 60 ms decay-time. Both for x-ray and LED exposure and after baseline subtraction, probe-pulse analysis revealed disturbed pulse height measurements shortly after exposure. X-ray exposure of the Diacam corroborated the elementary experiments. Up to 50 ms after an x-ray pulse, no events are registered, followed by apparent energy elevations up to 100 ms after exposure. Limiting the dose to 0.02 nGy/pulse prevents detrimental effects. Conventional gamma cameras exhibit substantial dead-time and mis-registration of photon energies up to 100 ms after intense x-ray pulses. This is due PMT limitations and due to afterglow in the crystal. Using PMTs with modified circuitry, we show that deteriorative afterglow effects can be reduced without noticeable effects on the PMT performance, up to x-ray pulse doses of 1 nGy.
Electronic method for autofluorography of macromolecules on two-D matrices
Davidson, Jackson B.; Case, Arthur L.
1983-01-01
A method for detecting, localizing, and quantifying macromolecules contained in a two-dimensional matrix is provided which employs a television-based position sensitive detection system. A molecule-containing matrix may be produced by conventional means to produce spots of light at the molecule locations which are detected by the television system. The matrix, such as a gel matrix, is exposed to an electronic camera system including an image-intensifier and secondary electron conduction camera capable of light integrating times of many minutes. A light image stored in the form of a charge image on the camera tube target is scanned by conventional television techniques, digitized, and stored in a digital memory. Intensity of any point on the image may be determined from the number at the memory address of the point. The entire image may be displayed on a television monitor for inspection and photographing or individual spots may be analyzed through selected readout of the memory locations. Compared to conventional film exposure methods, the exposure time may be reduced 100-1000 times.
Motion-Blur-Free High-Speed Video Shooting Using a Resonant Mirror
Inoue, Michiaki; Gu, Qingyi; Takaki, Takeshi; Ishii, Idaku; Tajima, Kenji
2017-01-01
This study proposes a novel concept of actuator-driven frame-by-frame intermittent tracking for motion-blur-free video shooting of fast-moving objects. The camera frame and shutter timings are controlled for motion blur reduction in synchronization with a free-vibration-type actuator vibrating with a large amplitude at hundreds of hertz so that motion blur can be significantly reduced in free-viewpoint high-frame-rate video shooting for fast-moving objects by deriving the maximum performance of the actuator. We develop a prototype of a motion-blur-free video shooting system by implementing our frame-by-frame intermittent tracking algorithm on a high-speed video camera system with a resonant mirror vibrating at 750 Hz. It can capture 1024 × 1024 images of fast-moving objects at 750 fps with an exposure time of 0.33 ms without motion blur. Several experimental results for fast-moving objects verify that our proposed method can reduce image degradation from motion blur without decreasing the camera exposure time. PMID:29109385
Flexible nuclear medicine camera and method of using
Dilmanian, F. Avraham; Packer, Samuel; Slatkin, Daniel N.
1996-12-10
A nuclear medicine camera 10 and method of use photographically record radioactive decay particles emitted from a source, for example a small, previously undetectable breast cancer, inside a patient. The camera 10 includes a flexible frame 20 containing a window 22, a photographic film 24, and a scintillation screen 26, with or without a gamma-ray collimator 34. The frame 20 flexes for following the contour of the examination site on the patient, with the window 22 being disposed in substantially abutting contact with the skin of the patient for reducing the distance between the film 24 and the radiation source inside the patient. The frame 20 is removably affixed to the patient at the examination site for allowing the patient mobility to wear the frame 20 for a predetermined exposure time period. The exposure time may be several days for obtaining early qualitative detection of small malignant neoplasms.
Reciprocity testing of Kodak film type SO-289 multispectral infrared aerial film
NASA Technical Reports Server (NTRS)
Lockwood, H. E.
1975-01-01
Kodak multispectral infrared aerial film type SO-289 was tested for reciprocity characteristics because of the variance between the I-B sensitometer exposure times (8 seconds and 4 seconds) and the camera exposure time (1/500 second) used on the ASTP stratospheric aerosol measurement project. Test exposures were made on the flight emulsion using a Mead star system sensitometer, the films were processed to ASTP control standards, and the resulting densities read and reciprocity data calculated. It was found that less exposure was required to produce a typical density (1.3) at 1/500 second exposure time than at an 8 second exposure time. This exposure factor was 2.8.
Recreational use assessment of water-based activities, using time-lapse construction cameras.
Sunger, Neha; Teske, Sondra S; Nappier, Sharon; Haas, Charles N
2012-01-01
Recreational exposure to surface waters during periods of increased pathogen concentration may lead to a significantly higher risk of illness. However, estimates of elementary exposure factors necessary to evaluate health risk (i.e., usage distributions and exposure durations) are not available for many non-swimming water-related activities. No prior studies have assessed non-swimming water exposure with respect to factors leading to impaired water quality from increased pathogen concentration, such as weather condition (rain events produce increased runoff and sewer overflows) and type of day (heavy recreational periods). We measured usage patterns and evaluated the effect of weather and type of day at eight water sites located within Philadelphia, by using a novel "time lapse photography" technology during three peak recreational seasons (May-September) 2008-2010. Camera observations validated with simultaneous in-person surveys exhibited a strong correlation (R(2)=0.81 to 0.96) between the two survey techniques, indicating that the application of remote photography in collecting human exposure data was appropriate. Recreational activities usage varied more on a temporal basis than due to inclement weather. Only 14% (6 out of 44) of the site-specific activity combinations showed dry weather preference, whereas 41.5% (17 out of 41) of the combinations indicated greater usage on weekends as compared with weekday. In general, the log normal distribution described the playing and wading duration distribution, while the gamma distribution was the best fit for fishing durations. Remote photography provided unbiased, real-time human exposure data and was less personnel intensive compared with traditional survey methods. However, there are potential limitations associated with remote surveillance data related to its limited view. This is the first study to report that time lapse cameras can be successfully applied to assess water-based human recreational patterns and can provide precise exposure statistics for non-swimming recreational exposures.
NASA Astrophysics Data System (ADS)
Zoletnik, S.; Biedermann, C.; Cseh, G.; Kocsis, G.; König, R.; Szabolics, T.; Szepesi, T.; Wendelstein 7-X Team
2018-01-01
A special video camera has been developed for the 10-camera overview video system of the Wendelstein 7-X (W7-X) stellarator considering multiple application needs and limitations resulting from this complex long-pulse superconducting stellarator experiment. The event detection intelligent camera (EDICAM) uses a special 1.3 Mpixel CMOS sensor with non-destructive read capability which enables fast monitoring of smaller Regions of Interest (ROIs) even during long exposures. The camera can perform simple data evaluation algorithms (minimum/maximum, mean comparison to levels) on the ROI data which can dynamically change the readout process and generate output signals. Multiple EDICAM cameras were operated in the first campaign of W7-X and capabilities were explored in the real environment. Data prove that the camera can be used for taking long exposure (10-100 ms) overview images of the plasma while sub-ms monitoring and even multi-camera correlated edge plasma turbulence measurements of smaller areas can be done in parallel. These latter revealed that filamentary turbulence structures extend between neighboring modules of the stellarator. Considerations emerging for future upgrades of this system and similar setups on future long-pulse fusion experiments such as ITER are discussed.
Multiplexed time-lapse photomicrography of cultured cells.
Heye, R R; Kiebler, E W; Arnzen, R J; Tolmach, L J
1982-01-01
A system of cinemicrography has been developed in which a single microscope and 16 mm camera are multiplexed to produce a time-lapse photographic record of many fields simultaneously. The field coordinates and focus are selected via a control console and entered into the memory of a dedicated microcomputer; they are then automatically recalled in sequence, thus permitting the photographing of additional fields in the interval between exposures of any given field. Sequential exposures of each field are isolated in separate sections of the film by means of a specially designed random-access camera that is also controlled by the microcomputer. The need to unscramble frames is thereby avoided, and the developed film can be directly analysed.
von der Heide, Anna Maria; Fallavollita, Pascal; Wang, Lejing; Sandner, Philipp; Navab, Nassir; Weidert, Simon; Euler, Ekkehard
2018-04-01
In orthopaedic trauma surgery, image-guided procedures are mostly based on fluoroscopy. The reduction of radiation exposure is an important goal. The purpose of this work was to investigate the impact of a camera-augmented mobile C-arm (CamC) on radiation exposure and the surgical workflow during a first clinical trial. Applying a workflow-oriented approach, 10 general workflow steps were defined to compare the CamC to traditional C-arms. The surgeries included were arbitrarily identified and assigned to the study. The evaluation criteria were radiation exposure and operation time for each workflow step and the entire surgery. The evaluation protocol was designed and conducted in a single-centre study. The radiation exposure was remarkably reduced by 18 X-ray shots 46% using the CamC while keeping similar surgery times. The intuitiveness of the system, its easy integration into the surgical workflow, and its great potential to reduce radiation have been demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.
Electronic method for autofluorography of macromolecules on two-D matrices. [Patent application
Davidson, J.B.; Case, A.L.
1981-12-30
A method for detecting, localizing, and quantifying macromolecules contained in a two-dimensional matrix is provided which employs a television-based position sensitive detection system. A molecule-containing matrix may be produced by conventional means to produce spots of light at the molecule locations which are detected by the television system. The matrix, such as a gel matrix, is exposed to an electronic camera system including an image-intensifier and secondary electron conduction camera capable of light integrating times of many minutes. A light image stored in the form of a charge image on the camera tube target is scanned by conventional television techniques, digitized, and stored in a digital memory. Intensity of any point on the image may be determined from the number at the memory address of the point. The entire image may be displayed on a television monitor for inspection and photographing or individual spots may be analyzed through selected readout of the memory locations. Compared to conventional film exposure methods, the exposure time may be reduced 100 to 1000 times.
Audi, Ahmad; Pierrot-Deseilligny, Marc; Meynard, Christophe
2017-01-01
Images acquired with a long exposure time using a camera embedded on UAVs (Unmanned Aerial Vehicles) exhibit motion blur due to the erratic movements of the UAV. The aim of the present work is to be able to acquire several images with a short exposure time and use an image processing algorithm to produce a stacked image with an equivalent long exposure time. Our method is based on the feature point image registration technique. The algorithm is implemented on the light-weight IGN (Institut national de l’information géographique) camera, which has an IMU (Inertial Measurement Unit) sensor and an SoC (System on Chip)/FPGA (Field-Programmable Gate Array). To obtain the correct parameters for the resampling of the images, the proposed method accurately estimates the geometrical transformation between the first and the N-th images. Feature points are detected in the first image using the FAST (Features from Accelerated Segment Test) detector, then homologous points on other images are obtained by template matching using an initial position benefiting greatly from the presence of the IMU sensor. The SoC/FPGA in the camera is used to speed up some parts of the algorithm in order to achieve real-time performance as our ultimate objective is to exclusively write the resulting image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images and block diagrams of the described architecture. The resulting stacked image obtained for real surveys does not seem visually impaired. An interesting by-product of this algorithm is the 3D rotation estimated by a photogrammetric method between poses, which can be used to recalibrate in real time the gyrometers of the IMU. Timing results demonstrate that the image resampling part of this algorithm is the most demanding processing task and should also be accelerated in the FPGA in future work. PMID:28718788
Audi, Ahmad; Pierrot-Deseilligny, Marc; Meynard, Christophe; Thom, Christian
2017-07-18
Images acquired with a long exposure time using a camera embedded on UAVs (Unmanned Aerial Vehicles) exhibit motion blur due to the erratic movements of the UAV. The aim of the present work is to be able to acquire several images with a short exposure time and use an image processing algorithm to produce a stacked image with an equivalent long exposure time. Our method is based on the feature point image registration technique. The algorithm is implemented on the light-weight IGN (Institut national de l'information géographique) camera, which has an IMU (Inertial Measurement Unit) sensor and an SoC (System on Chip)/FPGA (Field-Programmable Gate Array). To obtain the correct parameters for the resampling of the images, the proposed method accurately estimates the geometrical transformation between the first and the N -th images. Feature points are detected in the first image using the FAST (Features from Accelerated Segment Test) detector, then homologous points on other images are obtained by template matching using an initial position benefiting greatly from the presence of the IMU sensor. The SoC/FPGA in the camera is used to speed up some parts of the algorithm in order to achieve real-time performance as our ultimate objective is to exclusively write the resulting image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images and block diagrams of the described architecture. The resulting stacked image obtained for real surveys does not seem visually impaired. An interesting by-product of this algorithm is the 3D rotation estimated by a photogrammetric method between poses, which can be used to recalibrate in real time the gyrometers of the IMU. Timing results demonstrate that the image resampling part of this algorithm is the most demanding processing task and should also be accelerated in the FPGA in future work.
Image deblurring in smartphone devices using built-in inertial measurement sensors
NASA Astrophysics Data System (ADS)
Šindelář, Ondřej; Šroubek, Filip
2013-01-01
Long-exposure handheld photography is degraded with blur, which is difficult to remove without prior information about the camera motion. In this work, we utilize inertial sensors (accelerometers and gyroscopes) in modern smartphones to detect exact motion trajectory of the smartphone camera during exposure and remove blur from the resulting photography based on the recorded motion data. The whole system is implemented on the Android platform and embedded in the smartphone device, resulting in a close-to-real-time deblurring algorithm. The performance of the proposed system is demonstrated in real-life scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goddu, S; Sun, B; Grantham, K
2016-06-15
Purpose: Proton therapy (PT) delivery is complex and extremely dynamic. Therefore, quality assurance testing is vital, but highly time-consuming. We have developed a High-Speed Scintillation-Camera-System (HS-SCS) for simultaneously measuring multiple beam characteristics. Methods: High-speed camera was placed in a light-tight housing and dual-layer neutron shield. HS-SCS is synchronized with a synchrocyclotron to capture individual proton-beam-pulses (PBPs) at ∼504 frames/sec. The PBPs from synchrocyclotron trigger the HS-SCS to open its shutter for programmed exposure-time. Light emissions within 30×30×5cm3 plastic-scintillator (BC-408) were captured by a CCD-camera as individual images revealing dose-deposition in a 2D-plane with a resolution of 0.7mm for range andmore » SOBP measurements and 1.67mm for profiles. The CCD response as well as signal to noise ratio (SNR) was characterized for varying exposure times, gains for different light intensities using a TV-Optoliner system. Software tools were developed to analyze ∼5000 images to extract different beam parameters. Quenching correction-factors were established by comparing scintillation Bragg-Peaks with water scanned ionization-chamber measurements. Quenching corrected Bragg-peaks were integrated to ascertain proton-beam range (PBR), width of Spared-Out-Bragg-Peak (MOD) and distal.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goggin, L; Kilby, W; Noll, M
2015-06-15
Purpose: A technique using a scintillator-mirror-camera system to measure MLC leakage was developed to provide an efficient alternative to film dosimetry while maintaining high spatial resolution. This work describes the technique together with measurement uncertainties. Methods: Leakage measurements were made for the InCise™ MLC using the Logos XRV-2020A device. For each measurement approximately 170 leakage and background images were acquired using optimized camera settings. Average background was subtracted from each leakage frame before filtering the integrated leakage image to replace anomalous pixels. Pixel value to dose conversion was performed using a calibration image. Mean leakage was calculated within an ROImore » corresponding to the primary beam, and maximum leakage was determined by binning the image into overlapping 1mm x 1mm ROIs. 48 measurements were performed using 3 cameras and multiple MLC-linac combinations in varying beam orientations, with each compared to film dosimetry. Optical and environmental influences were also investigated. Results: Measurement time with the XRV-2020A was 8 minutes vs. 50 minutes using radiochromic film, and results were available immediately. Camera radiation exposure degraded measurement accuracy. With a relatively undamaged camera, mean leakage agreed with film measurement to ≤0.02% in 92% cases, ≤0.03% in 100% (for maximum leakage the values were 88% and 96%) relative to reference open field dose. The estimated camera lifetime over which this agreement is maintained is at least 150 measurements, and can be monitored using reference field exposures. A dependency on camera temperature was identified and a reduction in sensitivity with distance from image center due to optical distortion was characterized. Conclusion: With periodic monitoring of the degree of camera radiation damage, the XRV-2020A system can be used to measure MLC leakage. This represents a significant time saving when compared to the traditional film-based approach without any substantial reduction in accuracy.« less
Stargazing at 'Husband Hill Observatory' on Mars
NASA Technical Reports Server (NTRS)
2005-01-01
NASA's Mars Exploration Rover Spirit continues to take advantage of extra solar energy by occasionally turning its cameras upward for night sky observations. Most recently, Spirit made a series of observations of bright star fields from the summit of 'Husband Hill' in Gusev Crater on Mars. Scientists use the images to assess the cameras' sensitivity and to search for evidence of nighttime clouds or haze. The image on the left is a computer simulation of the stars in the constellation Orion. The next three images are actual views of Orion captured with Spirit's panoramic camera during exposures of 10, 30, and 60 seconds. Because Spirit is in the southern hemisphere of Mars, Orion appears upside down compared to how it would appear to viewers in the Northern Hemisphere of Earth. 'Star trails' in the longer exposures are a result of the planet's rotation. The faintest stars visible in the 60-second exposure are about as bright as the faintest stars visible with the naked eye from Earth (about magnitude 6 in astronomical terms). The Orion Nebula, famous as a nursery of newly forming stars, is also visible in these images. Bright streaks in some parts of the images aren't stars or meteors or unidentified flying objects, but are caused by solar and galactic cosmic rays striking the camera's detector. Spirit acquired these images with the panoramic camera on Martian day, or sol, 632 (Oct. 13, 2005) at around 45 minutes past midnight local time, using the camera's broadband filter (wavelengths of 739 nanometers plus or minus 338 nanometers).Implementation of a Real-Time Stacking Algorithm in a Photogrammetric Digital Camera for Uavs
NASA Astrophysics Data System (ADS)
Audi, A.; Pierrot-Deseilligny, M.; Meynard, C.; Thom, C.
2017-08-01
In the recent years, unmanned aerial vehicles (UAVs) have become an interesting tool in aerial photography and photogrammetry activities. In this context, some applications (like cloudy sky surveys, narrow-spectral imagery and night-vision imagery) need a longexposure time where one of the main problems is the motion blur caused by the erratic camera movements during image acquisition. This paper describes an automatic real-time stacking algorithm which produces a high photogrammetric quality final composite image with an equivalent long-exposure time using several images acquired with short-exposure times. Our method is inspired by feature-based image registration technique. The algorithm is implemented on the light-weight IGN camera, which has an IMU sensor and a SoC/FPGA. To obtain the correct parameters for the resampling of images, the presented method accurately estimates the geometrical relation between the first and the Nth image, taking into account the internal parameters and the distortion of the camera. Features are detected in the first image by the FAST detector, than homologous points on other images are obtained by template matching aided by the IMU sensors. The SoC/FPGA in the camera is used to speed up time-consuming parts of the algorithm such as features detection and images resampling in order to achieve a real-time performance as we want to write only the resulting final image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images, as well as block diagrams of the described architecture. The resulting stacked image obtained on real surveys doesn't seem visually impaired. Timing results demonstrate that our algorithm can be used in real-time since its processing time is less than the writing time of an image in the storage device. An interesting by-product of this algorithm is the 3D rotation estimated by a photogrammetric method between poses, which can be used to recalibrate in real-time the gyrometers of the IMU.
Astrophotography Basics: Meteors, Comets, Eclipses, Aurorae, Star Trails. Revised.
ERIC Educational Resources Information Center
Eastman Kodak Co., Rochester, NY.
This pamphlet gives an introduction to the principles of astronomical picture-taking. Chapters included are: (1) "Getting Started" (describing stationary cameras, sky charts and mapping, guided cameras, telescopes, brightness of astronomical subjects, estimating exposure, film selection, camera filters, film processing, and exposure for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hachtel, A. J.; Gillette, M. C.; Clements, E. R.
A novel home-built system for imaging cold atom samples is presented using a readily available astronomy camera which has the requisite sensitivity but no timing-control. We integrate the camera with LabVIEW achieving fast, low-jitter imaging with a convenient user-defined interface. We show that our system takes precisely timed millisecond exposures and offers significant improvements in terms of system jitter and readout time over previously reported home-built systems. Our system rivals current commercial “black box” systems in performance and user-friendliness.
VizieR Online Data Catalog: Photometry of YSOs in eight bright-rimmed clouds (Sharma+, 2016)
NASA Astrophysics Data System (ADS)
Sharma, S.; Pandey, A. K.; Borissova, J.; Ojha, D. K.; Ivanov, V. D.; Ogura, K.; Kobayashi, N.; Kurtev, R.; Gopinathan, M.; Yadav, R. K.
2016-08-01
Near-infrared (J, H, K') data for eight selected Bright-Rimmed Clouds (BRCs) along with two nearby field regions (see Table1) were collected with the Infrared Side Port Imager (ISPI) camera (FOV~10.5*10.5arcmin2; scale 0.3arcsec/pixel) on the 4m Blanco telescope at Cerro Tololo Inter-American Observatory (CTIO), Chile, during the nights of 2010 March 03-04. The seeing was ~1arcsec. The individual exposure times were 60 s per frame for all filters. The total exposure time for the target fields was 540s for each J, H, and K' band. We also used the infrared archived data taken from the Infrared Array Camera (IRAC) of the space-based Spitzer telescope at the 3.6, 4.5, 5.8, and 8.0μm bands. We obtained Basic Calibrated Data (BCD) from the Spitzer data archive for all BRCs (except SFO 76, which has no Spitzer data). The exposure time of each BCD was 10.4s (4 data files).
R&D 100, 2016: Ultrafast X-ray Imager
Porter, John; Claus, Liam; Sanchez, Marcos; Robertson, Gideon; Riley, Nathan; Rochau, Greg
2018-06-13
The Ultrafast X-ray Imager is a solid-state camera capable of capturing a sequence of images with user-selectable exposure times as short as 2 billionths of a second. Using 3D semiconductor integration techniques to form a hybrid chip, this camera was developed to enable scientists to study the heating and compression of fusion targets in the quest to harness the energy process that powers the stars.
R&D 100, 2016: Ultrafast X-ray Imager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, John; Claus, Liam; Sanchez, Marcos
The Ultrafast X-ray Imager is a solid-state camera capable of capturing a sequence of images with user-selectable exposure times as short as 2 billionths of a second. Using 3D semiconductor integration techniques to form a hybrid chip, this camera was developed to enable scientists to study the heating and compression of fusion targets in the quest to harness the energy process that powers the stars.
An HDR imaging method with DTDI technology for push-broom cameras
NASA Astrophysics Data System (ADS)
Sun, Wu; Han, Chengshan; Xue, Xucheng; Lv, Hengyi; Shi, Junxia; Hu, Changhong; Li, Xiangzhi; Fu, Yao; Jiang, Xiaonan; Huang, Liang; Han, Hongyin
2018-03-01
Conventionally, high dynamic-range (HDR) imaging is based on taking two or more pictures of the same scene with different exposure. However, due to a high-speed relative motion between the camera and the scene, it is hard for this technique to be applied to push-broom remote sensing cameras. For the sake of HDR imaging in push-broom remote sensing applications, the present paper proposes an innovative method which can generate HDR images without redundant image sensors or optical components. Specifically, this paper adopts an area array CMOS (complementary metal oxide semiconductor) with the digital domain time-delay-integration (DTDI) technology for imaging, instead of adopting more than one row of image sensors, thereby taking more than one picture with different exposure. And then a new HDR image by fusing two original images with a simple algorithm can be achieved. By conducting the experiment, the dynamic range (DR) of the image increases by 26.02 dB. The proposed method is proved to be effective and has potential in other imaging applications where there is a relative motion between the cameras and scenes.
Comet Wild 2 Up Close and Personal
NASA Technical Reports Server (NTRS)
2004-01-01
On January 2, 2004 NASA's Stardust spacecraft made a close flyby of comet Wild 2 (pronounced 'Vilt-2'). Among the equipment the spacecraft carried on board was a navigation camera. This is the 34th of the 72 images taken by Stardust's navigation camera during close encounter. The exposure time was 10 milliseconds. The two frames are actually of 1 single exposure. The frame on the left depicts the comet as the human eye would see it. The frame on the right depicts the same image but 'stretched' so that the faint jets emanating from Wild 2 can be plainly seen. Comet Wild 2 is about five kilometers (3.1 miles) in diameter.
Multi-Angle Snowflake Camera Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuefer, Martin; Bailey, J.
2016-07-01
The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less
Absolute colorimetric characterization of a DSLR camera
NASA Astrophysics Data System (ADS)
Guarnera, Giuseppe Claudio; Bianco, Simone; Schettini, Raimondo
2014-03-01
A simple but effective technique for absolute colorimetric camera characterization is proposed. It offers a large dynamic range requiring just a single, off-the-shelf target and a commonly available controllable light source for the characterization. The characterization task is broken down in two modules, respectively devoted to absolute luminance estimation and to colorimetric characterization matrix estimation. The characterized camera can be effectively used as a tele-colorimeter, giving an absolute estimation of the XYZ data in cd=m2. The user is only required to vary the f - number of the camera lens or the exposure time t, to better exploit the sensor dynamic range. The estimated absolute tristimulus values closely match the values measured by a professional spectro-radiometer.
VizieR Online Data Catalog: HST and Magellan observations of Haumea system (Hastings+, 2016)
NASA Astrophysics Data System (ADS)
Hastings, D. M.; Ragozzine, D.; Fabrycky, D. C.; Burkhart, L. D.; Fuentes, C.; Margot, J.-L.; Brown, M. E.; Holman, M.
2017-01-01
The Hubble Space Telescope (HST) observations of the Haumea system comprised five HST orbits' worth of 100s exposures of the Wide Field Planetary Camera 2 from 2009 February 4 (Program 11971) and 10 HST orbits' worth of 44s exposures of the Wide Field Camera 3 from 2010 June 28 (Program 12243). This system was also observed on the night of UT 2009 June 2 with the Magellan Baade telescope at Las Campanas Observatory in Chile. We used the Raymond and Beverly Sackler Magellan Instant Camera (MagIC). Observations were taken from the beginning of the night until it was unobservable, for a total of ~5hr. We centered the system on one of the four quadrants defined by the instrument's four amplifiers. The seeing was constant during the observations and consistently close to 0.5'', smaller than Hi'iaka's separation of 1.4''. The SITe CCD detector has a pixel scale of 0.069''/pixel. We set the exposure times at 120s to avoid saturation and optimize readout time. The filter selected was Johnson-Cousins R. Standard calibrations were taken at the beginning and end of the night. The telescope guiding system ensured that the pointing was constant to within an FWHM over the course of the observations. Table1 presents the relative normalized photometry inferred from our observations. (1 data file).
Time-gated real-time pump-probe imaging spectroscopy
NASA Astrophysics Data System (ADS)
Ferrari, Raffaele; D'Andrea, Cosimo; Bassi, Andrea; Valentini, Gianluca; Cubeddu, Rinaldo
2007-07-01
An experimental technique which allows one to perform pump-probe transient absorption spectroscopy in real-time is an important tool to study irreversible processes. This is particularly interesting in the case of biological samples which easily deteriorate upon exposure to light pulses, with the formation of permanent photoproducts and structural changes. In particular pump-probe spectroscopy can provide fundamental information for the design of optical chromophores. In this work a real-time pump-probe imaging spectroscopy system has been realized and we have explored the possibility to further reduce the number of laser pulses by using a time-gated camera. We believe that the use of a time-gated camera can provide an important step towards the final goal of pump-probe single shot spectroscopy.
Film annotation system for a space experiment
NASA Technical Reports Server (NTRS)
Browne, W. R.; Johnson, S. S.
1989-01-01
This microprocessor system was designed to control and annotate a Nikon 35 mm camera for the purpose of obtaining photographs and data at predefined time intervals. The single STD BUSS interface card was designed in such a way as to allow it to be used in either a stand alone application with minimum features or installed in a STD BUSS computer allowing for maximum features. This control system also allows the exposure of twenty eight alpha/numeric characters across the bottom of each photograph. The data contains such information as camera identification, frame count, user defined text, and time to .01 second.
Hypervelocity impact studies using a rotating mirror framing laser shadowgraph camera
NASA Technical Reports Server (NTRS)
Parker, Vance C.; Crews, Jeanne Lee
1988-01-01
The need to study the effects of the impact of micrometeorites and orbital debris on various space-based systems has brought together the technologies of several companies and individuals in order to provide a successful instrumentation package. A light gas gun was employed to accelerate small projectiles to speeds in excess of 7 km/sec. Their impact on various targets is being studied with the help of a specially designed continuous-access rotating-mirror framing camera. The camera provides 80 frames of data at up to 1 x 10 to the 6th frames/sec with exposure times of 20 nsec.
NASA Astrophysics Data System (ADS)
Torres, Juan; Menéndez, José Manuel
2015-02-01
This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together with a representative auto-exposure algorithm in the recent literature. Besides the sunrises and the nightfalls, multiple weather conditions occurred which produced light changes in the scene: sunny hours that produced sharpen shadows and highlights; cloud coverages that softened the shadows; and cloudy and rainy hours that dimmed the scene. Several indicators were used to measure the performance of the algorithms. They provided the objective quality as regards: the time that the algorithms recover from an under or over exposure, the brightness stability, and the change related to the optimal exposure. The results demonstrated that our algorithm reacts faster to all the light changes than the selected state-of-the-art algorithm. It is also capable of acquiring well exposed images and maintaining the brightness stable during more time. Summing up the results, we concluded that the proposed algorithm provides a fast and stable auto-exposure method that maintains an optimal exposure for video surveillance applications. Future work will involve the evaluation of this algorithm in robotics.
Image Alignment for Multiple Camera High Dynamic Range Microscopy.
Eastwood, Brian S; Childs, Elisabeth C
2012-01-09
This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera.
Image Alignment for Multiple Camera High Dynamic Range Microscopy
Eastwood, Brian S.; Childs, Elisabeth C.
2012-01-01
This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera. PMID:22545028
VizieR Online Data Catalog: NGC 300 giant dust clouds (Riener+, 2018)
NASA Astrophysics Data System (ADS)
Riener, M.; Faesi, C. M.; Forbrich, J.; Lada, C. J.
2017-11-01
We obtained photometric observations of NGC 300 with the Herschel Space Observatory (PI: Jan Forbrich) with two of its instruments,the Spectral and Photometric Imaging Receiver (SPIRE) and the Photodetector Array Camera and Spectrometer (PACS). The SPIRE observations took place on 2012 May 11 with a total exposure time of 4558s. The two PACS observations were carried out on 2012 June 25 with a total exposure time of 3245 and 3803s. (2 data files).
Measurement of the meteoroid flux at Mars
NASA Astrophysics Data System (ADS)
Domokos, A.; Bell, J. F.; Brown, P.; Lemmon, M. T.; Suggs, R.; Vaubaillon, J.; Cooke, W.
2007-11-01
In the fall of 2005, a dedicated meteor observing campaign was carried out by the Panoramic Camera (Pancam) onboard the Mars Exploration Rover (MER) Spirit to determine the viability of using MER cameras as meteor detectors and to obtain the first experimental estimate of the meteoroid flux at Mars. Our observing targets included both the sporadic meteoroid background and two predicted martian meteor showers: one associated with 1P/Halley and a potential stream associated with 2001/R1 LONEOS. A total of 353 images covering 2.7 h of net exposure time were analyzed with no conclusive meteor detections. From these data, an upper limit to the background meteoroid flux at Mars is estimated to be <4.4×10 meteoroidskmh for meteoroids with mass larger than 4 g. For comparison, the estimated flux to this mass limit at the Earth is 10 meteoroidskmh [Grün, E., Zook, H.A., Fechtig, H., Giese, R.H., 1985. Icarus 62, 244-272]. This result is qualitatively consistent, within error bounds, with theoretical models predicting martian fluxes of ˜50% that at Earth for meteoroids of mass 10-10 g [Adolfsson, L.G., Gustafson, B.A.S., Murray, C.D., 1996. Icarus 119, 144-152]. The MER cameras, even using the most sensitive mode of operation, should expect to see on average only one coincident meteor on of order 40-150 h of total exposure time based on these same theoretical martian flux estimates. To more meaningfully constrain these flux models, a longer total integrated exposure time or more sensitive camera is needed. Our analysis also suggests that the event reported as the first martian meteor [Selsis, F., Lemmon, M.T., Vaubaillon, J., Bell, J.F., 2005. Nature 435, 581] is more likely a grazing cosmic ray impact, which we show to be a major source of confusion with potential meteors in all Pancam images.
Comet Wild 2 Up Close and Personal
2004-01-02
On January 2, 2004 NASA's Stardust spacecraft made a close flyby of comet Wild 2 (pronounced "Vilt-2"). Among the equipment the spacecraft carried on board was a navigation camera. This is the 34th of the 72 images taken by Stardust's navigation camera during close encounter. The exposure time was 10 milliseconds. The two frames are actually of 1 single exposure. The frame on the left depicts the comet as the human eye would see it. The frame on the right depicts the same image but "stretched" so that the faint jets emanating from Wild 2 can be plainly seen. Comet Wild 2 is about five kilometers (3.1 miles) in diameter. http://photojournal.jpl.nasa.gov/catalog/PIA05571
Flexcam Image Capture Viewing and Spot Tracking
NASA Technical Reports Server (NTRS)
Rao, Shanti
2008-01-01
Flexcam software was designed to allow continuous monitoring of the mechanical deformation of the telescope structure at Palomar Observatory. Flexcam allows the user to watch the motion of a star with a low-cost astronomical camera, to measure the motion of the star on the image plane, and to feed this data back into the telescope s control system. This automatic interaction between the camera and a user interface facilitates integration and testing. Flexcam is a CCD image capture and analysis tool for the ST-402 camera from Santa Barbara Instruments Group (SBIG). This program will automatically take a dark exposure and then continuously display corrected images. The image size, bit depth, magnification, exposure time, resolution, and filter are always displayed on the title bar. Flexcam locates the brightest pixel and then computes the centroid position of the pixels falling in a box around that pixel. This tool continuously writes the centroid position to a network file that can be used by other instruments.
NASA Technical Reports Server (NTRS)
1980-01-01
Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.
Agostini, Denis; Marie, Pierre-Yves; Ben-Haim, Simona; Rouzet, François; Songy, Bernard; Giordano, Alessandro; Gimelli, Alessia; Hyafil, Fabien; Sciagrà, Roberto; Bucerius, Jan; Verberne, Hein J; Slart, Riemer H J A; Lindner, Oliver; Übleis, Christopher; Hacker, Marcus
2016-12-01
The trade-off between resolution and count sensitivity dominates the performance of standard gamma cameras and dictates the need for relatively high doses of radioactivity of the used radiopharmaceuticals in order to limit image acquisition duration. The introduction of cadmium-zinc-telluride (CZT)-based cameras may overcome some of the limitations against conventional gamma cameras. CZT cameras used for the evaluation of myocardial perfusion have been shown to have a higher count sensitivity compared to conventional single photon emission computed tomography (SPECT) techniques. CZT image quality is further improved by the development of a dedicated three-dimensional iterative reconstruction algorithm, based on maximum likelihood expectation maximization (MLEM), which corrects for the loss in spatial resolution due to line response function of the collimator. All these innovations significantly reduce imaging time and result in a lower patient's radiation exposure compared with standard SPECT. To guide current and possible future users of the CZT technique for myocardial perfusion imaging, the Cardiovascular Committee of the European Association of Nuclear Medicine, starting from the experience of its members, has decided to examine the current literature regarding procedures and clinical data on CZT cameras. The committee hereby aims 1) to identify the main acquisitions protocols; 2) to evaluate the diagnostic and prognostic value of CZT derived myocardial perfusion, and finally 3) to determine the impact of CZT on radiation exposure.
Digital micromirror device camera with per-pixel coded exposure for high dynamic range imaging.
Feng, Wei; Zhang, Fumin; Wang, Weijing; Xing, Wei; Qu, Xinghua
2017-05-01
In this paper, we overcome the limited dynamic range of the conventional digital camera, and propose a method of realizing high dynamic range imaging (HDRI) from a novel programmable imaging system called a digital micromirror device (DMD) camera. The unique feature of the proposed new method is that the spatial and temporal information of incident light in our DMD camera can be flexibly modulated, and it enables the camera pixels always to have reasonable exposure intensity by DMD pixel-level modulation. More importantly, it allows different light intensity control algorithms used in our programmable imaging system to achieve HDRI. We implement the optical system prototype, analyze the theory of per-pixel coded exposure for HDRI, and put forward an adaptive light intensity control algorithm to effectively modulate the different light intensity to recover high dynamic range images. Via experiments, we demonstrate the effectiveness of our method and implement the HDRI on different objects.
NASA Astrophysics Data System (ADS)
Jantzen, Connie; Slagle, Rick
1997-05-01
The distinction between exposure time and sample rate is often the first point raised in any discussion of high speed imaging. Many high speed events require exposure times considerably shorter than those that can be achieved solely by the sample rate of the camera, where exposure time equals 1/sample rate. Gating, a method of achieving short exposure times in digital cameras, is often difficult to achieve for exposure time requirements shorter than 100 microseconds. This paper discusses the advantages and limitations of using the short duration light pulse of a near infrared laser with high speed digital imaging systems. By closely matching the output wavelength of the pulsed laser to the peak near infrared response of current sensors, high speed image capture can be accomplished at very low (visible) light levels of illumination. By virtue of the short duration light pulse, adjustable to as short as two microseconds, image capture of very high speed events can be achieved at relatively low sample rates of less than 100 pictures per second, without image blur. For our initial investigations, we chose a ballistic subject. The results of early experimentation revealed the limitations of applying traditional ballistic imaging methods when using a pulsed infrared lightsource with a digital imaging system. These early disappointing results clarified the need to further identify the unique system characteristics of the digital imager and pulsed infrared combination. It was also necessary to investigate how the infrared reflectance and transmittance of common materials affects the imaging process. This experimental work yielded a surprising, successful methodology which will prove useful in imaging ballistic and weapons tests, as well as forensics, flow visualizations, spray pattern analyses, and nocturnal animal behavioral studies.
Face detection assisted auto exposure: supporting evidence from a psychophysical study
NASA Astrophysics Data System (ADS)
Jin, Elaine W.; Lin, Sheng; Dharumalingam, Dhandapani
2010-01-01
Face detection has been implemented in many digital still cameras and camera phones with the promise of enhancing existing camera functions (e.g. auto exposure) and adding new features to cameras (e.g. blink detection). In this study we examined the use of face detection algorithms in assisting auto exposure (AE). The set of 706 images, used in this study, was captured using Canon Digital Single Lens Reflex cameras and subsequently processed with an image processing pipeline. A psychophysical study was performed to obtain optimal exposure along with the upper and lower bounds of exposure for all 706 images. Three methods of marking faces were utilized: manual marking, face detection algorithm A (FD-A), and face detection algorithm B (FD-B). The manual marking method found 751 faces in 426 images, which served as the ground-truth for face regions of interest. The remaining images do not have any faces or the faces are too small to be considered detectable. The two face detection algorithms are different in resource requirements and in performance. FD-A uses less memory and gate counts compared to FD-B, but FD-B detects more faces and has less false positives. A face detection assisted auto exposure algorithm was developed and tested against the evaluation results from the psychophysical study. The AE test results showed noticeable improvement when faces were detected and used in auto exposure. However, the presence of false positives would negatively impact the added benefit.
Image registration for multi-exposed HDRI and motion deblurring
NASA Astrophysics Data System (ADS)
Lee, Seok; Wey, Ho-Cheon; Lee, Seong-Deok
2009-02-01
In multi-exposure based image fusion task, alignment is an essential prerequisite to prevent ghost artifact after blending. Compared to usual matching problem, registration is more difficult when each image is captured under different photographing conditions. In HDR imaging, we use long and short exposure images, which have different brightness and there exist over/under satuated regions. In motion deblurring problem, we use blurred and noisy image pair and the amount of motion blur varies from one image to another due to the different exposure times. The main difficulty is that luminance levels of the two images are not in linear relationship and we cannot perfectly equalize or normalize the brightness of each image and this leads to unstable and inaccurate alignment results. To solve this problem, we applied probabilistic measure such as mutual information to represent similarity between images after alignment. In this paper, we discribed about the characteristics of multi-exposed input images in the aspect of registration and also analyzed the magnitude of camera hand shake. By exploiting the independence of luminance of mutual information, we proposed a fast and practically useful image registration technique in multiple capturing. Our algorithm can be applied to extreme HDR scenes and motion blurred scenes with over 90% success rate and its simplicity enables to be embedded in digital camera and mobile camera phone. The effectiveness of our registration algorithm is examined by various experiments on real HDR or motion deblurring cases using hand-held camera.
NASA Astrophysics Data System (ADS)
Kolkoori, S.; Wrobel, N.; Osterloh, K.; Zscherpel, U.; Ewert, U.
2013-09-01
Radiological inspections, in general, are the nondestructive testing (NDT) methods to detect the bulk of explosives in large objects. In contrast to personal luggage, cargo or building components constitute a complexity that may significantly hinder the detection of a threat by conventional X-ray transmission radiography. In this article, a novel X-ray backscatter technique is presented for detecting suspicious objects in a densely packed large object with only a single sided access. It consists of an X-ray backscatter camera with a special twisted slit collimator for imaging backscattering objects. The new X-ray backscatter camera is not only imaging the objects based on their densities but also by including the influences of surrounding objects. This unique feature of the X-ray backscatter camera provides new insights in identifying the internal features of the inspected object. Experimental mock-ups were designed imitating containers with threats among a complex packing as they may be encountered in reality. We investigated the dependence of the quality of the X-ray backscatter image on (a) the exposure time, (b) multiple exposures, (c) the distance between object and slit camera, and (d) the width of the slit. At the end, the significant advantages of the presented X-ray backscatter camera in the context of aviation and port security are discussed.
Chambers, T; Pearson, A L; Kawachi, I; Rzotkiewicz, Z; Stanley, J; Smith, M; Barr, M; Ni Mhurchu, C; Signal, L
2017-11-01
Defining the boundary of children's 'neighborhoods' has important implications for understanding the contextual influences on child health. Additionally, insight into activities that occur outside people's neighborhoods may indicate exposures that place-based studies cannot detect. This study aimed to 1) extend current neighborhood research, using data from wearable cameras and GPS devices that were worn over several days in an urban setting; 2) define the boundary of children's neighborhoods by using leisure time activity space data; and 3) determine the destinations visited by children in their leisure time, outside their neighborhoods. One hundred and fourteen children (mean age 12y) from Wellington, New Zealand wore wearable cameras and GPS recorders. Residential Euclidean buffers at incremental distances were paired with GPS data (thereby identifying time spent in different places) to explore alternative definitions of neighborhood boundaries. Children's neighborhood boundary was at 500 m. A newly developed software application was used to identify 'destinations' visited outside the neighborhood by specifying space-time parameters. Image data from wearable cameras were used to determine the type of destination. Children spent over half of their leisure time within 500 m of their homes. Children left their neighborhood predominantly to visit school (for leisure purposes), other residential locations (e.g. to visit friends) and food retail outlets (e.g. convenience stores, fast food outlets). Children spent more time at food retail outlets than at structured sport and in outdoor recreation locations combined. Person-centered neighborhood definitions may serve to better represent children's everyday experiences and neighborhood exposures than previous methods based on place-based measures. As schools and other residential locations (friends and family) are important destinations outside the neighborhood, such destinations should be taken into account. The combination of image data and activity space GPS data provides a more robust approach to understanding children's neighborhoods and activity spaces. Copyright © 2017 Elsevier Ltd. All rights reserved.
1990-02-14
Range : 4 billion miles from Earth, at 32 degrees to the ecliptic. P-36057C This color image of the Sun, Earth, and Venus is one of the first, and maybe, only images that show are solar system from such a vantage point. The image is a portion of a wide angle image containing the sun and the region of space where the Earth and Venus were at the time, with narrow angle cameras centered on each planet. The wide angle was taken with the cameras darkest filter, a methane absorption band, and the shortest possible exposure, one two-hundredth of a second, to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky, as seen from Voyager's perpective at the edge of the solar system. Yet, it is still 8xs brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics of the camera. The rays around th sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. the 2 narrow angle frames containing the images of the Earth and Venus have been digitally mosaicked into the wide angle image at the appropriate scale. These images were taken through three color filters and recombined to produce the color image. The violet, green, and blue filters used , as well as exposure times of .72,.48, and .72 for Earth, and .36, .24, and .36 for Venus.The images also show long linear streaks resulting from scatering of sulight off parts of the camera and its shade.
Cano-García, Angel E.; Lazaro, José Luis; Infante, Arturo; Fernández, Pedro; Pompa-Chacón, Yamilet; Espinoza, Felipe
2012-01-01
In this study, a camera to infrared diode (IRED) distance estimation problem was analyzed. The main objective was to define an alternative to measures depth only using the information extracted from pixel grey levels of the IRED image to estimate the distance between the camera and the IRED. In this paper, the standard deviation of the pixel grey level in the region of interest containing the IRED image is proposed as an empirical parameter to define a model for estimating camera to emitter distance. This model includes the camera exposure time, IRED radiant intensity and the distance between the camera and the IRED. An expression for the standard deviation model related to these magnitudes was also derived and calibrated using different images taken under different conditions. From this analysis, we determined the optimum parameters to ensure the best accuracy provided by this alternative. Once the model calibration had been carried out, a differential method to estimate the distance between the camera and the IRED was defined and applied, considering that the camera was aligned with the IRED. The results indicate that this method represents a useful alternative for determining the depth information. PMID:22778608
Cano-García, Angel E; Lazaro, José Luis; Infante, Arturo; Fernández, Pedro; Pompa-Chacón, Yamilet; Espinoza, Felipe
2012-01-01
In this study, a camera to infrared diode (IRED) distance estimation problem was analyzed. The main objective was to define an alternative to measures depth only using the information extracted from pixel grey levels of the IRED image to estimate the distance between the camera and the IRED. In this paper, the standard deviation of the pixel grey level in the region of interest containing the IRED image is proposed as an empirical parameter to define a model for estimating camera to emitter distance. This model includes the camera exposure time, IRED radiant intensity and the distance between the camera and the IRED. An expression for the standard deviation model related to these magnitudes was also derived and calibrated using different images taken under different conditions. From this analysis, we determined the optimum parameters to ensure the best accuracy provided by this alternative. Once the model calibration had been carried out, a differential method to estimate the distance between the camera and the IRED was defined and applied, considering that the camera was aligned with the IRED. The results indicate that this method represents a useful alternative for determining the depth information.
NASA Astrophysics Data System (ADS)
Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.
2008-12-01
Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous data volumes.
Electronic Flash In Data Acquisition
NASA Astrophysics Data System (ADS)
Miller, C. E.
1982-02-01
Photographic acquisition of data often may be simplified, or the data quality improved upon by employing electronic flash sources with traditional equipment or techniques. The relatively short flash duration compared to movie camera shutters, or to the long integration time of video camera provides improved spatial resolution through blur reduction, particularly important as image movement becomes a significant fraction of film format dimension. Greater accuracy typically is achieved in velocity and acceleration determinations by using a stroboscopic light source rather than a movie camera frame-rate control as a time standard. Electrical efficiency often is an important advantage of electronic flash sources since almost any necessary light level for exposure may be produced, yet the source typically is "off" most of the time. Various synchronization techniques greatly expand the precise control of exposure. Biomechanical and sports equipment studies may involve velocities up to 200 feet-per-second, and often will have associated very rapid actions of interest. The need for brief exposures increases H.s one "ZOOMS in on the action." In golf, for example, the swing may be examined using 100 microsecond (Us) flashes at rates of 60 or 120 flashes-per-second (FPS). Accurate determination of linear and rotational velocity of the ball requires 10 Us flashes at 500-1,000 FPS, while sub-Us flashes at 20,000-50,000 FPS are required to resolve the interaction of the ball and the club, head. Some seldom. used techniques involving streak photography are described, with enhanced results obtained by combining strobe with the usual continuous light source. The combination of strobe and a fast electro-mechanical shutter is considered for Us photography under daylight conditions.
The Use of Reflexive Photography in the Study of the Freshman Year Experience.
ERIC Educational Resources Information Center
Harrington, Charles; Lindy, Ingrid
This study used reflexive photography to examine the perceptions of college freshmen at the University of Southern Indiana. A random sample of 10 first-time, full-time, degree-seeking freshmen completed an initial interview and background questionnaire and were given a 27-exposure disposable camera to take pictures that would illustrate their…
HDR video synthesis for vision systems in dynamic scenes
NASA Astrophysics Data System (ADS)
Shopovska, Ivana; Jovanov, Ljubomir; Goossens, Bart; Philips, Wilfried
2016-09-01
High dynamic range (HDR) image generation from a number of differently exposed low dynamic range (LDR) images has been extensively explored in the past few decades, and as a result of these efforts a large number of HDR synthesis methods have been proposed. Since HDR images are synthesized by combining well-exposed regions of the input images, one of the main challenges is dealing with camera or object motion. In this paper we propose a method for the synthesis of HDR video from a single camera using multiple, differently exposed video frames, with circularly alternating exposure times. One of the potential applications of the system is in driver assistance systems and autonomous vehicles, involving significant camera and object movement, non- uniform and temporally varying illumination, and the requirement of real-time performance. To achieve these goals simultaneously, we propose a HDR synthesis approach based on weighted averaging of aligned radiance maps. The computational complexity of high-quality optical flow methods for motion compensation is still pro- hibitively high for real-time applications. Instead, we rely on more efficient global projective transformations to solve camera movement, while moving objects are detected by thresholding the differences between the trans- formed and brightness adapted images in the set. To attain temporal consistency of the camera motion in the consecutive HDR frames, the parameters of the perspective transformation are stabilized over time by means of computationally efficient temporal filtering. We evaluated our results on several reference HDR videos, on synthetic scenes, and using 14-bit raw images taken with a standard camera.
Solar System Portrait - 60 Frame Mosaic
1996-09-13
The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever portrait of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. http://photojournal.jpl.nasa.gov/catalog/PIA00451
Solar System Portrait - 60 Frame Mosaic
NASA Technical Reports Server (NTRS)
1990-01-01
The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever 'portrait' of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun.
Efficient space-time sampling with pixel-wise coded exposure for high-speed imaging.
Liu, Dengyu; Gu, Jinwei; Hitomi, Yasunobu; Gupta, Mohit; Mitsunaga, Tomoo; Nayar, Shree K
2014-02-01
Cameras face a fundamental trade-off between spatial and temporal resolution. Digital still cameras can capture images with high spatial resolution, but most high-speed video cameras have relatively low spatial resolution. It is hard to overcome this trade-off without incurring a significant increase in hardware costs. In this paper, we propose techniques for sampling, representing, and reconstructing the space-time volume to overcome this trade-off. Our approach has two important distinctions compared to previous works: 1) We achieve sparse representation of videos by learning an overcomplete dictionary on video patches, and 2) we adhere to practical hardware constraints on sampling schemes imposed by architectures of current image sensors, which means that our sampling function can be implemented on CMOS image sensors with modified control units in the future. We evaluate components of our approach, sampling function and sparse representation, by comparing them to several existing approaches. We also implement a prototype imaging system with pixel-wise coded exposure control using a liquid crystal on silicon device. System characteristics such as field of view and modulation transfer function are evaluated for our imaging system. Both simulations and experiments on a wide range of scenes show that our method can effectively reconstruct a video from a single coded image while maintaining high spatial resolution.
Optical design and development of a snapshot light-field laryngoscope
NASA Astrophysics Data System (ADS)
Zhu, Shuaishuai; Jin, Peng; Liang, Rongguang; Gao, Liang
2018-02-01
The convergence of recent advances in optical fabrication and digital processing yields a generation of imaging technology-light-field (LF) cameras which bridge the realms of applied mathematics, optics, and high-performance computing. Herein for the first time, we introduce the paradigm of LF imaging into laryngoscopy. The resultant probe can image the three-dimensional shape of vocal folds within a single camera exposure. Furthermore, to improve the spatial resolution, we developed an image fusion algorithm, providing a simple solution to a long-standing problem in LF imaging.
NASA Technical Reports Server (NTRS)
Bozyan, Elizabeth P.; Hemenway, Paul D.; Argue, A. Noel
1990-01-01
Observations of a set of 89 extragalactic objects (EGOs) will be made with the Hubble Space Telescope Fine Guidance Sensors and Planetary Camera in order to link the HIPPARCOS Instrumental System to an extragalactic coordinate system. Most of the sources chosen for observation contain compact radio sources and stellarlike nuclei; 65 percent are optical variables beyond a 0.2 mag limit. To ensure proper exposure times, accurate mean magnitudes are necessary. In many cases, the average magnitudes listed in the literature were not adequate. The literature was searched for all relevant photometric information for the EGOs, and photometric parameters were derived, including mean magnitude, maximum range, and timescale of variability. This paper presents the results of that search and the parameters derived. The results will allow exposure times to be estimated such that an observed magnitude different from the tabular magnitude by 0.5 mag in either direction will not degrade the astrometric centering ability on a Planetary Camera CCD frame.
VizieR Online Data Catalog: PHAT X. UV-IR photometry of M31 stars (Williams+, 2014)
NASA Astrophysics Data System (ADS)
Williams, B. F.; Lang, D.; Dalcanton, J. J.; Dolphin, A. E.; Weisz, D. R.; Bell, E. F.; Bianchi, L.; Byler, N.; Gilbert, K. M.; Girardi, L.; Gordon, K.; Gregersen, D.; Johnson, L. C.; Kalirai, J.; Lauer, T. R.; Monachesi, A.; Rosenfield, P.; Seth, A.; Skillman, E.
2015-01-01
The data for the Panchromatic Hubble Andromeda Treasury (PHAT) survey were obtained from 2010 July 12 to 2013 October 12 using the Advanced Camera for Surveys (ACS) Wide Field Channel (WFC), the Wide Field Camera 3 (WFC3) IR (infrared) channel, and the WFC3 UVIS (ultraviolet-optical) channel. The observing strategy is described in detail in Dalcanton et al. (2012ApJS..200...18D). A list of the target names, observing dates, coordinates, orientations, instruments, exposure times, and filters is given in Table 1. Using the ACS and WFC3 cameras aboard HST, we have photometered 414 contiguous WFC3/IR footprints covering 0.5deg2 of the M31 star-forming disk. (4 data files).
SOLAR - ASTRONOMY (APOLLO-SATURN [AS]-16)
1972-05-09
S72-36972 (21 April 1972) --- A color enhancement of a far-ultraviolet photo of Earth taken by astronaut John W. Young, commander, with the ultraviolet camera on April 21, 1972. The original black and white photo was printed on Agfacontour film three times, each exposure recording only one light level. The three light levels were then colored blue (dimmest), green (next brightest), and red (brightest). The three auroral belts, the sunlit atmosphere and the background stars (one very close to Earth, on left) can be studied quantitatively fro brightness. The UV camera was designed and built at the Naval Research Laboratory, Washington, D.C. EDITOR'S NOTE: The photographic number of the original black & white UV camera photograph from which this enhancement was made is AS16-123-19657.
High-resolution ophthalmic imaging system
Olivier, Scot S.; Carrano, Carmen J.
2007-12-04
A system for providing an improved resolution retina image comprising an imaging camera for capturing a retina image and a computer system operatively connected to the imaging camera, the computer producing short exposures of the retina image and providing speckle processing of the short exposures to provide the improved resolution retina image. The system comprises the steps of capturing a retina image, producing short exposures of the retina image, and speckle processing the short exposures of the retina image to provide the improved resolution retina image.
NASA Astrophysics Data System (ADS)
Taggart, D. P.; Gribble, R. J.; Bailey, A. D., III; Sugimoto, S.
Recently, a prototype soft x ray pinhole camera was fielded on FRX-C/LSM at Los Alamos and TRX at Spectra Technology. The soft x ray FRC images obtained using this camera stand out in high contrast to their surroundings. It was particularly useful for studying the FRC during and shortly after formation when, at certain operating conditions, flute-like structures at the edge and internal structures of the FRC were observed which other diagnostics could not resolve. Building on this early experience, a new soft x ray pinhole camera was installed on FRX-C/LSM, which permits more rapid data acquisition and briefer exposures. It will be used to continue studying FRC formation and to look for internal structure later in time which could be a signature of instability. The initial operation of this camera is summarized.
Inexpensive Neutron Imaging Cameras Using CCDs for Astronomy
NASA Astrophysics Data System (ADS)
Hewat, A. W.
We have developed inexpensive neutron imaging cameras using CCDs originally designed for amateur astronomical observation. The low-light, high resolution requirements of such CCDs are similar to those for neutron imaging, except that noise as well as cost is reduced by using slower read-out electronics. For example, we use the same 2048x2048 pixel ;Kodak; KAI-4022 CCD as used in the high performance PCO-2000 CCD camera, but our electronics requires ∼5 sec for full-frame read-out, ten times slower than the PCO-2000. Since neutron exposures also require several seconds, this is not seen as a serious disadvantage for many applications. If higher frame rates are needed, the CCD unit on our camera can be easily swapped for a faster readout detector with similar chip size and resolution, such as the PCO-2000 or the sCMOS PCO.edge 4.2.
Multispectral imaging of the ocular fundus using light emitting diode illumination
NASA Astrophysics Data System (ADS)
Everdell, N. L.; Styles, I. B.; Calcagni, A.; Gibson, J.; Hebden, J.; Claridge, E.
2010-09-01
We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.
Multispectral imaging of the ocular fundus using light emitting diode illumination.
Everdell, N L; Styles, I B; Calcagni, A; Gibson, J; Hebden, J; Claridge, E
2010-09-01
We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.
NASA Astrophysics Data System (ADS)
Verdaasdonk, Rudolf M.; Wedzinga, Rosaline; van Montfrans, Bibi; Stok, Mirte; Klaessens, John; van der Veen, Albert
2016-03-01
The significant increase of skin cancer occurring in the western world is attributed to longer sun expose during leisure time. For prevention, people should become aware of the risks of UV light exposure by showing skin damage and the protective effect of sunscreen with an UV camera. An UV awareness imaging system optimized for 365 nm (UV-A) was develop using consumer components being interactive, safe and mobile. A Sony NEX5t camera was adapted to full spectral range. In addition, UV transparent lenses and filters were selected based on spectral characteristics measured (Schott S8612 and Hoya U-340 filters) to obtain the highest contrast for e.g. melanin spots and wrinkles on the skin. For uniform UV illumination, 2 facial tanner units were adapted with UV 365 nm black light fluorescent tubes. Safety of the UV illumination was determined relative to the sun and with absolute irradiance measurements at the working distance. A maximum exposure time over 15 minutes was calculate according the international safety standards. The UV camera was successfully demonstrated during the Dutch National Skin Cancer day and was well received by dermatologists and participating public. Especially, the 'black paint' effect putting sun screen on the face was dramatic and contributed to the awareness of regions on the face what are likely to be missed applying sunscreen. The UV imaging system shows to be promising for diagnostics and clinical studies in dermatology and potentially in other areas (dentistry and ophthalmology)
Multi-exposure high dynamic range image synthesis with camera shake correction
NASA Astrophysics Data System (ADS)
Li, Xudong; Chen, Yongfu; Jiang, Hongzhi; Zhao, Huijie
2017-10-01
Machine vision plays an important part in industrial online inspection. Owing to the nonuniform illuminance conditions and variable working distances, the captured image tends to be over-exposed or under-exposed. As a result, when processing the image such as crack inspection, the algorithm complexity and computing time increase. Multiexposure high dynamic range (HDR) image synthesis is used to improve the quality of the captured image, whose dynamic range is limited. Inevitably, camera shake will result in ghost effect, which blurs the synthesis image to some extent. However, existed exposure fusion algorithms assume that the input images are either perfectly aligned or captured in the same scene. These assumptions limit the application. At present, widely used registration based on Scale Invariant Feature Transform (SIFT) is usually time consuming. In order to rapidly obtain a high quality HDR image without ghost effect, we come up with an efficient Low Dynamic Range (LDR) images capturing approach and propose a registration method based on ORiented Brief (ORB) and histogram equalization which can eliminate the illumination differences between the LDR images. The fusion is performed after alignment. The experiment results demonstrate that the proposed method is robust to illumination changes and local geometric distortion. Comparing with other exposure fusion methods, our method is more efficient and can produce HDR images without ghost effect by registering and fusing four multi-exposure images.
NASA Astrophysics Data System (ADS)
Clifton, K. S.; Owens, J. K.
1983-04-01
Efforts continue regarding the analysis of particulate contamination recorded by the Camera/Photometers on STS-2. These systems were constructed by Epsilon Laboratories, Inc. and consisted of two 16-mm photographic cameras, using Kodak Double X film, Type 7222, to make stereoscopic observations of contaminant particles and background. Each was housed within a pressurized canister and operated automatically throughout the mission, making simultaneous exposures on a continuous basis every 150 sec. The cameras were equipped with 18-mm f/0.9 lenses and subtended overlapping 20° fields-of-view. An integrating photometer was used to inhibit the exposure sequences during periods of excessive illumination and to terminate the exposures at preset light levels. During the exposures, a camera shutter operated in a chopping mode in order to isolate the movement of particles for velocity determinations. Calculations based on the preflight film calibration indicate that particles as small as 25 μm can be detected from ideal observing conditions. Current emphasis is placed on the digitization of the photographic data frames and the determination of particle distances, sizes, and velocities. It has been concluded that background bright-ness measurements cannot be established with any reliability on the STS-2 mission, due to the preponderance of Earth-directed attitudes and the incidence of light reflected from nearby surfaces.
Photographing the Night Sky (Without a Telescope).
ERIC Educational Resources Information Center
Scott, Roger L.
1983-01-01
Describes the use of a 35-millimeter camera with color slide film to produce photographs of constellations, star trails, bright comets, aurorae, and meteor showers. Discusses film speed, lenses, f-stop settings, exposure times, and other items related to astrophotographic technique; provides ideas for use of slides in the classroom. (JM)
ERIC Educational Resources Information Center
Giles, Rebecca McMahon
2006-01-01
Exposure to cell phones, DVD players, video games, computers, digital cameras, and iPods has made today's young people more technologically advanced than those of any previous generation. As a result, parents are now concerned that their children are spending too much time in front of the computer. In this article, the author focuses her…
Flat-panel detector, CCD cameras, and electron-beam-tube-based video for use in portal imaging
NASA Astrophysics Data System (ADS)
Roehrig, Hans; Tang, Chuankun; Cheng, Chee-Way; Dallas, William J.
1998-07-01
This paper provides a comparison of some imaging parameters of four portal imaging systems at 6 MV: a flat panel detector, two CCD cameras and an electron beam tube based video camera. Measurements were made of signal and noise and consequently of signal-to-noise per pixel as a function of the exposure. All systems have a linear response with respect to exposure, and with the exception of the electron beam tube based video camera, the noise is proportional to the square-root of the exposure, indicating photon-noise limitation. The flat-panel detector has a signal-to-noise ratio, which is higher than that observed with both CCD-Cameras or with the electron beam tube based video camera. This is expected because most portal imaging systems using optical coupling with a lens exhibit severe quantum-sinks. The measurements of signal-and noise were complemented by images of a Las Vegas-type aluminum contrast detail phantom, located at the ISO-Center. These images were generated at an exposure of 1 MU. The flat-panel detector permits detection of Aluminum holes of 1.2 mm diameter and 1.6 mm depth, indicating the best signal-to-noise ratio. The CCD-cameras rank second and third in signal-to- noise ratio, permitting detection of Aluminum-holes of 1.2 mm diameter and 2.2 mm depth (CCD_1) and of 1.2 mm diameter and 3.2 mm depth (CCD_2) respectively, while the electron beam tube based video camera permits detection of only a hole of 1.2 mm diameter and 4.6 mm depth. Rank Order Filtering was applied to the raw images from the CCD-based systems in order to remove the direct hits. These are camera responses to scattered x-ray photons which interact directly with the CCD of the CCD-Camera and generate 'Salt and Pepper type noise,' which interferes severely with attempts to determine accurate estimates of the image noise. The paper also presents data on the metal-phosphor's photon gain (the number of light-photons per interacting x-ray photon).
2006-01-01
The paper compares the efficiency of infrasound and ultrasound phonophoreses. The efficiency was evaluated on the basis of the rate of radiotracers within the eye after infrasound or ultrasound exposure of the eyeball. The exposure was made after preliminary putting the radiotracer-impregnated application into the bulbar conjunctiva of an animal. Radioactivity was recorded on a Siemens gamma camera in its lifetime. The time course of changes in the radioactivities measured 10, 30, and 60 minutes after termination of exposures strongly suggests its stable increase in the eye exposed to infrasound. At the same time 10 minutes after ultrasound exposure, the increased concentration of a radiotracer in the eye was less than that after infrasound exposure and then it progressively decreased. Thus, having a significant phoretic activity, infrasound, as ultrasound, creates more favorable conditions for long drug storage in the eye.
High-speed plasma imaging: A lightning bolt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurden, G.A.; Whiteson, D.O.
Using a gated intensified digital Kodak Ektapro camera system, the authors captured a lightning bolt at 1,000 frames per second, with 100-{micro}s exposure time on each consecutive frame. As a thunder storm approaches while darkness descended (7:50 pm) on July 21, 1994, they photographed lightning bolts with an f22 105-mm lens and 100% gain on the intensified camera. This 15-frame sequence shows a cloud to ground stroke at a distance of about 1.5 km, which has a series of stepped leaders propagating downwards, following by the upward-propagating main return stroke.
High speed imaging - An important industrial tool
NASA Technical Reports Server (NTRS)
Moore, Alton; Pinelli, Thomas E.
1986-01-01
High-speed photography, which is a rapid sequence of photographs that allow an event to be analyzed through the stoppage of motion or the production of slow-motion effects, is examined. In high-speed photography 16, 35, and 70 mm film and framing rates between 64-12,000 frames per second are utilized to measure such factors as angles, velocities, failure points, and deflections. The use of dual timing lamps in high-speed photography and the difficulties encountered with exposure and programming the camera and event are discussed. The application of video cameras to the recording of high-speed events is described.
Single exposure three-dimensional imaging of dusty plasma clusters.
Hartmann, Peter; Donkó, István; Donkó, Zoltán
2013-02-01
We have worked out the details of a single camera, single exposure method to perform three-dimensional imaging of a finite particle cluster. The procedure is based on the plenoptic imaging principle and utilizes a commercial Lytro light field still camera. We demonstrate the capabilities of our technique on a single layer particle cluster in a dusty plasma, where the camera is aligned and inclined at a small angle to the particle layer. The reconstruction of the third coordinate (depth) is found to be accurate and even shadowing particles can be identified.
High-speed multi-exposure laser speckle contrast imaging with a single-photon counting camera
Dragojević, Tanja; Bronzi, Danilo; Varma, Hari M.; Valdes, Claudia P.; Castellvi, Clara; Villa, Federica; Tosi, Alberto; Justicia, Carles; Zappa, Franco; Durduran, Turgut
2015-01-01
Laser speckle contrast imaging (LSCI) has emerged as a valuable tool for cerebral blood flow (CBF) imaging. We present a multi-exposure laser speckle imaging (MESI) method which uses a high-frame rate acquisition with a negligible inter-frame dead time to mimic multiple exposures in a single-shot acquisition series. Our approach takes advantage of the noise-free readout and high-sensitivity of a complementary metal-oxide-semiconductor (CMOS) single-photon avalanche diode (SPAD) array to provide real-time speckle contrast measurement with high temporal resolution and accuracy. To demonstrate its feasibility, we provide comparisons between in vivo measurements with both the standard and the new approach performed on a mouse brain, in identical conditions. PMID:26309751
Environmental performance evaluation of an advanced-design solid-state television camera
NASA Technical Reports Server (NTRS)
1979-01-01
The development of an advanced-design black-and-white solid-state television camera which can survive exposure to space environmental conditions was undertaken. A 380 x 488 element buried-channel CCD is utilized as the image sensor to ensure compatibility with 525-line transmission and display equipment. Specific camera design approaches selected for study and analysis included: (1) component and circuit sensitivity to temperature; (2) circuit board thermal and mechanical design; and (3) CCD temperature control. Preferred approaches were determined and integrated into the final design for two deliverable solid-state TV cameras. One of these cameras was subjected to environmental tests to determine stress limits for exposure to vibration, shock, acceleration, and temperature-vacuum conditions. These tests indicate performance at the design goal limits can be achieved for most of the specified conditions.
Snowy owl predation on lapland longspur nestlings recorded on film
Custer, T.W.
1973-01-01
During the summer of 1971 I investigated the breeding biology of the Lapland Longspur, Calcarius lapponicus, near Barrow, Alaska. To obtain data on incubation and feeding patterns of nesting longspurs, time-lapse cameras (Minolta Autopak-8 D6 super-8 movie cameras equipped with an Intervalometer-P time-lapse device) were positioned at several nests throughout the nesting season with an exposure interval of either 8 or 30 seconds. At 07:00 on 14 July a Snowy Owl, Nyctea scandiaca, took the three largest of four young at a nest being monitored at 30-second intervals (Figure 1). The nestlings were 4, 6, 7, and 8 days posthatching and weighed approximately 12, 15, 20, and 20 g respectively on 13 July.
Hubble Space Telescope: Faint object camera instrument handbook. Version 2.0
NASA Technical Reports Server (NTRS)
Paresce, Francesco (Editor)
1990-01-01
The Faint Object Camera (FOC) is a long focal ratio, photon counting device designed to take high resolution two dimensional images of areas of the sky up to 44 by 44 arcseconds squared in size, with pixel dimensions as small as 0.0007 by 0.0007 arcseconds squared in the 1150 to 6500 A wavelength range. The basic aim of the handbook is to make relevant information about the FOC available to a wide range of astronomers, many of whom may wish to apply for HST observing time. The FOC, as presently configured, is briefly described, and some basic performance parameters are summarized. Also included are detailed performance parameters and instructions on how to derive approximate FOC exposure times for the proposed targets.
STS-56 ESC Earth observation of New York City at night
NASA Technical Reports Server (NTRS)
1993-01-01
STS-56 electronic still camera (ESC) Earth observation image shows New York City at night as recorded on the 64th orbit of Discovery, Orbiter Vehicle (OV) 103. The image was recorded with an image intensifier on the Hand-held, Earth-oriented, Real-time, Cooperative, User-friendly, Location-targeting and Environmental System (HERCULES). HERCULES is a device that makes it simple for shuttle crewmembers to take pictures of Earth as they merely point a modified 35mm camera and shoot any interesting feature, whose latitude and longitude are automatically determined in real-time. Center coordinates on this image are 40.665 degrees north latitude and 74.048 degrees west longitude. (1/60 second exposure). Digital file name is ESC04034.IMG.
Wide field NEO survey 1.0-m telescope with 10 2k×4k mosaic CCD camera
NASA Astrophysics Data System (ADS)
Isobe, Syuzo; Asami, Atsuo; Asher, David J.; Hashimoto, Toshiyasu; Nakano, Shi-ichi; Nishiyama, Kota; Ohshima, Yoshiaki; Terazono, Junya; Umehara, Hiroaki; Yoshikawa, Makoto
2002-12-01
We developed a new 1.0 m telescope with a 3 degree flat focal plane to which a mosaic CCD camera with 10 2k×4k chips is fixed. The system was set up in February 2002, and is now undergoing the final fine adjustments. Since the telescope has a focal length of 3 m, a field of 7.5 square degrees is covered in one image. In good seeing conditions, 1.5 arc seconds, at the site located in Bisei town, Okayama prefecture in Japan, we can expect to detect down to 20th magnitude stars with an exposure time of 60 seconds. Considering a read-out time, 46 seconds, of the CCD camera, one image is taken in every two minutes, and about 2,100 square degrees of field is expected to be covered in one clear night. This system is very effective for survey work, especially for Near-Earth-Asteroid detection.
Development of a portable multispectral thermal infrared camera
NASA Technical Reports Server (NTRS)
Osterwisch, Frederick G.
1991-01-01
The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The instrument was delivered to NASA in March 1991. Potential commercial and research uses for this instrument are in its primary application as a field geologists exploration tool. Other applications have been suggested but not investigated in depth. These are measurements of process control in commercial materials processing and quality control functions which require information on surface heterogeneity.
Multiple-frame IR photo-recorder KIT-3M
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E; Wilkins, P; Nebeker, N
2006-05-15
This paper reports the experimental results of a high-speed multi-frame infrared camera which has been developed in Sarov at VNIIEF. Earlier [1] we discussed the possibility of creation of the multi-frame infrared radiation photo-recorder with framing frequency about 1 MHz. The basis of the photo-recorder is a semiconductor ionization camera [2, 3], which converts IR radiation of spectral range 1-10 micrometers into a visible image. Several sequential thermal images are registered by using the IR converter in conjunction with a multi-frame electron-optical camera. In the present report we discuss the performance characteristics of a prototype commercial 9-frame high-speed IR photo-recorder.more » The image converter records infrared images of thermal fields corresponding to temperatures ranging from 300 C to 2000 C with an exposure time of 1-20 {micro}s at a frame frequency up to 500 KHz. The IR-photo-recorder camera is useful for recording the time evolution of thermal fields in fast processes such as gas dynamics, ballistics, pulsed welding, thermal processing, automotive industry, aircraft construction, in pulsed-power electric experiments, and for the measurement of spatial mode characteristics of IR-laser radiation.« less
Very High-Speed Digital Video Capability for In-Flight Use
NASA Technical Reports Server (NTRS)
Corda, Stephen; Tseng, Ting; Reaves, Matthew; Mauldin, Kendall; Whiteman, Donald
2006-01-01
digital video camera system has been qualified for use in flight on the NASA supersonic F-15B Research Testbed aircraft. This system is capable of very-high-speed color digital imaging at flight speeds up to Mach 2. The components of this system have been ruggedized and shock-mounted in the aircraft to survive the severe pressure, temperature, and vibration of the flight environment. The system includes two synchronized camera subsystems installed in fuselage-mounted camera pods (see Figure 1). Each camera subsystem comprises a camera controller/recorder unit and a camera head. The two camera subsystems are synchronized by use of an MHub(TradeMark) synchronization unit. Each camera subsystem is capable of recording at a rate up to 10,000 pictures per second (pps). A state-of-the-art complementary metal oxide/semiconductor (CMOS) sensor in the camera head has a maximum resolution of 1,280 1,024 pixels at 1,000 pps. Exposure times of the electronic shutter of the camera range from 1/200,000 of a second to full open. The recorded images are captured in a dynamic random-access memory (DRAM) and can be downloaded directly to a personal computer or saved on a compact flash memory card. In addition to the high-rate recording of images, the system can display images in real time at 30 pps. Inter Range Instrumentation Group (IRIG) time code can be inserted into the individual camera controllers or into the M-Hub unit. The video data could also be used to obtain quantitative, three-dimensional trajectory information. The first use of this system was in support of the Space Shuttle Return to Flight effort. Data were needed to help in understanding how thermally insulating foam is shed from a space shuttle external fuel tank during launch. The cameras captured images of simulated external tank debris ejected from a fixture mounted under the centerline of the F-15B aircraft. Digital video was obtained at subsonic and supersonic flight conditions, including speeds up to Mach 2 and altitudes up to 50,000 ft (15.24 km). The digital video was used to determine the structural survivability of the debris in a real flight environment and quantify the aerodynamic trajectories of the debris.
Adaptive DOF for plenoptic cameras
NASA Astrophysics Data System (ADS)
Oberdörster, Alexander; Lensch, Hendrik P. A.
2013-03-01
Plenoptic cameras promise to provide arbitrary re-focusing through a scene after the capture. In practice, however, the refocusing range is limited by the depth of field (DOF) of the plenoptic camera. For the focused plenoptic camera, this range is given by the range of object distances for which the microimages are in focus. We propose a technique of recording light fields with an adaptive depth of focus. Between multiple exposures { or multiple recordings of the light field { the distance between the microlens array (MLA) and the image sensor is adjusted. The depth and quality of focus is chosen by changing the number of exposures and the spacing of the MLA movements. In contrast to traditional cameras, extending the DOF does not necessarily lead to an all-in-focus image. Instead, the refocus range is extended. There is full creative control about the focus depth; images with shallow or selective focus can be generated.
HIGH SPEED KERR CELL FRAMING CAMERA
Goss, W.C.; Gilley, L.F.
1964-01-01
The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)
Light field geometry of a Standard Plenoptic Camera.
Hahne, Christopher; Aggoun, Amar; Haxha, Shyqyri; Velisavljevic, Vladan; Fernández, Juan Carlos Jácome
2014-11-03
The Standard Plenoptic Camera (SPC) is an innovation in photography, allowing for acquiring two-dimensional images focused at different depths, from a single exposure. Contrary to conventional cameras, the SPC consists of a micro lens array and a main lens projecting virtual lenses into object space. For the first time, the present research provides an approach to estimate the distance and depth of refocused images extracted from captures obtained by an SPC. Furthermore, estimates for the position and baseline of virtual lenses which correspond to an equivalent camera array are derived. On the basis of paraxial approximation, a ray tracing model employing linear equations has been developed and implemented using Matlab. The optics simulation tool Zemax is utilized for validation purposes. By designing a realistic SPC, experiments demonstrate that a predicted image refocusing distance at 3.5 m deviates by less than 11% from the simulation in Zemax, whereas baseline estimations indicate no significant difference. Applying the proposed methodology will enable an alternative to the traditional depth map acquisition by disparity analysis.
Digital dental photography. Part 6: camera settings.
Ahmad, I
2009-07-25
Once the appropriate camera and equipment have been purchased, the next considerations involve setting up and calibrating the equipment. This article provides details regarding depth of field, exposure, colour spaces and white balance calibration, concluding with a synopsis of camera settings for a standard dental set-up.
A math model for high velocity sensoring with a focal plane shuttered camera.
NASA Technical Reports Server (NTRS)
Morgan, P.
1971-01-01
A new mathematical model is presented which describes the image produced by a focal plane shutter-equipped camera. The model is based upon the well-known collinearity condition equations and incorporates both the translational and rotational motion of the camera during the exposure interval. The first differentials of the model with respect to exposure interval, delta t, yield the general matrix expressions for image velocities which may be simplified to known cases. The exposure interval, delta t, may be replaced under certain circumstances with a function incorporating blind velocity and image position if desired. The model is tested using simulated Lunar Orbiter data and found to be computationally stable as well as providing excellent results, provided that some external information is available on the velocity parameters.
The CTIO Acquisition CCD-TV camera design
NASA Astrophysics Data System (ADS)
Schmidt, Ricardo E.
1990-07-01
A CCD-based Acquisition TV Camera has been developed at CTIO to replace the existing ISIT units. In a 60 second exposure, the new Camera shows a sixfold improvement in sensitivity over an ISIT used with a Leaky Memory. Integration times can be varied over a 0.5 to 64 second range. The CCD, contained in an evacuated enclosure, is operated at -45 C. Only the image section, an area of 8.5 mm x 6.4 mm, gets exposed to light. Pixel size is 22 microns and either no binning or 2 x 2 binning can be selected. The typical readout rates used vary between 3.5 and 9 microseconds/pixel. Images are stored in a PC/XT/AT, which generates RS-170 video. The contrast in the RS-170 frames is automatically enhanced by the software.
ERIC Educational Resources Information Center
Petzold, Paul
The amateur movie camera differs from a still camera on several important points. The author explores these differences and discusses the various ways they may be used to advantage. He describes in detail the workings of basic equipment--cameras, exposure meters, lenses, films, and lights--and demonstrates the proper use of each. Techniques such…
Light-pollution measurement with the Wide-field all-sky image analyzing monitoring system
NASA Astrophysics Data System (ADS)
Vítek, S.
2017-07-01
The purpose of this experiment was to measure light pollution in the capital of Czech Republic, Prague. As a measuring instrument is used calibrated consumer level digital single reflex camera with IR cut filter, therefore, the paper reports results of measuring and monitoring of the light pollution in the wavelength range of 390 - 700 nm, which most affects visual range astronomy. Combining frames of different exposure times made with a digital camera coupled with fish-eye lens allow to create high dynamic range images, contain meaningful values, so such a system can provide absolute values of the sky brightness.
Radiometric calibration of wide-field camera system with an application in astronomy
NASA Astrophysics Data System (ADS)
Vítek, Stanislav; Nasyrova, Maria; Stehlíková, Veronika
2017-09-01
Camera response function (CRF) is widely used for the description of the relationship between scene radiance and image brightness. Most common application of CRF is High Dynamic Range (HDR) reconstruction of the radiance maps of imaged scenes from a set of frames with different exposures. The main goal of this work is to provide an overview of CRF estimation algorithms and compare their outputs with results obtained under laboratory conditions. These algorithms, typically designed for multimedia content, are unfortunately quite useless with astronomical image data, mostly due to their nature (blur, noise, and long exposures). Therefore, we propose an optimization of selected methods to use in an astronomical imaging application. Results are experimentally verified on the wide-field camera system using Digital Single Lens Reflex (DSLR) camera.
Portal imaging with flat-panel detector and CCD camera
NASA Astrophysics Data System (ADS)
Roehrig, Hans; Tang, Chuankun; Cheng, Chee-Wai; Dallas, William J.
1997-07-01
This paper provides a comparison of imaging parameters of two portal imaging systems at 6 MV: a flat panel detector and a CCD-camera based portal imaging system. Measurements were made of the signal and noise and consequently of signal-to-noise per pixel as a function of the exposure. Both systems have a linear response with respect to exposure, and the noise is proportional to the square-root of the exposure, indicating photon-noise limitation. The flat-panel detector has a signal- to-noise ratio, which is higher than that observed wit the CCD-camera based portal imaging system. This is expected because most portal imaging systems using optical coupling with a lens exhibit severe quantum-sinks. The paper also presents data on the screen's photon gain (the number of light-photons per interacting x-ray photon), as well as on the magnitude of the Swank-noise, (which describes fluctuation in the screen's photon gain). Images of a Las Vegas-type aluminum contrast detail phantom, located at the ISO-Center, were generated at an exposure of 1 MU. The CCD-camera based system permits detection of aluminum-holes of 0.01194 cm diameter and 0.228 mm depth while the flat-panel detector permits detection of aluminum holes of 0.01194 cm diameter and 0.1626 mm depth, indicating a better signal-to-noise ratio. Rank order filtering was applied to the raw images from the CCD-based system in order to remove the direct hits. These are camera responses to scattered x-ray photons which interact directly with the CCD of the CCD-camera and generate 'salt and pepper type noise,' which interferes severely with attempts to determine accurate estimates of the image noise.
Development of biostereometric experiments. [stereometric camera system
NASA Technical Reports Server (NTRS)
Herron, R. E.
1978-01-01
The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.
On a novel low cost high accuracy experimental setup for tomographic particle image velocimetry
NASA Astrophysics Data System (ADS)
Discetti, Stefano; Ianiro, Andrea; Astarita, Tommaso; Cardone, Gennaro
2013-07-01
This work deals with the critical aspects related to cost reduction of a Tomo PIV setup and to the bias errors introduced in the velocity measurements by the coherent motion of the ghost particles. The proposed solution consists of using two independent imaging systems composed of three (or more) low speed single frame cameras, which can be up to ten times cheaper than double shutter cameras with the same image quality. Each imaging system is used to reconstruct a particle distribution in the same measurement region, relative to the first and the second exposure, respectively. The reconstructed volumes are then interrogated by cross-correlation in order to obtain the measured velocity field, as in the standard tomographic PIV implementation. Moreover, differently from tomographic PIV, the ghost particle distributions of the two exposures are uncorrelated, since their spatial distribution is camera orientation dependent. For this reason, the proposed solution promises more accurate results, without the bias effect of the coherent ghost particles motion. Guidelines for the implementation and the application of the present method are proposed. The performances are assessed with a parametric study on synthetic experiments. The proposed low cost system produces a much lower modulation with respect to an equivalent three-camera system. Furthermore, the potential accuracy improvement using the Motion Tracking Enhanced MART (Novara et al 2010 Meas. Sci. Technol. 21 035401) is much higher than in the case of the standard implementation of tomographic PIV.
NASA Astrophysics Data System (ADS)
Barla, Lindi; Verdaasdonk, Rudolf M.; Rustemeyer, Thomas; Klaessens, John; van der Veen, Albert
2016-02-01
Allergy testing is usually performed by exposing the skin to small quantities of potential allergens on the inner forearm and scratching the protective epidermis to increase exposure. After 15 minutes the dermatologist performs a visual check for swelling and erythema which is subjective and difficult for e.g. dark skin types. A small smart phone based thermo camera (FLIR One) was used to obtain quantitative images in a feasibility study of 17 patients Directly after allergen exposure on the forearm, thermal images were captured at 30 seconds interval and processed to a time lapse movie over 15 minutes. Considering the 'subjective' reading of the dermatologist as golden standard, in 11/17 pts (65%) the evaluation of dermatologist was confirmed by the thermo camera including 5 of 6 patients without allergic response. In 7 patients thermo showed additional spots. Of the 342 sites tested, the dermatologist detected 47 allergies of which 28 (60%) were confirmed by thermo imaging while thermo imaging showed 12 additional spots. The method can be improved with user dedicated acquisition software and better registration between normal and thermal images. The lymphatic reaction seems to shift from the original puncture site. The interpretation of the thermal images is still subjective since collecting quantitative data is difficult due to motion patient during 15 minutes. Although not yet conclusive, thermal imaging shows to be promising to improve the sensitivity and selectivity of allergy testing using a smart phone based camera.
X-rays only when you want them: optimized pump–probe experiments using pseudo-single-bunch operation
Hertlein, M. P.; Scholl, A.; Cordones, A. A.; Lee, J. H.; Engelhorn, K.; Glover, T. E.; Barbrel, B.; Sun, C.; Steier, C.; Portmann, G.; Robin, D. S.
2015-01-01
Laser pump–X-ray probe experiments require control over the X-ray pulse pattern and timing. Here, the first use of pseudo-single-bunch mode at the Advanced Light Source in picosecond time-resolved X-ray absorption experiments on solutions and solids is reported. In this mode the X-ray repetition rate is fully adjustable from single shot to 500 kHz, allowing it to be matched to typical laser excitation pulse rates. Suppressing undesired X-ray pulses considerably reduces detector noise and improves signal to noise in time-resolved experiments. In addition, dose-induced sample damage is considerably reduced, easing experimental setup and allowing the investigation of less robust samples. Single-shot X-ray exposures of a streak camera detector using a conventional non-gated charge-coupled device (CCD) camera are also demonstrated. PMID:25931090
X-rays only when you want them: Optimized pump–probe experiments using pseudo-single-bunch operation
Hertlein, M. P.; Scholl, A.; Cordones, A. A.; ...
2015-04-02
Laser pump–X-ray probe experiments require control over the X-ray pulse pattern and timing. Here, the first use of pseudo-single-bunch mode at the Advanced Light Source in picosecond time-resolved X-ray absorption experiments on solutions and solids is reported. In this mode the X-ray repetition rate is fully adjustable from single shot to 500 kHz, allowing it to be matched to typical laser excitation pulse rates. Suppressing undesired X-ray pulses considerably reduces detector noise and improves signal to noise in time-resolved experiments. In addition, dose-induced sample damage is considerably reduced, easing experimental setup and allowing the investigation of less robust samples. Single-shotmore » X-ray exposures of a streak camera detector using a conventional non-gated charge-coupled device (CCD) camera are also demonstrated.« less
Camera characterization for all-sky polarization measurements during the 2017 solar eclipse
NASA Astrophysics Data System (ADS)
Hashimoto, Taiga; Dahl, Laura M.; Laurie, Seth A.; Shaw, Joseph A.
2017-08-01
A solar eclipse provides a rare opportunity to observe skylight polarization during conditions that are fundamentally different than what we see every day. On 21 August 2017 we will measure the skylight polarization during a total solar eclipse in Rexburg, Idaho, USA. Previous research has shown that during totality the sky polarization pattern is altered significantly to become nominally symmetric about the zenith. However, there are still questions remaining about the details of how surface reflectance near the eclipse observation site and optical properties of aerosols in the atmosphere influence the totality sky polarization pattern. We will study how skylight polarization in a solar eclipse changes through each phase and how surface and atmospheric features affect the measured polarization signatures. To accomplish this, fully characterizing the cameras and fisheye lenses is critical. This paper reports measurements that include finding the camera sensitivity and its relationship to the required short exposure times, measuring the camera's spectral response function, mapping the angles of each camera pixel with the fisheye lens, and taking test measurements during daytime and twilight conditions. The daytime polarimetric images were compared to images from an existing all-sky polarization imager and a polarimetric radiative transfer model.
An optimal algorithm for reconstructing images from binary measurements
NASA Astrophysics Data System (ADS)
Yang, Feng; Lu, Yue M.; Sbaiz, Luciano; Vetterli, Martin
2010-01-01
We have studied a camera with a very large number of binary pixels referred to as the gigavision camera [1] or the gigapixel digital film camera [2, 3]. Potential advantages of this new camera design include improved dynamic range, thanks to its logarithmic sensor response curve, and reduced exposure time in low light conditions, due to its highly sensitive photon detection mechanism. We use maximum likelihood estimator (MLE) to reconstruct a high quality conventional image from the binary sensor measurements of the gigavision camera. We prove that when the threshold T is "1", the negative loglikelihood function is a convex function. Therefore, optimal solution can be achieved using convex optimization. Base on filter bank techniques, fast algorithms are given for computing the gradient and the multiplication of a vector and Hessian matrix of the negative log-likelihood function. We show that with a minor change, our algorithm also works for estimating conventional images from multiple binary images. Numerical experiments with synthetic 1-D signals and images verify the effectiveness and quality of the proposed algorithm. Experimental results also show that estimation performance can be improved by increasing the oversampling factor or the number of binary images.
Synchronizing A Stroboscope With A Video Camera
NASA Technical Reports Server (NTRS)
Rhodes, David B.; Franke, John M.; Jones, Stephen B.; Dismond, Harriet R.
1993-01-01
Circuit synchronizes flash of light from stroboscope with frame and field periods of video camera. Sync stripper sends vertical-synchronization signal to delay generator, which generates trigger signal. Flashlamp power supply accepts delayed trigger signal and sends pulse of power to flash lamp. Designed for use in making short-exposure images that "freeze" flow in wind tunnel. Also used for making longer-exposure images obtained by use of continuous intense illumination.
Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
NASA Astrophysics Data System (ADS)
Georgiou, Giota; Verdaasdonk, Rudolf M.; van der Veen, Albert; Klaessens, John H.
2017-02-01
In the development of new near-infrared (NIR) fluorescence dyes for image guided surgery, there is a need for new NIR sensitive camera systems that can easily be adjusted to specific wavelength ranges in contrast the present clinical systems that are only optimized for ICG. To test alternative camera systems, a setup was developed to mimic the fluorescence light in a tissue phantom to measure the sensitivity and resolution. Selected narrow band NIR LED's were used to illuminate a 6mm diameter circular diffuse plate to create uniform intensity controllable light spot (μW-mW) as target/source for NIR camera's. Layers of (artificial) tissue with controlled thickness could be placed on the spot to mimic a fluorescent `cancer' embedded in tissue. This setup was used to compare a range of NIR sensitive consumer's cameras for potential use in image guided surgery. The image of the spot obtained with the cameras was captured and analyzed using ImageJ software. Enhanced CCD night vision cameras were the most sensitive capable of showing intensities < 1 μW through 5 mm of tissue. However, there was no control over the automatic gain and hence noise level. NIR sensitive DSLR cameras proved relative less sensitive but could be fully manually controlled as to gain (ISO 25600) and exposure time and are therefore preferred for a clinical setting in combination with Wi-Fi remote control. The NIR fluorescence testing setup proved to be useful for camera testing and can be used for development and quality control of new NIR fluorescence guided surgery equipment.
Ultrafast Imaging using Spectral Resonance Modulation
NASA Astrophysics Data System (ADS)
Huang, Eric; Ma, Qian; Liu, Zhaowei
2016-04-01
CCD cameras are ubiquitous in research labs, industry, and hospitals for a huge variety of applications, but there are many dynamic processes in nature that unfold too quickly to be captured. Although tradeoffs can be made between exposure time, sensitivity, and area of interest, ultimately the speed limit of a CCD camera is constrained by the electronic readout rate of the sensors. One potential way to improve the imaging speed is with compressive sensing (CS), a technique that allows for a reduction in the number of measurements needed to record an image. However, most CS imaging methods require spatial light modulators (SLMs), which are subject to mechanical speed limitations. Here, we demonstrate an etalon array based SLM without any moving elements that is unconstrained by either mechanical or electronic speed limitations. This novel spectral resonance modulator (SRM) shows great potential in an ultrafast compressive single pixel camera.
SHOK—The First Russian Wide-Field Optical Camera in Space
NASA Astrophysics Data System (ADS)
Lipunov, V. M.; Gorbovskoy, E. S.; Kornilov, V. G.; Panasyuk, M. I.; Amelushkin, A. M.; Petrov, V. L.; Yashin, I. V.; Svertilov, S. I.; Vedenkin, N. N.
2018-02-01
Onboard the spacecraft Lomonosov is established two fast, fixed, very wide-field cameras SHOK. The main goal of this experiment is the observation of GRB optical emission before, synchronously, and after the gamma-ray emission. The field of view of each of the cameras is placed in the gamma-ray burst detection area of other devices located onboard the "Lomonosov" spacecraft. SHOK provides measurements of optical emissions with a magnitude limit of ˜ 9-10m on a single frame with an exposure of 0.2 seconds. The device is designed for continuous sky monitoring at optical wavelengths in the very wide field of view (1000 square degrees each camera), detection and localization of fast time-varying (transient) optical sources on the celestial sphere, including provisional and synchronous time recording of optical emissions from the gamma-ray burst error boxes, detected by the BDRG device and implemented by a control signal (alert trigger) from the BDRG. The Lomonosov spacecraft has two identical devices, SHOK1 and SHOK2. The core of each SHOK device is a fast-speed 11-Megapixel CCD. Each of the SHOK devices represents a monoblock, consisting of a node observations of optical emission, the electronics node, elements of the mechanical construction, and the body.
Megapixel mythology and photospace: estimating photospace for camera phones from large image sets
NASA Astrophysics Data System (ADS)
Hultgren, Bror O.; Hertel, Dirk W.
2008-01-01
It is a myth that more pixels alone result in better images. The marketing of camera phones in particular has focused on their pixel numbers. However, their performance varies considerably according to the conditions of image capture. Camera phones are often used in low-light situations where the lack of a flash and limited exposure time will produce underexposed, noisy and blurred images. Camera utilization can be quantitatively described by photospace distributions, a statistical description of the frequency of pictures taken at varying light levels and camera-subject distances. If the photospace distribution is known, the user-experienced distribution of quality can be determined either directly by direct measurement of subjective quality, or by photospace-weighting of objective attributes. The population of a photospace distribution requires examining large numbers of images taken under typical camera phone usage conditions. ImagePhi was developed as a user-friendly software tool to interactively estimate the primary photospace variables, subject illumination and subject distance, from individual images. Additionally, subjective evaluations of image quality and failure modes for low quality images can be entered into ImagePhi. ImagePhi has been applied to sets of images taken by typical users with a selection of popular camera phones varying in resolution. The estimated photospace distribution of camera phone usage has been correlated with the distributions of failure modes. The subjective and objective data show that photospace conditions have a much bigger impact on image quality of a camera phone than the pixel count of its imager. The 'megapixel myth' is thus seen to be less a myth than an ill framed conditional assertion, whose conditions are to a large extent specified by the camera's operational state in photospace.
Comparison of laser Doppler and laser speckle contrast imaging using a concurrent processing system
NASA Astrophysics Data System (ADS)
Sun, Shen; Hayes-Gill, Barrie R.; He, Diwei; Zhu, Yiqun; Huynh, Nam T.; Morgan, Stephen P.
2016-08-01
Full field laser Doppler imaging (LDI) and single exposure laser speckle contrast imaging (LSCI) are directly compared using a novel instrument which can concurrently image blood flow using both LDI and LSCI signal processing. Incorporating a commercial CMOS camera chip and a field programmable gate array (FPGA) the flow images of LDI and the contrast maps of LSCI are simultaneously processed by utilizing the same detected optical signals. The comparison was carried out by imaging a rotating diffuser. LDI has a linear response to the velocity. In contrast, LSCI is exposure time dependent and does not provide a linear response in the presence of static speckle. It is also demonstrated that the relationship between LDI and LSCI can be related through a power law which depends on the exposure time of LSCI.
NASA Astrophysics Data System (ADS)
Richards, Lisa M.; Kazmi, S. M. S.; Olin, Katherine E.; Waldron, James S.; Fox, Douglas J.; Dunn, Andrew K.
2017-03-01
Monitoring cerebral blood flow (CBF) during neurosurgery is essential for detecting ischemia in a timely manner for a wide range of procedures. Multiple clinical studies have demonstrated that laser speckle contrast imaging (LSCI) has high potential to be a valuable, label-free CBF monitoring technique during neurosurgery. LSCI is an optical imaging method that provides blood flow maps with high spatiotemporal resolution requiring only a coherent light source, a lens system, and a camera. However, the quantitative accuracy and sensitivity of LSCI is limited and highly dependent on the exposure time. An extension to LSCI called multi-exposure speckle imaging (MESI) overcomes these limitations, and was evaluated intraoperatively in patients undergoing brain tumor resection. This clinical study (n = 7) recorded multiple exposure times from the same cortical tissue area, and demonstrates that shorter exposure times (≤1 ms) provide the highest dynamic range and sensitivity for sampling flow rates in human neurovasculature. This study also combined exposure times using the MESI model, demonstrating high correlation with proper image calibration and acquisition. The physiological accuracy of speckle-estimated flow was validated using conservation of flow analysis on vascular bifurcations. Flow estimates were highly conserved in MESI and 1 ms exposure LSCI, with percent errors at 6.4% ± 5.3% and 7.2% ± 7.2%, respectively, while 5 ms exposure LSCI had higher errors at 21% ± 10% (n = 14 bifurcations). Results from this study demonstrate the importance of exposure time selection for LSCI, and that intraoperative MESI can be performed with high quantitative accuracy.
Qualification of Engineering Camera for Long-Duration Deep Space Missions
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni; Maki, Justin N.; Pourangi, Ali M.; Lee, Steven W.
2012-01-01
Qualification and verification of advanced electronic packaging and interconnect technologies, and various other types of hardware elements for the Mars Exploration Rover s Spirit and Opportunity (MER)/Mars Science Laboratory (MSL) flight projects, has been performed to enhance the mission assurance. The qualification of hardware (engineering camera) under extreme cold temperatures has been performed with reference to various Mars-related project requirements. The flight-like packages, sensors, and subassemblies have been selected for the study to survive three times the total number of expected diurnal temperature cycles resulting from all environmental and operational exposures occurring over the life of the flight hardware, including all relevant manufacturing, ground operations, and mission phases. Qualification has been performed by subjecting above flight-like hardware to the environmental temperature extremes, and assessing any structural failures or degradation in electrical performance due to either overstress or thermal cycle fatigue. Engineering camera packaging designs, charge-coupled devices (CCDs), and temperature sensors were successfully qualified for MER and MSL per JPL design principles. Package failures were observed during qualification processes and the package redesigns were then made to enhance the reliability and subsequent mission assurance. These results show the technology certainly is promising for MSL, and especially for longterm extreme temperature missions to the extreme temperature conditions. The engineering camera has been completely qualified for the MSL project, with the proven ability to survive on Mars for 2010 sols, or 670 sols times three. Finally, the camera continued to be functional, even after 2010 thermal cycles.
Advanced illumination control algorithm for medical endoscopy applications
NASA Astrophysics Data System (ADS)
Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Morgado-Dias, F.
2015-05-01
CMOS image sensor manufacturer, AWAIBA, is providing the world's smallest digital camera modules to the world market for minimally invasive surgery and one time use endoscopic equipment. Based on the world's smallest digital camera head and the evaluation board provided to it, the aim of this paper is to demonstrate an advanced fast response dynamic control algorithm of the illumination LED source coupled to the camera head, over the LED drivers embedded on the evaluation board. Cost efficient and small size endoscopic camera modules nowadays embed minimal size image sensors capable of not only adjusting gain and exposure time but also LED illumination with adjustable illumination power. The LED illumination power has to be dynamically adjusted while navigating the endoscope over changing illumination conditions of several orders of magnitude within fractions of the second to guarantee a smooth viewing experience. The algorithm is centered on the pixel analysis of selected ROIs enabling it to dynamically adjust the illumination intensity based on the measured pixel saturation level. The control core was developed in VHDL and tested in a laboratory environment over changing light conditions. The obtained results show that it is capable of achieving correction speeds under 1 s while maintaining a static error below 3% relative to the total number of pixels on the image. The result of this work will allow the integration of millimeter sized high brightness LED sources on minimal form factor cameras enabling its use in endoscopic surgical robotic or micro invasive surgery.
Automatic exposure control for space sequential camera
NASA Technical Reports Server (NTRS)
Mcatee, G. E., Jr.; Stoap, L. J.; Solheim, C. D.; Sharpsteen, J. T.
1975-01-01
The final report for the automatic exposure control study for space sequential cameras, for the NASA Johnson Space Center is presented. The material is shown in the same sequence that the work was performed. The purpose of the automatic exposure control is to automatically control the lens iris as well as the camera shutter so that the subject is properly exposed on the film. A study of design approaches is presented. Analysis of the light range of the spectrum covered indicates that the practical range would be from approximately 20 to 6,000 foot-lamberts, or about nine f-stops. Observation of film available from space flights shows that optimum scene illumination is apparently not present in vehicle interior photography as well as in vehicle-to-vehicle situations. The evaluation test procedure for a breadboard, and the results, which provided information for the design of a brassboard are given.
Confocal retinal imaging using a digital light projector with a near infrared VCSEL source
NASA Astrophysics Data System (ADS)
Muller, Matthew S.; Elsner, Ann E.
2018-02-01
A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1" LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging.
High-accuracy 3D measurement system based on multi-view and structured light
NASA Astrophysics Data System (ADS)
Li, Mingyue; Weng, Dongdong; Li, Yufeng; Zhang, Longbin; Zhou, Haiyun
2013-12-01
3D surface reconstruction is one of the most important topics in Spatial Augmented Reality (SAR). Using structured light is a simple and rapid method to reconstruct the objects. In order to improve the precision of 3D reconstruction, we present a high-accuracy multi-view 3D measurement system based on Gray-code and Phase-shift. We use a camera and a light projector that casts structured light patterns on the objects. In this system, we use only one camera to take photos on the left and right sides of the object respectively. In addition, we use VisualSFM to process the relationships between each perspective, so the camera calibration can be omitted and the positions to place the camera are no longer limited. We also set appropriate exposure time to make the scenes covered by gray-code patterns more recognizable. All of the points above make the reconstruction more precise. We took experiments on different kinds of objects, and a large number of experimental results verify the feasibility and high accuracy of the system.
To catch a comet: Technical overview of CAN DO G-324
NASA Technical Reports Server (NTRS)
Obrien, T. J. (Editor)
1986-01-01
The primary objective of the C. E. Williams Middle School Get Away Special CAN DO is the photographing of Comet Halley. The project will involve middle school students, grades 6 through 8, in the study and interpretation of astronomical photographs and techniques. G-324 is contained in a 5 cubic foot GAS Canister with an opening door and pyrex window for photography. It will be pressurized with one atmosphere of dry nitrogen. Three 35mm still cameras with 250 exposure film backs and different focal length lenses will be fired by a combination of automatic timer and an active comet detector. A lightweight 35mm movie camera will shoot single exposures at about 1/2 minute intervals to give an overlapping skymap of the mission. The fifth camera is a solid state television camera specially constructed for detection of the comet by microprocessor.
Chambers, T; Pearson, A L; Stanley, J; Smith, M; Barr, M; Ni Mhurchu, C; Signal, L
2017-07-01
Exposure to alcohol marketing within alcohol retailers has been associated with higher rates of childhood drinking, brand recognition, and marketing recall. This study aimed to objectively measure children's everyday exposure to alcohol marketing within supermarkets. Children aged 11-13 (n = 167) each wore a wearable camera and GPS device for four consecutive days. Micro-spatial analyses were used to examine exposures within supermarkets. In alcohol retailing supermarkets (n = 30), children encountered alcohol marketing on 85% of their visits (n = 78). Alcohol marketing was frequently near everyday goods (bread and milk) or entrance/exit. Alcohol sales in supermarkets should be banned in order to protect children from alcohol marketing. Copyright © 2017 Elsevier Ltd. All rights reserved.
High-speed imaging using CMOS image sensor with quasi pixel-wise exposure
NASA Astrophysics Data System (ADS)
Sonoda, T.; Nagahara, H.; Endo, K.; Sugiyama, Y.; Taniguchi, R.
2017-02-01
Several recent studies in compressive video sensing have realized scene capture beyond the fundamental trade-off limit between spatial resolution and temporal resolution using random space-time sampling. However, most of these studies showed results for higher frame rate video that were produced by simulation experiments or using an optically simulated random sampling camera, because there are currently no commercially available image sensors with random exposure or sampling capabilities. We fabricated a prototype complementary metal oxide semiconductor (CMOS) image sensor with quasi pixel-wise exposure timing that can realize nonuniform space-time sampling. The prototype sensor can reset exposures independently by columns and fix these amount of exposure by rows for each 8x8 pixel block. This CMOS sensor is not fully controllable via the pixels, and has line-dependent controls, but it offers flexibility when compared with regular CMOS or charge-coupled device sensors with global or rolling shutters. We propose a method to realize pseudo-random sampling for high-speed video acquisition that uses the flexibility of the CMOS sensor. We reconstruct the high-speed video sequence from the images produced by pseudo-random sampling using an over-complete dictionary.
Software for Acquiring Image Data for PIV
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Cheung, H. M.; Kressler, Brian
2003-01-01
PIV Acquisition (PIVACQ) is a computer program for acquisition of data for particle-image velocimetry (PIV). In the PIV system for which PIVACQ was developed, small particles entrained in a flow are illuminated with a sheet of light from a pulsed laser. The illuminated region is monitored by a charge-coupled-device camera that operates in conjunction with a data-acquisition system that includes a frame grabber and a counter-timer board, both installed in a single computer. The camera operates in "frame-straddle" mode where a pair of images can be obtained closely spaced in time (on the order of microseconds). The frame grabber acquires image data from the camera and stores the data in the computer memory. The counter/timer board triggers the camera and synchronizes the pulsing of the laser with acquisition of data from the camera. PIVPROC coordinates all of these functions and provides a graphical user interface, through which the user can control the PIV data-acquisition system. PIVACQ enables the user to acquire a sequence of single-exposure images, display the images, process the images, and then save the images to the computer hard drive. PIVACQ works in conjunction with the PIVPROC program which processes the images of particles into the velocity field in the illuminated plane.
High dynamic spectroscopy using a digital micromirror device and periodic shadowing.
Kristensson, Elias; Ehn, Andreas; Berrocal, Edouard
2017-01-09
We present an optical solution called DMD-PS to boost the dynamic range of 2D imaging spectroscopic measurements up to 22 bits by incorporating a digital micromirror device (DMD) prior to detection in combination with the periodic shadowing (PS) approach. In contrast to high dynamic range (HDR), where the dynamic range is increased by recording several images at different exposure times, the current approach has the potential of improving the dynamic range from a single exposure and without saturation of the CCD sensor. In the procedure, the spectrum is imaged onto the DMD that selectively reduces the reflection from the intense spectral lines, allowing the signal from the weaker lines to be increased by a factor of 28 via longer exposure times, higher camera gains or increased laser power. This manipulation of the spectrum can either be based on a priori knowledge of the spectrum or by first performing a calibration measurement to sense the intensity distribution. The resulting benefits in detection sensitivity come, however, at the cost of strong generation of interfering stray light. To solve this issue the Periodic Shadowing technique, which is based on spatial light modulation, is also employed. In this proof-of-concept article we describe the full methodology of DMD-PS and demonstrate - using the calibration-based concept - an improvement in dynamic range by a factor of ~100 over conventional imaging spectroscopy. The dynamic range of the presented approach will directly benefit from future technological development of DMDs and camera sensors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taneja, S; Fru, L Che; Desai, V
Purpose: It is now commonplace to handle treatments of hyperthyroidism using iodine-131 as an outpatient procedure due to lower costs and less stringent federal regulations. The Nuclear Regulatory Commission has currently updated release guidelines for these procedures, but there is still a large uncertainty in the dose to the public. Current guidelines to minimize dose to the public require patients to remain isolated after treatment. The purpose of this study was to use a low-cost common device, such as a cell phone, to estimate exposure emitted from a patient to the general public. Methods: Measurements were performed using an Applemore » iPhone 3GS and a Cs-137 irradiator. The charge-coupled device (CCD) camera on the phone was irradiated to exposure rates ranging from 0.1 mR/hr to 100 mR/hr and 30-sec videos were taken during irradiation with the camera lens covered by electrical tape. Interactions were detected as white pixels on a black background in each video. Both single threshold (ST) and colony counting (CC) methods were performed using MATLAB®. Calibration curves were determined by comparing the total pixel intensity output from each method to the known exposure rate. Results: The calibration curve showed a linear relationship above 5 mR/hr for both analysis techniques. The number of events counted per unit exposure rate within the linear region was 19.5 ± 0.7 events/mR and 8.9 ± 0.4 events/mR for the ST and CC methods respectively. Conclusion: Two algorithms were developed and show a linear relationship between photons detected by a CCD camera and low exposure rates, in the range of 5 mR/hr to 100-mR/hr. Future work aims to refine this model by investigating the dose-rate and energy dependencies of the camera response. This algorithm allows for quantitative monitoring of exposure from patients treated with iodine-131 using a simple device outside of the hospital.« less
A customizable commercial miniaturized 320×256 indium gallium arsenide shortwave infrared camera
NASA Astrophysics Data System (ADS)
Huang, Shih-Che; O'Grady, Matthew; Groppe, Joseph V.; Ettenberg, Martin H.; Brubaker, Robert M.
2004-10-01
The design and performance of a commercial short-wave-infrared (SWIR) InGaAs microcamera engine is presented. The 0.9-to-1.7 micron SWIR imaging system consists of a room-temperature-TEC-stabilized, 320x256 (25 μm pitch) InGaAs focal plane array (FPA) and a high-performance, highly customizable image-processing set of electronics. The detectivity, D*, of the system is greater than 1013 cm-√Hz/W at 1.55 μm, and this sensitivity may be adjusted in real-time over 100 dB. It features snapshot-mode integration with a minimum exposure time of 130 μs. The digital video processor provides real time pixel-to-pixel, 2-point dark-current subtraction and non-uniformity compensation along with defective-pixel substitution. Other features include automatic gain control (AGC), gamma correction, 7 preset configurations, adjustable exposure time, external triggering, and windowing. The windowing feature is highly flexible; the region of interest (ROI) may be placed anywhere on the imager and can be varied at will. Windowing allows for high-speed readout enabling such applications as target acquisition and tracking; for example, a 32x32 ROI window may be read out at over 3500 frames per second (fps). Output video is provided as EIA170-compatible analog, or as 12-bit CameraLink-compatible digital. All the above features are accomplished in a small volume < 28 cm3, weight < 70 g, and with low power consumption < 1.3 W at room temperature using this new microcamera engine. Video processing is based on a field-programmable gate array (FPGA) platform with a soft-embedded processor that allows for ease of integration/addition of customer-specific algorithms, processes, or design requirements. The camera was developed with the high-performance, space-restricted, power-conscious application in mind, such as robotic or UAV deployment.
NASA Technical Reports Server (NTRS)
Deker, H.
1971-01-01
The West German tracking stations are equipped with ballistic cameras. Plate measurement and plate reduction must therefore follow photogrammetric methods. Approximately 100 star positions and 200 satellite positions are measured on each plate. The mathematical model for spatial rotation of the bundle of rays is extended by including terms for distortion and internal orientation of the camera as well as by providing terms for refraction which are computed for the measured coordinates of the star positions on the plate. From the measuring accuracy of the plate coordinates it follows that the timing accuracy for the exposures has to be about one millisecond, in order to obtain a homogeneous system.
Confocal Retinal Imaging Using a Digital Light Projector with a Near Infrared VCSEL Source
Muller, Matthew S.; Elsner, Ann E.
2018-01-01
A custom near infrared VCSEL source has been implemented in a confocal non-mydriatic retinal camera, the Digital Light Ophthalmoscope (DLO). The use of near infrared light improves patient comfort, avoids pupil constriction, penetrates the deeper retina, and does not mask visual stimuli. The DLO performs confocal imaging by synchronizing a sequence of lines displayed with a digital micromirror device to the rolling shutter exposure of a 2D CMOS camera. Real-time software adjustments enable multiply scattered light imaging, which rapidly and cost-effectively emphasizes drusen and other scattering disruptions in the deeper retina. A separate 5.1″ LCD display provides customizable visible stimuli for vision experiments with simultaneous near infrared imaging. PMID:29899586
Optical readout of a two phase liquid argon TPC using CCD camera and THGEMs
NASA Astrophysics Data System (ADS)
Mavrokoridis, K.; Ball, F.; Carroll, J.; Lazos, M.; McCormick, K. J.; Smith, N. A.; Touramanis, C.; Walker, J.
2014-02-01
This paper presents a preliminary study into the use of CCDs to image secondary scintillation light generated by THick Gas Electron Multipliers (THGEMs) in a two phase LAr TPC. A Sony ICX285AL CCD chip was mounted above a double THGEM in the gas phase of a 40 litre two-phase LAr TPC with the majority of the camera electronics positioned externally via a feedthrough. An Am-241 source was mounted on a rotatable motion feedthrough allowing the positioning of the alpha source either inside or outside of the field cage. Developed for and incorporated into the TPC design was a novel high voltage feedthrough featuring LAr insulation. Furthermore, a range of webcams were tested for operation in cryogenics as an internal detector monitoring tool. Of the range of webcams tested the Microsoft HD-3000 (model no:1456) webcam was found to be superior in terms of noise and lowest operating temperature. In ambient temperature and atmospheric pressure 1 ppm pure argon gas, the THGEM gain was ≈ 1000 and using a 1 msec exposure the CCD captured single alpha tracks. Successful operation of the CCD camera in two-phase cryogenic mode was also achieved. Using a 10 sec exposure a photograph of secondary scintillation light induced by the Am-241 source in LAr has been captured for the first time.
Unattended real-time re-establishment of visibility in high dynamic range video and stills
NASA Astrophysics Data System (ADS)
Abidi, B.
2014-05-01
We describe a portable unattended persistent surveillance system that corrects for harsh illumination conditions, where bright sun light creates mixed contrast effects, i.e., heavy shadows and washouts. These effects result in high dynamic range scenes, where illuminance can vary from few luxes to a 6 figure value. When using regular monitors and cameras, such wide span of illuminations can only be visualized if the actual range of values is compressed, leading to the creation of saturated and/or dark noisy areas and a loss of information in these areas. Images containing extreme mixed contrast cannot be fully enhanced from a single exposure, simply because all information is not present in the original data. The active intervention in the acquisition process is required. A software package, capable of integrating multiple types of COTS and custom cameras, ranging from Unmanned Aerial Systems (UAS) data links to digital single-lens reflex cameras (DSLR), is described. Hardware and software are integrated via a novel smart data acquisition algorithm, which communicates to the camera the parameters that would maximize information content in the final processed scene. A fusion mechanism is then applied to the smartly acquired data, resulting in an enhanced scene where information in both dark and bright areas is revealed. Multi-threading and parallel processing are exploited to produce automatic real time full motion corrected video. A novel enhancement algorithm was also devised to process data from legacy and non-controllable cameras. The software accepts and processes pre-recorded sequences and stills, enhances visible, night vision, and Infrared data, and successfully applies to night time and dark scenes. Various user options are available, integrating custom functionalities of the application into intuitive and easy to use graphical interfaces. The ensuing increase in visibility in surveillance video and intelligence imagery will expand the performance and timely decision making of the human analyst, as well as that of unmanned systems performing automatic data exploitation, such as target detection and identification.
Video-rate imaging of microcirculation with single-exposure oblique back-illumination microscopy
NASA Astrophysics Data System (ADS)
Ford, Tim N.; Mertz, Jerome
2013-06-01
Oblique back-illumination microscopy (OBM) is a new technique for simultaneous, independent measurements of phase gradients and absorption in thick scattering tissues based on widefield imaging. To date, OBM has been used with sequential camera exposures, which reduces temporal resolution, and can produce motion artifacts in dynamic samples. Here, a variation of OBM that allows single-exposure operation with wavelength multiplexing and image splitting with a Wollaston prism is introduced. Asymmetric anamorphic distortion induced by the prism is characterized and corrected in real time using a graphics-processing unit. To demonstrate the capacity of single-exposure OBM to perform artifact-free imaging of blood flow, video-rate movies of microcirculation in ovo in the chorioallantoic membrane of the developing chick are presented. Imaging is performed with a high-resolution rigid Hopkins lens suitable for endoscopy.
Video-rate imaging of microcirculation with single-exposure oblique back-illumination microscopy.
Ford, Tim N; Mertz, Jerome
2013-06-01
Oblique back-illumination microscopy (OBM) is a new technique for simultaneous, independent measurements of phase gradients and absorption in thick scattering tissues based on widefield imaging. To date, OBM has been used with sequential camera exposures, which reduces temporal resolution, and can produce motion artifacts in dynamic samples. Here, a variation of OBM that allows single-exposure operation with wavelength multiplexing and image splitting with a Wollaston prism is introduced. Asymmetric anamorphic distortion induced by the prism is characterized and corrected in real time using a graphics-processing unit. To demonstrate the capacity of single-exposure OBM to perform artifact-free imaging of blood flow, video-rate movies of microcirculation in ovo in the chorioallantoic membrane of the developing chick are presented. Imaging is performed with a high-resolution rigid Hopkins lens suitable for endoscopy.
Time-resolved imaging of the plasma development in a triggered vacuum switch
NASA Astrophysics Data System (ADS)
Park, Wung-Hoa; Kim, Moo-Sang; Son, Yoon-Kyoo; Frank, Klaus; Lee, Byung-Joon; Ackerman, Thilo; Iberler, Marcus
2017-12-01
Triggered vacuum switches (TVS) are particularly used in pulsed power technology as closing switches for high voltages and high charge transfer. A non-sealed-off prototype was designed with a side-on quartz window to investigate the evolution of the trigger discharge into the main discharge. The image acquisition was done with a fast CCD camera PI-MAX2 from Princeton Instruments. The CCD camera has a maximum exposure time of 2 ns. The electrode configuration of the prototype is a conventional six-rod gap type, a capacitor bank with C = 16.63 μF, which corresponds at 20 kV charging voltage to a total stored charge of 0.3 C or a total energy of 3.3 kJ. The peak current is 88 kA. According to the tremendously highly different light intensities during the trigger and main discharge, the complete discharge is split into three phases: a trigger breakdown phase, an intermediate phase and a main discharge phase. The CCD camera images of the first phase show instabilities of the trigger breakdown, in phase 2 three different discharge modes are observed. After the first current maximum the discharge behavior is reproducible.
Automatic Exposure Iris Control (AEIC) for data acquisition camera
NASA Technical Reports Server (NTRS)
Mcatee, G. E., Jr.; Stoap, L. J.; Solheim, C. D.; Sharpsteen, J. T.
1975-01-01
A lens design capable of operating over a total range of f/1.4 to f/11.0 with through the lens light sensing is presented along with a system which compensates for ASA film speeds as well as shutter openings. The space shuttle camera system package is designed so that it can be assembled on the existing 16 mm DAC with a minimum of alteration to the camera.
Image Intensifier Modules For Use With Commercially Available Solid State Cameras
NASA Astrophysics Data System (ADS)
Murphy, Howard; Tyler, Al; Lake, Donald W.
1989-04-01
A modular approach to design has contributed greatly to the success of the family of machine vision video equipment produced by EG&G Reticon during the past several years. Internal modularity allows high-performance area (matrix) and line scan cameras to be assembled with two or three electronic subassemblies with very low labor costs, and permits camera control and interface circuitry to be realized by assemblages of various modules suiting the needs of specific applications. Product modularity benefits equipment users in several ways. Modular matrix and line scan cameras are available in identical enclosures (Fig. 1), which allows enclosure components to be purchased in volume for economies of scale and allows field replacement or exchange of cameras within a customer-designed system to be easily accomplished. The cameras are optically aligned (boresighted) at final test; modularity permits optical adjustments to be made with the same precise test equipment for all camera varieties. The modular cameras contain two, or sometimes three, hybrid microelectronic packages (Fig. 2). These rugged and reliable "submodules" perform all of the electronic operations internal to the camera except for the job of image acquisition performed by the monolithic image sensor. Heat produced by electrical power dissipation in the electronic modules is conducted through low resistance paths to the camera case by the metal plates, which results in a thermally efficient and environmentally tolerant camera with low manufacturing costs. A modular approach has also been followed in design of the camera control, video processor, and computer interface accessory called the Formatter (Fig. 3). This unit can be attached directly onto either a line scan or matrix modular camera to form a self-contained units, or connected via a cable to retain the advantages inherent to a small, light weight, and rugged image sensing component. Available modules permit the bus-structured Formatter to be configured as required by a specific camera application. Modular line and matrix scan cameras incorporating sensors with fiber optic faceplates (Fig 4) are also available. These units retain the advantages of interchangeability, simple construction, ruggedness, and optical precision offered by the more common lens input units. Fiber optic faceplate cameras are used for a wide variety of applications. A common usage involves mating of the Reticon-supplied camera to a customer-supplied intensifier tube for low light level and/or short exposure time situations.
Fast camera imaging of dust in the DIII-D tokamak
NASA Astrophysics Data System (ADS)
Yu, J. H.; Rudakov, D. L.; Pigarov, A. Yu.; Smirnov, R. D.; Brooks, N. H.; Muller, S. H.; West, W. P.
2009-06-01
Naturally occurring and injected dust particles are observed in the DIII-D tokamak in the outer midplane scrape-off-layer (SOL) using a visible fast-framing camera, and the size of dust particles is estimated using the observed particle lifetime and theoretical ablation rate of a carbon sphere. Using this method, the lower limit of detected dust radius is ˜3 μm and particles with inferred radius as large as ˜1 mm are observed. Dust particle 2D velocities range from approximately 10 to 300 m/s with velocities inversely correlated with dust size. Pre-characterized 2-4 μm diameter diamond dust particles are introduced at the lower divertor in an ELMing H-mode discharge using the divertor materials evaluation system (DiMES), and these particles are found to be at the lower size limit of detection using the camera with resolution of ˜0.2 cm 2 per pixel and exposure time of 330 μs.
First Results from the Wide Angle Camera of the ROSETTA Mission .
NASA Astrophysics Data System (ADS)
Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; De Cecco, M.; Parzianello, G.; Zaccariotto, M.; Da Deppo, V.; Naletto, G.
This paper gives a brief description of the Wide Angle Camera (WAC), built by the Center of Studies and Activities for Space (CISAS) of the University of Padova for the ESA ROSETTA Mission, of data we have obtained about the new mission targets, and of the first results achieved after the launch in March 2004. The WAC is part of the OSIRIS imaging system, built under the PI-ship of Dr. U. Keller (Max-Planck-Institute for Solar System Studies) which comprises also a Narrow Angle Camera (NAC) built by the Laboratoire d'Astrophysique Spatiale (LAS) of Marseille. CISAS had also the responsibility to build the shutter and the front door mechanism for the NAC. The images show the excellent optical quality of the WAC, exceeding the specifications both in term of encircled energy (80% in one pixel over a FoV of 12×12 sq degree), limiting magnitude (fainter than the 13th in 30s exposure time through a wideband red filter) and amount of distortions.
All Sky Cloud Coverage Monitoring for SONG-China Project
NASA Astrophysics Data System (ADS)
Tian, J. F.; Deng, L. C.; Yan, Z. Z.; Wang, K.; Wu, Y.
2016-05-01
In order to monitor the cloud distributions at Qinghai station, a site selected for SONG (Stellar Observations Network Group)-China node, the design of the proto-type of all sky camera (ASC) applied in Xinglong station is adopted. Both hardware and software improvements have been made in order to be more precise and deliver quantitative measurements. The ARM (Advanced Reduced Instruction Set Computer Machine) MCU (Microcontroller Unit) instead of PC is used to control the upgraded version of ASC. A much higher reliability has been realized in the current scheme. Independent of the positions of the Sun and Moon, the weather conditions are constantly changing, therefore it is difficult to get proper exposure parameters using only the temporal information of the major light sources. A realistic exposure parameters for the ASC can actually be defined using a real-time sky brightness monitor that is also installed at the same site. The night sky brightness value is a very sensitive function of the cloud coverage, and can be accurately measured by the sky quality monitor. We study the correlation between the exposure parameter and night sky brightness value, and give the mathematical relation. The images of the all sky camera are inserted into database directly. All sky quality images are archived in FITS format which can be used for further analysis.
A motion deblurring method with long/short exposure image pairs
NASA Astrophysics Data System (ADS)
Cui, Guangmang; Hua, Weiping; Zhao, Jufeng; Gong, Xiaoli; Zhu, Liyao
2018-01-01
In this paper, a motion deblurring method with long/short exposure image pairs is presented. The long/short exposure image pairs are captured for the same scene under different exposure time. The image pairs are treated as the input of the deblurring method and more information could be used to obtain a deblurring result with high image quality. Firstly, the luminance equalization process is carried out to the short exposure image. And the blur kernel is estimated with the image pair under the maximum a posteriori (MAP) framework using conjugate gradient algorithm. Then a L0 image smoothing based denoising method is applied to the luminance equalized image. And the final deblurring result is obtained with the gain controlled residual image deconvolution process with the edge map as the gain map. Furthermore, a real experimental optical system is built to capture the image pair in order to demonstrate the effectiveness of the proposed deblurring framework. The long/short image pairs are obtained under different exposure time and camera gain control. Experimental results show that the proposed method could provide a superior deblurring result in both subjective and objective assessment compared with other deblurring approaches.
Performance of a 512 x 512 Gated CMOS Imager with a 250 ps Exposure Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teruya, A T; Moody, J D; Hsing, W W
2012-10-01
We describe the performance of a 512x512 gated CMOS read out integrated circuit (ROIC) with a 250 ps exposure time. A low-skew, H-tree trigger distribution system is used to locally generate individual pixel gates in each 8x8 neighborhood of the ROIC. The temporal width of the gate is voltage controlled and user selectable via a precision potentiometer. The gating implementation was first validated in optical tests of a 64x64 pixel prototype ROIC developed as a proof-of-concept during the early phases of the development program. The layout of the H-Tree addresses each quadrant of the ROIC independently and admits operation ofmore » the ROIC in two modes. If “common mode” triggering is used, the camera provides a single 512x512 image. If independent triggers are used, the camera can provide up to four 256x256 images with a frame separation set by the trigger intervals. The ROIC design includes small (sub-pixel) optical photodiode structures to allow test and characterization of the ROIC using optical sources prior to bump bonding. Reported test results were obtained using short pulse, second harmonic Ti:Sapphire laser systems operating at λ~ 400 nm at sub-ps pulse widths.« less
Visualization of corona discharge induced by UV (248 nm) pulses of a KrF excimer laser
NASA Astrophysics Data System (ADS)
Mizeraczyk, Jerzy; Ohkubo, Toshikazu; Kanazawa, Seiji; Nomoto, Yukiharu; Kawasaki, Toshiyuki; Kocik, Marek
2000-11-01
A KrF excimer laser (248 nm) was used to induce DC corona discharge streamers in air between the electrodes of a needle-to-plane geometry. The UV laser beam pulses were transformed into the form of a laser sheet (1.5 mm thick and 20 mm-wide) that was positioned along the axis directed from the needle electrode to the plane electrode. The laser pulses were time-synchronized with the exposure of an ICCD camera that record images of the corona streamers induced by the laser sheet. The laser pulse energy flux (75 MW/cm2) crossing the gap was high enough to induce corona streamers with a reliability of 100% even at relatively low operating voltages (e.g., 15 kV) at which self-sustained streamers could not occur. Due to the full synchronization of the corona streamer onset, induced by the laser pulse and the exposure of the ICCD camera, 2-D visualization of the corona streamer evolution with a time resolution of 10 ns was possible. The recorded images made possible determining such features of the corona discharge streamer as its velocity (2.5 105 m/s) and the diameters of the leader channel (200 micrometers ) and the leader streamers (100 micrometers ).
Method used to test the imaging consistency of binocular camera's left-right optical system
NASA Astrophysics Data System (ADS)
Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui
2016-09-01
To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.
NASA Astrophysics Data System (ADS)
Sun, Q. M.; Melnikov, A.; Mandelis, A.
2015-06-01
Carrierographic (spectrally gated photoluminescence) imaging of a crystalline silicon wafer using an InGaAs camera and two spread super-bandgap illumination laser beams is introduced in both low-frequency lock-in and high-frequency heterodyne modes. Lock-in carrierographic images of the wafer up to 400 Hz modulation frequency are presented. To overcome the frame rate and exposure time limitations of the camera, a heterodyne method is employed for high-frequency carrierographic imaging which results in high-resolution near-subsurface information. The feasibility of the method is guaranteed by the typical superlinearity behavior of photoluminescence, which allows one to construct a slow enough beat frequency component from nonlinear mixing of two high frequencies. Intensity-scan measurements were carried out with a conventional single-element InGaAs detector photocarrier radiometry system, and the nonlinearity exponent of the wafer was found to be around 1.7. Heterodyne images of the wafer up to 4 kHz have been obtained and qualitatively analyzed. With the help of the complementary lock-in and heterodyne modes, camera-based carrierographic imaging in a wide frequency range has been realized for fundamental research and industrial applications toward in-line nondestructive testing of semiconductor materials and devices.
High dynamic range image acquisition based on multiplex cameras
NASA Astrophysics Data System (ADS)
Zeng, Hairui; Sun, Huayan; Zhang, Tinghua
2018-03-01
High dynamic image is an important technology of photoelectric information acquisition, providing higher dynamic range and more image details, and it can better reflect the real environment, light and color information. Currently, the method of high dynamic range image synthesis based on different exposure image sequences cannot adapt to the dynamic scene. It fails to overcome the effects of moving targets, resulting in the phenomenon of ghost. Therefore, a new high dynamic range image acquisition method based on multiplex cameras system was proposed. Firstly, different exposure images sequences were captured with the camera array, using the method of derivative optical flow based on color gradient to get the deviation between images, and aligned the images. Then, the high dynamic range image fusion weighting function was established by combination of inverse camera response function and deviation between images, and was applied to generated a high dynamic range image. The experiments show that the proposed method can effectively obtain high dynamic images in dynamic scene, and achieves good results.
NASA Astrophysics Data System (ADS)
Pagnutti, Mary; Ryan, Robert E.; Cazenavette, George; Gold, Maxwell; Harlan, Ryan; Leggett, Edward; Pagnutti, James
2017-01-01
A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.
Evaluation of S190A radiometric exposure test data
NASA Technical Reports Server (NTRS)
Lockwood, H. E.; Goodding, R. A.
1974-01-01
The S190A preflight radiometric exposure test data generated as part of preflight and system test of KM-002 Sequence 29 on flight camera S/N 002 was analyzed. The analysis was to determine camera system transmission using available data which included: (1) films exposed to a calibrated light source subject; (2) filter transmission data; (3) calibrated light source data; (4) density vs. log10 exposure curves for the films; and (5) spectral sensitometric data for the films. The procedure used is outlined, and includes the data and a transmission matrix as a function of field position for nine measured points on each station-film-filter-aperture-shutter speed combination.
TOPDAQ Acquisition Utility Beta version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
MOreno, Mario; & Barret, Keith
2010-01-07
This TOPDAQ Acquisition Utility uses 5 digital cameras mounted on a vertical pole, maintained in a vertical position using sensors and actuators, to take photographs of an RP-2 or RP-3 module, one camera for each row (4) and one in the center for driving, when the module is at 0 degrees, or facing the eastern horizon. These photographs and other data collected at the same time the pictures are taken are analyzed by the TOPAAP Analysis Utility. The TOPCAT implemented by the TOPDAQ Acquisition Utility and TOPAAP Analysis Utility programs optimizes the alignment of each RP in a module onmore » a parabolic trough solar collector array (SCA) to maximize the amount of solar energy intercepted by the solar receiver. The camera fixture and related hardware are mounted on a pickup truck and driven between rows in a parabolic trough solar power plant. An ultrasonic distance meter is used to maintain the correct distance between the cameras and the RP module. Along with the two leveling actuators, a third actuator is used to maintain a proper relative vertical position between the cameras and the RP module. The TOPDAQ Acquisition Utility facilitates file management by keeping track of which RP module data is being taken and also controls the exposure levels for each camera to maintain a high contract ratio in the photograph even as the available daylight changes throughout the day. The theoretical TOPCAT hardware and software support the current industry standard RP-2 and RP-3 module geometries.« less
Modulated CMOS camera for fluorescence lifetime microscopy.
Chen, Hongtao; Holst, Gerhard; Gratton, Enrico
2015-12-01
Widefield frequency-domain fluorescence lifetime imaging microscopy (FD-FLIM) is a fast and accurate method to measure the fluorescence lifetime of entire images. However, the complexity and high costs involved in construction of such a system limit the extensive use of this technique. PCO AG recently released the first luminescence lifetime imaging camera based on a high frequency modulated CMOS image sensor, QMFLIM2. Here we tested and provide operational procedures to calibrate the camera and to improve the accuracy using corrections necessary for image analysis. With its flexible input/output options, we are able to use a modulated laser diode or a 20 MHz pulsed white supercontinuum laser as the light source. The output of the camera consists of a stack of modulated images that can be analyzed by the SimFCS software using the phasor approach. The nonuniform system response across the image sensor must be calibrated at the pixel level. This pixel calibration is crucial and needed for every camera settings, e.g. modulation frequency and exposure time. A significant dependency of the modulation signal on the intensity was also observed and hence an additional calibration is needed for each pixel depending on the pixel intensity level. These corrections are important not only for the fundamental frequency, but also for the higher harmonics when using the pulsed supercontinuum laser. With these post data acquisition corrections, the PCO CMOS-FLIM camera can be used for various biomedical applications requiring a large frame and high speed acquisition. © 2015 Wiley Periodicals, Inc.
Solar System Portrait - View of the Sun, Earth and Venus
1996-09-13
This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The "rays" around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics. http://photojournal.jpl.nasa.gov/catalog/PIA00450
Solar System Portrait - View of the Sun, Earth and Venus
NASA Technical Reports Server (NTRS)
1990-01-01
This color image of the sun, Earth and Venus was taken by the Voyager 1 spacecraft Feb. 14, 1990, when it was approximately 32 degrees above the plane of the ecliptic and at a slant-range distance of approximately 4 billion miles. It is the first -- and may be the only -- time that we will ever see our solar system from such a vantage point. The image is a portion of a wide-angle image containing the sun and the region of space where the Earth and Venus were at the time with two narrow-angle pictures centered on each planet. The wide-angle was taken with the camera's darkest filter (a methane absorption band), and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large in the sky as seen from Voyager's perspective at the edge of the solar system but is still eight million times brighter than the brightest star in Earth's sky, Sirius. The image of the sun you see is far larger than the actual dimension of the solar disk. The result of the brightness is a bright burned out image with multiple reflections from the optics in the camera. The 'rays' around the sun are a diffraction pattern of the calibration lamp which is mounted in front of the wide angle lens. The two narrow-angle frames containing the images of the Earth and Venus have been digitally mosaiced into the wide-angle image at the appropriate scale. These images were taken through three color filters and recombined to produce a color image. The violet, green and blue filters were used; exposure times were, for the Earth image, 0.72, 0.48 and 0.72 seconds, and for the Venus frame, 0.36, 0.24 and 0.36, respectively. Although the planetary pictures were taken with the narrow-angle camera (1500 mm focal length) and were not pointed directly at the sun, they show the effects of the glare from the nearby sun, in the form of long linear streaks resulting from the scattering of sunlight off parts of the camera and its sun shade. From Voyager's great distance both Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. Detailed analysis also suggests that Voyager detected the moon as well, but it is too faint to be seen without special processing. Venus was only 0.11 pixel in diameter. The faint colored structure in both planetary frames results from sunlight scattered in the optics.
Children's everyday exposure to food marketing: an objective analysis using wearable cameras.
Signal, L N; Stanley, J; Smith, M; Barr, M B; Chambers, T J; Zhou, J; Duane, A; Gurrin, C; Smeaton, A F; McKerchar, C; Pearson, A L; Hoek, J; Jenkin, G L S; Ni Mhurchu, C
2017-10-08
Over the past three decades the global prevalence of childhood overweight and obesity has increased by 47%. Marketing of energy-dense nutrient-poor foods and beverages contributes to this worldwide increase. Previous research on food marketing to children largely uses self-report, reporting by parents, or third-party observation of children's environments, with the focus mostly on single settings and/or media. This paper reports on innovative research, Kids'Cam, in which children wore cameras to examine the frequency and nature of everyday exposure to food marketing across multiple media and settings. Kids'Cam was a cross-sectional study of 168 children (mean age 12.6 years, SD = 0.5) in Wellington, New Zealand. Each child wore a wearable camera on four consecutive days, capturing images automatically every seven seconds. Images were manually coded as either recommended (core) or not recommended (non-core) to be marketed to children by setting, marketing medium, and product category. Images in convenience stores and supermarkets were excluded as marketing examples were considered too numerous to count. On average, children were exposed to non-core food marketing 27.3 times a day (95% CI 24.8, 30.1) across all settings. This was more than twice their average exposure to core food marketing (12.3 per day, 95% CI 8.7, 17.4). Most non-core exposures occurred at home (33%), in public spaces (30%) and at school (19%). Food packaging was the predominant marketing medium (74% and 64% for core and non-core foods) followed by signs (21% and 28% for core and non-core). Sugary drinks, fast food, confectionary and snack foods were the most commonly encountered non-core foods marketed. Rates were calculated using Poisson regression. Children in this study were frequently exposed, across multiple settings, to marketing of non-core foods not recommended to be marketed to children. The study provides further evidence of the need for urgent action to reduce children's exposure to marketing of unhealthy foods, and suggests the settings and media in which to act. Such action is necessary if the Commission on Ending Childhood Obesity's vision is to be achieved.
Establishing imaging sensor specifications for digital still cameras
NASA Astrophysics Data System (ADS)
Kriss, Michael A.
2007-02-01
Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.
NASA Astrophysics Data System (ADS)
Kerr, Andrew D.
Determining optimal imaging settings and best practices related to the capture of aerial imagery using consumer-grade digital single lens reflex (DSLR) cameras, should enable remote sensing scientists to generate consistent, high quality, and low cost image data sets. Radiometric optimization, image fidelity, image capture consistency and repeatability were evaluated in the context of detailed image-based change detection. The impetus for this research is in part, a dearth of relevant, contemporary literature, on the utilization of consumer grade DSLR cameras for remote sensing, and the best practices associated with their use. The main radiometric control settings on a DSLR camera, EV (Exposure Value), WB (White Balance), light metering, ISO, and aperture (f-stop), are variables that were altered and controlled over the course of several image capture missions. These variables were compared for their effects on dynamic range, intra-frame brightness variation, visual acuity, temporal consistency, and the detectability of simulated cracks placed in the images. This testing was conducted from a terrestrial, rather than an airborne collection platform, due to the large number of images per collection, and the desire to minimize inter-image misregistration. The results point to a range of slightly underexposed image exposure values as preferable for change detection and noise minimization fidelity. The makeup of the scene, the sensor, and aerial platform, influence the selection of the aperture and shutter speed which along with other variables, allow for estimation of the apparent image motion (AIM) motion blur in the resulting images. The importance of the image edges in the image application, will in part dictate the lowest usable f-stop, and allow the user to select a more optimal shutter speed and ISO. The single most important camera capture variable is exposure bias (EV), with a full dynamic range, wide distribution of DN values, and high visual contrast and acuity occurring around -0.7 to -0.3EV exposure bias. The ideal values for sensor gain, was found to be ISO 100, with ISO 200 a less desirable. This study offers researchers a better understanding of the effects of camera capture settings on RSI pairs and their influence on image-based change detection.
A digital ISO expansion technique for digital cameras
NASA Astrophysics Data System (ADS)
Yoo, Youngjin; Lee, Kangeui; Choe, Wonhee; Park, SungChan; Lee, Seong-Deok; Kim, Chang-Yong
2010-01-01
Market's demands of digital cameras for higher sensitivity capability under low-light conditions are remarkably increasing nowadays. The digital camera market is now a tough race for providing higher ISO capability. In this paper, we explore an approach for increasing maximum ISO capability of digital cameras without changing any structure of an image sensor or CFA. Our method is directly applied to the raw Bayer pattern CFA image to avoid non-linearity characteristics and noise amplification which are usually deteriorated after ISP (Image Signal Processor) of digital cameras. The proposed method fuses multiple short exposed images which are noisy, but less blurred. Our approach is designed to avoid the ghost artifact caused by hand-shaking and object motion. In order to achieve a desired ISO image quality, both low frequency chromatic noise and fine-grain noise that usually appear in high ISO images are removed and then we modify the different layers which are created by a two-scale non-linear decomposition of an image. Once our approach is performed on an input Bayer pattern CFA image, the resultant Bayer image is further processed by ISP to obtain a fully processed RGB image. The performance of our proposed approach is evaluated by comparing SNR (Signal to Noise Ratio), MTF50 (Modulation Transfer Function), color error ~E*ab and visual quality with reference images whose exposure times are properly extended into a variety of target sensitivity.
DC drive system for cine/pulse cameras
NASA Technical Reports Server (NTRS)
Gerlach, R. H.; Sharpsteen, J. T.; Solheim, C. D.; Stoap, L. J.
1977-01-01
Camera-drive functions are separated mechanically into two groups which are driven by two separate dc brushless motors. First motor, a 90 deg stepper, drives rotating shutter; second electronically commutated motor drives claw and film transport. Shutter is made of one piece but has two openings for slow and fast exposures.
Harry E. Brown
1962-01-01
The canopy camera is a device of new design that takes wide-angle, overhead photographs of vegetation canopies, cloud cover, topographic horizons, and similar subjects. Since the entire hemisphere is photographed in a single exposure, the resulting photograph is circular, with the horizon forming the perimeter and the zenith the center. Photographs of this type provide...
Opto-mechanical system design of test system for near-infrared and visible target
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Zhu, Guodong; Wang, Yuchao
2014-12-01
Guidance precision is the key indexes of the guided weapon shooting. The factors of guidance precision including: information processing precision, control system accuracy, laser irradiation accuracy and so on. The laser irradiation precision is an important factor. This paper aimed at the demand of the precision test of laser irradiator,and developed the laser precision test system. The system consists of modified cassegrain system, the wide range CCD camera, tracking turntable and industrial PC, and makes visible light and near infrared target imaging at the same time with a Near IR camera. Through the analysis of the design results, when it exposures the target of 1000 meters that the system measurement precision is43mm, fully meet the needs of the laser precision test.
Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
Hubble Space Telescope, Faint Object Camera
NASA Technical Reports Server (NTRS)
1981-01-01
This drawing illustrates Hubble Space Telescope's (HST's), Faint Object Camera (FOC). The FOC reflects light down one of two optical pathways. The light enters a detector after passing through filters or through devices that can block out light from bright objects. Light from bright objects is blocked out to enable the FOC to see background images. The detector intensifies the image, then records it much like a television camera. For faint objects, images can be built up over long exposure times. The total image is translated into digital data, transmitted to Earth, and then reconstructed. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors.
A New Approach for Alpha Radiography by Triple THGEM using Monte Carlo Simulation and Measurement
NASA Astrophysics Data System (ADS)
Khezripour, S.; Negarestani, A.; Rezaie, M. R.
2018-05-01
In this research, alpha imaging in Self Quenching Streamer (SQS) mode is investigated using a triple Thick Gas Electron Multiplier (THGEM) detector by Monte Carlo method and experimental data. First, a semi-empirical equation is derived to represent the relation between the SQS voltage and the alpha energy in every hole of the triple THGEM. The accuracy of this equation is tested and confirmed by a high degree of consistency. Secondly, the images of objects that are irradiated by Am-241 alpha source (5.49 MeV) are recorded by a CMOS camera using triple THGEM detector in the SQS mode. The resolution of images in this paper is a function of the exposure time. For an alpha source with 150 kBq activity, an optimal time interval for exposure is about 30 sec. For exposure time less or more than 30 sec, the images are incomplete and ambiguous, respectively. The overall objective of this work is to facilitate the alpha radiography in nuclear imaging through a triple THGEM without any amplifier or complicated electrical equipment.
The Citizen CATE Experiment: Techniques to Determine Totality Coverage and Clouded Data Removal.
NASA Astrophysics Data System (ADS)
McKay, Myles A.; Ursache, Andrei; Penn, Matthew; Citizen CATE Experiment 2017 Team
2018-01-01
August 21, 2017, the Citizen Continental-America Telescopic Eclipse(CATE) Experiment observed the 2017 total solar eclipse using a network of 68 identical telescopes and camera systems along the path of totality. The result from the observation was over 90% of all sites collected totality data on the day of the eclipse. Since the volunteers had to remove the solar filter manually, there is an uncertainty between the time of totality and data acquired during totality. Some sites also experienced cloudy weather which obscured the eclipse in some of the exposures but had small breaks in the clouds during the observation, collecting clear totality data. Before we can process and analyze the eclipse data, we must carefully determine which frames cover the time of totality for each site and remove exposures with clouds blocking the FOV. In this poster, we will discuss the techniques we used to determine the extent of totality from each location using the logged GPS data and the removal of totality exposure with clouds.
Soft X-ray and XUV imaging with a charge-coupled device /CCD/-based detector
NASA Technical Reports Server (NTRS)
Loter, N. G.; Burstein, P.; Krieger, A.; Ross, D.; Harrison, D.; Michels, D. J.
1981-01-01
A soft X-ray/XUV imaging camera which uses a thinned, back-illuminated, all-buried channel RCA CCD for radiation sensing has been built and tested. The camera is a slow-scan device which makes possible frame integration if necessary. The detection characteristics of the device have been tested over the 15-1500 eV range. The response was linear with exposure up to 0.2-0.4 erg/sq cm; saturation occurred at greater exposures. Attention is given to attempts to resolve single photons with energies of 1.5 keV.
Using focused plenoptic cameras for rich image capture.
Georgiev, T; Lumsdaine, A; Chunev, G
2011-01-01
This approach uses a focused plenoptic camera to capture the plenoptic function's rich "non 3D" structure. It employs two techniques. The first simultaneously captures multiple exposures (or other aspects) based on a microlens array having an interleaved set of different filters. The second places multiple filters at the main lens aperture.
Gamma Ray Burst Optical Counterpart Search Experiment (GROCSE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H.S.; Ables, E.; Bionta, R.M.
GROCSE (Gamma-Ray Optical Counterpart Search Experiments) is a system of automated telescopes that search for simultaneous optical activity associated with gamma ray bursts in response to real-time burst notifications provided by the BATSE/BACODINE network. The first generation system, GROCSE 1, is sensitive down to Mv {approximately} 8.5 and requires an average of 12 seconds to obtain the first images of the gamma ray burst error box defined by the BACODINE trigger. The collaboration is now constructing a second generation system which has a 4 second slewing time and can reach Mv {approximately} 14 with a 5 second exposure. GROCSE 2more » consists of 4 cameras on a single mount. Each camera views the night sky through a commercial Canon lens (f/1.8, focal length 200 mm) and utilizes a 2K x 2K Loral CCD. Light weight and low noise custom readout electronics were designed and fabricated for these CCDs. The total field of view of the 4 cameras is 17.6 x 17.6 {degree}. GROCSE II will be operated by the end of 1995. In this paper, the authors present an overview of the GROCSE system and the results of measurements with a GROCSE 2 prototype unit.« less
NASA Astrophysics Data System (ADS)
Skaloud, J.; Rehak, M.; Lichti, D.
2014-03-01
This study highlights the benefit of precise aerial position control in the context of mapping using frame-based imagery taken by small UAVs. We execute several flights with a custom Micro Aerial Vehicle (MAV) octocopter over a small calibration field equipped with 90 signalized targets and 25 ground control points. The octocopter carries a consumer grade RGB camera, modified to insure precise GPS time stamping of each exposure, as well as a multi-frequency/constellation GNSS receiver. The GNSS antenna and camera are rigidly mounted together on a one-axis gimbal that allows control of the obliquity of the captured imagery. The presented experiments focus on including absolute and relative aerial control. We confirm practically that both approaches are very effective: the absolute control allows omission of ground control points while the relative requires only a minimum number of control points. Indeed, the latter method represents an attractive alternative in the context of MAVs for two reasons. First, the procedure is somewhat simplified (e.g. the lever-arm between the camera perspective and antenna phase centers does not need to be determined) and, second, its principle allows employing a single-frequency antenna and carrier-phase GNSS receiver. This reduces the cost of the system as well as the payload, which in turn increases the flying time.
NASA Astrophysics Data System (ADS)
Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.
2014-07-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-α and to detect the Hanle effect in the line core. Due to the nature of Lyman-α polarizationin the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. The CLASP cameras were designed to operate with ≤ 10 e-/pixel/second dark current, ≤ 25 e- read noise, a gain of 2.0 +- 0.5 and ≤ 1.0% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.
Robotic Arm Camera on Mars with Lights On
NASA Technical Reports Server (NTRS)
2008-01-01
This image is a composite view of NASA's Phoenix Mars Lander's Robotic Arm Camera (RAC) with its lights on, as seen by the lander's Surface Stereo Imager (SSI). This image combines images taken on the afternoon of Phoenix's 116th Martian day, or sol (September 22, 2008). The RAC is about 8 centimeters (3 inches) tall. The SSI took images of the RAC to test both the light-emitting diodes (LEDs) and cover function. Individual images were taken in three SSI filters that correspond to the red, green, and blue LEDs one at a time. When combined, it appears that all three sets of LEDs are on at the same time. This composite image is not true color. The streaks of color extending from the LEDs are an artifact from saturated exposure. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.Phenology cameras observing boreal ecosystems of Finland
NASA Astrophysics Data System (ADS)
Peltoniemi, Mikko; Böttcher, Kristin; Aurela, Mika; Kolari, Pasi; Tanis, Cemal Melih; Linkosalmi, Maiju; Loehr, John; Metsämäki, Sari; Nadir Arslan, Ali
2016-04-01
Cameras have become useful tools for monitoring seasonality of ecosystems. Low-cost cameras facilitate validation of other measurements and allow extracting some key ecological features and moments from image time series. We installed a network of phenology cameras at selected ecosystem research sites in Finland. Cameras were installed above, on the level, or/and below the canopies. Current network hosts cameras taking time lapse images in coniferous and deciduous forests as well as at open wetlands offering thus possibilities to monitor various phenological and time-associated events and elements. In this poster, we present our camera network and give examples of image series use for research. We will show results about the stability of camera derived color signals, and based on that discuss about the applicability of cameras in monitoring time-dependent phenomena. We will also present results from comparisons between camera-derived color signal time series and daily satellite-derived time series (NVDI, NDWI, and fractional snow cover) from the Moderate Resolution Imaging Spectrometer (MODIS) at selected spruce and pine forests and in a wetland. We will discuss the applicability of cameras in supporting phenological observations derived from satellites, by considering the possibility of cameras to monitor both above and below canopy phenology and snow.
Photon collider: a four-channel autoguider solution
NASA Astrophysics Data System (ADS)
Hygelund, John C.; Haynes, Rachel; Burleson, Ben; Fulton, Benjamin J.
2010-07-01
The "Photon Collider" uses a compact array of four off axis autoguider cameras positioned with independent filtering and focus. The photon collider is two way symmetric and robustly mounted with the off axis light crossing the science field which allows the compact single frame construction to have extremely small relative deflections between guide and science CCDs. The photon collider provides four independent guiding signals with a total of 15 square arc minutes of sky coverage. These signals allow for simultaneous altitude, azimuth, field rotation and focus guiding. Guide cameras read out without exposure overhead increasing the tracking cadence. The independent focus allows the photon collider to maintain in focus guide stars when the main science camera is taking defocused exposures as well as track for telescope focus changes. Independent filters allow auto guiding in the science camera wavelength bandpass. The four cameras are controlled with a custom web services interface from a single Linux based industrial PC, and the autoguider mechanism and telemetry is built around a uCLinux based Analog Devices BlackFin embedded microprocessor. Off axis light is corrected with a custom meniscus correcting lens. Guide CCDs are cooled with ethylene glycol with an advanced leak detection system. The photon collider was built for use on Las Cumbres Observatory's 2 meter Faulks telescopes and currently used to guide the alt-az mount.
VizieR Online Data Catalog: LY And photometric followup (Lu+, 2017)
NASA Astrophysics Data System (ADS)
Lu, H.-P.; Zhang, L.-Y.; Han, X. L.; Pi, Q.-F.; Wang, D.-M.
2017-04-01
We obtained our first photometric data set in R and I bands for LY And on November 24, 2014 using the 1-m RCC reflecting telescope at Yunnan Observatory, which was equipped with an Andor DW436 2048x2048 CCD camera with a field of view of 7.3'x7.3'. The exposure times were 300s for both R and I bands. We obtained our second photometric data set in B, V, R and I bands using the SARA 914-mm telescope at Kitt Peak National Observatory on October 23, 2015. This telescope was equipped with a 2048x2048 pixels CCD and each pixel after 2x2 binning is about 0.86". The exposure times were 120s in B band and 60 s in V, R and I bands, respectively. (3 data files).
Photobleaching of red fluorescence in oral biofilms.
Hope, C K; de Josselin de Jong, E; Field, M R T; Valappil, S P; Higham, S M
2011-04-01
Many species of oral bacteria can be induced to fluoresce due to the presence of endogenous porphyrins, a phenomenon that can be utilized to visualize and quantify dental plaque in the laboratory or clinical setting. However, an inevitable consequence of fluorescence is photobleaching, and the effects of this on longitudinal, quantitative analysis of dental plaque have yet to be ascertained. Filter membrane biofilms were grown from salivary inocula or single species (Prevotella nigrescens and Prevotella intermedia). The mature biofilms were then examined in a custom-made lighting rig comprising 405 nm light-emitting diodes capable of delivering 220 W/m(2) at the sample, an appropriate filter and a digital camera; a set-up analogous to quantitative light-induced fluorescence digital. Longitudinal sets of images were captured and processed to assess the degradation in red fluorescence over time. Photobleaching was observed in all instances. The highest rates of photobleaching were observed immediately after initiation of illumination, specifically during the first minute. Relative rates of photobleaching during the first minute of exposure were 19.17, 13.72 and 3.43 arbitrary units/min for P. nigrescens biofilms, microcosm biofilm and P. intermedia biofilms, respectively. Photobleaching could be problematic when making quantitative measurements of porphyrin fluorescence in situ. Reducing both light levels and exposure time, in combination with increased camera sensitivity, should be the default approach when undertaking analyses by quantitative light-induced fluorescence digital. © 2010 John Wiley & Sons A/S.
COBRA ATD multispectral camera response model
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.
NASA Astrophysics Data System (ADS)
Lagrosas, N.; Gacal, G. F. B.; Kuze, H.
2017-12-01
Detection of nighttime cloud from Himawari 8 is implemented using the difference of digital numbers from bands 13 (10.4µm) and 7 (3.9µm). The digital number difference of -1.39x104 can be used as a threshold to separate clouds from clear sky conditions. To look at observations from the ground over Chiba, a digital camera (Canon Powershot A2300) is used to take images of the sky every 5 minutes at an exposure time of 5s at the Center for Environmental Remote Sensing, Chiba University. From these images, cloud cover values are obtained using threshold algorithm (Gacal, et al, 2016). Ten minute nighttime cloud cover values from these two datasets are compared and analyzed from 29 May to 05 June 2017 (20:00-03:00 JST). When compared with lidar data, the camera can detect thick high level clouds up to 10km. The results show that during clear sky conditions (02-03 June), both camera and satellite cloud cover values show 0% cloud cover. During cloudy conditions (05-06 June), the camera shows almost 100% cloud cover while satellite cloud cover values range from 60 to 100%. These low values can be attributed to the presence of low-level thin clouds ( 2km above the ground) as observed from National Institute for Environmental Studies lidar located inside Chiba University. This difference of cloud cover values shows that the camera can produce accurate cloud cover values of low level clouds that are sometimes not detected by satellites. The opposite occurs when high level clouds are present (01-02 June). Derived satellite cloud cover shows almost 100% during the whole night while ground-based camera shows cloud cover values that range from 10 to 100% during the same time interval. The fluctuating values can be attributed to the presence of thin clouds located at around 6km from the ground and the presence of low level clouds ( 1km). Since the camera relies on the reflected city lights, it is possible that the high level thin clouds are not observed by the camera but is observed by the satellite. Also, this condition constitutes layers of clouds that are not observed by each camera. The results of this study show that one instrument can be used to correct each other to provide better cloud cover values. These corrections is dependent on the height and thickness of the clouds. No correction is necessary when the sky is clear.
Seasonal Effect on Ocular Sun Exposure and Conjunctival UV Autofluorescence.
Haworth, Kristina M; Chandler, Heather L
2017-02-01
To evaluate feasibility and repeatability of measures for ocular sun exposure and conjunctival ultraviolet autofluorescence (UVAF), and to test for relationships between the outcomes. Fifty volunteers were seen for two visits 14 ± 2 days apart. Ocular sun exposure was estimated over a 2-week time period using questionnaires that quantified time outdoors and ocular protection habits. Conjunctival UVAF was imaged using a Nikon D7000 camera system equipped with appropriate flash and filter system; image analysis was done using ImageJ software. Repeatability estimates were made using Bland-Altman plots with mean differences and 95% limits of agreement calculated. Non-normally distributed data was transformed by either log10 or square root methods. Linear regression was conducted to evaluate relationships between measures. Mean (±SD) values for ocular sun exposure and conjunctival UVAF were 8.86 (±11.97) hours and 9.15 (±9.47) mm, respectively. Repeatability was found to be acceptable for both ocular sun exposure and conjunctival UVAF. Univariate linear regression showed outdoor occupation to be a predictor of higher ocular sun exposure; outdoor occupation and winter season of collection both predicted higher total UVAF. Furthermore, increased portion of day spent outdoors while working was associated with increased total conjunctival UVAF. We demonstrate feasibility and repeatability of estimating ocular sun exposure using a previously unreported method and for conjunctival UVAF in a group of subjects residing in Ohio. Seasonal temperature variation may have influenced time outdoors and ultimately calculation of ocular sun exposure. As winter season of collection and outdoor occupation both predicted higher total UVAF, our data suggests that ocular sun exposure is associated with conjunctival UVAF and, possibly, that UVAF remains for at least several months after sun exposure.
Seasonal Effect on Ocular Sun Exposure and Conjunctival UV Autofluorescence
Haworth, Kristina M.; Chandler, Heather L.
2016-01-01
Purpose To evaluate feasibility and repeatability of measures for ocular sun exposure and conjunctival ultraviolet autofluorescence (UVAF), and to test for relationships between the outcomes. Methods Fifty volunteers were seen for 2 visits 14±2 days apart. Ocular sun exposure was estimated over a two-week time period using questionnaires that quantified time outdoors and ocular protection habits. Conjunctival UVAF was imaged using a Nikon D7000 camera system equipped with appropriate flash and filter system; image analysis was done using ImageJ software. Repeatability estimates were made using Bland-Altman plots with mean differences and 95% limits of agreement calculated. Non-normally distributed data was transformed by either log10 or square root methods. Linear regression was conducted to evaluate relationships between measures. Results Mean (±SD) values for ocular sun exposure and conjunctival UVAF were 8.86 (±11.97) hours and 9.15 (±9.47) mm2, respectively. Repeatability was found to be acceptable for both ocular sun exposure and conjunctival UVAF. Univariate linear regression showed outdoor occupation to be a predictor of higher ocular sun exposure; outdoor occupation and winter season of collection both predicted higher total UVAF. Furthermore, increased portion of day spent outdoors while working was associated with increased total conjunctival UVAF. Conclusions We demonstrate feasibility and repeatability of estimating ocular sun exposure using a previously unreported method and for conjunctival UVAF in a group of subjects residing in Ohio. Seasonal temperature variation may have influenced time outdoors and ultimately calculation of ocular sun exposure. As winter season of collection and outdoor occupation both predicted higher total UVAF, our data suggests that ocular sun exposure is associated with conjunctival UVAF and possibly, that UVAF remains for at least several months following sun exposure. PMID:27820717
2003-03-07
File name :DSC_0749.JPG File size :1.1MB(1174690Bytes) Date taken :2003/03/07 13:51:29 Image size :2000 x 1312 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D1H Quality mode :FINE Metering mode :Matrix Exposure mode :Shutter priority Speed light :No Focal length :20 mm Shutter speed :1/500second Aperture :F11.0 Exposure compensation :0 EV White Balance :Auto Lens :20 mm F 2.8 Flash sync mode :N/A Exposure difference :0.0 EV Flexible program :No Sensitivity :ISO200 Sharpening :Normal Image Type :Color Color Mode :Mode II(Adobe RGB) Hue adjustment :3 Saturation Control :N/A Tone compensation :Normal Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
2002-02-19
File name :DSC_0028.JPG File size :2.8MB(2950833Bytes) Date taken :2002/02/19 09:49:01 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/60second Aperture :F3.5 Exposure compensation :0 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
2002-02-24
File name :DSC_0047.JPG File size :2.8MB(2931574Bytes) Date taken :2002/02/24 10:06:57 Image size :3008 x 2000 Resolution :300 x 300 dpi Number of bits :8bit/channel Protection attribute :Off Hide Attribute :Off Camera ID :N/A Camera :NIKON D100 Quality mode :N/A Metering mode :Matrix Exposure mode :Shutter priority Speed light :Yes Focal length :24 mm Shutter speed :1/180second Aperture :F20.0 Exposure compensation :+0.3 EV White Balance :N/A Lens :N/A Flash sync mode :N/A Exposure difference :N/A Flexible program :N/A Sensitivity :N/A Sharpening :N/A Image Type :Color Color Mode :N/A Hue adjustment :N/A Saturation Control :N/A Tone compensation :N/A Latitude(GPS) :N/A Longitude(GPS) :N/A Altitude(GPS) :N/A
Particle image velocimetry based on wavelength division multiplexing
NASA Astrophysics Data System (ADS)
Tang, Chunxiao; Li, Enbang; Li, Hongqiang
2018-01-01
This paper introduces a technical approach of wavelength division multiplexing (WDM) based particle image velocimetry (PIV). It is designed to measure transient flows with different scales of velocity by capturing multiple particle images in one exposure. These images are separated by different wavelengths, and thus the pulse separation time is not influenced by the frame rate of the camera. A triple-pulsed PIV system has been created in order to prove the feasibility of WDM-PIV. This is demonstrated in a sieve plate extraction column model by simultaneously measuring the fast flow in the downcomer and the slow vortices inside the plates. A simple displacement/velocity field combination method has also been developed. The constraints imposed by WDM-PIV are limited wavelength choices of available light sources and cameras. The usage of WDM technique represents a feasible way to realize multiple-pulsed PIV.
Photogrammetry and altimetry. Part A: Apollo 16 laser altimeter
NASA Technical Reports Server (NTRS)
Wollenhaupt, W. R.; Sjogren, W. L.
1972-01-01
The laser altimeter measures precise altitudes of the command and service module above the lunar surface and can function either with the metric (mapping) camera or independently. In the camera mode, the laser altimeter ranges at each exposure time, which varies between 20 and 28 sec (i.e., 30 to 43 km on the lunar surface). In the independent mode, the laser altimeter ranges every 20 sec. These altitude data and the spacecraft attitudes that are derived from simultaneous stellar photography are used to constrain the photogrammetric reduction of the lunar surface photographs when cartographic products are generated. In addition, the altimeter measurements alone provide broad-scale topographic relief around the entire circumference of the moon. These data are useful in investigating the selenodetic figure of the moon and may provide information regarding gravitational anomalies on the lunar far side.
QuadCam - A Quadruple Polarimetric Camera for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Skuljan, J.
A specialised quadruple polarimetric camera for space situational awareness, QuadCam, has been built at the Defence Technology Agency (DTA), New Zealand, as part of collaboration with the Defence Science and Technology Laboratory (Dstl), United Kingdom. The design was based on a similar system originally developed at Dstl, with some significant modifications for improved performance. The system is made up of four identical CCD cameras looking in the same direction, but in a different plane of polarisation at 0, 45, 90 and 135 degrees with respect to the reference plane. A standard set of Stokes parameters can be derived from the four images in order to describe the state of polarisation of an object captured in the field of view. The modified design of the DTA QuadCam makes use of four small Raspberry Pi computers, so that each camera is controlled by its own computer in order to speed up the readout process and ensure that the four individual frames are taken simultaneously (to within 100-200 microseconds). In addition, a new firmware was requested from the camera manufacturer so that an output signal is generated to indicate the state of the camera shutter. A specialised GPS unit (also developed at DTA) is then used to monitor the shutter signals from the four cameras and record the actual time of exposure to an accuracy of about 100 microseconds. This makes the system well suited for the observation of fast-moving objects in the low Earth orbit (LEO). The QuadCam is currently mounted on a Paramount MEII robotic telescope mount at the newly built DTA space situational awareness observatory located on Whangaparaoa Peninsula near Auckland, New Zealand. The system will be used for tracking satellites in low Earth orbit and geostationary belt as well. The performance of the camera has been evaluated and a series of test images have been collected in order to derive the polarimetric signatures for selected satellites.
Plenoptic camera image simulation for reconstruction algorithm verification
NASA Astrophysics Data System (ADS)
Schwiegerling, Jim
2014-09-01
Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.
The suitability of lightfield camera depth maps for coordinate measurement applications
NASA Astrophysics Data System (ADS)
Rangappa, Shreedhar; Tailor, Mitul; Petzing, Jon; Kinnell, Peter; Jackson, Michael
2015-12-01
Plenoptic cameras can capture 3D information in one exposure without the need for structured illumination, allowing grey scale depth maps of the captured image to be created. The Lytro, a consumer grade plenoptic camera, provides a cost effective method of measuring depth of multiple objects under controlled lightning conditions. In this research, camera control variables, environmental sensitivity, image distortion characteristics, and the effective working range of two Lytro first generation cameras were evaluated. In addition, a calibration process has been created, for the Lytro cameras, to deliver three dimensional output depth maps represented in SI units (metre). The novel results show depth accuracy and repeatability of +10.0 mm to -20.0 mm, and 0.5 mm respectively. For the lateral X and Y coordinates, the accuracy was +1.56 μm to -2.59 μm and the repeatability was 0.25 μm.
Design and realization of an AEC&AGC system for the CCD aerial camera
NASA Astrophysics Data System (ADS)
Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun
2015-08-01
An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.
Ejaz, Sohail; Woong, Lim Chae
2006-02-01
Embryonic movements (EM) are considered to be the first sign of life and cigarette smoking during pregnancy has been linked to affect EM. Exposure to sidestream smoke, produced from the emissions of a smoldering cigarette, may result in poor pregnancy outcome and increased risk of serious perinatal morbidity and mortality. In this study, the chicken embryo bioassay was used to systematically assess the effects of short-term exposure to sidestream whole smoke solutions (SSWSS) on EM, recorded in real time by a video camera for 60 min and each EM was counted for every 3-min interval. Application of different types of SSWSS to the embryos caused significant changes in all types of EM from 15 to 18 min of recording time. Extensive reduction (P<0.001) and some time complete stoppage of swing-like movements and whole-body movements were observed in almost all treated embryos. Our data clearly link between exposure of SSWSS and substantial decrease in EM. It is unclear whether nicotine and/or other ingredients present in sidestream smoke are responsible for these alterations in EM. This article provides an outline of the relevance of SSWSS on EM for evolutionary developmental biology and this assay can be used to investigate the complex mixtures with regard to their effects on EM.
Smartphone based Tomographic PIV using colored shadows
NASA Astrophysics Data System (ADS)
Aguirre-Pablo, Andres A.; Alarfaj, Meshal K.; Li, Er Qiang; Thoroddsen, Sigurdur T.
2016-11-01
We use low-cost smartphones and Tomo-PIV, to reconstruct the 3D-3C velocity field of a vortex ring. The experiment is carried out in an octagonal tank of water with a vortex ring generator consisting of a flexible membrane enclosed by a cylindrical chamber. This chamber is pre-seeded with black polyethylene microparticles. The membrane is driven by an adjustable impulsive air-pressure to produce the vortex ring. Four synchronized smartphone cameras, of 40 Mpx each, are used to capture the location of particles from different viewing angles. We use red, green and blue LED's as backlighting sources, to capture particle locations at different times. The exposure time on the smartphone cameras are set to 2 seconds, while exposing each LED color for about 80 μs with different time steps that can go below 300 μs. The timing of these light pulses is controlled with a digital delay generator. The backlight is blocked by the instantaneous location of the particles in motion, leaving a shadow of the corresponding color for each time step. The image then is preprocessed to separate the 3 different color fields, before using the MART reconstruction and cross-correlation of the time steps to obtain the 3D-3C velocity field. This proof of concept experiment represents a possible low-cost Tomo-PIV setup.
NASA Technical Reports Server (NTRS)
Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtin, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike
2014-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30%) quantum efficiency at the Lyman-$\\alpha$ line. The CLASP cameras were designed to operate with =10 e- /pixel/second dark current, = 25 e- read noise, a gain of 2.0 and =0.1% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.
NASA Technical Reports Server (NTRS)
Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.
2014-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with 10 e-/pixel/second dark current, 25 e- read noise, a gain of 2.0 +/- 0.5 and 1.0 percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.
Sedimentary Rocks of Aram Chaos
NASA Technical Reports Server (NTRS)
2004-01-01
4 February 2004 Aram Chaos is a large meteor impact crater that was nearly filled with sediment. Over time, this sediment was hardened to form sedimentary rock. Today, much of the eastern half of the crater has exposures of light-toned sedimentary rock, such as the outcrops shown in this Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image. The picture is located near 2.0oN, 20.3oW, and covers an area 3 km (1.9 mi) wide. Sunlight illuminates the scene from the left.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frazin, Richard A., E-mail: rfrazin@umich.edu
2013-04-10
Heretofore, the literature on exoplanet detection with coronagraphic telescope systems has paid little attention to the information content of short exposures and methods of utilizing the measurements of adaptive optics wavefront sensors. This paper provides a framework for the incorporation of the wavefront sensor measurements in the context of observing modes in which the science camera takes millisecond exposures. In this formulation, the wavefront sensor measurements provide a means to jointly estimate the static speckle and the planetary signal. The ability to estimate planetary intensities in as little as a few seconds has the potential to greatly improve the efficiencymore » of exoplanet search surveys. For simplicity, the mathematical development assumes a simple optical system with an idealized Lyot coronagraph. Unlike currently used methods, in which increasing the observation time beyond a certain threshold is useless, this method produces estimates whose error covariances decrease more quickly than inversely proportional to the observation time. This is due to the fact that the estimates of the quasi-static aberrations are informed by a new random (but approximately known) wavefront every millisecond. The method can be extended to include angular (due to diurnal field rotation) and spectral diversity. Numerical experiments are performed with wavefront data from the AEOS Adaptive Optics System sensing at 850 nm. These experiments assume a science camera wavelength {lambda} of 1.1 {mu}, that the measured wavefronts are exact, and a Gaussian approximation of shot-noise. The effects of detector read-out noise and other issues are left to future investigations. A number of static aberrations are introduced, including one with a spatial frequency exactly corresponding the planet location, which was at a distance of Almost-Equal-To 3{lambda}/D from the star. Using only 4 s of simulated observation time, a planetary intensity, of Almost-Equal-To 1 photon ms{sup -1}, and a stellar intensity of Almost-Equal-To 10{sup 5} photons ms{sup -1} (contrast ratio 10{sup 5}), the short-exposure estimation method recovers the amplitudes' static aberrations with 1% accuracy, and the planet brightness with 20% accuracy.« less
Detection strategies for the first supernovae with JWST
NASA Astrophysics Data System (ADS)
Hartwig, Tilman; Bromm, Volker; Loeb, Abraham
2018-06-01
Pair-instability supernovae (PISNe) are very luminous explosions of massive, low metallicity stars. They can potentially be observed out to high redshifts due to their high explosion energies, thus providing a probe of the Universe prior to reionization. The near-infrared camera, NIRCam, on board the James Webb Space Telescope is ideally suited for detecting their redshifted ultraviolet emission. We calculate the photometric signature of high-redshift PISNe and derive the optimal detection strategy for identifying their prompt emission and possible afterglow. We differentiate between PISNe and other sources that could have a similar photometric signature, such as active galactic nuclei or high-redshift galaxies. We demonstrate that the optimal strategy, which maximizes the visibility time of the PISN lightcurve per invested exposure time, consists of the two wide-band filters F200W and F356W with an exposure time of 600 s. For such exposures, we expect one PISN at z ≲ 7.5 per at least 50,000 different field of view, which can be accomplished with parallel observations and an extensive archival search. The PISN afterglow, caused by nebular emission and reverberation, is very faint and requires unfeasibly long exposure times to be uniquely identified. However, this afterglow would be visible for several hundred years, about two orders of magnitude longer than the prompt emission, rendering PISNe promising targets for future, even more powerful telescopes.
Pearson, Amber L.; Bottomley, Ross; Chambers, Tim; Thornton, Lukar; Stanley, James; Smith, Moira; Barr, Michelle; Signal, Louise
2017-01-01
Blue spaces (water bodies) may promote positive mental and physical health through opportunities for relaxation, recreation, and social connections. However, we know little about the nature and extent of everyday exposure to blue spaces, particularly in settings outside the home or among children, nor whether exposure varies by individual or household characteristics. Wearable cameras offer a novel, reliable method for blue space exposure measurement. In this study, we used images from cameras worn over two days by 166 children in Wellington, New Zealand, and conducted content and blue space quantification analysis on each image (n = 749,389). Blue space was identified in 24,721 images (3.6%), with a total of 23 blue recreation events. Visual exposure and participation in blue recreation did not differ by ethnicity, weight status, household deprivation, or residential proximity to the coastline. Significant differences in both visual exposure to blue space and participation in blue recreation were observed, whereby children from the most deprived schools had significantly higher rates of blue space exposure than children from low deprivation schools. Schools may be important settings to promote equitable blue space exposures. Childhood exposures to blue space may not follow the expected income inequality trends observed among adults. PMID:28587134
Pearson, Amber L; Bottomley, Ross; Chambers, Tim; Thornton, Lukar; Stanley, James; Smith, Moira; Barr, Michelle; Signal, Louise
2017-05-26
Blue spaces (water bodies) may promote positive mental and physical health through opportunities for relaxation, recreation, and social connections. However, we know little about the nature and extent of everyday exposure to blue spaces, particularly in settings outside the home or among children, nor whether exposure varies by individual or household characteristics. Wearable cameras offer a novel, reliable method for blue space exposure measurement. In this study, we used images from cameras worn over two days by 166 children in Wellington, New Zealand, and conducted content and blue space quantification analysis on each image ( n = 749,389). Blue space was identified in 24,721 images (3.6%), with a total of 23 blue recreation events. Visual exposure and participation in blue recreation did not differ by ethnicity, weight status, household deprivation, or residential proximity to the coastline. Significant differences in both visual exposure to blue space and participation in blue recreation were observed, whereby children from the most deprived schools had significantly higher rates of blue space exposure than children from low deprivation schools. Schools may be important settings to promote equitable blue space exposures. Childhood exposures to blue space may not follow the expected income inequality trends observed among adults.
Patel, Akash R; Ganley, Jamie; Zhu, Xiaowei; Rome, Jonathan J; Shah, Maully; Glatz, Andrew C
2014-10-01
Radiation exposure during pediatric catheterization is significant. We sought to describe radiation exposure and the effectiveness of radiation safety protocols in reducing exposure during catheter ablations with electrophysiology studies in children and patients with congenital heart disease. We additionally sought to identify at-risk patients. We retrospectively reviewed all interventional electrophysiology procedures performed from April 2009 to September 2011 (6 months preceding intervention, 12 months following implementation of initial radiation safety protocol, and 8 months following implementation of modified protocol). The protocols consisted of low pulse rate fluoroscopy settings, operator notification of skin entrance dose every 1,000 mGy, adjusting cameras by >5 at every 1,000 mGy, and appropriate collimation. The cohort consisted of 291 patients (70 pre-intervention, 137 after initial protocol implementation, 84 after modified protocol implementation) at a median age of 14.9 years with congenital heart disease present in 11 %. Diagnoses included atrioventricular nodal reentrant tachycardia (25 %), atrioventricular reentrant tachycardia (61 %), atrial tachycardias (12 %), and ventricular tachycardia (2 %). There were no differences between groups based on patient, arrhythmia, and procedural characteristics. Following implementation of the protocols, there were significant reductions in all measures of radiation exposure: fluoroscopy time (17.8 %), dose area product (80.2 %), skin entry dose (81.0 %), and effective dose (76.9 %), p = 0.0001. Independent predictors of increased radiation exposure included larger patient weight, longer fluoroscopy time, and lack of radiation safety protocol. Implementation of a radiation safety protocol for pediatric and congenital catheter ablations can drastically reduce radiation exposure to patients without affecting procedural success.
Changes in ventricular function during emotional stress and cold exposure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiess, M.C.; Moore, R.A.; Dimsdale, J.
1984-01-01
Patients with cardiac disease frequently develop symptoms with emotional stress or cold exposure. To investigate the effects of these stresses in normal subjects, an ambulatory ventricular function monitor (VEST) (previously reported to measure EFs which correlate well with gamma camera measurements) was employed to record sequential 2 minute time activity curves from the left ventricles of 6 healthy men (ages 19-24) during a control period and during a 30 minute stress interview with a psychiatrist. Four of the subjects were also monitored in a cold room (1/sup 0/C) for 20 min. In addition to the left ventricular time-activity curve, heartmore » rate (HR), and BP (cuff) were recorded. All subjects had increases in HR, BP and EF during the stress interview. Cold, however, produced decreases in HR and EF and an increase in BP. The results (mean +- SD) are tabulated. End-systolic and end-diastolic counts and hence volume decreased during the interview and increased during cold exposure. The results suggest that (1) ambulatory changes in ventricular function can be measured with the VEST, and (2) significant changes in cardiovascular physiology are seen in normal subjects during a stress interview and exposure to cold.« less
NASA Astrophysics Data System (ADS)
Johnson, Payton; Ladd, Edwin
2018-01-01
We present time- and spatially-resolved observations of the inner solar corona in the 5303 Å line of Fe XIV, taken during the 21 August 2017 solar eclipse from a field observing site in Crossville, TN. These observations are used to characterize the intensity variations in this coronal emission line, and to compare with oscillation predictions from models for heating the corona by magnetic wave dissipation.The observations were taken with two Explore Scientific ED 102CF 102 mm aperture triplet apochromatic refractors. One system used a DayStar custom-built 5 Å FWHM filter centered on the Fe XIV coronal spectral line and an Atik Titan camera for image collection. The setup produced images with a pixel size of 2.15 arcseconds (~1.5 Mm at the distance to the Sun), and a field of view of 1420 x 1060 arcseconds, covering approximately 20% of the entire solar limb centered near the emerging sunspot complex AR 2672. We obtained images with an exposure time of 0.22 seconds and a frame rate of 2.36 Hz, for a total of 361 images during totality.An identical, co-aligned telescope/camera system observed the same portion of the solar corona, but with a 100 Å FWHM Baader Planetarium solar continuum filter centered on a wavelength of 5400 Å. Images with an exposure time of 0.01 seconds were obtained with a frame rate of 4.05 Hz. These simultaneous observations are used as a control to monitor brightness variations not related to coronal line oscillations.
VizieR Online Data Catalog: Proper motions and photometry of stars in NGC 3201 (Sariya+, 2017)
NASA Astrophysics Data System (ADS)
Sariya, D. P.; Jiang, I.-G.; Yadav, R. K. S.
2017-07-01
To determine the PMs of the stars in this work, we used archive images (http://archive.eso.org/eso/esoarchivemain.html) from observations made with the 2.2m ESO/MPI telescope at La Silla, Chile. This telescope contains a mosaic camera called the Wide-Field Imager (WFI), consisting of 4*2 (i.e., 8 CCD chips). Since each CCD has an array of 2048*4096 pixels, WFI ultimately produces images with a 34*33arcmin2 field of view. The observational run of the first epoch contains two images in B,V and I bands, each with 240s exposure time observed on 1999 December 05. In the second epoch, we have 35 images with 40s exposure time each in V filter observed during the period of 2014 April 02-05. Thus the epoch gap between the data is ~14.3 years. (2 data files).
NASA Astrophysics Data System (ADS)
Luo, Lin-Bo; An, Sang-Woo; Wang, Chang-Shuai; Li, Ying-Chun; Chong, Jong-Wha
2012-09-01
Digital cameras usually decrease exposure time to capture motion-blur-free images. However, this operation will generate an under-exposed image with a low-budget complementary metal-oxide semiconductor image sensor (CIS). Conventional color correction algorithms can efficiently correct under-exposed images; however, they are generally not performed in real time and need at least one frame memory if they are implemented by hardware. The authors propose a real-time look-up table-based color correction method that corrects under-exposed images with hardware without using frame memory. The method utilizes histogram matching of two preview images, which are exposed for a long and short time, respectively, to construct an improved look-up table (ILUT) and then corrects the captured under-exposed image in real time. Because the ILUT is calculated in real time before processing the captured image, this method does not require frame memory to buffer image data, and therefore can greatly save the cost of CIS. This method not only supports single image capture, but also bracketing to capture three images at a time. The proposed method was implemented by hardware description language and verified by a field-programmable gate array with a 5 M CIS. Simulations show that the system can perform in real time with a low cost and can correct the color of under-exposed images well.
Time-Based Measurement of Personal Mite Allergen Bioaerosol Exposure over 24 Hour Periods
Tovey, Euan R.; Liu-Brennan, Damien; Garden, Frances L.; Oliver, Brian G.; Perzanowski, Matthew S.; Marks, Guy B.
2016-01-01
Allergic diseases such as asthma and rhinitis are common in many countries. Globally the most common allergen associated with symptoms is produced by house dust mites. Although the bed has often been cited as the main site of exposure to mite allergens, surprisingly this has not yet been directly established by measurement due to a lack of suitable methods. Here we report on the development of novel methods to determine the pattern of personal exposure to mite allergen bioaerosols over 24-hour periods and applied this in a small field study using 10 normal adults. Air was sampled using a miniature time-based air-sampler of in-house design located close to the breathing zone of the participants, co-located with a miniature time-lapse camera. Airborne particles, drawn into the sampler at 2L/min via a narrow slot, were impacted onto the peripheral surface of a disk mounted on the hour-hand of either a 12 or 24 hour clock motor. The impaction surface was either an electret cloth, or an adhesive film; both novel for these purposes. Following a review of the time-lapse images, disks were post-hoc cut into subsamples corresponding to eight predetermined categories of indoor or outdoor location, extracted and analysed for mite allergen Der p 1 by an amplified ELISA. Allergen was detected in 57.2% of the total of 353 subsamples collected during 20 days of sampling. Exposure patterns varied over time. Higher concentrations of airborne mite allergen were typically measured in samples collected from domestic locations in the day and evening. Indoor domestic Der p 1 exposures accounted for 59.5% of total exposure, whereas total in-bed-asleep exposure, which varied 80 fold between individuals, accounted overall for 9.85% of total exposure, suggesting beds are not often the main site of exposure. This study establishes the feasibility of novel methods for determining the time-geography of personal exposure to many bioaerosols and identifies new areas for future technical development and clinical applications. PMID:27192200
Time-Based Measurement of Personal Mite Allergen Bioaerosol Exposure over 24 Hour Periods.
Tovey, Euan R; Liu-Brennan, Damien; Garden, Frances L; Oliver, Brian G; Perzanowski, Matthew S; Marks, Guy B
2016-01-01
Allergic diseases such as asthma and rhinitis are common in many countries. Globally the most common allergen associated with symptoms is produced by house dust mites. Although the bed has often been cited as the main site of exposure to mite allergens, surprisingly this has not yet been directly established by measurement due to a lack of suitable methods. Here we report on the development of novel methods to determine the pattern of personal exposure to mite allergen bioaerosols over 24-hour periods and applied this in a small field study using 10 normal adults. Air was sampled using a miniature time-based air-sampler of in-house design located close to the breathing zone of the participants, co-located with a miniature time-lapse camera. Airborne particles, drawn into the sampler at 2L/min via a narrow slot, were impacted onto the peripheral surface of a disk mounted on the hour-hand of either a 12 or 24 hour clock motor. The impaction surface was either an electret cloth, or an adhesive film; both novel for these purposes. Following a review of the time-lapse images, disks were post-hoc cut into subsamples corresponding to eight predetermined categories of indoor or outdoor location, extracted and analysed for mite allergen Der p 1 by an amplified ELISA. Allergen was detected in 57.2% of the total of 353 subsamples collected during 20 days of sampling. Exposure patterns varied over time. Higher concentrations of airborne mite allergen were typically measured in samples collected from domestic locations in the day and evening. Indoor domestic Der p 1 exposures accounted for 59.5% of total exposure, whereas total in-bed-asleep exposure, which varied 80 fold between individuals, accounted overall for 9.85% of total exposure, suggesting beds are not often the main site of exposure. This study establishes the feasibility of novel methods for determining the time-geography of personal exposure to many bioaerosols and identifies new areas for future technical development and clinical applications.
VizieR Online Data Catalog: Times of transits and occultations of WASP-12b (Patra+, 2017)
NASA Astrophysics Data System (ADS)
Patra, K. C.; Winn, J. N.; Holman, M. J.; Yu, L.; Deming, D.; Dai, F.
2017-08-01
Between 2016 October and 2017 February, we observed seven transits of WASP-12 with the 1.2m telescope at the Fred Lawrence Whipple Observatory on Mt. Hopkins, Arizona. Images were obtained with the KeplerCam detector through a Sloan r'-band filter. The typical exposure time was 15s, chosen to give a signal-to-noise ratio of about 200 for WASP-12. The field of view of this camera is 23.1' on a side. We used 2*2 binning, giving a pixel scale of 0.68''. We measured two new occultation times based on hitherto unpublished Spitzer observations in 2013 December (program 90186, P.I. Todorov). Two different transits were observed, one at 3.6μm and one at 4.5μm. The data take the form of a time series of 32*32-pixel subarray images, with an exposure time of 2.0s per image. The data were acquired over a wide range of orbital phases, but for our purpose, we analyzed only the ~14000 images within 4hr of each occultation. (1 data file).
Hasinoff, Samuel W; Kutulakos, Kiriakos N
2011-11-01
In this paper, we consider the problem of imaging a scene with a given depth of field at a given exposure level in the shortest amount of time possible. We show that by 1) collecting a sequence of photos and 2) controlling the aperture, focus, and exposure time of each photo individually, we can span the given depth of field in less total time than it takes to expose a single narrower-aperture photo. Using this as a starting point, we obtain two key results. First, for lenses with continuously variable apertures, we derive a closed-form solution for the globally optimal capture sequence, i.e., that collects light from the specified depth of field in the most efficient way possible. Second, for lenses with discrete apertures, we derive an integer programming problem whose solution is the optimal sequence. Our results are applicable to off-the-shelf cameras and typical photography conditions, and advocate the use of dense, wide-aperture photo sequences as a light-efficient alternative to single-shot, narrow-aperture photography.
Research relative to high resolution camera on the advanced X-ray astrophysics facility
NASA Technical Reports Server (NTRS)
1986-01-01
The HRC (High Resolution Camera) is a photon counting instrument to be flown on the Advanced X-Ray Astrophysics Facility (AXAF). It is a large field of view, high angular resolution, detector for the x-ray telescope. The HRC consists of a CsI coated microchannel plate (MCP) acting as a soft x-ray photocathode, followed by a second MCP for high electronic gain. The MCPs are readout by a crossed grid of resistively coupled wires to provide high spatial resolution along with timing and pulse height data. The instrument will be used in two modes, as a direct imaging detector with a limiting sensitivity of 10 to the -15 ergs sq cm sec in a 10 to the 5th second exposure, and as a readout for an objective transmission grating providing spectral resolution of several hundreds to thousands.
VizieR Online Data Catalog: AQ Boo VRI differential light curves (Wang+, 2016)
NASA Astrophysics Data System (ADS)
Wang, S.; Zhang, L.; Pi, Q.; Han, X. L.; Zhang, X.; Lu, H.; Wang, D.; Li, T.
2016-11-01
On March 22 and April 19 in 2014, we observed AQ Boo with the 60cm telescope at Xinglong Station of the National Astronomical Observatories of China (NAOC). The CCD camera on this telescope has a resolution of 1024 x 1024 pixels and its corresponding field of view is 17'x17' (Yang, 2013NewA...25..109Y). The other three days of data were obtained using the 1-m telescope at Yunnan Observatory of Chinese Academy of Sciences, on January 20, 21 and February 28 in 2015. The CCD camera on this telescope has a resolution of 2048x2048 pixels and its corresponding field of view is 7.3'x7.3'. Bessel VRI filters were used. The exposure times are 100-170s, 50-100s and 50-80s in the V, R, I bands, respectively. (1 data file).
NASA Technical Reports Server (NTRS)
2008-01-01
This image, and many like it, are one way NASA's Phoenix Mars Lander is measuring trace amounts of water vapor in the atmosphere over far-northern Mars. Phoenix's Surface Stereo Imager (SSI) uses solar filters, or filters designed to image the sun, to make these images. The camera is aimed at the sky for long exposures. SSI took this image as a test on June 9, 2008, which was the Phoenix mission's 15th Martian day, or sol, since landing, at 5:20 p.m. local solar time. The camera was pointed about 38 degrees above the horizon. The white dots in the sky are detector dark current that will be removed during image processing and analysis. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin SpaceHow Many Pixels Does It Take to Make a Good 4"×6" Print? Pixel Count Wars Revisited
NASA Astrophysics Data System (ADS)
Kriss, Michael A.
Digital still cameras emerged following the introduction of the Sony Mavica analog prototype camera in 1981. These early cameras produced poor image quality and did not challenge film cameras for overall quality. By 1995 digital still cameras in expensive SLR formats had 6 mega-pixels and produced high quality images (with significant image processing). In 2005 significant improvement in image quality was apparent and lower prices for digital still cameras (DSCs) started a rapid decline in film usage and film camera sells. By 2010 film usage was mostly limited to professionals and the motion picture industry. The rise of DSCs was marked by a “pixel war” where the driving feature of the cameras was the pixel count where even moderate cost, ˜120, DSCs would have 14 mega-pixels. The improvement of CMOS technology pushed this trend of lower prices and higher pixel counts. Only the single lens reflex cameras had large sensors and large pixels. The drive for smaller pixels hurt the quality aspects of the final image (sharpness, noise, speed, and exposure latitude). Only today are camera manufactures starting to reverse their course and producing DSCs with larger sensors and pixels. This paper will explore why larger pixels and sensors are key to the future of DSCs.
Evaluation of Real-Time Hand Motion Tracking Using a Range Camera and the Mean-Shift Algorithm
NASA Astrophysics Data System (ADS)
Lahamy, H.; Lichti, D.
2011-09-01
Several sensors have been tested for improving the interaction between humans and machines including traditional web cameras, special gloves, haptic devices, cameras providing stereo pairs of images and range cameras. Meanwhile, several methods are described in the literature for tracking hand motion: the Kalman filter, the mean-shift algorithm and the condensation algorithm. In this research, the combination of a range camera and the simple version of the mean-shift algorithm has been evaluated for its capability for hand motion tracking. The evaluation was assessed in terms of position accuracy of the tracking trajectory in x, y and z directions in the camera space and the time difference between image acquisition and image display. Three parameters have been analyzed regarding their influence on the tracking process: the speed of the hand movement, the distance between the camera and the hand and finally the integration time of the camera. Prior to the evaluation, the required warm-up time of the camera has been measured. This study has demonstrated the suitability of the range camera used in combination with the mean-shift algorithm for real-time hand motion tracking but for very high speed hand movement in the traverse plane with respect to the camera, the tracking accuracy is low and requires improvement.
System Synchronizes Recordings from Separated Video Cameras
NASA Technical Reports Server (NTRS)
Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.
2009-01-01
A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.
A fast double shutter for CCD-based metrology
NASA Astrophysics Data System (ADS)
Geisler, R.
2017-02-01
Image based metrology such as Particle Image Velocimetry (PIV) depends on the comparison of two images of an object taken in fast succession. Cameras for these applications provide the so-called `double shutter' mode: One frame is captured with a short exposure time and in direct succession a second frame with a long exposure time can be recorded. The difference in the exposure times is typically no problem since illumination is provided by a pulsed light source such as a laser and the measurements are performed in a darkened environment to prevent ambient light from accumulating in the long second exposure time. However, measurements of self-luminous processes (e.g. plasma, combustion ...) as well as experiments in ambient light are difficult to perform and require special equipment (external shutters, highspeed image sensors, multi-sensor systems ...). Unfortunately, all these methods incorporate different drawbacks such as reduced resolution, degraded image quality, decreased light sensitivity or increased susceptibility to decalibration. In the solution presented here, off-the-shelf CCD sensors are used with a special timing to combine neighbouring pixels in a binning-like way. As a result, two frames of short exposure time can be captured in fast succession. They are stored in the on-chip vertical register in a line-interleaved pattern, read out in the common way and separated again by software. The two resultant frames are completely congruent; they expose no insensitive lines or line shifts and thus enable sub-pixel accurate measurements. A third frame can be captured at the full resolution analogue to the double shutter technique. Image based measurement techniques such as PIV can benefit from this mode when applied in bright environments. The third frame is useful e.g. for acceleration measurements or for particle tracking applications.
Minamisawa, T; Hirokaga, K
1995-11-01
The open-field activity of first-generation (F1) hybrid male C57BL/6 x C3H mice irradiated with gamma rays on day 14 of gestation was studied at the following ages: 6-7 months (young), 12-13 months (adult) and 19-20 months (old). Doses were 0.5 Gy or 1.0 Gy. Open-field activity was recorded with a camera. The camera output signal was recorded every second through an A/D converter to a personal computer. The field was divided into 25 8-cm2 units. All recordings were continuous for 60 min. The walking speed of the 1.0-Gy group recorded at 19-20 months was higher than that for the comparably aged control group. The time which the irradiated group, recorded at 19-20 months, spent in the corner fields was high in comparison with the control group at the same age. Conversely, the time spent by the irradiated group in the middle fields when recorded at 19-20 months was shorter than in the comparably aged control group. No effect of radiation was shown for any of the behaviors observed and recorded at 6-7 and 12-13 months. The results demonstrate that such exposure to gamma rays on day 14 of gestation results in behavioral changes which occur at 19-20 months but not at 6-7 or 12-13 months.
Tonne, Cathryn; Salmon, Maëlle; Sanchez, Margaux; Sreekanth, V; Bhogadi, Santhi; Sambandam, Sankar; Balakrishnan, Kalpana; Kinra, Sanjay; Marshall, Julian D
2017-08-01
While there is convincing evidence that fine particulate matter causes cardiovascular mortality and morbidity, little of the evidence is based on populations outside of high income countries, leaving large uncertainties at high exposures. India is an attractive setting for investigating the cardiovascular risk of particles across a wide concentration range, including concentrations for which there is the largest uncertainty in the exposure-response relationship. CHAI is a European Research Council funded project that investigates the relationship between particulate air pollution from outdoor and household sources with markers of atherosclerosis, an important cardiovascular pathology. The project aims to (1) characterize the exposure of a cohort of adults to particulate air pollution from household and outdoor sources (2) integrate information from GPS, wearable cameras, and continuous measurements of personal exposure to particles to understand where and through which activities people are most exposed and (3) quantify the association between particles and markers of atherosclerosis. CHAI has the potential to make important methodological contributions to modeling air pollution exposure integrating outdoor and household sources as well as in the application of wearable camera data in environmental exposure assessment. Copyright © 2017 Elsevier GmbH. All rights reserved.
NASA Technical Reports Server (NTRS)
1999-01-01
Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.
640 x 480 MWIR and LWIR camera system developments
NASA Astrophysics Data System (ADS)
Tower, John R.; Villani, Thomas S.; Esposito, Benjamin J.; Gilmartin, Harvey R.; Levine, Peter A.; Coyle, Peter J.; Davis, Timothy J.; Shallcross, Frank V.; Sauer, Donald J.; Meyerhofer, Dietrich
1993-01-01
The performance of a 640 x 480 PtSi, 3,5 microns (MWIR), Stirling cooled camera system with a minimum resolvable temperature of 0.03 is considered. A preliminary specification of a full-TV resolution PtSi radiometer was developed using the measured performance characteristics of the Stirling cooled camera. The radiometer is capable of imaging rapid thermal transients from 25 to 250 C with better than 1 percent temperature resolution. This performance is achieved using the electronic exposure control capability of the MOS focal plane array (FPA). A liquid nitrogen cooled camera with an eight-position filter wheel has been developed using the 640 x 480 PtSi FPA. Low thermal mass packaging for the FPA was developed for Joule-Thomson applications.
640 x 480 MWIR and LWIR camera system developments
NASA Astrophysics Data System (ADS)
Tower, J. R.; Villani, T. S.; Esposito, B. J.; Gilmartin, H. R.; Levine, P. A.; Coyle, P. J.; Davis, T. J.; Shallcross, F. V.; Sauer, D. J.; Meyerhofer, D.
The performance of a 640 x 480 PtSi, 3,5 microns (MWIR), Stirling cooled camera system with a minimum resolvable temperature of 0.03 is considered. A preliminary specification of a full-TV resolution PtSi radiometer was developed using the measured performance characteristics of the Stirling cooled camera. The radiometer is capable of imaging rapid thermal transients from 25 to 250 C with better than 1 percent temperature resolution. This performance is achieved using the electronic exposure control capability of the MOS focal plane array (FPA). A liquid nitrogen cooled camera with an eight-position filter wheel has been developed using the 640 x 480 PtSi FPA. Low thermal mass packaging for the FPA was developed for Joule-Thomson applications.
Precision of FLEET Velocimetry Using High-Speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 microseconds, precisions of 0.5 meters per second in air and 0.2 meters per second in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision HighSpeed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
Three-dimensional device characterization by high-speed cinematography
NASA Astrophysics Data System (ADS)
Maier, Claus; Hofer, Eberhard P.
2001-10-01
Testing of micro-electro-mechanical systems (MEMS) for optimization purposes or reliability checks can be supported by device visualization whenever an optical access is available. The difficulty in such an investigation is the short time duration of dynamical phenomena in micro devices. This paper presents a test setup to visualize movements within MEMS in real-time and in two perpendicular directions. A three-dimensional view is achieved by the combination of a commercial high-speed camera system, which allows to take up to 8 images of the same process with a minimum interframe time of 10 ns for the first direction, with a second visualization system consisting of a highly sensitive CCD camera working with a multiple exposure LED illumination in the perpendicular direction. Well synchronized this provides 3-D information which is treated by digital image processing to correct image distortions and to perform the detection of object contours. Symmetric and asymmetric binary collisions of micro drops are chosen as test experiments, featuring coalescence and surface rupture. Another application shown here is the investigation of sprays produced by an atomizer. The second direction of view is a prerequisite for this measurement to select an intended plane of focus.
1971-08-01
S71-58222 (31 July-2 Aug. 1971) --- During the lunar eclipse that occurred during the Apollo 15 lunar landing mission, astronaut Alfred M. Worden, command module pilot, used a 35mm Nikon camera to obtain a series of 15 photographs while the moon was entering and exiting Earth's umbra. Although it might seem that there should be no light on the moon when it is in Earth's shadow, sunlight is scattered into this region by Earth's atmosphere. This task was an attempt to measure by photographic photometry the amount of scattered light reaching the moon. The four views from upper left to lower right were selected to show the moon as it entered Earth's umbra. The first is a four-second exposure which was taken at the moment when the moon had just entered umbra; the second is a 15-second exposure taken two minutes after entry; the third, a 30-second exposure three minutes after entry; and the fourth is a 60-second exposure four minutes after entry. In all cases the light reaching the moon was so bright on the very high speed film (Eastman Kodak type 2485 emulsion) that the halation obscures the lunar image, which should be about one-third as big as the circle of light. The background star field is clearly evident, and this is very important for these studies. The spacecraft was in full sunlight when these photographs were taken, and it was pointed almost directly away from the sun so that the windows and a close-in portion of the camera's line-of-sight were in shadow. The environment around the vehicle at this time appears to be very "clean" with no light scattering particles noticeable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shao, Michael; Nemati, Bijan; Zhai, Chengxing
We present an approach that significantly increases the sensitivity for finding and tracking small and fast near-Earth asteroids (NEAs). This approach relies on a combined use of a new generation of high-speed cameras which allow short, high frame-rate exposures of moving objects, effectively 'freezing' their motion, and a computationally enhanced implementation of the 'shift-and-add' data processing technique that helps to improve the signal-to-noise ratio (SNR) for detection of NEAs. The SNR of a single short exposure of a dim NEA is insufficient to detect it in one frame, but by computationally searching for an appropriate velocity vector, shifting successive framesmore » relative to each other and then co-adding the shifted frames in post-processing, we synthetically create a long-exposure image as if the telescope were tracking the object. This approach, which we call 'synthetic tracking,' enhances the familiar shift-and-add technique with the ability to do a wide blind search, detect, and track dim and fast-moving NEAs in near real time. We discuss also how synthetic tracking improves the astrometry of fast-moving NEAs. We apply this technique to observations of two known asteroids conducted on the Palomar 200 inch telescope and demonstrate improved SNR and 10 fold improvement of astrometric precision over the traditional long-exposure approach. In the past 5 yr, about 150 NEAs with absolute magnitudes H = 28 (∼10 m in size) or fainter have been discovered. With an upgraded version of our camera and a field of view of (28 arcmin){sup 2} on the Palomar 200 inch telescope, synthetic tracking could allow detecting up to 180 such objects per night, including very small NEAs with sizes down to 7 m.« less
MS Grunsfeld changes film using film bag
1997-01-16
S81-E-05468 (16 Jan. 1997) --- To protect it from exposure to light, astronaut John M. Grunsfeld, mission specialist, uses a black bag to change out a film magazine on a 70mm handheld camera during mid-week activity aboard the Space Shuttle Atlantis. The photograph was recorded with an Electronic Still Camera (ESC) and later was downlinked to flight controllers in Houston, Texas.
Optical observations of Swift J1822.3-1606 with the 10.4m Gran Telescopio Canarias
NASA Astrophysics Data System (ADS)
Rea, N.; Mignani, R. P.; Israel, G. L.; Esposi, P.
2011-07-01
We observed the field of the new Soft Gamma-ray Repeater (SGR), Swift J1822.3-1606 (Cummings et al., Atel #3488) with the 10.4m Gran Telescopio Canarias (GranTeCan). Images have been taken with the OSIRIS camera, a two-chip CCD with a nominal 7.8'x7.8' arcmin field of view and a pixel size of 0.125". Observations have been taken in the z-Sloan-band on 2011 July 21st (unfortunately in bright lunar time, with a large sky background and a seeing ranging from 1-2.5") with exposure times of 54-108s.
Visualization of hump formation in high-speed gas metal arc welding
NASA Astrophysics Data System (ADS)
Wu, C. S.; Zhong, L. M.; Gao, J. Q.
2009-11-01
The hump bead is a typical weld defect observed in high-speed welding. Its occurrence limits the improvement of welding productivity. Visualization of hump formation during high-speed gas metal arc welding (GMAW) is helpful in the better understanding of the humping phenomena so that effective measures can be taken to suppress or decrease the tendency of hump formation and achieve higher productivity welding. In this study, an experimental system was developed to implement vision-based observation of the weld pool behavior during high-speed GMAW. Considering the weld pool characteristics in high-speed welding, a narrow band-pass and neutral density filter was equipped for the CCD camera, the suitable exposure time was selected and side view orientation of the CCD camera was employed. The events that took place at the rear portion of the weld pools were imaged during the welding processes with and without hump bead formation, respectively. It was found that the variation of the weld pool surface height and the solid-liquid interface at the pool trailing with time shows some useful information to judge whether the humping phenomenon occurs or not.
NASA Astrophysics Data System (ADS)
Gacal, G. F. B.; Lagrosas, N.
2016-12-01
Nowadays, cameras are commonly used by students. In this study, we use this instrument to look at moon signals and relate these signals to Gaussian functions. To implement this as a classroom activity, students need computers, computer software to visualize signals, and moon images. A normalized Gaussian function is often used to represent probability density functions of normal distribution. It is described by its mean m and standard deviation s. The smaller standard deviation implies less spread from the mean. For the 2-dimensional Gaussian function, the mean can be described by coordinates (x0, y0), while the standard deviations can be described by sx and sy. In modelling moon signals obtained from sky-cameras, the position of the mean (x0, y0) is solved by locating the coordinates of the maximum signal of the moon. The two standard deviations are the mean square weighted deviation based from the sum of total pixel values of all rows/columns. If visualized in three dimensions, the 2D Gaussian function appears as a 3D bell surface (Fig. 1a). This shape is similar to the pixel value distribution of moon signals as captured by a sky-camera. An example of this is illustrated in Fig 1b taken around 22:20 (local time) of January 31, 2015. The local time is 8 hours ahead of coordinated universal time (UTC). This image is produced by a commercial camera (Canon Powershot A2300) with 1s exposure time, f-stop of f/2.8, and 5mm focal length. One has to chose a camera with high sensitivity when operated at nighttime to effectively detect these signals. Fig. 1b is obtained by converting the red-green-blue (RGB) photo to grayscale values. The grayscale values are then converted to a double data type matrix. The last conversion process is implemented for the purpose of having the same scales for both Gaussian model and pixel distribution of raw signals. Subtraction of the Gaussian model from the raw data produces a moonless image as shown in Fig. 1c. This moonless image can be used for quantifying cloud cover as captured by ordinary cameras (Gacal et al, 2016). Cloud cover can be defined as the ratio of number of pixels whose values exceeds 0.07 and the total number of pixels. In this particular image, cloud cover value is 0.67.
Long, Tom; Johnson, Ted; Ollison, Will
2002-05-01
Researchers have developed a variety of computer-based models to estimate population exposure to air pollution. These models typically estimate exposures by simulating the movement of specific population groups through defined microenvironments. Exposures in the motor vehicle microenvironment are significantly affected by air exchange rate, which in turn is affected by vehicle speed, window position, vent status, and air conditioning use. A pilot study was conducted in Houston, Texas, during September 2000 for a specific set of weather, vehicle speed, and road type conditions to determine whether useful information on the position of windows, sunroofs, and convertible tops could be obtained through the use of video cameras. Monitoring was conducted at three sites (two arterial roads and one interstate highway) on the perimeter of Harris County located in or near areas not subject to mandated Inspection and Maintenance programs. Each site permitted an elevated view of vehicles as they proceeded through a turn, thereby exposing all windows to the stationary video camera. Five videotaping sessions were conducted over a two-day period in which the Heat Index (HI)-a function of temperature and humidity-varied from 80 to 101 degrees F and vehicle speed varied from 30 to 74 mph. The resulting videotapes were processed to create a master database listing vehicle-specific data for site location, date, time, vehicle type (e.g., minivan), color, window configuration (e.g., four windows and sunroof), number of windows in each of three position categories (fully open, partially open, and closed), HI, and speed. Of the 758 vehicles included in the database, 140 (18.5 percent) were labeled as "open," indicating a window, sunroof, or convertible top was fully or partially open. The results of a series of stepwise linear regression analyses indicated that the probability of a vehicle in the master database being "open" was weakly affected by time of day, vehicle type, vehicle color, vehicle speed, and HI. In particular, open windows occurred more frequently when vehicle speed was less than 50 mph during periods when HI exceeded 99.9 degrees F and the vehicle was a minivan or passenger van. Overall, the pilot study demonstrated that data on factors affecting vehicle window position could be acquired through a relatively simple experimental protocol using a single video camera. Limitations of the study requiring further research include the inability to determine the status of the vehicle air conditioning system; lack of a wide range of weather, vehicle speed, and road type conditions; and the need to exclude some vehicles from statistical analyses due to ambiguous window positions.
VizieR Online Data Catalog: Observed light curve of (3200) Phaethon (Ansdell+, 2014)
NASA Astrophysics Data System (ADS)
Ansdell, M.; Meech, K. J.; Hainaut, O.; Buie, M. W.; Kaluna, H.; Bauer, J.; Dundon, L.
2017-04-01
We obtained time series photometry over 15 nights from 1994 to 2013. All but three nights used the Tektronix 2048x2048 pixel CCD camera on the University of Hawaii 2.2 m telescope on Mauna Kea. Two nights used the PRISM 2048x2048 pixel CCD camera on the Perkins 72 inch telescope at the Lowell Observatory in Flagstaff, Arizona, while one night used the Optic 2048x4096 CCD camera also on the University of Hawaii 2.2 m telescope. All observations used the standard Kron-Cousins R filter with the telescope guiding on (3200) Phaethon at non-sidereal rates. Raw images were processed with standard IRAF routines for bias subtraction, flat-fielding, and cosmic ray removal (Tody, 1986SPIE..627..733T). We constructed reference flat fields by median combining dithered images of either twilight or the object field (in both cases, flattening reduced gradients to <1% across the CCD). We performed photometry using the IRAF phot routine with circular apertures typically 5'' in radius, although aperture sizes changed depending on the night and/or exposure as they were chosen to consistently include 99.5% of the object's light. (1 data file).
Passive radiation detection using optically active CMOS sensors
NASA Astrophysics Data System (ADS)
Dosiek, Luke; Schalk, Patrick D.
2013-05-01
Recently, there have been a number of small-scale and hobbyist successes in employing commodity CMOS-based camera sensors for radiation detection. For example, several smartphone applications initially developed for use in areas near the Fukushima nuclear disaster are capable of detecting radiation using a cell phone camera, provided opaque tape is placed over the lens. In all current useful implementations, it is required that the sensor not be exposed to visible light. We seek to build a system that does not have this restriction. While building such a system would require sophisticated signal processing, it would nevertheless provide great benefits. In addition to fulfilling their primary function of image capture, cameras would also be able to detect unknown radiation sources even when the danger is considered to be low or non-existent. By experimentally profiling the image artifacts generated by gamma ray and β particle impacts, algorithms are developed to identify the unique features of radiation exposure, while discarding optical interaction and thermal noise effects. Preliminary results focus on achieving this goal in a laboratory setting, without regard to integration time or computational complexity. However, future work will seek to address these additional issues.
High-angular-resolution NIR astronomy with large arrays (SHARP I and SHARP II)
NASA Astrophysics Data System (ADS)
Hofmann, Reiner; Brandl, Bernhard; Eckart, Andreas; Eisenhauer, Frank; Tacconi-Garman, Lowell E.
1995-06-01
SHARP I and SHARP II are near infrared cameras for high-angular-resolution imaging. Both cameras are built around a 256 X 256 pixel NICMOS 3 HgCdTe array from Rockwell which is sensitive in the 1 - 2.5 micrometers range. With a 0.05'/pixel scale, they can produce diffraction limited K-band images at 4-m-class telescopes. For a 256 X 256 array, this pixel scale results in a field of view of 12.8' X 12.8' which is well suited for the observation of galactic and extragalactic near-infrared sources. Photometric and low resolution spectroscopic capabilities are added by photometric band filters (J, H, K), narrow band filters ((lambda) /(Delta) (lambda) approximately equals 100) for selected spectral lines, and a CVF ((lambda) /(Delta) (lambda) approximately equals 70). A cold shutter permits short exposure times down to about 10 ms. The data acquisition electronics permanently accepts the maximum frame rate of 8 Hz which is defined by the detector time constants (data rate 1 Mbyte/s). SHARP I has been especially designed for speckle observations at ESO's 3.5 m New Technology Telescope and is in operation since 1991. SHARP II is used at ESO's 3.6 m telescope together with the adaptive optics system COME-ON + since 1993. A new version of SHARP II is presently under test, which incorporates exchangeable camera optics for observations with scales of 0.035, 0.05, and 0.1'/pixel. The first scale extends diffraction limited observations down to the J-band, while the last one provides a larger field of view. To demonstrate the power of the cameras, images of the galactic center obtained with SHARP I, and images of the R136 region in 30 Doradus observed with SHARP II are presented.
NASA Astrophysics Data System (ADS)
Brauchle, Joerg; Berger, Ralf; Hein, Daniel; Bucher, Tilman
2017-04-01
The DLR Institute of Optical Sensor Systems has developed the MACS-Himalaya, a custom built Modular Aerial Camera System specifically designed for the extreme geometric (steep slopes) and radiometric (high contrast) conditions of high mountain areas. It has an overall field of view of 116° across-track consisting of a nadir and two oblique looking RGB camera heads and a fourth nadir looking near-infrared camera. This design provides the capability to fly along narrow valleys and simultaneously cover ground and steep valley flank topography with similar ground resolution. To compensate for extreme contrasts between fresh snow and dark shadows in high altitudes a High Dynamic Range (HDR) mode was implemented, which typically takes a sequence of 3 images with graded integration times, each covering 12 bit radiometric depth, resulting in a total dynamic range of 15-16 bit. This enables dense image matching and interpretation for sunlit snow and glaciers as well as for dark shaded rock faces in the same scene. Small and lightweight industrial grade camera heads are used and operated at a rate of 3.3 frames per second with 3-step HDR, which is sufficient to achieve a longitudinal overlap of approximately 90% per exposure time at 1,000 m above ground at a velocity of 180 km/h. Direct georeferencing and multitemporal monitoring without the need of ground control points is possible due to the use of a high end GPS/INS system, a stable calibrated inner geometry of the camera heads and a fully photogrammetric workflow at DLR. In 2014 a survey was performed on the Nepalese side of the Himalayas. The remote sensing system was carried in a wingpod by a Stemme S10 motor glider. Amongst other targets, the Seti Valley, Kali-Gandaki Valley and the Mt. Everest/Khumbu Region were imaged at altitudes up to 9,200 m. Products such as dense point clouds, DSMs and true orthomosaics with a ground pixel resolution of up to 15 cm were produced in regions and outcrops normally inaccessible to aerial imagery. These data are used in the fields of natural hazards, geomorphology and glaciology (see Thompson et al., CR4.3). In the presentation the camera system is introduced and examples and applications from the Nepal campaign are given.
New Day for Longest-Working Mars Rover
2018-02-16
NASA's Mars Exploration Rover Opportunity recorded the dawn of the rover's 4,999th Martian day, or sol, with its Panoramic Camera (Pancam) on Feb. 15, 2018, yielding this processed, approximately true-color scene. The view looks across Endeavour Crater, which is about 14 miles (22 kilometers) in diameter, from the inner slope of the crater's western rim. Opportunity has driven a little over 28.02 miles (45.1 kilometers) since it landed in the Meridiani Planum region of Mars in January, 2004, for what was planned as a 90-sol mission. A sol lasts about 40 minutes longer than an Earth day. This view combines three separate Pancam exposures taken through filters centered on wavelengths of 601 microns (red), 535 microns (green) and 482 microns (blue). It was processed at Texas A&M University to correct for some of the oversaturation and glare, though it still includes some artifacts from pointing a camera with a dusty lens at the Sun. The processing includes radiometric correction, interpolation to fill in gaps in the data caused by saturation due to Sun's brightness, and warping the red and blue images to undo the effects of time passing between each of the exposures through different filters. https://photojournal.jpl.nasa.gov/catalog/PIA22221
Direct measurement of lateral transport in membranes by using time-resolved spatial photometry.
Kapitza, H G; McGregor, G; Jacobson, K A
1985-01-01
Spatially resolving light detectors allow, with proper calibration, quantitative analysis of the variations in two-dimensional intensity distributions over time. An ultrasensitive microfluorometer was assembled by using as a detector a microchannel plate-intensified video camera. The camera was interfaced with a software-based digital video analysis system to digitize, average, and process images and to directly control the timing of the experiments to minimize exposure of the specimen to light. The detector system has been characterized to allow its use as a photometer. A major application has been to perform fluorescence recovery after photobleaching measurements by using the camera in place of a photomultiplier tube (video-FRAP) with the goal of detecting possible anisotropic diffusion or convective flow. Analysis of the data on macromolecular diffusion in homogenous aqueous glycol solutions yielded diffusion constants in agreement with previous measurements. Results on lipid probe diffusion in dimyristoylphosphatidylcholine multibilayers indicated that at temperatures above the gel-to-liquid crystalline phase transition diffusion is isotropic, and analysis of video-FRAP data yielded diffusion coefficients consistent with those measured previously by using spot photobleaching. However, lipid probes in these multibilayers held just below the main phase transition temperature exhibited markedly anisotropic diffusive fluxes when the bleaching beam was positioned proximate to domain boundaries in the P beta' phase. Lipid probes and lectin receptor complexes diffused isotropically in fibroblast surface membranes with little evidence for diffusion channeled parallel to stress fibers. A second application was to trace the time evolution of cell surface reactions such as patching. The feasibility of following, on the optical scale, the growth of individual receptor clusters induced by the ligand wheat germ agglutinin was demonstrated. PMID:3858869
Direct measurement of lateral transport in membranes by using time-resolved spatial photometry.
Kapitza, H G; McGregor, G; Jacobson, K A
1985-06-01
Spatially resolving light detectors allow, with proper calibration, quantitative analysis of the variations in two-dimensional intensity distributions over time. An ultrasensitive microfluorometer was assembled by using as a detector a microchannel plate-intensified video camera. The camera was interfaced with a software-based digital video analysis system to digitize, average, and process images and to directly control the timing of the experiments to minimize exposure of the specimen to light. The detector system has been characterized to allow its use as a photometer. A major application has been to perform fluorescence recovery after photobleaching measurements by using the camera in place of a photomultiplier tube (video-FRAP) with the goal of detecting possible anisotropic diffusion or convective flow. Analysis of the data on macromolecular diffusion in homogenous aqueous glycol solutions yielded diffusion constants in agreement with previous measurements. Results on lipid probe diffusion in dimyristoylphosphatidylcholine multibilayers indicated that at temperatures above the gel-to-liquid crystalline phase transition diffusion is isotropic, and analysis of video-FRAP data yielded diffusion coefficients consistent with those measured previously by using spot photobleaching. However, lipid probes in these multibilayers held just below the main phase transition temperature exhibited markedly anisotropic diffusive fluxes when the bleaching beam was positioned proximate to domain boundaries in the P beta' phase. Lipid probes and lectin receptor complexes diffused isotropically in fibroblast surface membranes with little evidence for diffusion channeled parallel to stress fibers. A second application was to trace the time evolution of cell surface reactions such as patching. The feasibility of following, on the optical scale, the growth of individual receptor clusters induced by the ligand wheat germ agglutinin was demonstrated.
Photometric Lunar Surface Reconstruction
NASA Technical Reports Server (NTRS)
Nefian, Ara V.; Alexandrov, Oleg; Morattlo, Zachary; Kim, Taemin; Beyer, Ross A.
2013-01-01
Accurate photometric reconstruction of the Lunar surface is important in the context of upcoming NASA robotic missions to the Moon and in giving a more accurate understanding of the Lunar soil composition. This paper describes a novel approach for joint estimation of Lunar albedo, camera exposure time, and photometric parameters that utilizes an accurate Lunar-Lambertian reflectance model and previously derived Lunar topography of the area visualized during the Apollo missions. The method introduced here is used in creating the largest Lunar albedo map (16% of the Lunar surface) at the resolution of 10 meters/pixel.
Home cage locomotor changes in non-human primates after prolonged welding-fume exposure.
Kim, Choong Yong; Sung, Jae Hyuck; Chung, Yong Hyun; Park, Jung Duck; Han, Jeong Hee; Lee, Jong Seong; Heo, Jeong Doo; Yu, Il Je
2013-12-01
To define the relationship between the brain concentration of manganese and neurological signs, such as locomotion, after prolonged welding-fume exposure, cynomolgus monkeys were acclimated for 1 month and then divided into three concentration groups: unexposed, low concentration (31 mg/m(3) total suspended particulate (TSP), 0.9 mg/m(3) of Mn), and high concentration (62 mg/m(3) TSP, 1.95 mg/m(3) of Mn) of TSP. The monkeys were exposed to manual metal-arc stainless steel (MMA-SS) welding fumes for 2 h per day over 8 months in an inhalation chamber system equipped with an automatic fume generator. The home cage locomotor activity and patterns were determined using a camera system over 2-4 consecutive days. After 25 and 32 weeks of exposure, the home cage locomotor activity of the high-concentration primates was found to be 5-6 times higher than that of the unexposed primates, and this increased locomotor activity was maintained for 7 weeks after ceasing the welding-fume exposure, eventually subsiding to three times higher after 13 weeks of recovery. Therefore, the present results, along with our previous observations of a high magnetic resonance imaging (MRI) T1 signal in the globus pallidus and increased blood Mn concentration, indicate that prolonged welding-fume exposure can cause neurobehavioral changes in cynomolgus monkeys.
ERIC Educational Resources Information Center
Fortunato, John A.
2001-01-01
Identifies and analyzes the exposure and portrayal framing methods that are utilized by the National Basketball Association (NBA). Notes that key informant interviews provide insight into the exposure framing method and reveal two portrayal instruments: cameras and announcers; and three framing strategies: depicting the NBA as a team game,…
Video surveillance with speckle imaging
Carrano, Carmen J [Livermore, CA; Brase, James M [Pleasanton, CA
2007-07-17
A surveillance system looks through the atmosphere along a horizontal or slant path. Turbulence along the path causes blurring. The blurring is corrected by speckle processing short exposure images recorded with a camera. The exposures are short enough to effectively freeze the atmospheric turbulence. Speckle processing is used to recover a better quality image of the scene.
Jarc, Anthony M; Curet, Myriam J
2017-03-01
Effective visualization of the operative field is vital to surgical safety and education. However, additional metrics for visualization are needed to complement other common measures of surgeon proficiency, such as time or errors. Unlike other surgical modalities, robot-assisted minimally invasive surgery (RAMIS) enables data-driven feedback to trainees through measurement of camera adjustments. The purpose of this study was to validate and quantify the importance of novel camera metrics during RAMIS. New (n = 18), intermediate (n = 8), and experienced (n = 13) surgeons completed 25 virtual reality simulation exercises on the da Vinci Surgical System. Three camera metrics were computed for all exercises and compared to conventional efficiency measures. Both camera metrics and efficiency metrics showed construct validity (p < 0.05) across most exercises (camera movement frequency 23/25, camera movement duration 22/25, camera movement interval 19/25, overall score 24/25, completion time 25/25). Camera metrics differentiated new and experienced surgeons across all tasks as well as efficiency metrics. Finally, camera metrics significantly (p < 0.05) correlated with completion time (camera movement frequency 21/25, camera movement duration 21/25, camera movement interval 20/25) and overall score (camera movement frequency 20/25, camera movement duration 19/25, camera movement interval 20/25) for most exercises. We demonstrate construct validity of novel camera metrics and correlation between camera metrics and efficiency metrics across many simulation exercises. We believe camera metrics could be used to improve RAMIS proficiency-based curricula.
Earth-orbiting extreme ultraviolet spectroscopic mission: SPRINT-A/EXCEED
NASA Astrophysics Data System (ADS)
Yoshikawa, I.; Tsuchiya, F.; Yamazaki, A.; Yoshioka, K.; Uemizu, K.; Murakami, G.; Kimura, T.; Kagitani, M.; Terada, N.; Kasaba, Y.; Sakanoi, T.; Ishii, H.; Uji, K.
2012-09-01
The EXCEED (Extreme Ultraviolet Spectroscope for Exospheric Dynamics) mission is an Earth-orbiting extreme ultraviolet (EUV) spectroscopic mission and the first in the SPRINT series being developed by ISAS/JAXA. It will be launched in the summer of 2013. EUV spectroscopy is suitable for observing tenuous gases and plasmas around planets in the solar system (e.g., Mercury, Venus, Mars, Jupiter, and Saturn). Advantage of remote sensing observation is to take a direct picture of the plasma dynamics and distinguish between spatial and temporal variability explicitly. One of the primary observation targets is an inner magnetosphere of Jupiter, whose plasma dynamics is dominated by planetary rotation. Previous observations have shown a few percents of the hot electron population in the inner magnetosphere whose temperature is 100 times higher than the background thermal electrons. Though the hot electrons have a significant impact on the energy balance in the inner magnetosphere, their generation process has not yet been elucidated. In the EUV range, a number of emission lines originate from plasmas distributed in Jupiter's inner magnetosphere. The EXCEED spectrograph is designed to have a wavelength range of 55-145 nm with minimum spectral resolution of 0.4 nm, enabling the electron temperature and ion composition in the inner magnetosphere to be determined. Another primary objective is to investigate an unresolved problem concerning the escape of the atmosphere to space. Although there have been some in-situ observations by orbiters, our knowledge is still limited. The EXCEED mission plans to make imaging observations of plasmas around Venus and Mars to determine the amounts of escaping atmosphere. The instrument's field of view (FOV) is so wide that we can get an image from the interaction region between the solar wind and planetary plasmas down to the tail region at one time. This will provide us with information about outward-flowing plasmas, e.g., their composition, rate, and dependence on solar activity. EXCEED has two mission instruments: the EUV spectrograph and a target guide camera that is sensitive to visible light. The EUV spectrograph is designed to have a wavelength range of 55-145 nm with a spectral resolution of 0.4-1.0 nm. The spectrograph slits have a FOV of 400 x 140 arcseconds (maximum). The optics of the instrument consists of a primary mirror with a diameter of 20cm, a laminar type grating, and a 5-stage micro-channel plate assembly with a resistive anode encoder. To achieve high efficiencies, the surfaces of the primary mirror and the grating are coated with CVD-SiC. Because of the large primary mirror and high efficiencies, good temporal resolution and complete spatial coverage for Io plasma torus observation is expected. Based on a feasibility study using the spectral diagnosis method, it is shown that EXCEED can determine the Io plasma torus parameters, such as the electron density, temperatures, hot electron fraction and so on, using an exposure time of 50 minutes. The target guide camera will be used to capture the target and guide the observation area of interest to the slit. Emissions from outside the slit's FOV will be reflected by the front of the slit and guided to the target guide camera. The guide camera's FOV is 240" x 240". The camera will take an image every 3 seconds and the image is sent to a mission data processor (MDP), which calculates the centroid of the image. During an observation, the bus system controls the attitude to keep the centroid position of the target in the guide camera with an accuracy of ±5 arc-seconds. With the help of the target guide camera, we will take spectral images with a long exposure time of 50 minutes and good spatial resolution of 20 arc-seconds.
NASA Astrophysics Data System (ADS)
Migiyama, Go; Sugimura, Atsuhiko; Osa, Atsushi; Miike, Hidetoshi
Recently, digital cameras are offering technical advantages rapidly. However, the shot image is different from the sight image generated when that scenery is seen with the naked eye. There are blown-out highlights and crushed blacks in the image that photographed the scenery of wide dynamic range. The problems are hardly generated in the sight image. These are contributory cause of difference between the shot image and the sight image. Blown-out highlights and crushed blacks are caused by the difference of dynamic range between the image sensor installed in a digital camera such as CCD and CMOS and the human visual system. Dynamic range of the shot image is narrower than dynamic range of the sight image. In order to solve the problem, we propose an automatic method to decide an effective exposure range in superposition of edges. We integrate multi-step exposure images using the method. In addition, we try to erase pseudo-edges using the process to blend exposure values. Afterwards, we get a pseudo wide dynamic range image automatically.
Capturing exposures: using automated cameras to document environmental determinants of obesity.
Barr, Michelle; Signal, Louise; Jenkin, Gabrielle; Smith, Moira
2015-03-01
Children's exposure to food marketing across multiple everyday settings, a key environmental influence on health, has not yet been objectively documented. Wearable automated cameras (ACs) may have the potential to provide an objective account of this exposure. The purpose of this study is to assess the feasibility of using ACs to document children's exposure to food marketing in multiple settings. A convenience sample of six participants (aged 12) wore a SenseCam device for two full days. Following which, participants attended a focus group to ascertain their experiences of using the device. The collected data were analysed to determine participants' daily and setting specific exposure to 'healthy' and 'unhealthy' food marketing (in minutes). The focus group transcript was analysed using thematic analysis to identify the common themes. Participants collected usable data that could be analysed to determine participant's daily exposure (in minutes) to 'unhealthy' food marketing across a number of everyday settings. Results from the focus group discussion indicated that participants were comfortable wearing the device, after an initial adjustment period. ACs may be an effective tool for documenting children's exposure to food marketing in multiple settings. ACs provide a new method for documenting environmental determinants of obesity and likely other environmental impacts on health. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Event-Driven Random-Access-Windowing CCD Imaging System
NASA Technical Reports Server (NTRS)
Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William
2004-01-01
A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable assembly. The FPGA controller card is connected to the host computer via a standard peripheral component interface (PCI).
Hyper Suprime-Cam: System design and verification of image quality
NASA Astrophysics Data System (ADS)
Miyazaki, Satoshi; Komiyama, Yutaka; Kawanomoto, Satoshi; Doi, Yoshiyuki; Furusawa, Hisanori; Hamana, Takashi; Hayashi, Yusuke; Ikeda, Hiroyuki; Kamata, Yukiko; Karoji, Hiroshi; Koike, Michitaro; Kurakami, Tomio; Miyama, Shoken; Morokuma, Tomoki; Nakata, Fumiaki; Namikawa, Kazuhito; Nakaya, Hidehiko; Nariai, Kyoji; Obuchi, Yoshiyuki; Oishi, Yukie; Okada, Norio; Okura, Yuki; Tait, Philip; Takata, Tadafumi; Tanaka, Yoko; Tanaka, Masayuki; Terai, Tsuyoshi; Tomono, Daigo; Uraguchi, Fumihiro; Usuda, Tomonori; Utsumi, Yousuke; Yamada, Yoshihiko; Yamanoi, Hitomi; Aihara, Hiroaki; Fujimori, Hiroki; Mineo, Sogo; Miyatake, Hironao; Oguri, Masamune; Uchida, Tomohisa; Tanaka, Manobu M.; Yasuda, Naoki; Takada, Masahiro; Murayama, Hitoshi; Nishizawa, Atsushi J.; Sugiyama, Naoshi; Chiba, Masashi; Futamase, Toshifumi; Wang, Shiang-Yu; Chen, Hsin-Yo; Ho, Paul T. P.; Liaw, Eric J. Y.; Chiu, Chi-Fang; Ho, Cheng-Lin; Lai, Tsang-Chih; Lee, Yao-Cheng; Jeng, Dun-Zen; Iwamura, Satoru; Armstrong, Robert; Bickerton, Steve; Bosch, James; Gunn, James E.; Lupton, Robert H.; Loomis, Craig; Price, Paul; Smith, Steward; Strauss, Michael A.; Turner, Edwin L.; Suzuki, Hisanori; Miyazaki, Yasuhito; Muramatsu, Masaharu; Yamamoto, Koei; Endo, Makoto; Ezaki, Yutaka; Ito, Noboru; Kawaguchi, Noboru; Sofuku, Satoshi; Taniike, Tomoaki; Akutsu, Kotaro; Dojo, Naoto; Kasumi, Kazuyuki; Matsuda, Toru; Imoto, Kohei; Miwa, Yoshinori; Suzuki, Masayuki; Takeshi, Kunio; Yokota, Hideo
2018-01-01
The Hyper Suprime-Cam (HSC) is an 870 megapixel prime focus optical imaging camera for the 8.2 m Subaru telescope. The wide-field corrector delivers sharp images of 0{^''.}2 (FWHM) in the HSC-i band over the entire 1.5° diameter field of view. The collimation of the camera with respect to the optical axis of the primary mirror is done with hexapod actuators, the mechanical accuracy of which is a few microns. Analysis of the remaining wavefront error in off-focus stellar images reveals that the collimation of the optical components meets design specifications. While there is a flexure of mechanical components, it also is within the design specification. As a result, the camera achieves its seeing-limited imaging on Maunakea during most of the time; the median seeing over several years of observing is 0.67" (FWHM) in the i band. The sensors use p-channel, fully depleted CCDs of 200 μm thickness (2048 × 4176 15 μm square pixels) and we employ 116 of them to pave the 50 cm diameter focal plane. The minimum interval between exposures is 34 s, including the time to read out arrays, to transfer data to the control computer, and to save them to the hard drive. HSC on Subaru uniquely features a combination of a large aperture, a wide field of view, sharp images and a high sensitivity especially at longer wavelengths, which makes the HSC one of the most powerful observing facilities in the world.
KWFC: four square degrees camera for the Kiso Schmidt Telescope
NASA Astrophysics Data System (ADS)
Sako, Shigeyuki; Aoki, Tsutomu; Doi, Mamoru; Ienaka, Nobuyuki; Kobayashi, Naoto; Matsunaga, Noriyuki; Mito, Hiroyuki; Miyata, Takashi; Morokuma, Tomoki; Nakada, Yoshikazu; Soyano, Takao; Tarusawa, Ken'ichi; Miyazaki, Satoshi; Nakata, Fumiaki; Okada, Norio; Sarugaku, Yuki; Richmond, Michael W.
2012-09-01
The Kiso Wide Field Camera (KWFC) is a facility instrument for the 105-cm Schmidt telescope being operated by the Kiso Observatory of the University of Tokyo. This camera has been designed for wide-field observations by taking advantage of a large focal-plane area of the Schmidt telescope. Eight CCD chips with a total of 8k x 8k pixels cover a field-of-view of 2.2 degrees x 2.2 degrees on the sky. The dewar window works as a field flattener lens minimizing an image distortion across the field of view. Two shutter plates moving in parallel achieve uniform exposures on all the CCD pixels. The KWFC is equipped with a filter exchanger composed of an industrial robotic arm, a filter magazine capable of storing 12 filters, and a filter holder at the focal plane. Both the arm and the magazine are installed inside the tube framework of the telescope but without vignetting the beam. Wide-field survey programs searching for supernovae and late-type variable stars have begun in April 2012. The survey observations are performed with a management software system for facility instruments including the telescope and the KWFC. This system automatically carries out observations based on target lists registered in advance and makes appropriate decisions for implementation of observations by referring to weather conditions and status of the instruments. Image data obtained in the surveys are processed with pipeline software in real time to search for candidates of time-variable sources.
NASA Astrophysics Data System (ADS)
Jackson, Christopher Robert
"Lucky-region" fusion (LRF) is a synthetic imaging technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm selects sharp regions of an image obtained from a series of short exposure frames, and fuses the sharp regions into a final, improved image. In previous research, the LRF algorithm had been implemented on a PC using the C programming language. However, the PC did not have sufficient sequential processing power to handle real-time extraction, processing and reduction required when the LRF algorithm was applied to real-time video from fast, high-resolution image sensors. This thesis describes two hardware implementations of the LRF algorithm to achieve real-time image processing. The first was created with a VIRTEX-7 field programmable gate array (FPGA). The other developed using the graphics processing unit (GPU) of a NVIDIA GeForce GTX 690 video card. The novelty in the FPGA approach is the creation of a "black box" LRF video processing system with a general camera link input, a user controller interface, and a camera link video output. We also describe a custom hardware simulation environment we have built to test the FPGA LRF implementation. The advantage of the GPU approach is significantly improved development time, integration of image stabilization into the system, and comparable atmospheric turbulence mitigation.
A multipurpose camera system for monitoring Kīlauea Volcano, Hawai'i
Patrick, Matthew R.; Orr, Tim R.; Lee, Lopaka; Moniz, Cyril J.
2015-01-01
We describe a low-cost, compact multipurpose camera system designed for field deployment at active volcanoes that can be used either as a webcam (transmitting images back to an observatory in real-time) or as a time-lapse camera system (storing images onto the camera system for periodic retrieval during field visits). The system also has the capability to acquire high-definition video. The camera system uses a Raspberry Pi single-board computer and a 5-megapixel low-light (near-infrared sensitive) camera, as well as a small Global Positioning System (GPS) module to ensure accurate time-stamping of images. Custom Python scripts control the webcam and GPS unit and handle data management. The inexpensive nature of the system allows it to be installed at hazardous sites where it might be lost. Another major advantage of this camera system is that it provides accurate internal timing (independent of network connection) and, because a full Linux operating system and the Python programming language are available on the camera system itself, it has the versatility to be configured for the specific needs of the user. We describe example deployments of the camera at Kīlauea Volcano, Hawai‘i, to monitor ongoing summit lava lake activity.
NASA Astrophysics Data System (ADS)
Ratzloff, Jeff; Law, Nicholas M.; Fors, Octavi; Wulfken, Philip J.
2015-01-01
We designed, tested, prototyped and built a compact 27-camera robotic telescope that images 10,000 square degrees in 2-minute exposures. We exploit mass produced interline CCD Cameras with Rokinon consumer lenses to economically build a telescope that covers this large part of the sky simultaneously with a good enough pixel sampling to avoid the confusion limit over most of the sky. We developed the initial concept into a 3-d mechanical design with the aid of computer modeling programs. Significant design components include the camera assembly-mounting modules, the hemispherical support structure, and the instrument base structure. We simulated flexure and material stress in each of the three main components, which helped us optimize the rigidity and materials selection, while reducing weight. The camera mounts are CNC aluminum and the support shell is reinforced fiberglass. Other significant project components include optimizing camera locations, camera alignment, thermal analysis, environmental sealing, wind protection, and ease of access to internal components. The Evryscope will be assembled at UNC Chapel Hill and deployed to the CTIO in 2015.
2D Measurements of the Balmer Series in Proto-MPEX using a Fast Visible Camera Setup
NASA Astrophysics Data System (ADS)
Lindquist, Elizabeth G.; Biewer, Theodore M.; Ray, Holly B.
2017-10-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device with densities up to 1020 m-3 and temperatures up to 20 eV. Broadband spectral measurements show the visible emission spectra are solely due to the Balmer lines of deuterium. Monochromatic and RGB color Sanstreak SC1 Edgertronic fast visible cameras capture high speed video of plasmas in Proto-MPEX. The color camera is equipped with a long pass 450 nm filter and an internal Bayer filter to view the Dα line at 656 nm on the red channel and the Dβ line at 486 nm on the blue channel. The monochromatic camera has a 434 nm narrow bandpass filter to view the Dγ intensity. In the setup, a 50/50 beam splitter is used so both cameras image the same region of the plasma discharge. Camera images were aligned to each other by viewing a grid ensuring 1 pixel registration between the two cameras. A uniform intensity calibrated white light source was used to perform a pixel-to-pixel relative and an absolute intensity calibration for both cameras. Python scripts that combined the dual camera data, rendering the Dα, Dβ, and Dγ intensity ratios. Observations from Proto-MPEX discharges will be presented. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725.
Raimondi, V; Agati, G; Cecchi, G; Gomoiu, I; Lognoli, D; Palombi, L
2009-12-07
An optical epifluorescence microscope, coupled to a CCD camera, a standard webcam and a microspectrofluorimeter, are used to record in vivo real-time changes in the autofluorescence of spores and hyphae in Aspergillus niger, a fungus containing melanin, while exposed to UV irradiation. The results point out major changes in both signal intensity and the spectral shape of the autofluorescence signal after only few minutes of exposure, and can contribute to the interpretation of data obtained with other fluorescence techniques, including those, such as GPF labeling, in which endogenous fluorophores constitute a major disturbance.
Software manual for operating particle displacement tracking data acquisition and reduction system
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
1991-01-01
The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.
View of 'Cape Verde' from 'Cape St. Mary' in Mid-Afternoon
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The exposures were taken during mid-afternoon lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.View of 'Cape Verde' from 'Cape St. Mary' in Late Morning
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The exposures were taken during late-morning lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.NASA Technical Reports Server (NTRS)
1970-01-01
Results are presented of engineering tests of the Surveyor III television camera, which resided on the moon for 2 and 1/2 years before being brought back to earth by the Apollo XII astronauts. Electric circuits, electrical, mechanical, and optical components and subsystems, the vidicon tube, and a variety of internal materials and surface coatings were examined to determine the effects of lunar exposure. Anomalies and failures uncovered were analyzed. For the most part, the camera parts withstood the extreme environment exceedingly well except where degradation of obsolete parts or suspect components had been anticipated. No significant evidence of cold welding was observed, and the anomalies were largely attributable to causes other than lunar exposure. Very little evidence of micrometeoroid impact was noted. Discoloration of material surfaces -- one of the major effects noted--was found to be due to lunar dust contamination and radiation damage. The extensive test data contained in this report are supplemented by results of tests of other Surveyor parts retrieved by the Apollo XII astronauts, which are contained in a companion report.
Coincidence ion imaging with a fast frame camera
NASA Astrophysics Data System (ADS)
Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen
2014-12-01
A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.
NASA Astrophysics Data System (ADS)
Gauthier, L. R.; Jansen, M. E.; Meyer, J. R.
2014-09-01
Camera motion is a potential problem when a video camera is used to perform dynamic displacement measurements. If the scene camera moves at the wrong time, the apparent motion of the object under study can easily be confused with the real motion of the object. In some cases, it is practically impossible to prevent camera motion, as for instance, when a camera is used outdoors in windy conditions. A method to address this challenge is described that provides an objective means to measure the displacement of an object of interest in the scene, even when the camera itself is moving in an unpredictable fashion at the same time. The main idea is to synchronously measure the motion of the camera and to use those data ex post facto to subtract out the apparent motion in the scene that is caused by the camera motion. The motion of the scene camera is measured by using a reference camera that is rigidly attached to the scene camera and oriented towards a stationary reference object. For instance, this reference object may be on the ground, which is known to be stationary. It is necessary to calibrate the reference camera by simultaneously measuring the scene images and the reference images at times when it is known that the scene object is stationary and the camera is moving. These data are used to map camera movement data to apparent scene movement data in pixel space and subsequently used to remove the camera movement from the scene measurements.
Environmental estrogen(s) induced swimming behavioural alterations in adult zebrafish (Danio rerio).
Goundadkar, Basavaraj B; Katti, Pancharatna
2017-09-01
The present study is an attempt to investigate the effects of long-term (75days) exposure to environmental estrogens (EE) on the swimming behaviour of zebrafish (Danio rerio). Adult zebrafish were exposed semi-statically to media containing commonly detected estrogenic water contaminants (EE2, DES and BPA) at a concentration (5ng/L) much lower than environmentally recorded levels. Time spent in swimming, surface preference, patterns and path of swimming were recorded (6mins) for each fish using two video cameras on day 15, 30 60 and 75. Video clips were analysed using a software program. Results indicate that chronic exposure to EE leads to increased body weight and size of females, reduced (P<0.05) swimming time, delay in latency, increased (P<0.05) immobility, erratic movements and freezing episodes. We conclude that estrogenic contamination of natural aquatic systems induces alterations in locomotor behaviour and associated physiological disturbances in inhabitant fish fauna. Copyright © 2017 Elsevier B.V. All rights reserved.
Evaluation of a high framerate multi-exposure laser speckle contrast imaging setup
NASA Astrophysics Data System (ADS)
Hultman, Martin; Fredriksson, Ingemar; Strömberg, Tomas; Larsson, Marcus
2018-02-01
We present a first evaluation of a new multi-exposure laser speckle contrast imaging (MELSCI) system for assessing spatial variations in the microcirculatory perfusion. The MELSCI system is based on a 1000 frames per second 1-megapixel camera connected to a field programmable gate arrays (FPGA) capable of producing MELSCI data in realtime. The imaging system is evaluated against a single point laser Doppler flowmetry (LDF) system during occlusionrelease provocations of the arm in five subjects. Perfusion is calculated from MELSCI data using current state-of-the-art inverse models. The analysis displayed a good agreement between measured and modeled data, with an average error below 6%. This strongly indicates that the applied model is capable of accurately describing the MELSCI data and that the acquired data is of high quality. Comparing readings from the occlusion-release provocation showed that the MELSCI perfusion was significantly correlated (R=0.83) to the single point LDF perfusion, clearly outperforming perfusion estimations based on a single exposure time. We conclude that the MELSCI system provides blood flow images of enhanced quality, taking us one step closer to a system that accurately can monitor dynamic changes in skin perfusion over a large area in real-time.
NASA Astrophysics Data System (ADS)
Peltoniemi, Mikko; Aurela, Mika; Böttcher, Kristin; Kolari, Pasi; Loehr, John; Karhu, Jouni; Linkosalmi, Maiju; Melih Tanis, Cemal; Tuovinen, Juha-Pekka; Nadir Arslan, Ali
2018-01-01
In recent years, monitoring of the status of ecosystems using low-cost web (IP) or time lapse cameras has received wide interest. With broad spatial coverage and high temporal resolution, networked cameras can provide information about snow cover and vegetation status, serve as ground truths to Earth observations and be useful for gap-filling of cloudy areas in Earth observation time series. Networked cameras can also play an important role in supplementing laborious phenological field surveys and citizen science projects, which also suffer from observer-dependent observation bias. We established a network of digital surveillance cameras for automated monitoring of phenological activity of vegetation and snow cover in the boreal ecosystems of Finland. Cameras were mounted at 14 sites, each site having 1-3 cameras. Here, we document the network, basic camera information and access to images in the permanent data repository (http://www.zenodo.org/communities/phenology_camera/). Individual DOI-referenced image time series consist of half-hourly images collected between 2014 and 2016 (https://doi.org/10.5281/zenodo.1066862). Additionally, we present an example of a colour index time series derived from images from two contrasting sites.
Helmet-Cam: tool for assessing miners’ respirable dust exposure
Cecala, A.B.; Reed, W.R.; Joy, G.J.; Westmoreland, S.C.; O’Brien, A.D.
2015-01-01
Video technology coupled with datalogging exposure monitors have been used to evaluate worker exposure to different types of contaminants. However, previous application of this technology used a stationary video camera to record the worker’s activity while the worker wore some type of contaminant monitor. These techniques are not applicable to mobile workers in the mining industry because of their need to move around the operation while performing their duties. The Helmet-Cam is a recently developed exposure assessment tool that integrates a person-wearable video recorder with a datalogging dust monitor. These are worn by the miner in a backpack, safety belt or safety vest to identify areas or job tasks of elevated exposure. After a miner performs his or her job while wearing the unit, the video and dust exposure data files are downloaded to a computer and then merged together through a NIOSH-developed computer software program called Enhanced Video Analysis of Dust Exposure (EVADE). By providing synchronized playback of the merged video footage and dust exposure data, the EVADE software allows for the assessment and identification of key work areas and processes, as well as work tasks that significantly impact a worker’s personal respirable dust exposure. The Helmet-Cam technology has been tested at a number of metal/nonmetal mining operations and has proven to be a valuable assessment tool. Mining companies wishing to use this technique can purchase a commercially available video camera and an instantaneous dust monitor to obtain the necessary data, and the NIOSH-developed EVADE software will be available for download at no cost on the NIOSH website. PMID:26380529
Characterization of a smartphone camera's response to ultraviolet A radiation.
Igoe, Damien; Parisi, Alfio; Carter, Brad
2013-01-01
As part of a wider study into the use of smartphones as solar ultraviolet radiation monitors, this article characterizes the ultraviolet A (UVA; 320-400 nm) response of a consumer complementary metal oxide semiconductor (CMOS)-based smartphone image sensor in a controlled laboratory environment. The CMOS image sensor in the camera possesses inherent sensitivity to UVA, and despite the attenuation due to the lens and neutral density and wavelength-specific bandpass filters, the measured relative UVA irradiances relative to the incident irradiances range from 0.0065% at 380 nm to 0.0051% at 340 nm. In addition, the sensor demonstrates a predictable response to low-intensity discrete UVA stimuli that can be modelled using the ratio of recorded digital values to the incident UVA irradiance for a given automatic exposure time, and resulting in measurement errors that are typically less than 5%. Our results support the idea that smartphones can be used for scientific monitoring of UVA radiation. © 2012 Wiley Periodicals, Inc. Photochemistry and Photobiology © 2012 The American Society of Photobiology.
VizieR Online Data Catalog: HST FGS-1r parallaxes for 8 metal-poor stars (Chaboyer+, 2017)
NASA Astrophysics Data System (ADS)
Chaboyer, B.; McArthur, B. E.; O'Malley, E.; Benedict, G. F.; Feiden, G. A.; Harrison, T. E.; McWilliam, A.; Nelan, E. P.; Patterson, R. J.; Sarajedini, A.
2017-08-01
Each program star was observed with the HST Advanced Camera for Surveys-Wide Field Camera (ACS/WFC) in the F606W and F814W filters. The CTE-corrected ACS/WFC images for the program stars were retrieved from MAST. These instrumental magnitudes were corrected for exposure time, matched to form colors, and calibrated to the VEGAMag and ground-based VI systems using the Sirianni+ (2005PASP..117.1049S) photometric transformations. Ground based photometry for all of our program stars were obtained using the New Mexico State University (NMSU) 1m telescope, the MDM 1.3m telescope, and the SMARTS 0.9m telescope. See appendix A1 for further details. We used HST FGS-1r, a two-axis interferometer, to make the astrometric observations. Eighty-nine orbits of HST astrometric observations were made between 2008 December and 2013 June. Every orbit contained several observations of the target and surrounding reference stars. (4 data files).
An accurate registration technique for distorted images
NASA Technical Reports Server (NTRS)
Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis
1990-01-01
Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.
Sensors for 3D Imaging: Metric Evaluation and Calibration of a CCD/CMOS Time-of-Flight Camera.
Chiabrando, Filiberto; Chiabrando, Roberto; Piatti, Dario; Rinaudo, Fulvio
2009-01-01
3D imaging with Time-of-Flight (ToF) cameras is a promising recent technique which allows 3D point clouds to be acquired at video frame rates. However, the distance measurements of these devices are often affected by some systematic errors which decrease the quality of the acquired data. In order to evaluate these errors, some experimental tests on a CCD/CMOS ToF camera sensor, the SwissRanger (SR)-4000 camera, were performed and reported in this paper. In particular, two main aspects are treated: the calibration of the distance measurements of the SR-4000 camera, which deals with evaluation of the camera warm up time period, the distance measurement error evaluation and a study of the influence on distance measurements of the camera orientation with respect to the observed object; the second aspect concerns the photogrammetric calibration of the amplitude images delivered by the camera using a purpose-built multi-resolution field made of high contrast targets.
High-frame-rate infrared and visible cameras for test range instrumentation
NASA Astrophysics Data System (ADS)
Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.
1995-09-01
Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.
Age estimation of bloodstains using smartphones and digital image analysis.
Thanakiatkrai, Phuvadol; Yaodam, Alisa; Kitpipit, Thitika
2013-12-10
Recent studies on bloodstains have focused on determining the time since deposition of bloodstains, which can provide useful temporal information to forensic investigations. This study is the first to use smartphone cameras in combination with a truly low-cost illumination system as a tool to estimate the age of bloodstains. Bloodstains were deposited on various substrates and photographed with a smartphone camera. Three smartphones (Samsung Galaxy S Plus, Apple iPhone 4, and Apple iPad 2) were compared. The environmental effects - temperature, humidity, light exposure, and anticoagulant - on the bloodstain age estimation process were explored. The color values from the digital images were extracted and correlated with time since deposition. Magenta had the highest correlation (R(2)=0.966) and was used in subsequent experiments. The Samsung Galaxy S Plus was the most suitable smartphone as its magenta decreased exponentially with increasing time and had highest repeatability (low variation within and between pictures). The quantifiable color change observed is consistent with well-established hemoglobin denaturation process. Using a statistical classification technique called Random Forests™, we could predict bloodstain age accurately up to 42 days with an error rate of 12%. Additionally, the age of forty blind stains were all correctly predicted, and 83% of mock casework samples were correctly classified. No within- and between-person variations were observed (p>0.05), while smartphone camera, temperature, humidity, and substrate color influenced the age determination process in different ways. Our technique provides a cheap, rapid, easy-to-use, and truly portable alternative to more complicated analysis using specialized equipment, e.g. spectroscopy and HPLC. No training is necessary with our method, and we envision a smartphone application that could take user inputs of environmental factors and provide an accurate estimate of bloodstain age. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Evaluation of High Dynamic Range Photography as a Luminance Mapping Technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inanici, Mehlika; Galvin, Jim
2004-12-30
The potential, limitations, and applicability of the High Dynamic Range (HDR) photography technique is evaluated as a luminance mapping tool. Multiple exposure photographs of static scenes are taken with a Nikon 5400 digital camera to capture the wide luminance variation within the scenes. The camera response function is computationally derived using the Photosphere software, and is used to fuse the multiple photographs into HDR images. The vignetting effect and point spread function of the camera and lens system is determined. Laboratory and field studies have shown that the pixel values in the HDR photographs can correspond to the physical quantitymore » of luminance with reasonable precision and repeatability.« less
Winter sky brightness and cloud cover at Dome A, Antarctica
NASA Astrophysics Data System (ADS)
Moore, Anna M.; Yang, Yi; Fu, Jianning; Ashley, Michael C. B.; Cui, Xiangqun; Feng, Long Long; Gong, Xuefei; Hu, Zhongwen; Lawrence, Jon S.; Luong-Van, Daniel M.; Riddle, Reed; Shang, Zhaohui; Sims, Geoff; Storey, John W. V.; Tothill, Nicholas F. H.; Travouillon, Tony; Wang, Lifan; Yang, Huigen; Yang, Ji; Zhou, Xu; Zhu, Zhenxi
2013-01-01
At the summit of the Antarctic plateau, Dome A offers an intriguing location for future large scale optical astronomical observatories. The Gattini Dome A project was created to measure the optical sky brightness and large area cloud cover of the winter-time sky above this high altitude Antarctic site. The wide field camera and multi-filter system was installed on the PLATO instrument module as part of the Chinese-led traverse to Dome A in January 2008. This automated wide field camera consists of an Apogee U4000 interline CCD coupled to a Nikon fisheye lens enclosed in a heated container with glass window. The system contains a filter mechanism providing a suite of standard astronomical photometric filters (Bessell B, V, R) and a long-pass red filter for the detection and monitoring of airglow emission. The system operated continuously throughout the 2009, and 2011 winter seasons and part-way through the 2010 season, recording long exposure images sequentially for each filter. We have in hand one complete winter-time dataset (2009) returned via a manned traverse. We present here the first measurements of sky brightness in the photometric V band, cloud cover statistics measured so far and an estimate of the extinction.
Development of the FPI+ as facility science instrument for SOFIA cycle four observations
NASA Astrophysics Data System (ADS)
Pfüller, Enrico; Wiedemann, Manuel; Wolf, Jürgen; Krabbe, Alfred
2016-08-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a heavily modified Boeing 747SP aircraft, accommodating a 2.5m infrared telescope. This airborne observation platform takes astronomers to flight altitudes of up to 13.7 km (45,000ft) and therefore allows an unobstructed view of the infrared universe at wavelengths between 0.3 m and 1600 m. SOFIA is currently completing its fourth cycle of observations and utilizes eight different imaging and spectroscopic science instruments. New instruments for SOFIAs cycle 4 observations are the High-resolution Airborne Wideband Camera-plus (HAWC+) and the Focal Plane Imager (FPI+). The latter is an integral part of the telescope assembly and is used on every SOFIA flight to ensure precise tracking on the desired targets. The FPI+ is used as a visual-light photometer in its role as facility science instrument. Since the upgrade of the FPI camera and electronics in 2013, it uses a thermo-electrically cooled science grade EM-CCD sensor inside a commercial-off-the-shelf Andor camera. The back-illuminated sensor has a peak quantum efficiency of 95% and the dark current is as low as 0.01 e-/pix/sec. With this new hardware the telescope has successfully tracked on 16th magnitude stars and thus the sky coverage, e.g. the area of sky that has suitable tracking stars, has increased to 99%. Before its use as an integrated tracking imager, the same type of camera has been used as a standalone diagnostic tool to analyze the telescope pointing stability at frequencies up to 200 Hz (imaging with 400 fps). These measurements help to improve the telescope pointing control algorithms and therefore reduce the image jitter in the focal plane. Science instruments benefit from this improvement with smaller image sizes for longer exposure times. The FPI has also been used to support astronomical observations like stellar occultations by the dwarf planet Pluto and a number of exoplanet transits. Especially the observation of the occultation events benefits from the high camera sensitivity, fast readout capability and the low read noise and it was possible to achieve high time resolution on the photometric light curves. This paper will give an overview of the development from the standalone diagnostic camera to the upgraded guiding/tracking camera, fully integrated into the telescope, while still offering the diagnostic capabilities and finally to the use as a facility science instrument on SOFIA.
Leveraging traffic and surveillance video cameras for urban traffic.
DOT National Transportation Integrated Search
2014-12-01
The objective of this project was to investigate the use of existing video resources, such as traffic : cameras, police cameras, red light cameras, and security cameras for the long-term, real-time : collection of traffic statistics. An additional ob...
Multifocal Fluorescence Microscope for Fast Optical Recordings of Neuronal Action Potentials
Shtrahman, Matthew; Aharoni, Daniel B.; Hardy, Nicholas F.; Buonomano, Dean V.; Arisaka, Katsushi; Otis, Thomas S.
2015-01-01
In recent years, optical sensors for tracking neural activity have been developed and offer great utility. However, developing microscopy techniques that have several kHz bandwidth necessary to reliably capture optically reported action potentials (APs) at multiple locations in parallel remains a significant challenge. To our knowledge, we describe a novel microscope optimized to measure spatially distributed optical signals with submillisecond and near diffraction-limit resolution. Our design uses a spatial light modulator to generate patterned illumination to simultaneously excite multiple user-defined targets. A galvanometer driven mirror in the emission path streaks the fluorescence emanating from each excitation point during the camera exposure, using unused camera pixels to capture time varying fluorescence at rates that are ∼1000 times faster than the camera’s native frame rate. We demonstrate that this approach is capable of recording Ca2+ transients resulting from APs in neurons labeled with the Ca2+ sensor Oregon Green Bapta-1 (OGB-1), and can localize the timing of these events with millisecond resolution. Furthermore, optically reported APs can be detected with the voltage sensitive dye DiO-DPA in multiple locations within a neuron with a signal/noise ratio up to ∼40, resolving delays in arrival time along dendrites. Thus, the microscope provides a powerful tool for photometric measurements of dynamics requiring submillisecond sampling at multiple locations. PMID:25650920
Pulsed Holographic Nondestructive Testing On Aircraft
NASA Astrophysics Data System (ADS)
Fagot, Hubert; Smigielski, Paul; Albe, Felix; Arnaud, Jean-Louis
1983-06-01
An holographic camera composed of two ruby lasers was built at ISL. It provides double exposure holograms with an adjustable time interval ranging from few ns to infinity. Various aircraft structures were first tested at ISL in laboratory conditions: honeycomb panels, wings ... The industrial tests on a military aircraft in maintenance checking were performed in a hangar of the SNIAS at Saint-Nazaire: wings, trap-door of the rear landing gear, air-brake... Electromechanical shocks were used to make the structure vibrate and to allow a fast trigger of the lasers. This avoids disturbance due to ambiant noises and vibrations.
Direct measurements of protein-stabilized gold nanoparticle interactions.
Eichmann, Shannon L; Bevan, Michael A
2010-09-21
We report integrated video and total internal reflection microscopy measurements of protein stabilized 110 nm Au nanoparticles confined in 280 nm gaps in physiological media. Measured potential energy profiles display quantitative agreement with Brownian dynamic simulations that include hydrodynamic interactions and camera exposure time and noise effects. Our results demonstrate agreement between measured nonspecific van der Waals and adsorbed protein interactions with theoretical potentials. Confined, lateral nanoparticle diffusivity measurements also display excellent agreement with predictions. These findings provide a basis to interrogate specific biomacromolecular interactions in similar experimental configurations and to design future improved measurement methods.
Towards real time speckle controlled retinal photocoagulation
NASA Astrophysics Data System (ADS)
Bliedtner, Katharina; Seifert, Eric; Stockmann, Leoni; Effe, Lisa; Brinkmann, Ralf
2016-03-01
Photocoagulation is a laser treatment widely used for the therapy of several retinal diseases. Intra- and inter-individual variations of the ocular transmission, light scattering and the retinal absorption makes it impossible to achieve a uniform effective exposure and hence a uniform damage throughout the therapy. A real-time monitoring and control of the induced damage is highly requested. Here, an approach to realize a real time optical feedback using dynamic speckle analysis is presented. A 532 nm continuous wave Nd:YAG laser is used for coagulation. During coagulation, speckle dynamics are monitored by a coherent object illumination using a 633nm HeNe laser and analyzed by a CMOS camera with a frame rate up to 1 kHz. It is obvious that a control system needs to determine whether the desired damage is achieved to shut down the system in a fraction of the exposure time. Here we use a fast and simple adaption of the generalized difference algorithm to analyze the speckle movements. This algorithm runs on a FPGA and is able to calculate a feedback value which is correlated to the thermal and coagulation induced tissue motion and thus the achieved damage. For different spot sizes (50-200 μm) and different exposure times (50-500 ms) the algorithm shows the ability to discriminate between different categories of retinal pigment epithelial damage ex-vivo in enucleated porcine eyes. Furthermore in-vivo experiments in rabbits show the ability of the system to determine tissue changes in living tissue during coagulation.
Coincidence ion imaging with a fast frame camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei
2014-12-15
A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots onmore » each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.« less
Tanaka, Hirokazu; Chikamori, Taishiro; Hida, Satoshi; Uchida, Kenji; Igarashi, Yuko; Yokoyama, Tsuyoshi; Takahashi, Masaki; Shiba, Chie; Yoshimura, Mana; Tokuuye, Koichi; Yamashina, Akira
2013-01-01
Cadmium-zinc-telluride (CZT) solid-state detectors have been recently introduced into the field of myocardial perfusion imaging. The aim of this study was to prospectively compare the diagnostic performance of the CZT high-speed gamma camera (Discovery NM 530c) with that of the standard 3-head gamma camera in the same group of patients. The study group consisted of 150 consecutive patients who underwent a 1-day stress-rest (99m)Tc-sestamibi or tetrofosmin imaging protocol. Image acquisition was performed first on a standard gamma camera with a 15-min scan time each for stress and for rest. All scans were immediately repeated on a CZT camera with a 5-min scan time for stress and a 3-min scan time for rest, using list mode. The correlations between the CZT camera and the standard camera for perfusion and function analyses were strong within narrow Bland-Altman limits of agreement. Using list mode analysis, image quality for stress was rated as good or excellent in 97% of the 3-min scans, and in 100% of the ≥4-min scans. For CZT scans at rest, similarly, image quality was rated as good or excellent in 94% of the 1-min scans, and in 100% of the ≥2-min scans. The novel CZT camera provides excellent image quality, which is equivalent to standard myocardial single-photon emission computed tomography, despite a short scan time of less than half of the standard time.
Thermal Design of the Instrument for the Transiting Exoplanet Survey Satellite
NASA Technical Reports Server (NTRS)
Allen, Gregory D.
2016-01-01
TESS observatory is a two year NASA Explorer mission which will use a set of four cameras to discover exoplanets. It will be placed in a high-earth orbit with a period of 13.7 days and will be unaffected by temperature disturbances caused by environmental heating from the Earth. The cameras use their stray-light baffles to passively cool the cameras and in turn the CCD's in order to maintain operational temperatures. The design has been well thought out and analyzed to maximize temperature stability. The analysis shows that the design keeps the cameras and their components within their temperature ranges which will help make it a successful mission. It will also meet its survival requirement of sustaining exposure to a five hour eclipse. Official validation and verification planning is underway and will be performed as the system is built up. It is slated for launch in 2017.
Orr, Tim R.; Hoblitt, Richard P.
2008-01-01
Volcanoes can be difficult to study up close. Because it may be days, weeks, or even years between important events, direct observation is often impractical. In addition, volcanoes are often inaccessible due to their remote location and (or) harsh environmental conditions. An eruption adds another level of complexity to what already may be a difficult and dangerous situation. For these reasons, scientists at the U.S. Geological Survey (USGS) Hawaiian Volcano Observatory (HVO) have, for years, built camera systems to act as surrogate eyes. With the recent advances in digital-camera technology, these eyes are rapidly improving. One type of photographic monitoring involves the use of near-real-time network-enabled cameras installed at permanent sites (Hoblitt and others, in press). Time-lapse camera-systems, on the other hand, provide an inexpensive, easily transportable monitoring option that offers more versatility in site location. While time-lapse systems lack near-real-time capability, they provide higher image resolution and can be rapidly deployed in areas where the use of sophisticated telemetry required by the networked cameras systems is not practical. This report describes the latest generation (as of 2008) time-lapse camera system used by HVO for photograph acquisition in remote and hazardous sites on Kilauea Volcano.
Development of two-framing camera with large format and ultrahigh speed
NASA Astrophysics Data System (ADS)
Jiang, Xiaoguo; Wang, Yuan; Wang, Yi
2012-10-01
High-speed imaging facility is important and necessary for the formation of time-resolved measurement system with multi-framing capability. The framing camera which satisfies the demands of both high speed and large format needs to be specially developed in the ultrahigh speed research field. A two-framing camera system with high sensitivity and time-resolution has been developed and used for the diagnosis of electron beam parameters of Dragon-I linear induction accelerator (LIA). The camera system, which adopts the principle of light beam splitting in the image space behind the lens with long focus length, mainly consists of lens-coupled gated image intensifier, CCD camera and high-speed shutter trigger device based on the programmable integrated circuit. The fastest gating time is about 3 ns, and the interval time between the two frames can be adjusted discretely at the step of 0.5 ns. Both the gating time and the interval time can be tuned to the maximum value of about 1 s independently. Two images with the size of 1024×1024 for each can be captured simultaneously in our developed camera. Besides, this camera system possesses a good linearity, uniform spatial response and an equivalent background illumination as low as 5 electrons/pix/sec, which fully meets the measurement requirements of Dragon-I LIA.
NASA Technical Reports Server (NTRS)
Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtain, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike
2014-01-01
The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro- polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with a gain of 2.0 +/- 0.5, less than or equal to 25 e- readout noise, less than or equal to 10 e-/second/pixel dark current, and less than 0.1percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; system gain, dark current, read noise, and residual non-linearity.
Astrometric Calibration and Performance of the Dark Energy Camera
Bernstein, G. M.; Armstrong, R.; Plazas, A. A.; ...
2017-05-30
We characterize the ability of the Dark Energy Camera (DECam) to perform relative astrometry across its 500 Mpix, 3more » $deg^2$ science field of view, and across 4 years of operation. This is done using internal comparisons of $~ 4 x 10^7$ measurements of high-S/N stellar images obtained in repeat visits to fields of moderate stellar density, with the telescope dithered to move the sources around the array. An empirical astrometric model includes terms for: optical distortions; stray electric fields in the CCD detectors; chromatic terms in the instrumental and atmospheric optics; shifts in CCD relative positions of up to $$\\approx 10 \\mu m$$ when the DECam temperature cycles; and low-order distortions to each exposure from changes in atmospheric refraction and telescope alignment. Errors in this astrometric model are dominated by stochastic variations with typical amplitudes of 10-30 mas (in a 30 s exposure) and $$5^{\\prime}-10^{\\prime}$$ arcmin coherence length, plausibly attributed to Kolmogorov-spectrum atmospheric turbulence. The size of these atmospheric distortions is not closely related to the seeing. Given an astrometric reference catalog at density $$\\approx 0.7$$ $$arcmin^{-2}$$, e.g. from Gaia, the typical atmospheric distortions can be interpolated to $$\\approx$$ 7 mas RMS accuracy (for 30 s exposures) with $$1^{\\prime}$$ arcmin coherence length for residual errors. Remaining detectable error contributors are 2-4 mas RMS from unmodelled stray electric fields in the devices, and another 2-4 mas RMS from focal plane shifts between camera thermal cycles. Thus the astrometric solution for a single DECam exposure is accurate to 3-6 mas ( $$\\approx$$ 0.02 pixels, or $$\\approx$$ 300 nm) on the focal plane, plus the stochastic atmospheric distortion.« less
In-Home Exposure Therapy for Veterans with PTSD
2017-10-01
telehealth (HBT; Veterans stay at home and meet with the therapist using the computer and video cameras), and (3) PE delivered in home, in person (IHIP... video cameras), and (3) PE delivered in home, in person (IHIP; the therapist comes to the Veterans’ homes for treatment). We will be checking to see...when providing treatment in homes and through home based video technology. BODY: Our focus in the past year (30 Sept 2016 – 10 Oct 2017) has been to
Coincidence electron/ion imaging with a fast frame camera
NASA Astrophysics Data System (ADS)
Li, Wen; Lee, Suk Kyoung; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander; Fan, Lin
2015-05-01
A new time- and position- sensitive particle detection system based on a fast frame CMOS camera is developed for coincidence electron/ion imaging. The system is composed of three major components: a conventional microchannel plate (MCP)/phosphor screen electron/ion imager, a fast frame CMOS camera and a high-speed digitizer. The system collects the positional information of ions/electrons from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of MCPs processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of electron/ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide. We further show that a time resolution of 30 ps can be achieved when measuring electron TOF spectrum and this enables the new system to achieve a good energy resolution along the TOF axis.
Detecting method of subjects' 3D positions and experimental advanced camera control system
NASA Astrophysics Data System (ADS)
Kato, Daiichiro; Abe, Kazuo; Ishikawa, Akio; Yamada, Mitsuho; Suzuki, Takahito; Kuwashima, Shigesumi
1997-04-01
Steady progress is being made in the development of an intelligent robot camera capable of automatically shooting pictures with a powerful sense of reality or tracking objects whose shooting requires advanced techniques. Currently, only experienced broadcasting cameramen can provide these pictures.TO develop an intelligent robot camera with these abilities, we need to clearly understand how a broadcasting cameraman assesses his shooting situation and how his camera is moved during shooting. We use a real- time analyzer to study a cameraman's work and his gaze movements at studios and during sports broadcasts. This time, we have developed a detecting method of subjects' 3D positions and an experimental camera control system to help us further understand the movements required for an intelligent robot camera. The features are as follows: (1) Two sensor cameras shoot a moving subject and detect colors, producing its 3D coordinates. (2) Capable of driving a camera based on camera movement data obtained by a real-time analyzer. 'Moving shoot' is the name we have given to the object position detection technology on which this system is based. We used it in a soccer game, producing computer graphics showing how players moved. These results will also be reported.
Kidd, David G; McCartt, Anne T
2016-02-01
This study characterized the use of various fields of view during low-speed parking maneuvers by drivers with a rearview camera, a sensor system, a camera and sensor system combined, or neither technology. Participants performed four different low-speed parking maneuvers five times. Glances to different fields of view the second time through the four maneuvers were coded along with the glance locations at the onset of the audible warning from the sensor system and immediately after the warning for participants in the sensor and camera-plus-sensor conditions. Overall, the results suggest that information from cameras and/or sensor systems is used in place of mirrors and shoulder glances. Participants with a camera, sensor system, or both technologies looked over their shoulders significantly less than participants without technology. Participants with cameras (camera and camera-plus-sensor conditions) used their mirrors significantly less compared with participants without cameras (no-technology and sensor conditions). Participants in the camera-plus-sensor condition looked at the center console/camera display for a smaller percentage of the time during the low-speed maneuvers than participants in the camera condition and glanced more frequently to the center console/camera display immediately after the warning from the sensor system compared with the frequency of glances to this location at warning onset. Although this increase was not statistically significant, the pattern suggests that participants in the camera-plus-sensor condition may have used the warning as a cue to look at the camera display. The observed differences in glance behavior between study groups were illustrated by relating it to the visibility of a 12-15-month-old child-size object. These findings provide evidence that drivers adapt their glance behavior during low-speed parking maneuvers following extended use of rearview cameras and parking sensors, and suggest that other technologies which augment the driving task may do the same. Copyright © 2015 Elsevier Ltd. All rights reserved.
Høye, Gudrun; Fridman, Andrei
2013-05-06
Current high-resolution push-broom hyperspectral cameras introduce keystone errors to the captured data. Efforts to correct these errors in hardware severely limit the optical design, in particular with respect to light throughput and spatial resolution, while at the same time the residual keystone often remains large. The mixel camera solves this problem by combining a hardware component--an array of light mixing chambers--with a mathematical method that restores the hyperspectral data to its keystone-free form, based on the data that was recorded onto the sensor with large keystone. A Virtual Camera software, that was developed specifically for this purpose, was used to compare the performance of the mixel camera to traditional cameras that correct keystone in hardware. The mixel camera can collect at least four times more light than most current high-resolution hyperspectral cameras, and simulations have shown that the mixel camera will be photon-noise limited--even in bright light--with a significantly improved signal-to-noise ratio compared to traditional cameras. A prototype has been built and is being tested.
Dynamical Modeling of NGC 6397: Simulated HST Imaging
NASA Astrophysics Data System (ADS)
Dull, J. D.; Cohn, H. N.; Lugger, P. M.; Slavin, S. D.; Murphy, B. W.
1994-12-01
The proximity of NGC 6397 (2.2 kpc) provides an ideal opportunity to test current dynamical models for globular clusters with the HST Wide-Field/Planetary Camera (WFPC2)\\@. We have used a Monte Carlo algorithm to generate ensembles of simulated Planetary Camera (PC) U-band images of NGC 6397 from evolving, multi-mass Fokker-Planck models. These images, which are based on the post-repair HST-PC point-spread function, are used to develop and test analysis methods for recovering structural information from actual HST imaging. We have considered a range of exposure times up to 2.4times 10(4) s, based on our proposed HST Cycle 5 observations. Our Fokker-Planck models include energy input from dynamically-formed binaries. We have adopted a 20-group mass spectrum extending from 0.16 to 1.4 M_sun. We use theoretical luminosity functions for red giants and main sequence stars. Horizontal branch stars, blue stragglers, white dwarfs, and cataclysmic variables are also included. Simulated images are generated for cluster models at both maximal core collapse and at a post-collapse bounce. We are carrying out stellar photometry on these images using ``DAOPHOT-assisted aperture photometry'' software that we have developed. We are testing several techniques for analyzing the resulting star counts, to determine the underlying cluster structure, including parametric model fits and the nonparametric density estimation methods. Our simulated images also allow us to investigate the accuracy and completeness of methods for carrying out stellar photometry in HST Planetary Camera images of dense cluster cores.
Positions of minor planets and Comet Panther (1980 u) obtained at the Chorzow Observatory
NASA Astrophysics Data System (ADS)
Wlodarczyk, I.
Photographic observations of 17 asteroids and Comet Panther were made between 1977 and 1982 with a 200/1000 mm photographic camera coupled to a 300/4500 mm refractor. The Turner method with the complete second-order polynomial was used to reduce the 16 x 16 cm ORWO ZU-2 plates that were obtained. The tabulated information for each asteroid and the comet include the number of the observation, the time of the observation in Universal Time, the topocentric position of the object referred to the mean epoch 1950.0, the dispersion in right ascension and declination, the duration of the exposure in minutes, and the symbol of the observer. Ten observers participated in the program.
A real-time camera calibration system based on OpenCV
NASA Astrophysics Data System (ADS)
Zhang, Hui; Wang, Hua; Guo, Huinan; Ren, Long; Zhou, Zuofeng
2015-07-01
Camera calibration is one of the essential steps in the computer vision research. This paper describes a real-time OpenCV based camera calibration system, and developed and implemented in the VS2008 environment. Experimental results prove that the system to achieve a simple and fast camera calibration, compared with MATLAB, higher precision and does not need manual intervention, and can be widely used in various computer vision system.
Finite-time tracking control for multiple non-holonomic mobile robots based on visual servoing
NASA Astrophysics Data System (ADS)
Ou, Meiying; Li, Shihua; Wang, Chaoli
2013-12-01
This paper investigates finite-time tracking control problem of multiple non-holonomic mobile robots via visual servoing. It is assumed that the pinhole camera is fixed to the ceiling, and camera parameters are unknown. The desired reference trajectory is represented by a virtual leader whose states are available to only a subset of the followers, and the followers have only interaction. First, the camera-objective visual kinematic model is introduced by utilising the pinhole camera model for each mobile robot. Second, a unified tracking error system between camera-objective visual servoing model and desired reference trajectory is introduced. Third, based on the neighbour rule and by using finite-time control method, continuous distributed cooperative finite-time tracking control laws are designed for each mobile robot with unknown camera parameters, where the communication topology among the multiple mobile robots is assumed to be a directed graph. Rigorous proof shows that the group of mobile robots converges to the desired reference trajectory in finite time. Simulation example illustrates the effectiveness of our method.
Darmanis, Spyridon; Toms, Andrew; Durman, Robert; Moore, Donna; Eyres, Keith
2007-07-01
To reduce the operating time in computer-assisted navigated total knee replacement (TKR), by improving communication between the infrared camera and the trackers placed on the patient. The innovation involves placing a routinely used laser pointer on top of the camera, so that the infrared cameras focus precisely on the trackers located on the knee to be operated on. A prospective randomized study was performed involving 40 patients divided into two groups, A and B. Both groups underwent navigated TKR, but for group B patients a laser pointer was used to improve the targeting capabilities of the cameras. Without the laser pointer, the camera had to move a mean 9.2 times in order to identify the trackers. With the introduction of the laser pointer, this was reduced to 0.9 times. Accordingly, the additional mean time required without the laser pointer was 11.6 minutes. Time delays are a major problem in computer-assisted surgery, and our technical suggestion can contribute towards reducing the delays associated with this particular application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Arthur; van Beuzekom, Martin; Bouwens, Bram
Here, we demonstrate a coincidence velocity map imaging apparatus equipped with a novel time-stamping fast optical camera, Tpx3Cam, whose high sensitivity and nanosecond timing resolution allow for simultaneous position and time-of-flight detection. This single detector design is simple, flexible, and capable of highly differential measurements. We show detailed characterization of the camera and its application in strong field ionization experiments.
Zhao, Arthur; van Beuzekom, Martin; Bouwens, Bram; ...
2017-11-07
Here, we demonstrate a coincidence velocity map imaging apparatus equipped with a novel time-stamping fast optical camera, Tpx3Cam, whose high sensitivity and nanosecond timing resolution allow for simultaneous position and time-of-flight detection. This single detector design is simple, flexible, and capable of highly differential measurements. We show detailed characterization of the camera and its application in strong field ionization experiments.
Architecture and applications of a high resolution gated SPAD image sensor
Burri, Samuel; Maruyama, Yuki; Michalet, Xavier; Regazzoni, Francesco; Bruschini, Claudio; Charbon, Edoardo
2014-01-01
We present the architecture and three applications of the largest resolution image sensor based on single-photon avalanche diodes (SPADs) published to date. The sensor, fabricated in a high-voltage CMOS process, has a resolution of 512 × 128 pixels and a pitch of 24 μm. The fill-factor of 5% can be increased to 30% with the use of microlenses. For precise control of the exposure and for time-resolved imaging, we use fast global gating signals to define exposure windows as small as 4 ns. The uniformity of the gate edges location is ∼140 ps (FWHM) over the whole array, while in-pixel digital counting enables frame rates as high as 156 kfps. Currently, our camera is used as a highly sensitive sensor with high temporal resolution, for applications ranging from fluorescence lifetime measurements to fluorescence correlation spectroscopy and generation of true random numbers. PMID:25090572
Correlating measured transient temperature rises with damage rate processes in cultured cells
NASA Astrophysics Data System (ADS)
Denton, Michael L.; Tijerina, Amanda J.; Gonzalez, Cherry C.; Gamboa, B. Giovana; Noojin, Gary D.; Ahmed, Elharith M.; Rickman, John M.; Dyer, Phillip H.; Rockwell, Benjamin A.
2017-02-01
Thermal damage rate processes in biological tissues are usually characterized by a kinetics approach. This stems from experimental data that show how the transformation of a specified biological property of cells or biomolecule (plating efficiency for viability, change in birefringence, tensile strength, etc.) is dependent upon both time and temperature. Here, two disparate approaches were used to study thermal damage rate processes in cultured retinal pigment epithelial cells. Laser exposure (photothermal) parameters included 2-μm laser exposure of non-pigmented cells and 532-nm exposures of cells possessing a variety of melanosome particle densities. Photothermal experiments used a mid-IR camera to record temperature histories with spatial resolution of about 8 μm, while fluorescence microscopy of the cell monolayers identified threshold damage at the boundary between live and dead cells. Photothermal exposure durations ranged from 0.05-20 s, and the effects of varying ambient temperature were investigated. Temperature during heat transfer using a water-jacketed cuvette was recorded with a fast microthermister, while damage and viability of the suspended cells were determined as percentages. Exposure durations for the heat transfer experiments ranged from 50- 60 s. Empirically-determined kinetic parameters for the two heating methods were compared with each other, and with values found in the literature.
NASA Technical Reports Server (NTRS)
Ponseggi, B. G. (Editor); Johnson, H. C. (Editor)
1985-01-01
Papers are presented on the picosecond electronic framing camera, photogrammetric techniques using high-speed cineradiography, picosecond semiconductor lasers for characterizing high-speed image shutters, the measurement of dynamic strain by high-speed moire photography, the fast framing camera with independent frame adjustments, design considerations for a data recording system, and nanosecond optical shutters. Consideration is given to boundary-layer transition detectors, holographic imaging, laser holographic interferometry in wind tunnels, heterodyne holographic interferometry, a multispectral video imaging and analysis system, a gated intensified camera, a charge-injection-device profile camera, a gated silicon-intensified-target streak tube and nanosecond-gated photoemissive shutter tubes. Topics discussed include high time-space resolved photography of lasers, time-resolved X-ray spectrographic instrumentation for laser studies, a time-resolving X-ray spectrometer, a femtosecond streak camera, streak tubes and cameras, and a short pulse X-ray diagnostic development facility.
SpectraCAM SPM: a camera system with high dynamic range for scientific and medical applications
NASA Astrophysics Data System (ADS)
Bhaskaran, S.; Baiko, D.; Lungu, G.; Pilon, M.; VanGorden, S.
2005-08-01
A scientific camera system having high dynamic range designed and manufactured by Thermo Electron for scientific and medical applications is presented. The newly developed CID820 image sensor with preamplifier-per-pixel technology is employed in this camera system. The 4 Mega-pixel imaging sensor has a raw dynamic range of 82dB. Each high-transparent pixel is based on a preamplifier-per-pixel architecture and contains two photogates for non-destructive readout of the photon-generated charge (NDRO). Readout is achieved via parallel row processing with on-chip correlated double sampling (CDS). The imager is capable of true random pixel access with a maximum operating speed of 4MHz. The camera controller consists of a custom camera signal processor (CSP) with an integrated 16-bit A/D converter and a PowerPC-based CPU running a Linux embedded operating system. The imager is cooled to -40C via three-stage cooler to minimize dark current. The camera housing is sealed and is designed to maintain the CID820 imager in the evacuated chamber for at least 5 years. Thermo Electron has also developed custom software and firmware to drive the SpectraCAM SPM camera. Included in this firmware package is the new Extreme DRTM algorithm that is designed to extend the effective dynamic range of the camera by several orders of magnitude up to 32-bit dynamic range. The RACID Exposure graphical user interface image analysis software runs on a standard PC that is connected to the camera via Gigabit Ethernet.
Next-generation digital camera integration and software development issues
NASA Astrophysics Data System (ADS)
Venkataraman, Shyam; Peters, Ken; Hecht, Richard
1998-04-01
This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.
Laser line scan underwater imaging by complementary metal-oxide-semiconductor camera
NASA Astrophysics Data System (ADS)
He, Zhiyi; Luo, Meixing; Song, Xiyu; Wang, Dundong; He, Ning
2017-12-01
This work employs the complementary metal-oxide-semiconductor (CMOS) camera to acquire images in a scanning manner for laser line scan (LLS) underwater imaging to alleviate backscatter impact of seawater. Two operating features of the CMOS camera, namely the region of interest (ROI) and rolling shutter, can be utilized to perform image scan without the difficulty of translating the receiver above the target as the traditional LLS imaging systems have. By the dynamically reconfigurable ROI of an industrial CMOS camera, we evenly divided the image into five subareas along the pixel rows and then scanned them by changing the ROI region automatically under the synchronous illumination by the fun beams of the lasers. Another scanning method was explored by the rolling shutter operation of the CMOS camera. The fun beam lasers were turned on/off to illuminate the narrow zones on the target in a good correspondence to the exposure lines during the rolling procedure of the camera's electronic shutter. The frame synchronization between the image scan and the laser beam sweep may be achieved by either the strobe lighting output pulse or the external triggering pulse of the industrial camera. Comparison between the scanning and nonscanning images shows that contrast of the underwater image can be improved by our LLS imaging techniques, with higher stability and feasibility than the mechanically controlled scanning method.
Integrated evaluation of visually induced motion sickness in terms of autonomic nervous regulation.
Kiryu, Tohru; Tada, Gen; Toyama, Hiroshi; Iijima, Atsuhiko
2008-01-01
To evaluate visually-induced motion sickness, we integrated subjective and objective responses in terms of autonomic nervous regulation. Twenty-seven subjects viewed a 2-min-long first-person-view video section five times (total 10 min) continuously. Measured biosignals, the RR interval, respiration, and blood pressure, were used to estimate the indices related to autonomic nervous activity (ANA). Then we determined the trigger points and some sensation sections based on the time-varying behavior of ANA-related indices. We found that there was a suitable combination of biosignals to present the symptoms of visually-induced motion sickness. Based on the suitable combination, integrating trigger points and subjective scores allowed us to represent the time-distribution of subjective responses during visual exposure, and helps us to understand what types of camera motions will cause visually-induced motion sickness.
Sugden, Nicole A; Mohamed-Ali, Marwan I; Moulson, Margaret C
2014-02-01
Exposure to faces is known to shape and change the face processing system; however, no study has yet documented infants' natural daily first-hand exposure to faces. One- and three-month-old infants' visual experience was recorded through head-mounted cameras. The video recordings were coded for faces to determine: (1) How often are infants exposed to faces? (2) To what type of faces are they exposed? and (3) Do frequently encountered face types reflect infants' typical pattern of perceptual narrowing? As hypothesized, infants spent a large proportion of their time (25%) exposed to faces; these faces were primarily female (70%), own-race (96%), and adult-age (81%). Infants were exposed to more individual exemplars of female, own-race, and adult-age faces than to male, other-race, and child- or older-adult-age faces. Each exposure to own-race faces was longer than to other-race faces. There were no differences in exposure duration related to the gender or age of the face. Previous research has found that the face types frequently experienced by our participants are preferred over and more successfully recognized than other face types. The patterns of face exposure revealed in the current study coincide with the known trajectory of perceptual narrowing seen later in infancy. © 2013 The Authors. Developmental Psychobiology Published by Wiley Periodicals, Inc.
Science Goals for an All-sky Viewing Observatory in X-rays
NASA Astrophysics Data System (ADS)
Remillard, R. A.; Levine, A. M.; Morgan, E. H.; Bradt, H. V.
2003-03-01
We describe a concept for a NASA SMEX Mission that will provide a comprehensive investigation of cosmic explosions. These range from the short flashes at cosmological distances in Gamma-ray bursts, to the moments of relativistic mass ejections in Galactic microquasars, to the panorama of outbursts used to identify the stellar-scale black holes in our Galaxy. With an equatorial launch, an array of 31 cameras can cover 97% of the sky with an average exposure efficiency of 65%. Coded mask cameras with Xe detectors (1.5-12 keV) are chosen for their ability to distinguish thermal and non-thermal processes, while providing high throughput and msec time resolution to capture the detailed evolution of bright events. This mission, with 1' position accuracy, would provide a long-term solution to the critical needs for monitoring services for Chandra and GLAST, with possible overlap into the time frame for Constellation-X. The sky coverage would create additional science opportunities beyond the X-ray missions: "eyes" for LIGO and partnerships for time-variability with LOFAR and dedicated programs at optical observatories. Compared to the RXTE ASM, AVOX offers improvements by a factor of 40 in instantaneous sky coverage and a factor of 10 in sensitivity to faint X-ray sources (i.e. to 0.8 mCrab at 3 sigma in 1 day).
Pluto in Hi-Def Note: There is debate within the science community as to whether Pluto should be
NASA Technical Reports Server (NTRS)
2008-01-01
This image demonstrates the first detection of Pluto using the high-resolution mode on the New Horizons Long-Range Reconnaissance Imager (LORRI). The mode provides a clear separation between Pluto and numerous nearby background stars. When the image was taken on October 6, 2007, Pluto was located in the constellation Serpens, in a region of the sky dense with background stars. Typically, LORRI's exposure time in hi-res mode is limited to approximately 0.1 seconds, but by using a special pointing mode that allowed an increase in the exposure time to 0.967 seconds, scientists were able to spot Pluto, which is approximately 15,000 times fainter than human eyes can detect. New Horizons was still too far from Pluto (3.6 billion kilometers, or 2.2 billion miles) for LORRI to resolve any details on Pluto's surface that won't happen until summer 2014, approximately one year before closest approach. For now the entire Pluto system remains a bright dot to the spacecraft's telescopic camera, though LORRI is expected to start resolving Charon from Pluto seeing them as separate objects in summer 2010.Development of an imaging system for single droplet characterization using a droplet generator.
Minov, S Vulgarakis; Cointault, F; Vangeyte, J; Pieters, J G; Hijazi, B; Nuyttens, D
2012-01-01
The spray droplets generated by agricultural nozzles play an important role in the application accuracy and efficiency of plant protection products. The limitations of the non-imaging techniques and the recent improvements in digital image acquisition and processing increased the interest in using high speed imaging techniques in pesticide spray characterisation. The goal of this study was to develop an imaging technique to evaluate the characteristics of a single spray droplet using a piezoelectric single droplet generator and a high speed imaging technique. Tests were done with different camera settings, lenses, diffusers and light sources. The experiments have shown the necessity for having a good image acquisition and processing system. Image analysis results contributed in selecting the optimal set-up for measuring droplet size and velocity which consisted of a high speed camera with a 6 micros exposure time, a microscope lens at a working distance of 43 cm resulting in a field of view of 1.0 cm x 0.8 cm and a Xenon light source without diffuser used as a backlight. For measuring macro-spray characteristics as the droplet trajectory, the spray angle and the spray shape, a Macro Video Zoom lens at a working distance of 14.3 cm with a bigger field of view of 7.5 cm x 9.5 cm in combination with a halogen spotlight with a diffuser and the high speed camera can be used.
Overview of the Multi-Spectral Imager on the NEAR spacecraft
NASA Astrophysics Data System (ADS)
Hawkins, S. E., III
1996-07-01
The Multi-Spectral Imager on the Near Earth Asteroid Rendezvous (NEAR) spacecraft is a 1 Hz frame rate CCD camera sensitive in the visible and near infrared bands (~400-1100 nm). MSI is the primary instrument on the spacecraft to determine morphology and composition of the surface of asteroid 433 Eros. In addition, the camera will be used to assist in navigation to the asteroid. The instrument uses refractive optics and has an eight position spectral filter wheel to select different wavelength bands. The MSI optical focal length of 168 mm gives a 2.9 ° × 2.25 ° field of view. The CCD is passively cooled and the 537×244 pixel array output is digitized to 12 bits. Electronic shuttering increases the effective dynamic range of the instrument by more than a factor of 100. A one-time deployable cover protects the instrument during ground testing operations and launch. A reduced aperture viewport permits full field of view imaging while the cover is in place. A Data Processing Unit (DPU) provides the digital interface between the spacecraft and the Camera Head and uses an RTX2010 processor. The DPU provides an eight frame image buffer, lossy and lossless data compression routines, and automatic exposure control. An overview of the instrument is presented and design parameters and trade-offs are discussed.
Mode shape analysis using a commercially available peak store video frame buffer
NASA Technical Reports Server (NTRS)
Snow, Walter L.; Childers, Brooks A.
1994-01-01
Time exposure photography, sometimes coupled with strobe illumination, is an accepted method for motion analysis that bypasses frame by frame analysis and resynthesis of data. Garden variety video cameras can now exploit this technique using a unique frame buffer that is a non-integrating memory that compares incoming data with that already stored. The device continuously outputs an analog video signal of the stored contents which can then be redigitized and analyzed using conventional equipment. Historically, photographic time exposures have been used to record the displacement envelope of harmonically oscillating structures to show mode shape. Mode shape analysis is crucial, for example, in aeroelastic testing of wind tunnel models. Aerodynamic, inertial, and elastic forces can couple together leading to catastrophic failure of a poorly designed aircraft. This paper will explore the usefulness of the peak store device as a videometric tool and in particular discuss methods for analyzing a targeted vibrating plate using the 'peak store' in conjunction with calibration methods familiar to the close-range videometry community. Results for the first three normal modes will be presented.
VizieR Online Data Catalog: NGC 1893 optical and NIR photometry (Prisinzano+, 2011)
NASA Astrophysics Data System (ADS)
Prisinzano, L.; Sanz-Forcada, J.; Micela, G.; Caramazza, M.; Guarcello, M. G.; Sciortino, S.; Testi, L.
2010-10-01
We present new optical and NIR photometric data in the VRIJHK and H-α bands for the cluster NGC 1893. The optical photometry was obtained by using images acquired in service mode using two different telescopes: the Device Optimized for the LOw RESolution (DOLORES) mounted on the Telescopio Nazionale Galileo (TNG), used in service mode during three nights in 2007, and the Calar Alto Faint Object Spectrograph (CAFOS), mounted on the 2.2m telescope in Calar Alto German-Spanish Observatory (Spain), during three nights in 2007 and 2008. NIR observations were acquired in service mode at the TNG, using the large field Near Infrared Camera Spectrometer (NICS) with the Js(1.25um), H(1.63um) and K'(2.12um) filters during eight nights in 2007 and 2008. We observed a field around NGC 1893 with a raster of 4x4 pointings, at each pointing we obtained a series of NINT dithered exposures. Each exposure is a repetition of a DIT (Detector Integration Time) times NDIT (number of DIT), to avoid saturation of the background. (4 data files).
Mode shape analysis using a commercially available "peak-store" video frame buffer
NASA Astrophysics Data System (ADS)
Snow, Walter L.; Childers, Brooks A.
1994-10-01
Time exposure photography, sometimes coupled with strobe illumination, is an accepted method for motion analysis that bypasses frame by frame analysis and re synthesis of data. Garden variety video cameras can now exploit this technique using a unique frame buffer that is a non integrating memory that compares incoming data with that already stored. The device continuously outputs an analog video signal of the stored contents which can then be redigitized and analyzed using conventional equipment. Historically, photographic time exposures have been used to record the displacement envelope of harmonically oscillating structures to show mode shape. Mode shape analysis is crucial, for example, in aeroelastic testing of wind tunnel models. Aerodynamic, inertial, and elastic forces can couple together leading to catastrophic failure of a poorly designed aircraft. This paper will explore the usefulness of the peak store device as a videometric tool and in particular discuss methods for analyzing a targeted vibrating plate using the `peak store' in conjunction with calibration methods familiar to the close-range videometry community. Results for the first three normal modes will be presented.
NASA Technical Reports Server (NTRS)
Revercomb, Henry E.; Sromovsky, Lawrence A.; Fry, Patrick M.; Best, Fred A.; LaPorte, Daniel D.
2001-01-01
The combination of massively parallel spatial sampling and accurate spectral radiometry offered by imaging FTS makes it extremely attractive for earth and planetary remote sensing. We constructed a breadboard instrument to help assess the potential for planetary applications of small imaging FTS instruments in the 1 - 5 micrometer range. The results also support definition of the NASA Geostationary Imaging FTS (GIFTS) instrument that will make key meteorological and climate observations from geostationary earth orbit. The Planetary Imaging FTS (PIFTS) breadboard is based on a custom miniaturized Bomen interferometer that uses corner cube reflectors, a wishbone pivoting voice-coil delay scan mechanism, and a laser diode metrology system. The interferometer optical output is measured by a commercial infrared camera procured from Santa Barbara Focalplane. It uses an InSb 128x128 detector array that covers the entire FOV of the instrument when coupled with a 25 mm focal length commercial camera lens. With appropriate lenses and cold filters the instrument can be used from the visible to 5 micrometers. The delay scan is continuous, but slow, covering the maximum range of +/- 0.4 cm in 37.56 sec at a rate of 500 image frames per second. Image exposures are timed to be centered around predicted zero crossings. The design allows for prediction algorithms that account for the most recent fringe rate so that timing jitter produced by scan speed variations can be minimized. Response to a fixed source is linear with exposure time nearly to the point of saturation. Linearity with respect to input variations was demonstrated to within 0.16% using a 3-point blackbody calibration. Imaging of external complex scenes was carried out at low and high spectral resolution. These require full complex calibration to remove background contributions that vary dramatically over the instrument FOV. Testing is continuing to demonstrate the precise radiometric accuracy and noise characteristics.
Russo, Paolo; Mettivier, Giovanni
2011-04-01
The goal of this study is to evaluate a new method based on a coded aperture mask combined with a digital x-ray imaging detector for measurements of the focal spot sizes of diagnostic x-ray tubes. Common techniques for focal spot size measurements employ a pinhole camera, a slit camera, or a star resolution pattern. The coded aperture mask is a radiation collimator consisting of a large number of apertures disposed on a predetermined grid in an array, through which the radiation source is imaged onto a digital x-ray detector. The method of the coded mask camera allows one to obtain a one-shot accurate and direct measurement of the two dimensions of the focal spot (like that for a pinhole camera) but at a low tube loading (like that for a slit camera). A large number of small apertures in the coded mask operate as a "multipinhole" with greater efficiency than a single pinhole, but keeping the resolution of a single pinhole. X-ray images result from the multiplexed output on the detector image plane of such a multiple aperture array, and the image of the source is digitally reconstructed with a deconvolution algorithm. Images of the focal spot of a laboratory x-ray tube (W anode: 35-80 kVp; focal spot size of 0.04 mm) were acquired at different geometrical magnifications with two different types of digital detector (a photon counting hybrid silicon pixel detector with 0.055 mm pitch and a flat panel CMOS digital detector with 0.05 mm pitch) using a high resolution coded mask (type no-two-holes-touching modified uniformly redundant array) with 480 0.07 mm apertures, designed for imaging at energies below 35 keV. Measurements with a slit camera were performed for comparison. A test with a pinhole camera and with the coded mask on a computed radiography mammography unit with 0.3 mm focal spot was also carried out. The full width at half maximum focal spot sizes were obtained from the line profiles of the decoded images, showing a focal spot of 0.120 mm x 0.105 mm at 35 kVp and M = 6.1, with a detector entrance exposure as low as 1.82 mR (0.125 mA s tube load). The slit camera indicated a focal spot of 0.112 mm x 0.104 mm at 35 kVp and M = 3.15, with an exposure at the detector of 72 mR. Focal spot measurements with the coded mask could be performed up to 80 kVp. Tolerance to angular misalignment with the reference beam up to 7 degrees in in-plane rotations and 1 degrees deg in out-of-plane rotations was observed. The axial distance of the focal spot from the coded mask could also be determined. It is possible to determine the beam intensity via measurement of the intensity of the decoded image of the focal spot and via a calibration procedure. Coded aperture masks coupled to a digital area detector produce precise determinations of the focal spot of an x-ray tube with reduced tube loading and measurement time, coupled to a large tolerance in the alignment of the mask.
GEMINI-TITAN (GT)-10 - MISC. - INFLIGHT (MILKY WAY) - OUTER SPACE
1966-08-01
S66-45314 (19 July 1966) --- Ultraviolet spectra of stars in the region of the Southern Cross. These objective-grating spectra were obtained by astronauts John W. Young and Michael Collins during Gemini-10 stand-up EVA on July 19, 1966, with a 70mm Maurer camera and its f/3.3 focal length lens. The spectra extends from 2,200 angstroms to about 4,000 angstroms. The spacecraft was docked to the horizon-stabilized Agena-10; thus giving an apparent field of rotation resulting from the four-degree-per-minute orbital motion during the 20-second exposure time. Photo credit: NASA
GEMINI-TITAN (GT)-10 - MISC. - INFLIGHT (MILKY WAY) - OUTER SPACE
1966-08-01
S66-45328 (19 July 1966) --- Ultraviolet spectra of stars in the Carina-Vela region of the southern Milky Way. These objective-grating spectra were obtained by astronauts John W. Young and Michael Collins during Gemini-10 stand-up EVA on July 19, 1966, with a 70mm Maurer camera and its f/3.3 focal length lens. The spectra extends from 2,200 angstroms to about 4,000 angstroms. The spacecraft was docked to the horizon-stabilized Agena-10; thus giving an apparent field of rotation resulting from the four-degree-per-minute orbital motion during the 20-second exposure time. Photo credit: NASA
NASA Technical Reports Server (NTRS)
Wattson, R. B.; Harvey, P.; Swift, R.
1975-01-01
An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.
Image synchronization for 3D application using the NanEye sensor
NASA Astrophysics Data System (ADS)
Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Dias, Morgado
2015-03-01
Based on Awaiba's NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this paper is to demonstrate a novel technique to perfectly synchronize up to 8 individual self-timed cameras. Minimal form factor self-timed camera modules of 1 mm x 1 mm or smaller do not generally allow external synchronization. However, for stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is required to synchronize multiple cameras. In this work, the challenge to synchronize multiple self-timed cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the cameras to synchronize their frame rate and frame phase. To that effect, a control core was created to constantly monitor the operating frequency of each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line period. To ensure phase synchronization between frames of multiple cameras, a Master-Slave interface was implemented. A single camera is defined as the Master entity, with its operating frequency being controlled directly through a PC based interface. The remaining cameras are setup in Slave mode and are interfaced directly with the Master camera control module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The result of this work will allow the realization of smaller than 3mm diameter 3D stereo vision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.
Extracting spatial information from large aperture exposures of diffuse sources
NASA Technical Reports Server (NTRS)
Clarke, J. T.; Moos, H. W.
1981-01-01
The spatial properties of large aperture exposures of diffuse emission can be used both to investigate spatial variations in the emission and to filter out camera noise in exposures of weak emission sources. Spatial imaging can be accomplished both parallel and perpendicular to dispersion with a resolution of 5-6 arc sec, and a narrow median filter running perpendicular to dispersion across a diffuse image selectively filters out point source features, such as reseaux marks and fast particle hits. Spatial information derived from observations of solar system objects is presented.
Space Flight Experiments to Measure Polymer Erosion and Contamination on Spacecraft
NASA Technical Reports Server (NTRS)
Lillis, Maura C.; Youngstrom, Erica E.; Marx, Laura M.; Hammerstrom, Anne M.; Finefrock, Katherine D.; Youngstrom, Christiane A.; Kaminski, Carolyn; Fine, Elizabeth S.; Hunt, Patricia K.; deGroh, Kim K.
2002-01-01
Atomic oxygen erosion and silicone contamination are serious issues that could damage or destroy spacecraft components after orbiting for an extended period of time, such as on a space station or satellite. An experiment, the Polymer Erosion And Contamination Experiment (PEACE) will be conducted to study the effects of atomic oxygen (AO) erosion and silicone contamination, and it will provide information and contribute to a solution for these problems. PEACE will fly 43 different polymer materials that will be analyzed for AO erosion effects through two techniques: mass loss measurement and recession depth measurement. Pinhole cameras will provide information about the arrival direction of AO, and silicone contamination pinhole cameras will identify the source of silicone contamination on a spacecraft. All experimental hardware will be passively exposed to AO for up to two weeks in the actual space environment when it flies in the bay of a space shuttle. A second set of the PEACE Polymers is being exposed to the space environment for erosion yield determination as part of a second experiment, Materials International Space Station Experiment (MISSE). MISSE is a collaboration between several federal agencies and aerospace companies. During a space walk on August 16, 2001, MISSE was attached to the outside of the International Space Station (ISS) during an extravehicular activity (EVA), where it began its exposure to AO for approximately 1.5 years. The PEACE polymers, therefore, will be analyzed after both short-term and long-term AO exposures for a more complete study of AO effects.
NASA Technical Reports Server (NTRS)
1999-01-01
This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.
Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.Temporal Coding of Volumetric Imagery
NASA Astrophysics Data System (ADS)
Llull, Patrick Ryan
'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.
Cloud Height Estimation with a Single Digital Camera and Artificial Neural Networks
NASA Astrophysics Data System (ADS)
Carretas, Filipe; Janeiro, Fernando M.
2014-05-01
Clouds influence the local weather, the global climate and are an important parameter in the weather prediction models. Clouds are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Therefore it is important to develop low cost and robust systems that can be easily deployed in the field, enabling large scale acquisition of cloud parameters. Recently, the authors developed a low-cost system for the measurement of cloud base height using stereo-vision and digital photography. However, due to the stereo nature of the system, some challenges were presented. In particular, the relative camera orientation requires calibration and the two cameras need to be synchronized so that the photos from both cameras are acquired simultaneously. In this work we present a new system that estimates the cloud height between 1000 and 5000 meters. This prototype is composed by one digital camera controlled by a Raspberry Pi and is installed at Centro de Geofísica de Évora (CGE) in Évora, Portugal. The camera is periodically triggered to acquire images of the overhead sky and the photos are downloaded to the Raspberry Pi which forwards them to a central computer that processes the images and estimates the cloud height in real time. To estimate the cloud height using just one image requires a computer model that is able to learn from previous experiences and execute pattern recognition. The model proposed in this work is an Artificial Neural Network (ANN) that was previously trained with cloud features at different heights. The chosen Artificial Neural Network is a three-layer network, with six parameters in the input layer, 12 neurons in the hidden intermediate layer, and an output layer with only one output. The six input parameters are the average intensity values and the intensity standard deviation of each RGB channel. The output parameter in the output layer is the cloud height estimated by the ANN. The training procedure was performed, using the back-propagation method, in a set of 260 different clouds with heights in the range [1000, 5000] m. The training of the ANN has resulted in a correlation ratio of 0.74. This trained ANN can therefore be used to estimate the cloud height. The previously described system can also measure the wind speed and direction at cloud height by measuring the displacement, in pixels, of a cloud feature between consecutively acquired photos. Also, the geographical north direction can be estimated using this setup through sequential night images with high exposure times. A further advantage of this single camera system is that no camera calibration or synchronization is needed. This significantly reduces the cost and complexity of field deployment of cloud height measurement systems based on digital photography.
Attitude identification for SCOLE using two infrared cameras
NASA Technical Reports Server (NTRS)
Shenhar, Joram
1991-01-01
An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.
Real-time vehicle matching for multi-camera tunnel surveillance
NASA Astrophysics Data System (ADS)
Jelača, Vedran; Niño Castañeda, Jorge Oswaldo; Frías-Velázquez, Andrés; Pižurica, Aleksandra; Philips, Wilfried
2011-03-01
Tracking multiple vehicles with multiple cameras is a challenging problem of great importance in tunnel surveillance. One of the main challenges is accurate vehicle matching across the cameras with non-overlapping fields of view. Since systems dedicated to this task can contain hundreds of cameras which observe dozens of vehicles each, for a real-time performance computational efficiency is essential. In this paper, we propose a low complexity, yet highly accurate method for vehicle matching using vehicle signatures composed of Radon transform like projection profiles of the vehicle image. The proposed signatures can be calculated by a simple scan-line algorithm, by the camera software itself and transmitted to the central server or to the other cameras in a smart camera environment. The amount of data is drastically reduced compared to the whole image, which relaxes the data link capacity requirements. Experiments on real vehicle images, extracted from video sequences recorded in a tunnel by two distant security cameras, validate our approach.
Applications of digital image acquisition in anthropometry
NASA Technical Reports Server (NTRS)
Woolford, B.; Lewis, J. L.
1981-01-01
A description is given of a video kinesimeter, a device for the automatic real-time collection of kinematic and dynamic data. Based on the detection of a single bright spot by three TV cameras, the system provides automatic real-time recording of three-dimensional position and force data. It comprises three cameras, two incandescent lights, a voltage comparator circuit, a central control unit, and a mass storage device. The control unit determines the signal threshold for each camera before testing, sequences the lights, synchronizes and analyzes the scan voltages from the three cameras, digitizes force from a dynamometer, and codes the data for transmission to a floppy disk for recording. Two of the three cameras face each other along the 'X' axis; the third camera, which faces the center of the line between the first two, defines the 'Y' axis. An image from the 'Y' camera and either 'X' camera is necessary for determining the three-dimensional coordinates of the point.
Solid-state framing camera with multiple time frames
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, K. L.; Stewart, R. E.; Steele, P. T.
2013-10-07
A high speed solid-state framing camera has been developed which can operate over a wide range of photon energies. This camera measures the two-dimensional spatial profile of the flux incident on a cadmium selenide semiconductor at multiple times. This multi-frame camera has been tested at 3.1 eV and 4.5 keV. The framing camera currently records two frames with a temporal separation between the frames of 5 ps but this separation can be varied between hundreds of femtoseconds up to nanoseconds and the number of frames can be increased by angularly multiplexing the probe beam onto the cadmium selenide semiconductor.
Initial Demonstration of 9-MHz Framing Camera Rates on the FAST UV Drive Laser Pulse Trains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumpkin, A. H.; Edstrom Jr., D.; Ruan, J.
2016-10-09
We report the configuration of a Hamamatsu C5680 streak camera as a framing camera to record transverse spatial information of green-component laser micropulses at 3- and 9-MHz rates for the first time. The latter is near the time scale of the ~7.5-MHz revolution frequency of the Integrable Optics Test Accelerator (IOTA) ring and its expected synchroton radiation source temporal structure. The 2-D images are recorded with a Gig-E readout CCD camera. We also report a first proof of principle with an OTR source using the linac streak camera in a semi-framing mode.
Timing Calibration in PET Using a Time Alignment Probe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moses, William W.; Thompson, Christopher J.
2006-05-05
We evaluate the Scanwell Time Alignment Probe for performing the timing calibration for the LBNL Prostate-Specific PET Camera. We calibrate the time delay correction factors for each detector module in the camera using two methods--using the Time Alignment Probe (which measures the time difference between the probe and each detector module) and using the conventional method (which measures the timing difference between all module-module combinations in the camera). These correction factors, which are quantized in 2 ns steps, are compared on a module-by-module basis. The values are in excellent agreement--of the 80 correction factors, 62 agree exactly, 17 differ bymore » 1 step, and 1 differs by 2 steps. We also measure on-time and off-time counting rates when the two sets of calibration factors are loaded into the camera and find that they agree within statistical error. We conclude that the performance using the Time Alignment Probe and conventional methods are equivalent.« less
Research on a solid state-streak camera based on an electro-optic crystal
NASA Astrophysics Data System (ADS)
Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang
2006-06-01
With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.
VizieR Online Data Catalog: BV(RI)c light curves of FF Vul (Samec+, 2016)
NASA Astrophysics Data System (ADS)
Samec, R. G.; Nyaude, R.; Caton, D.; van Hamme, W.
2017-02-01
The present BVRcIc light curves were taken by DC, the Dark Sky Observatory 0.81m reflector at Phillips Gap, North Carolina. These were taken on 2015 September 12, 13, 14 and 15, and October 15, with a thermoelectrically cooled (-40°C) 2*2K Apogee Alta camera. Additional observations were obtained remotely with the SARA north 0.91m reflector at KPNO on 2015 September 20 and October 11, with the ARC 2*2K camera cooled to -110°C. Individual observations were taken at both sites with standard Johnson-Cousins filters, and included 444 field images in B, 451 in V, 443 in Rc, and 445 in Ic. The standard error was ~7mmag in each of B, V, Rc and Ic. Nightly images were calibrated with 25 bias frames, five flat frames in each filter, and ten 300s dark frames. The exposure times were 40-50s in B, 25-30s in V, 15-25s in Rc and Ic. Our observations are listed in Table1. (1 data file).
Comparison of Brownian-dynamics-based estimates of polymer tension with direct force measurements.
Arsenault, Mark E; Purohit, Prashant K; Goldman, Yale E; Shuman, Henry; Bau, Haim H
2010-11-01
With the aid of brownian dynamics models, it is possible to estimate polymer tension by monitoring polymers' transverse thermal fluctuations. To assess the precision of the approach, brownian dynamics-based tension estimates were compared with the force applied to rhodamine-phalloidin labeled actin filaments bound to polymer beads and suspended between two optical traps. The transverse thermal fluctuations of each filament were monitored with a CCD camera, and the images were analyzed to obtain the filament's transverse displacement variance as a function of position along the filament, the filament's tension, and the camera's exposure time. A linear Brownian dynamics model was used to estimate the filament's tension. The estimated force was compared and agreed within 30% (when the tension <0.1 pN ) and 70% (when the tension <1 pN ) with the applied trap force. In addition, the paper presents concise asymptotic expressions for the mechanical compliance of a system consisting of a filament attached tangentially to bead handles (dumbbell system). The techniques described here can be used for noncontact estimates of polymers' and fibers' tension.
Calibration of HST wide field camera for quantitative analysis of faint galaxy images
NASA Technical Reports Server (NTRS)
Ratnatunga, Kavan U.; Griffiths, Richard E.; Casertano, Stefano; Neuschaefer, Lyman W.; Wyckoff, Eric W.
1994-01-01
We present the methods adopted to optimize the calibration of images obtained with the Hubble Space Telescope (HST) Wide Field Camera (WFC) (1991-1993). Our main goal is to improve quantitative measurement of faint images, with special emphasis on the faint (I approximately 20-24 mag) stars and galaxies observed as a part of the Medium-Deep Survey. Several modifications to the standard calibration procedures have been introduced, including improved bias and dark images, and a new supersky flatfield obtained by combining a large number of relatively object-free Medium-Deep Survey exposures of random fields. The supersky flat has a pixel-to-pixel rms error of about 2.0% in F555W and of 2.4% in F785LP; large-scale variations are smaller than 1% rms. Overall, our modifications improve the quality of faint images with respect to the standard calibration by about a factor of five in photometric accuracy and about 0.3 mag in sensitivity, corresponding to about a factor of two in observing time. The relevant calibration images have been made available to the scientific community.
First photometric properties of Dome C, Antarctica
NASA Astrophysics Data System (ADS)
Chadid, M.; Vernin, J.; Jeanneaux, F.; Mekarnia, D.; Trinquet, H.
2008-07-01
Here we present the first photometric extinction measurements in the visible range performed at Dome C in Antarctica, using PAIX photometer (Photometer AntarctIca eXtinction). It is made with "off the shelf" components, Audine camera at the focus of Blazhko telescope, a Meade M16 diaphragmed down to 15 cm. For an exposure time of 60 s without filter, a 10th V-magnitude star is measured with a precision of 1/100 mag. A first statistics over 16 nights in August 2007 leads to a 0.5 magnitude per air mass extinction, may be due to high altitude cirrus. This rather simple experiment shows that continuous observations can be performed at Dome C, allowing high frequency resolution on pulsation and asteroseismology studies. Light curves of one of RR Lyrae stars: SAra were established. They show the typical trend of a RRLyrae star. A recent sophisticated photometer, PAIX II, has been installed recently at Dome C during polar summer 2008, with a ST10 XME camera, automatic guiding, auto focusing and Johnson/Bessel UBVRI filter wheels.
Study of the detail content of Apollo orbital photography
NASA Technical Reports Server (NTRS)
Kinzly, R. E.
1972-01-01
The results achieved during a study of the Detail Content of Apollo Orbital Photography are reported. The effect of residual motion smear or image reproduction processes upon the detail content of lunar surface imagery obtained from the orbiting command module are assessed. Data and conclusions obtained from the Apollo 8, 12, 14 and 15 missions are included. For the Apollo 8, 12 and 14 missions, the bracket-mounted Hasselblad camera had no mechanism internal to the camera for motion compensation. If the motion of the command module were left totally uncompensated, these photographs would exhibit a ground smear varying from 12 to 27 meters depending upon the focal length of the lens and the exposure time. During the photographic sequences motion compensation was attempted by firing the attitude control system of the spacecraft at a rate to compensate for the motion relative to the lunar surface. The residual smear occurring in selected frames of imagery was assessed using edge analyses methods to obtain and achieved modulation transfer function (MTF) which was compared to a baseline MTF.
VizieR Online Data Catalog: NIR polarimetric study in the LMC N159/N160 field (Kim+, 2017)
NASA Astrophysics Data System (ADS)
Kim, J.; Jeong, W.-S.; Pyo, J.; Pak, S.; Park, W.-K.; Kwon, J.; Tamura, M.
2018-04-01
Simultaneous JHKs polarimetric observations of the N159/N160 c were performed on 2007 February 3 and 5. We used the near-infrared camera SIRIUS (Nagayama et al. 2003SPIE.4841..459N) and the polarimeter SIRPOL (Kandori et al. 2006SPIE.6269E..51K) of the Infrared Survey Facility (IRSF) 1.4 m telescope at the South African Astronomical Observatory in Sutherland, South Africa. The camera has a field of view of 7.7"x7.7" and a pixel scale of 0.45"/pixel. One set of observations for a target field consisted of 20 s exposures at 10 dithered positions for four wave-plate angles (0°, 45°, 22.5°, and 67.5°) in the J, H, and Ks bands, and the whole sequence is repeated 10 and 9 times for the N159 and N160 fields centered at (α, δ)2000=(5h39m37.1s, -69°43'45.1") and (5h40m05.6s, -69°36'25.8"), respectively. (2 data files).
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni; Maki, Justin N.; Cucullu, Gordon C.
2008-01-01
Package Qualification and Verification (PQV) of advanced electronic packaging and interconnect technologies and various other types of qualification hardware for the Mars Exploration Rover/Mars Science Laboratory flight projects has been performed to enhance the mission assurance. The qualification of hardware (Engineering Camera and Platinum Resistance Thermometer, PRT) under extreme cold temperatures has been performed with reference to various project requirements. The flight-like packages, sensors, and subassemblies have been selected for the study to survive three times (3x) the total number of expected temperature cycles resulting from all environmental and operational exposures occurring over the life of the flight hardware including all relevant manufacturing, ground operations and mission phases. Qualification has been performed by subjecting above flight-like qual hardware to the environmental temperature extremes and assessing any structural failures or degradation in electrical performance due to either overstress or thermal cycle fatigue. Experiments of flight like hardware qualification test results have been described in this paper.
Implicit multiplane 3D camera calibration matrices for stereo image processing
NASA Astrophysics Data System (ADS)
McKee, James W.; Burgett, Sherrie J.
1997-12-01
By implicit camera calibration, we mean the process of calibrating cameras without explicitly computing their physical parameters. We introduce a new implicit model based on a generalized mapping between an image plane and multiple, parallel calibration planes (usually between four to seven planes). This paper presents a method of computing a relationship between a point on a three-dimensional (3D) object and its corresponding two-dimensional (2D) coordinate in a camera image. This relationship is expanded to form a mapping of points in 3D space to points in image (camera) space and visa versa that requires only matrix multiplication operations. This paper presents the rationale behind the selection of the forms of four matrices and the algorithms to calculate the parameters for the matrices. Two of the matrices are used to map 3D points in object space to 2D points on the CCD camera image plane. The other two matrices are used to map 2D points on the image plane to points on user defined planes in 3D object space. The mappings include compensation for lens distortion and measurement errors. The number of parameters used can be increased, in a straight forward fashion, to calculate and use as many parameters as needed to obtain a user desired accuracy. Previous methods of camera calibration use a fixed number of parameters which can limit the obtainable accuracy and most require the solution of nonlinear equations. The procedure presented can be used to calibrate a single camera to make 2D measurements or calibrate stereo cameras to make 3D measurements. Positional accuracy of better than 3 parts in 10,000 have been achieved. The algorithms in this paper were developed and are implemented in MATLABR (registered trademark of The Math Works, Inc.). We have developed a system to analyze the path of optical fiber during high speed payout (unwinding) of optical fiber off a bobbin. This requires recording and analyzing high speed (5 microsecond exposure time), synchronous, stereo images of the optical fiber during payout. A 3D equation for the fiber at an instant in time is calculated from the corresponding pair of stereo images as follows. In each image, about 20 points along the 2D projection of the fiber are located. Each of these 'fiber points' in one image is mapped to its projection line in 3D space. Each projection line is mapped into another line in the second image. The intersection of each mapped projection line and a curve fitted to the fiber points of the second image (fiber projection in second image) is calculated. Each intersection point is mapped back to the 3D space. A 3D fiber coordinate is formed from the intersection, in 3D space, of a mapped intersection point with its corresponding projection line. The 3D equation for the fiber is computed from this ordered list of 3D coordinates. This process requires a method of accurately mapping 2D (image space) to 3D (object space) and visa versa.3173
Video-Camera-Based Position-Measuring System
NASA Technical Reports Server (NTRS)
Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert
2005-01-01
A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white squares to an object of interest (see Figure 2). For other situations, where circular symmetry is more desirable, circular targets also can be created. Such a target can readily be generated and modified by use of commercially available software and printed by use of a standard office printer. All three relative coordinates (x, y, and z) of each target can be determined by processing the video image of the target. Because of the unique design of corresponding image-processing filters and targets, the vision-based position- measurement system is extremely robust and tolerant of widely varying fields of view, lighting conditions, and varying background imagery.
Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera
NASA Astrophysics Data System (ADS)
Dziri, Aziz; Duranton, Marc; Chapuis, Roland
2016-07-01
Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.
NASA Astrophysics Data System (ADS)
Nunez, Jorge; Llacer, Jorge
1993-10-01
This paper describes a general Bayesian iterative algorithm with entropy prior for image reconstruction. It solves the cases of both pure Poisson data and Poisson data with Gaussian readout noise. The algorithm maintains positivity of the solution; it includes case-specific prior information (default map) and flatfield corrections; it removes background and can be accelerated to be faster than the Richardson-Lucy algorithm. In order to determine the hyperparameter that balances the entropy and liklihood terms in the Bayesian approach, we have used a liklihood cross-validation technique. Cross-validation is more robust than other methods because it is less demanding in terms of the knowledge of exact data characteristics and of the point-spread function. We have used the algorithm to reconstruct successfully images obtained in different space-and ground-based imaging situations. It has been possible to recover most of the original intended capabilities of the Hubble Space Telescope (HST) wide field and planetary camera (WFPC) and faint object camera (FOC) from images obtained in their present state. Semireal simulations for the future wide field planetary camera 2 show that even after the repair of the spherical abberration problem, image reconstruction can play a key role in improving the resolution of the cameras, well beyond the design of the Hubble instruments. We also show that ground-based images can be reconstructed successfully with the algorithm. A technique which consists of dividing the CCD observations into two frames, with one-half the exposure time each, emerges as a recommended procedure for the utilization of the described algorithms. We have compared our technique with two commonly used reconstruction algorithms: the Richardson-Lucy and the Cambridge maximum entropy algorithms.
[Determination of radioactivity by smartphones].
Hartmann, H; Freudenberg, R; Andreeff, M; Kotzerke, J
2013-01-01
The interest in the detection of radioactive materials has strongly increased after the accident in the nuclear power plant Fukushima and has led to a bottleneck of suitable measuring instruments. Smartphones equipped with a commercially available software tool could be used for dose rate measurements following a calibration according to the specific camera module. We examined whether such measurements provide reliable data for typical activities and radionuclides in nuclear medicine. For the nuclides 99mTc (10 - 1000 MBq), 131I (3.7 - 1800 MBq, therapy capsule) and 68Ga (50 - 600 MBq) radioactivity with defined geometry in different distances was measured. The smartphones Milestone Droid 1 (Motorola) and HTC Desire (HTC Corporation) were compared with the standard instruments AD6 (automess) and DoseGUARD (AEA Technology). Measurements with the smartphones and the other devices show a good agreement: linear signal increase with rising activity and dose rate. The long time measurement (131I, 729 MBq, 0.5 m, 60 min) demonstrates a considerably higher variation (by 20%) of the measured smartphone data values compared with the AD6. For low dose rates (< 1 µGy/h), the sensitivity decreases so that measurements of e. g. the natural radiation exposure do not lead to valid results. The calibration of the camera responsivity for the smartphone has a big influence on the results caused by the small detector surface of the camera semiconductor. With commercial software the camera module of a smartphone can be used for the measurement of radioactivity. Dose rates resulting from typical nuclear medicine procedures can be measured reliably (e. g., dismissal dose after radioiodine therapy). The signal shows a high correlation to measured values of conventional dose measurement devices.
Re-visiting the Amplifier Gains of the HST/ACS Wide Field Channel CCDs
NASA Astrophysics Data System (ADS)
Desjardins, Tyler D.; Grogin, Norman A.; ACS Team
2018-06-01
For the first time since HST Servicing Mission 4 (SM4) in May 2009, we present an analysis of the amplifier gains of the Advanced Camera for Surveys (ACS) Wide Field Channel (WFC) CCDs. Using a series of in-flight flat-field exposures taken in November 2017 with a tungsten calibration lamp, we utilize the photon transfer method to estimate the gains of the WFC1 and WFC2 CCD amplifiers. We find evidence that the gains of the four readout amplifiers have changed by a small, but statistically significant, 1–2% since SM4. We further present a study of historical ACS/WFC observations of the globular cluster NGC 104 (47 Tuc) in an attempt to estimate the time dependence of the gains.
Yu, Cilong; Chen, Peibing; Zhong, Xiaopin; Pan, Xizhou; Deng, Yuanlong
2018-05-07
Machine vision systems have been widely used in industrial production lines because of their automation and contactless inspection mode. In polymeric polarizers, extremely slight transparent aesthetic defects are difficult to detect and characterize through conventional illumination. To inspect such defects rapidly and accurately, a saturated imaging technique was proposed, which innovatively uses the characteristics of saturated light in imaging by adjusting the light intensity, exposure time, and camera gain. An optical model of defect was established to explain the theory by simulation. Based on the optimum experimental conditions, active two-step scanning was conducted to demonstrate the feasibility of this detection scheme, and the proposed method was found to be efficient for real-time and in situ inspection of defects in polymer films and products.
Low-cost digital dynamic visualization system
NASA Astrophysics Data System (ADS)
Asundi, Anand K.; Sajan, M. R.
1995-05-01
High speed photographic systems like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording systems requiring time consuming and tedious wet processing of the films. Currently digital cameras are replacing to certain extent the conventional cameras for static experiments. Recently, there is lot of interest in developing and modifying CCD architectures and recording arrangements for dynamic scene analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration (TDI) mode for digitally recording dynamic scenes. Applications in solid as well as fluid impact problems are presented.
Hardware accelerator design for tracking in smart camera
NASA Astrophysics Data System (ADS)
Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Vohra, Anil
2011-10-01
Smart Cameras are important components in video analysis. For video analysis, smart cameras needs to detect interesting moving objects, track such objects from frame to frame, and perform analysis of object track in real time. Therefore, the use of real-time tracking is prominent in smart cameras. The software implementation of tracking algorithm on a general purpose processor (like PowerPC) could achieve low frame rate far from real-time requirements. This paper presents the SIMD approach based hardware accelerator designed for real-time tracking of objects in a scene. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA. Resulted frame rate is 30 frames per second for 250x200 resolution video in gray scale.
An electron energy loss spectrometer based streak camera for time resolved TEM measurements.
Ali, Hasan; Eriksson, Johan; Li, Hu; Jafri, S Hassan M; Kumar, M S Sharath; Ögren, Jim; Ziemann, Volker; Leifer, Klaus
2017-05-01
We propose an experimental setup based on a streak camera approach inside an energy filter to measure time resolved properties of materials in the transmission electron microscope (TEM). In order to put in place the streak camera, a beam sweeper was built inside an energy filter. After exciting the TEM sample, the beam is swept across the CCD camera of the filter. We describe different parts of the setup at the example of a magnetic measurement. This setup is capable to acquire time resolved diffraction patterns, electron energy loss spectra (EELS) and images with total streaking times in the range between 100ns and 10μs. Copyright © 2016 Elsevier B.V. All rights reserved.
High-performance dual-speed CCD camera system for scientific imaging
NASA Astrophysics Data System (ADS)
Simpson, Raymond W.
1996-03-01
Traditionally, scientific camera systems were partitioned with a `camera head' containing the CCD and its support circuitry and a camera controller, which provided analog to digital conversion, timing, control, computer interfacing, and power. A new, unitized high performance scientific CCD camera with dual speed readout at 1 X 106 or 5 X 106 pixels per second, 12 bit digital gray scale, high performance thermoelectric cooling, and built in composite video output is described. This camera provides all digital, analog, and cooling functions in a single compact unit. The new system incorporates the A/C converter, timing, control and computer interfacing in the camera, with the power supply remaining a separate remote unit. A 100 Mbyte/second serial link transfers data over copper or fiber media to a variety of host computers, including Sun, SGI, SCSI, PCI, EISA, and Apple Macintosh. Having all the digital and analog functions in the camera made it possible to modify this system for the Woods Hole Oceanographic Institution for use on a remote controlled submersible vehicle. The oceanographic version achieves 16 bit dynamic range at 1.5 X 105 pixels/second, can be operated at depths of 3 kilometers, and transfers data to the surface via a real time fiber optic link.
Characterizing volcanic activity: Application of freely-available webcams
NASA Astrophysics Data System (ADS)
Dehn, J.; Harrild, M.; Webley, P. W.
2017-12-01
In recent years, freely-available web-based cameras, or webcams, have become more readily available allowing an increased level of monitoring at active volcanoes across the globe. While these cameras have been extensively used as qualitative tools, they provide a unique dataset to perform quantitative analyzes of the changing behavior of the particular volcano within the cameras field of view. We focus on the multitude of these freely-available webcams and present a new algorithm to detect changes in volcanic activity using nighttime webcam data. Our approach uses a quick, efficient, and fully automated algorithm to identify changes in webcam data in near real-time, including techniques such as edge detection, Gaussian mixture models, and temporal/spatial statistical tests, which are applied to each target image. Often the image metadata (exposure, gain settings, aperture, focal length, etc.) are unknown, meaning we developed our algorithm to identify the quantity of volcanically incandescent pixels as well as the number of specific algorithm tests needed to detect thermal activity, instead of directly correlating brightness in the webcam to eruption temperatures. We compared our algorithm results to a manual analysis of webcam data for several volcanoes and determined a false detection rate of less than 3% for the automated approach. In our presentation, we describe the different tests integrated into our algorithm, lessons learned, and how we applied our method to several volcanoes across the North Pacific during its development and implementation. We will finish with a discussion on the global applicability of our approach and how to build a 24/7, 365 day a year tool that can be used as an additional data source for real-time analysis of volcanic activity.
Edge Turbulence Imaging in Alcator C-Mod
NASA Astrophysics Data System (ADS)
Zweben, Stewart J.
2001-10-01
This talk will describe measurements and modeling of the 2-D structure of edge turbulence in Alcator C-Mod. The radial vs. poloidal structure was measured using Gas Puff Imaging (GPI) (R. Maqueda et al, RSI 72, 931 (2001), J. Terry et al, J. Nucl. Materials 290-293, 757 (2001)), in which the visible light emitted by an edge neutral gas puff (generally D or He) is viewed along the local magnetic field by a fast-gated video camera. Strong fluctuations are observed in the gas cloud light emission when the camera is gated at ~2 microsec exposure time per frame. The structure of these fluctuations is highly turbulent with a typical radial and poloidal scale of ≈1 cm, and often with local maxima in the scrape-off layer (i.e. ``blobs"). Video clips and analyses of these images will be presented along with their variation in different plasma regimes. The local time dependence of edge turbulence is measured using high-speed photodiodes viewing the gas puff emission, a scanning Langmuir probe, and also with a Princeton Scientific Instruments ultra-fast framing camera, which can make 2-D images the gas puff at up to 200,000 frames/sec. Probe measurements show that the strong turbulence region moves to the separatrix as the density limit is approached, which may be connected to the density limit (B. LaBombard et al., Phys. Plasmas 8 2107 (2001)). Comparisons of this C-Mod turbulence data will be made with results of simulations from the Drift-Ballooning Mode (DBM) (B.N. Rogers et al, Phys. Rev. Lett. 20 4396 (1998))and Non-local Edge Turbulence (NLET) codes.
NASA Astrophysics Data System (ADS)
Liu, Yu-Che; Huang, Chung-Lin
2013-03-01
This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.
View of 'Cape Verde' from 'Cape St. Mary' in Mid-Afternoon (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately false-color mosaic. The exposures were taken during mid-afternoon lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. The false color enhances subtle color differences among materials in the rocks and soils of the scene.View of 'Cape Verde' from 'Cape St. Mary' in Late Morning (False Color)
NASA Technical Reports Server (NTRS)
2006-01-01
As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into a false-color mosaic. The exposures were taken during late-morning lighting conditions. The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. The false color enhances subtle color differences among materials in the rocks and soils of the scene.Particle and heat flux estimates in Proto-MPEX in Helicon Mode with IR imaging
NASA Astrophysics Data System (ADS)
Showers, M. A.; Biewer, T. M.; Caughman, J. B. O.; Donovan, D. C.; Goulding, R. H.; Rapp, J.
2016-10-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) at Oak Ridge National Laboratory (ORNL) is a linear plasma device developing the plasma source concept for the Material Plasma Exposure eXperiment (MPEX), which will address plasma material interaction (PMI) science for future fusion reactors. To better understand how and where energy is being lost from the Proto-MPEX plasma during ``helicon mode'' operations, particle and heat fluxes are quantified at multiple locations along the machine length. Relevant diagnostics include infrared (IR) cameras, four double Langmuir probes (LPs), and in-vessel thermocouples (TCs). The IR cameras provide temperature measurements of Proto-MPEX's plasma-facing dump and target plates, located on either end of the machine. The change in surface temperature is measured over the duration of the plasma shot to determine the heat flux hitting the plates. The IR cameras additionally provide 2-D thermal load distribution images of these plates, highlighting Proto-MPEX plasma behaviors, such as hot spots. The LPs and TCs provide additional plasma measurements required to determine particle and heat fluxes. Quantifying axial variations in fluxes will help identify machine operating parameters that will improve Proto-MPEX's performance, increasing its PMI research capabilities. This work was supported by the U.S. D.O.E. contract DE-AC05-00OR22725.
A hierarchical model for estimating density in camera-trap studies
Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.
2009-01-01
Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.
Pirie, Chris G; Pizzirani, Stefano
2011-12-01
To describe a digital single lens reflex (dSLR) camera adaptor for posterior segment photography. A total of 30 normal canine and feline animals were imaged using a dSLR adaptor which mounts between a dSLR camera body and lens. Posterior segment viewing and imaging was performed with the aid of an indirect lens ranging from 28-90D. Coaxial illumination for viewing was provided by a single white light emitting diode (LED) within the adaptor, while illumination during exposure was provided by the pop-up flash or an accessory flash. Corneal and/or lens reflections were reduced using a pair of linear polarizers, having their azimuths perpendicular to one another. Quality high-resolution, reflection-free, digital images of the retina were obtained. Subjective image evaluation demonstrated the same amount of detail, as compared to a conventional fundus camera. A wide range of magnification(s) [1.2-4X] and/or field(s) of view [31-95 degrees, horizontal] were obtained by altering the indirect lens utilized. The described adaptor may provide an alternative to existing fundus camera systems. Quality images were obtained and the adapter proved to be versatile, portable and of low cost.
SPADAS: a high-speed 3D single-photon camera for advanced driver assistance systems
NASA Astrophysics Data System (ADS)
Bronzi, D.; Zou, Y.; Bellisai, S.; Villa, F.; Tisa, S.; Tosi, A.; Zappa, F.
2015-02-01
Advanced Driver Assistance Systems (ADAS) are the most advanced technologies to fight road accidents. Within ADAS, an important role is played by radar- and lidar-based sensors, which are mostly employed for collision avoidance and adaptive cruise control. Nonetheless, they have a narrow field-of-view and a limited ability to detect and differentiate objects. Standard camera-based technologies (e.g. stereovision) could balance these weaknesses, but they are currently not able to fulfill all automotive requirements (distance range, accuracy, acquisition speed, and frame-rate). To this purpose, we developed an automotive-oriented CMOS single-photon camera for optical 3D ranging based on indirect time-of-flight (iTOF) measurements. Imagers based on Single-photon avalanche diode (SPAD) arrays offer higher sensitivity with respect to CCD/CMOS rangefinders, have inherent better time resolution, higher accuracy and better linearity. Moreover, iTOF requires neither high bandwidth electronics nor short-pulsed lasers, hence allowing the development of cost-effective systems. The CMOS SPAD sensor is based on 64 × 32 pixels, each able to process both 2D intensity-data and 3D depth-ranging information, with background suppression. Pixel-level memories allow fully parallel imaging and prevents motion artefacts (skew, wobble, motion blur) and partial exposure effects, which otherwise would hinder the detection of fast moving objects. The camera is housed in an aluminum case supporting a 12 mm F/1.4 C-mount imaging lens, with a 40°×20° field-of-view. The whole system is very rugged and compact and a perfect solution for vehicle's cockpit, with dimensions of 80 mm × 45 mm × 70 mm, and less that 1 W consumption. To provide the required optical power (1.5 W, eye safe) and to allow fast (up to 25 MHz) modulation of the active illumination, we developed a modular laser source, based on five laser driver cards, with three 808 nm lasers each. We present the full characterization of the 3D automotive system, operated both at night and during daytime, in both indoor and outdoor, in real traffic, scenario. The achieved long-range (up to 45m), high dynamic-range (118 dB), highspeed (over 200 fps) 3D depth measurement, and high precision (better than 90 cm at 45 m), highlight the excellent performance of this CMOS SPAD camera for automotive applications.
NASA Astrophysics Data System (ADS)
Sabin, Lisa D.; Kozawa, Kathleen; Behrentz, Eduardo; Winer, Arthur M.; Fitz, Dennis R.; Pankratz, David V.; Colome, Steven D.; Fruin, Scott A.
Variables affecting children's exposure during school bus commutes were investigated using real-time measurements of black carbon (BC), particle-bound polycyclic aromatic hydrocarbons (PB-PAH) and nitrogen dioxide (NO 2) inside 3 conventional diesel school buses, a particle trap-outfitted (TO) diesel school bus and a compressed natural gas (CNG) school bus, while traveling along an urban Los Angeles Unified School District bus route. A video camera was mounted at the front of each bus to record roadway conditions ahead of the bus during each commute. The videotapes from 12 commutes, in conjunction with pollutant concentration time series, were used to determine the influence of variables such as vehicles being followed, bus type and roadway type on pollutant concentrations inside the bus. For all buses tested, the highest concentrations of BC, PB-PAH and NO 2 were observed when following a diesel school bus, especially if that bus was emitting visible exhaust. This result was important because other diesel school buses were responsible for the majority of the diesel vehicle encounters, primarily due to caravanning with each other when leaving a school at the same time. Compared with following a gasoline vehicle or no target, following a smoky diesel school bus yielded BC and PB-PAH concentrations inside the cabin 8 and 11 times higher, respectively, with windows open, and ˜1.8 times higher for both pollutants with windows closed. When other diesel vehicles were not present, pollutant concentrations were highest inside the conventional diesel buses and lowest inside the CNG bus, while the TO diesel bus exhibited intermediate concentrations. Differences in pollutant concentrations between buses were most pronounced with the bus windows closed, and were attributed to a combination of higher concentrations in the exhaust and higher exhaust gas intrusion rates for the conventional diesel buses. Conventional diesel school buses can have a double exposure impact on commuting children: first, exposures to the exhaust from other nearby diesel school buses and, second, exposure to the bus's own exhaust through "self-pollution".
Electronic cameras for low-light microscopy.
Rasnik, Ivan; French, Todd; Jacobson, Ken; Berland, Keith
2013-01-01
This chapter introduces to electronic cameras, discusses the various parameters considered for evaluating their performance, and describes some of the key features of different camera formats. The chapter also presents the basic understanding of functioning of the electronic cameras and how these properties can be exploited to optimize image quality under low-light conditions. Although there are many types of cameras available for microscopy, the most reliable type is the charge-coupled device (CCD) camera, which remains preferred for high-performance systems. If time resolution and frame rate are of no concern, slow-scan CCDs certainly offer the best available performance, both in terms of the signal-to-noise ratio and their spatial resolution. Slow-scan cameras are thus the first choice for experiments using fixed specimens such as measurements using immune fluorescence and fluorescence in situ hybridization. However, if video rate imaging is required, one need not evaluate slow-scan CCD cameras. A very basic video CCD may suffice if samples are heavily labeled or are not perturbed by high intensity illumination. When video rate imaging is required for very dim specimens, the electron multiplying CCD camera is probably the most appropriate at this technological stage. Intensified CCDs provide a unique tool for applications in which high-speed gating is required. The variable integration time video cameras are very attractive options if one needs to acquire images at video rate acquisition, as well as with longer integration times for less bright samples. This flexibility can facilitate many diverse applications with highly varied light levels. Copyright © 2007 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Holland, S. Douglas
1992-09-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
NASA Technical Reports Server (NTRS)
Holland, S. Douglas (Inventor)
1992-01-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
Characterization of the LBNL PEM Camera
NASA Astrophysics Data System (ADS)
Wang, G.-C.; Huber, J. S.; Moses, W. W.; Qi, J.; Choong, W.-S.
2006-06-01
We present the tomographic images and performance measurements of the LBNL positron emission mammography (PEM) camera, a specially designed positron emission tomography (PET) camera that utilizes PET detector modules with depth of interaction measurement capability to achieve both high sensitivity and high resolution for breast cancer detection. The camera currently consists of 24 detector modules positioned as four detector banks to cover a rectangular patient port that is 8.2/spl times/6 cm/sup 2/ with a 5 cm axial extent. Each LBNL PEM detector module consists of 64 3/spl times/3/spl times/30 mm/sup 3/ LSO crystals coupled to a single photomultiplier tube (PMT) and an 8/spl times/8 silicon photodiode array (PD). The PMT provides accurate timing, the PD identifies the crystal of interaction, the sum of the PD and PMT signals (PD+PMT) provides the total energy, and the PD/(PD+PMT) ratio determines the depth of interaction. The performance of the camera has been evaluated by imaging various phantoms. The full-width-at-half-maximum (FWHM) spatial resolution changes slightly from 1.9 mm to 2.1 mm when measured at the center and corner of the field of the view, respectively, using a 6 ns coincidence timing window and a 300-750 keV energy window. With the same setup, the peak sensitivity of the camera is 1.83 kcps//spl mu/Ci.
Real-time depth camera tracking with geometrically stable weight algorithm
NASA Astrophysics Data System (ADS)
Fu, Xingyin; Zhu, Feng; Qi, Feng; Wang, Mingming
2017-03-01
We present an approach for real-time camera tracking with depth stream. Existing methods are prone to drift in sceneries without sufficient geometric information. First, we propose a new weight method for an iterative closest point algorithm commonly used in real-time dense mapping and tracking systems. By detecting uncertainty in pose and increasing weight of points that constrain unstable transformations, our system achieves accurate and robust trajectory estimation results. Our pipeline can be fully parallelized with GPU and incorporated into the current real-time depth camera tracking system seamlessly. Second, we compare the state-of-the-art weight algorithms and propose a weight degradation algorithm according to the measurement characteristics of a consumer depth camera. Third, we use Nvidia Kepler Shuffle instructions during warp and block reduction to improve the efficiency of our system. Results on the public TUM RGB-D database benchmark demonstrate that our camera tracking system achieves state-of-the-art results both in accuracy and efficiency.
MonoSLAM: real-time single camera SLAM.
Davison, Andrew J; Reid, Ian D; Molton, Nicholas D; Stasse, Olivier
2007-06-01
We present a real-time algorithm which can recover the 3D trajectory of a monocular camera, moving rapidly through a previously unknown scene. Our system, which we dub MonoSLAM, is the first successful application of the SLAM methodology from mobile robotics to the "pure vision" domain of a single uncontrolled camera, achieving real time but drift-free performance inaccessible to Structure from Motion approaches. The core of the approach is the online creation of a sparse but persistent map of natural landmarks within a probabilistic framework. Our key novel contributions include an active approach to mapping and measurement, the use of a general motion model for smooth camera movement, and solutions for monocular feature initialization and feature orientation estimation. Together, these add up to an extremely efficient and robust algorithm which runs at 30 Hz with standard PC and camera hardware. This work extends the range of robotic systems in which SLAM can be usefully applied, but also opens up new areas. We present applications of MonoSLAM to real-time 3D localization and mapping for a high-performance full-size humanoid robot and live augmented reality with a hand-held camera.
VizieR Online Data Catalog: Periods of 4-10 Myr old T Tauri members of Orion OB1 (Karim+, 2016)
NASA Astrophysics Data System (ADS)
Karim, M. T.; Stassun, K. G.; Briceno, C.; Vivas, A. K.; Raetz, S.; Mateu, C.; Downes, J. J.; Calvet, N.; Hernandez, J.; Neuhauser, R.; Mugrauer, M.; Takahashi, H.; Tachihara, K.; Chini, R.; Cruz-Dias, G. A.; Aarnio, A.; James, D. J.; Hackstein, M.
2017-02-01
The Astronomia Variability Survey of Orion (CVSO) was carried out at the Llano del Hato National Astronomical Observatory in Venezuela, with the QUEST CCD mosaic camera (8000*8000pixels) on the 1m (clear aperture) Schmidt telescope, with a plate scale of 1.02''/pixel and field of view of 5.4 deg2. This V-, RC-, and IC-band multi-epoch survey, covering ~180deg2 of the Orion OB1 association, spans a time baseline of 12yr, from 1998 December to 2011 February. The 25 Ori cluster was observed by the 0.6/0.9m Schmidt-type telescope at Jena Observatory (Germany), the two 5.9'' telescopes at Observatorio Cerro Armazones (OCA, Chile), and the 1.5m reflector at the Gunma Astronomical Observatory in Japan, over four observing campaigns during the years 2010-2013. The Jena Schmidt-type telescope was equipped with the optical Schmidt Telescope Camera (STK), with an e2v 42-10 2048*2048 detector, yielding a plate scale of 1.55''/pixel and a field of view of 53'*53', thus encompassing most of the cluster. The Jena 50s exposures, all taken through the R filter, were centered on 25 Ori. A total of 8506 individual exposures were obtained in 108 nights. The Gunma 1.5m reflector observations were carried out by obtaining 60s integrations in R with the Gunma Low-resolution Spectrograph and Imager (GLOWS), which has an e2v CCD55-30 1250*1152 pixel detector with a 0.6''/pixel scale, covering a field of view of 12.5'*11.5'. Observations were obtained during four nights in year 2010. The Observatorio Cerro Armazones observations were done in the R band using the RoBoTT (Robotic Bochum TWin Telescope), which consists of twin Takahashi 150mm aperture apochromatic astrographs, each equipped with an Apogee U16M camera with a KAF-16803 4096*4096 pixel CCD, providing a 2.7°*2.7° field of view with 2.37''/pixel scale. The 60s exposures were centered on 25 Ori, spanning an area much larger than the cluster. OCA data were obtained during all YETI seasons. During the nights of 2006 January 8-15, we used the 0.9m telescope with the 8000*8000 pixel MOSAIC imager at the Kitt Peak National Observatory (KPNO), Arizona, USA, to obtain IC-band time-series observations of several regions in the Orion OB1 association, including the 25 Ori cluster in the OB1a subassociation, and fields in the OB1b subassociation, under NOAO program 2005B-0529. (1 data file).
Timing generator of scientific grade CCD camera and its implementation based on FPGA technology
NASA Astrophysics Data System (ADS)
Si, Guoliang; Li, Yunfei; Guo, Yongfei
2010-10-01
The Timing Generator's functions of Scientific Grade CCD Camera is briefly presented: it generates various kinds of impulse sequence for the TDI-CCD, video processor and imaging data output, acting as the synchronous coordinator for time in the CCD imaging unit. The IL-E2TDI-CCD sensor produced by DALSA Co.Ltd. use in the Scientific Grade CCD Camera. Driving schedules of IL-E2 TDI-CCD sensor has been examined in detail, the timing generator has been designed for Scientific Grade CCD Camera. FPGA is chosen as the hardware design platform, schedule generator is described with VHDL. The designed generator has been successfully fulfilled function simulation with EDA software and fitted into XC2VP20-FF1152 (a kind of FPGA products made by XILINX). The experiments indicate that the new method improves the integrated level of the system. The Scientific Grade CCD camera system's high reliability, stability and low power supply are achieved. At the same time, the period of design and experiment is sharply shorted.
Dynamic photoelasticity by TDI imaging
NASA Astrophysics Data System (ADS)
Asundi, Anand K.; Sajan, M. R.
2001-06-01
High speed photographic system like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for the recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording system requiring time consuming and tedious wet processing of the films. Digital cameras are replacing the conventional cameras, to certain extent in static experiments. Recently, there is lots of interest in development and modifying CCD architectures and recording arrangements for dynamic scenes analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration mode for digitally recording dynamic photoelastic stress patterns. Applications in strobe and streak photoelastic pattern recording and system limitations will be explained in the paper.
Enhancing swimming pool safety by the use of range-imaging cameras
NASA Astrophysics Data System (ADS)
Geerardyn, D.; Boulanger, S.; Kuijk, M.
2015-05-01
Drowning is the cause of death of 372.000 people, each year worldwide, according to the report of November 2014 of the World Health Organization.1 Currently, most swimming pools only use lifeguards to detect drowning people. In some modern swimming pools, camera-based detection systems are nowadays being integrated. However, these systems have to be mounted underwater, mostly as a replacement of the underwater lighting. In contrast, we are interested in range imaging cameras mounted on the ceiling of the swimming pool, allowing to distinguish swimmers at the surface from drowning people underwater, while keeping the large field-of-view and minimizing occlusions. However, we have to take into account that the water surface of a swimming pool is not a flat, but mostly rippled surface, and that the water is transparent for visible light, but less transparent for infrared or ultraviolet light. We investigated the use of different types of 3D cameras to detect objects underwater at different depths and with different amplitudes of surface perturbations. Specifically, we performed measurements with a commercial Time-of-Flight camera, a commercial structured-light depth camera and our own Time-of-Flight system. Our own system uses pulsed Time-of-Flight and emits light of 785 nm. The measured distances between the camera and the object are influenced through the perturbations on the water surface. Due to the timing of our Time-of-Flight camera, our system is theoretically able to minimize the influence of the reflections of a partially-reflecting surface. The combination of a post image-acquisition filter compensating for the perturbations and the use of a light source with shorter wavelengths to enlarge the depth range can improve the current commercial cameras. As a result, we can conclude that low-cost range imagers can increase swimming pool safety, by inserting a post-processing filter and the use of another light source.
VizieR Online Data Catalog: MOST photometry of Proxima (Kipping+, 2017)
NASA Astrophysics Data System (ADS)
Kipping, D. M.; Cameron, C.; Hartman, J. D.; Davenport, J. R. A.; Matthews, J. M.; Sasselov, D.; Rowe, J.; Siverd, R. J.; Chen, J.; Sandford, E.; Bakos, G. A.; Jordan, A.; Bayliss, D.; Henning, T.; Mancini, L.; Penev, K.; Csubry, Z.; Bhatti, W.; da Silva Bento, J.; Guenther, D. B.; Kuschnig, R.; Moffat, A. F. J.; Rucinski, S. M.; Weiss, W. W.
2017-06-01
Microwave and Oscillations of STars (MOST) telescope is a 53kg satellite in low Earth orbit with a 15cm aperture visible band camera (35-750nm). MOST observed Proxima Centauri in 2014 May (beginning on HJD(2000) 2456793.18) for about 12.5 days. MOST again observed Proxima Centauri in 2015 May (starting on HJD(2000) 2457148.54), this time for a total of 31 days. Independent of the MOST observations, Proxima Cen was also monitored by the HATSouth ground-based telescope network. The network consists of six wide-field photometric instruments located at three observatories in the Southern Hemisphere (Las Campanas Observatory [LCO] in Chile, the High Energy Stereoscopic System [HESS] site in Namibia, and Siding Spring Observatory [SSO] in Australia), with two instruments per site. Each instrument consists of four 18cm diameter astrographs and associated 4K*4K backside-illuminated CCD cameras and Sloan r-band filters, placed on a common robotic mount. The four astrographs and cameras together cover a 8.2°*8.2° mosaic field of view at a pixel scale of 3.7''/pixel. Observations of a field containing Proxima Cen were collected as part of the general HATSouth transit survey, with a total of 11071 (this number does not count observations that were rejected as not useful for high-precision photometry, or those that produced large-amplitude outliers in the Proxima Cen light curve) composite 3*80s exposures gathered between 2012 June 14 and 2014 September 20. These include 3430 observations made with the HS-2 unit at LCO, 4630 observations made with the HS-4 unit at the HESS site, and 3011 observations made with the HS-6 unit at the SSO site. Due to weather and other factors, the cadence was nonuniform. The median time difference between consecutive observations in the full time series is 368s. (2 data files).
Imaging of turbulent structures and tomographic reconstruction of TORPEX plasma emissivity
NASA Astrophysics Data System (ADS)
Iraji, D.; Furno, I.; Fasoli, A.; Theiler, C.
2010-12-01
In the TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], a simple magnetized plasma device, low frequency electrostatic fluctuations associated with interchange waves, are routinely measured by means of extensive sets of Langmuir probes. To complement the electrostatic probe measurements of plasma turbulence and study of plasma structures smaller than the spatial resolution of probes array, a nonperturbative direct imaging system has been developed on TORPEX, including a fast framing Photron-APX-RS camera and an image intensifier unit. From the line-integrated camera images, we compute the poloidal emissivity profile of the plasma by applying a tomographic reconstruction technique using a pixel method and solving an overdetermined set of equations by singular value decomposition. This allows comparing statistical, spectral, and spatial properties of visible light radiation with electrostatic fluctuations. The shape and position of the time-averaged reconstructed plasma emissivity are observed to be similar to those of the ion saturation current profile. In the core plasma, excluding the electron cyclotron and upper hybrid resonant layers, the mean value of the plasma emissivity is observed to vary with (Te)α(ne)β, in which α =0.25-0.7 and β =0.8-1.4, in agreement with collisional radiative model. The tomographic reconstruction is applied to the fast camera movie acquired with 50 kframes/s rate and 2 μs of exposure time to obtain the temporal evolutions of the emissivity fluctuations. Conditional average sampling is also applied to visualize and measure sizes of structures associated with the interchange mode. The ω-time and the two-dimensional k-space Fourier analysis of the reconstructed emissivity fluctuations show the same interchange mode that is detected in the ω and k spectra of the ion saturation current fluctuations measured by probes. Small scale turbulent plasma structures can be detected and tracked in the reconstructed emissivity movies with the spatial resolution down to 2 cm, well beyond the spatial resolution of the probe array.
Towards next generation 3D cameras
NASA Astrophysics Data System (ADS)
Gupta, Mohit
2017-03-01
We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (<100 microns resolution) scans in extremely demanding scenarios with low-cost components. Several of these cameras are making a practical impact in industrial automation, being adopted in robotic inspection and assembly systems.
Microchannel plate streak camera
Wang, Ching L.
1989-01-01
An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.
High-dynamic-range imaging for cloud segmentation
NASA Astrophysics Data System (ADS)
Dev, Soumyabrata; Savoy, Florian M.; Lee, Yee Hui; Winkler, Stefan
2018-04-01
Sky-cloud images obtained from ground-based sky cameras are usually captured using a fisheye lens with a wide field of view. However, the sky exhibits a large dynamic range in terms of luminance, more than a conventional camera can capture. It is thus difficult to capture the details of an entire scene with a regular camera in a single shot. In most cases, the circumsolar region is overexposed, and the regions near the horizon are underexposed. This renders cloud segmentation for such images difficult. In this paper, we propose HDRCloudSeg - an effective method for cloud segmentation using high-dynamic-range (HDR) imaging based on multi-exposure fusion. We describe the HDR image generation process and release a new database to the community for benchmarking. Our proposed approach is the first using HDR radiance maps for cloud segmentation and achieves very good results.
Performance of the Tachyon Time-of-Flight PET Camera
NASA Astrophysics Data System (ADS)
Peng, Q.; Choong, W.-S.; Vu, C.; Huber, J. S.; Janecek, M.; Wilson, D.; Huesman, R. H.; Qi, Jinyi; Zhou, Jian; Moses, W. W.
2015-02-01
We have constructed and characterized a time-of-flight Positron Emission Tomography (TOF PET) camera called the Tachyon. The Tachyon is a single-ring Lutetium Oxyorthosilicate (LSO) based camera designed to obtain significantly better timing resolution than the 550 ps found in present commercial TOF cameras, in order to quantify the benefit of improved TOF resolution for clinically relevant tasks. The Tachyon's detector module is optimized for timing by coupling the 6.15 ×25 mm2 side of 6.15 ×6.15 ×25 mm3 LSO scintillator crystals onto a 1-inch diameter Hamamatsu R-9800 PMT with a super-bialkali photocathode. We characterized the camera according to the NEMA NU 2-2012 standard, measuring the energy resolution, timing resolution, spatial resolution, noise equivalent count rates and sensitivity. The Tachyon achieved a coincidence timing resolution of 314 ps +/- 20 ps FWHM over all crystal-crystal combinations. Experiments were performed with the NEMA body phantom to assess the imaging performance improvement over non-TOF PET. The results show that at a matched contrast, incorporating 314 ps TOF reduces the standard deviation of the contrast by a factor of about 2.3.
Performance of the Tachyon Time-of-Flight PET Camera.
Peng, Q; Choong, W-S; Vu, C; Huber, J S; Janecek, M; Wilson, D; Huesman, R H; Qi, Jinyi; Zhou, Jian; Moses, W W
2015-02-01
We have constructed and characterized a time-of-flight Positron Emission Tomography (TOF PET) camera called the Tachyon. The Tachyon is a single-ring Lutetium Oxyorthosilicate (LSO) based camera designed to obtain significantly better timing resolution than the ~ 550 ps found in present commercial TOF cameras, in order to quantify the benefit of improved TOF resolution for clinically relevant tasks. The Tachyon's detector module is optimized for timing by coupling the 6.15 × 25 mm 2 side of 6.15 × 6.15 × 25 mm 3 LSO scintillator crystals onto a 1-inch diameter Hamamatsu R-9800 PMT with a super-bialkali photocathode. We characterized the camera according to the NEMA NU 2-2012 standard, measuring the energy resolution, timing resolution, spatial resolution, noise equivalent count rates and sensitivity. The Tachyon achieved a coincidence timing resolution of 314 ps +/- ps FWHM over all crystal-crystal combinations. Experiments were performed with the NEMA body phantom to assess the imaging performance improvement over non-TOF PET. The results show that at a matched contrast, incorporating 314 ps TOF reduces the standard deviation of the contrast by a factor of about 2.3.
Performance of the Tachyon Time-of-Flight PET Camera
Peng, Q.; Choong, W.-S.; Vu, C.; Huber, J. S.; Janecek, M.; Wilson, D.; Huesman, R. H.; Qi, Jinyi; Zhou, Jian; Moses, W. W.
2015-01-01
We have constructed and characterized a time-of-flight Positron Emission Tomography (TOF PET) camera called the Tachyon. The Tachyon is a single-ring Lutetium Oxyorthosilicate (LSO) based camera designed to obtain significantly better timing resolution than the ~ 550 ps found in present commercial TOF cameras, in order to quantify the benefit of improved TOF resolution for clinically relevant tasks. The Tachyon’s detector module is optimized for timing by coupling the 6.15 × 25 mm2 side of 6.15 × 6.15 × 25 mm3 LSO scintillator crystals onto a 1-inch diameter Hamamatsu R-9800 PMT with a super-bialkali photocathode. We characterized the camera according to the NEMA NU 2-2012 standard, measuring the energy resolution, timing resolution, spatial resolution, noise equivalent count rates and sensitivity. The Tachyon achieved a coincidence timing resolution of 314 ps +/− ps FWHM over all crystal-crystal combinations. Experiments were performed with the NEMA body phantom to assess the imaging performance improvement over non-TOF PET. The results show that at a matched contrast, incorporating 314 ps TOF reduces the standard deviation of the contrast by a factor of about 2.3. PMID:26594057
Performance of the Tachyon Time-of-Flight PET Camera
Peng, Q.; Choong, W. -S.; Vu, C.; ...
2015-01-23
We have constructed and characterized a time-of-flight Positron Emission Tomography (TOF PET) camera called the Tachyon. The Tachyon is a single-ring Lutetium Oxyorthosilicate (LSO) based camera designed to obtain significantly better timing resolution than the ~ 550 ps found in present commercial TOF cameras, in order to quantify the benefit of improved TOF resolution for clinically relevant tasks. The Tachyon's detector module is optimized for timing by coupling the 6.15 ×25 mm 2 side of 6.15 ×6.15 ×25 mm 3 LSO scintillator crystals onto a 1-inch diameter Hamamatsu R-9800 PMT with a super-bialkali photocathode. We characterized the camera according tomore » the NEMA NU 2-2012 standard, measuring the energy resolution, timing resolution, spatial resolution, noise equivalent count rates and sensitivity. The Tachyon achieved a coincidence timing resolution of 314 ps +/- 20 ps FWHM over all crystal-crystal combinations. Experiments were performed with the NEMA body phantom to assess the imaging performance improvement over non-TOF PET. We find that the results show that at a matched contrast, incorporating 314 ps TOF reduces the standard deviation of the contrast by a factor of about 2.3.« less
Circuit design of an EMCCD camera
NASA Astrophysics Data System (ADS)
Li, Binhua; Song, Qian; Jin, Jianhui; He, Chun
2012-07-01
EMCCDs have been used in the astronomical observations in many ways. Recently we develop a camera using an EMCCD TX285. The CCD chip is cooled to -100°C in an LN2 dewar. The camera controller consists of a driving board, a control board and a temperature control board. Power supplies and driving clocks of the CCD are provided by the driving board, the timing generator is located in the control board. The timing generator and an embedded Nios II CPU are implemented in an FPGA. Moreover the ADC and the data transfer circuit are also in the control board, and controlled by the FPGA. The data transfer between the image workstation and the camera is done through a Camera Link frame grabber. The software of image acquisition is built using VC++ and Sapera LT. This paper describes the camera structure, the main components and circuit design for video signal processing channel, clock driver, FPGA and Camera Link interfaces, temperature metering and control system. Some testing results are presented.
The development of large-aperture test system of infrared camera and visible CCD camera
NASA Astrophysics Data System (ADS)
Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying
2015-10-01
Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.
NASA Astrophysics Data System (ADS)
Cuillandre, J.-C.; Magnier, E.; Sabin, D.; Mahoney, B.
2016-05-01
Mauna Kea is known for its pristine seeing conditions but sky transparency can be an issue for science operations since at least 25% of the observable (i.e. open dome) nights are not photometric, an effect mostly due to high-altitude cirrus. Since 2001, the original single channel SkyProbe mounted in parallel on the Canada-France-Hawaii Telescope (CFHT) has gathered one V-band exposure every minute during each observing night using a small CCD camera offering a very wide field of view (35 sq. deg.) encompassing the region pointed by the telescope for science operations, and exposures long enough (40 seconds) to capture at least 100 stars of Hipparcos' Tycho catalog at high galactic latitudes (and up to 600 stars at low galactic latitudes). The measurement of the true atmospheric absorption is achieved within 2%, a key advantage over all-sky direct thermal infrared imaging detection of clouds. The absolute measurement of the true atmospheric absorption by clouds and particulates affecting the data being gathered by the telescope's main science instrument has proven crucial for decision making in the CFHT queued service observing (QSO) representing today all of the telescope time. Also, science exposures taken in non-photometric conditions are automatically registered for a new observation at a later date at 1/10th of the original exposure time in photometric conditions to ensure a proper final absolute photometric calibration. Photometric standards are observed only when conditions are reported as being perfectly stable by SkyProbe. The more recent dual color system (simultaneous B & V bands) will offer a better characterization of the sky properties above Mauna Kea and should enable a better detection of the thinnest cirrus (absorption down to 0.01 mag., or 1%).
Astronaut George Nelson working on Comet Halley Active monitoring program
1986-01-14
61C-05-026 (14 Jan. 1986) --- Astronaut George D. Nelson smiles for a fellow crew man's 35mm camera exposure while participating in the Comet Halley active monitoring program (CHAMP). Camera equipment and a protective shroud used to eliminate all cabin light interference surround the mission specialist. This is the first of three 1986 missions which are scheduled to monitor the rare visit by the comet. The principal investigators for CHAMP are S. Alan Stern of the Laboratory for Atmospheric and Space Physics at the University of Colorado; and Dr. Stephen Mende of Lockheed Palo Alto Research Laboratory.
Development of Digital SLR Camera: PENTAX K-7
NASA Astrophysics Data System (ADS)
Kawauchi, Hiraku
The DSLR "PENTAX K-7" comes with an easy-to-carry, minimal yet functional small form factor, a long inherited identities of the PENTAX brand. Nevertheless for its compact body, this camera has up-to-date enhanced fundamental features such as high-quality viewfinder, enhanced shutter mechanism, extended continuous shooting capabilities, reliable exposure control, and fine-tuned AF systems, as well as strings of newest technologies such as movie recording capability and automatic leveling function. The main focus of this article is to reveal the ideas behind the concept making of this product and its distinguished features.
NASA Astrophysics Data System (ADS)
Risteiu, M.; Lorincz, A.; Dobra, R.; Dasic, P.; Andras, I.; Roventa, M.
2017-06-01
The proposed paper shows some experimental results of a research in metallic structures inspection by using a high definition camera controller by high processing capabilities. The dedicated ARM Cortex-M4 initializes the ARM Cortex-M0 system for image acquiring. Then, by programming options, we are action for patterns (abnormal situations like metal cracks, or discontinuities) types and tuning, for enabling overexposure highlighting and adjusting camera brightness/exposure, to adjust minimum brightness, and to adjust the pattern’s teach threshold. The proposed system has been tested in normal lighting conditions from the typical site.
An Investigation into the Spectral Imaging of Hall Thruster Plumes
2015-07-01
imaging experiment. It employs a Kodak KAF-3200E 3 megapixel CCD (2184×1472 with 6.8 µm pixels). The camera was designed for astronomical imaging and thus...19 mml 14c--7_0_m_m_~•... ,. ,. 50 mm I· ·I ,. 41 mm I Kodak KAF- 3200E ceo 2184 x 1472 px 14.9 x 10.0 mm 6.8 x 6.8J..Lm pixel size SBIG ST...It employs a Kodak KAF-3200E 3 megapixel CCD (2184×1472 with 6.8 µm pixels). The camera was designed for astronomical imaging and thus long exposure
Image quality evaluation of color displays using a Fovean color camera
NASA Astrophysics Data System (ADS)
Roehrig, Hans; Dallas, William J.; Fan, Jiahua; Krupinski, Elizabeth A.; Redford, Gary R.; Yoneda, Takahiro
2007-03-01
This paper presents preliminary data on the use of a color camera for the evaluation of Quality Control (QC) and Quality Analysis (QA) of a color LCD in comparison with that of a monochrome LCD. The color camera is a C-MOS camera with a pixel size of 9 µm and a pixel matrix of 2268 × 1512 × 3. The camera uses a sensor that has co-located pixels for all three primary colors. The imaging geometry used mostly was 12 × 12 camera pixels per display pixel even though it appears that an imaging geometry of 17.6 might provide results which are more accurate. The color camera is used as an imaging colorimeter, where each camera pixel is calibrated to serve as a colorimeter. This capability permits the camera to determine chromaticity of the color LCD at different sections of the display. After the color calibration with a CS-200 colorimeter the color coordinates of the display's primaries determined from the camera's luminance response are very close to those found from the CS-200. Only the color coordinates of the display's white point were in error. Modulation Transfer Function (MTF) as well as Noise in terms of the Noise Power Spectrum (NPS) of both LCDs were evaluated. The horizontal MTFs of both displays have a larger negative slope than the vertical MTFs, indicating that the horizontal MTFs are poorer than the vertical MTFs. However the modulations at the Nyquist frequency seem lower for the color LCD than for the monochrome LCD. These results contradict simulations regarding MTFs in the vertical direction. The spatial noise of the color display in both directions are larger than that of the monochrome display. Attempts were also made to analyze the total noise in terms of spatial and temporal noise by applying subtractions of images taken at exactly the same exposure. Temporal noise seems to be significantly lower than spatial noise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenfield, J.R.; La Riviere, P.J.; Sandhu, J.S.
Purpose: To characterize the dynamic response of a novel acousto-optic (AO) liquid crystal detector for high-resolution transmission ultrasound breast imaging. Transient and steady-state lesion contrast were investigated to identify optimal transducer settings for our prototype imaging system consistent with the FDA limits of 1 W/cm{sup 2} and 50 J/cm{sup 2} on the incident acoustic intensity and the transmitted acoustic energy flux density. Methods: We have developed a full-field transmission ultrasound breast imaging system that uses monochromatic plane-wave illumination to acquire projection images of the compressed breast. The acoustic intensity transmitted through the breast is converted into a visual image bymore » a proprietary liquid crystal detector operating on the basis of the AO effect. The dynamic response of the AO detector in the absence of an imaged breast was recorded by a CCD camera as a function of the acoustic field intensity and the detector exposure time. Additionally, a stereotactic needle biopsy breast phantom was used to investigate the change in opaque lesion contrast with increasing exposure time for a range of incident acoustic field intensities. Results: Using transducer voltages between 0.3 V and 0.8 V and exposure times of 3 minutes, a unique one-to-one mapping of incident acoustic intensity to steady-state optical brightness in the AO detector was observed. A transfer curve mapping acoustic intensity to steady-state optical brightness shows a high-contrast region analogous to the linear portion of the Hurter-Driffield curves of radiography. Using transducer voltages between 1 V and 1.75 V and exposure times of 90 s, the lesion contrast study demonstrated increasing lesion contrast with increasing breast exposure time and acoustic field intensity. Lesion-to-background contrast on the order of 0.80 was observed. Conclusion: Maximal lesion contrast in our prototype system can be obtained using the highest acoustic field intensity and the longest breast exposure time allowable under FDA standards. Department of Defense (DOD) Breast Cancer Research Program IDEA Award W81XWH-11-1-0332; National Institutes of Health (NIH) Grant T32 EB002103-21 from the National Institute of Biomedical Imaging and Bioengineering (NIBIB)« less
Mohamed‐Ali, Marwan I.; Moulson, Margaret C.
2013-01-01
ABSTRACT Exposure to faces is known to shape and change the face processing system; however, no study has yet documented infants' natural daily first‐hand exposure to faces. One‐ and three‐month‐old infants' visual experience was recorded through head‐mounted cameras. The video recordings were coded for faces to determine: (1) How often are infants exposed to faces? (2) To what type of faces are they exposed? and (3) Do frequently encountered face types reflect infants' typical pattern of perceptual narrowing? As hypothesized, infants spent a large proportion of their time (25%) exposed to faces; these faces were primarily female (70%), own‐race (96%), and adult‐age (81%). Infants were exposed to more individual exemplars of female, own‐race, and adult‐age faces than to male, other‐race, and child‐ or older‐adult‐age faces. Each exposure to own‐race faces was longer than to other‐race faces. There were no differences in exposure duration related to the gender or age of the face. Previous research has found that the face types frequently experienced by our participants are preferred over and more successfully recognized than other face types. The patterns of face exposure revealed in the current study coincide with the known trajectory of perceptual narrowing seen later in infancy.© 2013 The Authors. Developmental Psychobiology Published by Wiley Periodicals, Inc. Dev Psychobiol 56: 249–261, 2014. PMID:24285109
A time-resolved image sensor for tubeless streak cameras
NASA Astrophysics Data System (ADS)
Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji
2014-03-01
This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .
NASA Astrophysics Data System (ADS)
Tanada, Jun
1992-08-01
Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.
Real-time tracking and fast retrieval of persons in multiple surveillance cameras of a shopping mall
NASA Astrophysics Data System (ADS)
Bouma, Henri; Baan, Jan; Landsmeer, Sander; Kruszynski, Chris; van Antwerpen, Gert; Dijk, Judith
2013-05-01
The capability to track individuals in CCTV cameras is important for e.g. surveillance applications at large areas such as train stations, airports and shopping centers. However, it is laborious to track and trace people over multiple cameras. In this paper, we present a system for real-time tracking and fast interactive retrieval of persons in video streams from multiple static surveillance cameras. This system is demonstrated in a shopping mall, where the cameras are positioned without overlapping fields-of-view and have different lighting conditions. The results show that the system allows an operator to find the origin or destination of a person more efficiently. The misses are reduced with 37%, which is a significant improvement.
Tyurin readies the NASDA exposure experiment cases for their EVA
2001-10-14
ISS003-E-6623 (14 October 2001) --- Cosmonaut Mikhail Tyurin, Expedition Three flight engineer representing Rosaviakosmos, works with hardware for the Micro-Particles Capturer (MPAC) and Space Environment Exposure Device (SEED) experiment and fixture mechanism in the Zvezda Service Module on the International Space Station (ISS). MPAC and SEED were developed by Japans National Space Development Agency (NASDA), and Russia developed the Fixture Mechanism. This image was taken with a digital still camera.
On a Mathematical Theory of Coded Exposure
2014-08-01
formulae that give the MSE and SNR of the final crisp image 1. Assumes the Shannon-Whittaker framework that i) requires band limited (with a fre...represents the ideal crisp image, i.e., the image that one would observed if there were no noise whatsoever, no motion, with a perfect optical system...discrete. In addition, the image obtained by a coded exposure camera requires to undergo a deconvolution to get the final crisp image. Note that the
Easily Accessible Camera Mount
NASA Technical Reports Server (NTRS)
Chalson, H. E.
1986-01-01
Modified mount enables fast alinement of movie cameras in explosionproof housings. Screw on side and readily reached through side door of housing. Mount includes right-angle drive mechanism containing two miter gears that turn threaded shaft. Shaft drives movable dovetail clamping jaw that engages fixed dovetail plate on camera. Mechanism alines camera in housing and secures it. Reduces installation time by 80 percent.
Reconditioning of Cassini Narrow-Angle Camera
2002-07-23
These five images of single stars, taken at different times with the narrow-angle camera on NASA Cassini spacecraft, show the effects of haze collecting on the camera optics, then successful removal of the haze by warming treatments.
The Development of the Spanish Fireball Network Using a New All-Sky CCD System
NASA Astrophysics Data System (ADS)
Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.; Llorca, J.; Fabregat, J.; Martínez, V. J.; Reglero, V.; Jelínek, M.; Kubánek, P.; Mateo, T.; Postigo, A. De Ugarte
2004-12-01
We have developed an all-sky charge coupled devices (CCD) automatic system for detecting meteors and fireballs that will be operative in four stations in Spain during 2005. The cameras were developed following the BOOTES-1 prototype installed at the El Arenosillo Observatory in 2002, which is based on a CCD detector of 4096 × 4096 pixels with a fish-eye lens that provides an all-sky image with enough resolution to make accurate astrometric measurements. Since late 2004, a couple of cameras at two of the four stations operate for 30 s in alternate exposures, allowing 100% time coverage. The stellar limiting magnitude of the images is +10 in the zenith, and +8 below ~ 65° of zenithal angle. As a result, the images provide enough comparison stars to make astrometric measurements of faint meteors and fireballs with an accuracy of ~ 2°arcminutes. Using this prototype, four automatic all-sky CCD stations have been developed, two in Andalusia and two in the Valencian Community, to start full operation of the Spanish Fireball Network. In addition to all-sky coverage, we are developing a fireball spectroscopy program using medium field lenses with additional CCD cameras. Here we present the first images obtained from the El Arenosillo and La Mayora stations in Andalusia during their first months of activity. The detection of the Jan 27, 2003 superbolide of ± 17 ± 1 absolute magnitude that overflew Algeria and Morocco is an example of the detection capability of our prototype.
Study on the measurement system of the target polarization characteristics and test
NASA Astrophysics Data System (ADS)
Fu, Qiang; Zhu, Yong; Zhang, Su; Duan, Jin; Yang, Di; Zhan, Juntong; Wang, Xiaoman; Jiang, Hui-Lin
2015-10-01
The polarization imaging detection technology increased the polarization information on the basis of the intensity imaging, which is extensive application in the military and civil and other fields, the research on the polarization characteristics of target is particularly important. The research of the polarization reflection model was introduced in this paper, which describes the scattering vector light energy distribution in reflecting hemisphere polarization characteristics, the target polarization characteristics test system solutions was put forward, by the irradiation light source, measuring turntable and camera, etc, which illuminate light source shall direct light source, with laser light sources and xenon lamp light source, light source can be replaced according to the test need; Hemispherical structure is used in measuring circumarotate placed near its base material sample, equipped with azimuth and pitching rotation mechanism, the manual in order to adjust the azimuth Angle and high Angle observation; Measuring camera pump works, through the different in the way of motor control polaroid polarization test, to ensure the accuracy of measurement and imaging resolution. The test platform has set up by existing laboratory equipment, the laser is 532 nm, line polaroid camera, at the same time also set the sending and receiving optical system. According to the different materials such as wood, metal, plastic, azimuth Angle and zenith Angle in different observation conditions, measurement of target in the polarization scattering properties of different exposure conditions, implementation of hemisphere space pBRDF measurement.
Analysis of Dragon's Breath and Scattered Light Detector Anomalies on WFC3/UVIS
NASA Astrophysics Data System (ADS)
Fowler, Julia; Markwardt, Larissa; Bourque, Matthew; Anderson, Jay
2017-02-01
We summarize the examination of the light anomalies known as Dragon's Breath and Scattered Light for the UVIS channel of Wide Field Camera 3 (WFC3) of the Hubble Space Telescope (HST). We present three methods for WFC3 users to help avoid these effects during observation planning. We analyzed all of the full-frame wide and long pass filters with exposure times ≥ 300 seconds, comprising ∼13% of WFC3/UVIS on-orbit data (∼20% of all full-frame data, and ∼35% of all full-frame ≥300 second exposures.) We find that stars producing Dragon's Breath peak at specific orientations to the detector and V-band magnitudes. The bulk of these stars fall along the vertical and horizontal edges, within ∼490 pixels of the image frame. The corners of the detector show significantly fewer instances of Dragon's Breath and Scattered Light, though still a few occurrences. Furthermore, matching stars outside the field of the image to V-band magnitude data from the Hubble Guide Star Catalog II (GSC-II) shows that stars causing the anomaly consistently peak around a V-band magnitude of 11.9 or 14.6, whereas the general trend of objects lying outside the field instead peaks around a magnitude of 16.5 within our exposure time and filter selection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sels, Seppe, E-mail: Seppe.Sels@uantwerpen.be; Ribbens, Bart; Mertens, Luc
Scanning laser Doppler vibrometers (LDV) are used to measure full-field vibration shapes of products and structures. In most commercially available scanning laser Doppler vibrometer systems the user manually draws a grid of measurement locations on a 2D camera image of the product. The determination of the correct physical measurement locations can be a time consuming and diffcult task. In this paper we present a new methodology for product testing and quality control that integrates 3D imaging techniques with vibration measurements. This procedure allows to test prototypes in a shorter period because physical measurements locations will be located automatically. The proposedmore » methodology uses a 3D time-of-flight camera to measure the location and orientation of the test-object. The 3D image of the time-of-flight camera is then matched with the 3D-CAD model of the object in which measurement locations are pre-defined. A time of flight camera operates strictly in the near infrared spectrum. To improve the signal to noise ratio in the time-of-flight measurement, a time-of-flight camera uses a band filter. As a result of this filter, the laser spot of most laser vibrometers is invisible in the time-of-flight image. Therefore a 2D RGB-camera is used to find the laser-spot of the vibrometer. The laser spot is matched to the 3D image obtained by the time-of-flight camera. Next an automatic calibration procedure is used to aim the laser at the (pre)defined locations. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. Secondly the orientation of the CAD model is known with respect to the laser beam. This information can be used to find the direction of the measured vibration relatively to the surface of the object. With this direction, the vibration measurements can be compared more precisely with numerical experiments.« less
Connor S. Adams; Wade A. Ryberg; Toby J. Hibbitts; Brian L. Pierce; Josh B. Pierce; D. Craig Rudolph
2017-01-01
Recent advancements in camera trap technology have allowed researchers to explore methodologies that are minimally invasive, and both time and cost efficient (Long et al. 2008; OâConnell et al. 2010; Gregory et al. 2014; Meek et al. 2014; Swinnen et al. 2014; Newey et al. 2015). The use of cameras for understanding the distribution and ecology of mammals is advanced;...
Using a Video Camera to Measure the Radius of the Earth
ERIC Educational Resources Information Center
Carroll, Joshua; Hughes, Stephen
2013-01-01
A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of…
Gate simulation of Compton Ar-Xe gamma-camera for radionuclide imaging in nuclear medicine
NASA Astrophysics Data System (ADS)
Dubov, L. Yu; Belyaev, V. N.; Berdnikova, A. K.; Bolozdynia, A. I.; Akmalova, Yu A.; Shtotsky, Yu V.
2017-01-01
Computer simulations of cylindrical Compton Ar-Xe gamma camera are described in the current report. Detection efficiency of cylindrical Ar-Xe Compton camera with internal diameter of 40 cm is estimated as1-3%that is 10-100 times higher than collimated Anger’s camera. It is shown that cylindrical Compton camera can image Tc-99m radiotracer distribution with uniform spatial resolution of 20 mm through the whole field of view.
Improving the color fidelity of cameras for advanced television systems
NASA Astrophysics Data System (ADS)
Kollarits, Richard V.; Gibbon, David C.
1992-08-01
In this paper we compare the accuracy of the color information obtained from television cameras using three and five wavelength bands. This comparison is based on real digital camera data. The cameras are treated as colorimeters whose characteristics are not linked to that of the display. The color matrices for both cameras were obtained by identical optimization procedures that minimized the color error The color error for the five band camera is 2. 5 times smaller than that obtained from the three band camera. Visual comparison of color matches on a characterized color monitor indicate that the five band camera is capable of color measurements that produce no significant visual error on the display. Because the outputs from the five band camera are reduced to the normal three channels conventionally used for display there need be no increase in signal handling complexity outside the camera. Likewise it is possible to construct a five band camera using only three sensors as in conventional cameras. The principal drawback of the five band camera is the reduction in effective camera sensitivity by about 3/4 of an I stop. 1.
Stray light calibration of the Dawn Framing Camera
NASA Astrophysics Data System (ADS)
Kovacs, Gabor; Sierks, Holger; Nathues, Andreas; Richards, Michael; Gutierrez-Marques, Pablo
2013-10-01
Sensitive imaging systems with high dynamic range onboard spacecrafts are susceptible to ghost and stray-light effects. During the design phase, the Dawn Framing Camera was laid out and optimized to minimize those unwanted, parasitic effects. However, the requirement of low distortion to the optical design and use of a front-lit focal plane array induced an additional stray light component. This paper presents the ground-based and in-flight procedures characterizing the stray-light artifacts. The in-flight test used the Sun as the stray light source, at different angles of incidence. The spacecraft was commanded to point predefined solar elongation positions, and long exposure images were recorded. The PSNIT function was calculated by the known illumination and the ground based calibration information. In the ground based calibration, several extended and point sources were used with long exposure times in dedicated imaging setups. The tests revealed that the major contribution to the stray light is coming from the ghost reflections between the focal plan array and the band pass interference filters. Various laboratory experiments and computer modeling simulations were carried out to quantify the amount of this effect, including the analysis of the diffractive reflection pattern generated by the imaging sensor. The accurate characterization of the detector reflection pattern is the key to successfully predict the intensity distribution of the ghost image. Based on the results, and the properties of the optical system, a novel correction method is applied in the image processing pipeline. The effect of this correction procedure is also demonstrated with the first images of asteroid Vesta.
NASA Astrophysics Data System (ADS)
Dong, Shidu; Yang, Xiaofan; He, Bo; Liu, Guojin
2006-11-01
Radiance coming from the interior of an uncooled infrared camera has a significant effect on the measured value of the temperature of the object. This paper presents a three-phase compensation scheme for coping with this effect. The first phase acquires the calibration data and forms the calibration function by least square fitting. Likewise, the second phase obtains the compensation data and builds the compensation function by fitting. With the aid of these functions, the third phase determines the temperature of the object in concern from any given ambient temperature. It is known that acquiring the compensation data of a camera is very time-consuming. For the purpose of getting the compensation data at a reasonable time cost, we propose a transplantable scheme. The idea of this scheme is to calculate the ratio between the central pixel’s responsivity of the child camera to the radiance from the interior and that of the mother camera, followed by determining the compensation data of the child camera using this ratio and the compensation data of the mother camera Experimental results show that either of the child camera and the mother camera can measure the temperature of the object with an error of no more than 2°C.
de Lasarte, Marta; Pujol, Jaume; Arjona, Montserrat; Vilaseca, Meritxell
2007-01-10
We present an optimized linear algorithm for the spatial nonuniformity correction of a CCD color camera's imaging system and the experimental methodology developed for its implementation. We assess the influence of the algorithm's variables on the quality of the correction, that is, the dark image, the base correction image, and the reference level, and the range of application of the correction using a uniform radiance field provided by an integrator cube. The best spatial nonuniformity correction is achieved by having a nonzero dark image, by using an image with a mean digital level placed in the linear response range of the camera as the base correction image and taking the mean digital level of the image as the reference digital level. The response of the CCD color camera's imaging system to the uniform radiance field shows a high level of spatial uniformity after the optimized algorithm has been applied, which also allows us to achieve a high-quality spatial nonuniformity correction of captured images under different exposure conditions.
The diagnosing of plasmas using spectroscopy and imaging on Proto-MPEX
NASA Astrophysics Data System (ADS)
Baldwin, K. A.; Biewer, T. M.; Crouse Powers, J.; Hardin, R.; Johnson, S.; McCleese, A.; Shaw, G. C.; Showers, M.; Skeen, C.
2015-11-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device being developed at Oak Ridge National Laboratory (ORNL). This machine plans to study plasma-material interaction (PMI) physics relevant to future fusion reactors. We tested and learned to use tools of spectroscopy and imaging. These tools consist of a spectrometer, a high speed camera, an infrared camera, and a thermocouple. The spectrometer measures the color of the light from the plasma and its intensity. We also used a high speed camera to see how the magnetic field acts on the plasma, and how it is heated to the fourth state of matter. The thermocouples measure the temperature of the objects they are placed against, which in this case are the end plates of the machine. We also used the infrared camera to see the heat pattern of the plasma on the end plates. Data from these instruments will be shown. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725, and the Oak Ridge Associated Universities ARC program.
Refocusing distance of a standard plenoptic camera.
Hahne, Christopher; Aggoun, Amar; Velisavljevic, Vladan; Fiebig, Susanne; Pesch, Matthias
2016-09-19
Recent developments in computational photography enabled variation of the optical focus of a plenoptic camera after image exposure, also known as refocusing. Existing ray models in the field simplify the camera's complexity for the purpose of image and depth map enhancement, but fail to satisfyingly predict the distance to which a photograph is refocused. By treating a pair of light rays as a system of linear functions, it will be shown in this paper that its solution yields an intersection indicating the distance to a refocused object plane. Experimental work is conducted with different lenses and focus settings while comparing distance estimates with a stack of refocused photographs for which a blur metric has been devised. Quantitative assessments over a 24 m distance range suggest that predictions deviate by less than 0.35 % in comparison to an optical design software. The proposed refocusing estimator assists in predicting object distances just as in the prototyping stage of plenoptic cameras and will be an essential feature in applications demanding high precision in synthetic focus or where depth map recovery is done by analyzing a stack of refocused photographs.
VizieR Online Data Catalog: Redshifts of galaxies in Abell 1351 field (Barrena+, 2014)
NASA Astrophysics Data System (ADS)
Barrena, R.; Girardi, M.; Boschin, W.; de Grandi, S.; Rossetti, M.
2015-03-01
Multi-object spectroscopic (MOS) observations of A1351 were carried out at the TNG on 2010 March 10. We used DOLORES/MOS with the LR-B Grism 1, yielding a dispersion of 187Å/mm. We used the 2048x2048 pixel E2V CCD, with a pixel size of 13.5um. In total, we observed four MOS masks including 143 slits. For each mask, the exposure time was 3x1800s. We had already observed A1351 field with the Wide Field Camera (WFC), mounted at the prime focus of the 2.5m INT telescope. We took exposures of 9x600s and 9x300s in B and R Harris filters in photometric conditions and 1.2-arcsec seeing. However, we used SDSS-DR7 data because a greater number of photometric bands are available, which allows an accurate colour analysis. INT and SDSS-DR7 photometric data are very similar. The completeness magnitude is r'=20.8. (1 data file).
Adjustable long duration high-intensity point light source
NASA Astrophysics Data System (ADS)
Krehl, P.; Hagelweide, J. B.
1981-06-01
A new long duration high-intensity point light source with adjustable light duration and a small light spot locally stable in time has been developed. The principle involved is a stationary high-temperature plasma flow inside a partly constrained capillary of a coaxial spark gap which is viewed end on through a terminating Plexiglas window. The point light spark gap is operated via a resistor by an artificial transmission line. Using two exchangeable inductance sets in the line, two ranges of photoduration 10-130 μs and 100-600 μs can be covered. For a light spot size of 1.5 mm diameter the corresponding peak light output amounts to 5×106 and 1.6×106 candelas, respectively. Within these ranges the duration is controlled by an ignitron crowbar to extinguish the plasma. The adjustable photoduration is very useful for the application of continuous writing rotating mirror cameras, thus preventing multiple exposures. The essentially uniform exposure within the visible spectral range makes the new light source suitable for color cinematography.
1989-08-19
Range : 8.6 million kilometers (5.3 million miles) The Voyager took this 61 second exposure through the clear filter with the narrow angle camera of Neptune. The Voyager cameras were programmed to make a systematic search for faint ring arcs and new satellites. The bright upper corner of the image is due to a residual image from a previous long exposure of the planet. The portion of the arc visible here is approximately 35 degrees in longitudinal extent, making it approximately 38,000 kilometers (24,000 miles) in length, and is broken up into three segments separated from each other by approximately 5 degrees. The trailing edge is at the upper right and has an abrupt end while the leading edge seems to fade into the background more gradually. This arc orbits very close to one of the newly discovered Neptune satellites, 1989N4. Close-up studies of this ring arc will be carried out in the coming days which will give higher spatial resolution at different lighting angles. (JPL Ref: P-34617)
Nuclear Cardiology: Are We Using the Right Protocols and Tracers the Right Way?
Dondi, Maurizio; Pascual, Thomas; Paez, Diana; Einstein, Andrew J
2017-12-01
The field of nuclear cardiology has changed considerably over recent years, with greater attention paid to safety and radiation protection issues. The wider usage of technetium-99m (Tc-99m)-labeled radiopharmaceuticals for single-photon emission computed tomography (SPECT) imaging using gamma cameras has contributed to better quality studies and lower radiation exposure to patients. Increased availability of tracers and scanners for positron emission tomography (PET) will help further improve the quality of studies and quantify myocardial blood flow and myocardial flow reserve, thus enhancing the contribution of non-invasive imaging to the management of coronary artery disease. The introduction of new instrumentation such as solid state cameras and new software will help reduce further radiation exposure to patients undergoing nuclear cardiology studies. Results from recent studies, focused on assessing the relationship between best practices and radiation risk, provide useful insights on simple measures to improve the safety of nuclear cardiology studies without compromising the quality of results.
Large-mirror testing facility at the National Optical Astronomy Observatories.
NASA Astrophysics Data System (ADS)
Barr, L. D.; Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, C.; Roddier, F.
1991-09-01
A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes will do. The method uses a remotely operated CCD camera system to record the fringe pattern from the workpiece. The typical test is done with a camera exposure of about a millisecond to "freeze" the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. The method described provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce, all within a few minutes, to an accuracy of 0.01 μm measured peak-to-valley.
Astronaut Ronald Evans photographed during transearth coast EVA
NASA Technical Reports Server (NTRS)
1972-01-01
Astronaut Ronald E. Evans is photographed performing extravehicular activity (EVA) during the Apollo 17 spacecraft's transearth coast. During his EVA Command Module pilot Evans retrieved film cassettes from the Lunar Sounder, Mapping Camera, and Panoramic Camera. The cylindrical object at Evans left side is the mapping camera cassette. The total time for the transearth EVA was one hour seven minutes 19 seconds, starting at ground elapsed time of 257:25 (2:28 p.m.) amd ending at ground elapsed time of 258:42 (3:35 p.m.) on Sunday, December 17, 1972.
Li, Jin; Liu, Zilong
2017-07-24
Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.
Kelly, Paul; Thomas, Emma; Doherty, Aiden; Harms, Teresa; Burke, Órlaith; Gershuny, Jonathan; Foster, Charlie
2015-01-01
Self-report time use diaries collect a continuous sequenced record of daily activities but the validity of the data they produce is uncertain. This study tests the feasibility of using wearable cameras to generate, through image prompted interview, reconstructed 'near-objective' data to assess their validity. 16 volunteers completed the Harmonised European Time Use Survey (HETUS) diary and used an Autographer wearable camera (recording images at approximately 15 second intervals) for the waking hours of the same 24-hour period. Participants then completed an interview in which visual images were used as prompts to reconstruct a record of activities for comparison with the diary record. 14 participants complied with the full collection protocol. We compared time use and number of discrete activities from the diary and camera records (using 10 classifications of activity). In terms of aggregate totals of daily time use we found no significant difference between the diary and camera data. In terms of number of discrete activities, participants reported a mean of 19.2 activities per day in the diaries, while image prompted interviews revealed 41.1 activities per day. The visualisations of the individual activity sequences reveal some potentially important differences between the two record types, which will be explored at the next project stage. This study demonstrates the feasibility of using wearable cameras to reconstruct time use through image prompted interview in order to test the concurrent validity of 24-hour activity time-use budgets. In future we need a suitably powered study to assess the validity and reliability of 24-hour time use diaries. PMID:26633807
Apollo 12 stereo view of lunar surface upon which astronaut had stepped
1969-11-20
AS12-57-8448 (19-20 Nov. 1969) --- An Apollo 12 stereo view showing a three-inch square of the lunar surface upon which an astronaut had stepped. Taken during extravehicular activity of astronauts Charles Conrad Jr. and Alan L. Bean, the exposure of the boot imprint was made with an Apollo 35mm stereo close-up camera. The camera was developed to get the highest possible resolution of a small area. The three-inch square is photographed with a flash illumination and at a fixed distance. The camera is mounted on a walking stick, and the astronauts use it by holding it up against the object to be photographed and pulling the trigger. While astronauts Conrad and Bean descended in their Apollo 12 Lunar Module to explore the lunar surface, astronaut Richard F. Gordon Jr. remained with the Command and Service Modules in lunar orbit.
The Galileo Solid-State Imaging experiment
Belton, M.J.S.; Klaasen, K.P.; Clary, M.C.; Anderson, J.L.; Anger, C.D.; Carr, M.H.; Chapman, C.R.; Davies, M.E.; Greeley, R.; Anderson, D.; Bolef, L.K.; Townsend, T.E.; Greenberg, R.; Head, J. W.; Neukum, G.; Pilcher, C.B.; Veverka, J.; Gierasch, P.J.; Fanale, F.P.; Ingersoll, A.P.; Masursky, H.; Morrison, D.; Pollack, James B.
1992-01-01
The Solid State Imaging (SSI) experiment on the Galileo Orbiter spacecraft utilizes a high-resolution (1500 mm focal length) television camera with an 800 ?? 800 pixel virtual-phase, charge-coupled detector. It is designed to return images of Jupiter and its satellites that are characterized by a combination of sensitivity levels, spatial resolution, geometric fiedelity, and spectral range unmatched by imaging data obtained previously. The spectral range extends from approximately 375 to 1100 nm and only in the near ultra-violet region (??? 350 nm) is the spectral coverage reduced from previous missions. The camera is approximately 100 times more sensitive than those used in the Voyager mission, and, because of the nature of the satellite encounters, will produce images with approximately 100 times the ground resolution (i.e., ??? 50 m lp-1) on the Galilean satellites. We describe aspects of the detector including its sensitivity to energetic particle radiation and how the requirements for a large full-well capacity and long-term stability in operating voltages led to the choice of the virtual phase chip. The F/8.5 camera system can reach point sources of V(mag) ??? 11 with S/N ??? 10 and extended sources with surface brightness as low as 20 kR in its highest gain state and longest exposure mode. We describe the performance of the system as determined by ground calibration and the improvements that have been made to the telescope (same basic catadioptric design that was used in Mariner 10 and the Voyager high-resolution cameras) to reduce the scattered light reaching the detector. The images are linearly digitized 8-bits deep and, after flat-fielding, are cosmetically clean. Information 'preserving' and 'non-preserving' on-board data compression capabilities are outlined. A special "summation" mode, designed for use deep in the Jovian radiation belts, near Io, is also described. The detector is 'preflashed' before each exposure to ensure the photometric linearity. The dynamic range is spread over 3 gain states and an exposure range from 4.17 ms to 51.2 s. A low-level of radial, third-order, geometric distortion has been measured in the raw images that is entirely due to the optical design. The distortion is of the pincushion type and amounts to about 1.2 pixels in the corners of the images. It is expected to be very stable. We discuss the measurement objectives of the SSI experiment in the Jupiter system and emphasize their relationships to those of other experiments in the Galileo project. We outline objectives for Jupiter atmospheric science, noting the relationship of SSI data to that to be returned by experiments on the atmospheric entry Probe. We also outline SSI objectives for satellite surfaces, ring structure, and 'darkside' (e.g., aurorae, lightning, etc.) experiments. Proposed cruise measurement objectives that relate to encounters at Venus, Moon, Earth, Gaspra, and, possibly, Ida are also briefly outlined. The article concludes with a description of a 'fully distributed' data analysis system (HIIPS) that SSI team members intend to use at their home institutions. We also list the nature of systematic data products that will become available to the scientific community. Finally, we append a short 'historical' note outlining the responsibilities and roles of institutions and individuals that have been involved in the 14 year development of the SSI experiment so far. ?? 1992 Kluwer Academic Publishers.
Applying Bayesian hierarchical models to examine motorcycle crashes at signalized intersections.
Haque, Md Mazharul; Chin, Hoong Chor; Huang, Helai
2010-01-01
Motorcycles are overrepresented in road traffic crashes and particularly vulnerable at signalized intersections. The objective of this study is to identify causal factors affecting the motorcycle crashes at both four-legged and T signalized intersections. Treating the data in time-series cross-section panels, this study explores different Hierarchical Poisson models and found that the model allowing autoregressive lag-1 dependence specification in the error term is the most suitable. Results show that the number of lanes at the four-legged signalized intersections significantly increases motorcycle crashes largely because of the higher exposure resulting from higher motorcycle accumulation at the stop line. Furthermore, the presence of a wide median and an uncontrolled left-turn lane at major roadways of four-legged intersections exacerbate this potential hazard. For T signalized intersections, the presence of exclusive right-turn lane at both major and minor roadways and an uncontrolled left-turn lane at major roadways increases motorcycle crashes. Motorcycle crashes increase on high-speed roadways because they are more vulnerable and less likely to react in time during conflicts. The presence of red light cameras reduces motorcycle crashes significantly for both four-legged and T intersections. With the red light camera, motorcycles are less exposed to conflicts because it is observed that they are more disciplined in queuing at the stop line and less likely to jump start at the start of green.
Accurate estimation of camera shot noise in the real-time
NASA Astrophysics Data System (ADS)
Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.
2017-10-01
Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the accuracy of the obtained temporal noise values was estimated.
Assessment of the UV camera sulfur dioxide retrieval for point source plumes
Dalton, M.P.; Watson, I.M.; Nadeau, P.A.; Werner, C.; Morrow, W.; Shannon, J.M.
2009-01-01
Digital cameras, sensitive to specific regions of the ultra-violet (UV) spectrum, have been employed for quantifying sulfur dioxide (SO2) emissions in recent years. The instruments make use of the selective absorption of UV light by SO2 molecules to determine pathlength concentration. Many monitoring advantages are gained by using this technique, but the accuracy and limitations have not been thoroughly investigated. The effect of some user-controlled parameters, including image exposure duration, the diameter of the lens aperture, the frequency of calibration cell imaging, and the use of the single or paired bandpass filters, have not yet been addressed. In order to clarify methodological consequences and quantify accuracy, laboratory and field experiments were conducted. Images were collected of calibration cells under varying observational conditions, and our conclusions provide guidance for enhanced image collection. Results indicate that the calibration cell response is reliably linear below 1500 ppm m, but that the response is significantly affected by changing light conditions. Exposure durations that produced maximum image digital numbers above 32 500 counts can reduce noise in plume images. Sulfur dioxide retrieval results from a coal-fired power plant plume were compared to direct sampling measurements and the results indicate that the accuracy of the UV camera retrieval method is within the range of current spectrometric methods. ?? 2009 Elsevier B.V.
A method for measuring aircraft height and velocity using dual television cameras
NASA Technical Reports Server (NTRS)
Young, W. R.
1977-01-01
A unique electronic optical technique, consisting of two closed circuit television cameras and timing electronics, was devised to measure an aircraft's horizontal velocity and height above ground without the need for airborne cooperative devices. The system is intended to be used where the aircraft has a predictable flight path and a height of less than 660 meters (2,000 feet) at or near the end of an air terminal runway, but is suitable for greater aircraft altitudes whenever the aircraft remains visible. Two television cameras, pointed at zenith, are placed in line with the expected path of travel of the aircraft. Velocity is determined by measuring the time it takes the aircraft to travel the measured distance between cameras. Height is determined by correlating this speed with the time required to cross the field of view of either camera. Preliminary tests with a breadboard version of the system and a small model aircraft indicate the technique is feasible.
Monitoring Kilauea Volcano Using Non-Telemetered Time-Lapse Camera Systems
NASA Astrophysics Data System (ADS)
Orr, T. R.; Hoblitt, R. P.
2006-12-01
Systematic visual observations are an essential component of monitoring volcanic activity. At the Hawaiian Volcano Observatory, the development and deployment of a new generation of high-resolution, non- telemetered, time-lapse camera systems provides periodic visual observations in inaccessible and hazardous environments. The camera systems combine a hand-held digital camera, programmable shutter-release, and other off-the-shelf components in a package that is inexpensive, easy to deploy, and ideal for situations in which the probability of equipment loss due to volcanic activity or theft is substantial. The camera systems have proven invaluable in correlating eruptive activity with deformation and seismic data streams. For example, in late 2005 and much of 2006, Pu`u `O`o, the active vent on Kilauea Volcano`s East Rift Zone, experienced 10--20-hour cycles of inflation and deflation that correlated with increases in seismic energy release. A time-lapse camera looking into a skylight above the main lava tube about 1 km south of the vent showed an increase in lava level---an indicator of increased lava flux---during periods of deflation, and a decrease in lava level during periods of inflation. A second time-lapse camera, with a broad view of the upper part of the active flow field, allowed us to correlate the same cyclic tilt and seismicity with lava breakouts from the tube. The breakouts were accompanied by rapid uplift and subsidence of shatter rings over the tube. The shatter rings---concentric rings of broken rock---rose and subsided by as much as 6 m in less than an hour during periods of varying flux. Time-lapse imagery also permits improved assessment of volcanic hazards, and is invaluable in illustrating the hazards to the public. In collaboration with Hawaii Volcanoes National Park, camera systems have been used to monitor the growth of lava deltas at the entry point of lava into the ocean to determine the potential for catastrophic collapse.
Li, Rui; Elson, Daniel S; Dunsby, Chris; Eckersley, Robert; Tang, Meng-Xing
2011-04-11
Ultrasound-modulated optical tomography (UOT) combines optical contrast with ultrasound spatial resolution and has great potential for soft tissue functional imaging. One current problem with this technique is the weak optical modulation signal, primarily due to strong optical scattering in diffuse media and minimal acoustically induced modulation. The acoustic radiation force (ARF) can create large particle displacements in tissue and has been shown to be able to improve optical modulation signals. However, shear wave propagation induced by the ARF can be a significant source of nonlocal optical modulation which may reduce UOT spatial resolution and contrast. In this paper, the time evolution of shear waves was examined on tissue mimicking-phantoms exposed to 5 MHz ultrasound and 532 nm optical radiation and measured with a CCD camera. It has been demonstrated that by generating an ARF with an acoustic burst and adjusting both the timing and the exposure time of the CCD measurement, optical contrast and spatial resolution can be improved by ~110% and ~40% respectively when using the ARF rather than 5 MHz ultrasound alone. Furthermore, it has been demonstrated that this technique simultaneously detects both optical and mechanical contrast in the medium and the optical and mechanical contrast can be distinguished by adjusting the CCD exposure time. © 2011 Optical Society of America
Biasetti, Jacopo; Sampath, Kaushik; Cortez, Angel; Azhir, Alaleh; Gilad, Assaf A; Kickler, Thomas S; Obser, Tobias; Ruggeri, Zaverio M; Katz, Joseph
2017-01-01
Tracking cells and proteins' phenotypic changes in deep suspensions is critical for the direct imaging of blood-related phenomena in in vitro replica of cardiovascular systems and blood-handling devices. This paper introduces fluorescence imaging techniques for space and time resolved detection of platelet activation, von Willebrand factor (VWF) conformational changes, and VWF-platelet interaction in deep suspensions. Labeled VWF, platelets, and VWF-platelet strands are suspended in deep cuvettes, illuminated, and imaged with a high-sensitivity EM-CCD camera, allowing detection using an exposure time of 1 ms. In-house postprocessing algorithms identify and track the moving signals. Recombinant VWF-eGFP (rVWF-eGFP) and VWF labeled with an FITC-conjugated polyclonal antibody are employed. Anti-P-Selectin FITC-conjugated antibodies and the calcium-sensitive probe Indo-1 are used to detect activated platelets. A positive correlation between the mean number of platelets detected per image and the percentage of activated platelets determined through flow cytometry is obtained, validating the technique. An increase in the number of rVWF-eGFP signals upon exposure to shear stress demonstrates the technique's ability to detect breakup of self-aggregates. VWF globular and unfolded conformations and self-aggregation are also observed. The ability to track the size and shape of VWF-platelet strands in space and time provides means to detect pro- and antithrombotic processes.
PRIMAS: a real-time 3D motion-analysis system
NASA Astrophysics Data System (ADS)
Sabel, Jan C.; van Veenendaal, Hans L. J.; Furnee, E. Hans
1994-03-01
The paper describes a CCD TV-camera-based system for real-time multicamera 2D detection of retro-reflective targets and software for accurate and fast 3D reconstruction. Applications of this system can be found in the fields of sports, biomechanics, rehabilitation research, and various other areas of science and industry. The new feature of real-time 3D opens an even broader perspective of application areas; animations in virtual reality are an interesting example. After presenting an overview of the hardware and the camera calibration method, the paper focuses on the real-time algorithms used for matching of the images and subsequent 3D reconstruction of marker positions. When using a calibrated setup of two cameras, it is now possible to track at least ten markers at 100 Hz. Limitations in the performance are determined by the visibility of the markers, which could be improved by adding a third camera.
Mach-zehnder based optical marker/comb generator for streak camera calibration
Miller, Edward Kirk
2015-03-03
This disclosure is directed to a method and apparatus for generating marker and comb indicia in an optical environment using a Mach-Zehnder (M-Z) modulator. High speed recording devices are configured to record image or other data defining a high speed event. To calibrate and establish time reference, the markers or combs are indicia which serve as timing pulses (markers) or a constant-frequency train of optical pulses (comb) to be imaged on a streak camera for accurate time based calibration and time reference. The system includes a camera, an optic signal generator which provides an optic signal to an M-Z modulator and biasing and modulation signal generators configured to provide input to the M-Z modulator. An optical reference signal is provided to the M-Z modulator. The M-Z modulator modulates the reference signal to a higher frequency optical signal which is output through a fiber coupled link to the streak camera.
Research and implementation of simulation for TDICCD remote sensing in vibration of optical axis
NASA Astrophysics Data System (ADS)
Liu, Zhi-hong; Kang, Xiao-jun; Lin, Zhe; Song, Li
2013-12-01
During the exposure time, the charge transfer speed in the push-broom direction and the line-by-lines canning speed of the sensor are required to match each other strictly for a space-borne TDICCD push-broom camera. However, as attitude disturbance of satellite and vibration of camera are inevitable, it is impossible to eliminate the speed mismatch, which will make the signal of different targets overlay each other and result in a decline of image resolution. The effects of velocity mismatch will be visually observed and analyzed by simulating the degradation of image quality caused by the vibration of the optical axis, and it is significant for the evaluation of image quality and design of the image restoration algorithm. How to give a model in time domain and space domain during the imaging time is the problem needed to be solved firstly. As vibration information for simulation is usually given by a continuous curve, the pixels of original image matrix and sensor matrix are discrete, as a result, they cannot always match each other well. The effect of simulation will also be influenced by the discrete sampling in integration time. In conclusion, it is quite significant for improving simulation accuracy and efficiency to give an appropriate discrete modeling and simulation method. The paper analyses discretization schemes in time domain and space domain and presents a method to simulate the quality of image of the optical system in the vibration of the line of sight, which is based on the principle of TDICCD sensor. The gray value of pixels in sensor matrix is obtained by a weighted arithmetic, which solves the problem of pixels dismatch. The result which compared with the experiment of hardware test indicate that this simulation system performances well in accuracy and reliability.
Visual control of robots using range images.
Pomares, Jorge; Gil, Pablo; Torres, Fernando
2010-01-01
In the last years, 3D-vision systems based on the time-of-flight (ToF) principle have gained more importance in order to obtain 3D information from the workspace. In this paper, an analysis of the use of 3D ToF cameras to guide a robot arm is performed. To do so, an adaptive method to simultaneous visual servo control and camera calibration is presented. Using this method a robot arm is guided by using range information obtained from a ToF camera. Furthermore, the self-calibration method obtains the adequate integration time to be used by the range camera in order to precisely determine the depth information.
A user-friendly technical set-up for infrared photography of forensic findings.
Rost, Thomas; Kalberer, Nicole; Scheurer, Eva
2017-09-01
Infrared photography is interesting for a use in forensic science and forensic medicine since it reveals findings that normally are almost invisible to the human eye. Originally, infrared photography has been made possible by the placement of an infrared light transmission filter screwed in front of the camera objective lens. However, this set-up is associated with many drawbacks such as the loss of the autofocus function, the need of an external infrared source, and long exposure times which make the use of a tripod necessary. These limitations prevented up to now the routine application of infrared photography in forensics. In this study the use of a professional modification inside the digital camera body was evaluated regarding camera handling and image quality. This permanent modification consisted of the replacement of the in-built infrared blocking filter by an infrared transmission filter of 700nm and 830nm, respectively. The application of this camera set-up for the photo-documentation of forensically relevant post-mortem findings was investigated in examples of trace evidence such as gunshot residues on the skin, in external findings, e.g. hematomas, as well as in an exemplary internal finding, i.e., Wischnewski spots in a putrefied stomach. The application of scattered light created by indirect flashlight yielded a more uniform illumination of the object, and the use of the 700nm filter resulted in better pictures than the 830nm filter. Compared to pictures taken under visible light, infrared photographs generally yielded better contrast. This allowed for discerning more details and revealed findings which were not visible otherwise, such as imprints on a fabric and tattoos in mummified skin. The permanent modification of a digital camera by building in a 700nm infrared transmission filter resulted in a user-friendly and efficient set-up which qualified for the use in daily forensic routine. Main advantages were a clear picture in the viewfinder, an auto-focus usable over the whole range of infrared light, and the possibility of using short shutter speeds which allows taking infrared pictures free-hand. The proposed set-up with a modification of the camera allows a user-friendly application of infrared photography in post-mortem settings. Copyright © 2017 Elsevier B.V. All rights reserved.
Ocular dynamics and visual tracking performance after Q-switched laser exposure
NASA Astrophysics Data System (ADS)
Zwick, Harry; Stuck, Bruce E.; Lund, David J.; Nawim, Maqsood
2001-05-01
In previous investigations of q-switched laser retinal exposure in awake task oriented non-human primates (NHPs), the threshold for retinal damage occurred well below that of the threshold for permanent visual function loss. Visual function measures used in these studies involved measures of visual acuity and contrast sensitivity. In the present study, we examine the same relationship for q-switched laser exposure using a visual performance task, where task dependency involves more parafoveal than foveal retina. NHPs were trained on a visual pursuit motor tracking performance task that required maintaining a small HeNe laser spot (0.3 degrees) centered in a slowly moving (0.5deg/sec) annulus. When NHPs reliably produced visual target tracking efficiencies > 80%, single q-switched laser exposures (7 nsec) were made coaxially with the line of sight of the moving target. An infrared camera imaged the pupil during exposure to obtain the pupillary response to the laser flash. Retinal images were obtained with a scanning laser ophthalmoscope 3 days post exposure under ketamine and nembutol anesthesia. Q-switched visible laser exposures at twice the damage threshold produced small (about 50mm) retinal lesions temporal to the fovea; deficits in NHP visual pursuit tracking were transient, demonstrating full recovery to baseline within a single tracking session. Post exposure analysis of the pupillary response demonstrated that the exposure flash entered the pupil, followed by 90 msec refractory period and than a 12 % pupillary contraction within 1.5 sec from the onset of laser exposure. At 6 times the morphological threshold damage level for 532 nm q-switched exposure, longer term losses in NHP pursuit tracking performance were observed. In summary, q-switched laser exposure appears to have a higher threshold for permanent visual performance loss than the corresponding threshold to produce retinal threshold injury. Mechanisms of neural plasticity within the retina and at higher visual brain centers may mediat
Heterogeneous Vision Data Fusion for Independently Moving Cameras
2010-03-01
target detection , tracking , and identification over a large terrain. The goal of the project is to investigate and evaluate the existing image...fusion algorithms, develop new real-time algorithms for Category-II image fusion, and apply these algorithms in moving target detection and tracking . The...moving target detection and classification. 15. SUBJECT TERMS Image Fusion, Target Detection , Moving Cameras, IR Camera, EO Camera 16. SECURITY
ERIC Educational Resources Information Center
Smallman, Kirk
The fundamentals of motion picture photography are introduced with a physiological explanation for the illusion of motion in a film. Film stock formats and emulsions, camera features, and lights are listed and described. Various techniques of exposure control are illustrated in terms of their effects. Photographing action with a stationary or a…
ERIC Educational Resources Information Center
Bernstein, Alan; Alan, Michael
2009-01-01
A good high school cultural arts program puts students' creative talents on display, allows them to demonstrate innovative thinking, and gives them direct and indirect exposure to careers in the cultural arts. A truly outstanding program goes even further to emulate real-world authentic artistic experiences and to create multidisciplinary artistic…
A Reconfigurable Real-Time Compressive-Sampling Camera for Biological Applications
Fu, Bo; Pitter, Mark C.; Russell, Noah A.
2011-01-01
Many applications in biology, such as long-term functional imaging of neural and cardiac systems, require continuous high-speed imaging. This is typically not possible, however, using commercially available systems. The frame rate and the recording time of high-speed cameras are limited by the digitization rate and the capacity of on-camera memory. Further restrictions are often imposed by the limited bandwidth of the data link to the host computer. Even if the system bandwidth is not a limiting factor, continuous high-speed acquisition results in very large volumes of data that are difficult to handle, particularly when real-time analysis is required. In response to this issue many cameras allow a predetermined, rectangular region of interest (ROI) to be sampled, however this approach lacks flexibility and is blind to the image region outside of the ROI. We have addressed this problem by building a camera system using a randomly-addressable CMOS sensor. The camera has a low bandwidth, but is able to capture continuous high-speed images of an arbitrarily defined ROI, using most of the available bandwidth, while simultaneously acquiring low-speed, full frame images using the remaining bandwidth. In addition, the camera is able to use the full-frame information to recalculate the positions of targets and update the high-speed ROIs without interrupting acquisition. In this way the camera is capable of imaging moving targets at high-speed while simultaneously imaging the whole frame at a lower speed. We have used this camera system to monitor the heartbeat and blood cell flow of a water flea (Daphnia) at frame rates in excess of 1500 fps. PMID:22028852
Electronic camera-management system for 35-mm and 70-mm film cameras
NASA Astrophysics Data System (ADS)
Nielsen, Allan
1993-01-01
Military and commercial test facilities have been tasked with the need for increasingly sophisticated data collection and data reduction. A state-of-the-art electronic control system for high speed 35 mm and 70 mm film cameras designed to meet these tasks is described. Data collection in today's test range environment is difficult at best. The need for a completely integrated image and data collection system is mandated by the increasingly complex test environment. Instrumentation film cameras have been used on test ranges to capture images for decades. Their high frame rates coupled with exceptionally high resolution make them an essential part of any test system. In addition to documenting test events, today's camera system is required to perform many additional tasks. Data reduction to establish TSPI (time- space-position information) may be performed after a mission and is subject to all of the variables present in documenting the mission. A typical scenario would consist of multiple cameras located on tracking mounts capturing the event along with azimuth and elevation position data. Corrected data can then be reduced using each camera's time and position deltas and calculating the TSPI of the object using triangulation. An electronic camera control system designed to meet these requirements has been developed by Photo-Sonics, Inc. The feedback received from test technicians at range facilities throughout the world led Photo-Sonics to design the features of this control system. These prominent new features include: a comprehensive safety management system, full local or remote operation, frame rate accuracy of less than 0.005 percent, and phase locking capability to Irig-B. In fact, Irig-B phase lock operation of multiple cameras can reduce the time-distance delta of a test object traveling at mach-1 to less than one inch during data reduction.
VizieR Online Data Catalog: Multiwavelenght photometry of Sh 2-138 YSOs (Baug+, 2015)
NASA Astrophysics Data System (ADS)
Baug, T.; Ojha, D. K.; Dewangan, L. K.; Ninan, J. P.; Bhatt, B. C.; Ghosh, S. K.; Mallick, K. K.
2016-07-01
Optical BVRI imaging observations of the Sh2-138 region were carried out on 2005 September 8 using the Himalaya Faint Object Spectrograph and Camera (HFOSC) mounted on the 2 m Himalayan Chandra Telescope (HCT). In order to identify strong Hα emission sources in the Sh2-138 region, slitless Hα spectra were obtained using the HFOSC on 2007 November 16. Optical spectroscopic observations of the central brightest source were performed using the HFOSC on 2014 November 18. The newly installed TIFR Near Infrared Spectrometer and Imager Camera (TIRSPEC) on the HCT was used for NIR observations on 2014 November 18 under photometric conditions with an average seeing of 1.4 arcsec. We obtained NIR spectra of the central brightest source on 2014 May 29, using the TIRSPEC, in NIR Y (1.02-1.20um), J (1.21-1.48um), H (1.49-1.78um), and K (2.04-2.35um) bands. We conducted optical narrow-band imaging observations of the region in Hα filter (λ~6563Å, Δλ~100Å) with exposure times of 600s, 250s, and 50s on 2005 September 8 using the HFOSC. (1 data file).
NASA Technical Reports Server (NTRS)
Blades, J. C.; Barlow, M. J.; Albrecht, R.; Barbieri, C.; Boksenberg, A.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.; Kamperman, T. M.
1992-01-01
Using the Faint Object Camera on-board the Hubble Space Telescope, we have obtained images of four planetary nebulae (PNe) in the Magellanic Clouds, namely N2 and N5 in the SMC and N66 and N201 in the LMC. Each nebula was imaged through two narrow-band filters isolating forbidden O III 5007 and H-beta, for a nominal exposure time of 1000 s in each filter. In forbidden O III, SMC N5 shows a circular ring structure, with a peak-to-peak diameter of 0.26 arcsec and a FWHM of 0.35 arcsec while SMC N2 shows an elliptical ring structure with a peak-to-peak diameter of 0.26 x 0.21. The expansion ages corresponding to the observed structures in SMC N2 and N5 are of the order of 3000 yr. LMC N201 is very compact, with a FWHM of 0.2 arcsec in H-beta. The Type I PN LMC N66 is a multipolar nebula, with the brightest part having an extent of about 2 arcsec and with fainter structures extending over 4 arcsec.
Mars Rover Mastcam View of Possible Mud Cracks
2017-01-17
This view of a Martian rock slab called "Old Soaker," which has a network of cracks that may have originated in drying mud, comes from the Mast Camera (Mastcam) on NASA's Curiosity Mars rover. The location is within an exposure of Murray formation mudstone on lower Mount Sharp inside Gale Crater. Mud cracks would be evidence of a time more than 3 billion years ago when dry intervals interrupted wetter periods that supported lakes in the area. Curiosity has found evidence of ancient lakes in older, lower-lying rock layers and also in younger mudstone that is above Old Soaker. Several images from Mastcam's left-eye camera are combined into this mosaic view. They were taken on Dec. 20, 2016, during the 1,555th Martian day, or sol, of Curiosity's work on Mars. The Old Soaker slab is about 4 feet (1.2 meters) long. Figure 1 includes a scale bar of 30 centimeters (12 inches). The scene is presented with a color adjustment that approximates white balancing, to resemble how the rocks and sand would appear under daytime lighting conditions on Earth. http://photojournal.jpl.nasa.gov/catalog/PIA21262
Natural canopy bridges effectively mitigate tropical forest fragmentation for arboreal mammals.
Gregory, Tremaine; Carrasco-Rueda, Farah; Alonso, Alfonso; Kolowski, Joseph; Deichmann, Jessica L
2017-06-20
Linear infrastructure development and resulting habitat fragmentation are expanding in Neotropical forests, and arboreal mammals may be disproportionately impacted by these linear habitat clearings. Maintaining canopy connectivity through preservation of connecting branches (i.e. natural canopy bridges) may help mitigate that impact. Using camera traps, we evaluated crossing rates of a pipeline right-of-way in a control area with no bridges and in a test area where 13 bridges were left by the pipeline construction company. Monitoring all canopy crossing points for a year (7,102 canopy camera nights), we confirmed bridge use by 25 mammal species from 12 families. With bridge use beginning immediately after exposure and increasing over time, use rates were over two orders of magnitude higher than on the ground. We also found a positive relationship between a bridge's use rate and the number of species that used it, suggesting well-used bridges benefit multiple species. Data suggest bridge use may be related to a combination of bridge branch connectivity, multiple connections, connectivity to adjacent forest, and foliage cover. Given the high use rate and minimal cost, we recommend all linear infrastructure projects in forests with arboreal mammal populations include canopy bridges.
Park, Sung Pyo; Siringo, Frank S.; Pensec, Noelle; Hong, In Hwan; Sparrow, Janet; Barile, Gaetano; Tsang, Stephen H.; Chang, Stanley
2015-01-01
BACKGROUND AND OBJECTIVE To compare fundus autofluorescence (FAF) imaging via fundus camera (FC) and confocal scanning laser ophthalmoscope (cSLO). PATIENTS AND METHODS FAF images were obtained with a digital FC (530 to 580 nm excitation) and a cSLO (488 nm excitation). Two authors evaluated correlation of autofluorescence pattern, atrophic lesion size, and image quality between the two devices. RESULTS In 120 eyes, the autofluorescence pattern correlated in 86% of lesions. By lesion subtype, correlation rates were 100% in hemorrhage, 97% in geographic atrophy, 82% in flecks, 75% in drusen, 70% in exudates, 67% in pigment epithelial detachment, 50% in fibrous scars, and 33% in macular hole. The mean lesion size in geographic atrophy was 4.57 ± 2.3 mm2 via cSLO and 3.81 ± 1.94 mm2 via FC (P < .0001). Image quality favored cSLO in 71 eyes. CONCLUSION FAF images were highly correlated between the FC and cSLO. Differences between the two devices revealed contrasts. Multiple image capture and confocal optics yielded higher image contrast with the cSLO, although acquisition and exposure time was longer. PMID:24221461
NASA Astrophysics Data System (ADS)
Xie, Yijing; Thom, Maria; Ebner, Michael; Wykes, Victoria; Desjardins, Adrien; Miserocchi, Anna; Ourselin, Sebastien; McEvoy, Andrew W.; Vercauteren, Tom
2017-11-01
In high-grade glioma surgery, tumor resection is often guided by intraoperative fluorescence imaging. 5-aminolevulinic acid-induced protoporphyrin IX (PpIX) provides fluorescent contrast between normal brain tissue and glioma tissue, thus achieving improved tumor delineation and prolonged patient survival compared with conventional white-light-guided resection. However, commercially available fluorescence imaging systems rely solely on visual assessment of fluorescence patterns by the surgeon, which makes the resection more subjective than necessary. We developed a wide-field spectrally resolved fluorescence imaging system utilizing a Generation II scientific CMOS camera and an improved computational model for the precise reconstruction of the PpIX concentration map. In our model, the tissue's optical properties and illumination geometry, which distort the fluorescent emission spectra, are considered. We demonstrate that the CMOS-based system can detect low PpIX concentration at short camera exposure times, while providing high-pixel resolution wide-field images. We show that total variation regularization improves the contrast-to-noise ratio of the reconstructed quantitative concentration map by approximately twofold. Quantitative comparison between the estimated PpIX concentration and tumor histopathology was also investigated to further evaluate the system.
The DUV Stability of Superlattice-Doped CMOS Detector Arrays
NASA Technical Reports Server (NTRS)
Hoenk, M. E.; Carver, A. G.; Jones, T.; Dickie, M.; Cheng, P.; Greer, H. F.; Nikzad, S.; Sgro, J.; Tsur, S.
2013-01-01
JPL and Alacron have recently developed a high performance, DUV camera with a superlattice doped CMOS imaging detector. Supperlattice doped detectors achieve nearly 100% internal quantum efficiency in the deep and far ultraviolet, and a single layer, Al2O3 antireflection coating enables 64% external quantum efficiency at 263nm. In lifetime tests performed at Applied Materials using 263 nm pulsed, solid state and 193 nm pulsed excimer laser, the quantum efficiency and dark current of the JPL/Alacron camera remained stable to better than 1% precision during long-term exposure to several billion laser pulses, with no measurable degradation, no blooming and no image memory at 1000 fps.
Multi-camera synchronization core implemented on USB3 based FPGA platform
NASA Astrophysics Data System (ADS)
Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Dias, Morgado
2015-03-01
Centered on Awaiba's NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this paper is to demonstrate a new technique to synchronize up to 8 individual self-timed cameras with minimal error. Small form factor self-timed camera modules of 1 mm x 1 mm or smaller do not normally allow external synchronization. However, for stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is required to synchronize multiple cameras. In this work, the challenge of synchronizing multiple selftimed cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the cameras. To that effect, a control core was created to constantly monitor the operating frequency of each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line period. To ensure phase synchronization between frames, a Master-Slave interface was implemented. A single camera is defined as the Master, with its operating frequency being controlled directly through a PC based interface. The remaining cameras are setup in Slave mode and are interfaced directly with the Master camera control module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The result of this work will allow the implementation of smaller than 3mm diameter 3D stereo vision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.
Infrared Imaging Camera Final Report CRADA No. TC02061.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, E. V.; Nebeker, S.
This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less
Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras
NASA Technical Reports Server (NTRS)
Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.
2011-01-01
The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.
A detail-preserved and luminance-consistent multi-exposure image fusion algorithm
NASA Astrophysics Data System (ADS)
Wang, Guanquan; Zhou, Yue
2018-04-01
When irradiance across a scene varies greatly, we can hardly get an image of the scene without over- or underexposure area, because of the constraints of cameras. Multi-exposure image fusion (MEF) is an effective method to deal with this problem by fusing multi-exposure images of a static scene. A novel MEF method is described in this paper. In the proposed algorithm, coarser-scale luminance consistency is preserved by contribution adjustment using the luminance information between blocks; detail-preserved smoothing filter can stitch blocks smoothly without losing details. Experiment results show that the proposed method performs well in preserving luminance consistency and details.
NASA Technical Reports Server (NTRS)
Mulrooney, M.; Hickson, P.; Stansbery, Eugene G.
2010-01-01
MCAT (Meter-Class Autonomous Telescope) is a 1.3m f/4 Ritchey-Chr tien on a double horseshoe equatorial mount that will be deployed in early 2011 to the western pacific island of Legan in the Kwajalein Atoll to perform orbital debris observations. MCAT will be capable of tracking earth orbital objects at all inclinations and at altitudes from 200 km to geosynchronous. MCAT s primary objective is the detection of new orbital debris in both low-inclination low-earth orbits (LEO) and at geosynchronous earth orbit (GEO). MCAT was thus designed with a fast focal ratio and a large unvignetted image circle able to accommodate a detector sized to yield a large field of view. The selected primary detector is a close-cycle cooled 4Kx4K 15um pixel CCD camera that yields a 0.9 degree diagonal field. For orbital debris detection in widely spaced angular rate regimes, the camera must offer low read-noise performance over a wide range of framing rates. MCAT s 4-port camera operates from 100 kHz to 1.5 MHz per port at 2 e- and 10 e- read noise respectively. This enables low-noise multi-second exposures for GEO observations as well as rapid (several frames per second) exposures for LEO. GEO observations will be performed using a counter-sidereal time delay integration (TDI) technique which NASA has used successfully in the past. For MCAT the GEO survey, detection, and follow-up prediction algorithms will be automated. These algorithms will be detailed herein. For LEO observations two methods will be employed. The first, Orbit Survey Mode (OSM), will scan specific orbital inclination and altitude regimes, detect new orbital debris objects against trailed background stars, and adjust the telescope track to follow the detected object. The second, Stare and Chase Mode (SCM), will perform a stare, then detect and track objects that enter the field of view which satisfy specific rate and brightness criteria. As with GEO, the LEO operational modes will be fully automated and will be described herein. The automation of photometric and astrometric processing (thus streamlining data collection for environmental modeling) will also be discussed.
Object tracking using multiple camera video streams
NASA Astrophysics Data System (ADS)
Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford
2010-05-01
Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.
NASA Astrophysics Data System (ADS)
Martínez-González, A.; Moreno-Hernández, D.; Monzón-Hernández, D.; León-Rodríguez, M.
2017-06-01
In the schlieren method, the deflection of light by the presence of an inhomogeneous medium is proportional to the gradient of its refractive index. Such deflection, in a schlieren system, is represented by light intensity variations on the observation plane. Then, for a digital camera, the intensity level registered by each pixel depends mainly on the variation of the medium refractive index and the status of the digital camera settings. Therefore, in this study, we regulate the intensity value of each pixel by controlling the camera settings such as exposure time, gamma and gain values in order to calibrate the image obtained to the actual temperature values of a particular medium. In our approach, we use a color digital camera. The images obtained with a color digital camera can be separated on three different color-channels. Each channel corresponds to red, green, and blue color, moreover, each one has its own sensitivity. The differences in sensitivity allow us to obtain a range of temperature values for each color channel. Thus, high, medium and low sensitivity correspond to green, blue, and red color channel respectively. Therefore, by adding up the temperature contribution of each color channel we obtain a wide range of temperature values. Hence, the basic idea in our approach to measure temperature, using a schlieren system, is to relate the intensity level of each pixel in a schlieren image to the corresponding knife-edge position measured at the exit focal plane of the system. Our approach was applied to the measurement of instantaneous temperature fields of the air convection caused by a heated rectangular metal plate and a candle flame. We found that for the metal plate temperature measurements only the green and blue color-channels were required to sense the entire phenomena. On the other hand, for the candle case, the three color-channels were needed to obtain a complete measurement of temperature. In our study, the candle temperature was took as reference and it was found that the maximum temperature value obtained for green, blue and red color-channel was ∼275.6, ∼412.9, and ∼501.3 °C, respectively.
Progress In Ground Based Mercury's Imaging
NASA Astrophysics Data System (ADS)
Ksanfomality, L.
The reduction of an exposure time improves considerably the resolution of the images of astronomical objects, which usually equals 1.0"-1.5"?for telescopes of a moderate diameter. A poor resolution is determined specifically by the atmospheric instability. A considerable reduction of the exposure became possible only with the advent of ef- fective CCD receivers. Certainly, the reduction of exposure does not eliminate distor- tions, though it can eliminate the image blurring. Nevertheless, it is possible to select the images with small distortions from a number of images. This study using the short exposure method for observations of the planet Mercury started in 1998 (Ksanfomal- ity, 1998). A similar study of Mercury using ground based technique, was fulfilled by J.Warell (Astronomical Observatory Uppsala, Sweden) and S.S.Limaye (University of Wisconsin-Madison, USA). J. Warell started his work as early as 1995 (Warell and Limaye, 2001). Later J.Baumgardner, M.Mendello and J.K.Wilson (2000) succeeded obtaining an image of a portion of Mercury's surface not covered by the Mariner- 10 imaging. They used a CCD camera, continuously operating with frequency 30 frames/sec and choose the best of them for a compilation. Due to the low planet's orbit the possible duration of the ground-based observations of Mercury is extremely limited. Nevertheless the new results are really promising and can be a means for obtaining new information on the planet; it may be of importance for new missions to Mercury that are now in progress both by NASA (the Messenger project) and by the European Space Agency (the BepiColombo project). Within the framework of the Mercury investigations carried out at the Institute for Space Research, Russian Academy of Sciences, on December 1-3, 1999, and November 1-10, 2001, observa- tions of the planet Mercury were carried out at the Abastumany Astrophysical Obser- vatory (Republic of Georgia), by the short exposure method using a charge-coupled device (CCD) camera. A great advantage of the observations at this Observatory was its considerable height above sea level (about 1700 m), which is important for obser- vations at large zenith distances.The AZT-11 telescope (D = 1.25 m, the Cassegrain focus F = 16 m) was used as a long-focus instrument. Due to a random fortunate co- incidence, the phase and position of the planet happened to be virtually the same as 1 in 1974, when Mercury was observed from the Mariner-10 spacecraft. Exposures as short as 3 ms and up to 70 ms were used. The atmospheric condition on November 3rd for about 45 min got unusually clear, and about 300 of electronic images of different quality were gathered. Using a sophisticated technique processing of observationally independent data rows resulted in many compositional electronic images with a num- ber of small features, never seen before and repeating times and again, which could be successfully compared the Mariner-10 mosaic. A set of the images is presented. 2
NASA Astrophysics Data System (ADS)
Mundhenk, Terrell N.; Dhavale, Nitin; Marmol, Salvador; Calleja, Elizabeth; Navalpakkam, Vidhya; Bellman, Kirstie; Landauer, Chris; Arbib, Michael A.; Itti, Laurent
2003-10-01
In view of the growing complexity of computational tasks and their design, we propose that certain interactive systems may be better designed by utilizing computational strategies based on the study of the human brain. Compared with current engineering paradigms, brain theory offers the promise of improved self-organization and adaptation to the current environment, freeing the programmer from having to address those issues in a procedural manner when designing and implementing large-scale complex systems. To advance this hypothesis, we discus a multi-agent surveillance system where 12 agent CPUs each with its own camera, compete and cooperate to monitor a large room. To cope with the overload of image data streaming from 12 cameras, we take inspiration from the primate"s visual system, which allows the animal to operate a real-time selection of the few most conspicuous locations in visual input. This is accomplished by having each camera agent utilize the bottom-up, saliency-based visual attention algorithm of Itti and Koch (Vision Research 2000;40(10-12):1489-1506) to scan the scene for objects of interest. Real time operation is achieved using a distributed version that runs on a 16-CPU Beowulf cluster composed of the agent computers. The algorithm guides cameras to track and monitor salient objects based on maps of color, orientation, intensity, and motion. To spread camera view points or create cooperation in monitoring highly salient targets, camera agents bias each other by increasing or decreasing the weight of different feature vectors in other cameras, using mechanisms similar to excitation and suppression that have been documented in electrophysiology, psychophysics and imaging studies of low-level visual processing. In addition, if cameras need to compete for computing resources, allocation of computational time is weighed based upon the history of each camera. A camera agent that has a history of seeing more salient targets is more likely to obtain computational resources. The system demonstrates the viability of biologically inspired systems in a real time tracking. In future work we plan on implementing additional biological mechanisms for cooperative management of both the sensor and processing resources in this system that include top down biasing for target specificity as well as novelty and the activity of the tracked object in relation to sensitive features of the environment.
NASA Astrophysics Data System (ADS)
Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott
2003-09-01
A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.
A 3D photographic capsule endoscope system with full field of view
NASA Astrophysics Data System (ADS)
Ou-Yang, Mang; Jeng, Wei-De; Lai, Chien-Cheng; Kung, Yi-Chinn; Tao, Kuan-Heng
2013-09-01
Current capsule endoscope uses one camera to capture the surface image in the intestine. It can only observe the abnormal point, but cannot know the exact information of this abnormal point. Using two cameras can generate 3D images, but the visual plane changes while capsule endoscope rotates. It causes that two cameras can't capture the images information completely. To solve this question, this research provides a new kind of capsule endoscope to capture 3D images, which is 'A 3D photographic capsule endoscope system'. The system uses three cameras to capture images in real time. The advantage is increasing the viewing range up to 2.99 times respect to the two camera system. The system can accompany 3D monitor provides the exact information of symptom points, helping doctors diagnose the disease.
Assessment of virtual reality robotic simulation performance by urology resident trainees.
Ruparel, Raaj K; Taylor, Abby S; Patel, Janil; Patel, Vipul R; Heckman, Michael G; Rawal, Bhupendra; Leveillee, Raymond J; Thiel, David D
2014-01-01
To examine resident performance on the Mimic dV-Trainer (MdVT; Mimic Technologies, Inc., Seattle, WA) for correlation with resident trainee level (postgraduate year [PGY]), console experience (CE), and simulator exposure in their training program to assess for internal bias with the simulator. Residents from programs of the Southeastern Section of the American Urologic Association participated. Each resident was scored on 4 simulator tasks (peg board, camera targeting, energy dissection [ED], and needle targeting) with 3 different outcomes (final score, economy of motion score, and time to complete exercise) measured for each task. These scores were evaluated for association with PGY, CE, and simulator exposure. Robotic skills training laboratory. A total of 27 residents from 14 programs of the Southeastern Section of the American Urologic Association participated. Time to complete the ED exercise was significantly shorter for residents who had logged live robotic console compared with those who had not (p = 0.003). There were no other associations with live robotic console time that approached significance (all p ≥ 0.21). The only measure that was significantly associated with PGY was time to complete ED exercise (p = 0.009). No associations with previous utilization of a robotic simulator in the resident's home training program were statistically significant. The ED exercise on the MdVT is most associated with CE and PGY compared with other exercises. Exposure of trainees to the MdVT in training programs does not appear to alter performance scores compared with trainees who do not have the simulator. © 2013 Published by Association of Program Directors in Surgery on behalf of Association of Program Directors in Surgery.
How to handle 6GBytes a night and not get swamped
NASA Technical Reports Server (NTRS)
Allsman, R.; Alcock, C.; Axelrod, T.; Bennett, D.; Cook, K.; Park, H.-S.; Griest, K.; Marshall, S.; Perlmutter, S.; Stubbs, C.
1992-01-01
The Macho Project has undertaken a 5 year effort to search for dark matter in the halo of the Galaxy by scanning the Magellanic Clouds for micro-lensing events. Each evening's raw image data will be reduced in real-time into the observed stars' photometric measurements. The actual search for micro-lensing events will be a post-processing operation. The theoretical prediction of the rate of such events necessitates the collection of a large number of repeated exposures. The project designed camera subsystem delivers 64 Mbytes per exposure with exposures typically occurring every 500 seconds. An ideal evening's observing will provide 6 Gbytes of raw image data and 40 Mbytes of reduced photometric measurements. Recognizing the difficulty of digging out from a snowballing cascade of raw data, the project requires the real-time reduction of each evening's data. The software team's implementation strategy centered on this non-negotiable mandate. Accepting the reality that 2 full time people needed to implement the core real-time control and data management system within 6 months, off-the-shelf vendor components were explored to provide quick solutions to the classic needs for file management, data management, and process control. Where vendor solutions were lacking, state-of-the-art models were used for hand tailored subsystems. In particular, petri nets manage process control, memory mapped bulletin boards provide interprocess communication between the multi-tasked processes, and C++ class libraries provide memory mapped, disk resident databases. The differences between the implementation strategy and the final implementation reality are presented. The necessity of validating vendor product claims are explored. Both the successful and hindsight decisions enabling the collection and processing of the nightly data barrage are reviewed.
Weber, Daniel N.; Hoffmann, Raymond G.; Hoke, Elizabeth S.; Tanguay, Robert L.
2014-01-01
Developmental bisphenol A (BPA) exposure is associated with adverse behavioral effects, although underlying modes of action remain unclear. Because BPA is a suspected xenoestrogen, the objective was to identify sex-based changes in adult zebrafish social behavior developmentally exposed to BPA (0.0, 0.1 or 1 μM) or one of two control compounds (0.1μM 17β-estradiol [E2], and 0.1 μM GSK4716, a synthetic estrogen-related receptor γ ligand). A test chamber was divided lengthwise so each arena held one fish unable to detect the presence of the other fish. A mirror was inserted at one end of each arena; baseline activity levels were determined without mirror. Arenas were divided into 3, computer-generated zones to represent different distances from mirror image. Circadian rhythm patterns were evaluated at 1–3 (= AM) and 5–8 (= PM) hr postprandial. Adult zebrafish were placed into arenas and monitored by digital camera for 5 min. Total distance traveled, % time spent at mirror image, and number of attacks on mirror image were quantified. E2, GSK4716, and all BPA treatments dampened male activity and altered male circadian activity patterns; there was no marked effect on female activity. BPA induced non-monotonic effects (response curve changes direction within range of concentrations examined) on male % time at mirror only in AM. All treatments produced increased % time at the mirror during PM. Male attacks on the mirror were reduced by BPA exposure only during AM. There were sex-specific effects of developmental BPA on social interactions and time-of-day of observation affected results. PMID:25424546
Recent technology and usage of plastic lenses in image taking objectives
NASA Astrophysics Data System (ADS)
Yamaguchi, Susumu; Sato, Hiroshi; Mori, Nobuyoshi; Kiriki, Toshihiko
2005-09-01
Recently, plastic lenses produced by injection molding are widely used in image taking objectives for digital cameras, camcorders, and mobile phone cameras, because of their suitability for volume production and ease of obtaining an advantage of aspherical surfaces. For digital camera and camcorder objectives, it is desirable that there is no image point variation with the temperature change in spite of employing several plastic lenses. At the same time, due to the shrinking pixel size of solid-state image sensor, there is now a requirement to assemble lenses with high accuracy. In order to satisfy these requirements, we have developed 16 times compact zoom objective for camcorder and 3 times class folded zoom objectives for digital camera, incorporating cemented plastic doublet consisting of a positive lens and a negative lens. Over the last few years, production volumes of camera-equipped mobile phones have increased substantially. Therefore, for mobile phone cameras, the consideration of productivity is more important than ever. For this application, we have developed a 1.3-mega pixels compact camera module with macro function utilizing the advantage of a plastic lens that can be given mechanically functional shape to outer flange part. Its objective consists of three plastic lenses and all critical dimensions related to optical performance can be determined by high precise optical elements. Therefore this camera module is manufactured without optical adjustment in automatic assembling line, and achieves both high productivity and high performance. Reported here are the constructions and the technical topics of image taking objectives described above.
Astronaut Ronald Evans photographed during transearth coast EVA
1972-12-17
AS17-152-23393 (17 Dec. 1972) --- Astronaut Ronald E. Evans is photographed performing extravehicular activity during the Apollo 17 spacecraft's trans-Earth coast. During his EVA, command module pilot Evans retrieved film cassettes from the Lunar Sounder, Mapping Camera, and Panoramic Camera. The cylindrical object at Evans' left side is the Mapping Camera cassette. The total time for the trans-Earth EVA was one hour seven minutes 18 seconds, starting at ground elapsed time of 257:25 (2:28 p.m.) and ending at ground elapsed timed of 258:42 (3:35 p.m.) on Sunday, Dec. 17, 1972.
Thermal Cameras in School Laboratory Activities
ERIC Educational Resources Information Center
Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.
2015-01-01
Thermal cameras offer real-time visual access to otherwise invisible thermal phenomena, which are conceptually demanding for learners during traditional teaching. We present three studies of students' conduction of laboratory activities that employ thermal cameras to teach challenging thermal concepts in grades 4, 7 and 10-12. Visualization of…
Evaluation of camera-based systems to reduce transit bus side collisions : phase II.
DOT National Transportation Integrated Search
2012-12-01
The sideview camera system has been shown to eliminate blind zones by providing a view to the driver in real time. In : order to provide the best integration of these systems, an integrated camera-mirror system (hybrid system) was : developed and tes...
2004-06-17
This image shows the comet Wild 2, which NASA's Stardust spacecraft flew by on Jan. 2, 2004. This image is the closest short exposure of the comet, taken at an11.4-degree phase angle, the angle between the camera, comet and the Sun. http://photojournal.jpl.nasa.gov/catalog/PIA06285
Optical Transient Monitor (OTM) for BOOTES Project
NASA Astrophysics Data System (ADS)
Páta, P.; Bernas, M.; Castro-Tirado, A. J.; Hudec, R.
2003-04-01
The Optical Transient Monitor (OTM) is a software for control of three wide and ultra-wide filed cameras of BOOTES (Burst Observer and Optical Transient Exploring System) station. The OTM is a PC based and it is powerful tool for taking images from two SBIG CCD cameras in same time or from one camera only. The control program for BOOTES cameras is Windows 98 or MSDOS based. Now the version for Windows 2000 is prepared. There are five main supported modes of work. The OTM program could control cameras and evaluate image data without human interaction.
An image compression algorithm for a high-resolution digital still camera
NASA Technical Reports Server (NTRS)
Nerheim, Rosalee
1989-01-01
The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.
Evaluating video digitizer errors
NASA Astrophysics Data System (ADS)
Peterson, C.
2016-01-01
Analog output video cameras remain popular for recording meteor data. Although these cameras uniformly employ electronic detectors with fixed pixel arrays, the digitization process requires resampling the horizontal lines as they are output in order to reconstruct the pixel data, usually resulting in a new data array of different horizontal dimensions than the native sensor. Pixel timing is not provided by the camera, and must be reconstructed based on line sync information embedded in the analog video signal. Using a technique based on hot pixels, I present evidence that jitter, sync detection, and other timing errors introduce both position and intensity errors which are not present in cameras which internally digitize their sensors and output the digital data directly.
Variable-Interval Sequenced-Action Camera (VINSAC). Dissemination Document No. 1.
ERIC Educational Resources Information Center
Ward, Ted
The 16 millimeter (mm) Variable-Interval Sequenced-Action Camera (VINSAC) is designed for inexpensive photographic recording of effective teacher instruction and use of instructional materials for teacher education and research purposes. The camera photographs single frames at preselected time intervals (.5 second to 20 seconds) which are…
Overt vs. covert speed cameras in combination with delayed vs. immediate feedback to the offender.
Marciano, Hadas; Setter, Pe'erly; Norman, Joel
2015-06-01
Speeding is a major problem in road safety because it increases both the probability of accidents and the severity of injuries if an accident occurs. Speed cameras are one of the most common speed enforcement tools. Most of the speed cameras around the world are overt, but there is evidence that this can cause a "kangaroo effect" in driving patterns. One suggested alternative to prevent this kangaroo effect is the use of covert cameras. Another issue relevant to the effect of enforcement countermeasures on speeding is the timing of the fine. There is general agreement on the importance of the immediacy of the punishment, however, in the context of speed limit enforcement, implementing such immediate punishment is difficult. An immediate feedback that mediates the delay between the speed violation and getting a ticket is one possible solution. This study examines combinations of concealment and the timing of the fine in operating speed cameras in order to evaluate the most effective one in terms of enforcing speed limits. Using a driving simulator, the driving performance of the following four experimental groups was tested: (1) overt cameras with delayed feedback, (2) overt cameras with immediate feedback, (3) covert cameras with delayed feedback, and (4) covert cameras with immediate feedback. Each of the 58 participants drove in the same scenario on three different days. The results showed that both median speed and speed variance were higher with overt than with covert cameras. Moreover, implementing a covert camera system along with immediate feedback was more conducive to drivers maintaining steady speeds at the permitted levels from the very beginning. Finally, both 'overt cameras' groups exhibit a kangaroo effect throughout the entire experiment. It can be concluded that an implementation strategy consisting of covert speed cameras combined with immediate feedback to the offender is potentially an optimal way to motivate drivers to maintain speeds at the speed limit. Copyright © 2015 Elsevier Ltd. All rights reserved.
A novel simultaneous streak and framing camera without principle errors
NASA Astrophysics Data System (ADS)
Jingzhen, L.; Fengshan, S.; Ningwen, L.; Xiangdong, G.; Bin, H.; Qingyang, W.; Hongyi, C.; Yi, C.; Xiaowei, L.
2018-02-01
A novel simultaneous streak and framing camera with continuous access, the perfect information of which is far more important for the exact interpretation and precise evaluation of many detonation events and shockwave phenomena, has been developed. The camera with the maximum imaging frequency of 2 × 106 fps and the maximum scanning velocity of 16.3 mm/μs has fine imaging properties which are the eigen resolution of over 40 lp/mm in the temporal direction and over 60 lp/mm in the spatial direction and the framing frequency principle error of zero for framing record, and the maximum time resolving power of 8 ns and the scanning velocity nonuniformity of 0.136%~-0.277% for streak record. The test data have verified the performance of the camera quantitatively. This camera, simultaneously gained frames and streak with parallax-free and identical time base, is characterized by the plane optical system at oblique incidence different from space system, the innovative camera obscura without principle errors, and the high velocity motor driven beryllium-like rotating mirror, made of high strength aluminum alloy with cellular lateral structure. Experiments demonstrate that the camera is very useful and reliable to take high quality pictures of the detonation events.
Mexico, Arizona, Gulf of California as seen from Apollo 6 unmanned spacecraft
1968-04-04
AS06-02-1436 (4 April 1968) --- View of the mouth of the Colorado River and the Gulf of California in northwestern Mexico as photographed from the unmanned Apollo 6 (Spacecraft 020/Saturn 502) space mission. Altitude of the spacecraft at the time picture was taken was 120 nautical miles. NORTH IS TOWARD LEFT SIDE OF PICTURE. At bottom edge of photograph is Baja California. In the upper left corner is the Mexican state of Sonora showing the Sonoran Desert and the Pinacate Mountains. This photograph was made three hours and seven minutes after liftoff using Eastman Kodak SO-121 high resolution aerial Ektachrome film (exposure setting was f/5.6 at 1/500 second) in a J.A. Maurer model 2200 camera.
VizieR Online Data Catalog: Radial velocities of galaxies in A523 field (Girardi+, 2016)
NASA Astrophysics Data System (ADS)
Girardi, M.; Boschin, W.; Gastaldello, F.; Giovannini, G.; Govoni, F.; Murgia, M.; Barrena, R.; Ettori, S.; Trasatti, M.; Vacca, V.
2016-09-01
Multi-object spectroscopic observations of A523 were carried out at the TNG in 2012 December and 2014 January. We used the instrument DOLORES in MOS mode with the LR-B Grism. In summary, we observed six MOS masks for a total of 210 slits. The total exposure time was 3600s for three masks, 5400s for two masks and 7200s for the last one. Our photometric observations were carried out with the Wide Field Camera (WFC), mounted at the prime focus of the 2.5-m INT telescope. We observed A523 in g, r and i Sloan-Gunn filters in photometric conditions and a seeing of ~1.4arcsec. (1 data file).
NASA Astrophysics Data System (ADS)
Xu, S. J.; Zhang, Y. H.; Yu, Z.; Yao, J.; Zhang, Z. T.
2013-03-01
The streamer regime of pin-to-plane dielectric barrier discharge in air was studied by means of fast photography, electrical measurement and photoelectricity. The fast photographs of positive streamer were obtained by CCD camera with micro lens. The exposure time is one microseconds. The images illustrate that the streamer is non-axisymmetric because of some random factors, such as surface charge position, space charge distribution, gas liquidity and so on. In fact, the streamer propagates along bend discharge channel. The bending degree increases with the electric field strengthen. By surveying a mass of images, the diameter of streamer, height of surface charge effect and scope of surface charge was estimate used to describe the shape of streamer.
3D fluorescence anisotropy imaging using selective plane illumination microscopy.
Hedde, Per Niklas; Ranjit, Suman; Gratton, Enrico
2015-08-24
Fluorescence anisotropy imaging is a popular method to visualize changes in organization and conformation of biomolecules within cells and tissues. In such an experiment, depolarization effects resulting from differences in orientation, proximity and rotational mobility of fluorescently labeled molecules are probed with high spatial resolution. Fluorescence anisotropy is typically imaged using laser scanning and epifluorescence-based approaches. Unfortunately, those techniques are limited in either axial resolution, image acquisition speed, or by photobleaching. In the last decade, however, selective plane illumination microscopy has emerged as the preferred choice for three-dimensional time lapse imaging combining axial sectioning capability with fast, camera-based image acquisition, and minimal light exposure. We demonstrate how selective plane illumination microscopy can be utilized for three-dimensional fluorescence anisotropy imaging of live cells. We further examined the formation of focal adhesions by three-dimensional time lapse anisotropy imaging of CHO-K1 cells expressing an EGFP-paxillin fusion protein.
Aspirin decreases platelet uptake on Dacron vascular grafts in baboons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackey, W.C.; Connolly, R.J.; Callow, A.D.
The influence of a single dose of aspirin (5.4-7.4 mg/kg) on platelet uptake on 4-mm Dacron interposition grafts was studied in a baboon model using gamma camera scanning for 111-Indium labeled platelets. In vitro assessment of platelet function after aspirin administration revealed that in the baboon, as in the human, aspirin abolished arachidonic acid-induced platelet aggregation, prolonged the lag time between exposure to collagen and aggregation, and decreased plasma thromboxane B2 levels. Aspirin also prolonged the template bleeding time. Scans for 111-Indium labeled platelets revealed that pretreatment with a single dose of aspirin decreased platelet uptake on 4-mm Dacron carotidmore » interposition grafts. This decrease in platelet uptake was associated with a significant improvement in 2-hour graft patency and with a trend toward improved 2-week patency.« less
A computational approach to real-time image processing for serial time-encoded amplified microscopy
NASA Astrophysics Data System (ADS)
Oikawa, Minoru; Hiyama, Daisuke; Hirayama, Ryuji; Hasegawa, Satoki; Endo, Yutaka; Sugie, Takahisa; Tsumura, Norimichi; Kuroshima, Mai; Maki, Masanori; Okada, Genki; Lei, Cheng; Ozeki, Yasuyuki; Goda, Keisuke; Shimobaba, Tomoyoshi
2016-03-01
High-speed imaging is an indispensable technique, particularly for identifying or analyzing fast-moving objects. The serial time-encoded amplified microscopy (STEAM) technique was proposed to enable us to capture images with a frame rate 1,000 times faster than using conventional methods such as CCD (charge-coupled device) cameras. The application of this high-speed STEAM imaging technique to a real-time system, such as flow cytometry for a cell-sorting system, requires successively processing a large number of captured images with high throughput in real time. We are now developing a high-speed flow cytometer system including a STEAM camera. In this paper, we describe our approach to processing these large amounts of image data in real time. We use an analog-to-digital converter that has up to 7.0G samples/s and 8-bit resolution for capturing the output voltage signal that involves grayscale images from the STEAM camera. Therefore the direct data output from the STEAM camera generates 7.0G byte/s continuously. We provided a field-programmable gate array (FPGA) device as a digital signal pre-processor for image reconstruction and finding objects in a microfluidic channel with high data rates in real time. We also utilized graphics processing unit (GPU) devices for accelerating the calculation speed of identification of the reconstructed images. We built our prototype system, which including a STEAM camera, a FPGA device and a GPU device, and evaluated its performance in real-time identification of small particles (beads), as virtual biological cells, owing through a microfluidic channel.
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.
2015-10-01
Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.
NASA Astrophysics Data System (ADS)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.; Kalantar, Daniel H.
2015-02-01
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high energy density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.
Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi
2016-08-30
This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.
Into the blue: AO science with MagAO in the visible
NASA Astrophysics Data System (ADS)
Close, Laird M.; Males, Jared R.; Follette, Katherine B.; Hinz, Phil; Morzinski, Katie; Wu, Ya-Lin; Kopon, Derek; Riccardi, Armando; Esposito, Simone; Puglisi, Alfio; Pinna, Enrico; Xompero, Marco; Briguglio, Runa; Quiros-Pacheco, Fernando
2014-08-01
We review astronomical results in the visible (λ<1μm) with adaptive optics. Other than a brief period in the early 1990s, there has been little astronomical science done in the visible with AO until recently. The most productive visible AO system to date is our 6.5m Magellan telescope AO system (MagAO). MagAO is an advanced Adaptive Secondary system at the Magellan 6.5m in Chile. This secondary has 585 actuators with < 1 msec response times (0.7 ms typically). We use a pyramid wavefront sensor. The relatively small actuator pitch (~23 cm/subap) allows moderate Strehls to be obtained in the visible (0.63-1.05 microns). We use a CCD AO science camera called "VisAO". On-sky long exposures (60s) achieve <30mas resolutions, 30% Strehls at 0.62 microns (r') with the VisAO camera in 0.5" seeing with bright R < 8 mag stars. These relatively high visible wavelength Strehls are made possible by our powerful combination of a next generation ASM and a Pyramid WFS with 378 controlled modes and 1000 Hz loop frequency. We'll review the key steps to having good performance in the visible and review the exciting new AO visible science opportunities and refereed publications in both broad-band (r,i,z,Y) and at Halpha for exoplanets, protoplanetary disks, young stars, and emission line jets. These examples highlight the power of visible AO to probe circumstellar regions/spatial resolutions that would otherwise require much larger diameter telescopes with classical infrared AO cameras.
Solar System Portrait - Views of 6 Planets
1996-09-13
These six narrow-angle color images were made from the first ever portrait of the solar system taken by NASA’s Voyager 1, which was more than 4 billion miles from Earth and about 32 degrees above the ecliptic. The spacecraft acquired a total of 60 frames for a mosaic of the solar system which shows six of the planets. Mercury is too close to the sun to be seen. Mars was not detectable by the Voyager cameras due to scattered sunlight in the optics, and Pluto was not included in the mosaic because of its small size and distance from the sun. These blown-up images, left to right and top to bottom are Venus, Earth, Jupiter, and Saturn, Uranus, Neptune. The background features in the images are artifacts resulting from the magnification. The images were taken through three color filters -- violet, blue and green -- and recombined to produce the color images. Jupiter and Saturn were resolved by the camera but Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposure times. Earth appears to be in a band of light because it coincidentally lies right in the center of the scattered light rays resulting from taking the image so close to the sun. Earth was a crescent only 0.12 pixels in size. Venus was 0.11 pixel in diameter. The planetary images were taken with the narrow-angle camera (1500 mm focal length). http://photojournal.jpl.nasa.gov/catalog/PIA00453
Image acquisition system for traffic monitoring applications
NASA Astrophysics Data System (ADS)
Auty, Glen; Corke, Peter I.; Dunn, Paul; Jensen, Murray; Macintyre, Ian B.; Mills, Dennis C.; Nguyen, Hao; Simons, Ben
1995-03-01
An imaging system for monitoring traffic on multilane highways is discussed. The system, named Safe-T-Cam, is capable of operating 24 hours per day in all but extreme weather conditions and can capture still images of vehicles traveling up to 160 km/hr. Systems operating at different remote locations are networked to allow transmission of images and data to a control center. A remote site facility comprises a vehicle detection and classification module (VCDM), an image acquisition module (IAM) and a license plate recognition module (LPRM). The remote site is connected to the central site by an ISDN communications network. The remote site system is discussed in this paper. The VCDM consists of a video camera, a specialized exposure control unit to maintain consistent image characteristics, and a 'real-time' image processing system that processes 50 images per second. The VCDM can detect and classify vehicles (e.g. cars from trucks). The vehicle class is used to determine what data should be recorded. The VCDM uses a vehicle tracking technique to allow optimum triggering of the high resolution camera of the IAM. The IAM camera combines the features necessary to operate consistently in the harsh environment encountered when imaging a vehicle 'head-on' in both day and night conditions. The image clarity obtained is ideally suited for automatic location and recognition of the vehicle license plate. This paper discusses the camera geometry, sensor characteristics and the image processing methods which permit consistent vehicle segmentation from a cluttered background allowing object oriented pattern recognition to be used for vehicle classification. The image capture of high resolution images and the image characteristics required for the LPRMs automatic reading of vehicle license plates, is also discussed. The results of field tests presented demonstrate that the vision based Safe-T-Cam system, currently installed on open highways, is capable of producing automatic classification of vehicle class and recording of vehicle numberplates with a success rate around 90 percent in a period of 24 hours.
Baker, Stokes S.; Vidican, Cleo B.; Cameron, David S.; Greib, Haittam G.; Jarocki, Christine C.; Setaputri, Andres W.; Spicuzza, Christopher H.; Burr, Aaron A.; Waqas, Meriam A.; Tolbert, Danzell A.
2012-01-01
Background and aims Studies have shown that levels of green fluorescent protein (GFP) leaf surface fluorescence are directly proportional to GFP soluble protein concentration in transgenic plants. However, instruments that measure GFP surface fluorescence are expensive. The goal of this investigation was to develop techniques with consumer digital cameras to analyse GFP surface fluorescence in transgenic plants. Methodology Inexpensive filter cubes containing machine vision dichroic filters and illuminated with blue light-emitting diodes (LED) were designed to attach to digital single-lens reflex (SLR) camera macro lenses. The apparatus was tested on purified enhanced GFP, and on wild-type and GFP-expressing arabidopsis grown autotrophically and heterotrophically. Principal findings Spectrum analysis showed that the apparatus illuminates specimens with wavelengths between ∼450 and ∼500 nm, and detects fluorescence between ∼510 and ∼595 nm. Epifluorescent photographs taken with SLR digital cameras were able to detect red-shifted GFP fluorescence in Arabidopsis thaliana leaves and cotyledons of pot-grown plants, as well as roots, hypocotyls and cotyledons of etiolated and light-grown plants grown heterotrophically. Green fluorescent protein fluorescence was detected primarily in the green channel of the raw image files. Studies with purified GFP produced linear responses to both protein surface density and exposure time (H0: β (slope) = 0 mean counts per pixel (ng s mm−2)−1, r2 > 0.994, n = 31, P < 1.75 × 10−29). Conclusions Epifluorescent digital photographs taken with complementary metal-oxide-semiconductor and charge-coupled device SLR cameras can be used to analyse red-shifted GFP surface fluorescence using visible blue light. This detection device can be constructed with inexpensive commercially available materials, thus increasing the accessibility of whole-organism GFP expression analysis to research laboratories and teaching institutions with small budgets. PMID:22479674
NASA Astrophysics Data System (ADS)
Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.
2015-08-01
Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at the same exposure time will have same interior orientation parameters (IOPs) and exterior orientation parameters (EOPs) after band-to-band registration (BBR). Thus, in the aerial triangulation stage, the master band of MiniMCA-12 was treated as a reference channel to link with DSLR RGB images. It means, all reference images from the master band of MiniMCA-12 and all RGB images were triangulated at the same time with same coordinate system of ground control points (GCP). Due to the spatial resolution of RGB images is higher than the MiniMCA-12, the GCP can be marked on the RGB images only even they cannot be recognized on the MiniMCA images. Furthermore, a one meter gridded digital surface model (DSM) is created by the RGB images and applied to the MiniMCA imagery for ortho-rectification. Quantitative error analyses show that the proposed BBR scheme can achieve 0.33 pixels of average misregistration residuals length and the co-registration errors among 12 MiniMCA ortho-images and between MiniMCA and Canon RGB ortho-images are all less than 0.6 pixels. The experimental results demonstrate that the proposed method is robust, reliable and accurate for future remote sensing applications.
Kinect2 - respiratory movement detection study.
Rihana, Sandy; Younes, Elie; Visvikis, Dimitris; Fayad, Hadi
2016-08-01
Radiotherapy is one of the main cancer treatments. It consists in irradiating tumor cells to destroy them while sparing healthy tissue. The treatment is planned based on Computed Tomography (CT) and is delivered over fractions during several days. One of the main challenges is replacing patient in the same position every day to irradiate the tumor volume while sparing healthy tissues. Many patient positioning techniques are available. They are both invasive and not accurate performed using tattooed marker on the patient's skin aligned with a laser system calibrated in the treatment room or irradiating using X-ray. Currently systems such as Vision RT use two Time of Flight cameras. Time of Flight cameras have the advantage of having a very fast acquisition rate allows the real time monitoring of patient movement and patient repositioning. The purpose of this work is to test the Microsoft Kinect2 camera for potential use for patient positioning and respiration trigging. This type of Time of Flight camera is non-invasive and costless which facilitate its transfer to clinical practice.
Passive stand-off terahertz imaging with 1 hertz frame rate
NASA Astrophysics Data System (ADS)
May, T.; Zieger, G.; Anders, S.; Zakosarenko, V.; Starkloff, M.; Meyer, H.-G.; Thorwirth, G.; Kreysa, E.
2008-04-01
Terahertz (THz) cameras are expected to be a powerful tool for future security applications. If such a technology shall be useful for typical security scenarios (e.g. airport check-in) it has to meet some minimum standards. A THz camera should record images with video rate from a safe distance (stand-off). Although active cameras are conceivable, a passive system has the benefit of concealed operation. Additionally, from an ethic perspective, the lack of exposure to a radiation source is a considerable advantage in public acceptance. Taking all these requirements into account, only cooled detectors are able to achieve the needed sensitivity. A big leap forward in the detector performance and scalability was driven by the astrophysics community. Superconducting bolometers and midsized arrays of them have been developed and are in routine use. Although devices with many pixels are foreseeable nowadays a device with an additional scanning optic is the straightest way to an imaging system with a useful resolution. We demonstrate the capabilities of a concept for a passive Terahertz video camera based on superconducting technology. The actual prototype utilizes a small Cassegrain telescope with a gyrating secondary mirror to record 2 kilopixel THz images with 1 second frame rate.
Optical correlator method and apparatus for particle image velocimetry processing
NASA Technical Reports Server (NTRS)
Farrell, Patrick V. (Inventor)
1991-01-01
Young's fringes are produced from a double exposure image of particles in a flowing fluid by passing laser light through the film and projecting the light onto a screen. A video camera receives the image from the screen and controls a spatial light modulator. The spatial modulator has a two dimensional array of cells the transmissiveness of which are controlled in relation to the brightness of the corresponding pixel of the video camera image of the screen. A collimated beam of laser light is passed through the spatial light modulator to produce a diffraction pattern which is focused onto another video camera, with the output of the camera being digitized and provided to a microcomputer. The diffraction pattern formed when the laser light is passed through the spatial light modulator and is focused to a point corresponds to the two dimensional Fourier transform of the Young's fringe pattern projected onto the screen. The data obtained fro This invention was made with U.S. Government support awarded by the Department of the Army (DOD) and NASA grand number(s): DOD #DAAL03-86-K0174 and NASA #NAG3-718. The U.S. Government has certain rights in this invention.
Camera for Quasars in the Early Universe (CQUEAN)
NASA Astrophysics Data System (ADS)
Kim, Eunbin; Park, W.; Lim, J.; Jeong, H.; Kim, J.; Oh, H.; Pak, S.; Im, M.; Kuehne, J.
2010-05-01
The early universe of z ɳ is where the first stars, galaxies, and quasars formed, starting the re-ionization of the universe. The discovery and the study of quasars in the early universe allow us to witness the beginning of history of astronomical objects. In order to perform a medium-deep, medium-wide, imaging survey of quasars, we are developing an optical CCD camera, CQUEAN (Camera for QUasars in EArly uNiverse) which uses a 1024*1024 pixel deep-depletion CCD. It has an enhanced QE than conventional CCD at wavelength band around 1μm, thus it will be an efficient tool for observation of quasars at z > 7. It will be attached to the 2.1m telescope at McDonald Observatory, USA. A focal reducer is designed to secure a larger field of view at the cassegrain focus of 2.1m telescope. For long stable exposures, auto-guiding system will be implemented by using another CCD camera viewing an off-axis field. All these instruments will be controlled by the software written in python on linux platform. CQUEAN is expected to see the first light during summer in 2010.
Single chip camera device having double sampling operation
NASA Technical Reports Server (NTRS)
Fossum, Eric R. (Inventor); Nixon, Robert (Inventor)
2002-01-01
A single chip camera device is formed on a single substrate including an image acquisition portion for control portion and the timing circuit formed on the substrate. The timing circuit also controls the photoreceptors in a double sampling mode in which are reset level is first read and then after an integration time a charged level is read.