Sample records for optical monitoring camera

  1. Optical Transient Monitor (OTM) for BOOTES Project

    NASA Astrophysics Data System (ADS)

    Páta, P.; Bernas, M.; Castro-Tirado, A. J.; Hudec, R.

    2003-04-01

    The Optical Transient Monitor (OTM) is a software for control of three wide and ultra-wide filed cameras of BOOTES (Burst Observer and Optical Transient Exploring System) station. The OTM is a PC based and it is powerful tool for taking images from two SBIG CCD cameras in same time or from one camera only. The control program for BOOTES cameras is Windows 98 or MSDOS based. Now the version for Windows 2000 is prepared. There are five main supported modes of work. The OTM program could control cameras and evaluate image data without human interaction.

  2. Using Arago's spot to monitor optical axis shift in a Petzval refractor.

    PubMed

    Bruns, Donald G

    2017-03-10

    Measuring the change in the optical alignment of a camera attached to a telescope is necessary to perform astrometric measurements. Camera movement when the telescope is refocused changes the plate constants, invalidating the calibration. Monitoring the shift in the optical axis requires a stable internal reference source. This is easily implemented in a Petzval refractor by adding an illuminated pinhole and a small obscuration that creates a spot of Arago on the camera. Measurements of the optical axis shift for a commercial telescope are given as an example.

  3. SITHON: A Wireless Network of in Situ Optical Cameras Applied to the Early Detection-Notification-Monitoring of Forest Fires

    PubMed Central

    Tsiourlis, Georgios; Andreadakis, Stamatis; Konstantinidis, Pavlos

    2009-01-01

    The SITHON system, a fully wireless optical imaging system, integrating a network of in-situ optical cameras linking to a multi-layer GIS database operated by Control Operating Centres, has been developed in response to the need for early detection, notification and monitoring of forest fires. This article presents in detail the architecture and the components of SITHON, and demonstrates the first encouraging results of an experimental test with small controlled fires over Sithonia Peninsula in Northern Greece. The system has already been scheduled to be installed in some fire prone areas of Greece. PMID:22408536

  4. Weather and atmosphere observation with the ATOM all-sky camera

    NASA Astrophysics Data System (ADS)

    Jankowsky, Felix; Wagner, Stefan

    2015-03-01

    The Automatic Telescope for Optical Monitoring (ATOM) for H.E.S.S. is an 75 cm optical telescope which operates fully automated. As there is no observer present during observation, an auxiliary all-sky camera serves as weather monitoring system. This device takes an all-sky image of the whole sky every three minutes. The gathered data then undergoes live-analysis by performing astrometric comparison with a theoretical night sky model, interpreting the absence of stars as cloud coverage. The sky monitor also serves as tool for a meteorological analysis of the observation site of the the upcoming Cherenkov Telescope Array. This overview covers design and benefits of the all-sky camera and additionally gives an introduction into current efforts to integrate the device into the atmosphere analysis programme of H.E.S.S.

  5. Digital optical correlator x-ray telescope alignment monitoring system

    NASA Astrophysics Data System (ADS)

    Lis, Tomasz; Gaskin, Jessica; Jasper, John; Gregory, Don A.

    2018-01-01

    The High-Energy Replicated Optics to Explore the Sun (HEROES) program is a balloon-borne x-ray telescope mission to observe hard x-rays (˜20 to 70 keV) from the sun and multiple astrophysical targets. The payload consists of eight mirror modules with a total of 114 optics that are mounted on a 6-m-long optical bench. Each mirror module is complemented by a high-pressure xenon gas scintillation proportional counter. Attached to the payload is a camera that acquires star fields and then matches the acquired field to star maps to determine the pointing of the optical bench. Slight misalignments between the star camera, the optical bench, and the telescope elements attached to the optical bench may occur during flight due to mechanical shifts, thermal gradients, and gravitational effects. These misalignments can result in diminished imaging and reduced photon collection efficiency. To monitor these misalignments during flight, a supplementary Bench Alignment Monitoring System (BAMS) was added to the payload. BAMS hardware comprises two cameras mounted directly to the optical bench and rings of light-emitting diodes (LEDs) mounted onto the telescope components. The LEDs in these rings are mounted in a predefined, asymmetric pattern, and their positions are tracked using an optical/digital correlator. The BAMS analysis software is a digital adaption of an optical joint transform correlator. The aim is to enhance the observational proficiency of HEROES while providing insight into the magnitude of mechanically and thermally induced misalignments during flight. Results from a preflight test of the system are reported.

  6. Search for GRB related prompt optical emission and other fast varying objects with ``Pi of the Sky'' detector

    NASA Astrophysics Data System (ADS)

    Ćwiok, M.; Dominik, W.; Małek, K.; Mankiewicz, L.; Mrowca-Ciułacz, J.; Nawrocki, K.; Piotrowski, L. W.; Sitek, P.; Sokołowski, M.; Wrochna, G.; Żarnecki, A. F.

    2007-06-01

    Experiment “Pi of the Sky” is designed to search for prompt optical emission from GRB sources. 32 CCD cameras covering 2 steradians will monitor the sky continuously. The data will be analysed on-line in search for optical flashes. The prototype with 2 cameras operated at Las Campanas (Chile) since 2004 has recognised several outbursts of flaring stars and has given limits for a few GRB.

  7. Coherent infrared imaging camera (CIRIC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less

  8. [Computer optical topography: a study of the repeatability of the results of human body model examination].

    PubMed

    Sarnadskiĭ, V N

    2007-01-01

    The problem of repeatability of the results of examination of a plastic human body model is considered. The model was examined in 7 positions using an optical topograph for kyphosis diagnosis. The examination was performed under television camera monitoring. It was shown that variation of the model position in the camera view affected the repeatability of the results of topographic examination, especially if the model-to-camera distance was changed. A study of the repeatability of the results of optical topographic examination can help to increase the reliability of the topographic method, which is widely used for medical screening of children and adolescents.

  9. Camera-Based Microswitch Technology to Monitor Mouth, Eyebrow, and Eyelid Responses of Children with Profound Multiple Disabilities

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Lang, Russell; Didden, Robert

    2011-01-01

    A camera-based microswitch technology was recently used to successfully monitor small eyelid and mouth responses of two adults with profound multiple disabilities (Lancioni et al., Res Dev Disab 31:1509-1514, 2010a). This technology, in contrast with the traditional optic microswitches used for those responses, did not require support frames on…

  10. Docking alignment system

    NASA Technical Reports Server (NTRS)

    Monford, Leo G. (Inventor)

    1990-01-01

    Improved techniques are provided for alignment of two objects. The present invention is particularly suited for three-dimensional translation and three-dimensional rotational alignment of objects in outer space. A camera 18 is fixedly mounted to one object, such as a remote manipulator arm 10 of the spacecraft, while the planar reflective surface 30 is fixed to the other object, such as a grapple fixture 20. A monitor 50 displays in real-time images from the camera, such that the monitor displays both the reflected image of the camera and visible markings on the planar reflective surface when the objects are in proper alignment. The monitor may thus be viewed by the operator and the arm 10 manipulated so that the reflective surface is perpendicular to the optical axis of the camera, the roll of the reflective surface is at a selected angle with respect to the camera, and the camera is spaced a pre-selected distance from the reflective surface.

  11. Improved docking alignment system

    NASA Technical Reports Server (NTRS)

    Monford, Leo G. (Inventor)

    1988-01-01

    Improved techniques are provided for the alignment of two objects. The present invention is particularly suited for 3-D translation and 3-D rotational alignment of objects in outer space. A camera is affixed to one object, such as a remote manipulator arm of the spacecraft, while the planar reflective surface is affixed to the other object, such as a grapple fixture. A monitor displays in real-time images from the camera such that the monitor displays both the reflected image of the camera and visible marking on the planar reflective surface when the objects are in proper alignment. The monitor may thus be viewed by the operator and the arm manipulated so that the reflective surface is perpendicular to the optical axis of the camera, the roll of the reflective surface is at a selected angle with respect to the camera, and the camera is spaced a pre-selected distance from the reflective surface.

  12. Exploiting Auto-Collimation for Real-Time Onboard Monitoring of Space Optical Camera Geometric Parameters

    NASA Astrophysics Data System (ADS)

    Liu, W.; Wang, H.; Liu, D.; Miu, Y.

    2018-05-01

    Precise geometric parameters are essential to ensure the positioning accuracy for space optical cameras. However, state-of-the-art onorbit calibration method inevitably suffers from long update cycle and poor timeliness performance. To this end, in this paper we exploit the optical auto-collimation principle and propose a real-time onboard calibration scheme for monitoring key geometric parameters. Specifically, in the proposed scheme, auto-collimation devices are first designed by installing collimated light sources, area-array CCDs, and prisms inside the satellite payload system. Through utilizing those devices, the changes in the geometric parameters are elegantly converted into changes in the spot image positions. The variation of geometric parameters can be derived via extracting and processing the spot images. An experimental platform is then set up to verify the feasibility and analyze the precision index of the proposed scheme. The experiment results demonstrate that it is feasible to apply the optical auto-collimation principle for real-time onboard monitoring.

  13. Optical transient monitor

    NASA Astrophysics Data System (ADS)

    Bernas, Martin; Páta, Petr; Hudec, René; Soldán, Jan; Rezek, Tomáš; Castro-Tirado, Alberto J.

    1998-05-01

    Although there are several optical GRB follow-up systems in operation and/or in development, some of them with a very short response time, they will never be able to provide true simultaneous (no delay) and pre-burst optical data for GRBs. We report on the development and tests of a monitoring experiment expected to be put into test operation in 1998. The system should detect Optical Transients down to mag 6-7 (few seconds duration assumed) over a wide field of view. The system is based on the double CCD wide-field cameras ST8. For the real time evaluation of the signal from both cameras, two TMS 320C40 processors are used. Using two channels differing in spectral sensitivity and processing of temporal sequence of images allows us to eliminate man-made objects and defects of the CCD electronics. The system is controlled by a standard PC computer.

  14. Possibilities in optical monitoring of laser welding process

    NASA Astrophysics Data System (ADS)

    Horník, Petr; Mrňa, Libor; Pavelka, Jan

    2016-11-01

    Laser welding is a modern, widely used but still not really common method of welding. With increasing demands on the quality of the welds, it is usual to apply automated machine welding and with on-line monitoring of the welding process. The resulting quality of the weld is largely affected by the behavior of keyhole. However, its direct observation during the welding process is practically impossible and it is necessary to use indirect methods. At ISI we have developed optical methods of monitoring the process. Most advanced is an analysis of radiation of laser-induced plasma plume forming in the keyhole where changes in the frequency of the plasma bursts are monitored and evaluated using Fourier and autocorrelation analysis. Another solution, robust and suitable for industry, is based on the observation of the keyhole inlet opening through a coaxial camera mounted in the welding head and the subsequent image processing by computer vision methods. A high-speed camera is used to understand the dynamics of the plasma plume. Through optical spectroscopy of the plume, we can study the excitation of elements in a material. It is also beneficial to monitor the gas flow of shielding gas using schlieren method.

  15. Proposal of optical farming: development of several optical sensing instruments for agricultural use

    NASA Astrophysics Data System (ADS)

    Saito, Y.; Kobayashi, K.

    2013-05-01

    We propose the use of "Optical Farming," which is the leading application of all types of optical technologies, in agriculture and agriculture-related industries. This paper focuses on the optical sensing instruments named "Agriserver," "Agrigadget" and "LIFS Monitor" developed in our laboratory. They are considered major factors in utilizing Optical Farming. Agriserver is a sensor network system that uses the Internet to collect information on agricultural products growing in fields. Agrigadget contains several optical devices, such as a smartphone-based spectroscopic device and a hand framing camera. LIFS Monitor is an advanced monitoring instrument that makes it possible to obtain physiological information of living plants. They are strongly associated with information communication technology. Their field and data usage performance in agricultural industries are reported.

  16. SHOK—The First Russian Wide-Field Optical Camera in Space

    NASA Astrophysics Data System (ADS)

    Lipunov, V. M.; Gorbovskoy, E. S.; Kornilov, V. G.; Panasyuk, M. I.; Amelushkin, A. M.; Petrov, V. L.; Yashin, I. V.; Svertilov, S. I.; Vedenkin, N. N.

    2018-02-01

    Onboard the spacecraft Lomonosov is established two fast, fixed, very wide-field cameras SHOK. The main goal of this experiment is the observation of GRB optical emission before, synchronously, and after the gamma-ray emission. The field of view of each of the cameras is placed in the gamma-ray burst detection area of other devices located onboard the "Lomonosov" spacecraft. SHOK provides measurements of optical emissions with a magnitude limit of ˜ 9-10m on a single frame with an exposure of 0.2 seconds. The device is designed for continuous sky monitoring at optical wavelengths in the very wide field of view (1000 square degrees each camera), detection and localization of fast time-varying (transient) optical sources on the celestial sphere, including provisional and synchronous time recording of optical emissions from the gamma-ray burst error boxes, detected by the BDRG device and implemented by a control signal (alert trigger) from the BDRG. The Lomonosov spacecraft has two identical devices, SHOK1 and SHOK2. The core of each SHOK device is a fast-speed 11-Megapixel CCD. Each of the SHOK devices represents a monoblock, consisting of a node observations of optical emission, the electronics node, elements of the mechanical construction, and the body.

  17. Feasibility of an endotracheal tube-mounted camera for percutaneous dilatational tracheostomy.

    PubMed

    Grensemann, J; Eichler, L; Hopf, S; Jarczak, D; Simon, M; Kluge, S

    2017-07-01

    Percutaneous dilatational tracheostomy (PDT) in critically ill patients is often led by optical guidance with a bronchoscope. This is not without its disadvantages. Therefore, we aimed to study the feasibility of a recently introduced endotracheal tube-mounted camera (VivaSight™-SL, ET View, Misgav, Israel) in the guidance of PDT. We studied 10 critically ill patients who received PDT with a VivaSight-SL tube that was inserted prior to tracheostomy for optical guidance. Visualization of the tracheal structures (i.e., identification and monitoring of the thyroid, cricoid, and tracheal cartilage and the posterior wall) and the quality of ventilation (before puncture and during the tracheostomy) were rated on four-point Likert scales. Respiratory variables were recorded, and blood gases were sampled before the interventions, before the puncture and before the insertion of the tracheal cannula. Visualization of the tracheal landmarks was rated as 'very good' or 'good' in all but one case. Monitoring during the puncture and dilatation was also rated as 'very good' or 'good' in all but one. In the cases that were rated 'difficult', the visualization and monitoring of the posterior wall of the trachea were the main concerns. No changes in the respiratory variables or blood gases occurred between the puncture and the insertion of the tracheal cannula. Percutaneous dilatational tracheostomy with optical guidance from a tube-mounted camera is feasible. Further studies comparing the camera tube with bronchoscopy as the standard approach should be performed. © 2017 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  18. Applying UV cameras for SO2 detection to distant or optically thick volcanic plumes

    USGS Publications Warehouse

    Kern, Christoph; Werner, Cynthia; Elias, Tamar; Sutton, A. Jeff; Lübcke, Peter

    2013-01-01

    Ultraviolet (UV) camera systems represent an exciting new technology for measuring two dimensional sulfur dioxide (SO2) distributions in volcanic plumes. The high frame rate of the cameras allows the retrieval of SO2 emission rates at time scales of 1 Hz or higher, thus allowing the investigation of high-frequency signals and making integrated and comparative studies with other high-data-rate volcano monitoring techniques possible. One drawback of the technique, however, is the limited spectral information recorded by the imaging systems. Here, a framework for simulating the sensitivity of UV cameras to various SO2 distributions is introduced. Both the wavelength-dependent transmittance of the optical imaging system and the radiative transfer in the atmosphere are modeled. The framework is then applied to study the behavior of different optical setups and used to simulate the response of these instruments to volcanic plumes containing varying SO2 and aerosol abundances located at various distances from the sensor. Results show that UV radiative transfer in and around distant and/or optically thick plumes typically leads to a lower sensitivity to SO2 than expected when assuming a standard Beer–Lambert absorption model. Furthermore, camera response is often non-linear in SO2 and dependent on distance to the plume and plume aerosol optical thickness and single scatter albedo. The model results are compared with camera measurements made at Kilauea Volcano (Hawaii) and a method for integrating moderate resolution differential optical absorption spectroscopy data with UV imagery to retrieve improved SO2 column densities is discussed.

  19. Don't get burned: thermal monitoring of vessel sealing using a miniature infrared camera

    NASA Astrophysics Data System (ADS)

    Lin, Shan; Fichera, Loris; Fulton, Mitchell J.; Webster, Robert J.

    2017-03-01

    Miniature infrared cameras have recently come to market in a form factor that facilitates packaging in endoscopic or other minimally invasive surgical instruments. If absolute temperature measurements can be made with these cameras, they may be useful for non-contact monitoring of electrocautery-based vessel sealing, or other thermal surgical processes like thermal ablation of tumors. As a first step in evaluating the feasibility of optical medical thermometry with these new cameras, in this paper we explore how well thermal measurements can be made with them. These cameras measure the raw flux of incoming IR radiation, and we perform a calibration procedure to map their readings to absolute temperature values in the range between 40 and 150 °C. Furthermore, we propose and validate a method to estimate the spatial extent of heat spread created by a cautery tool based on the thermal images.

  20. Coincident Above- and Below-ground Autonomous Monitoring to Quantify Co-variability in Permafrost, Soil and Vegetation Properties in Arctic Tundra: Supporting Data

    DOE Data Explorer

    Baptiste Dafflon; Rusen Oktem; John Peterson; Craig Ulrich; Anh Phuong Tran; Vladimir Romanovsky; Susan Hubbard

    2017-05-10

    The dataset contains measurements obtained through electrical resistivity tomography (ERT) to monitor soil properties, pole-mounted optical cameras to monitor vegetation dynamics, point probes to measure soil temperature, and periodic manual measurements of thaw layer thickness, snow thickness and soil dielectric permittivity.

  1. A simple optical tweezers for trapping polystyrene particles

    NASA Astrophysics Data System (ADS)

    Shiddiq, Minarni; Nasir, Zulfa; Yogasari, Dwiyana

    2013-09-01

    Optical tweezers is an optical trap. For decades, it has become an optical tool that can trap and manipulate any particle from the very small size like DNA to the big one like bacteria. The trapping force comes from the radiation pressure of laser light which is focused to a group of particles. Optical tweezers has been used in many research areas such as atomic physics, medical physics, biophysics, and chemistry. Here, a simple optical tweezers has been constructed using a modified Leybold laboratory optical microscope. The ocular lens of the microscope has been removed for laser light and digital camera accesses. A laser light from a Coherent diode laser with wavelength λ = 830 nm and power 50 mW is sent through an immersion oil objective lens with magnification 100 × and NA 1.25 to a cell made from microscope slides containing polystyrene particles. Polystyrene particles with size 3 μm and 10 μm are used. A CMOS Thorlabs camera type DCC1545M with USB Interface and Thorlabs camera lens 35 mm are connected to a desktop and used to monitor the trapping and measure the stiffness of the trap. The camera is accompanied by camera software which makes able for the user to capture and save images. The images are analyzed using ImageJ and Scion macro. The polystyrene particles have been trapped successfully. The stiffness of the trap depends on the size of the particles and the power of the laser. The stiffness increases linearly with power and decreases as the particle size larger.

  2. Monitoring the fabrication of tapered optical fibres

    NASA Astrophysics Data System (ADS)

    Mullaney, K.; Correia, R.; Staines, S. E.; James, S. W.; Tatam, R. P.

    2017-04-01

    A variety of optical methods to enhance the process of making optical fibre tapers are explored. A thermal camera was used to both refine the alignment of the optical components and optimize the laser power profile during the tapering process. The fibre transmission was measured to verify that the tapers had the requisite optical characteristics while the strain experienced by the fibre while tapering was assessed using an optical fibre Bragg grating. Using these techniques, adiabatic tapers were fabricated with a 2% insertion loss.

  3. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras.

    PubMed

    Quinn, Mark Kenneth; Spinosa, Emanuele; Roberts, David A

    2017-07-25

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.

  4. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras

    PubMed Central

    Spinosa, Emanuele; Roberts, David A.

    2017-01-01

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access. PMID:28757553

  5. Adapting smartphones for low-cost optical medical imaging

    NASA Astrophysics Data System (ADS)

    Pratavieira, Sebastião.; Vollet-Filho, José D.; Carbinatto, Fernanda M.; Blanco, Kate; Inada, Natalia M.; Bagnato, Vanderlei S.; Kurachi, Cristina

    2015-06-01

    Optical images have been used in several medical situations to improve diagnosis of lesions or to monitor treatments. However, most systems employ expensive scientific (CCD or CMOS) cameras and need computers to display and save the images, usually resulting in a high final cost for the system. Additionally, this sort of apparatus operation usually becomes more complex, requiring more and more specialized technical knowledge from the operator. Currently, the number of people using smartphone-like devices with built-in high quality cameras is increasing, which might allow using such devices as an efficient, lower cost, portable imaging system for medical applications. Thus, we aim to develop methods of adaptation of those devices to optical medical imaging techniques, such as fluorescence. Particularly, smartphones covers were adapted to connect a smartphone-like device to widefield fluorescence imaging systems. These systems were used to detect lesions in different tissues, such as cervix and mouth/throat mucosa, and to monitor ALA-induced protoporphyrin-IX formation for photodynamic treatment of Cervical Intraepithelial Neoplasia. This approach may contribute significantly to low-cost, portable and simple clinical optical imaging collection.

  6. Fiber-Optic Surface Temperature Sensor Based on Modal Interference.

    PubMed

    Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc

    2016-07-28

    Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.

  7. A novel camera localization system for extending three-dimensional digital image correlation

    NASA Astrophysics Data System (ADS)

    Sabato, Alessandro; Reddy, Narasimha; Khan, Sameer; Niezrecki, Christopher

    2018-03-01

    The monitoring of civil, mechanical, and aerospace structures is important especially as these systems approach or surpass their design life. Often, Structural Health Monitoring (SHM) relies on sensing techniques for condition assessment. Advancements achieved in camera technology and optical sensors have made three-dimensional (3D) Digital Image Correlation (DIC) a valid technique for extracting structural deformations and geometry profiles. Prior to making stereophotogrammetry measurements, a calibration has to be performed to obtain the vision systems' extrinsic and intrinsic parameters. It means that the position of the cameras relative to each other (i.e. separation distance, cameras angle, etc.) must be determined. Typically, cameras are placed on a rigid bar to prevent any relative motion between the cameras. This constraint limits the utility of the 3D-DIC technique, especially as it is applied to monitor large-sized structures and from various fields of view. In this preliminary study, the design of a multi-sensor system is proposed to extend 3D-DIC's capability and allow for easier calibration and measurement. The suggested system relies on a MEMS-based Inertial Measurement Unit (IMU) and a 77 GHz radar sensor for measuring the orientation and relative distance of the stereo cameras. The feasibility of the proposed combined IMU-radar system is evaluated through laboratory tests, demonstrating its ability in determining the cameras position in space for performing accurate 3D-DIC calibration and measurements.

  8. Nekton Interaction Monitoring System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-03-15

    The software provides a real-time processing system for sonar to detect and track animals, and to extract water column biomass statistics in order to facilitate continuous monitoring of an underwater environment. The Nekton Interaction Monitoring System (NIMS) extracts and archives tracking and backscatter statistics data from a real-time stream of data from a sonar device. NIMS also sends real-time tracking messages over the network that can be used by other systems to generate other metrics or to trigger instruments such as an optical video camera. A web-based user interface provides remote monitoring and control. NIMS currently supports three popular sonarmore » devices: M3 multi-beam sonar (Kongsberg), EK60 split-beam echo-sounder (Simrad) and BlueView acoustic camera (Teledyne).« less

  9. Time-resolved optical measurements of the post-detonation combustion of aluminized explosives

    NASA Astrophysics Data System (ADS)

    Carney, Joel R.; Miller, J. Scott; Gump, Jared C.; Pangilinan, G. I.

    2006-06-01

    The dynamic observation and characterization of light emission following the detonation and subsequent combustion of an aluminized explosive is described. The temporal, spatial, and spectral specificity of the light emission are achieved using a combination of optical diagnostics. Aluminum and aluminum monoxide emission peaks are monitored as a function of time and space using streak camera based spectroscopy in a number of light collection configurations. Peak areas of selected aluminum containing species are tracked as a function of time to ascertain the relative kinetics (growth and decay of emitting species) during the energetic event. At the chosen streak camera sensitivity, aluminum emission is observed for 10μs following the detonation of a confined 20g charge of PBXN-113, while aluminum monoxide emission persists longer than 20μs. A broadband optical emission gauge, shock velocity gauge, and fast digital framing camera are used as supplemental optical diagnostics. In-line, collimated detection is determined to be the optimum light collection geometry because it is independent of distance between the optics and the explosive charge. The chosen optical configuration also promotes a constant cylindrical collection volume that should facilitate future modeling efforts.

  10. Beach Observations using Quadcopter Imagery

    NASA Astrophysics Data System (ADS)

    Yang, Yi-Chung; Wang, Hsing-Yu; Fang, Hui-Ming; Hsiao, Sung-Shan; Tsai, Cheng-Han

    2017-04-01

    Beaches are the places where the interaction of the land and sea takes place, and it is under the influence of many environmental factors, including meteorological and oceanic ones. To understand the evolution or changes of beaches, it may require constant monitoring. One way to monitor the beach changes is to use optical cameras. With careful placements of ground control points, land-based optical cameras, which are inexpensive compared to other remote sensing apparatuses, can be used to survey a relatively large area in a short time. For example, we have used terrestrial optical cameras incorporated with ground control points to monitor beaches. The images from the cameras were calibrated by applying the direct linear transformation, projective transformation, and Sobel edge detector to locate the shoreline. The terrestrial optical cameras can record the beach images continuous, and the shorelines can be satisfactorily identified. However, the terrestrial cameras have some limitations. First, the camera system set a sufficiently high level so that the camera can cover the whole area that is of interest; such a location may not be available. The second limitation is that objects in the image have a different resolution, depending on the distance of objects from the cameras. To overcome these limitations, the present study tested a quadcopter equipped with a down-looking camera to record video and still images of a beach. The quadcopter can be controlled to hover at one location. However, the hovering of the quadcopter can be affected by the wind, since it is not positively anchored to a structure. Although the quadcopter has a gimbal mechanism to damp out tiny shakings of the copter, it will not completely counter movements due to the wind. In our preliminary tests, we have flown the quadcopter up to 500 m high to record 10-minnte video. We then took a 10-minute average of the video data. The averaged image of the coast was blurred because of the time duration of the video and the small movement caused by the quadcopter trying to return to its original position, which is caused by the wind. To solve this problem, the feature detection technique of Speeded Up Robust Features (SURF) method was used on the image of the video, and the resulting image was much sharper than that original image. Next, we extracted the maximum and minimum of RGB value of each pixel, respectively, of the 10-minutes videos. The beach breaker zone showed up in the maximum RGB image as white color areas. Moreover, we were also able to remove the breaker from the images and see the breaker zone bottom features using minimum RGB value of the images. From this test, we also identified the location of the coastline. It was found that the correlation coefficient between the coastline identified by the copter image and that by the ground survey was as high as 0.98. By repeating this copter flight at different times, we could measure the evolution of the coastline.

  11. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.

    PubMed

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-11-17

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  12. VizieR Online Data Catalog: 14 unusual IR transients with Spitzer (SPRITEs) (Kasliwal+, 2017)

    NASA Astrophysics Data System (ADS)

    Kasliwal, M. M.; Bally, J.; Masci, F.; Cody, A. M.; Bond, H. E.; Jencson, J. E.; Tinyanont, S.; Cao, Yi; Contreras, C.; Dykhoff, D. A.; Amodeo, S.; Armus, L.; Boyer, M.; Cantiello, M.; Carlon, R. L.; Cass, A. C.; Cook, D.; Corgan, D. T.; Faella, J.; Fox, O. D.; Green, W.; Gehrz, R. D.; Helou, G.; Hsiao, E.; Johansson, J.; Khan, R. M.; Lau, R. M.; Langer, N.; Levesque, E.; Milne, P.; Mohamed, S.; Morrell, N.; Monson, A.; Moore, A.; Ofek, E. O.; O'Sullivan, D.; Parthasarathy, M.; Perez, A.; Perley, D. A.; Phillips, M.; Prince, T. A.; Shenoy, D.; Smith, N.; Surace, J.; van Dyk, S. D.; Whitelock, P. A.; Williams, R.

    2017-11-01

    The SPitzer InfraRed Intensive Transients Survey (SPIRITS) survey uses the IRAC instrument (FoV 5'x5') on board the warm Spitzer telescope to search for IR transients at 3.6um ([3.6]) and 4.5um ([4.5]). SPIRITS is a five-year survey from 2014 to 2018 (Kasliwal+ 2013sptz.prop10136K, 2016sptz.prop13053K). We are undertaking concomitant ground-based surveys to monitor the SPIRITS galaxy sample in the near-IR and the optical at roughly a monthly cadence. At the University of Minnesota's Mt. Lemmon Observing Facility (MLOF), we use the three-channel Two Micron All Sky Survey cameras mounted on the 1.52m IR telescope. At Las Campanas, we undertake near-IR monitoring with the Retrocam on Dupont 100 inch telescope and optical monitoring using the CCD on the Swope 40 inch telescope. At Palomar, we use the Samuel Oschin 48 inch (primarily gr-band) and Palomar 60 inch telescopes (gri-bands) for optical monitoring. Using the LCOGT network, we obtain additional optical monitoring in gri-bands. In addition, a follow-up of discovered transients was undertaken by a myriad of facilities including Keck, Magellan, Palomar 200 inch, SALT, and RATIR. Following non-detections from the ground, we were able to set even deeper magnitude limits for two transients based on a small HST Director's Discretionary program (GO/DD-13935, PI H. Bond). We imaged SPIRITS 14aje (in M101) and SPIRITS 14axa (in M81) with the Wide Field Camera 3 (WFC3) in 2014 September. (5 data files).

  13. Maritime microwave radar and electro-optical data fusion for homeland security

    NASA Astrophysics Data System (ADS)

    Seastrand, Mark J.

    2004-09-01

    US Customs is responsible for monitoring all incoming air and maritime traffic, including the island of Puerto Rico as a US territory. Puerto Rico offers potentially obscure points of entry to drug smugglers. This environment sets forth a formula for an illegal drug trade - based relatively near the continental US. The US Customs Caribbean Air and Marine Operations Center (CAMOC), located in Puntas Salinas, has the charter to monitor maritime and Air Traffic Control (ATC) radars. The CAMOC monitors ATC radars and advises the Air and Marine Branch of US Customs of suspicious air activity. In turn, the US Coast Guard and/or US Customs will launch air and sea assets as necessary. The addition of a coastal radar and camera system provides US Customs a maritime monitoring capability for the northwestern end of Puerto Rico (Figure 1). Command and Control of the radar and camera is executed at the CAMOC, located 75 miles away. The Maritime Microwave Surveillance Radar performs search, primary target acquisition and target tracking while the Midwave Infrared (MWIR) camera performs target identification. This wide area surveillance, using a combination of radar and MWIR camera, offers the CAMOC a cost and manpower effective approach to monitor, track and identify maritime targets.

  14. High spatial resolution infrared camera as ISS external experiment

    NASA Astrophysics Data System (ADS)

    Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan

    High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.

  15. Simultaneous Water Vapor and Dry Air Optical Path Length Measurements and Compensation with the Large Binocular Telescope Interferometer

    NASA Technical Reports Server (NTRS)

    Defrere, D.; Hinz, P.; Downey, E.; Boehm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.; hide

    2016-01-01

    The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller.

  16. SU-E-T-774: Use of a Scintillator-Mirror-Camera System for the Measurement of MLC Leakage Radiation with the CyberKnife M6 System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goggin, L; Kilby, W; Noll, M

    2015-06-15

    Purpose: A technique using a scintillator-mirror-camera system to measure MLC leakage was developed to provide an efficient alternative to film dosimetry while maintaining high spatial resolution. This work describes the technique together with measurement uncertainties. Methods: Leakage measurements were made for the InCise™ MLC using the Logos XRV-2020A device. For each measurement approximately 170 leakage and background images were acquired using optimized camera settings. Average background was subtracted from each leakage frame before filtering the integrated leakage image to replace anomalous pixels. Pixel value to dose conversion was performed using a calibration image. Mean leakage was calculated within an ROImore » corresponding to the primary beam, and maximum leakage was determined by binning the image into overlapping 1mm x 1mm ROIs. 48 measurements were performed using 3 cameras and multiple MLC-linac combinations in varying beam orientations, with each compared to film dosimetry. Optical and environmental influences were also investigated. Results: Measurement time with the XRV-2020A was 8 minutes vs. 50 minutes using radiochromic film, and results were available immediately. Camera radiation exposure degraded measurement accuracy. With a relatively undamaged camera, mean leakage agreed with film measurement to ≤0.02% in 92% cases, ≤0.03% in 100% (for maximum leakage the values were 88% and 96%) relative to reference open field dose. The estimated camera lifetime over which this agreement is maintained is at least 150 measurements, and can be monitored using reference field exposures. A dependency on camera temperature was identified and a reduction in sensitivity with distance from image center due to optical distortion was characterized. Conclusion: With periodic monitoring of the degree of camera radiation damage, the XRV-2020A system can be used to measure MLC leakage. This represents a significant time saving when compared to the traditional film-based approach without any substantial reduction in accuracy.« less

  17. Opto-mechanical design of the G-CLEF flexure control camera system

    NASA Astrophysics Data System (ADS)

    Oh, Jae Sok; Park, Chan; Kim, Jihun; Kim, Kang-Min; Chun, Moo-Young; Yu, Young Sam; Lee, Sungho; Nah, Jakyoung; Park, Sung-Joon; Szentgyorgyi, Andrew; McMuldroch, Stuart; Norton, Timothy; Podgorski, William; Evans, Ian; Mueller, Mark; Uomoto, Alan; Crane, Jeffrey; Hare, Tyson

    2016-08-01

    The GMT-Consortium Large Earth Finder (G-CLEF) is the very first light instrument of the Giant Magellan Telescope (GMT). The G-CLEF is a fiber feed, optical band echelle spectrograph that is capable of extremely precise radial velocity measurement. KASI (Korea Astronomy and Space Science Institute) is responsible for Flexure Control Camera (FCC) included in the G-CLEF Front End Assembly (GCFEA). The FCC is a kind of guide camera, which monitors the field images focused on a fiber mirror to control the flexure and the focus errors within the GCFEA. The FCC consists of five optical components: a collimator including triple lenses for producing a pupil, neutral density filters allowing us to use much brighter star as a target or a guide, a tent prism as a focus analyzer for measuring the focus offset at the fiber mirror, a reimaging camera with three pair of lenses for focusing the beam on a CCD focal plane, and a CCD detector for capturing the image on the fiber mirror. In this article, we present the optical and mechanical FCC designs which have been modified after the PDR in April 2015.

  18. Use of an UROV to develop 3-D optical models of submarine environments

    NASA Astrophysics Data System (ADS)

    Null, W. D.; Landry, B. J.

    2017-12-01

    The ability to rapidly obtain high-fidelity bathymetry is crucial for a broad range of engineering, scientific, and defense applications ranging from bridge scour, bedform morphodynamics, and coral reef health to unexploded ordnance detection and monitoring. The present work introduces the use of an Underwater Remotely Operated Vehicle (UROV) to develop 3-D optical models of submarine environments. The UROV used a Raspberry Pi camera mounted to a small servo which allowed for pitch control. Prior to video data collection, in situ camera calibration was conducted with the system. Multiple image frames were extracted from the underwater video for 3D reconstruction using Structure from Motion (SFM). This system provides a simple and cost effective solution to obtaining detailed bathymetry in optically clear submarine environments.

  19. Stereo-Optic High Definition Imaging: A New Technology to Understand Bird and Bat Avoidance of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Evan; Goodale, Wing; Burns, Steve

    There is a critical need to develop monitoring tools to track aerofauna (birds and bats) in three dimensions around wind turbines. New monitoring systems will reduce permitting uncertainty by increasing the understanding of how birds and bats are interacting with wind turbines, which will improve the accuracy of impact predictions. Biodiversity Research Institute (BRI), The University of Maine Orono School of Computing and Information Science (UMaine SCIS), HiDef Aerial Surveying Limited (HiDef), and SunEdison, Inc. (formerly First Wind) responded to this need by using stereo-optic cameras with near-infrared (nIR) technology to investigate new methods for documenting aerofauna behavior around windmore » turbines. The stereo-optic camera system used two synchronized high-definition video cameras with fisheye lenses and processing software that detected moving objects, which could be identified in post-processing. The stereo- optic imaging system offered the ability to extract 3-D position information from pairs of images captured from different viewpoints. Fisheye lenses allowed for a greater field of view, but required more complex image rectification to contend with fisheye distortion. The ability to obtain 3-D positions provided crucial data on the trajectory (speed and direction) of a target, which, when the technology is fully developed, will provide data on how animals are responding to and interacting with wind turbines. This project was focused on testing the performance of the camera system, improving video review processing time, advancing the 3-D tracking technology, and moving the system from Technology Readiness Level 4 to 5. To achieve these objectives, we determined the size and distance at which aerofauna (particularly eagles) could be detected and identified, created efficient data management systems, improved the video post-processing viewer, and attempted refinement of 3-D modeling with respect to fisheye lenses. The 29-megapixel camera system successfully captured 16,173 five-minute video segments in the field. During nighttime field trials using nIR, we found that bat-sized objects could not be detected more than 60 m from the camera system. This led to a decision to focus research efforts exclusively on daytime monitoring and to redirect resources towards improving the video post- processing viewer. We redesigned the bird event post-processing viewer, which substantially decreased the review time necessary to detect and identify flying objects. During daytime field trials, we determine that eagles could be detected up to 500 m away using the fisheye wide-angle lenses, and eagle-sized targets could be identified to species within 350 m of the camera system. We used distance sampling survey methods to describe the probability of detecting and identifying eagles and other aerofauna as a function of distance from the system. The previously developed 3-D algorithm for object isolation and tracking was tested, but the image rectification (flattening) required to obtain accurate distance measurements with fish-eye lenses was determined to be insufficient for distant eagles. We used MATLAB and OpenCV to improve fisheye lens rectification towards the center of the image, but accurate measurements towards the image corners could not be achieved. We believe that changing the fisheye lens to rectilinear lens would greatly improve position estimation, but doing so would result in a decrease in viewing angle and depth of field. Finally, we generated simplified shape profiles of birds to look for similarities between unknown animals and known species. With further development, this method could provide a mechanism for filtering large numbers of shapes to reduce data storage and processing. These advancements further refined the camera system and brought this new technology closer to market. Once commercialized, the stereo-optic camera system technology could be used to: a) research how different species interact with wind turbines in order to refine collision risk models and inform mitigation solutions; and b) monitor aerofauna interactions with terrestrial and offshore wind farms replacing costly human observers and allowing for long-term monitoring in the offshore environment. The camera system will provide developers and regulators with data on the risk that wind turbines present to aerofauna, which will reduce uncertainty in the environmental permitting process.« less

  20. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces. PMID:27873930

  1. Accuracy evaluation of the optical surface monitoring system on EDGE linear accelerator in a phantom study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mancosu, Pietro; Fogliata, Antonella, E-mail: Antonella.Fogliata@humanitas.it; Stravato, Antonella

    2016-07-01

    Frameless stereotactic radiosurgery (SRS) requires dedicated systems to monitor the patient position during the treatment to avoid target underdosage due to involuntary shift. The optical surface monitoring system (OSMS) is here evaluated in a phantom-based study. The new EDGE linear accelerator from Varian (Varian, Palo Alto, CA) integrates, for cranial lesions, the common cone beam computed tomography (CBCT) and kV-MV portal images to the optical surface monitoring system (OSMS), a device able to detect real-time patient's face movements in all 6 couch axes (vertical, longitudinal, lateral, rotation along the vertical axis, pitch, and roll). We have evaluated the OSMS imagingmore » capability in checking the phantoms' position and monitoring its motion. With this aim, a home-made cranial phantom was developed to evaluate the OSMS accuracy in 4 different experiments: (1) comparison with CBCT in isocenter location, (2) capability to recognize predefined shifts up to 2° or 3 cm, (3) evaluation at different couch angles, (4) ability to properly reconstruct the surface when the linac gantry visually block one of the cameras. The OSMS system showed, with a phantom, to be accurate for positioning in respect to the CBCT imaging system with differences of 0.6 ± 0.3 mm for linear vector displacement, with a maximum rotational inaccuracy of 0.3°. OSMS presented an accuracy of 0.3 mm for displacement up to 1 cm and 1°, and 0.5 mm for larger displacements. Different couch angles (45° and 90°) induced a mean vector uncertainty < 0.4 mm. Coverage of 1 camera produced an uncertainty < 0.5 mm. Translations and rotations of a phantom can be accurately detect with the optical surface detector system.« less

  2. An Optical Biosensing Strategy Based on Selective Light Absorption and Wavelength Filtering from Chromogenic Reaction

    PubMed Central

    Chun, Hyeong Jin; Han, Yong Duk; Park, Yoo Min; Kim, Ka Ram; Lee, Seok Jae

    2018-01-01

    To overcome the time and space constraints in disease diagnosis via the biosensing approach, we developed a new signal-transducing strategy that can be applied to colorimetric optical biosensors. Our study is focused on implementation of a signal transduction technology that can directly translate the color intensity signals—that require complicated optical equipment for the analysis—into signals that can be easily counted with the naked eye. Based on the selective light absorption and wavelength-filtering principles, our new optical signaling transducer was built from a common computer monitor and a smartphone. In this signal transducer, the liquid crystal display (LCD) panel of the computer monitor served as a light source and a signal guide generator. In addition, the smartphone was used as an optical receiver and signal display. As a biorecognition layer, a transparent and soft material-based biosensing channel was employed generating blue output via a target-specific bienzymatic chromogenic reaction. Using graphics editor software, we displayed the optical signal guide patterns containing multiple polygons (a triangle, circle, pentagon, heptagon, and 3/4 circle, each associated with a specified color ratio) on the LCD monitor panel. During observation of signal guide patterns displayed on the LCD monitor panel using a smartphone camera via the target analyte-loaded biosensing channel as a color-filtering layer, the number of observed polygons changed according to the concentration of the target analyte via the spectral correlation between absorbance changes in a solution of the biosensing channel and color emission properties of each type of polygon. By simple counting of the changes in the number of polygons registered by the smartphone camera, we could efficiently measure the concentration of a target analyte in a sample without complicated and expensive optical instruments. In a demonstration test on glucose as a model analyte, we could easily measure the concentration of glucose in the range from 0 to 10 mM. PMID:29509682

  3. An Optical Biosensing Strategy Based on Selective Light Absorption and Wavelength Filtering from Chromogenic Reaction.

    PubMed

    Chun, Hyeong Jin; Han, Yong Duk; Park, Yoo Min; Kim, Ka Ram; Lee, Seok Jae; Yoon, Hyun C

    2018-03-06

    To overcome the time and space constraints in disease diagnosis via the biosensing approach, we developed a new signal-transducing strategy that can be applied to colorimetric optical biosensors. Our study is focused on implementation of a signal transduction technology that can directly translate the color intensity signals-that require complicated optical equipment for the analysis-into signals that can be easily counted with the naked eye. Based on the selective light absorption and wavelength-filtering principles, our new optical signaling transducer was built from a common computer monitor and a smartphone. In this signal transducer, the liquid crystal display (LCD) panel of the computer monitor served as a light source and a signal guide generator. In addition, the smartphone was used as an optical receiver and signal display. As a biorecognition layer, a transparent and soft material-based biosensing channel was employed generating blue output via a target-specific bienzymatic chromogenic reaction. Using graphics editor software, we displayed the optical signal guide patterns containing multiple polygons (a triangle, circle, pentagon, heptagon, and 3/4 circle, each associated with a specified color ratio) on the LCD monitor panel. During observation of signal guide patterns displayed on the LCD monitor panel using a smartphone camera via the target analyte-loaded biosensing channel as a color-filtering layer, the number of observed polygons changed according to the concentration of the target analyte via the spectral correlation between absorbance changes in a solution of the biosensing channel and color emission properties of each type of polygon. By simple counting of the changes in the number of polygons registered by the smartphone camera, we could efficiently measure the concentration of a target analyte in a sample without complicated and expensive optical instruments. In a demonstration test on glucose as a model analyte, we could easily measure the concentration of glucose in the range from 0 to 10 mM.

  4. Single-Camera-Based Method for Step Length Symmetry Measurement in Unconstrained Elderly Home Monitoring.

    PubMed

    Cai, Xi; Han, Guang; Song, Xin; Wang, Jinkuan

    2017-11-01

    single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc. single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc.

  5. Laser differential image-motion monitor for characterization of turbulence during free-space optical communication tests.

    PubMed

    Brown, David M; Juarez, Juan C; Brown, Andrea M

    2013-12-01

    A laser differential image-motion monitor (DIMM) system was designed and constructed as part of a turbulence characterization suite during the DARPA free-space optical experimental network experiment (FOENEX) program. The developed link measurement system measures the atmospheric coherence length (r0), atmospheric scintillation, and power in the bucket for the 1550 nm band. DIMM measurements are made with two separate apertures coupled to a single InGaAs camera. The angle of arrival (AoA) for the wavefront at each aperture can be calculated based on focal spot movements imaged by the camera. By utilizing a single camera for the simultaneous measurement of the focal spots, the correlation of the variance in the AoA allows a straightforward computation of r0 as in traditional DIMM systems. Standard measurements of scintillation and power in the bucket are made with the same apertures by redirecting a percentage of the incoming signals to InGaAs detectors integrated with logarithmic amplifiers for high sensitivity and high dynamic range. By leveraging two, small apertures, the instrument forms a small size and weight configuration for mounting to actively tracking laser communication terminals for characterizing link performance.

  6. Traffic monitoring with distributed smart cameras

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin; Ulm, Michael; Schwingshackl, Gert

    2012-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. Today the automated analysis of traffic situations is still in its infancy--the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully captured and interpreted by a vision system. 3In this work we present steps towards a visual monitoring system which is designed to detect potentially dangerous traffic situations around a pedestrian crossing at a street intersection. The camera system is specifically designed to detect incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system has been field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in a weatherproof housing. Two cameras run vehicle detection and tracking software, one camera runs a pedestrian detection and tracking module based on the HOG dectection principle. All 3 cameras use sparse optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. Geometric calibration of the cameras allows us to estimate the real-world co-ordinates of detected objects and to link the cameras together into one common reference system. This work describes the foundation for all the different object detection modalities (pedestrians, vehicles), and explains the system setup, tis design, and evaluation results which we have achieved so far.

  7. Optical flare at RA 15:16:21.2 DEC -20:08:16

    NASA Astrophysics Data System (ADS)

    Nesci, Roberto; Falasca, Vincenzo; Fantaulli, Stefano

    2015-06-01

    On June 4 2015, while monitoring the occultation of the star HD 132885 by the asteroid 322 (Phaeo) in a session open to the public at the Foligno Observatory (IAU K56), we detected an optical flare with our Mintron intensified camera, mounted in parallel to the main telescope as electronic finder, with a 135mm F/2.5 objective.

  8. Repurposing video recordings for structure motion estimations

    NASA Astrophysics Data System (ADS)

    Khaloo, Ali; Lattanzi, David

    2016-04-01

    Video monitoring of public spaces is becoming increasingly ubiquitous, particularly near essential structures and facilities. During any hazard event that dynamically excites a structure, such as an earthquake or hurricane, proximal video cameras may inadvertently capture the motion time-history of the structure during the event. If this dynamic time-history could be extracted from the repurposed video recording it would become a valuable forensic analysis tool for engineers performing post-disaster structural evaluations. The difficulty is that almost all potential video cameras are not installed to monitor structure motions, leading to camera perspective distortions and other associated challenges. This paper presents a method for extracting structure motions from videos using a combination of computer vision techniques. Images from a video recording are first reprojected into synthetic images that eliminate perspective distortion, using as-built knowledge of a structure for calibration. The motion of the camera itself during an event is also considered. Optical flow, a technique for tracking per-pixel motion, is then applied to these synthetic images to estimate the building motion. The developed method was validated using the experimental records of the NEESHub earthquake database. The results indicate that the technique is capable of estimating structural motions, particularly the frequency content of the response. Further work will evaluate variants and alternatives to the optical flow algorithm, as well as study the impact of video encoding artifacts on motion estimates.

  9. Rapid-cadence optical monitoring for short-period variability of ɛ Aurigae

    NASA Astrophysics Data System (ADS)

    Billings, Gary

    2013-07-01

    ɛ Aurigae was observed with CCD cameras and 35 mm SLR camera lenses, at rapid cadence (>1/minute), for long runs (up to 11 hours), on multiple occasions during 2009 - 2011, to monitor for variability of the system at scales of minutes to hours. The lens and camera were changed during the period to improve results, finalizing on a 135 mm focal length Canon f/2 lens (at f/2.8), an ND8 neutral density filter, a Johnson V filter, and an SBIG ST-8XME camera (Kodak KAF-1603ME microlensed chip). Differential photometry was attempted, but because of the large separation between the variable and comparison star (η Aur), noise caused by transient extinction variations was not consistently eliminated. The lowest-noise time series for searching for short-period variability proved to be the extinction-corrected instrumental magnitude of ɛ Aur obtained on "photometric nights", with η Aur used to determine and monitor the extinction coefficient for the night. No flares or short-period variations of ɛ Aur were detected by visual inspection of the light curves from observing runs with noise levels as low as 0.008 magnitudes rms.

  10. Optimizing Orbital Debris Monitoring with Optical Telescopes

    DTIC Science & Technology

    2010-09-01

    poses an increasing risk to manned space missions and operational satellites ; however, the majority of debris large enough to cause catastrophic...cameras hosted on GEO- based satellites for monitoring GEO. Performance analysis indicates significant potential contributions of these systems as a...concerns over the long term-viability of the space environment and the resulting economic impacts. The 2007 China anti- satellite test and the 2009

  11. Monitoring lightning from space with TARANIS

    NASA Astrophysics Data System (ADS)

    Farges, T.; Blanc, E.; Pinçon, J.

    2010-12-01

    Some recent space experiments, e.g. OTD, LIS, show the large interest of lightning monitoring from space and the efficiency of optical measurement. Future instrumentations are now defined for the next generation of geostationary meteorology satellites. Calibration of these instruments requires ground truth events provided by lightning location networks, as NLDN in US, and EUCLID or LINET in Europe, using electromagnetic observations at a regional scale. One of the most challenging objectives is the continuous monitoring of the lightning activity over the tropical zone (Africa, America, and Indonesia). However, one difficulty is the lack of lightning location networks at regional scale in these areas to validate the data quality. TARANIS (Tool for the Analysis of Radiations from lightNings and Sprites) is a CNES micro satellite project. It is dedicated to the study of impulsive transfers of energy, between the Earth atmosphere and the space environment, from nadir observations of Transient Luminous Events (TLEs), Terrestrial Gamma ray Flashes (TGFs) and other possible associated emissions. Its orbit will be sun-synchronous at 10:30 local time; its altitude will be 700 km. Its lifetime will be nominally 2 years. Its payload is composed of several electromagnetic instruments in different wavelengths: X and gamma-ray detectors, optical cameras and photometers, electromagnetic wave sensors from DC to 30 MHz completed by high energy electron detectors. The optical instrument includes 2 cameras and 4 photometers. All sensors are equipped with filters for sprite and lightning differentiation. The filters of cameras are designed for sprite and lightning observations at 762 nm and 777 nm respectively. However, differently from OTD or LIS instruments, the filter bandwidth and the exposure time (respectively 10 nm and 91 ms) prevent lightning optical observations during daytime. The camera field of view is a square of 500 km at ground level with a spatial sampling frequency of about 1 km. One of the photometers will measure precisely the lightning radiance in a wide spectral range from 600 to 900 nm with a sampling frequency of 20 kHz. We suggest using the Event and mainly Survey mode of MCP instrument to monitor lightning activity and compare it to the geostationary satellite lightning mapper data. In the Event mode, data are recorded with their highest resolution. In the camera survey mode, every image is archived using a specific compression algorithm. The photometer Survey mode consists in decimating the data by a factor of 10 and to reduce the data dynamic. However, it remains well adapted to provide a good continuous characterization of lightning activity. The use of other instruments for example 0+ whistler detector will complete the lightning characterization.

  12. Flow cytometer jet monitor system

    DOEpatents

    Van den Engh, Ger

    1997-01-01

    A direct jet monitor illuminates the jet of a flow cytometer in a monitor wavelength band which is substantially separate from the substance wavelength band. When a laser is used to cause fluorescence of the substance, it may be appropriate to use an infrared source to illuminate the jet and thus optically monitor the conditions within the jet through a CCD camera or the like. This optical monitoring may be provided to some type of controller or feedback system which automatically changes either the horizontal location of the jet, the point at which droplet separation occurs, or some other condition within the jet in order to maintain optimum conditions. The direct jet monitor may be operated simultaneously with the substance property sensing and analysis system so that continuous monitoring may be achieved without interfering with the substance data gathering and may be configured so as to allow the front of the analysis or free fall area to be unobstructed during processing.

  13. Displacement and deformation measurement for large structures by camera network

    NASA Astrophysics Data System (ADS)

    Shang, Yang; Yu, Qifeng; Yang, Zhen; Xu, Zhiqiang; Zhang, Xiaohu

    2014-03-01

    A displacement and deformation measurement method for large structures by a series-parallel connection camera network is presented. By taking the dynamic monitoring of a large-scale crane in lifting operation as an example, a series-parallel connection camera network is designed, and the displacement and deformation measurement method by using this series-parallel connection camera network is studied. The movement range of the crane body is small, and that of the crane arm is large. The displacement of the crane body, the displacement of the crane arm relative to the body and the deformation of the arm are measured. Compared with a pure series or parallel connection camera network, the designed series-parallel connection camera network can be used to measure not only the movement and displacement of a large structure but also the relative movement and deformation of some interesting parts of the large structure by a relatively simple optical measurement system.

  14. Camera traps can be heard and seen by animals.

    PubMed

    Meek, Paul D; Ballard, Guy-Anthony; Fleming, Peter J S; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  15. CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

    2011-02-01

    Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

  16. Fundamentals of in Situ Digital Camera Methodology for Water Quality Monitoring of Coast and Ocean

    PubMed Central

    Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave

    2009-01-01

    Conventional digital cameras, the Nikon Coolpix885® and the SeaLife ECOshot®, were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method. PMID:22346729

  17. 47 CFR 51.323 - Standards for physical collocation and virtual collocation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... accessible by both the incumbent LEC and the collocating telecommunications carrier, at which the fiber optic... technically feasible, the incumbent LEC shall provide the connection using copper, dark fiber, lit fiber, or... that the incumbent LEC may adopt include: (1) Installing security cameras or other monitoring systems...

  18. 47 CFR 51.323 - Standards for physical collocation and virtual collocation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... accessible by both the incumbent LEC and the collocating telecommunications carrier, at which the fiber optic... technically feasible, the incumbent LEC shall provide the connection using copper, dark fiber, lit fiber, or... that the incumbent LEC may adopt include: (1) Installing security cameras or other monitoring systems...

  19. Beam line shielding calculations for an Electron Accelerator Mo-99 production facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mocko, Michal

    2016-05-03

    The purpose of this study is to evaluate the photon and neutron fields in and around the latest beam line design for the Mo-99 production facility. The radiation dose to the beam line components (quadrupoles, dipoles, beam stops and the linear accelerator) are calculated in the present report. The beam line design assumes placement of two cameras: infra red (IR) and optical transition radiation (OTR) for continuous monitoring of the beam spot on target during irradiation. The cameras will be placed off the beam axis offset in vertical direction. We explored typical shielding arrangements for the cameras and report themore » resulting neutron and photon dose fields.« less

  20. Instrumentation development for space debris optical observation system in Indonesia: Preliminary results

    NASA Astrophysics Data System (ADS)

    Dani, Tiar; Rachman, Abdul; Priyatikanto, Rhorom; Religia, Bahar

    2015-09-01

    An increasing number of space junk in orbit has raised their chances to fall in Indonesian region. So far, three debris of rocket bodies have been found in Bengkulu, Gorontalo and Lampung. LAPAN has successfully developed software for monitoring space debris that passes over Indonesia with an altitude below 200 km. To support the software-based system, the hardware-based system has been developed based on optical instruments. The system has been under development in early 2014 which consist of two systems: the telescopic system and wide field system. The telescopic system uses CCD cameras and a reflecting telescope with relatively high sensitivity. Wide field system uses DSLR cameras, binoculars and a combination of CCD with DSLR Lens. Methods and preliminary results of the systems will be presented.

  1. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring.

    PubMed

    Allison, Robert S; Johnston, Joshua M; Craig, Gregory; Jennings, Sion

    2016-08-18

    For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context.

  2. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring

    PubMed Central

    Allison, Robert S.; Johnston, Joshua M.; Craig, Gregory; Jennings, Sion

    2016-01-01

    For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context. PMID:27548174

  3. Camera Traps Can Be Heard and Seen by Animals

    PubMed Central

    Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356

  4. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  5. Updating the Synchrotron Radiation Monitor at TLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuo, C. H.; Hsu, S. Y.; Wang, C. J.

    2007-01-19

    The synchrotron radiation monitor provides useful information to support routine operation and physics experiments using the beam. Precisely knowing the profile of the beam helps to improve machine performance. The synchrotron radiation monitor at the Taiwan Light Source (TLS) was recently upgraded. The optics and modeling were improved to increase the accuracy of measurement in the small beam size. A high-performance IEEE-1394 digital CCD camera was used to improve the quality of images and extend the dynamic range of measurement. The image analysis is also improved. This report summarizes status and results.

  6. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  7. A novel optical investigation technique for railroad track inspection and assessment

    NASA Astrophysics Data System (ADS)

    Sabato, Alessandro; Beale, Christopher H.; Niezrecki, Christopher

    2017-04-01

    Track failures due to cross tie degradation or loss in ballast support may result in a number of problems ranging from simple service interruptions to derailments. Structural Health Monitoring (SHM) of railway track is important for safety reasons and to reduce downtime and maintenance costs. For this reason, novel and cost-effective track inspection technologies for assessing tracks' health are currently insufficient and needed. Advancements achieved in recent years in cameras technology, optical sensors, and image-processing algorithms have made machine vision, Structure from Motion (SfM), and three-dimensional (3D) Digital Image Correlation (DIC) systems extremely appealing techniques for extracting structural deformations and geometry profiles. Therefore, optically based, non-contact measurement techniques may be used for assessing surface defects, rail and tie deflection profiles, and ballast condition. In this study, the design of two camera-based measurement systems is proposed for crossties-ballast condition assessment and track examination purposes. The first one consists of four pairs of cameras installed on the underside of a rail car to detect the induced deformation and displacement on the whole length of the track's cross tie using 3D DIC measurement techniques. The second consists of another set of cameras using SfM techniques for obtaining a 3D rendering of the infrastructure from a series of two-dimensional (2D) images to evaluate the state of the track qualitatively. The feasibility of the proposed optical systems is evaluated through extensive laboratory tests, demonstrating their ability to measure parameters of interest (e.g. crosstie's full-field displacement, vertical deflection, shape, etc.) for assessment and SHM of railroad track.

  8. Endoscopic techniques in aesthetic plastic surgery.

    PubMed

    McCain, L A; Jones, G

    1995-01-01

    There has been an explosive interest in endoscopic techniques by plastic surgeons over the past two years. Procedures such as facial rejuvenation, breast augmentation and abdominoplasty are being performed with endoscopic assistance. Endoscopic operations require a complex setup with components such as video camera, light sources, cables and hard instruments. The Hopkins Rod Lens system consists of optical fibers for illumination, an objective lens, an image retrieval system, a series of rods and lenses, and an eyepiece for image collection. Good illumination of the body cavity is essential for endoscopic procedures. Placement of the video camera on the eyepiece of the endoscope gives a clear, brightly illuminated large image on the monitor. The video monitor provides the surgical team with the endoscopic image. It is important to become familiar with the equipment before actually doing cases. Several options exist for staff education. In the operating room the endoscopic cart needs to be positioned to allow a clear unrestricted view of the video monitor by the surgeon and the operating team. Fogging of the endoscope may be prevented during induction by using FREDD (a fog reduction/elimination device) or a warm bath. The camera needs to be white balanced. During the procedure, the nurse monitors the level of dissection and assesses for clogging of the suction.

  9. Lensless imaging for wide field of view

    NASA Astrophysics Data System (ADS)

    Nagahara, Hajime; Yagi, Yasushi

    2015-02-01

    It is desirable to engineer a small camera with a wide field of view (FOV) because of current developments in the field of wearable cameras and computing products, such as action cameras and Google Glass. However, typical approaches for achieving wide FOV, such as attaching a fisheye lens and convex mirrors, require a trade-off between optics size and the FOV. We propose camera optics that achieve a wide FOV, and are at the same time small and lightweight. The proposed optics are a completely lensless and catoptric design. They contain four mirrors, two for wide viewing, and two for focusing the image on the camera sensor. The proposed optics are simple and can be simply miniaturized, since we use only mirrors for the proposed optics and the optics are not susceptible to chromatic aberration. We have implemented the prototype optics of our lensless concept. We have attached the optics to commercial charge-coupled device/complementary metal oxide semiconductor cameras and conducted experiments to evaluate the feasibility of our proposed optics.

  10. Refining the Workflow of UV Camera Measurements: Data Collection from Low Emission Rate Volcanoes under Variable Conditions

    NASA Astrophysics Data System (ADS)

    Brewer, I. D.; Werner, C. A.; Nadeau, P. A.

    2010-12-01

    UV camera systems are gaining popularity worldwide for quantifying SO2 column abundances and emission rates from volcanoes, which serve as primary measures of volcanic hazard and aid in eruption forecasting. To date many of the investigations have focused on fairly active and routinely monitored volcanoes under optimal conditions. Some recent studies have begun to recommend protocols and procedures for data collection, but additional questions still need to be addressed. In this study we attempt to answer these questions, and also present results from volcanoes that are rarely monitored. Conditions at these volcanoes are typically sub-optimal for UV camera measurements. Discussion of such data is essential in the assessment of the wider applicability of UV camera measurements for SO2 monitoring purposes. Data discussed herein consists of plume images from volcanoes with relatively low emission rates, with varying weather conditions and from various distances (2-12 km). These include Karangatang Volcano (Indonesia), Mount St. Helens (Washington, USA), and Augustine and Redoubt Volcanoes (Alaska, USA). High emission rate data were also collected at Kilauea Volcano (Hawaii, USA), and blue sky test images with no plume were collected at Mammoth Mountain (California, USA). All data were collected between 2008 and 2010 using both single-filter (307 nm) and dual-filter (307 nm/326 nm) systems and were accompanied by FLYSPEC measurements. With the dual-filter systems, both a filter wheel setup and a synchronous-imaging dual-camera setup were employed. Data collection and processing questions included (1) what is the detection limit of the camera, (2) how large is the variability in raw camera output, (3) how do camera optics affect the measurements and how can this be corrected, (4) how much variability is observed in calibration under various conditions, (5) what is the optimal workflow for image collection and processing, and (6) what is the range of camera operating conditions? Besides emission rates from these infrequently monitored volcanoes, the results of this study include a recommended workflow and procedure for image collection and calibration, and a MATLAB-based algorithm for batch processing, thereby enabling accurate emission rates at 1 Hz when a synchronous-imaging dual-camera setup is used.

  11. Simultaneous water vapor and dry air optical path length measurements and compensation with the large binocular telescope interferometer

    NASA Astrophysics Data System (ADS)

    Defrère, D.; Hinz, P.; Downey, E.; Böhm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.; Millan-Gabet, R.; Montoya, M.; Pott, J.-U.; Skemer, A.; Spalding, E.; Stone, J.; Vaz, A.

    2016-08-01

    The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feedforward approach to stabilize the path length fluctuations seen by the LBTI nuller.

  12. Optical imaging characterizing brain response to thermal insult in injured rodent

    NASA Astrophysics Data System (ADS)

    Abookasis, David; Shaul, Oren; Meitav, Omri; Pinhasi, Gadi A.

    2018-02-01

    We used spatially modulated optical imaging system to assess the effect of temperature elevation on intact brain tissue in a mouse heatstress model. Heatstress or heatstroke is a medical emergency defined by abnormally elevated body temperature that causes biochemical, physiological and hematological changes. During experiments, brain temperature was measured concurrently with a thermal camera while core body temperature was monitored with rectal thermocouple probe. Changes in a battery of macroscopic brain physiological parameters, such as hemoglobin oxygen saturation level, cerebral water content, as well as intrinsic tissue optical properties were monitored during temperature elevation. These concurrent changes reflect the pathophysiology of the brain during heatstress and demonstrate successful monitoring of thermoregulation mechanisms. In addition, the variation of tissue refractive index was calculated showing a monotonous decrease with increasing wavelength. We found increased temperature to greatly affect both the scattering properties and refractive index which represent cellular and subcellular swelling indicative of neuronal damage. The overall trends detected in brain tissue parameters were consistent with previous observations using conventional medical devices and optical modalities.

  13. Early warning of footpad dermatitis and hockburn in broiler chicken flocks using optical flow, bodyweight and water consumption.

    PubMed

    Dawkins, M S; Roberts, S J; Cain, R J; Nickson, T; Donnelly, C A

    2017-05-20

    Footpad dermatitis and hockburn are serious welfare and economic issues for the production of broiler (meat) chickens. The authors here describe the use of an inexpensive camera system that monitors the movements of broiler flocks throughout their lives and suggest that it is possible to predict, even in young birds, the cross-sectional prevalence at slaughter of footpad dermatitis and hockburn before external signs are visible. The skew and kurtosis calculated from the authors' camera-based optical flow system had considerably more power to predict these outcomes in the 50 flocks reported here than water consumption, bodyweight or mortality and therefore have the potential to inform improved flock management through giving farmers early warning of welfare issues. Further trials are underway to establish the generality of the results. British Veterinary Association.

  14. High Energy Replicated Optics to Explore the Sun: Hard X-Ray Balloon-Borne Telescope

    NASA Technical Reports Server (NTRS)

    Gaskin, Jessica; Apple, Jeff; StevensonChavis, Katherine; Dietz, Kurt; Holt, Marlon; Koehler, Heather; Lis, Tomasz; O'Connor, Brian; RodriquezOtero, Miguel; Pryor, Jonathan; hide

    2013-01-01

    Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist

  15. High Energy Replicated Optics to Explore the Sun: Hard X-ray balloon-borne telescope

    NASA Astrophysics Data System (ADS)

    Gaskin, J.; Apple, J.; Chavis, K. S.; Dietz, K.; Holt, M.; Koehler, H.; Lis, T.; O'Connor, B.; Otero, M. R.; Pryor, J.; Ramsey, B.; Rinehart-Dawson, M.; Smith, L.; Sobey, A.; Wilson-Hodge, C.; Christe, S.; Cramer, A.; Edgerton, M.; Rodriguez, M.; Shih, A.; Gregory, D.; Jasper, J.; Bohon, S.

    Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist.

  16. LSST Camera Optics Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V J; Olivier, S; Bauman, B

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less

  17. Multi-channel measurement for hetero-core optical fiber sensor by using CMOS camera

    NASA Astrophysics Data System (ADS)

    Koyama, Yuya; Nishiyama, Michiko; Watanabe, Kazuhiro

    2015-07-01

    Fiber optic smart structures have been developed over several decades by the recent fiber optic sensor technology. Optical intensity-based sensors, which use LD or LEDs, can be suitable for the monitor system to be simple and cost effective. In this paper, a novel fiber optic smart structure with human-like perception has been demonstrated by using intensity-based hetero-core optical fiber sensors system with the CMOS detector. The optical intensity from the hetero-core optical fiber bend sensor is obtained as luminance spots indicated by the optical power distributions. A number of optical intensity spots are simultaneously readout by taking a picture of luminance pattern. To recognize the state of fiber optic smart structure with the hetero-core optical fibers, the template matching process is employed with Sum of Absolute Differences (SAD). A fiber optic smart glove having five optic fiber nerves have been employed to monitor hand postures. Three kinds of hand postures have been recognized by means of the template matching process. A body posture monitoring has also been developed by placing the wearable hetero-core optical fiber bend sensors on the body segments. In order for the CMOS system to be a human brain-like, the luminescent spots in the obtained picture were arranged to make the pattern corresponding to the position of body segments. As a result, it was successfully demonstrated that the proposed fiber optic smart structure could recognize eight kinds of body postures. The developed system will give a capability of human brain-like processing to the existing fiber optic smart structures.

  18. Adjustment of multi-CCD-chip-color-camera heads

    NASA Astrophysics Data System (ADS)

    Guyenot, Volker; Tittelbach, Guenther; Palme, Martin

    1999-09-01

    The principle of beam-splitter-multi-chip cameras consists in splitting an image into differential multiple images of different spectral ranges and in distributing these onto separate black and white CCD-sensors. The resulting electrical signals from the chips are recombined to produce a high quality color picture on the monitor. Because this principle guarantees higher resolution and sensitivity in comparison to conventional single-chip camera heads, the greater effort is acceptable. Furthermore, multi-chip cameras obtain the compete spectral information for each individual object point while single-chip system must rely on interpolation. In a joint project, Fraunhofer IOF and STRACON GmbH and in future COBRA electronic GmbH develop methods for designing the optics and dichroitic mirror system of such prism color beam splitter devices. Additionally, techniques and equipment for the alignment and assembly of color beam splitter-multi-CCD-devices on the basis of gluing with UV-curable adhesives have been developed, too.

  19. Dark Energy Camera for Blanco

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images frommore » the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.« less

  20. STREAK CAMERA MEASUREMENTS OF THE APS PC GUN DRIVE LASER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dooling, J. C.; Lumpkin, A. H.

    We report recent pulse-duration measurements of the APS PC Gun drive laser at both second harmonic and fourth harmonic wavelengths. The drive laser is a Nd:Glass-based chirped pulsed amplifier (CPA) operating at an IR wavelength of 1053 nm, twice frequency-doubled to obtain UV output for the gun. A Hamamatsu C5680 streak camera and an M5675 synchroscan unit are used for these measurements; the synchroscan unit is tuned to 119 MHz, the 24th subharmonic of the linac s-band operating frequency. Calibration is accomplished both electronically and optically. Electronic calibration utilizes a programmable delay line in the 119 MHz rf path. Themore » optical delay uses an etalon with known spacing between reflecting surfaces and is coated for the visible, SH wavelength. IR pulse duration is monitored with an autocorrelator. Fitting the streak camera image projected profiles with Gaussians, UV rms pulse durations are found to vary from 2.1 ps to 3.5 ps as the IR varies from 2.2 ps to 5.2 ps.« less

  1. UAV-based NDVI calculation over grassland: An alternative approach

    NASA Astrophysics Data System (ADS)

    Mejia-Aguilar, Abraham; Tomelleri, Enrico; Asam, Sarah; Zebisch, Marc

    2016-04-01

    The Normalised Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring and assessing vegetation in remote sensing. The index relies on the reflectance difference between the near infrared (NIR) and red light and is thus able to track variations of structural, phenological, and biophysical parameters for seasonal and long-term monitoring. Conventionally, NDVI is inferred from space-borne spectroradiometers, such as MODIS, with moderate resolution up to 250 m ground resolution. In recent years, a new generation of miniaturized radiometers and integrated hyperspectral sensors with high resolution became available. Such small and light instruments are particularly adequate to be mounted on airborne unmanned aerial vehicles (UAV) used for monitoring services reaching ground sampling resolution in the order of centimetres. Nevertheless, such miniaturized radiometers and hyperspectral sensors are still very expensive and require high upfront capital costs. Therefore, we propose an alternative, mainly cheaper method to calculate NDVI using a camera constellation consisting of two conventional consumer-grade cameras: (i) a Ricoh GR modified camera that acquires the NIR spectrum by removing the internal infrared filter. A mounted optical filter additionally obstructs all wavelengths below 700 nm. (ii) A Ricoh GR in RGB configuration using two optical filters for blocking wavelengths below 600 nm as well as NIR and ultraviolet (UV) light. To assess the merit of the proposed method, we carry out two comparisons: First, reflectance maps generated by the consumer-grade camera constellation are compared to reflectance maps produced with a hyperspectral camera (Rikola). All imaging data and reflectance maps are processed using the PIX4D software. In the second test, the NDVI at specific points of interest (POI) generated by the consumer-grade camera constellation is compared to NDVI values obtained by ground spectral measurements using a portable spectroradiometer (Spectravista SVC HR-1024i). All data were collected on a dry alpine mountain grassland site in the Matsch valley, Italy, during the vegetation period of 2015. Data acquisition for the first comparison followed a pre-programmed flight plan in which the hyperspectral and alternative dual-camera constellation were mounted separately on an octocopter-UAV during two consecutive flight campaigns. Ground spectral measurements collection took place on the same site and on the same dates (three in total) of the flight campaigns. The proposed technique achieves promising results and therewith constitutes a cheap and simple way of collecting spatially explicit information on vegetated areas even in challenging terrain.

  2. Dynamic light scattering microscopy

    NASA Astrophysics Data System (ADS)

    Dzakpasu, Rhonda

    An optical microscope technique, dynamic light scattering microscopy (DLSM) that images dynamically scattered light fluctuation decay rates is introduced. Using physical optics we show theoretically that within the optical resolution of the microscope, relative motions between scattering centers are sufficient to produce significant phase variations resulting in interference intensity fluctuations in the image plane. The time scale for these intensity fluctuations is predicted. The spatial coherence distance defining the average distance between constructive and destructive interference in the image plane is calculated and compared with the pixel size. We experimentally tested DLSM on polystyrene latex nanospheres and living macrophage cells. In order to record these rapid fluctuations, on a slow progressive scan CCD camera, we used a thin laser line of illumination on the sample such that only a single column of pixels in the CCD camera is illuminated. This allowed the use of the rate of the column-by-column readout transfer process as the acquisition rate of the camera. This manipulation increased the data acquisition rate by at least an order of magnitude in comparison to conventional CCD cameras rates defined by frames/s. Analysis of the observed fluctuations provides information regarding the rates of motion of the scattering centers. These rates, acquired from each position on the sample are used to create a spatial map of the fluctuation decay rates. Our experiments show that with this technique, we are able to achieve a good signal-to-noise ratio and can monitor fast intensity fluctuations, on the order of milliseconds. DLSM appears to provide dynamic information about fast motions within cells at a sub-optical resolution scale and provides a new kind of spatial contrast.

  3. Simultaneous Monitoring of Ballistocardiogram and Photoplethysmogram Using Camera

    PubMed Central

    Shao, Dangdang; Tsow, Francis; Liu, Chenbin; Yang, Yuting; Tao, Nongjian

    2017-01-01

    We present a noncontact method to measure Ballistocardiogram (BCG) and Photoplethysmogram (PPG) simultaneously using a single camera. The method tracks the motion of facial features to determine displacement BCG, and extracts the corresponding velocity and acceleration BCGs by taking first and second temporal derivatives from the displacement BCG, respectively. The measured BCG waveforms are consistent with those reported in literature and also with those recorded with an accelerometer-based reference method. The method also tracks PPG based on the reflected light from the same facial region, which makes it possible to track both BCG and PPG with the same optics. We verify the robustness and reproducibility of the noncontact method with a small pilot study with 23 subjects. The presented method is the first demonstration of simultaneous BCG and PPG monitoring without wearing any extra equipment or marker by the subject. PMID:27362754

  4. Research on a solid state-streak camera based on an electro-optic crystal

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang

    2006-06-01

    With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.

  5. Real-time biodetection using a smartphone-based dual-color surface plasmon resonance sensor

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Yuan, Huizhen; Liu, Yun; Wang, Jiabin; Jing, Zhenguo; Peng, Wei

    2018-04-01

    We proposed a compact and cost-effective red-green dual-color fiber optic surface plasmon resonance (SPR) sensor based on the smartphone. Inherent color selectivity of phone cameras was utilized for real-time monitoring of red and green color channels simultaneously, which can reduce the chance of false detection and improve the sensitivity. Because there are no external prisms, complex optical lenses, or diffraction grating, simple optical configuration is realized. It has a linear response in a refractive index range of 1.326 to 1.351 (R2 = 0.991) with a resolution of 2.3 × 10 - 4 RIU. We apply it for immunoglobulin G (IgG) concentration measurement. Experimental results demonstrate that a linear SPR response was achieved for IgG concentrations varying from 0.02 to 0.30 mg / ml with good repeatability. It may find promising applications in the fields of public health and environment monitoring owing to its simple optics design and applicability in real-time, label-free biodetection.

  6. Fiber optic interferometry for industrial process monitoring and control applications

    NASA Astrophysics Data System (ADS)

    Marcus, Michael A.

    2002-02-01

    Over the past few years we have been developing applications for a high-resolution (sub-micron accuracy) fiber optic coupled dual Michelson interferometer-based instrument. It is being utilized in a variety of applications including monitoring liquid layer thickness uniformity on coating hoppers, film base thickness uniformity measurement, digital camera focus assessment, optical cell path length assessment and imager and wafer surface profile mapping. The instrument includes both coherent and non-coherent light sources, custom application dependent optical probes and sample interfaces, a Michelson interferometer, custom electronics, a Pentium-based PC with data acquisition cards and LabWindows CVI or LabView based application specific software. This paper describes the development evolution of this instrument platform and applications highlighting robust instrument design, hardware, software, and user interfaces development. The talk concludes with a discussion of a new high-speed instrument configuration, which can be utilized for high speed surface profiling and as an on-line web thickness gauge.

  7. piscope - A Python based software package for the analysis of volcanic SO2 emissions using UV SO2 cameras

    NASA Astrophysics Data System (ADS)

    Gliss, Jonas; Stebel, Kerstin; Kylling, Arve; Solvejg Dinger, Anna; Sihler, Holger; Sudbø, Aasmund

    2017-04-01

    UV SO2 cameras have become a common method for monitoring SO2 emission rates from volcanoes. Scattered solar UV radiation is measured in two wavelength windows, typically around 310 nm and 330 nm (distinct / weak SO2 absorption) using interference filters. The data analysis comprises the retrieval of plume background intensities (to calculate plume optical densities), the camera calibration (to convert optical densities into SO2 column densities) and the retrieval of gas velocities within the plume as well as the retrieval of plume distances. SO2 emission rates are then typically retrieved along a projected plume cross section, for instance a straight line perpendicular to the plume propagation direction. Today, for most of the required analysis steps, several alternatives exist due to ongoing developments and improvements related to the measurement technique. We present piscope, a cross platform, open source software toolbox for the analysis of UV SO2 camera data. The code is written in the Python programming language and emerged from the idea of a common analysis platform incorporating a selection of the most prevalent methods found in literature. piscope includes several routines for plume background retrievals, routines for cell and DOAS based camera calibration including two individual methods to identify the DOAS field of view (shape and position) within the camera images. Gas velocities can be retrieved either based on an optical flow analysis or using signal cross correlation. A correction for signal dilution (due to atmospheric scattering) can be performed based on topographic features in the images. The latter requires distance retrievals to the topographic features used for the correction. These distances can be retrieved automatically on a pixel base using intersections of individual pixel viewing directions with the local topography. The main features of piscope are presented based on dataset recorded at Mt. Etna, Italy in September 2015.

  8. Monitoring the spatial and temporal evolution of slope instability with Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Manconi, Andrea; Glueer, Franziska; Loew, Simon

    2017-04-01

    The identification and monitoring of ground deformation is important for an appropriate analysis and interpretation of unstable slopes. Displacements are usually monitored with in-situ techniques (e.g., extensometers, inclinometers, geodetic leveling, tachymeters and D-GPS), and/or active remote sensing methods (e.g., LiDAR and radar interferometry). In particular situations, however, the choice of the appropriate monitoring system is constrained by site-specific conditions. Slope areas can be very remote and/or affected by rapid surface changes, thus hardly accessible, often unsafe, for field installations. In many cases the use of remote sensing approaches might be also hindered because of unsuitable acquisition geometries, poor spatial resolution and revisit times, and/or high costs. The increasing availability of digital imagery acquired from terrestrial photo and video cameras allows us nowadays for an additional source of data. The latter can be exploited to visually identify changes of the scene occurring over time, but also to quantify the evolution of surface displacements. Image processing analyses, such as Digital Image Correlation (known also as pixel-offset or feature-tracking), have demonstrated to provide a suitable alternative to detect and monitor surface deformation at high spatial and temporal resolutions. However, a number of intrinsic limitations have to be considered when dealing with optical imagery acquisition and processing, including the effects of light conditions, shadowing, and/or meteorological variables. Here we propose an algorithm to automatically select and process images acquired from time-lapse cameras. We aim at maximizing the results obtainable from large datasets of digital images acquired with different light and meteorological conditions, and at retrieving accurate information on the evolution of surface deformation. We show a successful example of application of our approach in the Swiss Alps, more specifically in the Great Aletsch area, where slope instability was recently reactivated due to the progressive glacier retreat. At this location, time-lapse cameras have been installed during the last two years, ranging from low-cost and low-resolution webcams to more expensive high-resolution reflex cameras. Our results confirm that time-lapse cameras provide quantitative and accurate measurements of surface deformation evolution over space and time, especially in situations when other monitoring instruments fail.

  9. Monitoring of degradation of porous silicon photonic crystals using digital photography

    PubMed Central

    2014-01-01

    We report the monitoring of porous silicon (pSi) degradation in aqueous solutions using a consumer-grade digital camera. To facilitate optical monitoring, the pSi samples were prepared as one-dimensional photonic crystals (rugate filters) by electrochemical etching of highly doped p-type Si wafers using a periodic etch waveform. Two pSi formulations, representing chemistries relevant for self-reporting drug delivery applications, were tested: freshly etched pSi (fpSi) and fpSi coated with the biodegradable polymer chitosan (pSi-ch). Accelerated degradation of the samples in an ethanol-containing pH 10 aqueous basic buffer was monitored in situ by digital imaging with a consumer-grade digital camera with simultaneous optical reflectance spectrophotometric point measurements. As the nanostructured porous silicon matrix dissolved, a hypsochromic shift in the wavelength of the rugate reflectance peak resulted in visible color changes from red to green. While the H coordinate in the hue, saturation, and value (HSV) color space calculated using the as-acquired photographs was a good monitor of degradation at short times (t < 100 min), it was not a useful monitor of sample degradation at longer times since it was influenced by reflections of the broad spectral output of the lamp as well as from the narrow rugate reflectance band. A monotonic relationship was observed between the wavelength of the rugate reflectance peak and an H parameter value calculated from the average red-green-blue (RGB) values of each image by first independently normalizing each channel (R, G, and B) using their maximum and minimum value over the time course of the degradation process. Spectrophotometric measurements and digital image analysis using this H parameter gave consistent relative stabilities of the samples as fpSi > pSi-ch. PMID:25242902

  10. Transport infrastructure monitoring: A ground based optical displacement monitoring system, field tests on a bridge, the Musmeci's bridge in Potenza, Italy.

    NASA Astrophysics Data System (ADS)

    Hagene, J. K.

    2012-04-01

    A gound based optical displacement monitoring system, "NIODIM", is being developed by Norsk Elektro Optikk in the framework of the activities of the European project "Integrated System for Transport Infrastructure surveillance and Monitoring by Electromagnetic Sensing" (ISTIMES), funded in the 7th Framework Programme (FP7/2007-2013). The optical displacement monitoring system has now participated in two real life field campaigns one in Switzerland and one in Italy. The latter, the tests in Potenza, Italy, will be presented in the following. The NIODIM system has undergone some development during the last year to adopt it for use in a somewhat higher frequency domain by changing the camera sensor part. This to make it more useful for monitoring of structures with oscillation frequencies tens of Hz. The original system was intended to a large extent to monitor land slides, quick clay and rock slides and similar phenomena typically having a relatively slow time response. The system has been significantly speeded up from the original 12 Hz. Current tests have been performed at a frame rate of 64 Hz i.e., the camera part and data processing unit have been running on 64Hz. In connection with the tests in Italy the data processing has been upgraded to include sub-pixel resolution i.e., the measurement results are no longer limited by pixel borders or single pixels. The main part of the NIODIM system is a camera capable of operating at a sufficiently high frame rate. This camera will typically be mounted on firm ground and will depict and monitor a reference point, typically a light emitting diode, LED, which will be mounted on the object susceptible to move. A processing unit will acquire the images from the camera part and find the position of the LED in the image and compare that to threshold values and if required raise a warning or an alarm. The NIODIM system can either be a standalone system or be an integrated part of the overall ISTIMES system, the ISTIMES system being a decision support system. Field trials as part of the ISTIMES project took place in Potenza, Italy, for a week in July 2011. The test target was Musmeci's bridge, a bridge with a design where aesthetic values have been just as important as traditional civil engineering aspects. Several technologies and techniques were tested at the same part of the bridge to allow for data correlation between different sensors. The camera and processing parts of the optical displacement monitoring system were mounted on a concrete wall at the one end of the bridge while the LED reference points were mounted on the bridge approximately 40 metres away. The tests at the Musmeci's bridge are successful and verifying some of the findings from the tests in Switzerland. However, we learned a lesson with regards to temporary mounting of the reference points using glossy stainless steel parts. A short period early in the morning, when illuminated by the sun, these stainless steel parts were just as bright as the LED reference point leading to potential noise in the measurements. Due to availability of the raw data this could be fixed later doing post processing on the stored data. One of the findings was that we have relatively large time of day variation that appear to be periodic with a cycle time of about 24 hours, at least with similar weather conditions. These displacements appear to be in the order of 10 mm and is probably due to thermal effects. Several shorter displacements have also been registered with amplitudes of a couple of mm and duration around 10 seconds. These shorter displacement peaks appear to be caused by heavy vehicles passing by on the bridge. The introduction of the processing using sub-pixel resolution looks very promising and appears to give a significant improvement of the actual resolution of the system. Even thought the measurements in the field are successfully completed we have noted larger slowly moving displacements than originally expected. This combined with shorter lasting peaks could lead to measurements above pre-set thresholds and could further on lead to a raised alarm. Such an alarm will most likely be regarded as a false alarm caused by the superposition of the long time constant thermal displacement and the short time constant peak possibly due to a vehicle. These results have made us re-think our system for handling warnings and alarms based on measurements done. There must be different thresholds for slow events and for quick event and the combination thereof. After taking into consideration the lessons learned our optical displacement monitoring system has potential of being a reliable and robust system solving the problem it was intended to solve. Acknowledgement: The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663.

  11. A continuous hyperspatial monitoring system of evapotranspiration and gross primary productivity from Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Wang, Sheng; Bandini, Filippo; Jakobsen, Jakob; Zarco-Tejada, Pablo J.; Köppl, Christian Josef; Haugård Olesen, Daniel; Ibrom, Andreas; Bauer-Gottwein, Peter; Garcia, Monica

    2017-04-01

    Unmanned Aerial Systems (UAS) can collect optical and thermal hyperspatial (<1m) imagery with low cost and flexible revisit times regardless of cloudy conditions. The reflectance and radiometric temperature signatures of the land surface, closely linked with the vegetation structure and functioning, are already part of models to predict Evapotranspiration (ET) and Gross Primary Productivity (GPP) from satellites. However, there remain challenges for an operational monitoring using UAS compared to satellites: the payload capacity of most commercial UAS is less than 2 kg, but miniaturized sensors have low signal to noise ratios and small field of view requires mosaicking hundreds of images and accurate orthorectification. In addition, wind gusts and lower platform stability require appropriate geometric and radiometric corrections. Finally, modeling fluxes on days without images is still an issue for both satellite and UAS applications. This study focuses on designing an operational UAS-based monitoring system including payload design, sensor calibration, based on routine collection of optical and thermal images in a Danish willow field to perform a joint monitoring of ET and GPP dynamics over continuous time at daily time steps. The payload (<2 kg) consists of a multispectral camera (Tetra Mini-MCA6), a thermal infrared camera (FLIR Tau 2), a digital camera (Sony RX-100) used to retrieve accurate digital elevation models (DEMs) for multispectral and thermal image orthorectification, and a standard GNSS single frequency receiver (UBlox) or a real time kinematic double frequency system (Novatel Inc. flexpack6+OEM628). Geometric calibration of the digital and multispectral cameras was conducted to recover intrinsic camera parameters. After geometric calibration, accurate DEMs with vertical errors about 10cm could be retrieved. Radiometric calibration for the multispectral camera was conducted with an integrating sphere (Labsphere CSTM-USS-2000C) and the laboratory calibration showed that the camera measured radiance had a bias within ±4.8%. The thermal camera was calibrated using a black body at varying target and ambient temperatures and resulted in laboratory accuracy with RMSE of 0.95 K. A joint model of ET and GPP was applied using two parsimonious, physiologically based models, a modified version of the Priestley-Taylor Jet Propulsion Laboratory model (Fisher et al., 2008; Garcia et al., 2013) and a Light Use Efficiency approach (Potter et al., 1993). Both models estimate ET and GPP under optimum potential conditions down-regulated by the same biophysical constraints dependent on remote sensing and atmospheric data to reflect multiple stresses. Vegetation indices were calculated from the multispectral data to assess vegetation conditions, while thermal infrared imagery was used to compute a thermal inertia index to infer soil moisture constraints. To interpolate radiometric temperature between flights, a prognostic Surface Energy Balance model (Margulis et al., 2001) based on the force-restore method was applied in a data assimilation scheme to obtain continuous ET and GPP fluxes. With this operational system, regular flight campaigns with a hexacopter (DJI S900) have been conducted in a Danish willow flux site (Risø) over the 2016 growing season. The observed energy, water and carbon fluxes from the Risø eddy covariance flux tower were used to validate the model simulation. This UAS monitoring system is suitable for agricultural management and land-atmosphere interaction studies.

  12. Surveying the earth from 20,000 miles

    USGS Publications Warehouse

    Colvocoresses, A.P.

    1970-01-01

    Current space programs aimed at monitoring the earth's resources concentrate on the lower orbital altitudes of 100 to 500 nautical miles. An earth synchronous (geo-stationary) orbit is 19,400 n. mi. above the earth. A powerful telephoto camera at such a location can monitor and record many time-variant phenomena far more effectively than instruments at lower altitudes. The geo-stationary systems characteristics and problem areas related to optics and telemetry are outlined and detailed, and on-going programs are discussed as they relate to the geo-stationary system.

  13. Small diameter, deep bore optical inspection system

    DOEpatents

    Lord, David E.; Petrini, Richard R.; Carter, Gary W.

    1981-01-01

    An improved rod optic system for inspecting small diameter, deep bores. The system consists of a rod optic system utilizing a curved mirror at the end of the rod lens such that the optical path through the system is bent 90.degree. to minimize optical distortion in examining the sides of a curved bore. The system is particularly useful in the examination of small bores for corrosion, and is capable of examining 1/16 inch diameter and up to 4 inch deep drill holes, for example. The positioning of the curved mirror allows simultaneous viewing from shallow and right angle points of observation of the same artifact (such as corrosion) in the bore hole. The improved rod optic system may be used for direct eye sighting, or in combination with a still camera or a low-light television monitor; particularly low-light color television.

  14. High-speed polarized light microscopy for in situ, dynamic measurement of birefringence properties

    NASA Astrophysics Data System (ADS)

    Wu, Xianyu; Pankow, Mark; Shadow Huang, Hsiao-Ying; Peters, Kara

    2018-01-01

    A high-speed, quantitative polarized light microscopy (QPLM) instrument has been developed to monitor the optical slow axis spatial realignment during controlled medium to high strain rate experiments at acquisition rates up to 10 kHz. This high-speed QPLM instrument is implemented within a modified drop tower and demonstrated using polycarbonate specimens. By utilizing a rotating quarter wave plate and a high-speed camera, the minimum acquisition time to generate an alignment map of a birefringent specimen is 6.1 ms. A sequential analysis method allows the QPLM instrument to generate QPLM data at the high-speed camera imaging frequency 10 kHz. The obtained QPLM data is processed using a vector correlation technique to detect anomalous optical axis realignment and retardation changes throughout the loading event. The detected anomalous optical axis realignment is shown to be associated with crack initiation, propagation, and specimen failure in a dynamically loaded polycarbonate specimen. The work provides a foundation for detecting damage in biological tissues through local collagen fiber realignment and fracture during dynamic loading.

  15. On-ground and in-orbit characterisation plan for the PLATO CCD normal cameras

    NASA Astrophysics Data System (ADS)

    Gow, J. P. D.; Walton, D.; Smith, A.; Hailey, M.; Curry, P.; Kennedy, T.

    2017-11-01

    PLAnetary Transits and Ocillations (PLATO) is the third European Space Agency (ESA) medium class mission in ESA's cosmic vision programme due for launch in 2026. PLATO will carry out high precision un-interrupted photometric monitoring in the visible band of large samples of bright solar-type stars. The primary mission goal is to detect and characterise terrestrial exoplanets and their systems with emphasis on planets orbiting in the habitable zone, this will be achieved using light curves to detect planetary transits. PLATO uses a novel multi- instrument concept consisting of 26 small wide field cameras The 26 cameras are made up of a telescope optical unit, four Teledyne e2v CCD270s mounted on a focal plane array and connected to a set of Front End Electronics (FEE) which provide CCD control and readout. There are 2 fast cameras with high read-out cadence (2.5 s) for magnitude ~ 4-8 stars, being developed by the German Aerospace Centre and 24 normal (N) cameras with a cadence of 25 s to monitor stars with a magnitude greater than 8. The N-FEEs are being developed at University College London's Mullard Space Science Laboratory (MSSL) and will be characterised along with the associated CCDs. The CCDs and N-FEEs will undergo rigorous on-ground characterisation and the performance of the CCDs will continue to be monitored in-orbit. This paper discusses the initial development of the experimental arrangement, test procedures and current status of the N-FEE. The parameters explored will include gain, quantum efficiency, pixel response non-uniformity, dark current and Charge Transfer Inefficiency (CTI). The current in-orbit characterisation plan is also discussed which will enable the performance of the CCDs and their associated N-FEE to be monitored during the mission, this will include measurements of CTI giving an indication of the impact of radiation damage in the CCDs.

  16. Infrared Camera Characterization of Bi-Propellant Reaction Control Engines during Auxiliary Propulsion Systems Tests at NASA's White Sands Test Facility in Las Cruces, New Mexico

    NASA Technical Reports Server (NTRS)

    Holleman, Elizabeth; Sharp, David; Sheller, Richard; Styron, Jason

    2007-01-01

    This paper describes the application of a FUR Systems A40M infrared (IR) digital camera for thermal monitoring of a Liquid Oxygen (LOX) and Ethanol bi-propellant Reaction Control Engine (RCE) during Auxiliary Propulsion System (APS) testing at the National Aeronautics & Space Administration's (NASA) White Sands Test Facility (WSTF) near Las Cruces, New Mexico. Typically, NASA has relied mostly on the use of ThermoCouples (TC) for this type of thermal monitoring due to the variability of constraints required to accurately map rapidly changing temperatures from ambient to glowing hot chamber material. Obtaining accurate real-time temperatures in the JR spectrum is made even more elusive by the changing emissivity of the chamber material as it begins to glow. The parameters evaluated prior to APS testing included: (1) remote operation of the A40M camera using fiber optic Firewire signal sender and receiver units; (2) operation of the camera inside a Pelco explosion proof enclosure with a germanium window; (3) remote analog signal display for real-time monitoring; (4) remote digital data acquisition of the A40M's sensor information using FUR's ThermaCAM Researcher Pro 2.8 software; and (5) overall reliability of the system. An initial characterization report was prepared after the A40M characterization tests at Marshall Space Flight Center (MSFC) to document controlled heat source comparisons to calibrated TCs. Summary IR digital data recorded from WSTF's APS testing is included within this document along with findings, lessons learned, and recommendations for further usage as a monitoring tool for the development of rocket engines.

  17. A novel spatter detection algorithm based on typical cellular neural network operations for laser beam welding processes

    NASA Astrophysics Data System (ADS)

    Nicolosi, L.; Abt, F.; Blug, A.; Heider, A.; Tetzlaff, R.; Höfler, H.

    2012-01-01

    Real-time monitoring of laser beam welding (LBW) has increasingly gained importance in several manufacturing processes ranging from automobile production to precision mechanics. In the latter, a novel algorithm for the real-time detection of spatters was implemented in a camera based on cellular neural networks. The latter can be connected to the optics of commercially available laser machines leading to real-time monitoring of LBW processes at rates up to 15 kHz. Such high monitoring rates allow the integration of other image evaluation tasks such as the detection of the full penetration hole for real-time control of process parameters.

  18. Small Orbital Stereo Tracking Camera Technology Development

    NASA Technical Reports Server (NTRS)

    Bryan, Tom; Macleod, Todd; Gagliano, Larry

    2015-01-01

    On-Orbit Small Debris Tracking and Characterization is a technical gap in the current National Space Situational Awareness necessary to safeguard orbital assets and crew. This poses a major risk of MOD damage to ISS and Exploration vehicles. In 2015 this technology was added to NASA's Office of Chief Technologist roadmap. For missions flying in or assembled in or staging from LEO, the physical threat to vehicle and crew is needed in order to properly design the proper level of MOD impact shielding and proper mission design restrictions. Need to verify debris flux and size population versus ground RADAR tracking. Use of ISS for In-Situ Orbital Debris Tracking development provides attitude, power, data and orbital access without a dedicated spacecraft or restricted operations on-board a host vehicle as a secondary payload. Sensor Applicable to in-situ measuring orbital debris in flux and population in other orbits or on other vehicles. Could enhance safety on and around ISS. Some technologies extensible to monitoring of extraterrestrial debris as well to help accomplish this, new technologies must be developed quickly. The Small Orbital Stereo Tracking Camera is one such up and coming technology. It consists of flying a pair of intensified megapixel telephoto cameras to evaluate Orbital Debris (OD) monitoring in proximity of International Space Station. It will demonstrate on-orbit optical tracking (in situ) of various sized objects versus ground RADAR tracking and small OD models. The cameras are based on Flight Proven Advanced Video Guidance Sensor pixel to spot algorithms (Orbital Express) and military targeting cameras. And by using twin cameras we can provide Stereo images for ranging & mission redundancy. When pointed into the orbital velocity vector (RAM), objects approaching or near the stereo camera set can be differentiated from the stars moving upward in background.

  19. Small Orbital Stereo Tracking Camera Technology Development

    NASA Technical Reports Server (NTRS)

    Bryan, Tom; MacLeod, Todd; Gagliano, Larry

    2016-01-01

    On-Orbit Small Debris Tracking and Characterization is a technical gap in the current National Space Situational Awareness necessary to safeguard orbital assets and crew. This poses a major risk of MOD damage to ISS and Exploration vehicles. In 2015 this technology was added to NASA's Office of Chief Technologist roadmap. For missions flying in or assembled in or staging from LEO, the physical threat to vehicle and crew is needed in order to properly design the proper level of MOD impact shielding and proper mission design restrictions. Need to verify debris flux and size population versus ground RADAR tracking. Use of ISS for In-Situ Orbital Debris Tracking development provides attitude, power, data and orbital access without a dedicated spacecraft or restricted operations on-board a host vehicle as a secondary payload. Sensor Applicable to in-situ measuring orbital debris in flux and population in other orbits or on other vehicles. Could enhance safety on and around ISS. Some technologies extensible to monitoring of extraterrestrial debris as well To help accomplish this, new technologies must be developed quickly. The Small Orbital Stereo Tracking Camera is one such up and coming technology. It consists of flying a pair of intensified megapixel telephoto cameras to evaluate Orbital Debris (OD) monitoring in proximity of International Space Station. It will demonstrate on-orbit optical tracking (in situ) of various sized objects versus ground RADAR tracking and small OD models. The cameras are based on Flight Proven Advanced Video Guidance Sensor pixel to spot algorithms (Orbital Express) and military targeting cameras. And by using twin cameras we can provide Stereo images for ranging & mission redundancy. When pointed into the orbital velocity vector (RAM), objects approaching or near the stereo camera set can be differentiated from the stars moving upward in background.

  20. A Submersible Holographic Camera for the Undisturbed Characterization of Optically Relevant Particles in Water (HOLOCAM)

    DTIC Science & Technology

    2013-09-30

    environmental factors that impact toxic algal blooms in the Great Lakes, including their initiation, development, and senescence. The project is...integrated with existing harmful algal bloom monitoring and observational activities through the NOAA Great Lakes Environmental Research Laboratory...holograms showing the orientation of Ditylum chains within a phytoplankton thin layer in East Sound, WA, 2013. IMPACT /APPLICATIONS The HOLOCAM

  1. Multiplexed fluorescence detector system for capillary electrophoresis

    DOEpatents

    Yeung, E.S.; Taylor, J.A.

    1996-03-12

    A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis. 14 figs.

  2. Multiplexed fluorescence detector system for capillary electrophoresis

    DOEpatents

    Yeung, E.S.; Taylor, J.A.

    1994-06-28

    A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis. 14 figures.

  3. Multiplexed fluorescence detector system for capillary electrophoresis

    DOEpatents

    Yeung, Edward S.; Taylor, John A.

    1996-03-12

    A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis.

  4. Multiplexed fluorescence detector system for capillary electrophoresis

    DOEpatents

    Yeung, Edward S.; Taylor, John A.

    1994-06-28

    A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis.

  5. Exploring the imaging properties of thin lenses for cryogenic infrared cameras

    NASA Astrophysics Data System (ADS)

    Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura

    2016-05-01

    Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.

  6. An imaging system for PLIF/Mie measurements for a combusting flow

    NASA Technical Reports Server (NTRS)

    Wey, C. C.; Ghorashi, B.; Marek, C. J.; Wey, C.

    1990-01-01

    The equipment required to establish an imaging system can be divided into four parts: (1) the light source and beam shaping optics; (2) camera and recording; (3) image acquisition and processing; and (4) computer and output systems. A pulsed, Nd:YAG-pummped, frequency-doubled dye laser which can freeze motion in the flowfield is used for an illumination source. A set of lenses is used to form the laser beam into a sheet. The induced fluorescence is collected by an UV-enhanced lens and passes through an UV-enhanced microchannel plate intensifier which is optically coupled to a gated solid state CCD camera. The output of the camera is simultaneously displayed on a monitor and recorded on either a laser videodisc set of a Super VHS VCR. This videodisc set is controlled by a minicomputer via a connection to the RS-232C interface terminals. The imaging system is connected to the host computer by a bus repeater and can be multiplexed between four video input sources. Sample images from a planar shear layer experiment are presented to show the processing capability of the imaging system with the host computer.

  7. Cooperative multisensor system for real-time face detection and tracking in uncontrolled conditions

    NASA Astrophysics Data System (ADS)

    Marchesotti, Luca; Piva, Stefano; Turolla, Andrea; Minetti, Deborah; Regazzoni, Carlo S.

    2005-03-01

    The presented work describes an innovative architecture for multi-sensor distributed video surveillance applications. The aim of the system is to track moving objects in outdoor environments with a cooperative strategy exploiting two video cameras. The system also exhibits the capacity of focusing its attention on the faces of detected pedestrians collecting snapshot frames of face images, by segmenting and tracking them over time at different resolution. The system is designed to employ two video cameras in a cooperative client/server structure: the first camera monitors the entire area of interest and detects the moving objects using change detection techniques. The detected objects are tracked over time and their position is indicated on a map representing the monitored area. The objects" coordinates are sent to the server sensor in order to point its zooming optics towards the moving object. The second camera tracks the objects at high resolution. As well as the client camera, this sensor is calibrated and the position of the object detected on the image plane reference system is translated in its coordinates referred to the same area map. In the map common reference system, data fusion techniques are applied to achieve a more precise and robust estimation of the objects" track and to perform face detection and tracking. The work novelties and strength reside in the cooperative multi-sensor approach, in the high resolution long distance tracking and in the automatic collection of biometric data such as a person face clip for recognition purposes.

  8. A traffic situation analysis system

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin

    2011-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. For example embedded vision systems built into vehicles can be used as early warning systems, or stationary camera systems can modify the switching frequency of signals at intersections. Today the automated analysis of traffic situations is still in its infancy - the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully understood by a vision system. We present steps towards such a traffic monitoring system which is designed to detect potentially dangerous traffic situations, especially incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system is field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in an outdoor capable housing. Two cameras run vehicle detection software including license plate detection and recognition, one camera runs a complex pedestrian detection and tracking module based on the HOG detection principle. As a supplement, all 3 cameras use additional optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. This work describes the foundation for all 3 different object detection modalities (pedestrians, vehi1cles, license plates), and explains the system setup and its design.

  9. Optical registration of spaceborne low light remote sensing camera

    NASA Astrophysics Data System (ADS)

    Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long

    2018-02-01

    For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.

  10. Single-sensor system for spatially resolved, continuous, and multiparametric optical mapping of cardiac tissue

    PubMed Central

    Lee, Peter; Bollensdorff, Christian; Quinn, T. Alexander; Wuskell, Joseph P.; Loew, Leslie M.; Kohl, Peter

    2011-01-01

    Background Simultaneous optical mapping of multiple electrophysiologically relevant parameters in living myocardium is desirable for integrative exploration of mechanisms underlying heart rhythm generation under normal and pathophysiologic conditions. Current multiparametric methods are technically challenging, usually involving multiple sensors and moving parts, which contributes to high logistic and economic thresholds that prevent easy application of the technique. Objective The purpose of this study was to develop a simple, affordable, and effective method for spatially resolved, continuous, simultaneous, and multiparametric optical mapping of the heart, using a single camera. Methods We present a new method to simultaneously monitor multiple parameters using inexpensive off-the-shelf electronic components and no moving parts. The system comprises a single camera, commercially available optical filters, and light-emitting diodes (LEDs), integrated via microcontroller-based electronics for frame-accurate illumination of the tissue. For proof of principle, we illustrate measurement of four parameters, suitable for ratiometric mapping of membrane potential (di-4-ANBDQPQ) and intracellular free calcium (fura-2), in an isolated Langendorff-perfused rat heart during sinus rhythm and ectopy, induced by local electrical or mechanical stimulation. Results The pilot application demonstrates suitability of this imaging approach for heart rhythm research in the isolated heart. In addition, locally induced excitation, whether stimulated electrically or mechanically, gives rise to similar ventricular propagation patterns. Conclusion Combining an affordable camera with suitable optical filters and microprocessor-controlled LEDs, single-sensor multiparametric optical mapping can be practically implemented in a simple yet powerful configuration and applied to heart rhythm research. The moderate system complexity and component cost is destined to lower the threshold to broader application of functional imaging and to ease implementation of more complex optical mapping approaches, such as multiparametric panoramic imaging. A proof-of-principle application confirmed that although electrically and mechanically induced excitation occur by different mechanisms, their electrophysiologic consequences downstream from the point of activation are not dissimilar. PMID:21459161

  11. Feasibility evaluation and study of adapting the attitude reference system to the Orbiter camera payload system's large format camera

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A design concept that will implement a mapping capability for the Orbital Camera Payload System (OCPS) when ground control points are not available is discussed. Through the use of stellar imagery collected by a pair of cameras whose optical axis are structurally related to the large format camera optical axis, such pointing information is made available.

  12. Retrieving Atmospheric Dust Loading on Mars Using Engineering Cameras and MSL's Mars Hand Lens Imager (MAHLI)

    NASA Astrophysics Data System (ADS)

    Wolfe, C. A.; Lemmon, M. T.

    2015-12-01

    Dust in the Martian atmosphere influences energy deposition, dynamics, and the viability of solar powered exploration vehicles. The Viking, Pathfinder, Spirit, Opportunity, Phoenix, and Curiosity landers and rovers each included the ability to image the Sun with a science camera equipped with a neutral density filter. Direct images of the Sun not only provide the ability to measure extinction by dust and ice in the atmosphere, but also provide a variety of constraints on the Martian dust and water cycles. These observations have been used to characterize dust storms, to provide ground truth sites for orbiter-based global measurements of dust loading, and to help monitor solar panel performance. In the cost-constrained environment of Mars exploration, future missions may omit such cameras, as the solar-powered InSight mission has. We seek to provide a robust capability of determining atmospheric opacity from sky images taken with cameras that have not been designed for solar imaging, such as the engineering cameras onboard Opportunity and the Mars Hand Lens Imager (MAHLI) on Curiosity. Our investigation focuses primarily on the accuracy of a method that determines optical depth values using scattering models that implement the ratio of sky radiance measurements at different elevation angles, but at the same scattering angle. Operational use requires the ability to retrieve optical depth on a timescale useful to mission planning, and with an accuracy and precision sufficient to support both mission planning and validating orbital measurements. We will present a simulation-based assessment of imaging strategies and their error budgets, as well as a validation based on the comparison of direct extinction measurements from archival Navcam, Hazcam, and MAHLI camera data.

  13. Small diameter, deep bore optical inspection system

    DOEpatents

    Lord, D.E.; Petrini, R.R.; Carter, G.W.

    An improved rod optic system for inspecting small diameter, deep bores is described. The system consists of a rod optic system utilizing a curved mirror at the end of the rod lens such that the optical path through the system is bent 90/sup 0/ to minimize optical distortion in examing the sides of a curved bore. The system is particularly useful in the examination of small bores for corrosion, and is capable if examing 1/16 inch diameter and up to 4-inch deep drill holes, for example. The positioning of the curved mirror allows simultaneous viewing from shallow and righ angle points of observation of the same artifact (such as corrosion) in the bore hole. The improved rod optic system may be used for direct eye sighting, or in combination with a still camera or a low-light television monitor; particularly low-light color television.

  14. Optimization of a miniature short-wavelength infrared objective optics of a short-wavelength infrared to visible upconversion layer attached to a mobile-devices visible camera

    NASA Astrophysics Data System (ADS)

    Kadosh, Itai; Sarusi, Gabby

    2017-10-01

    The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.

  15. Extraction of natural weight shift and foot rolling in gait based on hetero-core optical fiber load sensor

    NASA Astrophysics Data System (ADS)

    Otsuka, Yudai; Koyama, Yuya; Nishiyama, Michiko; Watanabe, Kazuhiro

    2016-03-01

    Gait in daily activity affects human health because it may cause physical problems such as asymmetric pelvis, flat foot and bowlegs. Monitoring natural weight shift and foot rolling on plantar has been employed in order for researchers to analyze gait characteristics. Conventional gait monitoring systems have been developed using camera, acceleration sensor, gyro sensor and electrical load sensors. They have some problems such as limited measurement place, temperature dependence and electric leakage. On the other hand, a hetero-core optical fiber sensor has many advantages such as high sensitivity for macro-bending, light weight sensor element, independency on temperature fluctuations, and no electric contact. This paper describes extraction of natural weight shift and foot rolling for gait evaluation by using a sensitive shoe, in the insole of which hetero-core optical load sensors are embedded for detecting plantar pressure. Plantar pressure of three subjects who wear the sensitive shoe and walk on the treadmill was monitored. As a result, weight shift and foot rolling for three subjects were extracted using the proposed sensitive shoe in terms of centroid movement and positions. Additionally, these extracted data are compared to that of electric load sensor to ensure consistency. For these results, it was successfully demonstrated that hetero-core optical fiber load sensor performed in unconstraint gait monitoring as well as electric load sensor.

  16. Mach-zehnder based optical marker/comb generator for streak camera calibration

    DOEpatents

    Miller, Edward Kirk

    2015-03-03

    This disclosure is directed to a method and apparatus for generating marker and comb indicia in an optical environment using a Mach-Zehnder (M-Z) modulator. High speed recording devices are configured to record image or other data defining a high speed event. To calibrate and establish time reference, the markers or combs are indicia which serve as timing pulses (markers) or a constant-frequency train of optical pulses (comb) to be imaged on a streak camera for accurate time based calibration and time reference. The system includes a camera, an optic signal generator which provides an optic signal to an M-Z modulator and biasing and modulation signal generators configured to provide input to the M-Z modulator. An optical reference signal is provided to the M-Z modulator. The M-Z modulator modulates the reference signal to a higher frequency optical signal which is output through a fiber coupled link to the streak camera.

  17. Optic Disc and Optic Cup Segmentation Methodologies for Glaucoma Image Detection: A Survey

    PubMed Central

    Almazroa, Ahmed; Burman, Ritambhar; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan

    2015-01-01

    Glaucoma is the second leading cause of loss of vision in the world. Examining the head of optic nerve (cup-to-disc ratio) is very important for diagnosing glaucoma and for patient monitoring after diagnosis. Images of optic disc and optic cup are acquired by fundus camera as well as Optical Coherence Tomography. The optic disc and optic cup segmentation techniques are used to isolate the relevant parts of the retinal image and to calculate the cup-to-disc ratio. The main objective of this paper is to review segmentation methodologies and techniques for the disc and cup boundaries which are utilized to calculate the disc and cup geometrical parameters automatically and accurately to help the professionals in the glaucoma to have a wide view and more details about the optic nerve head structure using retinal fundus images. We provide a brief description of each technique, highlighting its classification and performance metrics. The current and future research directions are summarized and discussed. PMID:26688751

  18. Oversampling in virtual visual sensors as a means to recover higher modes of vibration

    NASA Astrophysics Data System (ADS)

    Shariati, Ali; Schumacher, Thomas

    2015-03-01

    Vibration-based structural health monitoring (SHM) techniques require modal information from the monitored structure in order to estimate the location and severity of damage. Natural frequencies also provide useful information to calibrate finite element models. There are several types of physical sensors that can measure the response over a range of frequencies. For most of those sensors however, accessibility, limitation of measurement points, wiring, and high system cost represent major challenges. Recent optical sensing approaches offer advantages such as easy access to visible areas, distributed sensing capabilities, and comparatively inexpensive data recording while having no wiring issues. In this research we propose a novel methodology to measure natural frequencies of structures using digital video cameras based on virtual visual sensors (VVS). In our initial study where we worked with commercially available inexpensive digital video cameras we found that for multiple degrees of freedom systems it is difficult to detect all of the natural frequencies simultaneously due to low quantization resolution. In this study we show how oversampling enabled by the use of high-end high-frame-rate video cameras enable recovering all of the three natural frequencies from a three story lab-scale structure.

  19. Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund

    NASA Technical Reports Server (NTRS)

    Hagyard, Mona J.

    1992-01-01

    The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.

  20. Retinal arteriolar remodeling evaluated with adaptive optics camera: Relationship with blood pressure levels.

    PubMed

    Gallo, A; Mattina, A; Rosenbaum, D; Koch, E; Paques, M; Girerd, X

    2016-06-01

    To research a retinal arterioles wall-to-lumen ratio or lumen diameter cut-off that would discriminate hypertensive from normal subjects using adaptive optics camera. One thousand and five hundred subjects were consecutively recruited and Adaptive Optics Camera rtx1™ (Imagine-Eyes, Orsay, France) was used to measure wall thickness, internal diameter, to calculate wall-to-lumen ratio (WLR) and wall cross-sectional area of retinal arterioles. Sitting office blood pressure was measured once, just before retinal measurements and office blood pressure was defined as systolic blood pressure>=140mmHg and diastolic blood pressure>=90mmHg. ROC curves were constructed to determine cut-off values for retinal parameters to diagnose office hypertension. In another population of 276 subjects office BP, retinal arterioles evaluation and home blood pressure monitoring were obtained. The applicability of retinal WLR or diameter cut-off values were compared in patients with controlled, masked, white-coat and sustained hypertension. In 1500 patients, a WLR>0.31 discriminated office hypertensive subjects with a 0.57 sensitivity and 0.71 specificity. Lumen diameter<78.2μm discriminated office hypertension with a 0.73 sensitivity and a 0.52 specificity. In the other 276 patients, WLR was higher in sustained hypertension vs normotensive patients (0.330±0.06 vs 0.292±0.05; P<0.001) and diameter was narrower in masked hypertensive vs normotensive subjects (73.0±11.2 vs 78.5±11.6μm; P<0.005). A WLR higher than 0.31 is in favour of office arterial hypertension; a diameter under<78μm may indicate a masked hypertension. Retinal arterioles analysis through adaptive optics camera may help the diagnosis of arterial hypertension, in particular in case of masked hypertension. Copyright © 2016. Published by Elsevier SAS.

  1. Optical transition radiation used in the diagnostic of low energy and low current electron beams in particle accelerators.

    PubMed

    Silva, T F; Bonini, A L; Lima, R R; Maidana, N L; Malafronte, A A; Pascholati, P R; Vanin, V R; Martins, M N

    2012-09-01

    Optical transition radiation (OTR) plays an important role in beam diagnostics for high energy particle accelerators. Its linear intensity with beam current is a great advantage as compared to fluorescent screens, which are subject to saturation. Moreover, the measurement of the angular distribution of the emitted radiation enables the determination of many beam parameters in a single observation point. However, few works deals with the application of OTR to monitor low energy beams. In this work we describe the design of an OTR based beam monitor used to measure the transverse beam charge distribution of the 1.9-MeV electron beam of the linac injector of the IFUSP microtron using a standard vision machine camera. The average beam current in pulsed operation mode is of the order of tens of nano-Amps. Low energy and low beam current make OTR observation difficult. To improve sensitivity, the beam incidence angle on the target was chosen to maximize the photon flux in the camera field-of-view. Measurements that assess OTR observation (linearity with beam current, polarization, and spectrum shape) are presented, as well as a typical 1.9-MeV electron beam charge distribution obtained from OTR. Some aspects of emittance measurement using this device are also discussed.

  2. Recording the synchrotron radiation by a picosecond streak camera for bunch diagnostics in cyclic accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vereshchagin, A K; Vorob'ev, N S; Gornostaev, P B

    2016-02-28

    A PS-1/S1 picosecond streak camera with a linear sweep is used to measure temporal characteristics of synchrotron radiation pulses on a damping ring (DR) at the Budker Institute of Nuclear Physics (BINP) of the Siberian Branch of the Russian Academy of Sciences (Novosibirsk). The data obtained allow a conclusion as to the formation processes of electron bunches and their 'quality' in the DR after injection from the linear accelerator. The expediency of employing the streak camera as a part of an optical diagnostic accelerator complex for adjusting the injection from a linear accelerator is shown. Discussed is the issue ofmore » designing a new-generation dissector with a time resolution up to a few picoseconds, which would allow implementation of a continuous bunch monitoring in the DR during mutual work with the electron-positron colliders at the BINP. (acoustooptics)« less

  3. Method used to test the imaging consistency of binocular camera's left-right optical system

    NASA Astrophysics Data System (ADS)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  4. Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA

    NASA Astrophysics Data System (ADS)

    Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki

    2017-11-01

    SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.

  5. The AOTF-Based NO2 Camera

    NASA Astrophysics Data System (ADS)

    Dekemper, E.; Fussen, D.; Vanhellemont, F.; Vanhamel, J.; Pieroux, D.; Berkenbosch, S.

    2017-12-01

    In an urban environment, nitrogen dioxide is emitted by a multitude of static and moving point sources (cars, industry, power plants, heating systems,…). Air quality models generally rely on a limited number of monitoring stations which do not capture the whole pattern, neither allow for full validation. So far, there has been a lack of instrument capable of measuring NO2 fields with the necessary spatio-temporal resolution above major point sources (power plants), or more extended ones (cities). We have developed a new type of passive remote sensing instrument aiming at the measurement of 2-D distributions of NO2 slant column densities (SCDs) with a high spatial (meters) and temporal (minutes) resolution. The measurement principle has some similarities with the popular filter-based SO2 camera (used in volcanic and industrial sulfur emissions monitoring) as it relies on spectral images taken at wavelengths where the molecule absorption cross section is different. But contrary to the SO2 camera, the spectral selection is performed by an acousto-optical tunable filter (AOTF) capable of resolving the target molecule's spectral features. A first prototype was successfully tested with the plume of a coal-firing power plant in Romania, revealing the dynamics of the formation of NO2 in the early plume. A lighter version of the NO2 camera is now being tested on other targets, such as oil refineries and urban air masses.

  6. Cavity-Enhanced Raman Spectroscopy of Natural Gas with Optical Feedback cw-Diode Lasers.

    PubMed

    Hippler, Michael

    2015-08-04

    We report on improvements made on our previously introduced technique of cavity-enhanced Raman spectroscopy (CERS) with optical feedback cw-diode lasers in the gas phase, including a new mode-matching procedure which keeps the laser in resonance with the optical cavity without inducing long-term frequency shifts of the laser, and using a new CCD camera with improved noise performance. With 10 mW of 636.2 nm diode laser excitation and 30 s integration time, cavity enhancement achieves noise-equivalent detection limits below 1 mbar at 1 bar total pressure, depending on Raman cross sections. Detection limits can be easily improved using higher power diodes. We further demonstrate a relevant analytical application of CERS, the multicomponent analysis of natural gas samples. Several spectroscopic features have been identified and characterized. CERS with low power diode lasers is suitable for online monitoring of natural gas mixtures with sensitivity and spectroscopic selectivity, including monitoring H2, H2S, N2, CO2, and alkanes.

  7. Process control of laser conduction welding by thermal imaging measurement with a color camera.

    PubMed

    Bardin, Fabrice; Morgan, Stephen; Williams, Stewart; McBride, Roy; Moore, Andrew J; Jones, Julian D C; Hand, Duncan P

    2005-11-10

    Conduction welding offers an alternative to keyhole welding. Compared with keyhole welding, it is an intrinsically stable process because vaporization phenomena are minimal. However, as with keyhole welding, an on-line process-monitoring system is advantageous for quality assurance to maintain the required penetration depth, which in conduction welding is more sensitive to changes in heat sinking. The maximum penetration is obtained when the surface temperature is just below the boiling point, and so we normally wish to maintain the temperature at this level. We describe a two-color optical system that we have developed for real-time temperature profile measurement of the conduction weld pool. The key feature of the system is the use of a complementary metal-oxide semiconductor standard color camera leading to a simplified low-cost optical setup. We present and discuss the real-time temperature measurement and control performance of the system when a defocused beam from a high power Nd:YAG laser is used on 5 mm thick stainless steel workpieces.

  8. Optical fiducial timing system for X-ray streak cameras with aluminum coated optical fiber ends

    DOEpatents

    Nilson, David G.; Campbell, E. Michael; MacGowan, Brian J.; Medecki, Hector

    1988-01-01

    An optical fiducial timing system is provided for use with interdependent groups of X-ray streak cameras (18). The aluminum coated (80) ends of optical fibers (78) are positioned with the photocathodes (20, 60, 70) of the X-ray streak cameras (18). The other ends of the optical fibers (78) are placed together in a bundled array (90). A fiducial optical signal (96), that is comprised of 2.omega. or 1.omega. laser light, after introduction to the bundled array (90), travels to the aluminum coated (82) optical fiber ends and ejects quantities of electrons (84) that are recorded on the data recording media (52) of the X-ray streak cameras (18). Since both 2.omega. and 1.omega. laser light can travel long distances in optical fiber with only a slight attenuation, the initial arial power density of the fiducial optical signal (96) is well below the damage threshold of the fused silica or other material that comprises the optical fibers (78, 90). Thus the fiducial timing system can be repeatably used over long durations of time.

  9. ESA's X-ray space observatory XMM takes first pictures

    NASA Astrophysics Data System (ADS)

    2000-02-01

    Under the aegis of Prof. Roger Bonnet, ESA Director of Science, the mission's Principal Investigators will be presenting these spectacular first images at a press conference to be held on 9 February at the ESA Vilspa facility at Villafranca/Madrid in Spain, where the XMM Science Operations Centre is located. The event will also be the occasion for several major announcements concerning the XMM mission. In particular Professor Bonnet will launch the third XMM competition "Stargazing" - previously announced in September 1999. This will address European youngsters, 16 to 18 years old, who will be offered the unique opportunity of winning observing time using the X-ray telescope. Commissioning phase starts After a successful launch from Kourou on Ariane 504 on 10 December 1999, XMM was brought to its final operational orbit in the following week. The telescope doors on the X-ray Mirror Modules and on the Optical Monitor telescope were opened on 17/18 December. The Radiation Monitor was activated on 19 December and the spacecraft was put into a quiet mode over the Christmas and New Year period. The mission's scientific data is being received, processed and dispatched to astronomers by the XMM Science Operations Centre in Villafranca. Operations with the spacecraft restarted there on 4 January when, as part of the commissioning phase, all the science payloads were switched on one after the other for initial verifications. By the week of 17 January functional tests had begun on the Optical Monitor, the EPIC pn, the two EPIC MOS and the two RGS instruments. The internal doors of the EPIC cameras were opened whilst keeping the camera filter wheels closed. Astounding first images After a series of engineering exposures, all three EPIC cameras were used in turn, between 19-24 January, to take several views of two different extragalactic regions of the Universe. These views, featuring a variety of extended and X-ray point sources, were chosen to demonstrate the full functioning of the observatory. The Optical Monitor also simultaneously viewed the same regions. One RGS spectrometer obtained its first spectra on 25 January; the other will be commissioned at the start of February. This initial series of short and long duration exposures have delighted the Project management team and the scientists even more. First analyses confirm that the spacecraft is extremely stable, the XMM telescopes are focusing perfectly, and the EPIC cameras, Optical Monitor and RGS spectrometers are working exactly as expected. The Science Operations Centre infrastructure, processing and archiving the science data telemetry from the spacecraft, is also performing well. Initial inspection of the first commissioning images immediately showed some unique X-ray views of several celestial objects, to be presented on 9 February. The occasion will give Principal Investigators and Project management the opportunity to comment on the pictures and the excellent start of the XMM mission. The Calibration and Performance Verification phase for XMM's science instruments is to begin on 3 March, with routine science operations starting in June. Press is invited to attend to the press conference that will be held at the Villafranca/ Madrid- Vilspa facility (ESA's Satellite Tracking Station) Apartado 50727, E-2 080 MADRID, Spain. The press event will be broadcast to the other ESA establishments: ESA Headquarters, Paris; ESA/ ESTEC (Space Expo), Noordwijk, the Netherlands; ESA/ESOC, Darmstadt, Germany and ESA/ESRIN, Frascati, Italy. Media representatives wishing to attend the event are kindly requested to fill out the attached reply from and fax it back to the establishment of their choice.

  10. Variability of the symbiotic X-ray binary GX 1+4. Enhanced activity near periastron passage

    NASA Astrophysics Data System (ADS)

    Iłkiewicz, Krystian; Mikołajewska, Joanna; Monard, Berto

    2017-05-01

    Context. GX 1+4 belongs to a rare class of X-ray binaries with red giant donors, symbiotic X-ray binaries. It has a history of complicated variability on multiple timescales in the optical light and X-rays. The nature of this variability remains poorly understood. Aims: We aim to study variability of GX 1+4 on long timescale in X-ray and optical bands. Methods: We took X-ray observations from the INTEGRAL Soft Gamma-Ray Imager and RXTE All Sky Monitor. Optical observations were made with the INTEGRAL Optical Monitoring Camera. Results: The variability of GX 1+4 both in optical light and hard X-ray emission (>17 keV) is dominated by 50-70 d quasi-periodic changes. The amplitude of this variability is highest during the periastron passage, while during the potential neutron star eclipse the system is always at minimum. This confirms the 1161 d orbital period that has had been proposed for the system based on radial velocity curve. Neither the quasi-periodic variability or the orbital period are detected in soft X-ray emission (1.3-12.2 keV), where the binary shows no apparent periodicity.

  11. Scalar wave-optical reconstruction of plenoptic camera images.

    PubMed

    Junker, André; Stenau, Tim; Brenner, Karl-Heinz

    2014-09-01

    We investigate the reconstruction of plenoptic camera images in a scalar wave-optical framework. Previous publications relating to this topic numerically simulate light propagation on the basis of ray tracing. However, due to continuing miniaturization of hardware components it can be assumed that in combination with low-aperture optical systems this technique may not be generally valid. Therefore, we study the differences between ray- and wave-optical object reconstructions of true plenoptic camera images. For this purpose we present a wave-optical reconstruction algorithm, which can be run on a regular computer. Our findings show that a wave-optical treatment is capable of increasing the detail resolution of reconstructed objects.

  12. Supernova and optical transient observations using the three wide-field telescope array of the KMTNet

    NASA Astrophysics Data System (ADS)

    Moon, Dae-Sik; Kim, Sang Chul; Lee, Jae-Joon; Pak, Mina; Park, Hong Soo; He, Matthias Y.; Antoniadis, John; Ni, Yuan Qi; Lee, Chung-Uk; Kim, Seung-Lee; Park, Byeong-Gon; Kim, Dong-Jin; Cha, Sang-Mok; Lee, Yongseok; Gonzalez, Santiago

    2016-08-01

    The Korea Microlensing Telescope Network (KMTNet) is a network of three new 1.6-m, wide-field telescopes spread over three different sites in Chile, South Africa and Australia. Each telescope is equipped with a four square degree wide-field CCD camera, making the KMTNet an ideal facility for discovering and monitoring early supernovae and other rapidly evolving optical transients by providing 24-hour continuous sky coverage. We describe our inaugurating program of observing supernovae and optical transients using about 20% of the KMTNet time in 2015-2019. Our early results include detection of infant supernovae, novae and peculiar transients as well as numerous variable stars and low surface brightness objects such as dwarf galaxies.

  13. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  14. Optical photometric monitoring of gamma -ray loud blazars. II. Observations from November 1995 to June 1996

    NASA Astrophysics Data System (ADS)

    Raiteri, C. M.; Ghisellini, G.; Villata, M.; de Francesco, G.; Lanteri, L.; Chiaberge, M.; Peila, A.; Antico, G.

    1998-02-01

    New data from the optical monitoring of gamma -ray loud blazars at the Torino Astronomical Observatory are presented. Observations have been taken in the Johnson's B, V, and Cousins' R bands with the 1.05m REOSC telescope equipped with a 1242x1152 pixel CCD camera. Many of the 22 monitored sources presented here show noticeable magnitude variations. Periods corresponding to pointings of the Energetic Gamma Ray Experiment Telescope (EGRET) on the Compton Gamma Ray Observatory (CGRO) satellite are indicated on the light curves. The comparison of our data with those taken by CGRO in the gamma -ray band will contribute to better understand the mechanism of the gamma -ray emission. We finally show intranight light curves of 3C 66A and OJ 287, where microvariability was detected. Tables 2--21 are only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/Abstract.html

  15. A Low-Cost Optical Remote Sensing Application for Glacier Deformation Monitoring in an Alpine Environment

    PubMed Central

    Giordan, Daniele; Allasia, Paolo; Dematteis, Niccolò; Dell’Anese, Federico; Vagliasindi, Marco; Motta, Elena

    2016-01-01

    In this work, we present the results of a low-cost optical monitoring station designed for monitoring the kinematics of glaciers in an Alpine environment. We developed a complete hardware/software data acquisition and processing chain that automatically acquires, stores and co-registers images. The system was installed in September 2013 to monitor the evolution of the Planpincieux glacier, within the open-air laboratory of the Grandes Jorasses, Mont Blanc massif (NW Italy), and collected data with an hourly frequency. The acquisition equipment consists of a high-resolution DSLR camera operating in the visible band. The data are processed with a Pixel Offset algorithm based on normalized cross-correlation, to estimate the deformation of the observed glacier. We propose a method for the pixel-to-metric conversion and present the results of the projection on the mean slope of the glacier. The method performances are compared with measurements obtained by GB-SAR, and exhibit good agreement. The system provides good support for the analysis of the glacier evolution and allows the creation of daily displacement maps. PMID:27775652

  16. A Low-Cost Optical Remote Sensing Application for Glacier Deformation Monitoring in an Alpine Environment.

    PubMed

    Giordan, Daniele; Allasia, Paolo; Dematteis, Niccolò; Dell'Anese, Federico; Vagliasindi, Marco; Motta, Elena

    2016-10-21

    In this work, we present the results of a low-cost optical monitoring station designed for monitoring the kinematics of glaciers in an Alpine environment. We developed a complete hardware/software data acquisition and processing chain that automatically acquires, stores and co-registers images. The system was installed in September 2013 to monitor the evolution of the Planpincieux glacier, within the open-air laboratory of the Grandes Jorasses, Mont Blanc massif (NW Italy), and collected data with an hourly frequency. The acquisition equipment consists of a high-resolution DSLR camera operating in the visible band. The data are processed with a Pixel Offset algorithm based on normalized cross-correlation, to estimate the deformation of the observed glacier. We propose a method for the pixel-to-metric conversion and present the results of the projection on the mean slope of the glacier. The method performances are compared with measurements obtained by GB-SAR, and exhibit good agreement. The system provides good support for the analysis of the glacier evolution and allows the creation of daily displacement maps.

  17. Application of spatially modulated near-infrared structured light to study changes in optical properties of mouse brain tissue during heatstress.

    PubMed

    Shaul, Oren; Fanrazi-Kahana, Michal; Meitav, Omri; Pinhasi, Gad A; Abookasis, David

    2017-11-10

    Heat stress (HS) is a medical emergency defined by abnormally elevated body temperature that causes biochemical, physiological, and hematological changes. The goal of the present research was to detect variations in optical properties (absorption, reduced scattering, and refractive index coefficients) of mouse brain tissue during HS by using near-infrared (NIR) spatial light modulation. NIR spatial patterns with different spatial phases were used to differentiate the effects of tissue scattering from those of absorption. Decoupling optical scattering from absorption enabled the quantification of a tissue's chemical constituents (related to light absorption) and structural properties (related to light scattering). Technically, structured light patterns at low and high spatial frequencies of six wavelengths ranging between 690 and 970 nm were projected onto the mouse scalp surface while diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse scalp. Concurrently to pattern projection, brain temperature was measured with a thermal camera positioned slightly off angle from the mouse head while core body temperature was monitored by thermocouple probe. Data analysis demonstrated variations from baseline measurements in a battery of intrinsic brain properties following HS.

  18. Surface Plasmon Resonance Biosensor Based on Smart Phone Platforms

    NASA Astrophysics Data System (ADS)

    Liu, Yun; Liu, Qiang; Chen, Shimeng; Cheng, Fang; Wang, Hanqi; Peng, Wei

    2015-08-01

    We demonstrate a fiber optic surface plasmon resonance (SPR) biosensor based on smart phone platforms. The light-weight optical components and sensing element are connected by optical fibers on a phone case. This SPR adaptor can be conveniently installed or removed from smart phones. The measurement, control and reference channels are illuminated by the light entering the lead-in fibers from the phone’s LED flash, while the light from the end faces of the lead-out fibers is detected by the phone’s camera. The SPR-sensing element is fabricated by a light-guiding silica capillary that is stripped off its cladding and coated with 50-nm gold film. Utilizing a smart application to extract the light intensity information from the camera images, the light intensities of each channel are recorded every 0.5 s with refractive index (RI) changes. The performance of the smart phone-based SPR platform for accurate and repeatable measurements was evaluated by detecting different concentrations of antibody binding to a functionalized sensing element, and the experiment results were validated through contrast experiments with a commercial SPR instrument. This cost-effective and portable SPR biosensor based on smart phones has many applications, such as medicine, health and environmental monitoring.

  19. Surface Plasmon Resonance Biosensor Based on Smart Phone Platforms.

    PubMed

    Liu, Yun; Liu, Qiang; Chen, Shimeng; Cheng, Fang; Wang, Hanqi; Peng, Wei

    2015-08-10

    We demonstrate a fiber optic surface plasmon resonance (SPR) biosensor based on smart phone platforms. The light-weight optical components and sensing element are connected by optical fibers on a phone case. This SPR adaptor can be conveniently installed or removed from smart phones. The measurement, control and reference channels are illuminated by the light entering the lead-in fibers from the phone's LED flash, while the light from the end faces of the lead-out fibers is detected by the phone's camera. The SPR-sensing element is fabricated by a light-guiding silica capillary that is stripped off its cladding and coated with 50-nm gold film. Utilizing a smart application to extract the light intensity information from the camera images, the light intensities of each channel are recorded every 0.5 s with refractive index (RI) changes. The performance of the smart phone-based SPR platform for accurate and repeatable measurements was evaluated by detecting different concentrations of antibody binding to a functionalized sensing element, and the experiment results were validated through contrast experiments with a commercial SPR instrument. This cost-effective and portable SPR biosensor based on smart phones has many applications, such as medicine, health and environmental monitoring.

  20. Multi-color pyrometry imaging system and method of operating the same

    DOEpatents

    Estevadeordal, Jordi; Nirmalan, Nirm Velumylum; Tralshawala, Nilesh; Bailey, Jeremy Clyde

    2017-03-21

    A multi-color pyrometry imaging system for a high-temperature asset includes at least one viewing port in optical communication with at least one high-temperature component of the high-temperature asset. The system also includes at least one camera device in optical communication with the at least one viewing port. The at least one camera device includes a camera enclosure and at least one camera aperture defined in the camera enclosure, The at least one camera aperture is in optical communication with the at least one viewing port. The at least one camera device also includes a multi-color filtering mechanism coupled to the enclosure. The multi-color filtering mechanism is configured to sequentially transmit photons within a first predetermined wavelength band and transmit photons within a second predetermined wavelength band that is different than the first predetermined wavelength band.

  1. Polarization sensitive camera for the in vitro diagnostic and monitoring of dental erosion

    NASA Astrophysics Data System (ADS)

    Bossen, Anke; Rakhmatullina, Ekaterina; Lussi, Adrian; Meier, Christoph

    Due to a frequent consumption of acidic food and beverages, the prevalence of dental erosion increases worldwide. In an initial erosion stage, the hard dental tissue is softened due to acidic demineralization. As erosion progresses, a gradual tissue wear occurs resulting in thinning of the enamel. Complete loss of the enamel tissue can be observed in severe clinical cases. Therefore, it is essential to provide a diagnosis tool for an accurate detection and monitoring of dental erosion already at early stages. In this manuscript, we present the development of a polarization sensitive imaging camera for the visualization and quantification of dental erosion. The system consists of two CMOS cameras mounted on two sides of a polarizing beamsplitter. A horizontal linearly polarized light source is positioned orthogonal to the camera to ensure an incidence illumination and detection angles of 45°. The specular reflected light from the enamel surface is collected with an objective lens mounted on the beam splitter and divided into horizontal (H) and vertical (V) components on each associate camera. Images of non-eroded and eroded enamel surfaces at different erosion degrees were recorded and assessed with diagnostic software. The software was designed to generate and display two types of images: distribution of the reflection intensity (V) and a polarization ratio (H-V)/(H+V) throughout the analyzed tissue area. The measurements and visualization of these two optical parameters, i.e. specular reflection intensity and the polarization ratio, allowed detection and quantification of enamel erosion at early stages in vitro.

  2. Using Engineering Cameras on Mars Landers and Rovers to Retrieve Atmospheric Dust Loading

    NASA Astrophysics Data System (ADS)

    Wolfe, C. A.; Lemmon, M. T.

    2014-12-01

    Dust in the Martian atmosphere influences energy deposition, dynamics, and the viability of solar powered exploration vehicles. The Viking, Pathfinder, Spirit, Opportunity, Phoenix, and Curiosity landers and rovers each included the ability to image the Sun with a science camera that included a neutral density filter. Direct images of the Sun provide the ability to measure extinction by dust and ice in the atmosphere. These observations have been used to characterize dust storms, to provide ground truth sites for orbiter-based global measurements of dust loading, and to help monitor solar panel performance. In the cost-constrained environment of Mars exploration, future missions may omit such cameras, as the solar-powered InSight mission has. We seek to provide a robust capability of determining atmospheric opacity from sky images taken with cameras that have not been designed for solar imaging, such as lander and rover engineering cameras. Operational use requires the ability to retrieve optical depth on a timescale useful to mission planning, and with an accuracy and precision sufficient to support both mission planning and validating orbital measurements. We will present a simulation-based assessment of imaging strategies and their error budgets, as well as a validation based on archival engineering camera data.

  3. RESTORATION OF ATMOSPHERICALLY DEGRADED IMAGES. VOLUME 3.

    DTIC Science & Technology

    AERIAL CAMERAS, LASERS, ILLUMINATION, TRACKING CAMERAS, DIFFRACTION, PHOTOGRAPHIC GRAIN, DENSITY, DENSITOMETERS, MATHEMATICAL ANALYSIS, OPTICAL SCANNING, SYSTEMS ENGINEERING, TURBULENCE, OPTICAL PROPERTIES, SATELLITE TRACKING SYSTEMS.

  4. A Submersible Holographic Camera for the Undisturbed Characterization of Optically Relevant Particles in Water (HOLOCAM)

    DTIC Science & Technology

    2012-09-01

    how to improve both reconstruction and analytical software during testing of the submersible system. IMPACT AND APPLICATIONS Quality of Life...project (see related projects below). It could also be used for sediment load monitoring and assesment . The HOLOCAM could provide critical data to any...Science Education and Communication Currently the link between the suspended particle field and the bulk scattering properties of natural waters is

  5. HIGH SPEED KERR CELL FRAMING CAMERA

    DOEpatents

    Goss, W.C.; Gilley, L.F.

    1964-01-01

    The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)

  6. Optical monitor for observing turbulent flow

    DOEpatents

    Albrecht, Georg F.; Moore, Thomas R.

    1992-01-01

    The present invention provides an apparatus and method for non-invasively monitoring turbulent fluid flows including anisotropic flows. The present invention uses an optical technique to filter out the rays travelling in a straight line, while transmitting rays with turbulence induced fluctuations in time. The output is two dimensional, and can provide data regarding the spectral intensity distribution, or a view of the turbulence in real time. The optical monitor of the present invention comprises a laser that produces a coherent output beam that is directed through a fluid flow, which phase-modulates the beam. The beam is applied to a temporal filter that filters out the rays in the beam that are straight, while substantially transmitting the fluctuating, turbulence-induced rays. The temporal filter includes a lens and a photorefractive crystal such as BaTiO.sub.3 that is positioned in the converging section of the beam near the focal plane. An imaging system is used to observe the filtered beam. The imaging system may take a photograph, or it may include a real time camera that is connected to a computer. The present invention may be used for many purposes including research and design in aeronautics, hydrodynamics, and combustion.

  7. Cassini Camera Contamination Anomaly: Experiences and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Haemmerle, Vance R.; Gerhard, James H.

    2006-01-01

    We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.

  8. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    NASA Astrophysics Data System (ADS)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  9. Normalization of laser-induced breakdown spectroscopy spectra using a plastic optical fiber light collector and acoustic sensor device.

    PubMed

    Anabitarte, Francisco; Rodríguez-Cobo, Luis; López-Higuera, José-Miguel; Cobo, Adolfo

    2012-12-01

    To estimate the acoustic plasma energy in laser-induced breakdown spectroscopy (LIBS) experiments, a light collecting and acoustic sensing device based on a coil of plastic optical fiber (POF) is proposed. The speckle perturbation induced by the plasma acoustic energy was monitored using a CCD camera placed at the end of a coil of multimode POF and processed with an intraimage contrast ratio method. The results were successfully verified with the acoustic energy measured by a reference microphone. The proposed device is useful for normalizing LIBS spectra, enabling a better estimation of the sample's chemical composition.

  10. Hubble Space Telescope Discovery of a Probable Caustic-Crossing Event in the MACS1149 Galaxy Cluster Field

    NASA Astrophysics Data System (ADS)

    Kelly, Patrick L.; Rodney, Steven; Diego, Jose Maria; Zitrin, Adi; Broadhurst, Tom; Selsing, Jonatan; Balestra, Italo; Benito, Alberto Molino; Bradac, Marusa; Bradley, Larry; Brammer, Gabriel; Cenko, Brad; Christensen, Lise; Coe, Dan; Filippenko, Alexei V.; Foley, Ryan; Frye, Brenda; Graham, Melissa; Graur, Or; Grillo, Claudio; Hjorth, Jens; Howell, Andy; Jauzac, Mathilde; Jha, Saurabh; Kaiser, Nick; Kawamata, Ryota; Kneib, Jean-Paul; Lotz, Jennifer; Matheson, Thomas; McCully, Curtis; Merten, Julian; Nonino, Mario; Oguri, Masamune; Richard, Johan; Riess, Adam; Rosati, Piero; Schmidt, Kasper Borello; Sharon, Keren; Smith, Nathan; Strolger, Lou; Treu, Tommaso; Wang, Xin; Weiner, Ben; Williams, Liliya; Zheng, Weikang

    2016-05-01

    While monitoring the MACS1149 (z = 0.54) galaxy cluster as part of the RefsdalRedux program (PID 14199; PI Kelly) with the Hubble Space Telescope (HST) WFC3 IR camera, we have detected a rising transient that appears to be coincident ( Target-of-opportunity optical follow-up imaging in several ACS and WFC3 bands with the FrontierSN program (PID 14208; PI Rodney) has revealed that its rest-frame ultraviolet through optical spectrum may be reasonably well fit with that of a B star at z=1.49 exhibiting a strong Balmer break.

  11. Imaging microscopic structures in pathological retinas using a flood-illumination adaptive optics retinal camera

    NASA Astrophysics Data System (ADS)

    Viard, Clément; Nakashima, Kiyoko; Lamory, Barbara; Pâques, Michel; Levecq, Xavier; Château, Nicolas

    2011-03-01

    This research is aimed at characterizing in vivo differences between healthy and pathological retinal tissues at the microscopic scale using a compact adaptive optics (AO) retinal camera. Tests were performed in 120 healthy eyes and 180 eyes suffering from 19 different pathological conditions, including age-related maculopathy (ARM), glaucoma and rare diseases such as inherited retinal dystrophies. Each patient was first examined using SD-OCT and infrared SLO. Retinal areas of 4°x4° were imaged using an AO flood-illumination retinal camera based on a large-stroke deformable mirror. Contrast was finally enhanced by registering and averaging rough images using classical algorithms. Cellular-resolution images could be obtained in most cases. In ARM, AO images revealed granular contents in drusen, which were invisible in SLO or OCT images, and allowed the observation of the cone mosaic between drusen. In glaucoma cases, visual field was correlated to changes in cone visibility. In inherited retinal dystrophies, AO helped to evaluate cone loss across the retina. Other microstructures, slightly larger in size than cones, were also visible in several retinas. AO provided potentially useful diagnostic and prognostic information in various diseases. In addition to cones, other microscopic structures revealed by AO images may also be of interest in monitoring retinal diseases.

  12. Optical designs for the Mars '03 rover cameras

    NASA Astrophysics Data System (ADS)

    Smith, Gregory H.; Hagerott, Edward C.; Scherr, Lawrence M.; Herkenhoff, Kenneth E.; Bell, James F.

    2001-12-01

    In 2003, NASA is planning to send two robotic rover vehicles to explore the surface of Mars. The spacecraft will land on airbags in different, carefully chosen locations. The search for evidence indicating conditions favorable for past or present life will be a high priority. Each rover will carry a total of ten cameras of five various types. There will be a stereo pair of color panoramic cameras, a stereo pair of wide- field navigation cameras, one close-up camera on a movable arm, two stereo pairs of fisheye cameras for hazard avoidance, and one Sun sensor camera. This paper discusses the lenses for these cameras. Included are the specifications, design approaches, expected optical performances, prescriptions, and tolerances.

  13. Speckle-correlation monitoring of the internal micro-vascular flow

    NASA Astrophysics Data System (ADS)

    Zimnyakov, D. A.; Khmara, M. B.; Vilensky, M. A.; Kozlov, V. V.; Gorfinkel, I. V.; Zdrajevsky, R. A.

    2009-10-01

    The results of experimental study of possibility to monitor the micro-vascular blood flow in superficial tissues of various organs with the use of endoscope-based full-field speckle correlometer are presented. The blood microcirculation monitoring was carried out in the course of the laparotomy of abdominal cavity of laboratory animals (rats). Transfer of laser light to the area of interest and scattered radiation from the probed zone to the detector (CMOS camera) was carried out via fiber-optic bundles of endoscopic system. Microscopic hemodynamics was analyzed for small intestine, liver, spleen, kidney, and pancreas under different conditions (normal state, provocated peritonitis and ischemia, administration of vasodilative agents such as papaverine, lidocaine). The prospects and problems of internal monitoring of microvascular flow in laboratory and clinical conditions are discussed.

  14. An all-silicone zoom lens in an optical imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Cun-Hua

    2013-09-01

    An all-silicone zoom lens is fabricated. A tunable metal ringer is fettered around the side edge of the lens. A nylon rope linking a motor is tied, encircling the notch in the metal ringer. While the motor is operating, the rope can shrink or release to change the focal length of the lens. A calculation method is developed to obtain the focal length and the zoom ratio. The testing is carried out in succession. The testing values are compared with the calculated ones, and they tally with each other well. Finally, the imaging performance of the all-silicone lens is demonstrated. The all-silicone lens has potential uses in cellphone cameras, notebook cameras, micro monitor lenses, etc.

  15. BDPU, Favier places new test chamber into experiment module in LMS-1 Spacelab

    NASA Image and Video Library

    1996-07-09

    STS078-301-021 (20 June - 7 July 1996) --- Payload specialist Jean-Jacques Favier, representing the French Space Agency (CNES), holds up a test container to a Spacelab camera. The test involves the Bubble Drop Particle Unit (BDPU), which Favier is showing to ground controllers at the Marshall Space Flight Center (MSFC) in order to check the condition of the unit prior to heating in the BDPU facility. The test container holds experimental fluid and allows experiment observation through optical windows. BDPU contains three internal cameras that are used to continuously downlink BDPU activity so that behavior of the bubbles can be monitored. Astronaut Richard M. Linnehan, mission specialist, conducts biomedical testing in the background.

  16. Coordinating High-Resolution Traffic Cameras : Developing Intelligent, Collaborating Cameras for Transportation Security and Communications

    DOT National Transportation Integrated Search

    2015-08-01

    Cameras are used prolifically to monitor transportation incidents, infrastructure, and congestion. Traditional camera systems often require human monitoring and only offer low-resolution video. Researchers for the Exploratory Advanced Research (EAR) ...

  17. Image Intensifier Modules For Use With Commercially Available Solid State Cameras

    NASA Astrophysics Data System (ADS)

    Murphy, Howard; Tyler, Al; Lake, Donald W.

    1989-04-01

    A modular approach to design has contributed greatly to the success of the family of machine vision video equipment produced by EG&G Reticon during the past several years. Internal modularity allows high-performance area (matrix) and line scan cameras to be assembled with two or three electronic subassemblies with very low labor costs, and permits camera control and interface circuitry to be realized by assemblages of various modules suiting the needs of specific applications. Product modularity benefits equipment users in several ways. Modular matrix and line scan cameras are available in identical enclosures (Fig. 1), which allows enclosure components to be purchased in volume for economies of scale and allows field replacement or exchange of cameras within a customer-designed system to be easily accomplished. The cameras are optically aligned (boresighted) at final test; modularity permits optical adjustments to be made with the same precise test equipment for all camera varieties. The modular cameras contain two, or sometimes three, hybrid microelectronic packages (Fig. 2). These rugged and reliable "submodules" perform all of the electronic operations internal to the camera except for the job of image acquisition performed by the monolithic image sensor. Heat produced by electrical power dissipation in the electronic modules is conducted through low resistance paths to the camera case by the metal plates, which results in a thermally efficient and environmentally tolerant camera with low manufacturing costs. A modular approach has also been followed in design of the camera control, video processor, and computer interface accessory called the Formatter (Fig. 3). This unit can be attached directly onto either a line scan or matrix modular camera to form a self-contained units, or connected via a cable to retain the advantages inherent to a small, light weight, and rugged image sensing component. Available modules permit the bus-structured Formatter to be configured as required by a specific camera application. Modular line and matrix scan cameras incorporating sensors with fiber optic faceplates (Fig 4) are also available. These units retain the advantages of interchangeability, simple construction, ruggedness, and optical precision offered by the more common lens input units. Fiber optic faceplate cameras are used for a wide variety of applications. A common usage involves mating of the Reticon-supplied camera to a customer-supplied intensifier tube for low light level and/or short exposure time situations.

  18. High signal-to-noise-ratio electro-optical terahertz imaging system based on an optical demodulating detector array.

    PubMed

    Spickermann, Gunnar; Friederich, Fabian; Roskos, Hartmut G; Bolívar, Peter Haring

    2009-11-01

    We present a 64x48 pixel 2D electro-optical terahertz (THz) imaging system using a photonic mixing device time-of-flight camera as an optical demodulating detector array. The combination of electro-optic detection with a time-of-flight camera increases sensitivity drastically, enabling the use of a nonamplified laser source for high-resolution real-time THz electro-optic imaging.

  19. Phenology cameras observing boreal ecosystems of Finland

    NASA Astrophysics Data System (ADS)

    Peltoniemi, Mikko; Böttcher, Kristin; Aurela, Mika; Kolari, Pasi; Tanis, Cemal Melih; Linkosalmi, Maiju; Loehr, John; Metsämäki, Sari; Nadir Arslan, Ali

    2016-04-01

    Cameras have become useful tools for monitoring seasonality of ecosystems. Low-cost cameras facilitate validation of other measurements and allow extracting some key ecological features and moments from image time series. We installed a network of phenology cameras at selected ecosystem research sites in Finland. Cameras were installed above, on the level, or/and below the canopies. Current network hosts cameras taking time lapse images in coniferous and deciduous forests as well as at open wetlands offering thus possibilities to monitor various phenological and time-associated events and elements. In this poster, we present our camera network and give examples of image series use for research. We will show results about the stability of camera derived color signals, and based on that discuss about the applicability of cameras in monitoring time-dependent phenomena. We will also present results from comparisons between camera-derived color signal time series and daily satellite-derived time series (NVDI, NDWI, and fractional snow cover) from the Moderate Resolution Imaging Spectrometer (MODIS) at selected spruce and pine forests and in a wetland. We will discuss the applicability of cameras in supporting phenological observations derived from satellites, by considering the possibility of cameras to monitor both above and below canopy phenology and snow.

  20. Application of PLZT electro-optical shutter to diaphragm of visible and mid-infrared cameras

    NASA Astrophysics Data System (ADS)

    Fukuyama, Yoshiyuki; Nishioka, Shunji; Chonan, Takao; Sugii, Masakatsu; Shirahata, Hiromichi

    1997-04-01

    Pb0.9La0.09(Zr0.65,Ti0.35)0.9775O3 9/65/35) commonly used as an electro-optical shutter exhibits large phase retardation with low applied voltage. This shutter features as follows; (1) high shutter speed, (2) wide optical transmittance, and (3) high optical density in 'OFF'-state. If the shutter is applied to a diaphragm of video-camera, it could protect its sensor from intense lights. We have tested the basic characteristics of the PLZT electro-optical shutter and resolved power of imaging. The ratio of optical transmittance at 'ON' and 'OFF'-states was 1.1 X 103. The response time of the PLZT shutter from 'ON'-state to 'OFF'-state was 10 micro second. MTF reduction when putting the PLZT shutter in from of the visible video- camera lens has been observed only with 12 percent at a spatial frequency of 38 cycles/mm which are sensor resolution of the video-camera. Moreover, we took the visible image of the Si-CCD video-camera. The He-Ne laser ghost image was observed at 'ON'-state. On the contrary, the ghost image was totally shut out at 'OFF'-state. From these teste, it has been found that the PLZT shutter is useful for the diaphragm of the visible video-camera. The measured optical transmittance of PLZT wafer with no antireflection coating was 78 percent over the range from 2 to 6 microns.

  1. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera…

  2. Calibration of the Auger Fluorescence Telescopes

    NASA Astrophysics Data System (ADS)

    Klages, H.; Pierre Auger Observatory Collaboration

    Thirty fluorescence telescopes in four stations will overlook the detector array of the southern hemisphere experiment of the Pierre Auger project. The main aim of these telescopes is tracking of EHE air showers, measurement of the longitudinal shower development (Xmax) and determination of the absolute energy of EHE events. A telescope camera contains 440 PMTs each covering a 1.5 x 1.5 degree pixel of the sky. The response of every pixel is converted into the number of charged particles at the observed part of the shower. This reconstruction includes the shower/observer geometry and the details of the atmospheric photon production and transport. The remaining experimental task is to convert the ADC counts of the camera pixel electronics into the light flux entering the Schmidt aperture. Three types of calibration and control are necessary : a) Monitoring of time dependent variations has to be performed for all parts of the optics and for all pixels frequently. Common illumination for all pixels of a camera allows the detection of individual deviations. Properties of windows, filters and mirrors have to be measured separately. b) Differences in pixel-to-pixel efficiency are mainly due to PMT gain and to differences in effective area (camera shadow, mirror size limits). Homogeneous and isotropic illumination will enable cross calibration. c) An absolute calibration has to be performed once in a while using trusted light monitors. The calibration methods used for the Pierre Auger FD telescopes in Argentina are discussed.

  3. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  4. Experimental setup for camera-based measurements of electrically and optically stimulated luminescence of silicon solar cells and wafers.

    PubMed

    Hinken, David; Schinke, Carsten; Herlufsen, Sandra; Schmidt, Arne; Bothe, Karsten; Brendel, Rolf

    2011-03-01

    We report in detail on the luminescence imaging setup developed within the last years in our laboratory. In this setup, the luminescence emission of silicon solar cells or silicon wafers is analyzed quantitatively. Charge carriers are excited electrically (electroluminescence) using a power supply for carrier injection or optically (photoluminescence) using a laser as illumination source. The luminescence emission arising from the radiative recombination of the stimulated charge carriers is measured spatially resolved using a camera. We give details of the various components including cameras, optical filters for electro- and photo-luminescence, the semiconductor laser and the four-quadrant power supply. We compare a silicon charged-coupled device (CCD) camera with a back-illuminated silicon CCD camera comprising an electron multiplier gain and a complementary metal oxide semiconductor indium gallium arsenide camera. For the detection of the luminescence emission of silicon we analyze the dominant noise sources along with the signal-to-noise ratio of all three cameras at different operation conditions.

  5. Digital holographic interferometry applied to the investigation of ignition process.

    PubMed

    Pérez-Huerta, J S; Saucedo-Anaya, Tonatiuh; Moreno, I; Ariza-Flores, D; Saucedo-Orozco, B

    2017-06-12

    We use the digital holographic interferometry (DHI) technique to display the early ignition process for a butane-air mixture flame. Because such an event occurs in a short time (few milliseconds), a fast CCD camera is used to study the event. As more detail is required for monitoring the temporal evolution of the process, less light coming from the combustion is captured by the CCD camera, resulting in a deficient and underexposed image. Therefore, the CCD's direct observation of the combustion process is limited (down to 1000 frames per second). To overcome this drawback, we propose the use of DHI along with a high power laser in order to supply enough light to increase the speed capture, thus improving the visualization of the phenomenon in the initial moments. An experimental optical setup based on DHI is used to obtain a large sequence of phase maps that allows us to observe two transitory stages in the ignition process: a first explosion which slightly emits visible light, and a second stage induced by variations in temperature when the flame is emerging. While the last stage can be directly monitored by the CCD camera, the first stage is hardly detected by direct observation, and DHI clearly evidences this process. Furthermore, our method can be easily adapted for visualizing other types of fast processes.

  6. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  7. New generation of meteorology cameras

    NASA Astrophysics Data System (ADS)

    Janout, Petr; Blažek, Martin; Páta, Petr

    2017-12-01

    A new generation of the WILLIAM (WIde-field aLL-sky Image Analyzing Monitoring system) camera includes new features such as monitoring of rain and storm clouds during the day observation. Development of the new generation of weather monitoring cameras responds to the demand for monitoring of sudden weather changes. However, new WILLIAM cameras are ready to process acquired image data immediately, release warning against sudden torrential rains, and send it to user's cell phone and email. Actual weather conditions are determined from image data, and results of image processing are complemented by data from sensors of temperature, humidity, and atmospheric pressure. In this paper, we present the architecture, image data processing algorithms of mentioned monitoring camera and spatially-variant model of imaging system aberrations based on Zernike polynomials.

  8. Cloud Computing with Context Cameras

    NASA Astrophysics Data System (ADS)

    Pickles, A. J.; Rosing, W. E.

    2016-05-01

    We summarize methods and plans to monitor and calibrate photometric observations with our autonomous, robotic network of 2m, 1m and 40cm telescopes. These are sited globally to optimize our ability to observe time-variable sources. Wide field "context" cameras are aligned with our network telescopes and cycle every ˜2 minutes through BVr'i'z' filters, spanning our optical range. We measure instantaneous zero-point offsets and transparency (throughput) against calibrators in the 5-12m range from the all-sky Tycho2 catalog, and periodically against primary standards. Similar measurements are made for all our science images, with typical fields of view of ˜0.5 degrees. These are matched against Landolt, Stetson and Sloan standards, and against calibrators in the 10-17m range from the all-sky APASS catalog. Such measurements provide pretty good instantaneous flux calibration, often to better than 5%, even in cloudy conditions. Zero-point and transparency measurements can be used to characterize, monitor and inter-compare sites and equipment. When accurate calibrations of Target against Standard fields are required, monitoring measurements can be used to select truly photometric periods when accurate calibrations can be automatically scheduled and performed.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boivin, Jonathan, E-mail: jonathan.boivin.1@ulaval.ca; Beaulieu, Luc; Beddar, Sam

    Purpose: The authors’ objective was to systematically assess the performance of seven photodetectors used in plastic scintillation dosimetry. The authors also propose some guidelines for selecting an appropriate detector for a specific application. Methods: The plastic scintillation detector (PSD) consisted of a 1-mm diameter, 10-mm long plastic scintillation fiber (BCF-60), which was optically coupled to a clear 10-m long optical fiber of the same diameter. A light-tight plastic sheath covered both fibers and the scintillator end was sealed. The clear fiber end was connected to one of the following photodetectors: two polychromatic cameras (one with an optical lens and onemore » with a fiber optic taper replacing the lens), a monochromatic camera with an optical lens, a PIN photodiode, an avalanche photodiode (APD), or a photomultiplier tube (PMT). A commercially available W1 PSD was also included in the study, but it relied on its own fiber and scintillator. Each PSD was exposed to both low-energy beams (120, 180, and 220 kVp) from an orthovoltage unit and high-energy beams (6 and 23 MV) from a linear accelerator. Various dose rates were tested to identify the operating range and accuracy of each photodetector. Results: For all photodetectors, the relative uncertainty was less than 5% for dose rates higher than 3 mGy/s. The cameras allowed multiple probes to be used simultaneously, but they are less sensitive to low-light signals. The PIN, APD, and PMT had higher sensitivity, making them more suitable for low dose rate and out-of-field dose monitoring. The relative uncertainty of the PMT was less than 1% at the lowest dose rate achieved (0.10 mGy/s), suggesting that it was optimal for use in live dosimetry. Conclusions: For dose rates higher than 3 mGy/s, the PIN diode is the most effective photodetector in terms of performance/cost ratio. For lower dose rates, such as those seen in interventional radiology or high-gradient radiotherapy, PMTs are the optimal choice.« less

  10. Optical sensing in laser machining

    NASA Astrophysics Data System (ADS)

    Smurov, Igor; Doubenskaia, Maria

    2009-05-01

    Optical monitoring of temperature evolution and temperature distribution in laser machining provides important information to optimise and to control technological process under study. The multi-wavelength pyrometer is used to measure brightness temperature under the pulsed action of Nd:YAG laser on stainless steel substrates. Specially developed "notch" filters (10-6 transparency at 1.06 μm wavelength) are applied to avoid the influence of laser radiation on temperature measurements. The true temperature is restored based on the method of multi-colour pyrometry. Temperature monitoring of the thin-walled gilded kovar boxes is applied to detect deviation of the welding seam from its optimum position. The pyrometers are used to control CO2-laser welding of steel and Ti plates: misalignment of the welded plates, variation of the welding geometry, internal defects, deviation of the laser beam trajectory from the junction, etc. The temperature profiles along and across the welding axis are measured by the 2D pyrometer. When using multi-component powder blends in laser cladding, for example metal matrix composite with ceramic reinforcement, one needs to control temperature of the melt to avoid thermal decomposition of certain compounds (as WC) and to assure melting of the base metal (as Co). Infra-red camera FLIR Phoenix RDAS provides detailed information on distribution of brightness temperature in laser cladding zone. CCD-camera based diagnostic system is used to measure particles-in-flight velocity and size distribution.

  11. Atomic force-multi-optical imaging integrated microscope for monitoring molecular dynamics in live cells.

    PubMed

    Trache, Andreea; Meininger, Gerald A

    2005-01-01

    A novel hybrid imaging system is constructed integrating atomic force microscopy (AFM) with a combination of optical imaging techniques that offer high spatial resolution. The main application of this instrument (the NanoFluor microscope) is the study of mechanotransduction with an emphasis on extracellular matrix-integrin-cytoskeletal interactions and their role in the cellular responses to changes in external chemical and mechanical factors. The AFM allows the quantitative assessment of cytoskeletal changes, binding probability, adhesion forces, and micromechanical properties of the cells, while the optical imaging applications allow thin sectioning of the cell body at the coverslip-cell interface, permitting the study of focal adhesions using total internal reflection fluorescence (TIRF) and internal reflection microscopy (IRM). Combined AFM-optical imaging experiments show that mechanical stimulation at the apical surface of cells induces a force-generating cytoskeletal response, resulting in focal contact reorganization on the basal surface that can be monitored in real time. The NanoFluor system is also equipped with a novel mechanically aligned dual camera acquisition system for synthesized Forster resonance energy transfer (FRET). The integrated NanoFluor microscope system is described, including its characteristics, applications, and limitations.

  12. Stereo electro-optical tracker study for the measurement of model deformations at the National Transonic Facility

    NASA Astrophysics Data System (ADS)

    Hertel, R. J.; Hoilman, K. A.

    1982-01-01

    The effects of model vibration, camera and window nonlinearities, and aerodynamic disturbances in the optical path on the measurement of target position is examined. Window distortion, temperature and pressure changes, laminar and turbulent boundary layers, shock waves, target intensity and, target vibration are also studied. A general computer program was developed to trace optical rays through these disturbances. The use of a charge injection device camera as an alternative to the image dissector camera was examined.

  13. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  14. Non-contact continuous-wave diffuse optical tomographic system to capture vascular dynamics in the foot

    NASA Astrophysics Data System (ADS)

    Hoi, Jennifer W.; Kim, Hyun K.; Khalil, Michael A.; Fong, Christopher J.; Marone, Alessandro; Shrikhande, Gautam; Hielscher, Andreas H.

    2015-03-01

    Dynamic optical tomographic imaging has shown promise in diagnosing and monitoring peripheral arterial disease (PAD), which affects 8 to 12 million in the United States. PAD is the narrowing of the arteries that supply blood to the lower extremities. Prolonged reduced blood flow to the foot leads to ulcers and gangrene, which makes placement of optical fibers for contact-based optical tomography systems difficult and cumbersome. Since many diabetic PAD patients have foot wounds, a non-contact interface is highly desirable. We present a novel non-contact dynamic continuous-wave optical tomographic imaging system that images the vasculature in the foot for evaluating PAD. The system images at up to 1Hz by delivering 2 wavelengths of light to the top of the foot at up to 20 source positions through collimated source fibers. Transmitted light is collected with an electron multiplying charge couple device (EMCCD) camera. We demonstrate that the system can resolve absorbers at various locations in a phantom study and show the system's first clinical 3D images of total hemoglobin changes in the foot during venous occlusion at the thigh. Our initial results indicate that this system is effective in capturing the vascular dynamics within the foot and can be used to diagnose and monitor treatment of PAD in diabetic patients.

  15. Space telescope optical telescope assembly/scientific instruments. Phase B: -Preliminary design and program definition study; Volume 2A: Planetary camera report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Development of the F/48, F/96 Planetary Camera for the Large Space Telescope is discussed. Instrument characteristics, optical design, and CCD camera submodule thermal design are considered along with structural subsystem and thermal control subsystem. Weight, electrical subsystem, and support equipment requirements are also included.

  16. Employing unmanned aerial vehicle to monitor the health condition of wind turbines

    NASA Astrophysics Data System (ADS)

    Huang, Yishuo; Chiang, Chih-Hung; Hsu, Keng-Tsang; Cheng, Chia-Chi

    2018-04-01

    Unmanned aerial vehicle (UAV) can gather the spatial information of huge structures, such as wind turbines, that can be difficult to obtain with traditional approaches. In this paper, the UAV used in the experiments is equipped with high resolution camera and thermal infrared camera. The high resolution camera can provide a series of images with resolution up to 10 Megapixels. Those images can be used to form the 3D model using the digital photogrammetry technique. By comparing the 3D scenes of the same wind turbine at different times, possible displacement of the supporting tower of the wind turbine, caused by ground movement or foundation deterioration may be determined. The recorded thermal images are analyzed by applying the image segmentation methods to the surface temperature distribution. A series of sub-regions are separated by the differences of the surface temperature. The high-resolution optical image and the segmented thermal image are fused such that the surface anomalies are more easily identified for wind turbines.

  17. Handheld hyperspectral imager for standoff detection of chemical and biological aerosols

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Jensen, James O.; McAnally, Gerard

    2004-08-01

    Pacific Advanced Technology has developed a small hand held imaging spectrometer, Sherlock, for gas leak and aerosol detection and imaging. The system is based on a patented technique, (IMSS Image Multi-spectral Sensing), that uses diffractive optics and image processing algorithms to detect spectral information about objects in the scene of the camera. This cameras technology has been tested at Dugway Proving Ground and Dstl Porton Down facilities looking at Chemical and Biological agent simulants. In addition to Chemical and Biological detection, the camera has been used for environmental monitoring of green house gases and is currently undergoing extensive laboratory and field testing by the Gas Technology Institute, British Petroleum and Shell Oil for applications for gas leak detection and repair. In this paper we will present some of the results from the data collection at the TRE test at Dugway Proving Ground during the summer of 2002 and laboratory testing at the Dstl facility at Porton Down in the UK in the fall of 2002.

  18. Optically Remote Noncontact Heart Rates Sensing Technique

    NASA Astrophysics Data System (ADS)

    Thongkongoum, W.; Boonduang, S.; Limsuwan, P.

    2017-09-01

    Heart rate monitoring via optically remote noncontact technique was reported in this research. A green laser (5 mW, 532±10 nm) was projected onto the left carotid artery. The reflected laser light on the screen carried the deviation of the interference patterns. The interference patterns were recorded by the digital camera. The recorded videos of the interference patterns were frame by frame analysed by 2 standard digital image processing (DIP) techniques, block matching (BM) and optical flow (OF) techniques. The region of interest (ROI) pixels within the interference patterns were analysed for periodically changes of the interference patterns due to the heart pumping action. Both results of BM and OF techniques were compared with the reference medical heart rate monitoring device by which a contact measurement using pulse transit technique. The results obtained from BM technique was 74.67 bpm (beats per minute) and OF technique was 75.95 bpm. Those results when compared with the reference value of 75.43±1 bpm, the errors were found to be 1.01% and 0.69%, respectively.

  19. TH-AB-202-11: Spatial and Rotational Quality Assurance of 6DOF Patient Tracking Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belcher, AH; Liu, X; Grelewicz, Z

    2016-06-15

    Purpose: External tracking systems used for patient positioning and motion monitoring during radiotherapy are now capable of detecting both translations and rotations (6DOF). In this work, we develop a novel technique to evaluate the 6DOF performance of external motion tracking systems. We apply this methodology to an infrared (IR) marker tracking system and two 3D optical surface mapping systems in a common tumor 6DOF workspace. Methods: An in-house designed and built 6DOF parallel kinematics robotic motion phantom was used to follow input trajectories with sub-millimeter and sub-degree accuracy. The 6DOF positions of the robotic system were then tracked and recordedmore » independently by three optical camera systems. A calibration methodology which associates the motion phantom and camera coordinate frames was first employed, followed by a comprehensive 6DOF trajectory evaluation, which spanned a full range of positions and orientations in a 20×20×16 mm and 5×5×5 degree workspace. The intended input motions were compared to the calibrated 6DOF measured points. Results: The technique found the accuracy of the IR marker tracking system to have maximal root mean square error (RMSE) values of 0.25 mm translationally and 0.09 degrees rotationally, in any one axis, comparing intended 6DOF positions to positions measured by the IR camera. The 6DOF RSME discrepancy for the first 3D optical surface tracking unit yielded maximal values of 0.60 mm and 0.11 degrees over the same 6DOF volume. An earlier generation 3D optical surface tracker was observed to have worse tracking capabilities than both the IR camera unit and the newer 3D surface tracking system with maximal RMSE of 0.74 mm and 0.28 degrees within the same 6DOF evaluation space. Conclusion: The proposed technique was effective at evaluating the performance of 6DOF patient tracking systems. All systems examined exhibited tracking capabilities at the sub-millimeter and sub-degree level within a 6DOF workspace.« less

  20. Pi of the Sky observation of GRB160625B

    NASA Astrophysics Data System (ADS)

    Opiela, Rafał; Batsch, Tadeusz; Castro-Tirado, Alberto Javier; Czyrkowski, Henryk; Ćwiek, Arkadiusz; Ćwiok, Mikołaj; DÄ browski, Ryszard; Jelinek, Martin; Kasprowicz, Grzegorz; Majcher, Ariel; Małek, Katarzyna; Mankiewicz, Lech; Nawrocki, Krzysztof; Obara, Łukasz; Piotrowski, Lech; Siudek, Małgorzata; Sokołowski, Marcin; Wawrzaszek, Roman; Wrochna, Grzegorz; Zaremba, Marcin; Żarnecki, Aleksander Filip

    2017-08-01

    Pi of the Sky is a system of wide field of view robotic telescopes, which search for short timescale astrophysical phenomena, especially for prompt optical GRB emission. The system was designed for autonomous operation, monitoring a large fraction of the sky to a depth of 12m-13m and with time resolution of the order of 10 seconds. Custom designed CCD cameras are equipped with Canon lenses f = 85 mm, f/d = 1.2 and cover 20° × 20° of the sky each. The final system with 16 cameras on 4 equatorial mounts was completed in 2014 at the INTA El Arenosillo Test Centre in Spain. GRB160625B was an extremely bright GRB with three distinct emission episodes. Cameras of the Pi of the Sky observatory in Spain were not observing the position of the GRB160625B prior to the first emission episode. Observations started only after receiving Fermi/GBM trigger, about 140 seconds prior to the second emission. As the position estimate taken from the Fermi alert and used to position the telescope was not very accurate, the actual position of the burst happened to be in the overlap region of two cameras, resulting in two independent sets of measurements. Light curves from both cameras were reconstructed using the Luiza framework. No object brighter than 12.4m (3σ limit) was observed prior to the second GRB emission. An optical flash was identified on an image starting -5.9s before the time of the Fermi/LAT trigger, brightening to about 8m on the next image and then becoming gradually dimmer, fading below our sensitivity after about 400s. Emission features as measured in different spectral bands indicate that the three emission episodes of GRB160625B were dominated by distinct physics process. Simultaneously observations in gamma-rays and optical wavelengths support the hypothesis that this was the first observed transition from thermal to non-thermal radiation in a single GRB. Main results of the combined analysis are presented.

  1. Defining ray sets for the analysis of lenslet-based optical systems including plenoptic cameras and Shack-Hartmann wavefront sensors

    NASA Astrophysics Data System (ADS)

    Moore, Lori

    Plenoptic cameras and Shack-Hartmann wavefront sensors are lenslet-based optical systems that do not form a conventional image. The addition of a lens array into these systems allows for the aberrations generated by the combination of the object and the optical components located prior to the lens array to be measured or corrected with post-processing. This dissertation provides a ray selection method to determine the rays that pass through each lenslet in a lenslet-based system. This first-order, ray trace method is developed for any lenslet-based system with a well-defined fore optic, where in this dissertation the fore optic is all of the optical components located prior to the lens array. For example, in a plenoptic camera the fore optic is a standard camera lens. Because a lens array at any location after the exit pupil of the fore optic is considered in this analysis, it is applicable to both plenoptic cameras and Shack-Hartmann wavefront sensors. Only a generic, unaberrated fore optic is considered, but this dissertation establishes a framework for considering the effect of an aberrated fore optic in lenslet-based systems. The rays from the fore optic that pass through a lenslet placed at any location after the fore optic are determined. This collection of rays is reduced to three rays that describe the entire lenslet ray set. The lenslet ray set is determined at the object, image, and pupil planes of the fore optic. The consideration of the apertures that define the lenslet ray set for an on-axis lenslet leads to three classes of lenslet-based systems. Vignetting of the lenslet rays is considered for off-axis lenslets. Finally, the lenslet ray set is normalized into terms similar to the field and aperture vector used to describe the aberrated wavefront of the fore optic. The analysis in this dissertation is complementary to other first-order models that have been developed for a specific plenoptic camera layout or Shack-Hartmann wavefront sensor application. This general analysis determines the location where the rays of each lenslet pass through the fore optic establishing a framework to consider the effect of an aberrated fore optic in a future analysis.

  2. Close-range photogrammetry with video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1985-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  3. Close-Range Photogrammetry with Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1983-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  4. Monitoring chicken flock behaviour provides early warning of infection by human pathogen Campylobacter

    PubMed Central

    Colles, Frances M.; Cain, Russell J.; Nickson, Thomas; Smith, Adrian L.; Roberts, Stephen J.; Maiden, Martin C. J.; Lunn, Daniel; Dawkins, Marian Stamp

    2016-01-01

    Campylobacter is the commonest bacterial cause of gastrointestinal infection in humans, and chicken meat is the major source of infection throughout the world. Strict and expensive on-farm biosecurity measures have been largely unsuccessful in controlling infection and are hampered by the time needed to analyse faecal samples, with the result that Campylobacter status is often known only after a flock has been processed. Our data demonstrate an alternative approach that monitors the behaviour of live chickens with cameras and analyses the ‘optical flow’ patterns made by flock movements. Campylobacter-free chicken flocks have higher mean and lower kurtosis of optical flow than those testing positive for Campylobacter by microbiological methods. We show that by monitoring behaviour in this way, flocks likely to become positive can be identified within the first 7–10 days of life, much earlier than conventional on-farm microbiological methods. This early warning has the potential to lead to a more targeted approach to Campylobacter control and also provides new insights into possible sources of infection that could transform the control of this globally important food-borne pathogen. PMID:26740618

  5. Fourier-domain optical coherence tomography and adaptive optics reveal nerve fiber layer loss and photoreceptor changes in a patient with optic nerve drusen.

    PubMed

    Choi, Stacey S; Zawadzki, Robert J; Greiner, Mark A; Werner, John S; Keltner, John L

    2008-06-01

    New technology allows more precise definition of structural alterations of all retinal layers although it has not been used previously in cases of optic disc drusen. Using Stratus and Fourier domain (FD) optical coherence tomography (OCT) and adaptive optics (AO) through a flood-illuminated fundus camera, we studied the retinas of a patient with long-standing optic disc drusen and acute visual loss at high altitude attributed to ischemic optic neuropathy. Stratus OCT and FD-OCT confirmed severe thinning of the retinal nerve fiber layer (RNFL). FD-OCT revealed disturbances in the photoreceptor layer heretofore not described in optic disc drusen patients. AO confirmed the FD-OCT findings in the photoreceptor layer and also showed reduced cone density at retinal locations associated with reduced visual sensitivity. Based on this study, changes occur not only in the RNFL but also in the photoreceptor layer in optic nerve drusen complicated by ischemic optic neuropathy. This is the first reported application of FD-OCT and the AO to this condition. Such new imaging technology may in the future allow monitoring of disease progression more precisely and accurately.

  6. Nonmetallic materials contamination studies. [space telescope

    NASA Technical Reports Server (NTRS)

    Muscari, J. A.; Beverlin, G.

    1980-01-01

    In order to impose adequate contamination control requirements in the selection of Wide Field Planetary Camera (WFPC) materials and to develop a data base of potential optical degradation of the WFPC charge-couple device window, the outgassing properties of WFPC materials and the collected volatile condensed material (CVCM) effects on MgF2 transmittance were measured. Changes in the transmittance were monitored in the wavelength region from 115 nm to 300 nm for selected CVCM thicknesses up to 150 nm. The outgassing properties of reemitted CVCM were also studied.

  7. Optical and Image Transmission through Desert Atmospheres

    DTIC Science & Technology

    1994-01-25

    control tha camer and to record and analyze th digitized imag. 23 Recording of the target video scenes was initially severely limited by wind -induced...vibrations of the digital camera platform. In order to overcome this limitation, a portable wind resistant cage, measuring approximately 3x3x5 m2, was...monitoring and recording equipment was also housed inside the wind cage. In order to minimize its weight, the wind cage was constructed with 2"x4a wooden frame

  8. Optical performance analysis of plenoptic camera systems

    NASA Astrophysics Data System (ADS)

    Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas

    2014-09-01

    Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

  9. Analysis of the effect on optical equipment caused by solar position in target flight measure

    NASA Astrophysics Data System (ADS)

    Zhu, Shun-hua; Hu, Hai-bin

    2012-11-01

    Optical equipment is widely used to measure flight parameters in target flight performance test, but the equipment is sensitive to the sun's rays. In order to avoid the disadvantage of sun's rays directly shines to the optical equipment camera lens when measuring target flight parameters, the angle between observation direction and the line which connects optical equipment camera lens and the sun should be kept at a big range, The calculation method of the solar azimuth and altitude to the optical equipment at any time and at any place on the earth, the equipment observation direction model and the calculating model of angle between observation direction and the line which connects optical equipment camera lens are introduced in this article. Also, the simulation of the effect on optical equipment caused by solar position at different time, different date, different month and different target flight direction is given in this article.

  10. Potential for application of an acoustic camera in particle tracking velocimetry.

    PubMed

    Wu, Fu-Chun; Shao, Yun-Chuan; Wang, Chi-Kuei; Liou, Jim

    2008-11-01

    We explored the potential and limitations for applying an acoustic camera as the imaging instrument of particle tracking velocimetry. The strength of the acoustic camera is its usability in low-visibility environments where conventional optical cameras are ineffective, while its applicability is limited by lower temporal and spatial resolutions. We conducted a series of experiments in which acoustic and optical cameras were used to simultaneously image the rotational motion of tracer particles, allowing for a comparison of the acoustic- and optical-based velocities. The results reveal that the greater fluctuations associated with the acoustic-based velocities are primarily attributed to the lower temporal resolution. The positive and negative biases induced by the lower spatial resolution are balanced, with the positive ones greater in magnitude but the negative ones greater in quantity. These biases reduce with the increase in the mean particle velocity and approach minimum as the mean velocity exceeds the threshold value that can be sensed by the acoustic camera.

  11. New-generation security network with synergistic IP sensors

    NASA Astrophysics Data System (ADS)

    Peshko, Igor

    2007-09-01

    Global Dynamic Monitoring and Security Network (GDMSN) for real-time monitoring of (1) environmental and atmospheric conditions: chemical, biological, radiological and nuclear hazards, climate/man-induced catastrophe areas and terrorism threats; (2) water, soil, food chain quantifiers, and public health care; (3) large government/public/ industrial/ military areas is proposed. Each GDMSN branch contains stationary or mobile terminals (ground, sea, air, or space manned/unmanned vehicles) equipped with portable sensors. The sensory data are transferred via telephone, Internet, TV, security camera and other wire/wireless or optical communication lines. Each sensor is a self-registering, self-reporting, plug-and-play, portable unit that uses unified electrical and/or optical connectors and operates with IP communication protocol. The variant of the system based just on optical technologies cannot be disabled by artificial high-power radio- or gamma-pulses or sunbursts. Each sensor, being supplied with a battery and monitoring means, can be used as a separate portable unit. Military personnel, police officers, firefighters, miners, rescue teams, and nuclear power plant personnel may individually use these sensors. Terminals may be supplied with sensors essential for that specific location. A miniature "universal" optical gas sensor for specific applications in life support and monitoring systems was designed and tested. The sensor is based on the physics of absorption and/or luminescence spectroscopy. It can operate at high pressures and elevated temperatures, such as in professional and military diving equipment, submarines, underground shelters, mines, command stations, aircraft, space shuttles, etc. To enable this capability, the multiple light emitters, detectors and data processing electronics are located within a specially protected chamber.

  12. Navigating surgical fluorescence cameras using near-infrared optical tracking.

    PubMed

    van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs

    2018-05-01

    Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  13. Solid state electro-optic color filter and iris

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A pair of solid state electro-optic filters (SSEF) in a binocular holder were designed and fabricated for evaluation of field sequential stereo TV applications. The electronic circuitry for use with the stereo goggles was designed and fabricated, requiring only an external video input. A polarizing screen suitable for attachment to various size TV monitors for use in conjunction with the stereo goggles was designed and fabricated. An improved engineering model 2 filter was fabricated using the bonded holder technique developed previously and integrated to a GCTA color TV camera. An engineering model color filter was fabricated and assembled using PLZT control elements. In addition, a ruggedized holder assembly was designed, fabricated and tested. This assembly provides electrical contacts, high voltage protection, and support for the fragile PLZT disk, and also permits mounting and optical alignment of the associated polarizers.

  14. Wavefront Sensing With Switched Lenses for Defocus Diversity

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.

    2007-01-01

    In an alternative hardware design for an apparatus used in image-based wavefront sensing, defocus diversity is introduced by means of fixed lenses that are mounted in a filter wheel (see figure) so that they can be alternately switched into a position in front of the focal plane of an electronic camera recording the image formed by the optical system under test. [The terms image-based, wavefront sensing, and defocus diversity are defined in the first of the three immediately preceding articles, Broadband Phase Retrieval for Image-Based Wavefront Sensing (GSC-14899-1).] Each lens in the filter wheel is designed so that the optical effect of placing it at the assigned position is equivalent to the optical effect of translating the camera a specified defocus distance along the optical axis. Heretofore, defocus diversity has been obtained by translating the imaging camera along the optical axis to various defocus positions. Because data must be taken at multiple, accurately measured defocus positions, it is necessary to mount the camera on a precise translation stage that must be calibrated for each defocus position and/or to use an optical encoder for measurement and feedback control of the defocus positions. Additional latency is introduced into the wavefront sensing process as the camera is translated to the various defocus positions. Moreover, if the optical system under test has a large focal length, the required defocus values are large, making it necessary to use a correspondingly bulky translation stage. By eliminating the need for translation of the camera, the alternative design simplifies and accelerates the wavefront-sensing process. This design is cost-effective in that the filterwheel/lens mechanism can be built from commercial catalog components. After initial calibration of the defocus value of each lens, a selected defocus value is introduced by simply rotating the filter wheel to place the corresponding lens in front of the camera. The rotation of the wheel can be automated by use of a motor drive, and further calibration is not necessary. Because a camera-translation stage is no longer needed, the size of the overall apparatus can be correspondingly reduced.

  15. Light field analysis and its applications in adaptive optics and surveillance systems

    NASA Astrophysics Data System (ADS)

    Eslami, Mohammed Ali

    An image can only be as good as the optics of a camera or any other imaging system allows it to be. An imaging system is merely a transformation that takes a 3D world coordinate to a 2D image plane. This can be done through both linear/non-linear transfer functions. Depending on the application at hand it is easier to use some models of imaging systems over the others in certain situations. The most well-known models are the 1) Pinhole model, 2) Thin Lens Model and 3) Thick lens model for optical systems. Using light-field analysis the connection through these different models is described. A novel figure of merit is presented on using one optical model over the other for certain applications. After analyzing these optical systems, their applications in plenoptic cameras for adaptive optics applications are introduced. A new technique to use a plenoptic camera to extract information about a localized distorted planar wave front is described. CODEV simulations conducted in this thesis show that its performance is comparable to those of a Shack-Hartmann sensor and that they can potentially increase the dynamic range of angles that can be extracted assuming a paraxial imaging system. As a final application, a novel dual PTZ-surveillance system to track a target through space is presented. 22X optic zoom lenses on high resolution pan/tilt platforms recalibrate a master-slave relationship based on encoder readouts rather than complicated image processing algorithms for real-time target tracking. As the target moves out of a region of interest in the master camera, it is moved to force the target back into the region of interest. Once the master camera is moved, a precalibrated lookup table is interpolated to compute the relationship between the master/slave cameras. The homography that relates the pixels of the master camera to the pan/tilt settings of the slave camera then continue to follow the planar trajectories of targets as they move through space at high accuracies.

  16. Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena

    NASA Astrophysics Data System (ADS)

    Pei Wong, Choun; Subramaniam, R.

    2018-05-01

    The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  17. Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena

    ERIC Educational Resources Information Center

    Wong, Choun Pei; Subramaniam, R.

    2018-01-01

    The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  18. Comparison of Brownian-dynamics-based estimates of polymer tension with direct force measurements.

    PubMed

    Arsenault, Mark E; Purohit, Prashant K; Goldman, Yale E; Shuman, Henry; Bau, Haim H

    2010-11-01

    With the aid of brownian dynamics models, it is possible to estimate polymer tension by monitoring polymers' transverse thermal fluctuations. To assess the precision of the approach, brownian dynamics-based tension estimates were compared with the force applied to rhodamine-phalloidin labeled actin filaments bound to polymer beads and suspended between two optical traps. The transverse thermal fluctuations of each filament were monitored with a CCD camera, and the images were analyzed to obtain the filament's transverse displacement variance as a function of position along the filament, the filament's tension, and the camera's exposure time. A linear Brownian dynamics model was used to estimate the filament's tension. The estimated force was compared and agreed within 30% (when the tension <0.1 pN ) and 70% (when the tension <1 pN ) with the applied trap force. In addition, the paper presents concise asymptotic expressions for the mechanical compliance of a system consisting of a filament attached tangentially to bead handles (dumbbell system). The techniques described here can be used for noncontact estimates of polymers' and fibers' tension.

  19. Wireless multipoint communication for optical sensors in the industrial environment using the new Bluetooth standard

    NASA Astrophysics Data System (ADS)

    Hussmann, Stephan; Lau, Wing Y.; Chu, Terry; Grothof, Markus

    2003-07-01

    Traditionally, the measuring or monitoring system of manufacturing industries uses sensors, computers and screens for their quality control (Q.C.). The acquired information is fed back to the control room by wires, which - for obvious reason - are not suitable in many environments. This paper describes a method to solve this problem by employing the new Bluetooth technology to set up a complete new system, where a total wireless solution is made feasible. This new Q.C. system allows several line scan cameras to be connected at once to a graphical user interface (GUI) that can monitor the production process. There are many Bluetooth devices available on the market such as cell-phones, headsets, printers, PDA etc. However, the detailed application is a novel implementation in the industrial Q.C. area. This paper will contain more details about the Bluetooth standard and why it is used (nework topologies, host controller interface, data rates, etc.), the Bluetooth implemetation in the microcontroller of the line scan camera, and the GUI and its features.

  20. Color reproduction software for a digital still camera

    NASA Astrophysics Data System (ADS)

    Lee, Bong S.; Park, Du-Sik; Nam, Byung D.

    1998-04-01

    We have developed a color reproduction software for a digital still camera. The image taken by the camera was colorimetrically reproduced on the monitor after characterizing the camera and the monitor, and color matching between two devices. The reproduction was performed at three levels; level processing, gamma correction, and color transformation. The image contrast was increased after the level processing adjusting the level of dark and bright portions of the image. The relationship between the level processed digital values and the measured luminance values of test gray samples was calculated, and the gamma of the camera was obtained. The method for getting the unknown monitor gamma was proposed. As a result, the level processed values were adjusted by the look-up table created by the camera and the monitor gamma correction. For a color transformation matrix for the camera, 3 by 3 or 3 by 4 matrix was used, which was calculated by the regression between the gamma corrected values and the measured tristimulus values of each test color samples the various reproduced images were displayed on the dialogue box implemented in our software, which were generated according to four illuminations for the camera and three color temperatures for the monitor. An user can easily choose he best reproduced image comparing each others.

  1. Condenser for illuminating a ringfield camera with synchrotron emission light

    DOEpatents

    Sweatt, W.C.

    1996-04-30

    The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors. 9 figs.

  2. Condenser for illuminating a ringfield camera with synchrotron emission light

    DOEpatents

    Sweatt, William C.

    1996-01-01

    The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors.

  3. Volcano dome dynamics at Mount St. Helens: Deformation and intermittent subsidence monitored by seismicity and camera imagery pixel offsets

    USGS Publications Warehouse

    Salzer, Jacqueline T.; Thelen, Weston A.; James, Mike R.; Walter, Thomas R.; Moran, Seth C.; Denlinger, Roger P.

    2016-01-01

    The surface deformation field measured at volcanic domes provides insights into the effects of magmatic processes, gravity- and gas-driven processes, and the development and distribution of internal dome structures. Here we study short-term dome deformation associated with earthquakes at Mount St. Helens, recorded by a permanent optical camera and seismic monitoring network. We use Digital Image Correlation (DIC) to compute the displacement field between successive images and compare the results to the occurrence and characteristics of seismic events during a 6 week period of dome growth in 2006. The results reveal that dome growth at Mount St. Helens was repeatedly interrupted by short-term meter-scale downward displacements at the dome surface, which were associated in time with low-frequency, large-magnitude seismic events followed by a tremor-like signal. The tremor was only recorded by the seismic stations closest to the dome. We find a correlation between the magnitudes of the camera-derived displacements and the spectral amplitudes of the associated tremor. We use the DIC results from two cameras and a high-resolution topographic model to derive full 3-D displacement maps, which reveals internal dome structures and the effect of the seismic activity on daily surface velocities. We postulate that the tremor is recording the gravity-driven response of the upper dome due to mechanical collapse or depressurization and fault-controlled slumping. Our results highlight the different scales and structural expressions during growth and disintegration of lava domes and the relationships between seismic and deformation signals.

  4. Blinded evaluation of the effects of high definition and magnification on perceived image quality in laryngeal imaging.

    PubMed

    Otto, Kristen J; Hapner, Edie R; Baker, Michael; Johns, Michael M

    2006-02-01

    Advances in commercial video technology have improved office-based laryngeal imaging. This study investigates the perceived image quality of a true high-definition (HD) video camera and the effect of magnification on laryngeal videostroboscopy. We performed a prospective, dual-armed, single-blinded analysis of a standard laryngeal videostroboscopic examination comparing 3 separate add-on camera systems: a 1-chip charge-coupled device (CCD) camera, a 3-chip CCD camera, and a true 720p (progressive scan) HD camera. Displayed images were controlled for magnification and image size (20-inch [50-cm] display, red-green-blue, and S-video cable for 1-chip and 3-chip cameras; digital visual interface cable and HD monitor for HD camera). Ten blinded observers were then asked to rate the following 5 items on a 0-to-100 visual analog scale: resolution, color, ability to see vocal fold vibration, sense of depth perception, and clarity of blood vessels. Eight unblinded observers were then asked to rate the difference in perceived resolution and clarity of laryngeal examination images when displayed on a 10-inch (25-cm) monitor versus a 42-inch (105-cm) monitor. A visual analog scale was used. These monitors were controlled for actual resolution capacity. For each item evaluated, randomized block design analysis demonstrated that the 3-chip camera scored significantly better than the 1-chip camera (p < .05). For the categories of color and blood vessel discrimination, the 3-chip camera scored significantly better than the HD camera (p < .05). For magnification alone, observers rated the 42-inch monitor statistically better than the 10-inch monitor. The expense of new medical technology must be judged against its added value. This study suggests that HD laryngeal imaging may not add significant value over currently available video systems, in perceived image quality, when a small monitor is used. Although differences in clarity between standard and HD cameras may not be readily apparent on small displays, a large display size coupled with HD technology may impart improved diagnosis of subtle vocal fold lesions and vibratory anomalies.

  5. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  6. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  7. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  8. Liquid lens: advances in adaptive optics

    NASA Astrophysics Data System (ADS)

    Casey, Shawn Patrick

    2010-12-01

    'Liquid lens' technologies promise significant advancements in machine vision and optical communications systems. Adaptations for machine vision, human vision correction, and optical communications are used to exemplify the versatile nature of this technology. Utilization of liquid lens elements allows the cost effective implementation of optical velocity measurement. The project consists of a custom image processor, camera, and interface. The images are passed into customized pattern recognition and optical character recognition algorithms. A single camera would be used for both speed detection and object recognition.

  9. Wide field/planetary camera optics study. [for the large space telescope

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Design feasibility of the baseline optical design concept was established for the wide field/planetary camera (WF/PC) and will be used with the space telescope (ST) to obtain high angular resolution astronomical information over a wide field. The design concept employs internal optics to relay the ST image to a CCD detector system. Optical design performance predictions, sensitivity and tolerance analyses, manufacturability of the optical components, and acceptance testing of the two mirror Cassegrain relays are discussed.

  10. Optical methods for the optimization of system SWaP-C using aspheric components and advanced optical polymers

    NASA Astrophysics Data System (ADS)

    Zelazny, Amy; Benson, Robert; Deegan, John; Walsh, Ken; Schmidt, W. David; Howe, Russell

    2013-06-01

    We describe the benefits to camera system SWaP-C associated with the use of aspheric molded glasses and optical polymers in the design and manufacture of optical components and elements. Both camera objectives and display eyepieces, typical for night vision man-portable EO/IR systems, are explored. We discuss optical trade-offs, system performance, and cost reductions associated with this approach in both visible and non-visible wavebands, specifically NIR and LWIR. Example optical models are presented, studied, and traded using this approach.

  11. Design of a portable optical emission tomography system for microwave induced compact plasma for visible to near-infrared emission lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rathore, Kavita, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Munshi, Prabhat, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Bhattacharjee, Sudeep, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in

    A new non-invasive diagnostic system is developed for Microwave Induced Plasma (MIP) to reconstruct tomographic images of a 2D emission profile. A compact MIP system has wide application in industry as well as research application such as thrusters for space propulsion, high current ion beams, and creation of negative ions for heating of fusion plasma. Emission profile depends on two crucial parameters, namely, the electron temperature and density (over the entire spatial extent) of the plasma system. Emission tomography provides basic understanding of plasmas and it is very useful to monitor internal structure of plasma phenomena without disturbing its actualmore » processes. This paper presents development of a compact, modular, and versatile Optical Emission Tomography (OET) tool for a cylindrical, magnetically confined MIP system. It has eight slit-hole cameras and each consisting of a complementary metal–oxide–semiconductor linear image sensor for light detection. The optical noise is reduced by using aspheric lens and interference band-pass filters in each camera. The entire cylindrical plasma can be scanned with automated sliding ring mechanism arranged in fan-beam data collection geometry. The design of the camera includes a unique possibility to incorporate different filters to get the particular wavelength light from the plasma. This OET system includes selected band-pass filters for particular argon emission 750 nm, 772 nm, and 811 nm lines and hydrogen emission H{sub α} (656 nm) and H{sub β} (486 nm) lines. Convolution back projection algorithm is used to obtain the tomographic images of plasma emission line. The paper mainly focuses on (a) design of OET system in detail and (b) study of emission profile for 750 nm argon emission lines to validate the system design.« less

  12. Modeling of digital information optical encryption system with spatially incoherent illumination

    NASA Astrophysics Data System (ADS)

    Bondareva, Alyona P.; Cheremkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    State of the art micromirror DMD spatial light modulators (SLM) offer unprecedented framerate up to 30000 frames per second. This, in conjunction with high speed digital camera, should allow to build high speed optical encryption system. Results of modeling of digital information optical encryption system with spatially incoherent illumination are presented. Input information is displayed with first SLM, encryption element - with second SLM. Factors taken into account are: resolution of SLMs and camera, holograms reconstruction noise, camera noise and signal sampling. Results of numerical simulation demonstrate high speed (several gigabytes per second), low bit error rate and high crypto-strength.

  13. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef

    There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO{sub 2}) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations ofmore » the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera electronics will be exposed. Another important element of the camera is the optical system that transports the image from the scene to the image sensor. This arrangement of glass-based lenses is affected by radiations through two mechanisms: the radiation induced absorption and the radiation induced refractive index changes. The first one will limit the signal to noise ratio of the image whereas the second one will directly affect the resolution of the camera. We'll present at the conference a coupled simulation/experiment study of these effects for various commercial glasses and present vulnerability study of typical optical systems to radiations at MGy doses. The last very important part of the camera is the illumination system that can be based on various technologies of emitting devices like LED, SLED or lasers. The most promising solutions for high radiation doses will be presented at the conference. In addition to this hardening-by-component approach, the global radiation tolerance of the camera can be drastically improve by working at the system level, combining innovative approaches eg. for the optical and illumination systems. We'll present at the conference the developed approach allowing to extend the camera lifetime up to the MGy dose range. (authors)« less

  14. Observations of the Perseids 2012 using SPOSH cameras

    NASA Astrophysics Data System (ADS)

    Margonis, A.; Flohrer, J.; Christou, A.; Elgner, S.; Oberst, J.

    2012-09-01

    The Perseids are one of the most prominent annual meteor showers occurring every summer when the stream of dust particles, originating from Halley-type comet 109P/Swift-Tuttle, intersects the orbital path of the Earth. The dense core of this stream passes Earth's orbit on the 12th of August producing the maximum number of meteors. The Technical University of Berlin (TUB) and the German Aerospace Center (DLR) organize observing campaigns every summer monitoring the Perseids activity. The observations are carried out using the Smart Panoramic Optical Sensor Head (SPOSH) camera system [0]. The SPOSH camera has been developed by DLR and Jena-Optronik GmbH under an ESA/ESTEC contract and it is designed to image faint, short-lived phenomena on dark planetary hemispheres. The camera features a highly sensitive backilluminated 1024x1024 CCD chip and a high dynamic range of 14 bits. The custom-made fish-eye lens offers a 120°x120° field-of-view (168° over the diagonal). Figure 1: A meteor captured by the SPOSH cameras simultaneously during the last 2011 observing campaign in Greece. The horizon including surrounding mountains can be seen in the image corners as a result of the large FOV of the camera. The observations will be made on the Greek Peloponnese peninsula monitoring the post-peak activity of the Perseids during a one-week period around the August New Moon (14th to 21st). Two SPOSH cameras will be deployed in two remote sites in high altitudes for the triangulation of meteor trajectories captured at both stations simultaneously. The observations during this time interval will give us the possibility to study the poorly-observed postmaximum branch of the Perseid stream and compare the results with datasets from previous campaigns which covered different periods of this long-lived meteor shower. The acquired data will be processed using dedicated software for meteor data reduction developed at TUB and DLR. Assuming a successful campaign, statistics, trajectories and photometric properties of the processed double-station meteors will be presented at the conference. Furthermore, a first order statistical analysis of the meteors processed during the 2011 and the new 2012 campaigns will be presented [0].

  15. The AOTF-based NO2 camera

    NASA Astrophysics Data System (ADS)

    Dekemper, Emmanuel; Vanhamel, Jurgen; Van Opstal, Bert; Fussen, Didier

    2016-12-01

    The abundance of NO2 in the boundary layer relates to air quality and pollution source monitoring. Observing the spatiotemporal distribution of NO2 above well-delimited (flue gas stacks, volcanoes, ships) or more extended sources (cities) allows for applications such as monitoring emission fluxes or studying the plume dynamic chemistry and its transport. So far, most attempts to map the NO2 field from the ground have been made with visible-light scanning grating spectrometers. Benefiting from a high retrieval accuracy, they only achieve a relatively low spatiotemporal resolution that hampers the detection of dynamic features. We present a new type of passive remote sensing instrument aiming at the measurement of the 2-D distributions of NO2 slant column densities (SCDs) with a high spatiotemporal resolution. The measurement principle has strong similarities with the popular filter-based SO2 camera as it relies on spectral images taken at wavelengths where the molecule absorption cross section is different. Contrary to the SO2 camera, the spectral selection is performed by an acousto-optical tunable filter (AOTF) capable of resolving the target molecule's spectral features. The NO2 camera capabilities are demonstrated by imaging the NO2 abundance in the plume of a coal-fired power plant. During this experiment, the 2-D distribution of the NO2 SCD was retrieved with a temporal resolution of 3 min and a spatial sampling of 50 cm (over a 250 × 250 m2 area). The detection limit was close to 5 × 1016 molecules cm-2, with a maximum detected SCD of 4 × 1017 molecules cm-2. Illustrating the added value of the NO2 camera measurements, the data reveal the dynamics of the NO to NO2 conversion in the early plume with an unprecedent resolution: from its release in the air, and for 100 m upwards, the observed NO2 plume concentration increased at a rate of 0.75-1.25 g s-1. In joint campaigns with SO2 cameras, the NO2 camera could also help in removing the bias introduced by the NO2 interference with the SO2 spectrum.

  16. Closeups of IECM grappled by RMS and positioned above payload bay (PLB)

    NASA Image and Video Library

    1982-07-04

    STS004-23-119 (27 June-4 July 1982) --- This is a close-up view of the Marshall Space Flight Center-developed Induced Environment Contamination Monitor (IECM), a multi-instrument box designed to check for contaminants in and around the space shuttle orbiter cargo bay which might adversely affect delicate experiments carried aboard. The astronaut crew of Thomas K. Mattingly II and Henry W. Hartsfield Jr. maneuvered the Canadian-built robot arm (called the remote manipulator system) very near their overhead flight deck windows and captured this scene with a 35mm camera. HOLD PICTURE HORIZONTALLY WITH FRAME NUMBER AT TOP CENTER. Cameras for the 11 instruments are pictured as black circles at the bottom of the frame. The access door to the arm and safe plug is located about halfway up the left edge of the box. A cascade injector device is immediately to the right of the plug. The rectangular opening at right center of the monitor is the optical effects module. Mass spectrometer is at upper left. Air sampler bottles are at upper left. The colorful rectangle near upper left of the monitor is the passive array. Not easily seen, but also a part of the instrument, are the cryogenic quartz crystal micro balance and the temperature controlled quartz micro balance. Photo credit: NASA

  17. Low, slow, small target recognition based on spatial vision network

    NASA Astrophysics Data System (ADS)

    Cheng, Zhao; Guo, Pei; Qi, Xin

    2018-03-01

    Traditional photoelectric monitoring is monitored using a large number of identical cameras. In order to ensure the full coverage of the monitoring area, this monitoring method uses more cameras, which leads to more monitoring and repetition areas, and higher costs, resulting in more waste. In order to reduce the monitoring cost and solve the difficult problem of finding, identifying and tracking a low altitude, slow speed and small target, this paper presents spatial vision network for low-slow-small targets recognition. Based on camera imaging principle and monitoring model, spatial vision network is modeled and optimized. Simulation experiment results demonstrate that the proposed method has good performance.

  18. Process monitoring of additive manufacturing by using optical tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zenzinger, Guenter, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Bamberg, Joachim, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Ladewig, Alexander, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de

    2015-03-31

    Parts fabricated by means of additive manufacturing are usually of complex shape and owing to the fabrication procedure by using selective laser melting (SLM), potential defects and inaccuracies are often very small in lateral size. Therefore, an adequate quality inspection of such parts is rather challenging, while non-destructive-techniques (NDT) are difficult to realize, but considerable efforts are necessary in order to ensure the quality of SLM-parts especially used for aerospace components. Thus, MTU Aero Engines is currently focusing on the development of an Online Process Control system which monitors and documents the complete welding process during the SLM fabrication procedure.more » A high-resolution camera system is used to obtain images, from which tomographic data for a 3dim analysis of SLM-parts are processed. From the analysis, structural irregularities and structural disorder resulting from any possible erroneous melting process become visible and may be allocated anywhere within the 3dim structure. Results of our optical tomography (OT) method as obtained on real defects are presented.« less

  19. Effect of camera angulation on adaptation of CAD/CAM restorations.

    PubMed

    Parsell, D E; Anderson, B C; Livingston, H M; Rudd, J I; Tankersley, J D

    2000-01-01

    A significant concern with computer-assisted design/computer-assisted manufacturing (CAD/CAM)-produced prostheses is the accuracy of adaptation of the restoration to the preparation. The objective of this study is to determine the effect of operator-controlled camera misalignment on restoration adaptation. A CEREC 2 CAD/CAM unit (Sirona Dental Systems, Bensheim, Germany) was used to capture the optical impressions and machine the restorations. A Class I preparation was used as the standard preparation for optical impressions. Camera angles along the mesio-distal and buccolingual alignment were varied from the ideal orientation. Occlusal marginal gaps and sample height, width, and length were measured and compared to preparation dimensions. For clinical correlation, clinicians were asked to take optical impressions of mesio-occlusal preparations (Class II) on all four second molar sites, using a patient simulator. On the adjacent first molar occlusal surfaces, a preparation was machined such that camera angulation could be calculated from information taken from the optical impression. Degree of tilt and plane of tilt were compared to the optimum camera positions for those preparations. One-way analysis of variance and Dunnett C post hoc testing (alpha = 0.01) revealed little significant degradation in fit with camera angulation. Only the apical length fit was significantly degraded by excessive angulation. The CEREC 2 CAD/CAM system was found to be relatively insensitive to operator-induced errors attributable to camera misalignments of less than 5 degrees in either the buccolingual or the mesiodistal plane. The average camera tilt error generated by clinicians for all sites was 1.98 +/- 1.17 degrees.

  20. Optical analysis of a compound quasi-microscope for planetary landers

    NASA Technical Reports Server (NTRS)

    Wall, S. D.; Burcher, E. E.; Huck, F. O.

    1974-01-01

    A quasi-microscope concept, consisting of facsimile camera augmented with an auxiliary lens as a magnifier, was introduced and analyzed. The performance achievable with this concept was primarily limited by a trade-off between resolution and object field; this approach leads to a limiting resolution of 20 microns when used with the Viking lander camera (which has an angular resolution of 0.04 deg). An optical system is analyzed which includes a field lens between camera and auxiliary lens to overcome this limitation. It is found that this system, referred to as a compound quasi-microscope, can provide improved resolution (to about 2 microns ) and a larger object field. However, this improvement is at the expense of increased complexity, special camera design requirements, and tighter tolerances on the distances between optical components.

  1. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  2. Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA

    NASA Astrophysics Data System (ADS)

    Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.

    We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.

  3. Afocal viewport optics for underwater imaging

    NASA Astrophysics Data System (ADS)

    Slater, Dan

    2014-09-01

    A conventional camera can be adapted for underwater use by enclosing it in a sealed waterproof pressure housing with a viewport. The viewport, as an optical interface between water and air needs to consider both the camera and water optical characteristics while also providing a high pressure water seal. Limited hydrospace visibility drives a need for wide angle viewports. Practical optical interfaces between seawater and air vary from simple flat plate windows to complex water contact lenses. This paper first provides a brief overview of the physical and optical properties of the ocean environment along with suitable optical materials. This is followed by a discussion of the characteristics of various afocal underwater viewport types including flat windows, domes and the Ivanoff corrector lens, a derivative of a Galilean wide angle camera adapter. Several new and interesting optical designs derived from the Ivanoff corrector lens are presented including a pair of very compact afocal viewport lenses that are compatible with both in water and in air environments and an afocal underwater hyper-hemispherical fisheye lens.

  4. Adaptive Optics For Imaging Bright Objects Next To Dim Ones

    NASA Technical Reports Server (NTRS)

    Shao, Michael; Yu, Jeffrey W.; Malbet, Fabien

    1996-01-01

    Adaptive optics used in imaging optical systems, according to proposal, to enhance high-dynamic-range images (images of bright objects next to dim objects). Designed to alter wavefronts to correct for effects of scattering of light from small bumps on imaging optics. Original intended application of concept in advanced camera installed on Hubble Space Telescope for imaging of such phenomena as large planets near stars other than Sun. Also applicable to other high-quality telescopes and cameras.

  5. Design of the high resolution optical instrument for the Pleiades HR Earth observation satellites

    NASA Astrophysics Data System (ADS)

    Lamard, Jean-Luc; Gaudin-Delrieu, Catherine; Valentini, David; Renard, Christophe; Tournier, Thierry; Laherrere, Jean-Marc

    2017-11-01

    As part of its contribution to Earth observation from space, ALCATEL SPACE designed, built and tested the High Resolution cameras for the European intelligence satellites HELIOS I and II. Through these programmes, ALCATEL SPACE enjoys an international reputation. Its capability and experience in High Resolution instrumentation is recognised by the most customers. Coming after the SPOT program, it was decided to go ahead with the PLEIADES HR program. PLEIADES HR is the optical high resolution component of a larger optical and radar multi-sensors system : ORFEO, which is developed in cooperation between France and Italy for dual Civilian and Defense use. ALCATEL SPACE has been entrusted by CNES with the development of the high resolution camera of the Earth observation satellites PLEIADES HR. The first optical satellite of the PLEIADES HR constellation will be launched in mid-2008, the second will follow in 2009. To minimize the development costs, a mini satellite approach has been selected, leading to a compact concept for the camera design. The paper describes the design and performance budgets of this novel high resolution and large field of view optical instrument with emphasis on the technological features. This new generation of camera represents a breakthrough in comparison with the previous SPOT cameras owing to a significant step in on-ground resolution, which approaches the capabilities of aerial photography. Recent advances in detector technology, optical fabrication and electronics make it possible for the PLEIADES HR camera to achieve their image quality performance goals while staying within weight and size restrictions normally considered suitable only for much lower performance systems. This camera design delivers superior performance using an innovative low power, low mass, scalable architecture, which provides a versatile approach for a variety of imaging requirements and allows for a wide number of possibilities of accommodation with a mini-satellite class platform.

  6. Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.

    PubMed

    Song, Kai-Tai; Tai, Jen-Chao

    2006-10-01

    Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.

  7. Handheld hyperspectral imager for standoff detection of chemical and biological aerosols

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Jensen, James O.; McAnally, Gerard

    2004-02-01

    Pacific Advanced Technology has developed a small hand held imaging spectrometer, Sherlock, for gas leak and aerosol detection and imaging. The system is based on a patent technique that uses diffractive optics and image processing algorithms to detect spectral information about objects in the scene of the camera (IMSS Image Multi-spectral Sensing). This camera has been tested at Dugway Proving Ground and Dstl Porton Down facility looking at Chemical and Biological agent simulants. The camera has been used to investigate surfaces contaminated with chemical agent simulants. In addition to Chemical and Biological detection the camera has been used for environmental monitoring of green house gases and is currently undergoing extensive laboratory and field testing by the Gas Technology Institute, British Petroleum and Shell Oil for applications for gas leak detection and repair. The camera contains an embedded Power PC and a real time image processor for performing image processing algorithms to assist in the detection and identification of gas phase species in real time. In this paper we will present an over view of the technology and show how it has performed for different applications, such as gas leak detection, surface contamination, remote sensing and surveillance applications. In addition a sampling of the results form TRE field testing at Dugway in July of 2002 and Dstl at Porton Down in September of 2002 will be given.

  8. Remote sensing of atmospheric optical depth using a smartphone sun photometer.

    PubMed

    Cao, Tingting; Thompson, Jonathan E

    2014-01-01

    In recent years, smart phones have been explored for making a variety of mobile measurements. Smart phones feature many advanced sensors such as cameras, GPS capability, and accelerometers within a handheld device that is portable, inexpensive, and consistently located with an end user. In this work, a smartphone was used as a sun photometer for the remote sensing of atmospheric optical depth. The top-of-the-atmosphere (TOA) irradiance was estimated through the construction of Langley plots on days when the sky was cloudless and clear. Changes in optical depth were monitored on a different day when clouds intermittently blocked the sun. The device demonstrated a measurement precision of 1.2% relative standard deviation for replicate photograph measurements (38 trials, 134 datum). However, when the accuracy of the method was assessed through using optical filters of known transmittance, a more substantial uncertainty was apparent in the data. Roughly 95% of replicate smart phone measured transmittances are expected to lie within ±11.6% of the true transmittance value. This uncertainty in transmission corresponds to an optical depth of approx. ±0.12-0.13 suggesting the smartphone sun photometer would be useful only in polluted areas that experience significant optical depths. The device can be used as a tool in the classroom to present how aerosols and gases effect atmospheric transmission. If improvements in measurement precision can be achieved, future work may allow monitoring networks to be developed in which citizen scientists submit acquired data from a variety of locations.

  9. Transmitter diversity verification on ARTEMIS geostationary satellite

    NASA Astrophysics Data System (ADS)

    Mata Calvo, Ramon; Becker, Peter; Giggenbach, Dirk; Moll, Florian; Schwarzer, Malte; Hinz, Martin; Sodnik, Zoran

    2014-03-01

    Optical feeder links will become the extension of the terrestrial fiber communications towards space, increasing data throughput in satellite communications by overcoming the spectrum limitations of classical RF-links. The geostationary telecommunication satellite Alphasat and the satellites forming the EDRS-system will become the next generation for high-speed data-relay services. The ESA satellite ARTEMIS, precursor for geostationary orbit (GEO) optical terminals, is still a privileged experiment platform to characterize the turbulent channel and investigate the challenges of free-space optical communication to GEO. In this framework, two measurement campaigns were conducted with the scope of verifying the benefits of transmitter diversity in the uplink. To evaluate this mitigation technique, intensity measurements were carried out at both ends of the link. The scintillation parameter is calculated and compared to theory and, additionally, the Fried Parameter is estimated by using a focus camera to monitor the turbulence strength.

  10. Preliminary calibration results of the wide angle camera of the imaging instrument OSIRIS for the Rosetta mission

    NASA Astrophysics Data System (ADS)

    Da Deppo, V.; Naletto, G.; Nicolosi, P.; Zambolin, P.; De Cecco, M.; Debei, S.; Parzianello, G.; Ramous, P.; Zaccariotto, M.; Fornasier, S.; Verani, S.; Thomas, N.; Barthol, P.; Hviid, S. F.; Sebastian, I.; Meller, R.; Sierks, H.; Keller, H. U.; Barbieri, C.; Angrilli, F.; Lamy, P.; Rodrigo, R.; Rickman, H.; Wenzel, K. P.

    2017-11-01

    Rosetta is one of the cornerstone missions of the European Space Agency for having a rendezvous with the comet 67P/Churyumov-Gerasimenko in 2014. The imaging instrument on board the satellite is OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System), a cooperation among several European institutes, which consists of two cameras: a Narrow (NAC) and a Wide Angle Camera (WAC). The WAC optical design is an innovative one: it adopts an all reflecting, unvignetted and unobstructed two mirror configuration which allows to cover a 12° × 12° field of view with an F/5.6 aperture and gives a nominal contrast ratio of about 10-4. The flight model of this camera has been successfully integrated and tested in our laboratories, and finally has been integrated on the satellite which is now waiting to be launched in February 2004. In this paper we are going to describe the optical characteristics of the camera, and to summarize the results so far obtained with the preliminary calibration data. The analysis of the optical performance of this model shows a good agreement between theoretical performance and experimental results.

  11. 3D interferometric shape measurement technique using coherent fiber bundles

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Kuschmierz, Robert; Czarske, Jürgen

    2017-06-01

    In-situ 3-D shape measurements with submicron shape uncertainty of fast rotating objects in a cutting lathe are expected, which can be achieved by simultaneous distance and velocity measurements. Conventional tactile methods, coordinate measurement machines, only support ex-situ measurements. Optical measurement techniques such as triangulation and conoscopic holography offer only the distance, so that the absolute diameter cannot be retrieved directly. In comparison, laser Doppler distance sensors (P-LDD sensor) enable simultaneous and in-situ distance and velocity measurements for monitoring the cutting process in a lathe. In order to achieve shape measurement uncertainties below 1 μm, a P-LDD sensor with a dual camera based scattered light detection has been investigated. Coherent fiber bundles (CFB) are employed to forward the scattered light towards cameras. This enables a compact and passive sensor head in the future. Compared with a photo detector based sensor, the dual camera based sensor allows to decrease the measurement uncertainty by the order of one magnitude. As a result, the total shape uncertainty of absolute 3-D shape measurements can be reduced to about 100 nm.

  12. Efficient color correction method for smartphone camera-based health monitoring application.

    PubMed

    Duc Dang; Chae Ho Cho; Daeik Kim; Oh Seok Kwon; Jo Woon Chong

    2017-07-01

    Smartphone health monitoring applications are recently highlighted due to the rapid development of hardware and software performance of smartphones. However, color characteristics of images captured by different smartphone models are dissimilar each other and this difference may give non-identical health monitoring results when the smartphone health monitoring applications monitor physiological information using their embedded smartphone cameras. In this paper, we investigate the differences in color properties of the captured images from different smartphone models and apply a color correction method to adjust dissimilar color values obtained from different smartphone cameras. Experimental results show that the color corrected images using the correction method provide much smaller color intensity errors compared to the images without correction. These results can be applied to enhance the consistency of smartphone camera-based health monitoring applications by reducing color intensity errors among the images obtained from different smartphones.

  13. Spatial calibration of an optical see-through head mounted display

    PubMed Central

    Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew

    2010-01-01

    We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125

  14. A small field of view camera for hybrid gamma and optical imaging

    NASA Astrophysics Data System (ADS)

    Lees, J. E.; Bugby, S. L.; Bhatia, B. S.; Jambi, L. K.; Alqahtani, M. S.; McKnight, W. R.; Ng, A. H.; Perkins, A. C.

    2014-12-01

    The development of compact low profile gamma-ray detectors has allowed the production of small field of view, hand held imaging devices for use at the patient bedside and in operating theatres. The combination of an optical and a gamma camera, in a co-aligned configuration, offers high spatial resolution multi-modal imaging giving a superimposed scintigraphic and optical image. This innovative introduction of hybrid imaging offers new possibilities for assisting surgeons in localising the site of uptake in procedures such as sentinel node detection. Recent improvements to the camera system along with results of phantom and clinical imaging are reported.

  15. Study of a stereo electro-optical tracker system for the measurement of model deformations at the national transonic facility

    NASA Technical Reports Server (NTRS)

    Hertel, R. J.

    1979-01-01

    An electro-optical method to measure the aeroelastic deformations of wind tunnel models is examined. The multitarget tracking performance of one of the two electronic cameras comprising the stereo pair is modeled and measured. The properties of the targets at the model, the camera optics, target illumination, number of targets, acquisition time, target velocities, and tracker performance are considered. The electronic camera system is shown to be capable of locating, measuring, and following the positions of 5 to 50 targets attached to the model at measuring rates up to 5000 targets per second.

  16. Signature analysis of acoustic emission from graphite/epoxy composites

    NASA Technical Reports Server (NTRS)

    Russell, S. S.; Henneke, E. G., II

    1977-01-01

    Acoustic emissions were monitored for crack extension across and parallel to the fibers in a single ply and multiply laminates of graphite epoxy composites. Spectrum analysis was performed on the transient signal to ascertain if the fracture mode can be characterized by a particular spectral pattern. The specimens were loaded to failure quasistatically in a tensile machine. Visual observations were made via either an optical microscope or a television camera. The results indicate that several types of characteristics in the time and frequency domain correspond to different types of failure.

  17. Understanding the changes of cone reflectance in adaptive optics flood illumination retinal images over three years

    PubMed Central

    Mariotti, Letizia; Devaney, Nicholas; Lombardo, Giuseppe; Lombardo, Marco

    2016-01-01

    Although there is increasing interest in the investigation of cone reflectance variability, little is understood about its characteristics over long time scales. Cone detection and its automation is now becoming a fundamental step in the assessment and monitoring of the health of the retina and in the understanding of the photoreceptor physiology. In this work we provide an insight into the cone reflectance variability over time scales ranging from minutes to three years on the same eye, and for large areas of the retina (≥ 2.0 × 2.0 degrees) at two different retinal eccentricities using a commercial adaptive optics (AO) flood illumination retinal camera. We observed that the difference in reflectance observed in the cones increases with the time separation between the data acquisitions and this may have a negative impact on algorithms attempting to track cones over time. In addition, we determined that displacements of the light source within 0.35 mm of the pupil center, which is the farthest location from the pupil center used by operators of the AO camera to acquire high-quality images of the cone mosaic in clinical studies, does not significantly affect the cone detection and density estimation. PMID:27446708

  18. Understanding the changes of cone reflectance in adaptive optics flood illumination retinal images over three years.

    PubMed

    Mariotti, Letizia; Devaney, Nicholas; Lombardo, Giuseppe; Lombardo, Marco

    2016-07-01

    Although there is increasing interest in the investigation of cone reflectance variability, little is understood about its characteristics over long time scales. Cone detection and its automation is now becoming a fundamental step in the assessment and monitoring of the health of the retina and in the understanding of the photoreceptor physiology. In this work we provide an insight into the cone reflectance variability over time scales ranging from minutes to three years on the same eye, and for large areas of the retina (≥ 2.0 × 2.0 degrees) at two different retinal eccentricities using a commercial adaptive optics (AO) flood illumination retinal camera. We observed that the difference in reflectance observed in the cones increases with the time separation between the data acquisitions and this may have a negative impact on algorithms attempting to track cones over time. In addition, we determined that displacements of the light source within 0.35 mm of the pupil center, which is the farthest location from the pupil center used by operators of the AO camera to acquire high-quality images of the cone mosaic in clinical studies, does not significantly affect the cone detection and density estimation.

  19. Winter sky brightness and cloud cover at Dome A, Antarctica

    NASA Astrophysics Data System (ADS)

    Moore, Anna M.; Yang, Yi; Fu, Jianning; Ashley, Michael C. B.; Cui, Xiangqun; Feng, Long Long; Gong, Xuefei; Hu, Zhongwen; Lawrence, Jon S.; Luong-Van, Daniel M.; Riddle, Reed; Shang, Zhaohui; Sims, Geoff; Storey, John W. V.; Tothill, Nicholas F. H.; Travouillon, Tony; Wang, Lifan; Yang, Huigen; Yang, Ji; Zhou, Xu; Zhu, Zhenxi

    2013-01-01

    At the summit of the Antarctic plateau, Dome A offers an intriguing location for future large scale optical astronomical observatories. The Gattini Dome A project was created to measure the optical sky brightness and large area cloud cover of the winter-time sky above this high altitude Antarctic site. The wide field camera and multi-filter system was installed on the PLATO instrument module as part of the Chinese-led traverse to Dome A in January 2008. This automated wide field camera consists of an Apogee U4000 interline CCD coupled to a Nikon fisheye lens enclosed in a heated container with glass window. The system contains a filter mechanism providing a suite of standard astronomical photometric filters (Bessell B, V, R) and a long-pass red filter for the detection and monitoring of airglow emission. The system operated continuously throughout the 2009, and 2011 winter seasons and part-way through the 2010 season, recording long exposure images sequentially for each filter. We have in hand one complete winter-time dataset (2009) returned via a manned traverse. We present here the first measurements of sky brightness in the photometric V band, cloud cover statistics measured so far and an estimate of the extinction.

  20. Improving accuracy of Plenoptic PIV using two light field cameras

    NASA Astrophysics Data System (ADS)

    Thurow, Brian; Fahringer, Timothy

    2017-11-01

    Plenoptic particle image velocimetry (PIV) has recently emerged as a viable technique for acquiring three-dimensional, three-component velocity field data using a single plenoptic, or light field, camera. The simplified experimental arrangement is advantageous in situations where optical access is limited and/or it is not possible to set-up the four or more cameras typically required in a tomographic PIV experiment. A significant disadvantage of a single camera plenoptic PIV experiment, however, is that the accuracy of the velocity measurement along the optical axis of the camera is significantly worse than in the two lateral directions. In this work, we explore the accuracy of plenoptic PIV when two plenoptic cameras are arranged in a stereo imaging configuration. It is found that the addition of a 2nd camera improves the accuracy in all three directions and nearly eliminates any differences between them. This improvement is illustrated using both synthetic and real experiments conducted on a vortex ring using both one and two plenoptic cameras.

  1. Coincidence velocity map imaging using Tpx3Cam, a time stamping optical camera with 1.5 ns timing resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Arthur; van Beuzekom, Martin; Bouwens, Bram

    Here, we demonstrate a coincidence velocity map imaging apparatus equipped with a novel time-stamping fast optical camera, Tpx3Cam, whose high sensitivity and nanosecond timing resolution allow for simultaneous position and time-of-flight detection. This single detector design is simple, flexible, and capable of highly differential measurements. We show detailed characterization of the camera and its application in strong field ionization experiments.

  2. Coincidence velocity map imaging using Tpx3Cam, a time stamping optical camera with 1.5 ns timing resolution

    DOE PAGES

    Zhao, Arthur; van Beuzekom, Martin; Bouwens, Bram; ...

    2017-11-07

    Here, we demonstrate a coincidence velocity map imaging apparatus equipped with a novel time-stamping fast optical camera, Tpx3Cam, whose high sensitivity and nanosecond timing resolution allow for simultaneous position and time-of-flight detection. This single detector design is simple, flexible, and capable of highly differential measurements. We show detailed characterization of the camera and its application in strong field ionization experiments.

  3. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    PubMed Central

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-01-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454

  4. Optical pin apparatus for measuring the arrival time and velocity of shock waves and particles

    DOEpatents

    Benjamin, R.F.

    1983-10-18

    An apparatus for the detection of the arrival and for the determination of the velocity of disturbances such as shock-wave fronts and/or projectiles. Optical pins using fluid-filled microballoons as the light source and an optical fiber as a link to a photodetector have been used to investigate shock-waves and projectiles. A microballoon filled with a noble gas is affixed to one end of a fiber-optic cable, and the other end of the cable is attached to a high-speed streak camera. As the shock-front or projectile compresses the microballoon, the gas inside is heated and compressed producing a bright flash of light. The flash of light is transmitted via the optic cable to the streak camera where it is recorded. One image-converter streak camera is capable of recording information from more than 100 microballoon-cable combinations simultaneously.

  5. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    NASA Astrophysics Data System (ADS)

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-11-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.

  6. Optical pin apparatus for measuring the arrival time and velocity of shock waves and particles

    DOEpatents

    Benjamin, Robert F.

    1987-01-01

    An apparatus for the detection of the arrival and for the determination of the velocity of disturbances such as shock-wave fronts and/or projectiles. Optical pins using fluid-filled microballoons as the light source and an optical fiber as a link to a photodetector have been used to investigate shock-waves and projectiles. A microballoon filled with a noble gas is affixed to one end of a fiber-optic cable, and the other end of the cable is attached to a high-speed streak camera. As the shock-front or projectile compresses the microballoon, the gas inside is heated and compressed producing a bright flash of light. The flash of light is transmitted via the optic cable to the streak camera where it is recorded. One image-converter streak camera is capable of recording information from more than 100 microballoon-cable combinations simultaneously.

  7. Optical pin apparatus for measuring the arrival time and velocity of shock waves and particles

    DOEpatents

    Benjamin, R.F.

    1987-03-10

    An apparatus is disclosed for the detection of the arrival and for the determination of the velocity of disturbances such as shock-wave fronts and/or projectiles. Optical pins using fluid-filled microballoons as the light source and an optical fiber as a link to a photodetector have been used to investigate shock-waves and projectiles. A microballoon filled with a noble gas is affixed to one end of a fiber-optic cable, and the other end of the cable is attached to a high-speed streak camera. As the shock-front or projectile compresses the microballoon, the gas inside is heated and compressed producing a bright flash of light. The flash of light is transmitted via the optic cable to the streak camera where it is recorded. One image-converter streak camera is capable of recording information from more than 100 microballoon-cable combinations simultaneously. 3 figs.

  8. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera.

    PubMed

    Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng

    2017-06-20

    The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.

  9. Wearable Contact Lens Biosensors for Continuous Glucose Monitoring Using Smartphones.

    PubMed

    Elsherif, Mohamed; Hassan, Mohammed Umair; Yetisen, Ali K; Butt, Haider

    2018-05-17

    Low-cost, robust, and reusable continuous glucose monitoring systems that can provide quantitative measurements at point-of-care settings is an unmet medical need. Optical glucose sensors require complex and time-consuming fabrication processes, and their readouts are not practical for quantitative analyses. Here, a wearable contact lens optical sensor was created for the continuous quantification of glucose at physiological conditions, simplifying the fabrication process and facilitating smartphone readouts. A photonic microstructure having a periodicity of 1.6 μm was printed on a glucose-selective hydrogel film functionalized with phenylboronic acid. Upon binding with glucose, the microstructure volume swelled, which modulated the periodicity constant. The resulting change in the Bragg diffraction modulated the space between zero- and first-order spots. A correlation was established between the periodicity constant and glucose concentration within 0-50 mM. The sensitivity of the sensor was 12 nm mM -1 , and the saturation response time was less than 30 min. The sensor was integrated with commercial contact lenses and utilized for continuous glucose monitoring using smartphone camera readouts. The reflected power of the first-order diffraction was measured via a smartphone application and correlated to the glucose concentrations. A short response time of 3 s and a saturation time of 4 min was achieved in the continuous monitoring mode. Glucose-sensitive photonic microstructures may have applications in point-of-care continuous monitoring devices and diagnostics at home settings.

  10. Imaging monitoring techniques applications in the transient gratings detection

    NASA Astrophysics Data System (ADS)

    Zhao, Qing-ming

    2009-07-01

    Experimental studies of Degenerate four-wave mixing (DFWM) in iodine vapor at atmospheric pressure and 0℃ and 25℃ are reported. The Laser-induced grating (LIG) studies are carried out by generating the thermal grating using a pulsed, narrow bandwidth, dye laser .A new image processing system for detecting forward DFWM spectroscopy on iodine vapor is reported. This system is composed of CCD camera, imaging processing card and the related software. With the help of the detecting system, phase matching can be easily achieved in the optical arrangement by crossing the two pumps and the probe as diagonals linking opposite corners of a rectangular box ,and providing a way to position the PhotoMultiplier Tube (PMT) . Also it is practical to know the effect of the pointing stability on the optical path by monitoring facula changing with the laser beam pointing and disturbs of the environment. Finally the effects of Photostability of dye laser on the ration of signal to noise in DFWM using forward geometries have been investigated in iodine vapor. This system makes it feasible that the potential application of FG-DFWM is used as a diagnostic tool in combustion research and environment monitoring.

  11. Comparison of standing volume estimates using optical dendrometers

    Treesearch

    Neil A. Clark; Stanley J. Zarnoch; Alexander Clark; Gregory A. Reams

    2001-01-01

    This study compared height and diameter measurements and volume estimates on 20 hardwood and 20 softwood stems using traditional optical dendrometers, an experimental camera instrument, and mechanical calipers. Multiple comparison tests showed significant differences among the means for lower stem diameters when the camera was used. There were no significant...

  12. Comparison of Standing Volume Estimates Using Optical Dendrometers

    Treesearch

    Neil A. Clark; Stanley J. Zarnoch; Alexander Clark; Gregory A. Reams

    2001-01-01

    This study compared height and diameter measurements and volume estimates on 20 hardwood and 20 softwood stems using traditional optical dendrometers, an experimental camera instrument, and mechanical calipers. Multiple comparison tests showed significant differences among the means for lower stem diameters when the camera was used. There were no significant...

  13. Performance Evaluations and Quality Validation System for Optical Gas Imaging Cameras That Visualize Fugitive Hydrocarbon Gas Emissions

    EPA Science Inventory

    Optical gas imaging (OGI) cameras have the unique ability to exploit the electromagnetic properties of fugitive chemical vapors to make invisible gases visible. This ability is extremely useful for industrial facilities trying to mitigate product losses from escaping gas and fac...

  14. Development of a mirror-based endoscope for divertor spectroscopy on JET with the new ITER-like wall (invited).

    PubMed

    Huber, A; Brezinsek, S; Mertens, Ph; Schweer, B; Sergienko, G; Terra, A; Arnoux, G; Balshaw, N; Clever, M; Edlingdon, T; Egner, S; Farthing, J; Hartl, M; Horton, L; Kampf, D; Klammer, J; Lambertz, H T; Matthews, G F; Morlock, C; Murari, A; Reindl, M; Riccardo, V; Samm, U; Sanders, S; Stamp, M; Williams, J; Zastrow, K D; Zauner, C

    2012-10-01

    A new endoscope with optimised divertor view has been developed in order to survey and monitor the emission of specific impurities such as tungsten and the remaining carbon as well as beryllium in the tungsten divertor of JET after the implementation of the ITER-like wall in 2011. The endoscope is a prototype for testing an ITER relevant design concept based on reflective optics only. It may be subject to high neutron fluxes as expected in ITER. The operating wavelength range, from 390 nm to 2500 nm, allows the measurements of the emission of all expected impurities (W I, Be II, C I, C II, C III) with high optical transmittance (≥ 30% in the designed wavelength range) as well as high spatial resolution that is ≤ 2 mm at the object plane and ≤ 3 mm for the full depth of field (± 0.7 m). The new optical design includes options for in situ calibration of the endoscope transmittance during the experimental campaign, which allows the continuous tracing of possible transmittance degradation with time due to impurity deposition and erosion by fast neutral particles. In parallel to the new optical design, a new type of possibly ITER relevant shutter system based on pneumatic techniques has been developed and integrated into the endoscope head. The endoscope is equipped with four digital CCD cameras, each combined with two filter wheels for narrow band interference and neutral density filters. Additionally, two protection cameras in the λ > 0.95 μm range have been integrated in the optical design for the real time wall protection during the plasma operation of JET.

  15. X-ray and optical stereo-based 3D sensor fusion system for image-guided neurosurgery.

    PubMed

    Kim, Duk Nyeon; Chae, You Seong; Kim, Min Young

    2016-04-01

    In neurosurgery, an image-guided operation is performed to confirm that the surgical instruments reach the exact lesion position. Among the multiple imaging modalities, an X-ray fluoroscope mounted on C- or O-arm is widely used for monitoring the position of surgical instruments and the target position of the patient. However, frequently used fluoroscopy can result in relatively high radiation doses, particularly for complex interventional procedures. The proposed system can reduce radiation exposure and provide the accurate three-dimensional (3D) position information of surgical instruments and the target position. X-ray and optical stereo vision systems have been proposed for the C- or O-arm. Two subsystems have same optical axis and are calibrated simultaneously. This provides easy augmentation of the camera image and the X-ray image. Further, the 3D measurement of both systems can be defined in a common coordinate space. The proposed dual stereoscopic imaging system is designed and implemented for mounting on an O-arm. The calibration error of the 3D coordinates of the optical stereo and X-ray stereo is within 0.1 mm in terms of the mean and the standard deviation. Further, image augmentation with the camera image and the X-ray image using an artificial skull phantom is achieved. As the developed dual stereoscopic imaging system provides 3D coordinates of the point of interest in both optical images and fluoroscopic images, it can be used by surgeons to confirm the position of surgical instruments in a 3D space with minimum radiation exposure and to verify whether the instruments reach the surgical target observed in fluoroscopic images.

  16. Development of a mirror-based endoscope for divertor spectroscopy on JET with the new ITER-like wall (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huber, A.; Brezinsek, S.; Mertens, Ph.

    2012-10-15

    A new endoscope with optimised divertor view has been developed in order to survey and monitor the emission of specific impurities such as tungsten and the remaining carbon as well as beryllium in the tungsten divertor of JET after the implementation of the ITER-like wall in 2011. The endoscope is a prototype for testing an ITER relevant design concept based on reflective optics only. It may be subject to high neutron fluxes as expected in ITER. The operating wavelength range, from 390 nm to 2500 nm, allows the measurements of the emission of all expected impurities (W I, Be II,more » C I, C II, C III) with high optical transmittance ({>=}30% in the designed wavelength range) as well as high spatial resolution that is {<=}2 mm at the object plane and {<=}3 mm for the full depth of field ({+-}0.7 m). The new optical design includes options for in situ calibration of the endoscope transmittance during the experimental campaign, which allows the continuous tracing of possible transmittance degradation with time due to impurity deposition and erosion by fast neutral particles. In parallel to the new optical design, a new type of possibly ITER relevant shutter system based on pneumatic techniques has been developed and integrated into the endoscope head. The endoscope is equipped with four digital CCD cameras, each combined with two filter wheels for narrow band interference and neutral density filters. Additionally, two protection cameras in the {lambda} > 0.95 {mu}m range have been integrated in the optical design for the real time wall protection during the plasma operation of JET.« less

  17. Optical design of portable nonmydriatic fundus camera

    NASA Astrophysics Data System (ADS)

    Chen, Weilin; Chang, Jun; Lv, Fengxian; He, Yifan; Liu, Xin; Wang, Dajiang

    2016-03-01

    Fundus camera is widely used in screening and diagnosis of retinal disease. It is a simple, and widely used medical equipment. Early fundus camera expands the pupil with mydriatic to increase the amount of the incoming light, which makes the patients feel vertigo and blurred. Nonmydriatic fundus camera is a trend of fundus camera. Desktop fundus camera is not easy to carry, and only suitable to be used in the hospital. However, portable nonmydriatic retinal camera is convenient for patient self-examination or medical stuff visiting a patient at home. This paper presents a portable nonmydriatic fundus camera with the field of view (FOV) of 40°, Two kinds of light source are used, 590nm is used in imaging, while 808nm light is used in observing the fundus in high resolving power. Ring lights and a hollow mirror are employed to restrain the stray light from the cornea center. The focus of the camera is adjusted by reposition the CCD along the optical axis. The range of the diopter is between -20m-1 and 20m-1.

  18. Pi of the Sky full system and the new telescope

    NASA Astrophysics Data System (ADS)

    Mankiewicz, L.; Batsch, T.; Castro-Tirado, A.; Czyrkowski, H.; Cwiek, A.; Cwiok, M.; Dabrowski, R.; Jelínek, M.; Kasprowicz, G.; Majcher, A.; Majczyna, A.; Malek, K.; Nawrocki, K.; Obara, L.; Opiela, R.; Piotrowski, L. W.; Siudek, M.; Sokolowski, M.; Wawrzaszek, R.; Wrochna, G.; Zaremba, M.; Żarnecki, A. F.

    2014-12-01

    The Pi of the Sky is a system of wide field of view robotic telescopes, which search for short timescale astrophysical phenomena, especially for prompt optical GRB emission. The system was designed for autonomous operation, monitoring a large fraction of the sky to a depth of 12(m}-13({m)) and with time resolution of the order of 1 - 10 seconds. The system design and observation strategy were successfully tested with a prototype detector operational at Las Campanas Observatory, Chile from 2004-2009 and moved to San Pedro de Atacama Observatory in March 2011. In October 2010 the first unit of the final Pi of the Sky detector system, with 4 CCD cameras, was successfully installed at the INTA El Arenosillo Test Centre in Spain. In July 2013 three more units (12 CCD cameras) were commissioned and installed, together with the first one, on a new platform in INTA, extending sky coverage to about 6000 square degrees.

  19. Optical image acquisition system for colony analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weixing; Jin, Wenbiao

    2006-02-01

    For counting of both colonies and plaques, there is a large number of applications including food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing, AMES testing, pharmaceuticals, paints, sterile fluids and fungal contamination. Recently, many researchers and developers have made efforts for this kind of systems. By investigation, some existing systems have some problems since they belong to a new technology product. One of the main problems is image acquisition. In order to acquire colony images with good quality, an illumination box was constructed as: the box includes front lightning and back lightning, which can be selected by users based on properties of colony dishes. With the illumination box, lightning can be uniform; colony dish can be put in the same place every time, which make image processing easy. A digital camera in the top of the box connected to a PC computer with a USB cable, all the camera functions are controlled by the computer.

  20. The Sydney University PAPA camera

    NASA Astrophysics Data System (ADS)

    Lawson, Peter R.

    1994-04-01

    The Precision Analog Photon Address (PAPA) camera is a photon-counting array detector that uses optical encoding to locate photon events on the output of a microchannel plate image intensifier. The Sydney University camera is a 256x256 pixel detector which can operate at speeds greater than 1 million photons per second and produce individual photon coordinates with a deadtime of only 300 ns. It uses a new Gray coded mask-plate which permits a simplified optical alignment and successfully guards against vignetting artifacts.

  1. ARNICA, the Arcetri near-infrared camera: Astronomical performance assessment.

    NASA Astrophysics Data System (ADS)

    Hunt, L. K.; Lisi, F.; Testi, L.; Baffa, C.; Borelli, S.; Maiolino, R.; Moriondo, G.; Stanga, R. M.

    1996-01-01

    The Arcetri near-infrared camera ARNICA was built as a users' instrument for the Infrared Telescope at Gornergrat (TIRGO), and is based on a 256x256 NICMOS 3 detector. In this paper, we discuss ARNICA's optical and astronomical performance at the TIRGO and at the William Herschel Telescope on La Palma. Optical performance is evaluated in terms of plate scale, distortion, point spread function, and ghosting. Astronomical performance is characterized by camera efficiency, sensitivity, and spatial uniformity of the photometry.

  2. 640x480 PtSi Stirling-cooled camera system

    NASA Astrophysics Data System (ADS)

    Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; Coyle, Peter J.; Feder, Howard L.; Gilmartin, Harvey R.; Levine, Peter A.; Sauer, Donald J.; Shallcross, Frank V.; Demers, P. L.; Smalser, P. J.; Tower, John R.

    1992-09-01

    A Stirling cooled 3 - 5 micron camera system has been developed. The camera employs a monolithic 640 X 480 PtSi-MOS focal plane array. The camera system achieves an NEDT equals 0.10 K at 30 Hz frame rate with f/1.5 optics (300 K background). At a spatial frequency of 0.02 cycles/mRAD the vertical and horizontal Minimum Resolvable Temperature are in the range of MRT equals 0.03 K (f/1.5 optics, 300 K background). The MOS focal plane array achieves a resolution of 480 TV lines per picture height independent of background level and position within the frame.

  3. Scaling-up camera traps: monitoring the planet's biodiversity with networks of remote sensors

    USGS Publications Warehouse

    Steenweg, Robin; Hebblewhite, Mark; Kays, Roland; Ahumada, Jorge A.; Fisher, Jason T.; Burton, Cole; Townsend, Susan E.; Carbone, Chris; Rowcliffe, J. Marcus; Whittington, Jesse; Brodie, Jedediah; Royle, Andy; Switalski, Adam; Clevenger, Anthony P.; Heim, Nicole; Rich, Lindsey N.

    2017-01-01

    Countries committed to implementing the Convention on Biological Diversity's 2011–2020 strategic plan need effective tools to monitor global trends in biodiversity. Remote cameras are a rapidly growing technology that has great potential to transform global monitoring for terrestrial biodiversity and can be an important contributor to the call for measuring Essential Biodiversity Variables. Recent advances in camera technology and methods enable researchers to estimate changes in abundance and distribution for entire communities of animals and to identify global drivers of biodiversity trends. We suggest that interconnected networks of remote cameras will soon monitor biodiversity at a global scale, help answer pressing ecological questions, and guide conservation policy. This global network will require greater collaboration among remote-camera studies and citizen scientists, including standardized metadata, shared protocols, and security measures to protect records about sensitive species. With modest investment in infrastructure, and continued innovation, synthesis, and collaboration, we envision a global network of remote cameras that not only provides real-time biodiversity data but also serves to connect people with nature.

  4. The outlook of innovative optical-electronic technologies implementation in transportation

    NASA Astrophysics Data System (ADS)

    Shilina, Elena V.; Ryabichenko, Roman B.

    2005-06-01

    Information and telecommunication technologies (ITT) are already tool economic development of society and their role will grow. The first task is providing of information security of ITT that is necessary for it distribution in "information" society. The state policy of the leading world countries (USA, France, Japan, Great Britain and China) is focused on investment huge funds in innovative technologies development. Within the next 4-6 years the main fiber-optic transfer lines will have data transfer speed 40 Gbit/s, number of packed channels 60-200 that will provide effective data transfer speed 2,4-8 Tbit/s. Photonic-crystalline fibers will be promising base of new generation fiber-optic transfer lines. The market of information imaging devices and digital photo cameras will be grown in 3-5 times. Powerful lasers based on CO2 and Nd:YAG will be actively used in transport machinery construction when producing aluminum constructions of light rolling-stock. Light-emitting diodes (LEDs) will be base for energy saving and safety light sources used for vehicles and indoor lighting. For example, in the USA cost reducing for lighting will be 200 billion dollars. Implementation analysis of optic electronic photonic technologies (OPT) in ground and aerospace systems shows that they provide significant increasing of traffic safety, crew and passengers comfort with help of smart vehicles construction and non-contact dynamic monitoring both transport facilities (for example, wheel flanges) and condition of rail track (road surface), equipping vehicles with night vision equipment. Scientific-technical programs of JSC "RZD" propose application of OPT in new generation systems: axle-box units for coaches and freight cars monitoring when they are moved, track condition analysis, mechanical stress and permanent way irregularity detection, monitoring geometric parameters of aerial contact wire, car truck, rail and wheel pair roll surface, light signals automatic detection from locomotive, video monitoring, gyroscopes based on fiber optic.

  5. Optical Design of the Camera for Transiting Exoplanet Survey Satellite (TESS)

    NASA Technical Reports Server (NTRS)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-01-01

    The optical design of the wide field of view refractive camera, 34 degrees diagonal field, for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75 C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, took advantage of Forbes aspheres to develop a hybrid design form. This maximized the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter was replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements were met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  6. Plenoptic Imager for Automated Surface Navigation

    NASA Technical Reports Server (NTRS)

    Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael

    2010-01-01

    An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.

  7. Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography

    USGS Publications Warehouse

    Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.

    1972-01-01

    Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.

  8. Reconditioning of Cassini Narrow-Angle Camera

    NASA Image and Video Library

    2002-07-23

    These five images of single stars, taken at different times with the narrow-angle camera on NASA Cassini spacecraft, show the effects of haze collecting on the camera optics, then successful removal of the haze by warming treatments.

  9. Evaluation of modified portable digital camera for screening of diabetic retinopathy.

    PubMed

    Chalam, Kakarla V; Brar, Vikram S; Keshavamurthy, Ravi

    2009-01-01

    To describe a portable wide-field noncontact digital camera for posterior segment photography. The digital camera has a compound lens consisting of two optical elements (a 90-dpt and a 20-dpt lens) attached to a 7.2-megapixel camera. White-light-emitting diodes are used to illuminate the fundus and reduce source reflection. The camera settings are set to candlelight mode, the optic zoom standardized to x2.4 and the focus is manually set to 3.0 m. The new technique provides quality wide-angle digital images of the retina (60 degrees ) in patients with dilated pupils, at a fraction of the cost of established digital fundus photography. The modified digital camera is a useful alternative technique to acquire fundus images and provides a tool for screening posterior segment conditions, including diabetic retinopathy in a variety of clinical settings.

  10. Time-resolved X-ray excited optical luminescence using an optical streak camera

    NASA Astrophysics Data System (ADS)

    Ward, M. J.; Regier, T. Z.; Vogt, J. M.; Gordon, R. A.; Han, W.-Q.; Sham, T. K.

    2013-03-01

    We report the development of a time-resolved XEOL (TR-XEOL) system that employs an optical streak camera. We have conducted TR-XEOL experiments at the Canadian Light Source (CLS) operating in single bunch mode with a 570 ns dark gap and 35 ps electron bunch pulse, and at the Advanced Photon Source (APS) operating in top-up mode with a 153 ns dark gap and 33.5 ps electron bunch pulse. To illustrate the power of this technique we measured the TR-XEOL of solid-solution nanopowders of gallium nitride - zinc oxide, and for the first time have been able to resolve near-band-gap (NBG) optical luminescence emission from these materials. Herein we will discuss the development of the streak camera TR-XEOL technique and its application to the study of these novel materials.

  11. Exact optics - III. Schwarzschild's spectrograph camera revised

    NASA Astrophysics Data System (ADS)

    Willstrop, R. V.

    2004-03-01

    Karl Schwarzschild identified a system of two mirrors, each defined by conic sections, free of third-order spherical aberration, coma and astigmatism, and with a flat focal surface. He considered it impractical, because the field was too restricted. This system was rediscovered as a quadratic approximation to one of Lynden-Bell's `exact optics' designs which have wider fields. Thus the `exact optics' version has a moderate but useful field, with excellent definition, suitable for a spectrograph camera. The mirrors are strongly aspheric in both the Schwarzschild design and the exact optics version.

  12. Dual beam optical interferometer

    NASA Technical Reports Server (NTRS)

    Gutierrez, Roman C. (Inventor)

    2003-01-01

    A dual beam interferometer device is disclosed that enables moving an optics module in a direction, which changes the path lengths of two beams of light. The two beams reflect off a surface of an object and generate different speckle patterns detected by an element, such as a camera. The camera detects a characteristic of the surface.

  13. SPLASSH: Open source software for camera-based high-speed, multispectral in-vivo optical image acquisition

    PubMed Central

    Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.

    2010-01-01

    Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475

  14. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  15. Final Technical Report: Development of Post-Installation Monitoring Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polagye, Brian

    2014-03-31

    The development of approaches to harness marine and hydrokinetic energy at large-scale is predicated on the compatibility of these generation technologies with the marine environment. At present, aspects of this compatibility are uncertain. Demonstration projects provide an opportunity to address these uncertainties in a way that moves the entire industry forward. However, the monitoring capabilities to realize these advances are often under-developed in comparison to the marine and hydrokinetic energy technologies being studied. Public Utility District No. 1 of Snohomish County has proposed to deploy two 6-meter diameter tidal turbines manufactured by OpenHydro in northern Admiralty Inlet, Puget Sound, Washington.more » The goal of this deployment is to provide information about the environmental, technical, and economic performance of such turbines that can advance the development of larger-scale tidal energy projects, both in the United States and internationally. The objective of this particular project was to develop environmental monitoring plans in collaboration with resource agencies, while simultaneously advancing the capabilities of monitoring technologies to the point that they could be realistically implemented as part of these plans. In this, the District was joined by researchers at the Northwest National Marine Renewable Energy Center at the University of Washington, Sea Mammal Research Unit, LLC, H.T. Harvey & Associates, and Pacific Northwest National Laboratory. Over a two year period, the project team successfully developed four environmental monitoring and mitigation plans that were adopted as a condition of the operating license for the demonstration project that issued by the Federal Energy Regulatory Commission in March 2014. These plans address nearturbine interactions with marine animals, the sound produced by the turbines, marine mammal behavioral changes associated with the turbines, and changes to benthic habitat associated with colonization of the subsea base support structure. In support of these plans, the project team developed and field tested a strobe-illuminated stereooptical camera system suitable for studying near-turbine interactions with marine animals. The camera system underwent short-term field testing at the proposed turbine deployment site and a multi-month endurance test in shallower water to evaluate the effectiveness of biofouling mitigation measures for the optical ports on camera and strobe pressure housings. These tests demonstrated that the camera system is likely to meet the objectives of the near-turbine monitoring plan and operate, without maintenance, for periods of at least three months. The project team also advanced monitoring capabilities related to passive acoustic monitoring of marine mammals and monitoring of tidal currents. These capabilities will be integrated in a recoverable monitoring package that has a single interface point with the OpenHydro turbines, connects to shore power and data via a wet-mate connector, and can be recovered to the surface for maintenance and reconfiguration independent of the turbine. A logical next step would be to integrate these instruments within the package, such that one instrument can trigger the operation of another.« less

  16. Multimodal optical setup based on spectrometer and cameras combination for biological tissue characterization with spatially modulated illumination

    NASA Astrophysics Data System (ADS)

    Baruch, Daniel; Abookasis, David

    2017-04-01

    The application of optical techniques as tools for biomedical research has generated substantial interest for the ability of such methodologies to simultaneously measure biochemical and morphological parameters of tissue. Ongoing optimization of optical techniques may introduce such tools as alternative or complementary to conventional methodologies. The common approach shared by current optical techniques lies in the independent acquisition of tissue's optical properties (i.e., absorption and reduced scattering coefficients) from reflected or transmitted light. Such optical parameters, in turn, provide detailed information regarding both the concentrations of clinically relevant chromophores and macroscopic structural variations in tissue. We couple a noncontact optical setup with a simple analysis algorithm to obtain absorption and scattering coefficients of biological samples under test. Technically, a portable picoprojector projects serial sinusoidal patterns at low and high spatial frequencies, while a spectrometer and two independent CCD cameras simultaneously acquire the reflected diffuse light through a single spectrometer and two separate CCD cameras having different bandpass filters at nonisosbestic and isosbestic wavelengths in front of each. This configuration fills the gaps in each other's capabilities for acquiring optical properties of tissue at high spectral and spatial resolution. Experiments were performed on both tissue-mimicking phantoms as well as hands of healthy human volunteers to quantify their optical properties as proof of concept for the present technique. In a separate experiment, we derived the optical properties of the hand skin from the measured diffuse reflectance, based on a recently developed camera model. Additionally, oxygen saturation levels of tissue measured by the system were found to agree well with reference values. Taken together, the present results demonstrate the potential of this integrated setup for diagnostic and research applications.

  17. Dynamic measurements and simulations of airborne picolitre-droplet coalescence in holographic optical tweezers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bzdek, Bryan R.; Reid, Jonathan P., E-mail: j.p.reid@bristol.ac.uk; Collard, Liam

    We report studies of the coalescence of pairs of picolitre aerosol droplets manipulated with holographic optical tweezers, probing the shape relaxation dynamics following coalescence by simultaneously monitoring the intensity of elastic backscattered light (EBL) from the trapping laser beam (time resolution on the order of 100 ns) while recording high frame rate camera images (time resolution <10 μs). The goals of this work are to: resolve the dynamics of droplet coalescence in holographic optical traps; assign the origin of key features in the time-dependent EBL intensity; and validate the use of the EBL alone to precisely determine droplet surface tensionmore » and viscosity. For low viscosity droplets, two sequential processes are evident: binary coalescence first results from the overlap of the optical traps on the time scale of microseconds followed by the recapture of the composite droplet in an optical trap on the time scale of milliseconds. As droplet viscosity increases, the relaxation in droplet shape eventually occurs on the same time scale as recapture, resulting in a convoluted evolution of the EBL intensity that inhibits quantitative determination of the relaxation time scale. Droplet coalescence was simulated using a computational framework to validate both experimental approaches. The results indicate that time-dependent monitoring of droplet shape from the EBL intensity allows for robust determination of properties such as surface tension and viscosity. Finally, the potential of high frame rate imaging to examine the coalescence of dissimilar viscosity droplets is discussed.« less

  18. Clinical applications of commercially available video recording and monitoring systems: inexpensive, high-quality video recording and monitoring systems for endoscopy and microsurgery.

    PubMed

    Tsunoda, Koichi; Tsunoda, Atsunobu; Ishimoto, ShinnIchi; Kimura, Satoko

    2006-01-01

    The exclusive charge-coupled device (CCD) camera system for the endoscope and electronic fiberscopes are in widespread use. However, both are usually stationary in an office or examination room, and a wheeled cart is needed for mobility. The total costs of the CCD camera system and electronic fiberscopy system are at least US Dollars 10,000 and US Dollars 30,000, respectively. Recently, the performance of audio and visual instruments has improved dramatically, with a concomitant reduction in their cost. Commercially available CCD video cameras with small monitors have become common. They provide excellent image quality and are much smaller and less expensive than previous models. The authors have developed adaptors for the popular mini-digital video (mini-DV) camera. The camera also provides video and acoustic output signals; therefore, the endoscopic images can be viewed on a large monitor simultaneously. The new system (a mini-DV video camera and an adaptor) costs only US Dollars 1,000. Therefore, the system is both cost-effective and useful for the outpatient clinic or casualty setting, or on house calls for the purpose of patient education. In the future, the authors plan to introduce the clinical application of a high-vision camera and an infrared camera as medical instruments for clinical and research situations.

  19. Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics

    NASA Astrophysics Data System (ADS)

    Furxhi, Orges; Frascati, Joe; Driggers, Ronald

    2018-04-01

    Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.

  20. Motionless active depth from defocus system using smart optics for camera autofocus applications

    NASA Astrophysics Data System (ADS)

    Amin, M. Junaid; Riza, Nabeel A.

    2016-04-01

    This paper describes a motionless active Depth from Defocus (DFD) system design suited for long working range camera autofocus applications. The design consists of an active illumination module that projects a scene illuminating coherent conditioned optical radiation pattern which maintains its sharpness over multiple axial distances allowing an increased DFD working distance range. The imager module of the system responsible for the actual DFD operation deploys an electronically controlled variable focus lens (ECVFL) as a smart optic to enable a motionless imager design capable of effective DFD operation. An experimental demonstration is conducted in the laboratory which compares the effectiveness of the coherent conditioned radiation module versus a conventional incoherent active light source, and demonstrates the applicability of the presented motionless DFD imager design. The fast response and no-moving-parts features of the DFD imager design are especially suited for camera scenarios where mechanical motion of lenses to achieve autofocus action is challenging, for example, in the tiny camera housings in smartphones and tablets. Applications for the proposed system include autofocus in modern day digital cameras.

  1. Accurate estimation of camera shot noise in the real-time

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.

    2017-10-01

    Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the accuracy of the obtained temporal noise values was estimated.

  2. Single-camera stereo-digital image correlation with a four-mirror adapter: optimized design and validation

    NASA Astrophysics Data System (ADS)

    Yu, Liping; Pan, Bing

    2016-12-01

    A low-cost, easy-to-implement but practical single-camera stereo-digital image correlation (DIC) system using a four-mirror adapter is established for accurate shape and three-dimensional (3D) deformation measurements. The mirrors assisted pseudo-stereo imaging system can convert a single camera into two virtual cameras, which view a specimen from different angles and record the surface images of the test object onto two halves of the camera sensor. To enable deformation measurement in non-laboratory conditions or extreme high temperature environments, an active imaging optical design, combining an actively illuminated monochromatic source with a coupled band-pass optical filter, is compactly integrated to the pseudo-stereo DIC system. The optical design, basic principles and implementation procedures of the established system for 3D profile and deformation measurements are described in detail. The effectiveness and accuracy of the established system are verified by measuring the profile of a regular cylinder surface and displacements of a translated planar plate. As an application example, the established system is used to determine the tensile strains and Poisson's ratio of a composite solid propellant specimen during stress relaxation test. Since the established single-camera stereo-DIC system only needs a single camera and presents strong robustness against variations in ambient light or the thermal radiation of a hot object, it demonstrates great potential in determining transient deformation in non-laboratory or high-temperature environments with the aid of a single high-speed camera.

  3. Image quality enhancement method for on-orbit remote sensing cameras using invariable modulation transfer function.

    PubMed

    Li, Jin; Liu, Zilong

    2017-07-24

    Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.

  4. Movable Cameras And Monitors For Viewing Telemanipulator

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Venema, Steven C.

    1993-01-01

    Three methods proposed to assist operator viewing telemanipulator on video monitor in control station when video image generated by movable video camera in remote workspace of telemanipulator. Monitors rotated or shifted and/or images in them transformed to adjust coordinate systems of scenes visible to operator according to motions of cameras and/or operator's preferences. Reduces operator's workload and probability of error by obviating need for mental transformations of coordinates during operation. Methods applied in outer space, undersea, in nuclear industry, in surgery, in entertainment, and in manufacturing.

  5. Comparative analysis of three different methods for monitoring the use of green bridges by wildlife.

    PubMed

    Gužvica, Goran; Bošnjak, Ivana; Bielen, Ana; Babić, Danijel; Radanović-Gužvica, Biserka; Šver, Lidija

    2014-01-01

    Green bridges are used to decrease highly negative impact of roads/highways on wildlife populations and their effectiveness is evaluated by various monitoring methods. Based on the 3-year monitoring of four Croatian green bridges, we compared the effectiveness of three indirect monitoring methods: track-pads, camera traps and active infrared (IR) trail monitoring system. The ability of the methods to detect different species and to give good estimation of number of animal crossings was analyzed. The accuracy of species detection by track-pad method was influenced by granulometric composition of track-pad material, with the best results obtained with higher percentage of silt and clay. We compared the species composition determined by track-pad and camera trap methods and found that monitoring by tracks underestimated the ratio of small canids, while camera traps underestimated the ratio of roe deer. Regarding total number of recorder events, active IR detectors recorded from 11 to 19 times more events then camera traps and app. 80% of them were not caused by animal crossings. Camera trap method underestimated the real number of total events. Therefore, an algorithm for filtration of the IR dataset was developed for approximation of the real number of crossings. Presented results are valuable for future monitoring of wildlife crossings in Croatia and elsewhere, since advantages and disadvantages of used monitoring methods are shown. In conclusion, different methods should be chosen/combined depending on the aims of the particular monitoring study.

  6. Combined hostile fire and optics detection

    NASA Astrophysics Data System (ADS)

    Brännlund, Carl; Tidström, Jonas; Henriksson, Markus; Sjöqvist, Lars

    2013-10-01

    Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.

  7. On-line, continuous monitoring in solar cell and fuel cell manufacturing using spectral reflectance imaging

    DOEpatents

    Sopori, Bhushan; Rupnowski, Przemyslaw; Ulsh, Michael

    2016-01-12

    A monitoring system 100 comprising a material transport system 104 providing for the transportation of a substantially planar material 102, 107 through the monitoring zone 103 of the monitoring system 100. The system 100 also includes a line camera 106 positioned to obtain multiple line images across a width of the material 102, 107 as it is transported through the monitoring zone 103. The system 100 further includes an illumination source 108 providing for the illumination of the material 102, 107 transported through the monitoring zone 103 such that light reflected in a direction normal to the substantially planar surface of the material 102, 107 is detected by the line camera 106. A data processing system 110 is also provided in digital communication with the line camera 106. The data processing system 110 is configured to receive data output from the line camera 106 and further configured to calculate and provide substantially contemporaneous information relating to a quality parameter of the material 102, 107. Also disclosed are methods of monitoring a quality parameter of a material.

  8. Novel computer-based endoscopic camera

    NASA Astrophysics Data System (ADS)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  9. The Visible Imaging System (VIS) for the Polar Spacecraft

    NASA Technical Reports Server (NTRS)

    Frank, L. A.; Sigwarth, J. B.; Craven, J. D.; Cravens, J. P.; Dolan, J. S.; Dvorsky, M. R.; Hardebeck, P. K.; Harvey, J. D.; Muller, D. W.

    1995-01-01

    The Visible Imaging System (VIS) is a set of three low-light-level cameras to be flown on the POLAR spacecraft of the Global Geospace Science (GGS) program which is an element of the International Solar-Terrestrial Physics (ISTP) campaign. Two of these cameras share primary and some secondary optics and are designed to provide images of the nighttime auroral oval at visible wavelengths. A third camera is used to monitor the directions of the fields-of-view of these sensitive auroral cameras with respect to sunlit Earth. The auroral emissions of interest include those from N+2 at 391.4 nm, 0 I at 557.7 and 630.0 nm, H I at 656.3 nm, and 0 II at 732.0 nm. The two auroral cameras have different spatial resolutions. These resolutions are about 10 and 20 km from a spacecraft altitude of 8 R(sub e). The time to acquire and telemeter a 256 x 256-pixel image is about 12 s. The primary scientific objectives of this imaging instrumentation, together with the in-situ observations from the ensemble of ISTP spacecraft, are (1) quantitative assessment of the dissipation of magnetospheric energy into the auroral ionosphere, (2) an instantaneous reference system for the in-situ measurements, (3) development of a substantial model for energy flow within the magnetosphere, (4) investigation of the topology of the magnetosphere, and (5) delineation of the responses of the magnetosphere to substorms and variable solar wind conditions.

  10. A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera

    NASA Astrophysics Data System (ADS)

    Kroedel, Matthias; Langton, J. Brian; Wahl, Bill

    2017-09-01

    This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.

  11. New Optics See More With Less

    NASA Technical Reports Server (NTRS)

    Nabors, Sammy

    2015-01-01

    NASA offers companies an optical system that provides a unique panoramic perspective with a single camera. NASA's Marshall Space Flight Center has developed a technology that combines a panoramic refracting optic (PRO) lens with a unique detection system to acquire a true 360-degree field of view. Although current imaging systems can acquire panoramic images, they must use up to five cameras to obtain the full field of view. MSFC's technology obtains its panoramic images from one vantage point.

  12. A randomized comparison of laparoscopic, flexible endoscopic, and wired and wireless magnetic cameras on ex vivo and in vivo NOTES surgical performance.

    PubMed

    Chang, Victoria C; Tang, Shou-Jiang; Swain, C Paul; Bergs, Richard; Paramo, Juan; Hogg, Deborah C; Fernandez, Raul; Cadeddu, Jeffrey A; Scott, Daniel J

    2013-08-01

    The influence of endoscopic video camera (VC) image quality on surgical performance has not been studied. Flexible endoscopes are used as substitutes for laparoscopes in natural orifice translumenal endoscopic surgery (NOTES), but their optics are originally designed for intralumenal use. Manipulable wired or wireless independent VCs might offer advantages for NOTES but are still under development. To measure the optical characteristics of 4 VC systems and to compare their impact on the performance of surgical suturing tasks. VC systems included a laparoscope (Storz 10 mm), a flexible endoscope (Olympus GIF 160), and 2 prototype deployable cameras (magnetic anchoring and guidance system [MAGS] Camera and PillCam). In a randomized fashion, the 4 systems were evaluated regarding standardized optical characteristics and surgical manipulations of previously validated ex vivo (fundamentals of laparoscopic surgery model) and in vivo (live porcine Nissen model) tasks; objective metrics (time and errors/precision) and combined surgeon (n = 2) performance were recorded. Subtle differences were detected for color tests, and field of view was variable (65°-115°). Suitable resolution was detected up to 10 cm for the laparoscope and MAGS camera but only at closer distances for the endoscope and PillCam. Compared with the laparoscope, surgical suturing performances were modestly lower for the MAGS camera and significantly lower for the endoscope (ex vivo) and PillCam (ex vivo and in vivo). This study documented distinct differences in VC systems that may be used for NOTES in terms of both optical characteristics and surgical performance. Additional work is warranted to optimize cameras for NOTES. Deployable systems may be especially well suited for this purpose.

  13. Development of the Optical Communications Telescope Laboratory: A Laser Communications Relay Demonstration Ground Station

    NASA Technical Reports Server (NTRS)

    Wilson, K. E.; Antsos, D.; Roberts, L. C. Jr.,; Piazzolla, S.; Clare, L. P.; Croonquist, A. P.

    2012-01-01

    The Laser Communications Relay Demonstration (LCRD) project will demonstrate high bandwidth space to ground bi-directional optical communications links between a geosynchronous satellite and two LCRD optical ground stations located in the southwestern United States. The project plans to operate for two years with a possible extension to five. Objectives of the demonstration include the development of operational strategies to prototype optical link and relay services for the next generation tracking and data relay satellites. Key technologies to be demonstrated include adaptive optics to correct for clear air turbulence-induced wave front aberrations on the downlink, and advanced networking concepts for assured and automated data delivery. Expanded link availability will be demonstrated by supporting operations at small sun-Earth-probe angles. Planned optical modulation formats support future concepts of near-Earth satellite user services to a maximum of 1.244 Gb/s differential phase shift keying modulation and pulse position modulations formats for deep space links at data rates up to 311 Mb/s. Atmospheric monitoring instruments that will characterize the optical channel during the link include a sun photometer to measure atmospheric transmittance, a solar scintillometer, and a cloud camera to measure the line of sight cloud cover. This paper describes the planned development of the JPL optical ground station.

  14. 3D papillary image capturing by the stereo fundus camera system for clinical diagnosis on retina and optic nerve

    NASA Astrophysics Data System (ADS)

    Motta, Danilo A.; Serillo, André; de Matos, Luciana; Yasuoka, Fatima M. M.; Bagnato, Vanderlei S.; Carvalho, Luis A. V.

    2014-03-01

    Glaucoma is the second main cause of the blindness in the world and there is a tendency to increase this number due to the lifetime expectation raise of the population. Glaucoma is related to the eye conditions, which leads the damage to the optic nerve. This nerve carries visual information from eye to brain, then, if it has damage, it compromises the visual quality of the patient. In the majority cases the damage of the optic nerve is irreversible and it happens due to increase of intraocular pressure. One of main challenge for the diagnosis is to find out this disease, because any symptoms are not present in the initial stage. When is detected, it is already in the advanced stage. Currently the evaluation of the optic disc is made by sophisticated fundus camera, which is inaccessible for the majority of Brazilian population. The purpose of this project is to develop a specific fundus camera without fluorescein angiography and red-free system to accomplish 3D image of optic disc region. The innovation is the new simplified design of a stereo-optical system, in order to make capable the 3D image capture and in the same time quantitative measurements of excavation and topography of optic nerve; something the traditional fundus cameras do not do. The dedicated hardware and software is developed for this ophthalmic instrument, in order to permit quick capture and print of high resolution 3D image and videos of optic disc region (20° field-of-view) in the mydriatic and nonmydriatic mode.

  15. Sensitivity, accuracy, and precision issues in opto-electronic holography based on fiber optics and high-spatial- and high-digitial-resolution cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Yokum, Jeffrey S.; Pryputniewicz, Ryszard J.

    2002-06-01

    Sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography based on fiber optics and high-spatial and high-digital resolution cameras, are discussed in this paper. It is shown that sensitivity, accuracy, and precision dependent on both, the effective determination of optical phase and the effective characterization of the illumination-observation conditions. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gages, demonstrating the applicability of quantitative optical metrology techniques to satisfy constantly increasing needs for the study and development of emerging technologies.

  16. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    DOE PAGES

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...

    2016-11-28

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less

  17. Measuring the spatial resolution of an optical system in an undergraduate optics laboratory

    NASA Astrophysics Data System (ADS)

    Leung, Calvin; Donnelly, T. D.

    2017-06-01

    Two methods of quantifying the spatial resolution of a camera are described, performed, and compared, with the objective of designing an imaging-system experiment for students in an undergraduate optics laboratory. With the goal of characterizing the resolution of a typical digital single-lens reflex (DSLR) camera, we motivate, introduce, and show agreement between traditional test-target contrast measurements and the technique of using Fourier analysis to obtain the modulation transfer function (MTF). The advantages and drawbacks of each method are compared. Finally, we explore the rich optical physics at work in the camera system by calculating the MTF as a function of wavelength and f-number. For example, we find that the Canon 40D demonstrates better spatial resolution at short wavelengths, in accordance with scalar diffraction theory, but is not diffraction-limited, being significantly affected by spherical aberration. The experiment and data analysis routines described here can be built and written in an undergraduate optics lab setting.

  18. MACS-Himalaya: A photogrammetric aerial oblique camera system designed for highly accurate 3D-reconstruction and monitoring in steep terrain and under extreme illumination conditions

    NASA Astrophysics Data System (ADS)

    Brauchle, Joerg; Berger, Ralf; Hein, Daniel; Bucher, Tilman

    2017-04-01

    The DLR Institute of Optical Sensor Systems has developed the MACS-Himalaya, a custom built Modular Aerial Camera System specifically designed for the extreme geometric (steep slopes) and radiometric (high contrast) conditions of high mountain areas. It has an overall field of view of 116° across-track consisting of a nadir and two oblique looking RGB camera heads and a fourth nadir looking near-infrared camera. This design provides the capability to fly along narrow valleys and simultaneously cover ground and steep valley flank topography with similar ground resolution. To compensate for extreme contrasts between fresh snow and dark shadows in high altitudes a High Dynamic Range (HDR) mode was implemented, which typically takes a sequence of 3 images with graded integration times, each covering 12 bit radiometric depth, resulting in a total dynamic range of 15-16 bit. This enables dense image matching and interpretation for sunlit snow and glaciers as well as for dark shaded rock faces in the same scene. Small and lightweight industrial grade camera heads are used and operated at a rate of 3.3 frames per second with 3-step HDR, which is sufficient to achieve a longitudinal overlap of approximately 90% per exposure time at 1,000 m above ground at a velocity of 180 km/h. Direct georeferencing and multitemporal monitoring without the need of ground control points is possible due to the use of a high end GPS/INS system, a stable calibrated inner geometry of the camera heads and a fully photogrammetric workflow at DLR. In 2014 a survey was performed on the Nepalese side of the Himalayas. The remote sensing system was carried in a wingpod by a Stemme S10 motor glider. Amongst other targets, the Seti Valley, Kali-Gandaki Valley and the Mt. Everest/Khumbu Region were imaged at altitudes up to 9,200 m. Products such as dense point clouds, DSMs and true orthomosaics with a ground pixel resolution of up to 15 cm were produced in regions and outcrops normally inaccessible to aerial imagery. These data are used in the fields of natural hazards, geomorphology and glaciology (see Thompson et al., CR4.3). In the presentation the camera system is introduced and examples and applications from the Nepal campaign are given.

  19. Video and thermal imaging system for monitoring interiors of high temperature reaction vessels

    DOEpatents

    Saveliev, Alexei V [Chicago, IL; Zelepouga, Serguei A [Hoffman Estates, IL; Rue, David M [Chicago, IL

    2012-01-10

    A system and method for real-time monitoring of the interior of a combustor or gasifier wherein light emitted by the interior surface of a refractory wall of the combustor or gasifier is collected using an imaging fiber optic bundle having a light receiving end and a light output end. Color information in the light is captured with primary color (RGB) filters or complimentary color (GMCY) filters placed over individual pixels of color sensors disposed within a digital color camera in a BAYER mosaic layout, producing RGB signal outputs or GMCY signal outputs. The signal outputs are processed using intensity ratios of the primary color filters or the complimentary color filters, producing video images and/or thermal images of the interior of the combustor or gasifier.

  20. Design of a frequency domain instrument for simultaneous optical tomography and magnetic resonance imaging of small animals

    NASA Astrophysics Data System (ADS)

    Masciotti, James M.; Rahim, Shaheed; Grover, Jarrett; Hielscher, Andreas H.

    2007-02-01

    We present a design for frequency domain instrument that allows for simultaneous gathering of magnetic resonance and diffuse optical tomographic imaging data. This small animal imaging system combines the high anatomical resolution of magnetic resonance imaging (MRI) with the high temporal resolution and physiological information provided by diffuse optical tomography (DOT). The DOT hardware comprises laser diodes and an intensified CCD camera, which are modulated up to 1 GHz by radio frequency (RF) signal generators. An optical imaging head is designed to fit inside the 4 cm inner diameter of a 9.4 T MRI system. Graded index fibers are used to transfer light between the optical hardware and the imaging head within the RF coil. Fiducial markers are integrated into the imaging head to allow the determination of the positions of the source and detector fibers on the MR images and to permit co-registration of MR and optical tomographic images. Detector fibers are arranged compactly and focused through a camera lens onto the photocathode of the intensified CCD camera.

  1. A preliminary optical design for the JANUS camera of ESA's space mission JUICE

    NASA Astrophysics Data System (ADS)

    Greggio, D.; Magrin, D.; Ragazzoni, R.; Munari, M.; Cremonese, G.; Bergomi, M.; Dima, M.; Farinato, J.; Marafatto, L.; Viotto, V.; Debei, S.; Della Corte, V.; Palumbo, P.; Hoffmann, H.; Jaumann, R.; Michaelis, H.; Schmitz, N.; Schipani, P.; Lara, L.

    2014-08-01

    The JANUS (Jovis, Amorum ac Natorum Undique Scrutator) will be the on board camera of the ESA JUICE satellite dedicated to the study of Jupiter and its moons, in particular Ganymede and Europa. This optical channel will provide surface maps with plate scale of 15 microrad/pixel with both narrow and broad band filters in the spectral range between 0.35 and 1.05 micrometers over a Field of View 1.72 × 1.29 degrees2. The current optical design is based on TMA design, with on-axis pupil and off-axis field of view. The optical stop is located at the secondary mirror providing an effective collecting area of 7854 mm2 (100 mm entrance pupil diameter) and allowing a simple internal baffling for first order straylight rejection. The nominal optical performances are almost limited by the diffraction and assure a nominal MTF better than 63% all over the whole Field of View. We describe here the optical design of the camera adopted as baseline together with the trade-off that has led us to this solution.

  2. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  3. Infrared Cloud Imager Development for Atmospheric Optical Communication Characterization, and Measurements at the JPL Table Mountain Facility

    NASA Astrophysics Data System (ADS)

    Nugent, P. W.; Shaw, J. A.; Piazzolla, S.

    2013-02-01

    The continuous demand for high data return in deep space and near-Earth satellite missions has led NASA and international institutions to consider alternative technologies for high-data-rate communications. One solution is the establishment of wide-bandwidth Earth-space optical communication links, which require (among other things) a nearly obstruction-free atmospheric path. Considering the atmospheric channel, the most common and most apparent impairments on Earth-space optical communication paths arise from clouds. Therefore, the characterization of the statistical behavior of cloud coverage for optical communication ground station candidate sites is of vital importance. In this article, we describe the development and deployment of a ground-based, long-wavelength infrared cloud imaging system able to monitor and characterize the cloud coverage. This system is based on a commercially available camera with a 62-deg diagonal field of view. A novel internal-shutter-based calibration technique allows radiometric calibration of the camera, which operates without a thermoelectric cooler. This cloud imaging system provides continuous day-night cloud detection with constant sensitivity. The cloud imaging system also includes data-processing algorithms that calculate and remove atmospheric emission to isolate cloud signatures, and enable classification of clouds according to their optical attenuation. Measurements of long-wavelength infrared cloud radiance are used to retrieve the optical attenuation (cloud optical depth due to absorption and scattering) in the wavelength range of interest from visible to near-infrared, where the cloud attenuation is quite constant. This article addresses the specifics of the operation, calibration, and data processing of the imaging system that was deployed at the NASA/JPL Table Mountain Facility (TMF) in California. Data are reported from July 2008 to July 2010. These data describe seasonal variability in cloud cover at the TMF site, with cloud amount (percentage of cloudy pixels) peaking at just over 51 percent during February, of which more than 60 percent had optical attenuation exceeding 12 dB at wavelengths in the range from the visible to the near-infrared. The lowest cloud amount was found during August, averaging 19.6 percent, and these clouds were mostly optically thin, with low attenuation.

  4. Super-resolution in a defocused plenoptic camera: a wave-optics-based approach.

    PubMed

    Sahin, Erdem; Katkovnik, Vladimir; Gotchev, Atanas

    2016-03-01

    Plenoptic cameras enable the capture of a light field with a single device. However, with traditional light field rendering procedures, they can provide only low-resolution two-dimensional images. Super-resolution is considered to overcome this drawback. In this study, we present a super-resolution method for the defocused plenoptic camera (Plenoptic 1.0), where the imaging system is modeled using wave optics principles and utilizing low-resolution depth information of the scene. We are particularly interested in super-resolution of in-focus and near in-focus scene regions, which constitute the most challenging cases. The simulation results show that the employed wave-optics model makes super-resolution possible for such regions as long as sufficiently accurate depth information is available.

  5. Virtual-stereo fringe reflection technique for specular free-form surface testing

    NASA Astrophysics Data System (ADS)

    Ma, Suodong; Li, Bo

    2016-11-01

    Due to their excellent ability to improve the performance of optical systems, free-form optics have attracted extensive interest in many fields, e.g. optical design of astronomical telescopes, laser beam expanders, spectral imagers, etc. However, compared with traditional simple ones, testing for such kind of optics is usually more complex and difficult which has been being a big barrier for the manufacture and the application of these optics. Fortunately, owing to the rapid development of electronic devices and computer vision technology, fringe reflection technique (FRT) with advantages of simple system structure, high measurement accuracy and large dynamic range is becoming a powerful tool for specular free-form surface testing. In order to obtain absolute surface shape distributions of test objects, two or more cameras are often required in the conventional FRT which makes the system structure more complex and the measurement cost much higher. Furthermore, high precision synchronization between each camera is also a troublesome issue. To overcome the aforementioned drawback, a virtual-stereo FRT for specular free-form surface testing is put forward in this paper. It is able to achieve absolute profiles with the help of only one single biprism and a camera meanwhile avoiding the problems of stereo FRT based on binocular or multi-ocular cameras. Preliminary experimental results demonstrate the feasibility of the proposed technique.

  6. Programmable 10 MHz optical fiducial system for hydrodiagnostic cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huen, T.

    1987-07-01

    A solid state light control system was designed and fabricated for use with hydrodiagnostic streak cameras of the electro-optic type. With its use, the film containing the streak images will have on it two time scales simultaneously exposed with the signal. This allows timing and cross timing. The latter is achieved with exposure modulation marking onto the time tick marks. The purpose of using two time scales will be discussed. The design is based on a microcomputer, resulting in a compact and easy to use instrument. The light source is a small red light emitting diode. Time marking can bemore » programmed in steps of 0.1 microseconds, with a range of 255 steps. The time accuracy is based on a precision 100 MHz quartz crystal, giving a divided down 10 MHz system frequency. The light is guided by two small 100 micron diameter optical fibers, which facilitates light coupling onto the input slit of an electro-optic streak camera. Three distinct groups of exposure modulation of the time tick marks can be independently set anywhere onto the streak duration. This system has been successfully used in Fabry-Perot laser velocimeters for over four years in our Laboratory. The microcomputer control section is also being used in providing optical fids to mechanical rotor cameras.« less

  7. Acoustic Emission Beamforming for Detection and Localization of Damage

    NASA Astrophysics Data System (ADS)

    Rivey, Joshua Callen

    The aerospace industry is a constantly evolving field with corporate manufacturers continually utilizing innovative processes and materials. These materials include advanced metallics and composite systems. The exploration and implementation of new materials and structures has prompted the development of numerous structural health monitoring and nondestructive evaluation techniques for quality assurance purposes and pre- and in-service damage detection. Exploitation of acoustic emission sensors coupled with a beamforming technique provides the potential for creating an effective non-contact and non-invasive monitoring capability for assessing structural integrity. This investigation used an acoustic emission detection device that employs helical arrays of MEMS-based microphones around a high-definition optical camera to provide real-time non-contact monitoring of inspection specimens during testing. The study assessed the feasibility of the sound camera for use in structural health monitoring of composite specimens during tensile testing for detecting onset of damage in addition to nondestructive evaluation of aluminum inspection plates for visualizing stress wave propagation in structures. During composite material monitoring, the sound camera was able to accurately identify the onset and location of damage resulting from large amplitude acoustic feedback mechanisms such as fiber breakage. Damage resulting from smaller acoustic feedback events such as matrix failure was detected but not localized to the degree of accuracy of larger feedback events. Findings suggest that beamforming technology can provide effective non-contact and non-invasive inspection of composite materials, characterizing the onset and the location of damage in an efficient manner. With regards to the nondestructive evaluation of metallic plates, this remote sensing system allows us to record wave propagation events in situ via a single-shot measurement. This is a significant improvement over the conventional wave propagation tracking technique based on laser doppler vibrometry that requires synchronization of data acquired from numerous excitations and measurements. The proposed technique can be used to characterize and localize damage by detecting the scattering, attenuation, and reflections of stress waves resulting from damage and defects. These studies lend credence to the potential development of new SHM/NDE techniques based on acoustic emission beamforming for characterizing a wide spectrum of damage modes in next-generation materials and structures without the need for mounted contact sensors.

  8. Seasonal variations of leaf and canopy properties tracked by ground-based NDVI imagery in a temperate forest.

    PubMed

    Yang, Hualei; Yang, Xi; Heskel, Mary; Sun, Shucun; Tang, Jianwu

    2017-04-28

    Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporal resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). We found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.

  9. A Low-Cost and Portable Dual-Channel Fiber Optic Surface Plasmon Resonance System.

    PubMed

    Liu, Qiang; Liu, Yun; Chen, Shimeng; Wang, Fang; Peng, Wei

    2017-12-04

    A miniaturization and integration dual-channel fiber optic surface plasmon resonance (SPR) system was proposed and demonstrated in this paper. We used a yellow light-emitting diode (LED, peak wavelength 595 nm) and built-in web camera as a light source and detector, respectively. Except for the detection channel, one of the sensors was used as a reference channel to compensate nonspecific binding and physical absorption. We packaged the LED and surface plasmon resonance (SPR) sensors together, which are flexible enough to be applied to mobile devices as a compact and portable system. Experimental results show that the normalized intensity shift and refractive index (RI) of the sample have a good linear relationship in the RI range from 1.328 to 1.348. We used this sensor to monitor the reversible, specific interaction between lectin concanavalin A (Con A) and glycoprotein ribonuclease B (RNase B), which demonstrate its capabilities of specific identification and biochemical samples concentration detection. This sensor system has potential applications in various fields, such as medical diagnosis, public health, food safety, and environment monitoring.

  10. A USB 2.0 computer interface for the UCO/Lick CCD cameras

    NASA Astrophysics Data System (ADS)

    Wei, Mingzhi; Stover, Richard J.

    2004-09-01

    The new UCO/Lick Observatory CCD camera uses a 200 MHz fiber optic cable to transmit image data and an RS232 serial line for low speed bidirectional command and control. Increasingly RS232 is a legacy interface supported on fewer computers. The fiber optic cable requires either a custom interface board that is plugged into the mainboard of the image acquisition computer to accept the fiber directly or an interface converter that translates the fiber data onto a widely used standard interface. We present here a simple USB 2.0 interface for the UCO/Lick camera. A single USB cable connects to the image acquisition computer and the camera's RS232 serial and fiber optic cables plug into the USB interface. Since most computers now support USB 2.0 the Lick interface makes it possible to use the camera on essentially any modern computer that has the supporting software. No hardware modifications or additions to the computer are needed. The necessary device driver software has been written for the Linux operating system which is now widely used at Lick Observatory. The complete data acquisition software for the Lick CCD camera is running on a variety of PC style computers as well as an HP laptop.

  11. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.

  12. Versatile microsecond movie camera

    NASA Astrophysics Data System (ADS)

    Dreyfus, R. W.

    1980-03-01

    A laboratory-type movie camera is described which satisfies many requirements in the range 1 microsec to 1 sec. The camera consists of a He-Ne laser and compatible state-of-the-art components; the primary components are an acoustooptic modulator, an electromechanical beam deflector, and a video tape system. The present camera is distinct in its operation in that submicrosecond laser flashes freeze the image motion while still allowing the simplicity of electromechanical image deflection in the millisecond range. The gating and pulse delay circuits of an oscilloscope synchronize the modulator and scanner relative to the subject being photographed. The optical table construction and electronic control enhance the camera's versatility and adaptability. The instant replay video tape recording allows for easy synchronization and immediate viewing of the results. Economy is achieved by using off-the-shelf components, optical table construction, and short assembly time.

  13. High Energy Replicated Optics to Explore the Sun Balloon-Borne Telescope: Astrophysical Pointing

    NASA Technical Reports Server (NTRS)

    Gaskin, Jessica; Wilson-Hodge, Colleen; Ramsey, Brian; Apple, Jeff; Kurt, Dietz; Tennant, Allyn; Swartz, Douglas; Christe, Steven D.; Shih, Albert

    2014-01-01

    On September 21, 2013, the High Energy Replicated Optics to Explore the Sun, or HEROES, balloon-borne x-ray telescope launched from the Columbia Scientific Balloon Facility's site in Ft. Summer, NM. The flight lasted for approximately 27 hours and the observational targets included the Sun and astrophysical sources GRS 1915+105 and the Crab Nebula. Over the past year, the HEROES team upgraded the existing High Energy Replicated Optics (HERO) balloon-borne telescope to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES Project is a multi-NASA Center effort with team members at both Marshall Space Flight Center (MSFC) and Goddard Space Flight Center (GSFC), and is led by Co-PIs (one at each Center). The HEROES payload consists of the hard X-ray telescope HERO, developed at MSFC, combined with several new systems. To allow the HEROES telescope to make observations of the Sun, a new solar aspect system was added to supplement the existing star camera for fine pointing during both the day and night. A mechanical shutter was added to the star camera to protect it during solar observations and two alignment monitoring systems were added for improved pointing and post-flight data reconstruction. This mission was funded by the NASA HOPE (Hands-On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist.

  14. Sensing the gas metal arc welding process

    NASA Technical Reports Server (NTRS)

    Carlson, N. M.; Johnson, J. A.; Smartt, H. B.; Watkins, A. D.; Larsen, E. D.; Taylor, P. L.; Waddoups, M. A.

    1994-01-01

    Control of gas metal arc welding (GMAW) requires real-time sensing of the process. Three sensing techniques for GMAW are being developed at the Idaho National Engineering Laboratory (INEL). These are (1) noncontacting ultrasonic sensing using a laser/EMAT (electromagnetic acoustic transducer) to detect defects in the solidified weld on a pass-by-pass basis, (2) integrated optical sensing using a CCD camera and a laser stripe to obtain cooling rate and weld bead geometry information, and (3) monitoring fluctuations in digitized welding voltage data to detect the mode of metal droplet transfer and assure that the desired mass input is achieved.

  15. Sensing the gas metal arc welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, N.M.; Johnson, J.A.; Smartt, H.B.

    1992-01-01

    Control of gas metal arc welding (GMAW) requires real-time sensing of the process. Three sensing techniques for GMAW are being developed at the Idaho National Engineering Laboratory (INEL). These are (1) noncontacting ultrasonic sensing using a laser/EMAT (electromagnetic acoustic transducer) to detect defects in the solidified weld on a pass-bypass basis, (2) integrated optical sensing using a CCD camera and a laser stripe to obtain cooling rate and weld bead geometry information, and (3) monitoring fluctuations in digitized welding voltage data to detect the mode of metal droplet transfer and assure that the desired mass input is achieved.

  16. Sensing the gas metal arc welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, N.M.; Johnson, J.A.; Smartt, H.B.

    1992-10-01

    Control of gas metal arc welding (GMAW) requires real-time sensing of the process. Three sensing techniques for GMAW are being developed at the Idaho National Engineering Laboratory (INEL). These are (1) noncontacting ultrasonic sensing using a laser/EMAT (electromagnetic acoustic transducer) to detect defects in the solidified weld on a pass-bypass basis, (2) integrated optical sensing using a CCD camera and a laser stripe to obtain cooling rate and weld bead geometry information, and (3) monitoring fluctuations in digitized welding voltage data to detect the mode of metal droplet transfer and assure that the desired mass input is achieved.

  17. Evaluation of multispectral plenoptic camera

    NASA Astrophysics Data System (ADS)

    Meng, Lingfei; Sun, Ting; Kosoglow, Rich; Berkner, Kathrin

    2013-01-01

    Plenoptic cameras enable capture of a 4D lightfield, allowing digital refocusing and depth estimation from data captured with a compact portable camera. Whereas most of the work on plenoptic camera design has been based a simplistic geometric-optics-based characterization of the optical path only, little work has been done of optimizing end-to-end system performance for a specific application. Such design optimization requires design tools that need to include careful parameterization of main lens elements, as well as microlens array and sensor characteristics. In this paper we are interested in evaluating the performance of a multispectral plenoptic camera, i.e. a camera with spectral filters inserted into the aperture plane of the main lens. Such a camera enables single-snapshot spectral data acquisition.1-3 We first describe in detail an end-to-end imaging system model for a spectrally coded plenoptic camera that we briefly introduced in.4 Different performance metrics are defined to evaluate the spectral reconstruction quality. We then present a prototype which is developed based on a modified DSLR camera containing a lenslet array on the sensor and a filter array in the main lens. Finally we evaluate the spectral reconstruction performance of a spectral plenoptic camera based on both simulation and measurements obtained from the prototype.

  18. Space telescope phase B definition study. Volume 2A: Science instruments, f48/96 planetary camera

    NASA Technical Reports Server (NTRS)

    Grosso, R. P.; Mccarthy, D. J.

    1976-01-01

    The analysis and preliminary design of the f48/96 planetary camera for the space telescope are discussed. The camera design is for application to the axial module position of the optical telescope assembly.

  19. Recent Status of SIM Lite Astrometric Observatory Mission: Flight Engineering Risk Reduction Activities

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Dekens, Frank; Nemati, Bijan; An, Xin; Carson, Johnathan

    2010-01-01

    The SIM Lite Astrometric Observatory is a mission concept for a space-borne instrument to perform micro-arc-second narrow-angle astrometry to search 60 to 100 nearby stars for Earth-like planets, and to perform global astrometry for a broad astrophysics program. The instrument consists of two Michelson stellar interferometers and a telescope. The first interferometer chops between the target star and a set of reference stars. The second interferometer monitors the attitude of the instrument in the direction of the target star. The telescope monitors the attitude of the instrument in the other two directions. The main enabling technology development for the mission was completed during phases A & B. The project is currently implementing the developed technology onto flight-ready engineering models. These key engineering tasks will significantly reduce the implementation risks during the flight phases C & D of the mission. The main optical interferometer components, including the astrometric beam combiner, the fine steering optical mechanism, the path-length-control and modulation optical mechanisms, focal-plane camera electronics and cooling heat pipe, are currently under development. Main assemblies are built to meet flight requirements and will be subjected to flight qualification level environmental testing (random vibration and thermal cycling) and performance testing. This paper summarizes recent progress in engineering risk reduction activities.

  20. 10-kW-class YAG laser application for heavy components

    NASA Astrophysics Data System (ADS)

    Ishide, Takashi; Tsubota, S.; Nayama, Michisuke; Shimokusu, Yoshiaki; Nagashima, Tadashi; Okimura, K.

    2000-02-01

    The authors have put the YAG laser of the kW class to practical use for repair welding of nuclear power plant steam generator heat exchanger tubes, all-position welding of pipings, etc. This paper describes following developed methods and systems of high power YAG laser processing. First, we apply the 6 kW to 10 kW YAG lasers for welding and cutting in heavy components. The beam guide systems we have used are optical fibers which core diameter is 0.6 mm to 0.8 mm and its length is 200 m as standard one. Using these system, we can get the 1 pass penetration of 15 mm to 20 mm and multi pass welding for more thick plates. Cutting of 100 mm thickness plate data also described for dismantling of nuclear power plants. In these systems we carried out the in-process monitoring by using CCD camera image processing and monitoring fiber which placed coaxial to the YAG optical lens system. In- process monitoring by the monitoring fiber, we measured the light intensity from welding area. Further, we have developed new hybrid welding with the TIG electrode at the center of lens for high power. The hybrid welding with TIG-YAG system aims lightening of welding groove allowances and welding of high quality. Through these techniques we have applied 7 kW class YAG laser for welding in the components of nuclear power plants.

  1. Feasibility study of a gamma camera for monitoring nuclear materials in the PRIDE facility

    NASA Astrophysics Data System (ADS)

    Jo, Woo Jin; Kim, Hyun-Il; An, Su Jung; Lee, Chae Young; Song, Han-Kyeol; Chung, Yong Hyun; Shin, Hee-Sung; Ahn, Seong-Kyu; Park, Se-Hwan

    2014-05-01

    The Korea Atomic Energy Research Institute (KAERI) has been developing pyroprocessing technology, in which actinides are recovered together with plutonium. There is no pure plutonium stream in the process, so it has an advantage of proliferation resistance. Tracking and monitoring of nuclear materials through the pyroprocess can significantly improve the transparency of the operation and safeguards. An inactive engineering-scale integrated pyroprocess facility, which is the PyRoprocess Integrated inactive DEmonstration (PRIDE) facility, was constructed to demonstrate engineering-scale processes and the integration of each unit process. the PRIDE facility may be a good test bed to investigate the feasibility of a nuclear material monitoring system. In this study, we designed a gamma camera system for nuclear material monitoring in the PRIDE facility by using a Monte Carlo simulation, and we validated the feasibility of this system. Two scenarios, according to locations of the gamma camera, were simulated using GATE (GEANT4 Application for Tomographic Emission) version 6. A prototype gamma camera with a diverging-slat collimator was developed, and the simulated and experimented results agreed well with each other. These results indicate that a gamma camera to monitor the nuclear material in the PRIDE facility can be developed.

  2. Cost effective system for monitoring of fish migration with a camera

    NASA Astrophysics Data System (ADS)

    Sečnik, Matej; Brilly, Mitja; Vidmar, Andrej

    2016-04-01

    Within the European LIFE project Ljubljanica connects (LIFE10 NAT/SI/000142) we have developed a cost-effective solution for the monitoring of fish migration through the fish passes with the underwater camera. In the fish pass at Ambrožev trg and in the fish pass near the Fužine castle we installed a video camera called "Fishcam" to be able to monitor the migration of fish through the fish passes and success of its reconstruction. Live stream from fishcams installed in the fishpassesis available on our project website (http://ksh.fgg.uni-lj.si/ljubljanicaconnects/ang/12_camera). The system for the fish monitoring is made from two parts. First is the waterproof box for the computer with charger and the second part is the camera itself. We used a high sensitive Sony analogue camera. The advantage of this camera is that it has very good sensitivity in low light conditions, so it can take good quality pictures even at night with a minimum additional lighting. For the night recording we use additional IR reflector to illuminate passing fishes. The camera is connected to an 8-inch tablet PC. We decided to use a tablet PC because it is quite small, cheap, it is relatively fast and has a low power consumption. On the computer we use software which has advanced motion detection capabilities, so we can also detect the small fishes. When the fish is detected by a software, its photograph is automatically saved to local hard drive and for backup also on Google drive. The system for monitoring of fish migration has turned out to work very well. From the beginning of monitoring in June 2015 to end of the year there were more than 100.000 photographs produced. The first analysis of them was already prepared estimating fish species and their frequency in passing the fish pass.

  3. SpUpNIC (Spectrograph Upgrade: Newly Improved Cassegrain) on the South African Astronomical Observatory's 74-inch telescope

    NASA Astrophysics Data System (ADS)

    Crause, Lisa A.; Carter, Dave; Daniels, Alroy; Evans, Geoff; Fourie, Piet; Gilbank, David; Hendricks, Malcolm; Koorts, Willie; Lategan, Deon; Loubser, Egan; Mouries, Sharon; O'Connor, James E.; O'Donoghue, Darragh E.; Potter, Stephen; Sass, Craig; Sickafoose, Amanda A.; Stoffels, John; Swanevelder, Pieter; Titus, Keegan; van Gend, Carel; Visser, Martin; Worters, Hannah L.

    2016-08-01

    SpUpNIC (Spectrograph Upgrade: Newly Improved Cassegrain) is the extensively upgraded Cassegrain Spectrograph on the South African Astronomical Observatory's 74-inch (1.9-m) telescope. The inverse-Cassegrain collimator mirrors and woefully inefficient Maksutov-Cassegrain camera optics have been replaced, along with the CCD and SDSU controller. All moving mechanisms are now governed by a programmable logic controller, allowing remote configuration of the instrument via an intuitive new graphical user interface. The new collimator produces a larger beam to match the optically faster Folded-Schmidt camera design and nine surface-relief diffraction gratings offer various wavelength ranges and resolutions across the optical domain. The new camera optics (a fused silica Schmidt plate, a slotted fold flat and a spherically figured primary mirror, both Zerodur, and a fused silica field-flattener lens forming the cryostat window) reduce the camera's central obscuration to increase the instrument throughput. The physically larger and more sensitive CCD extends the available wavelength range; weak arc lines are now detectable down to 325 nm and the red end extends beyond one micron. A rear-of-slit viewing camera has streamlined the observing process by enabling accurate target placement on the slit and facilitating telescope focus optimisation. An interactive quick-look data reduction tool further enhances the user-friendliness of SpUpNI

  4. Focus adjustment method for CBERS 3 and 4 satellites Mux camera to be performed in air condition and its experimental verification for best performance in orbital vacuum condition

    NASA Astrophysics Data System (ADS)

    Scaduto, Lucimara C. N.; Malavolta, Alexandre T.; Modugno, Rodrigo G.; Vales, Luiz F.; Carvalho, Erica G.; Evangelista, Sérgio; Stefani, Mario A.; de Castro Neto, Jarbas C.

    2017-11-01

    The first Brazilian remote sensing multispectral camera (MUX) is currently under development at Opto Eletronica S.A. It consists of a four-spectral-band sensor covering a 450nm to 890nm wavelength range. This camera will provide images within a 20m ground resolution at nadir. The MUX camera is part of the payload of the upcoming Sino-Brazilian satellites CBERS 3&4 (China-Brazil Earth Resource Satellite). The preliminary alignment between the optical system and the CCD sensor, which is located at the focal plane assembly, was obtained in air condition, clean room environment. A collimator was used for the performance evaluation of the camera. The preliminary performance evaluation of the optical channel was registered by compensating the collimator focus position due to changes in the test environment, as an air-to-vacuum environment transition leads to a defocus process in this camera. Therefore, it is necessary to confirm that the alignment of the camera must always be attained ensuring that its best performance is reached for an orbital vacuum condition. For this reason and as a further step on the development process, the MUX camera Qualification Model was tested and evaluated inside a thermo-vacuum chamber and submitted to an as-orbit vacuum environment. In this study, the influence of temperature fields was neglected. This paper reports on the performance evaluation and discusses the results for this camera when operating within those mentioned test conditions. The overall optical tests and results show that the "in air" adjustment method was suitable to be performed, as a critical activity, to guarantee the equipment according to its design requirements.

  5. Thin and thick cloud top height retrieval algorithm with the Infrared Camera and LIDAR of the JEM-EUSO Space Mission

    NASA Astrophysics Data System (ADS)

    Sáez-Cano, G.; Morales de los Ríos, J. A.; del Peral, L.; Neronov, A.; Wada, S.; Rodríguez Frías, M. D.

    2015-03-01

    The origin of cosmic rays have remained a mistery for more than a century. JEM-EUSO is a pioneer space-based telescope that will be located at the International Space Station (ISS) and its aim is to detect Ultra High Energy Cosmic Rays (UHECR) and Extremely High Energy Cosmic Rays (EHECR) by observing the atmosphere. Unlike ground-based telescopes, JEM-EUSO will observe from upwards, and therefore, for a properly UHECR reconstruction under cloudy conditions, a key element of JEM-EUSO is an Atmospheric Monitoring System (AMS). This AMS consists of a space qualified bi-spectral Infrared Camera, that will provide the cloud coverage and cloud top height in the JEM-EUSO Field of View (FoV) and a LIDAR, that will measure the atmospheric optical depth in the direction it has been shot. In this paper we will explain the effects of clouds for the determination of the UHECR arrival direction. Moreover, since the cloud top height retrieval is crucial to analyze the UHECR and EHECR events under cloudy conditions, the retrieval algorithm that fulfills the technical requierements of the Infrared Camera of JEM-EUSO to reconstruct the cloud top height is presently reported.

  6. A surgical navigation system for non-contact diffuse optical tomography and intraoperative cone-beam CT

    NASA Astrophysics Data System (ADS)

    Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.

    2014-02-01

    A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.

  7. Recent technology and usage of plastic lenses in image taking objectives

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Susumu; Sato, Hiroshi; Mori, Nobuyoshi; Kiriki, Toshihiko

    2005-09-01

    Recently, plastic lenses produced by injection molding are widely used in image taking objectives for digital cameras, camcorders, and mobile phone cameras, because of their suitability for volume production and ease of obtaining an advantage of aspherical surfaces. For digital camera and camcorder objectives, it is desirable that there is no image point variation with the temperature change in spite of employing several plastic lenses. At the same time, due to the shrinking pixel size of solid-state image sensor, there is now a requirement to assemble lenses with high accuracy. In order to satisfy these requirements, we have developed 16 times compact zoom objective for camcorder and 3 times class folded zoom objectives for digital camera, incorporating cemented plastic doublet consisting of a positive lens and a negative lens. Over the last few years, production volumes of camera-equipped mobile phones have increased substantially. Therefore, for mobile phone cameras, the consideration of productivity is more important than ever. For this application, we have developed a 1.3-mega pixels compact camera module with macro function utilizing the advantage of a plastic lens that can be given mechanically functional shape to outer flange part. Its objective consists of three plastic lenses and all critical dimensions related to optical performance can be determined by high precise optical elements. Therefore this camera module is manufactured without optical adjustment in automatic assembling line, and achieves both high productivity and high performance. Reported here are the constructions and the technical topics of image taking objectives described above.

  8. Noninvasive imaging of cationic lipid-mediated delivery of optical and PET reporter genes in living mice.

    PubMed

    Iyer, Meera; Berenji, Manijeh; Templeton, Nancy S; Gambhir, Sanjiv S

    2002-10-01

    Gene therapy involves the safe and effective delivery of one or more genes of interest to target cells in vivo. The advantages of using nonviral delivery systems include ease of preparation, low toxicity, and weak immunogenicity. Nonviral delivery methods, when combined with a noninvasive, clinically applicable imaging assay, will greatly aid in the optimization of gene therapy approaches for cancer. We demonstrate cationic lipid-mediated noninvasive monitoring of reporter gene expression of firefly (Photinus pyralis) luciferase (fl) and a mutant herpes simplex virus type I thymidine kinase (HSV1-sr39tk, tk) in living mice using a cooled charge coupled device (CCD) camera and positron emission tomography (PET), respectively. We observe a high level of fl and tk reporter gene expression predominantly in the lungs after a single injection of the extruded DOTAP:cholesterol DNA liposome complexes by way of the tail vein, seen to be time- and dose-dependent. We observe a good correlation between the in vivo bioluminescent signal and the ex vivo firefly luciferase enzyme (FL) activity in different organs. We further demonstrate the feasibility of noninvasively imaging both optical and PET reporter gene expression in the same animal using the CCD camera and microPET, respectively.

  9. Measuring the retina optical properties using a structured illumination imaging system

    NASA Astrophysics Data System (ADS)

    Basiri, A.; Nguyen, T. A.; Ibrahim, M.; Nguyen, Q. D.; Ramella-Roman, Jessica C.

    2011-03-01

    Patients with diabetic retinopathy (DR) may experience a reduction in retinal oxygen saturation (SO2). Close monitoring with a fundus ophthalmoscope can help in the prediction of the progression of disease. In this paper we present a noninvasive instrument based on structured illumination aimed at measuring the retina optical properties including oxygen saturation. The instrument uses two wavelngths one in the NIR and one visible, a fast acquisition camera, and a splitter system that allows for contemporaneous collection of images at two different wavelengths. This scheme greatly reduces eye movement artifacts. Structured illumination was achieved in two different ways, firstly several binary illumination masks fabricated using laser micro-machining were used, a near-sinusoidal projection pattern is ultimately achieved at the image plane by appropriate positioning of the binary masks. Secondarily a sinusoidal pattern printed on a thin plastic sheet was positioned at image plane of a fundus ophthalmoscope. The system was calibrated using optical phantoms of known optical properties as well as an eye phantom that included a 150μm capillary vessel containing different concentrations of oxygenated and deoxygenated hemoglobin.

  10. Improved optical flow velocity analysis in SO2 camera images of volcanic plumes - implications for emission-rate retrievals investigated at Mt Etna, Italy and Guallatiri, Chile

    NASA Astrophysics Data System (ADS)

    Gliß, Jonas; Stebel, Kerstin; Kylling, Arve; Sudbø, Aasmund

    2018-02-01

    Accurate gas velocity measurements in emission plumes are highly desirable for various atmospheric remote sensing applications. The imaging technique of UV SO2 cameras is commonly used to monitor SO2 emissions from volcanoes and anthropogenic sources (e.g. power plants, ships). The camera systems capture the emission plumes at high spatial and temporal resolution. This allows the gas velocities in the plume to be retrieved directly from the images. The latter can be measured at a pixel level using optical flow (OF) algorithms. This is particularly advantageous under turbulent plume conditions. However, OF algorithms intrinsically rely on contrast in the images and often fail to detect motion in low-contrast image areas. We present a new method to identify ill-constrained OF motion vectors and replace them using the local average velocity vector. The latter is derived based on histograms of the retrieved OF motion fields. The new method is applied to two example data sets recorded at Mt Etna (Italy) and Guallatiri (Chile). We show that in many cases, the uncorrected OF yields significantly underestimated SO2 emission rates. We further show that our proposed correction can account for this and that it significantly improves the reliability of optical-flow-based gas velocity retrievals. In the case of Mt Etna, the SO2 emissions of the north-eastern crater are investigated. The corrected SO2 emission rates range between 4.8 and 10.7 kg s-1 (average of 7.1 ± 1.3 kg s-1) and are in good agreement with previously reported values. For the Guallatiri data, the emissions of the central crater and a fumarolic field are investigated. The retrieved SO2 emission rates are between 0.5 and 2.9 kg s-1 (average of 1.3 ± 0.5 kg s-1) and provide the first report of SO2 emissions from this remotely located and inaccessible volcano.

  11. Advanced imaging research and development at DARPA

    NASA Astrophysics Data System (ADS)

    Dhar, Nibir K.; Dat, Ravi

    2012-06-01

    Advances in imaging technology have huge impact on our daily lives. Innovations in optics, focal plane arrays (FPA), microelectronics and computation have revolutionized camera design. As a result, new approaches to camera design and low cost manufacturing is now possible. These advances are clearly evident in visible wavelength band due to pixel scaling, improvements in silicon material and CMOS technology. CMOS cameras are available in cell phones and many other consumer products. Advances in infrared imaging technology have been slow due to market volume and many technological barriers in detector materials, optics and fundamental limits imposed by the scaling laws of optics. There is of course much room for improvements in both, visible and infrared imaging technology. This paper highlights various technology development projects at DARPA to advance the imaging technology for both, visible and infrared. Challenges and potentials solutions are highlighted in areas related to wide field-of-view camera design, small pitch pixel, broadband and multiband detectors and focal plane arrays.

  12. Optical touch sensing: practical bounds for design and performance

    NASA Astrophysics Data System (ADS)

    Bläßle, Alexander; Janbek, Bebart; Liu, Lifeng; Nakamura, Kanna; Nolan, Kimberly; Paraschiv, Victor

    2013-02-01

    Touch sensitive screens are used in many applications ranging in size from smartphones and tablets to display walls and collaborative surfaces. In this study, we consider optical touch sensing, a technology best suited for large-scale touch surfaces. Optical touch sensing utilizes cameras and light sources placed along the edge of the display. Within this framework, we first find a sufficient number of cameras necessary for identifying a convex polygon touching the screen, using a continuous light source on the boundary of a circular domain. We then find the number of cameras necessary to distinguish between two circular objects in a circular or rectangular domain. Finally, we use Matlab to simulate the polygonal mesh formed from distributing cameras and light sources on a circular domain. Using this, we compute the number of polygons in the mesh and the maximum polygon area to give us information about the accuracy of the configuration. We close with summary and conclusions, and pointers to possible future research directions.

  13. Clinical Validation of a Smartphone-Based Adapter for Optic Disc Imaging in Kenya.

    PubMed

    Bastawrous, Andrew; Giardini, Mario Ettore; Bolster, Nigel M; Peto, Tunde; Shah, Nisha; Livingstone, Iain A T; Weiss, Helen A; Hu, Sen; Rono, Hillary; Kuper, Hannah; Burton, Matthew

    2016-02-01

    Visualization and interpretation of the optic nerve and retina are essential parts of most physical examinations. To design and validate a smartphone-based retinal adapter enabling image capture and remote grading of the retina. This validation study compared the grading of optic nerves from smartphone images with those of a digital retinal camera. Both image sets were independently graded at Moorfields Eye Hospital Reading Centre. Nested within the 6-year follow-up (January 7, 2013, to March 12, 2014) of the Nakuru Eye Disease Cohort in Kenya, 1460 adults (2920 eyes) 55 years and older were recruited consecutively from the study. A subset of 100 optic disc images from both methods were further used to validate a grading app for the optic nerves. Data analysis was performed April 7 to April 12, 2015. Vertical cup-disc ratio for each test was compared in terms of agreement (Bland-Altman and weighted κ) and test-retest variability. A total of 2152 optic nerve images were available from both methods (also 371 from the reference camera but not the smartphone, 170 from the smartphone but not the reference camera, and 227 from neither the reference camera nor the smartphone). Bland-Altman analysis revealed a mean difference of 0.02 (95% CI, -0.21 to 0.17) and a weighted κ coefficient of 0.69 (excellent agreement). The grades of an experienced retinal photographer were compared with those of a lay photographer (no health care experience before the study), and no observable difference in image acquisition quality was found. Nonclinical photographers using the low-cost smartphone adapter were able to acquire optic nerve images at a standard that enabled independent remote grading of the images comparable to those acquired using a desktop retinal camera operated by an ophthalmic assistant. The potential for task shifting and the detection of avoidable causes of blindness in the most at-risk communities makes this an attractive public health intervention.

  14. Towards designing an optical-flow based colonoscopy tracking algorithm: a comparative study

    NASA Astrophysics Data System (ADS)

    Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.

    2013-03-01

    Automatic co-alignment of optical and virtual colonoscopy images can supplement traditional endoscopic procedures, by providing more complete information of clinical value to the gastroenterologist. In this work, we present a comparative analysis of our optical flow based technique for colonoscopy tracking, in relation to current state of the art methods, in terms of tracking accuracy, system stability, and computational efficiency. Our optical-flow based colonoscopy tracking algorithm starts with computing multi-scale dense and sparse optical flow fields to measure image displacements. Camera motion parameters are then determined from optical flow fields by employing a Focus of Expansion (FOE) constrained egomotion estimation scheme. We analyze the design choices involved in the three major components of our algorithm: dense optical flow, sparse optical flow, and egomotion estimation. Brox's optical flow method,1 due to its high accuracy, was used to compare and evaluate our multi-scale dense optical flow scheme. SIFT6 and Harris-affine features7 were used to assess the accuracy of the multi-scale sparse optical flow, because of their wide use in tracking applications; the FOE-constrained egomotion estimation was compared with collinear,2 image deformation10 and image derivative4 based egomotion estimation methods, to understand the stability of our tracking system. Two virtual colonoscopy (VC) image sequences were used in the study, since the exact camera parameters(for each frame) were known; dense optical flow results indicated that Brox's method was superior to multi-scale dense optical flow in estimating camera rotational velocities, but the final tracking errors were comparable, viz., 6mm vs. 8mm after the VC camera traveled 110mm. Our approach was computationally more efficient, averaging 7.2 sec. vs. 38 sec. per frame. SIFT and Harris affine features resulted in tracking errors of up to 70mm, while our sparse optical flow error was 6mm. The comparison among egomotion estimation algorithms showed that our FOE-constrained egomotion estimation method achieved the optimal balance between tracking accuracy and robustness. The comparative study demonstrated that our optical-flow based colonoscopy tracking algorithm maintains good accuracy and stability for routine use in clinical practice.

  15. Holographic motion picture camera with Doppler shift compensation

    NASA Technical Reports Server (NTRS)

    Kurtz, R. L. (Inventor)

    1976-01-01

    A holographic motion picture camera is reported for producing three dimensional images by employing an elliptical optical system. There is provided in one of the beam paths (the object or reference beam path) a motion compensator which enables the camera to photograph faster moving objects.

  16. IR in Norway

    NASA Astrophysics Data System (ADS)

    Haakenaasen, Randi; Lovold, Stian

    2003-01-01

    Infrared technology in Norway started at the Norwegian Defense Research Establishment (FFI) in the 1960s, and has since then spread to universities, other research institutes and industry. FFI has a large, integrated IR activity that includes research and development in IR detectors, optics design, optical coatings, advanced dewar design, modelling/simulation of IR scenes, and image analysis. Part of the integrated activity is a laboratory for more basic research in materials science and semiconductor physics, in which thin films of CdHgTe are grown by molecular beam epitaxy and processed into IR detectors by various techniques. FFI also has a lot of experience in research and development of tunable infrared lasers for various applications. Norwegian industrial activities include production of infrared homing anti-ship missiles, laser rangefinders, various infrared gas sensors, hyperspectral cameras, and fiberoptic sensor systems for structural health monitoring and offshore oil well diagnostics.

  17. Standard design for National Ignition Facility x-ray streak and framing cameras.

    PubMed

    Kimbrough, J R; Bell, P M; Bradley, D K; Holder, J P; Kalantar, D K; MacPhee, A G; Telford, S

    2010-10-01

    The x-ray streak camera and x-ray framing camera for the National Ignition Facility were redesigned to improve electromagnetic pulse hardening, protect high voltage circuits from pressure transients, and maximize the use of common parts and operational software. Both instruments use the same PC104 based controller, interface, power supply, charge coupled device camera, protective hermetically sealed housing, and mechanical interfaces. Communication is over fiber optics with identical facility hardware for both instruments. Each has three triggers that can be either fiber optic or coax. High voltage protection consists of a vacuum sensor to enable the high voltage and pulsed microchannel plate phosphor voltage. In the streak camera, the high voltage is removed after the sweep. Both rely on the hardened aluminum box and a custom power supply to reduce electromagnetic pulse/electromagnetic interference (EMP/EMI) getting into the electronics. In addition, the streak camera has an EMP/EMI shield enclosing the front of the streak tube.

  18. Computational photography with plenoptic camera and light field capture: tutorial.

    PubMed

    Lam, Edmund Y

    2015-11-01

    Photography is a cornerstone of imaging. Ever since cameras became consumer products more than a century ago, we have witnessed great technological progress in optics and recording mediums, with digital sensors replacing photographic films in most instances. The latest revolution is computational photography, which seeks to make image reconstruction computation an integral part of the image formation process; in this way, there can be new capabilities or better performance in the overall imaging system. A leading effort in this area is called the plenoptic camera, which aims at capturing the light field of an object; proper reconstruction algorithms can then adjust the focus after the image capture. In this tutorial paper, we first illustrate the concept of plenoptic function and light field from the perspective of geometric optics. This is followed by a discussion on early attempts and recent advances in the construction of the plenoptic camera. We will then describe the imaging model and computational algorithms that can reconstruct images at different focus points, using mathematical tools from ray optics and Fourier optics. Last, but not least, we will consider the trade-off in spatial resolution and highlight some research work to increase the spatial resolution of the resulting images.

  19. Optical aberration correction for simple lenses via sparse representation

    NASA Astrophysics Data System (ADS)

    Cui, Jinlin; Huang, Wei

    2018-04-01

    Simple lenses with spherical surfaces are lightweight, inexpensive, highly flexible, and can be easily processed. However, they suffer from optical aberrations that lead to limitations in high-quality photography. In this study, we propose a set of computational photography techniques based on sparse signal representation to remove optical aberrations, thereby allowing the recovery of images captured through a single-lens camera. The primary advantage of the proposed method is that many prior point spread functions calibrated at different depths are successfully used for restoring visual images in a short time, which can be generally applied to nonblind deconvolution methods for solving the problem of the excessive processing time caused by the number of point spread functions. The optical software CODE V is applied for examining the reliability of the proposed method by simulation. The simulation results reveal that the suggested method outperforms the traditional methods. Moreover, the performance of a single-lens camera is significantly enhanced both qualitatively and perceptually. Particularly, the prior information obtained by CODE V can be used for processing the real images of a single-lens camera, which provides an alternative approach to conveniently and accurately obtain point spread functions of single-lens cameras.

  20. Preliminary Design of a Lightning Optical Camera and ThundEr (LOCATE) Sensor

    NASA Technical Reports Server (NTRS)

    Phanord, Dieudonne D.; Koshak, William J.; Rybski, Paul M.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The preliminary design of an optical/acoustical instrument is described for making highly accurate real-time determinations of the location of cloud-to-ground (CG) lightning. The instrument, named the Lightning Optical Camera And ThundEr (LOCATE) sensor, will also image the clear and cloud-obscured lightning channel produced from CGs and cloud flashes, and will record the transient optical waveforms produced from these discharges. The LOCATE sensor will consist of a full (360 degrees) field-of-view optical camera for obtaining CG channel image and azimuth, a sensitive thunder microphone for obtaining CG range, and a fast photodiode system for time-resolving the lightning optical waveform. The optical waveform data will be used to discriminate CGs from cloud flashes. Together, the optical azimuth and thunder range is used to locate CGs and it is anticipated that a network of LOCATE sensors would determine CG source location to well within 100 meters. All of this would be accomplished for a relatively inexpensive cost compared to present RF lightning location technologies, but of course the range detection is limited and will be quantified in the future. The LOCATE sensor technology would have practical applications for electric power utility companies, government (e.g. NASA Kennedy Space Center lightning safety and warning), golf resort lightning safety, telecommunications, and other industries.

  1. Mechanical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordby, Martin; Bowden, Gordon; Foss, Mike

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less

  2. Background simulations of the wide-field coded-mask camera for X-/Gamma-ray of the French-Chinese mission SVOM

    NASA Astrophysics Data System (ADS)

    Godet, Olivier; Barret, Didier; Paul, Jacques; Sizun, Patrick; Mandrou, Pierre; Cordier, Bertrand

    SVOM (Space Variable Object Monitor) is a French-Chinese mission dedicated to the study of high-redshift GRBs, which is expected to be launched in 2012. The anti-Sun pointing strategy of SVOM along with a strong and integrated ground segment consisting of two wide-field robotic telescopes covering the near-IR and optical will optimise the ground-based GRB follow-ups by the largest telescopes and thus the measurements of spectroscopic redshifts. The central instrument of the science payload will be an innovative wide-field coded-mask camera for X- /Gamma-rays (4-250 keV) responsible for triggering and localising GRBs with an accuracy better than 10 arc-minutes. Such an instrument will be background-dominated so it is essential to estimate the background level expected once in orbit during the early phase of the instrument design in order to ensure good science performance. We present our Monte-Carlo simulator enabling us to compute the background spectrum taking into account the mass model of the camera and the main components of the space environment encountered in orbit by the satellite. From that computation, we show that the current design of the camera CXG will be more sensitive to high-redshift GRBs than the Swift-BAT thanks to its low-energy threshold of 4 keV.

  3. High sensitive volumetric imaging of renal microcirculation in vivo using ultrahigh sensitive optical microangiography

    NASA Astrophysics Data System (ADS)

    Zhi, Zhongwei; Jung, Yeongri; Jia, Yali; An, Lin; Wang, Ruikang K.

    2011-03-01

    We present a non-invasive, label-free imaging technique called Ultrahigh Sensitive Optical Microangiography (UHSOMAG) for high sensitive volumetric imaging of renal microcirculation. The UHS-OMAG imaging system is based on spectral domain optical coherence tomography (SD-OCT), which uses a 47000 A-line scan rate CCD camera to perform an imaging speed of 150 frames per second that takes only ~7 seconds to acquire a 3D image. The technique, capable of measuring slow blood flow down to 4 um/s, is sensitive enough to image capillary networks, such as peritubular capillaries and glomerulus within renal cortex. We show superior performance of UHS-OMAG in providing depthresolved volumetric images of rich renal microcirculation. We monitored the dynamics of renal microvasculature during renal ischemia and reperfusion. Obvious reduction of renal microvascular density due to renal ischemia was visualized and quantitatively analyzed. This technique can be helpful for the assessment of chronic kidney disease (CKD) which relates to abnormal microvasculature.

  4. Beats: Video Monitors and Cameras.

    ERIC Educational Resources Information Center

    Worth, Frazier

    1996-01-01

    Presents a method to teach the concept of beats as a generalized phenomenon rather than teaching it only in the context of sound. Involves using a video camera to film a computer terminal, 16-mm projector, or TV monitor. (JRH)

  5. Optical design of the SuMIRe/PFS spectrograph

    NASA Astrophysics Data System (ADS)

    Pascal, Sandrine; Vives, Sébastien; Barkhouser, Robert; Gunn, James E.

    2014-07-01

    The SuMIRe Prime Focus Spectrograph (PFS), developed for the 8-m class SUBARU telescope, will consist of four identical spectrographs, each receiving 600 fibers from a 2394 fiber robotic positioner at the telescope prime focus. Each spectrograph includes three spectral channels to cover the wavelength range [0.38-1.26] um with a resolving power ranging between 2000 and 4000. A medium resolution mode is also implemented to reach a resolving power of 5000 at 0.8 um. Each spectrograph is made of 4 optical units: the entrance unit which produces three corrected collimated beams and three camera units (one per spectral channel: "blue, "red", and "NIR"). The beam is split by using two large dichroics; and in each arm, the light is dispersed by large VPH gratings (about 280x280mm). The proposed optical design was optimized to achieve the requested image quality while simplifying the manufacturing of the whole optical system. The camera design consists in an innovative Schmidt camera observing a large field-of-view (10 degrees) with a very fast beam (F/1.09). To achieve such a performance, the classical spherical mirror is replaced by a catadioptric mirror (i.e meniscus lens with a reflective surface on the rear side of the glass, like a Mangin mirror). This article focuses on the optical architecture of the PFS spectrograph and the perfornance achieved. We will first described the global optical design of the spectrograph. Then, we will focus on the Mangin-Schmidt camera design. The analysis of the optical performance and the results obtained are presented in the last section.

  6. Intraocular camera for retinal prostheses: Refractive and diffractive lens systems

    NASA Astrophysics Data System (ADS)

    Hauer, Michelle Christine

    The focus of this thesis is on the design and analysis of refractive, diffractive, and hybrid refractive/diffractive lens systems for a miniaturized camera that can be surgically implanted in the crystalline lens sac and is designed to work in conjunction with current and future generation retinal prostheses. The development of such an intraocular camera (IOC) would eliminate the need for an external head-mounted or eyeglass-mounted camera. Placing the camera inside the eye would allow subjects to use their natural eye movements for foveation (attention) instead of more cumbersome head tracking, would notably aid in personal navigation and mobility, and would also be significantly more psychologically appealing from the standpoint of personal appearances. The capability for accommodation with no moving parts or feedback control is incorporated by employing camera designs that exhibit nearly infinite depth of field. Such an ultracompact optical imaging system requires a unique combination of refractive and diffractive optical elements and relaxed system constraints derived from human psychophysics. This configuration necessitates an extremely compact, short focal-length lens system with an f-number close to unity. Initially, these constraints appear highly aggressive from an optical design perspective. However, after careful analysis of the unique imaging requirements of a camera intended to work in conjunction with the relatively low pixellation levels of a retinal microstimulator array, it becomes clear that such a design is not only feasible, but could possibly be implemented with a single lens system.

  7. Of Detection Limits and Effective Mitigation: The Use of Infrared Cameras for Methane Leak Detection

    NASA Astrophysics Data System (ADS)

    Ravikumar, A. P.; Wang, J.; McGuire, M.; Bell, C.; Brandt, A. R.

    2017-12-01

    Mitigating methane emissions, a short-lived and potent greenhouse gas, is critical to limiting global temperature rise to two degree Celsius as outlined in the Paris Agreement. A major source of anthropogenic methane emissions in the United States is the oil and gas sector. To this effect, state and federal governments have recommended the use of optical gas imaging systems in periodic leak detection and repair (LDAR) surveys to detect for fugitive emissions or leaks. The most commonly used optical gas imaging systems (OGI) are infrared cameras. In this work, we systematically evaluate the limits of infrared (IR) camera based OGI system for use in methane leak detection programs. We analyze the effect of various parameters that influence the minimum detectable leak rates of infrared cameras. Blind leak detection tests were carried out at the Department of Energy's MONITOR natural gas test-facility in Fort Collins, CO. Leak sources included natural gas wellheads, separators, and tanks. With an EPA mandated 60 g/hr leak detection threshold for IR cameras, we test leak rates ranging from 4 g/hr to over 350 g/hr at imaging distances between 5 ft and 70 ft from the leak source. We perform these experiments over the course of a week, encompassing a wide range of wind and weather conditions. Using repeated measurements at a given leak rate and imaging distance, we generate detection probability curves as a function of leak-size for various imaging distances, and measurement conditions. In addition, we estimate the median detection threshold - leak-size at which the probability of detection is 50% - under various scenarios to reduce uncertainty in mitigation effectiveness. Preliminary analysis shows that the median detection threshold varies from 3 g/hr at an imaging distance of 5 ft to over 150 g/hr at 50 ft (ambient temperature: 80 F, winds < 4 m/s). Results from this study can be directly used to improve OGI based LDAR protocols and reduce uncertainty in estimated mitigation effectiveness. Furthermore, detection limits determined in this study can be used as standards to compare new detection technologies.

  8. A near-Infrared SETI Experiment: Alignment and Astrometric precision

    NASA Astrophysics Data System (ADS)

    Duenas, Andres; Maire, Jerome; Wright, Shelley; Drake, Frank D.; Marcy, Geoffrey W.; Siemion, Andrew; Stone, Remington P. S.; Tallis, Melisa; Treffers, Richard R.; Werthimer, Dan

    2016-06-01

    Beginning in March 2015, a Near-InfraRed Optical SETI (NIROSETI) instrument aiming to search for fast nanosecond laser pulses, has been commissioned on the Nickel 1m-telescope at Lick Observatory. The NIROSETI instrument makes use of an optical guide camera, SONY ICX694 CCD from PointGrey, to align our selected sources into two 200µm near-infrared Avalanche Photo Diodes (APD) with a field-of-view of 2.5"x2.5" each. These APD detectors operate at very fast bandwidths and are able to detect pulse widths extending down into the nanosecond range. Aligning sources onto these relatively small detectors requires characterizing the guide camera plate scale, static optical distortion solution, and relative orientation with respect to the APD detectors. We determined the guide camera plate scale as 55.9+- 2.7 milli-arcseconds/pixel and magnitude limit of 18.15mag (+1.07/-0.58) in V-band. We will present the full distortion solution of the guide camera, orientation, and our alignment method between the camera and the two APDs, and will discuss target selection within the NIROSETI observational campaign, including coordination with Breakthrough Listen.

  9. Validation of Attitude and Heading Reference System and Microsoft Kinect for Continuous Measurement of Cervical Range of Motion Compared to the Optical Motion Capture System.

    PubMed

    Song, Young Seop; Yang, Kyung Yong; Youn, Kibum; Yoon, Chiyul; Yeom, Jiwoon; Hwang, Hyeoncheol; Lee, Jehee; Kim, Keewon

    2016-08-01

    To compare optical motion capture system (MoCap), attitude and heading reference system (AHRS) sensor, and Microsoft Kinect for the continuous measurement of cervical range of motion (ROM). Fifteen healthy adult subjects were asked to sit in front of the Kinect camera with optical markers and AHRS sensors attached to the body in a room equipped with optical motion capture camera. Subjects were instructed to independently perform axial rotation followed by flexion/extension and lateral bending. Each movement was repeated 5 times while being measured simultaneously with 3 devices. Using the MoCap system as the gold standard, the validity of AHRS and Kinect for measurement of cervical ROM was assessed by calculating correlation coefficient and Bland-Altman plot with 95% limits of agreement (LoA). MoCap and ARHS showed fair agreement (95% LoA<10°), while MoCap and Kinect showed less favorable agreement (95% LoA>10°) for measuring ROM in all directions. Intraclass correlation coefficient (ICC) values between MoCap and AHRS in -40° to 40° range were excellent for flexion/extension and lateral bending (ICC>0.9). ICC values were also fair for axial rotation (ICC>0.8). ICC values between MoCap and Kinect system in -40° to 40° range were fair for all motions. Our study showed feasibility of using AHRS to measure cervical ROM during continuous motion with an acceptable range of error. AHRS and Kinect system can also be used for continuous monitoring of flexion/extension and lateral bending in ordinary range.

  10. Development of an optically-based tension-indicating implanted orthopedic screw with a luminescent spectral ruler

    NASA Astrophysics Data System (ADS)

    Ravikumar, Nakul; Rogalski, Melissa M.; Benza, Donny; Lake, Joshua; Urban, Matthew; Pelham, Hunter; Anker, Jeffrey N.; DesJardins, John D.

    2017-03-01

    An orthopaedic screw was designed with an optical tension-indicator to non-invasively quantify screw tension and monitor the load sharing between the bone and the implant. The screw both applies load to the bone, and measures this load by reporting the strain on the screw. The screw contains a colorimetric optical encoder that converts axial strain into colorimetric changes visible through the head of the screw, or luminescent spectral changes that are detected through tissue. Screws were tested under cyclic mechanical loading to mimic in-vivo conditions to verify the sensitivity, repeatability, and reproducibility of the sensor. In the absence to tissue, color was measured using a digital camera as a function of axial load on a stainless steel cannulated (hollow) orthopedic screw, modified by adding a passive colorimetric strain gauge through the central hole. The sensor was able to quantify clinically-relevant bone healing strains. The sensor exhibited good repeatability and reproducibility but also displayed hysteresis due to the internal mechanics of the screw. The strain indicator was also modified for measurement through tissue by replacing the reflective colorimetric sensor with a low-background X-ray excited optical luminescence signal. Luminescent spectra were acquired through 6 mm of chicken breast tissue. Overall, this research shows feasibility for a unique device which quantifies the strain on an orthopedic screw. Future research will involve reducing hysteresis by changing the mechanism of strain transduction in the screw, miniaturizing the luminescent strain gauge, monitoring bending as well as tension, using alternative luminescent spectral rulers based upon near infrared fluorescence or upconversion luminescence, and application to monitoring changes in pretension and load sharing during bone healing.

  11. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  12. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  13. KAPAO Prime: Design and Simulation

    NASA Astrophysics Data System (ADS)

    McGonigle, Lorcan; Choi, P. I.; Severson, S. A.; Spjut, E.

    2013-01-01

    KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration over UV-NIR wavelengths from Pomona College’s telescope atop Table Mountain. We present here, the final optical system, KAPAO Prime, designed in Zemax Optical Design Software that uses custom off-axis paraboloid mirrors (OAPs) to manipulate light appropriately for a Shack-Hartman wavefront sensor, deformable mirror, and science cameras. KAPAO Prime is characterized by diffraction limited imaging over the full 81” field of view of our optical camera at f/33 as well as over the smaller field of view of our NIR camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of our optical camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to λ/10 surface irregularity (632.8nm). Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75°F when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of KAPAO Prime in Q1 2013.

  14. JEOS. The JANUS earth observation satellite

    NASA Astrophysics Data System (ADS)

    Molette, P.; Jouan, J.

    The JANUS multimission platform has been designed to minimize the cost of the satellite (by a maximum reuse of equipment from other proprogrammes) and of its associated launch by Aŕiane (by a piggy-back configuration optimized for Ariane 4). The paper describes the application of the JANUS platform to an Earth observation mission with the objective to provide a given country with a permanent monitoring of its earth resources by exploitation of spaceborne imagery. According to this objective, and to minimize the overall system and operational cost, the JANUS Earth Observation Satellite (JEOS) will provide a limited coverage with real time transmission of image data, thus avoiding need for on-board storage and simplifying operations. The JEOS operates on a low earth, near polar sun synchronous orbit. Launched in a piggy-back configuration on Ariane 4, with a SPOT or ERS spacecraft, it reaches its operational orbit after a drift orbit of a few weeks maximum. In its operational mode, the JEOS is 3-axis stabilised, earth pointed. After presentation of the platform, the paper describes the solid state push-broom camera which is composed of four optical lenses mounted on a highly stable optical bench. Each lens includes an optics system, reused from an on-going development, and two CCD linear arrays of detectors. The camera provides four registered channels in visible and near IR bands. The whole optical bench is supported by a rotating mechanism which allows rotation of the optical axis in the across-track direction. The JEOS typical performance for a 700 km altitude is then summarized: spatial resolution 30 m, swath width 120 km, off-track capability 325 km,… The payload data handling and transmission electronics, derived from the French SPOT satellite, realizes the processing, formatting, and transmission to the ground; this allows reuse of the standard SPOT receiving stations. The camera is only operated when the spacecraft is within the visibility of the ground station, and image data are directly transmitted to the ground station by the spacecraft X-band transmitter. Finally, the paper presents a set of typical Earth observation missions which can be realized with JEOS, for countries which wish to have their own observation system, possibly also as a complement to the SPOT and/or LANDSAT observation data.

  15. Performance prediction of optical image stabilizer using SVM for shaker-free production line

    NASA Astrophysics Data System (ADS)

    Kim, HyungKwan; Lee, JungHyun; Hyun, JinWook; Lim, Haekeun; Kim, GyuYeol; Moon, HyukSoo

    2016-04-01

    Recent smartphones adapt the camera module with optical image stabilizer(OIS) to enhance imaging quality in handshaking conditions. However, compared to the non-OIS camera module, the cost for implementing the OIS module is still high. One reason is that the production line for the OIS camera module requires a highly precise shaker table in final test process, which increases the unit cost of the production. In this paper, we propose a framework for the OIS quality prediction that is trained with the support vector machine and following module characterizing features : noise spectral density of gyroscope, optically measured linearity and cross-axis movement of hall and actuator. The classifier was tested on an actual production line and resulted in 88% accuracy of recall rate.

  16. Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic

    NASA Astrophysics Data System (ADS)

    Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

    2008-11-01

    Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

  17. Fiber optic TV direct

    NASA Technical Reports Server (NTRS)

    Kassak, John E.

    1991-01-01

    The objective of the operational television (OTV) technology was to develop a multiple camera system (up to 256 cameras) for NASA Kennedy installations where camera video, synchronization, control, and status data are transmitted bidirectionally via a single fiber cable at distances in excess of five miles. It is shown that the benefits (such as improved video performance, immunity from electromagnetic interference and radio frequency interference, elimination of repeater stations, and more system configuration flexibility) can be realized if application of the proven fiber optic transmission concept is used. The control system will marry the lens, pan and tilt, and camera control functions into a modular based Local Area Network (LAN) control network. Such a system does not exist commercially at present since the Television Broadcast Industry's current practice is to divorce the positional controls from the camera control system. The application software developed for this system will have direct applicability to similar systems in industry using LAN based control systems.

  18. Optical stereo video signal processor

    NASA Technical Reports Server (NTRS)

    Craig, G. D. (Inventor)

    1985-01-01

    An otpical video signal processor is described which produces a two-dimensional cross-correlation in real time of images received by a stereo camera system. The optical image of each camera is projected on respective liquid crystal light valves. The images on the liquid crystal valves modulate light produced by an extended light source. This modulated light output becomes the two-dimensional cross-correlation when focused onto a video detector and is a function of the range of a target with respect to the stereo camera. Alternate embodiments utilize the two-dimensional cross-correlation to determine target movement and target identification.

  19. Seasonal variations of leaf and canopy properties tracked by ground-based NDVI imagery in a temperate forest

    DOE PAGES

    Yang, Hualei; Yang, Xi; Heskel, Mary; ...

    2017-04-28

    Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less

  20. Seasonal variations of leaf and canopy properties tracked by ground-based NDVI imagery in a temperate forest

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Hualei; Yang, Xi; Heskel, Mary

    Changes in plant phenology affect the carbon flux of terrestrial forest ecosystems due to the link between the growing season length and vegetation productivity. Digital camera imagery, which can be acquired frequently, has been used to monitor seasonal and annual changes in forest canopy phenology and track critical phenological events. However, quantitative assessment of the structural and biochemical controls of the phenological patterns in camera images has rarely been done. In this study, we used an NDVI (Normalized Difference Vegetation Index) camera to monitor daily variations of vegetation reflectance at visible and near-infrared (NIR) bands with high spatial and temporalmore » resolutions, and found that the infrared camera based NDVI (camera-NDVI) agreed well with the leaf expansion process that was measured by independent manual observations at Harvard Forest, Massachusetts, USA. We also measured the seasonality of canopy structural (leaf area index, LAI) and biochemical properties (leaf chlorophyll and nitrogen content). Here we found significant linear relationships between camera-NDVI and leaf chlorophyll concentration, and between camera-NDVI and leaf nitrogen content, though weaker relationships between camera-NDVI and LAI. Therefore, we recommend ground-based camera-NDVI as a powerful tool for long-term, near surface observations to monitor canopy development and to estimate leaf chlorophyll, nitrogen status, and LAI.« less

  1. An evaluation of video cameras for collecting observational data on sanctuary-housed chimpanzees (Pan troglodytes).

    PubMed

    Hansen, Bethany K; Fultz, Amy L; Hopper, Lydia M; Ross, Stephen R

    2018-05-01

    Video cameras are increasingly being used to monitor captive animals in zoo, laboratory, and agricultural settings. This technology may also be useful in sanctuaries with large and/or complex enclosures. However, the cost of camera equipment and a lack of formal evaluations regarding the use of cameras in sanctuary settings make it challenging for facilities to decide whether and how to implement this technology. To address this, we evaluated the feasibility of using a video camera system to monitor chimpanzees at Chimp Haven. We viewed a group of resident chimpanzees in a large forested enclosure and compared observations collected in person and with remote video cameras. We found that via camera, the observer viewed fewer chimpanzees in some outdoor locations (GLMM post hoc test: est. = 1.4503, SE = 0.1457, Z = 9.951, p < 0.001) and identified a lower proportion of chimpanzees (GLMM post hoc test: est. = -2.17914, SE = 0.08490, Z = -25.666, p < 0.001) compared to in-person observations. However, the observer could view the 2 ha enclosure 15 times faster by camera compared to in person. In addition to these results, we provide recommendations to animal facilities considering the installation of a video camera system. Despite some limitations of remote monitoring, we posit that there are substantial benefits of using camera systems in sanctuaries to facilitate animal care and observational research. © 2018 Wiley Periodicals, Inc.

  2. 4D cone beam CT phase sorting using high frequency optical surface measurement during image guided radiotherapy

    NASA Astrophysics Data System (ADS)

    Price, G. J.; Marchant, T. E.; Parkhurst, J. M.; Sharrock, P. J.; Whitfield, G. A.; Moore, C. J.

    2011-03-01

    In image guided radiotherapy (IGRT) two of the most promising recent developments are four dimensional cone beam CT (4D CBCT) and dynamic optical metrology of patient surfaces. 4D CBCT is now becoming commercially available and finds use in treatment planning and verification, and whilst optical monitoring is a young technology, its ability to measure during treatment delivery without dose consequences has led to its uptake in many institutes. In this paper, we demonstrate the use of dynamic patient surfaces, simultaneously captured during CBCT acquisition using an optical sensor, to phase sort projection images for 4D CBCT volume reconstruction. The dual modality approach we describe means that in addition to 4D volumetric data, the system provides correlated wide field measurements of the patient's skin surface with high spatial and temporal resolution. As well as the value of such complementary data in verification and motion analysis studies, it introduces flexibility into the acquisition of the signal required for phase sorting. The specific technique used may be varied according to individual patient circumstances and the imaging target. We give details of three different methods of obtaining a suitable signal from the optical surfaces: simply following the motion of triangulation spots used to calibrate the surfaces' absolute height; monitoring the surface height in a single, arbitrarily selected, camera pixel; and tracking, in three dimensions, the movement of a surface feature. In addition to describing the system and methodology, we present initial results from a case study oesophageal cancer patient.

  3. A randomized comparison of laparoscopic, magnetically anchored, and flexible endoscopic cameras in performance and workload between laparoscopic and single-incision surgery.

    PubMed

    Arain, Nabeel A; Cadeddu, Jeffrey A; Best, Sara L; Roshek, Thomas; Chang, Victoria; Hogg, Deborah C; Bergs, Richard; Fernandez, Raul; Webb, Erin M; Scott, Daniel J

    2012-04-01

    This study aimed to evaluate the surgeon performance and workload of a next-generation magnetically anchored camera compared with laparoscopic and flexible endoscopic imaging systems for laparoscopic and single-site laparoscopy (SSL) settings. The cameras included a 5-mm 30° laparoscope (LAP), a magnetically anchored (MAGS) camera, and a flexible endoscope (ENDO). The three camera systems were evaluated using standardized optical characteristic tests. Each system was used in random order for visualization during performance of a standardized suturing task by four surgeons. Each participant performed three to five consecutive repetitions as a surgeon and also served as a camera driver for other surgeons. Ex vivo testing was conducted in a laparoscopic multiport and SSL layout using a box trainer. In vivo testing was performed only in the multiport configuration and used a previously validated live porcine Nissen model. Optical testing showed superior resolution for MAGS at 5 and 10 cm compared with LAP or ENDO. The field of view ranged from 39 to 99°. The depth of focus was almost three times greater for MAGS (6-270 mm) than for LAP (2-88 mm) or ENDO (1-93 mm). Both ex vivo and in vivo multiport combined surgeon performance was significantly better for LAP than for ENDO, but no significant differences were detected for MAGS. For multiport testing, workload ratings were significantly less ex vivo for LAP and MAGS than for ENDO and less in vivo for LAP than for MAGS or ENDO. For ex vivo SSL, no significant performance differences were detected, but camera drivers rated the workload significantly less for MAGS than for LAP or ENDO. The data suggest that the improved imaging element of the next-generation MAGS camera has optical and performance characteristics that meet or exceed those of the LAP or ENDO systems and that the MAGS camera may be especially useful for SSL. Further refinements of the MAGS camera are encouraged.

  4. The GCT camera for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium

    2017-12-01

    The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.

  5. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  6. Concave Surround Optics for Rapid Multi-View Imaging

    DTIC Science & Technology

    2006-11-01

    thus is amenable to capturing dynamic events avoiding the need to construct and calibrate an array of cameras. We demonstrate the system with a high...hard to assemble and calibrate . In this paper we present an optical system capable of rapidly moving the viewpoint around a scene. Our system...flexibility, large camera arrays are typically expensive and require significant effort to calibrate temporally, geometrically and chromatically

  7. Research into a Single-aperture Light Field Camera System to Obtain Passive Ground-based 3D Imagery of LEO Objects

    NASA Astrophysics Data System (ADS)

    Bechis, K.; Pitruzzello, A.

    2014-09-01

    This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera operation is that the target must be within the near-field (Fraunhofer distance) of the collecting optics. For example, in visible light the near-field of a 1-m telescope extends out to about 3,500 km, while the near-field of the AEOS telescope extends out over 46,000 km. For our initial proof of concept, we have integrated our light field camera with a 14-inch Meade LX600 advanced coma-free telescope, to image various surrogate ground targets at up to tens of kilometers range. Our experiments with the 14-inch telescope have assessed factors and requirements that are traceable and scalable to a larger-aperture system that would have the near-field distance needed to obtain 3D images of LEO objects. The next step would be to integrate a light field camera with a 1-m or larger telescope and evaluate its 3D imaging capability against LEO objects. 3D imaging of LEO space objects with light field camera technology can potentially provide a valuable new tool for space situational awareness, especially for those situations where laser or radar illumination of the target objects is not feasible.

  8. PRISM Spectrograph Optical Design

    NASA Technical Reports Server (NTRS)

    Chipman, Russell A.

    1995-01-01

    The objective of this contract is to explore optical design concepts for the PRISM spectrograph and produce a preliminary optical design. An exciting optical configuration has been developed which will allow both wavelength bands to be imaged onto the same detector array. At present the optical design is only partially complete because PRISM will require a fairly elaborate optical system to meet its specification for throughput (area*solid angle). The most complex part of the design, the spectrograph camera, is complete, providing proof of principle that a feasible design is attainable. This camera requires 3 aspheric mirrors to fit inside the 20x60 cm cross-section package. A complete design with reduced throughput (1/9th) has been prepared. The design documents the optical configuration concept. A suitable dispersing prism material, CdTe, has been identified for the prism spectrograph, after a comparison of many materials.

  9. Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.

  10. Optical design and development of a snapshot light-field laryngoscope

    NASA Astrophysics Data System (ADS)

    Zhu, Shuaishuai; Jin, Peng; Liang, Rongguang; Gao, Liang

    2018-02-01

    The convergence of recent advances in optical fabrication and digital processing yields a generation of imaging technology-light-field (LF) cameras which bridge the realms of applied mathematics, optics, and high-performance computing. Herein for the first time, we introduce the paradigm of LF imaging into laryngoscopy. The resultant probe can image the three-dimensional shape of vocal folds within a single camera exposure. Furthermore, to improve the spatial resolution, we developed an image fusion algorithm, providing a simple solution to a long-standing problem in LF imaging.

  11. Continuous monitoring of Hawaiian volcanoes with thermal cameras

    USGS Publications Warehouse

    Patrick, Matthew R.; Orr, Tim R.; Antolik, Loren; Lee, Robert Lopaka; Kamibayashi, Kevan P.

    2014-01-01

    Continuously operating thermal cameras are becoming more common around the world for volcano monitoring, and offer distinct advantages over conventional visual webcams for observing volcanic activity. Thermal cameras can sometimes “see” through volcanic fume that obscures views to visual webcams and the naked eye, and often provide a much clearer view of the extent of high temperature areas and activity levels. We describe a thermal camera network recently installed by the Hawaiian Volcano Observatory to monitor Kīlauea’s summit and east rift zone eruptions (at Halema‘uma‘u and Pu‘u ‘Ō‘ō craters, respectively) and to keep watch on Mauna Loa’s summit caldera. The cameras are long-wave, temperature-calibrated models protected in custom enclosures, and often positioned on crater rims close to active vents. Images are transmitted back to the observatory in real-time, and numerous Matlab scripts manage the data and provide automated analyses and alarms. The cameras have greatly improved HVO’s observations of surface eruptive activity, which includes highly dynamic lava lake activity at Halema‘uma‘u, major disruptions to Pu‘u ‘Ō‘ō crater and several fissure eruptions.

  12. A novel optical system design of light field camera

    NASA Astrophysics Data System (ADS)

    Wang, Ye; Li, Wenhua; Hao, Chenyang

    2016-01-01

    The structure of main lens - Micro Lens Array (MLA) - imaging sensor is usually adopted in optical system of light field camera, and the MLA is the most important part in the optical system, which has the function of collecting and recording the amplitude and phase information of the field light. In this paper, a novel optical system structure is proposed. The novel optical system is based on the 4f optical structure, and the micro-aperture array (MAA) is used to instead of the MLA for realizing the information acquisition of the 4D light field. We analyze the principle that the novel optical system could realize the information acquisition of the light field. At the same time, a simple MAA, line grating optical system, is designed by ZEMAX software in this paper. The novel optical system is simulated by a line grating optical system, and multiple images are obtained in the image plane. The imaging quality of the novel optical system is analyzed.

  13. Evaluation of Trail-Cameras for Analyzing the Diet of Nesting Raptors Using the Northern Goshawk as a Model

    PubMed Central

    García-Salgado, Gonzalo; Rebollo, Salvador; Pérez-Camacho, Lorenzo; Martínez-Hesterkamp, Sara; Navarro, Alberto; Fernández-Pereira, José-Manuel

    2015-01-01

    Diet studies present numerous methodological challenges. We evaluated the usefulness of commercially available trail-cameras for analyzing the diet of Northern Goshawks (Accipiter gentilis) as a model for nesting raptors during the period 2007–2011. We compared diet estimates obtained by direct camera monitoring of 80 nests with four indirect analyses of prey remains collected from the nests and surroundings (pellets, bones, feather-and-hair remains, and feather-hair-and-bone remains combined). In addition, we evaluated the performance of the trail-cameras and whether camera monitoring affected Goshawk behavior. The sensitivity of each diet-analysis method depended on prey size and taxonomic group, with no method providing unbiased estimates for all prey sizes and types. The cameras registered the greatest number of prey items and were probably the least biased method for estimating diet composition. Nevertheless this direct method yielded the largest proportion of prey unidentified to species level, and it underestimated small prey. Our trail-camera system was able to operate without maintenance for longer periods than what has been reported in previous studies with other types of cameras. Initially Goshawks showed distrust toward the cameras but they usually became habituated to its presence within 1–2 days. The habituation period was shorter for breeding pairs that had previous experience with cameras. Using trail-cameras to monitor prey provisioning to nests is an effective tool for studying the diet of nesting raptors. However, the technique is limited by technical failures and difficulties in identifying certain prey types. Our study also shows that cameras can alter adult Goshawk behavior, an aspect that must be controlled to minimize potential negative impacts. PMID:25992956

  14. Evaluation of trail-cameras for analyzing the diet of nesting raptors using the Northern Goshawk as a model.

    PubMed

    García-Salgado, Gonzalo; Rebollo, Salvador; Pérez-Camacho, Lorenzo; Martínez-Hesterkamp, Sara; Navarro, Alberto; Fernández-Pereira, José-Manuel

    2015-01-01

    Diet studies present numerous methodological challenges. We evaluated the usefulness of commercially available trail-cameras for analyzing the diet of Northern Goshawks (Accipiter gentilis) as a model for nesting raptors during the period 2007-2011. We compared diet estimates obtained by direct camera monitoring of 80 nests with four indirect analyses of prey remains collected from the nests and surroundings (pellets, bones, feather-and-hair remains, and feather-hair-and-bone remains combined). In addition, we evaluated the performance of the trail-cameras and whether camera monitoring affected Goshawk behavior. The sensitivity of each diet-analysis method depended on prey size and taxonomic group, with no method providing unbiased estimates for all prey sizes and types. The cameras registered the greatest number of prey items and were probably the least biased method for estimating diet composition. Nevertheless this direct method yielded the largest proportion of prey unidentified to species level, and it underestimated small prey. Our trail-camera system was able to operate without maintenance for longer periods than what has been reported in previous studies with other types of cameras. Initially Goshawks showed distrust toward the cameras but they usually became habituated to its presence within 1-2 days. The habituation period was shorter for breeding pairs that had previous experience with cameras. Using trail-cameras to monitor prey provisioning to nests is an effective tool for studying the diet of nesting raptors. However, the technique is limited by technical failures and difficulties in identifying certain prey types. Our study also shows that cameras can alter adult Goshawk behavior, an aspect that must be controlled to minimize potential negative impacts.

  15. Pixel-wise deblurring imaging system based on active vision for structural health monitoring at a speed of 100 km/h

    NASA Astrophysics Data System (ADS)

    Hayakawa, Tomohiko; Moko, Yushi; Morishita, Kenta; Ishikawa, Masatoshi

    2018-04-01

    In this paper, we propose a pixel-wise deblurring imaging (PDI) system based on active vision for compensation of the blur caused by high-speed one-dimensional motion between a camera and a target. The optical axis is controlled by back-and-forth motion of a galvanometer mirror to compensate the motion. High-spatial-resolution image captured by our system in high-speed motion is useful for efficient and precise visual inspection, such as visually judging abnormal parts of a tunnel surface to prevent accidents; hence, we applied the PDI system for structural health monitoring. By mounting the system onto a vehicle in a tunnel, we confirmed significant improvement in image quality for submillimeter black-and-white stripes and real tunnel-surface cracks at a speed of 100 km/h.

  16. Development of an Image Colorimeter for Noncontact Skin Color Measurement and Application to the Dermatological Treatment

    NASA Astrophysics Data System (ADS)

    Akimoto, Makio; Chen, Yu; Miyazaki, Michio; Yamashita, Toyonobu; Miyakawa, Michio; Hata, Mieko

    The skin is unique as an organ that is highly accessible to direct visual inspection with light. Visual inspection of cutaneous morphology is the mainstay of clinical dermatology, but relies heavily on subjective assessment by the skilled dermatologists. We present an imaging colorimeter of non-contact skin color measuring system and some experimented results using such instrument. The system is comprised by a video camera, light source, a real-time image processing board, magneto optics disk and personal computer which controls the entire system. The CIE-L*a*b* uniform color space is used. This system is used for monitoring of some clinical diagnosis. The instrument is non-contact, easy to operate, and has a high precision unlike the conventional colorimeters. This instrument is useful for clinical diagnoses, monitoring and evaluating the effectiveness of treatment.

  17. Evaluation of state-of-the-art imaging systems for in vivo monitoring of retinal structure in mice: current capabilities and limitations

    NASA Astrophysics Data System (ADS)

    Zhang, Pengfei; Zam, Azhar; Pugh, Edward N.; Zawadzki, Robert J.

    2014-02-01

    Animal models of human diseases play an important role in studying and advancing our understanding of these conditions, allowing molecular level studies of pathogenesis as well as testing of new therapies. Recently several non-invasive imaging modalities including Fundus Camera, Scanning Laser Ophthalmoscopy (SLO) and Optical Coherence Tomography (OCT) have been successfully applied to monitor changes in the retinas of the living animals in experiments in which a single animal is followed over a portion of its lifespan. Here we evaluate the capabilities and limitations of these three imaging modalities for visualization of specific structures in the mouse eye. Example images acquired from different types of mice are presented. Future directions of development for these instruments and potential advantages of multi-modal imaging systems are discussed as well.

  18. Cheap streak camera based on the LD-S-10 intensifier tube

    NASA Astrophysics Data System (ADS)

    Dashevsky, Boris E.; Krutik, Mikhail I.; Surovegin, Alexander L.

    1992-01-01

    Basic properties of a new streak camera and its test results are reported. To intensify images on its screen, we employed modular G1 tubes, the LD-A-1.0 and LD-A-0.33, enabling magnification of 1.0 and 0.33, respectively. If necessary, the LD-A-0.33 tube may be substituted by any other image intensifier of the LDA series, the choice to be determined by the size of the CCD matrix with fiber-optical windows. The reported camera employs a 12.5- mm-long CCD strip consisting of 1024 pixels, each 12 X 500 micrometers in size. Registered radiation was imaged on a 5 X 0.04 mm slit diaphragm tightly connected with the LD-S- 10 fiber-optical input window. Electrons escaping the cathode are accelerated in a 5 kV electric field and focused onto a phosphor screen covering a fiber-optical plate as they travel between deflection plates. Sensitivity of the latter was 18 V/mm, which implies that the total deflecting voltage was 720 V per 40 mm of the screen surface, since reversed-polarity scan pulses +360 V and -360 V were applied across the deflection plate. The streak camera provides full scan times over the screen of 15, 30, 50, 100, 250, and 500 ns. Timing of the electrically or optically driven camera was done using a 10 ns step-controlled-delay (0 - 500 ns) circuit.

  19. Image-based dynamic deformation monitoring of civil engineering structures from long ranges

    NASA Astrophysics Data System (ADS)

    Ehrhart, Matthias; Lienhart, Werner

    2015-02-01

    In this paper, we report on the vibration and displacement monitoring of civil engineering structures using a state of the art image assisted total station (IATS) and passive target markings. By utilizing the telescope camera of the total station, it is possible to capture video streams in real time with 10fps and an angular resolution of approximately 2″/px. Due to the high angular resolution resulting from the 30x optical magnification of the telescope, large distances to the object to be monitored are possible. The laser distance measurement unit integrated in the total station allows to precisely set the camera's focus position and to relate the angular quantities gained from image processing to units of length. To accurately measure the vibrations and displacements of civil engineering structures, we use circular target markings rigidly attached to the object. The computation of the targets' centers is performed by a least squares adjustment of an ellipse according to the Gauß-Helmert model from which the parameters of the ellipse and their standard deviations are derived. In laboratory experiments, we show that movements can be detected with an accuracy of better than 0.2mm for single frames and distances up to 30m. For static applications, where many video frames can be averaged, accuracies of better than 0.05mm are possible. In a field test on a life-size footbridge, we compare the vibrations measured by the IATS to reference values derived from accelerometer measurements.

  20. An Innovative Procedure for Calibration of Strapdown Electro-Optical Sensors Onboard Unmanned Air Vehicles

    PubMed Central

    Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio

    2010-01-01

    This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559

  1. Initial Observations of Micropulse Elongation of Electron Beams in a SCRF Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumpkin, A. H.; Thurman-Keup, R.; Edstrom Jr., D.

    2016-10-09

    Commissioning at the SCRF accelerator at the Fermilab Accelerator Science and Technology (FAST) Facility has included the implementation of a versatile bunch-length monitor located after the 4-dipole chicane bunch compressor for electron beam energies of 20-50 MeV and integrated charges in excess of 10 nC. The team has initially used a Hamamatsu C5680 synchroscan streak camera to assess the effects of space charge on the electron beam bunch lengths. An Al-coated Si screen was used to generate optical transition radiation (OTR) resulting from the beam’s interaction with the screen. The chicane bypass beamline allowed the measurements of the bunch lengthmore » without the compression stage at the downstream beamline location using OTR and the streak camera. We have observed electron beam bunch lengths from 5 to 16 ps (sigma) for micropulse charges of 60 pC to 800 pC, respectively. We also report a compressed sub-ps micropulse case.« less

  2. Phase and amplitude wave front sensing and reconstruction with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Nelson, William; Davis, Christopher C.

    2014-10-01

    A plenoptic camera is a camera that can retrieve the direction and intensity distribution of light rays collected by the camera and allows for multiple reconstruction functions such as: refocusing at a different depth, and for 3D microscopy. Its principle is to add a micro-lens array to a traditional high-resolution camera to form a semi-camera array that preserves redundant intensity distributions of the light field and facilitates back-tracing of rays through geometric knowledge of its optical components. Though designed to process incoherent images, we found that the plenoptic camera shows high potential in solving coherent illumination cases such as sensing both the amplitude and phase information of a distorted laser beam. Based on our earlier introduction of a prototype modified plenoptic camera, we have developed the complete algorithm to reconstruct the wavefront of the incident light field. In this paper the algorithm and experimental results will be demonstrated, and an improved version of this modified plenoptic camera will be discussed. As a result, our modified plenoptic camera can serve as an advanced wavefront sensor compared with traditional Shack- Hartmann sensors in handling complicated cases such as coherent illumination in strong turbulence where interference and discontinuity of wavefronts is common. Especially in wave propagation through atmospheric turbulence, this camera should provide a much more precise description of the light field, which would guide systems in adaptive optics to make intelligent analysis and corrections.

  3. Optimizing Optics For Remotely Controlled Underwater Vehicles

    NASA Astrophysics Data System (ADS)

    Billet, A. B.

    1984-09-01

    The past decade has shown a dramatic increase in the use of unmanned tethered vehicles in worldwide marine fields. These vehicles are used for inspection, debris removal and object retrieval. With advanced robotic technology, remotely operated vehicles (ROVs) are now able to perform a variety of jobs previously accomplished only by divers. The ROVs can be used at greater depths and for riskier jobs, and safety to the diver is increased, freeing him for safer, more cost-effective tasks requiring human capabilities. Secondly, the ROV operation becomes more cost effective to use as work depth increases. At 1000 feet a diver's 10 minutes of work can cost over $100,000 including support personnel, while an ROV operational cost might be 1/20 of the diver cost per day, based on the condition that the cost for ROV operation does not change with depth, as it does for divers. In the ROV operation the television lens must be as good as the human eye, with better light gathering capability than the human eye. The RCV-150 system is an example of these advanced technology vehicles. With the requirements of manueuverability and unusual inspection, a responsive, high performance, compact vehicle was developed. The RCV-150 viewing subsystem consists of a television camera, lights, and topside monitors. The vehicle uses a low light level Newvicon television camera. The camera is equipped with a power-down iris that closes for burn protection when the power is off. The camera can pan f 50 degrees and tilt f 85 degrees on command from the surface. Four independently controlled 250 watt quartz halogen flood lamps illuminate the viewing area as required; in addition, two 250 watt spotlights are fitted. A controlled nine inch CRT monitor provides real time camera pictures for the operator. The RCV-150 vehicle component system consists of the vehicle structure, the vehicle electronics, and hydraulic system which powers the thruster assemblies and the manipulator. For this vehicle, a light weight, high response hydraulic system was developed in a very small package.

  4. The sequence measurement system of the IR camera

    NASA Astrophysics Data System (ADS)

    Geng, Ai-hui; Han, Hong-xia; Zhang, Hai-bo

    2011-08-01

    Currently, the IR cameras are broadly used in the optic-electronic tracking, optic-electronic measuring, fire control and optic-electronic countermeasure field, but the output sequence of the most presently applied IR cameras in the project is complex and the giving sequence documents from the leave factory are not detailed. Aiming at the requirement that the continuous image transmission and image procession system need the detailed sequence of the IR cameras, the sequence measurement system of the IR camera is designed, and the detailed sequence measurement way of the applied IR camera is carried out. The FPGA programming combined with the SignalTap online observation way has been applied in the sequence measurement system, and the precise sequence of the IR camera's output signal has been achieved, the detailed document of the IR camera has been supplied to the continuous image transmission system, image processing system and etc. The sequence measurement system of the IR camera includes CameraLink input interface part, LVDS input interface part, FPGA part, CameraLink output interface part and etc, thereinto the FPGA part is the key composed part in the sequence measurement system. Both the video signal of the CmaeraLink style and the video signal of LVDS style can be accepted by the sequence measurement system, and because the image processing card and image memory card always use the CameraLink interface as its input interface style, the output signal style of the sequence measurement system has been designed into CameraLink interface. The sequence measurement system does the IR camera's sequence measurement work and meanwhile does the interface transmission work to some cameras. Inside the FPGA of the sequence measurement system, the sequence measurement program, the pixel clock modification, the SignalTap file configuration and the SignalTap online observation has been integrated to realize the precise measurement to the IR camera. Te sequence measurement program written by the verilog language combining the SignalTap tool on line observation can count the line numbers in one frame, pixel numbers in one line and meanwhile account the line offset and row offset of the image. Aiming at the complex sequence of the IR camera's output signal, the sequence measurement system of the IR camera accurately measures the sequence of the project applied camera, supplies the detailed sequence document to the continuous system such as image processing system and image transmission system and gives out the concrete parameters of the fval, lval, pixclk, line offset and row offset. The experiment shows that the sequence measurement system of the IR camera can get the precise sequence measurement result and works stably, laying foundation for the continuous system.

  5. The Ringo2 Optical Polarisation Catalogue of 13 High-Energy Blazars

    NASA Astrophysics Data System (ADS)

    Barres de Almeida, Ulisses; Jermak, Helen; Mundell, Carole; Lindfors, Elina; Nilsson, Kari; Steele, Iain

    2015-08-01

    We present the findings of the Ringo2 3-year survey of 13 blazars (3 FSRQs and 10 BL Lacs) with regular coverage and reasonably fast cadence of one to three observations a week. Ringo2 was installed on the Liverpool Robotic Telescope (LT) on the Canary Island of La Palma between 2009 and 2012 and monitored thirteen high-energy-emitting blazars in the northern sky. The objects selected as well as the observational strategy were tuned to maximise the synergies with high-energy X- to gamma-ray observations. Therefore this sample stands out as a well-sampled, long-term view of high-energy AGN jets in polarised optical light. Over half of the sources exhibited an increase in optical flux during this period and almost a quarter were observed in outburst. We compare the optical data to gamma (Fermi/LAT) and X-ray data during these periods of outburst. In this talk we present the data obtained for all sources over the lifetime of Ringo2 with additional optical data from the KVA telescope and the SkyCamZ wide-field camera (on the LT), we explore the relationship between the change in polarisation angle as a function of time (dEVPA/dMJD), flux and polarisation degree along with cross correlation comparisons of optical and high-energy flux.

  6. Correlation of ERTS-1 and aircraft optical data with water quality parameters of Charlotte Amalie Harbor, St. Thomas, Virgin Islands

    NASA Technical Reports Server (NTRS)

    Coulbourn, W. C.; Egan, W. G. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Attempts to correlate optical aircraft remote sensing of water quality with the optical data from the ERTS-1 satellite using calibrated imagery of Charlotte Amalie harbor, St. Thomas, Virgin Islands are reported. The harbor at Charlotte Amalie has a concentration of a number of factors affecting water quality: untreated sewage, land runoff, and sediment from navigation and dredging operations. Calibration procedures have been originated and applied to ERTS-1 and I2S camera imagery. The results indicate that the ERTS-1 and I2S imagery are correlated with optical in situ measurements of the harbor water. The aircraft green photographic and ERTS-1 MSS-4 bands have been found most suitable for monitoring the scattered light levels under the conditions of the investigation. The chemical parameters of the harbor water were found to be correlated to the optical properties for two stations investigated in detail. The biological properties of the harbor water (chlorophyll and carotenoids), correlate inversely with the optical data near the pollution sources compared to further away. Calibration procedures developed in this investigation were essential to the interpretation of the photographic and ERTS-1 photometric responses.

  7. Smartphone Fundus Photography.

    PubMed

    Nazari Khanamiri, Hossein; Nakatsuka, Austin; El-Annan, Jaafar

    2017-07-06

    Smartphone fundus photography is a simple technique to obtain ocular fundus pictures using a smartphone camera and a conventional handheld indirect ophthalmoscopy lens. This technique is indispensable when picture documentation of optic nerve, retina, and retinal vessels is necessary but a fundus camera is not available. The main advantage of this technique is the widespread availability of smartphones that allows documentation of macula and optic nerve changes in many settings that was not previously possible. Following the well-defined steps detailed here, such as proper alignment of the phone camera, handheld lens, and the patient's pupil, is the key for obtaining a clear retina picture with no interfering light reflections and aberrations. In this paper, the optical principles of indirect ophthalmoscopy and fundus photography will be reviewed first. Then, the step-by-step method to record a good quality retinal image using a smartphone will be explained.

  8. Geometric and Optic Characterization of a Hemispherical Dome Port for Underwater Photogrammetry

    PubMed Central

    Menna, Fabio; Nocerino, Erica; Fassi, Francesco; Remondino, Fabio

    2016-01-01

    The popularity of automatic photogrammetric techniques has promoted many experiments in underwater scenarios leading to quite impressive visual results, even by non-experts. Despite these achievements, a deep understanding of camera and lens behaviors as well as optical phenomena involved in underwater operations is fundamental to better plan field campaigns and anticipate the achievable results. The paper presents a geometric investigation of a consumer grade underwater camera housing, manufactured by NiMAR and equipped with a 7′′ dome port. After a review of flat and dome ports, the work analyzes, using simulations and real experiments, the main optical phenomena involved when operating a camera underwater. Specific aspects which deal with photogrammetric acquisitions are considered with some tests in laboratory and in a swimming pool. Results and considerations are shown and commented. PMID:26729133

  9. Designing the optimal semi-warm NIR spectrograph for SALT via detailed thermal analysis

    NASA Astrophysics Data System (ADS)

    Wolf, Marsha J.; Sheinis, Andrew I.; Mulligan, Mark P.; Wong, Jeffrey P.; Rogers, Allen

    2008-07-01

    The near infrared (NIR) upgrade to the Robert Stobie Spectrograph (RSS) on the Southern African Large Telescope (SALT), RSS/NIR, extends the spectral coverage of all modes of the optical spectrograph. The RSS/NIR is a low to medium resolution spectrograph with broadband, spectropolarimetric, and Fabry-Perot imaging capabilities. The optical and NIR arms can be used simultaneously to extend spectral coverage from 3200 Å to approximately 1.6 μm. Both arms utilize high efficiency volume phase holographic gratings via articulating gratings and cameras. The NIR camera incorporates a HAWAII-2RG detector with an Epps optical design consisting of 6 spherical elements and providing subpixel rms image sizes of 7.5 +/- 1.0 μm over all wavelengths and field angles. The NIR spectrograph is semi-warm, sharing a common slit plane and partial collimator with the optical arm. A pre-dewar, cooled to below ambient temperature, houses the final NIR collimator optic, the grating/Fabry-Perot etalon, the polarizing beam splitter, and the first three camera optics. The last three camera elements, blocking filters, and detector are housed in a cryogenically cooled dewar. The semi-warm design concept has long been proposed as an economical way to extend optical instruments into the NIR, however, success has been very limited. A major portion of our design effort entails a detailed thermal analysis using non-sequential ray tracing to interactively guide the mechanical design and determine a truly realizable long wavelength cutoff over which astronomical observations will be sky-limited. In this paper we describe our thermal analysis, design concepts for the staged cooling scheme, and results to be incorporated into the overall mechanical design and baffling.

  10. Optomechanical stability design of space optical mapping camera

    NASA Astrophysics Data System (ADS)

    Li, Fuqiang; Cai, Weijun; Zhang, Fengqin; Li, Na; Fan, Junjie

    2018-01-01

    According to the interior orientation elements and imaging quality requirements of mapping application to mapping camera and combined with off-axis three-mirror anastigmat(TMA) system, high optomechanical stability design of a space optical mapping camera is introduced in this paper. The configuration is a coaxial TMA system used in off-axis situation. Firstly, the overall optical arrangement is described., and an overview of the optomechanical packaging is provided. Zerodurglass, carbon fiber composite and carbon-fiber reinforced silicon carbon (C/SiC) are widely used in the optomechanical structure, because their low coefficient of thermal expansion (CTE) can reduce the thermal sensitivity of the mirrors and focal plane. Flexible and unloading support are used in reflector and camera supporting structure. Epoxy structural adhesives is used for bonding optics to metal structure is also introduced in this paper. The primary mirror is mounted by means of three-point ball joint flexures system, which is attach to the back of the mirror. Then, In order to predict flexural displacements due to gravity, static finite element analysis (FEA) is performed on the primary mirror. The optical performance peak-to-valley (PV) and root-mean-square (RMS) wavefront errors are detected before and after assemble. Also, the dynamic finite element analysis(FEA) of the whole optical arrangement is carried out as to investigate the performance of optomechanical. Finally, in order to evaluate the stability of the design, the thermal vacuum test and vibration test are carried out and the Modulation Transfer Function (MTF) and elements of interior orientation are presented as the evaluation index. Before and after the thermal vacuum test and vibration test, the MTF, focal distance and position of the principal point of optical system are measured and the result is as expected.

  11. The opto-cryo-mechanical design of the short wavelength camera for the CCAT Observatory

    NASA Astrophysics Data System (ADS)

    Parshley, Stephen C.; Adams, Joseph; Nikola, Thomas; Stacey, Gordon J.

    2014-07-01

    The CCAT observatory is a 25-m class Gregorian telescope designed for submillimeter observations that will be deployed at Cerro Chajnantor (~5600 m) in the high Atacama Desert region of Chile. The Short Wavelength Camera (SWCam) for CCAT is an integral part of the observatory, enabling the study of star formation at high and low redshifts. SWCam will be a facility instrument, available at first light and operating in the telluric windows at wavelengths of 350, 450, and 850 μm. In order to trace the large curvature of the CCAT focal plane, and to suit the available instrument space, SWCam is divided into seven sub-cameras, each configured to a particular telluric window. A fully refractive optical design in each sub-camera will produce diffraction-limited images. The material of choice for the optical elements is silicon, due to its excellent transmission in the submillimeter and its high index of refraction, enabling thin lenses of a given power. The cryostat's vacuum windows double as the sub-cameras' field lenses and are ~30 cm in diameter. The other lenses are mounted at 4 K. The sub-cameras will share a single cryostat providing thermal intercepts at 80, 15, 4, 1 and 0.1 K, with cooling provided by pulse tube cryocoolers and a dilution refrigerator. The use of the intermediate temperature stage at 15 K minimizes the load at 4 K and reduces operating costs. We discuss our design requirements, specifications, key elements and expected performance of the optical, thermal and mechanical design for the short wavelength camera for CCAT.

  12. Robotic Vehicle Communications Interoperability

    DTIC Science & Technology

    1988-08-01

    starter (cold start) X X Fire suppression X Fording control X Fuel control X Fuel tank selector X Garage toggle X Gear selector X X X X Hazard warning...optic Sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor control...optic sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor

  13. Single-Fiber Optical Link For Video And Control

    NASA Technical Reports Server (NTRS)

    Galloway, F. Houston

    1993-01-01

    Single optical fiber carries control signals to remote television cameras and video signals from cameras. Fiber replaces multiconductor copper cable, with consequent reduction in size. Repeaters not needed. System works with either multimode- or single-mode fiber types. Nonmetallic fiber provides immunity to electromagnetic interference at suboptical frequencies and much less vulnerable to electronic eavesdropping and lightning strikes. Multigigahertz bandwidth more than adequate for high-resolution television signals.

  14. Preliminary optical design of PANIC, a wide-field infrared camera for CAHA

    NASA Astrophysics Data System (ADS)

    Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.

    2008-07-01

    In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.

  15. Multiple-aperture optical design for micro-level cameras using 3D-printing method

    NASA Astrophysics Data System (ADS)

    Peng, Wei-Jei; Hsu, Wei-Yao; Cheng, Yuan-Chieh; Lin, Wen-Lung; Yu, Zong-Ru; Chou, Hsiao-Yu; Chen, Fong-Zhi; Fu, Chien-Chung; Wu, Chong-Syuan; Huang, Chao-Tsung

    2018-02-01

    The design of the ultra miniaturized camera using 3D-printing technology directly printed on to the complementary metal-oxide semiconductor (CMOS) imaging sensor is presented in this paper. The 3D printed micro-optics is manufactured using the femtosecond two-photon direct laser writing, and the figure error which could achieve submicron accuracy is suitable for the optical system. Because the size of the micro-level camera is approximately several hundreds of micrometers, the resolution is reduced much and highly limited by the Nyquist frequency of the pixel pitch. For improving the reduced resolution, one single-lens can be replaced by multiple-aperture lenses with dissimilar field of view (FOV), and then stitching sub-images with different FOV can achieve a high resolution within the central region of the image. The reason is that the angular resolution of the lens with smaller FOV is higher than that with larger FOV, and then the angular resolution of the central area can be several times than that of the outer area after stitching. For the same image circle, the image quality of the central area of the multi-lens system is significantly superior to that of a single-lens. The foveated image using stitching FOV breaks the limitation of the resolution for the ultra miniaturized imaging system, and then it can be applied such as biomedical endoscopy, optical sensing, and machine vision, et al. In this study, the ultra miniaturized camera with multi-aperture optics is designed and simulated for the optimum optical performance.

  16. Retinal axial focusing and multi-layer imaging with a liquid crystal adaptive optics camera

    NASA Astrophysics Data System (ADS)

    Liu, Rui-Xue; Zheng, Xian-Liang; Li, Da-Yu; Xia, Ming-Liang; Hu, Li-Fa; Cao, Zhao-Liang; Mu, Quan-Quan; Xuan, Li

    2014-09-01

    With the help of adaptive optics (AO) technology, cellular level imaging of living human retina can be achieved. Aiming to reduce distressing feelings and to avoid potential drug induced diseases, we attempted to image retina with dilated pupil and froze accommodation without drugs. An optimized liquid crystal adaptive optics camera was adopted for retinal imaging. A novel eye stared system was used for stimulating accommodation and fixating imaging area. Illumination sources and imaging camera kept linkage for focusing and imaging different layers. Four subjects with diverse degree of myopia were imaged. Based on the optical properties of the human eye, the eye stared system reduced the defocus to less than the typical ocular depth of focus. In this way, the illumination light can be projected on certain retina layer precisely. Since that the defocus had been compensated by the eye stared system, the adopted 512 × 512 liquid crystal spatial light modulator (LC-SLM) corrector provided the crucial spatial fidelity to fully compensate high-order aberrations. The Strehl ratio of a subject with -8 diopter myopia was improved to 0.78, which was nearly close to diffraction-limited imaging. By finely adjusting the axial displacement of illumination sources and imaging camera, cone photoreceptors, blood vessels and nerve fiber layer were clearly imaged successfully.

  17. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  18. Television monitor field shifter and an opto-electronic method for obtaining a stereo image of optimal depth resolution and reduced depth distortion on a single screen

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor)

    1989-01-01

    A method and apparatus is developed for obtaining a stereo image with reduced depth distortion and optimum depth resolution. Static and dynamic depth distortion and depth resolution tradeoff is provided. Cameras obtaining the images for a stereo view are converged at a convergence point behind the object to be presented in the image, and the collection-surface-to-object distance, the camera separation distance, and the focal lengths of zoom lenses for the cameras are all increased. Doubling the distances cuts the static depth distortion in half while maintaining image size and depth resolution. Dynamic depth distortion is minimized by panning a stereo view-collecting camera system about a circle which passes through the convergence point and the camera's first nodal points. Horizontal field shifting of the television fields on a television monitor brings both the monitor and the stereo views within the viewer's limit of binocular fusion.

  19. Development of an omnidirectional gamma-ray imaging Compton camera for low-radiation-level environmental monitoring

    NASA Astrophysics Data System (ADS)

    Watanabe, Takara; Enomoto, Ryoji; Muraishi, Hiroshi; Katagiri, Hideaki; Kagaya, Mika; Fukushi, Masahiro; Kano, Daisuke; Satoh, Wataru; Takeda, Tohoru; Tanaka, Manobu M.; Tanaka, Souichi; Uchida, Tomohisa; Wada, Kiyoto; Wakamatsu, Ryo

    2018-02-01

    We have developed an omnidirectional gamma-ray imaging Compton camera for environmental monitoring at low levels of radiation. The camera consisted of only six CsI(Tl) scintillator cubes of 3.5 cm, each of which was readout by super-bialkali photo-multiplier tubes (PMTs). Our camera enables the visualization of the position of gamma-ray sources in all directions (∼4π sr) over a wide energy range between 300 and 1400 keV. The angular resolution (σ) was found to be ∼11°, which was realized using an image-sharpening technique. A high detection efficiency of 18 cps/(µSv/h) for 511 keV (1.6 cps/MBq at 1 m) was achieved, indicating the capability of this camera to visualize hotspots in areas with low-radiation-level contamination from the order of µSv/h to natural background levels. Our proposed technique can be easily used as a low-radiation-level imaging monitor in radiation control areas, such as medical and accelerator facilities.

  20. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  1. A compact high-speed pnCCD camera for optical and x-ray applications

    NASA Astrophysics Data System (ADS)

    Ihle, Sebastian; Ordavo, Ivan; Bechteler, Alois; Hartmann, Robert; Holl, Peter; Liebel, Andreas; Meidinger, Norbert; Soltau, Heike; Strüder, Lothar; Weber, Udo

    2012-07-01

    We developed a camera with a 264 × 264 pixel pnCCD of 48 μm size (thickness 450 μm) for X-ray and optical applications. It has a high quantum efficiency and can be operated up to 400 / 1000 Hz (noise≍ 2:5 ° ENC / ≍4:0 ° ENC). High-speed astronomical observations can be performed with low light levels. Results of test measurements will be presented. The camera is well suitable for ground based preparation measurements for future X-ray missions. For X-ray single photons, the spatial position can be determined with significant sub-pixel resolution.

  2. Analysis of the color rendition of flexible endoscopes

    NASA Astrophysics Data System (ADS)

    Murphy, Edward M.; Hegarty, Francis J.; McMahon, Barry P.; Boyle, Gerard

    2003-03-01

    Endoscopes are imaging devices routinely used for the diagnosis of disease within the human digestive tract. Light is transmitted into the body cavity via incoherent fibreoptic bundles and is controlled by a light feedback system. Fibreoptic endoscopes use coherent fibreoptic bundles to provide the clinician with an image. It is also possible to couple fibreoptic endoscopes to a clip-on video camera. Video endoscopes consist of a small CCD camera, which is inserted into gastrointestinal tract, and associated image processor to convert the signal to analogue RGB video signals. Images from both types of endoscope are displayed on standard video monitors. Diagnosis is dependent upon being able to determine changes in the structure and colour of tissues and biological fluids, and therefore is dependent upon the ability of the endoscope to reproduce the colour of these tissues and fluids with fidelity. This study investigates the colour reproduction of flexible optical and video endoscopes. Fibreoptic and video endoscopes alter image colour characteristics in different ways. The colour rendition of fibreoptic endoscopes was assessed by coupling them to a video camera and applying video colorimetric techniques. These techniques were then used on video endoscopes to assess how the colour rendition of video endoscopes compared with that of optical endoscopes. In both cases results were obtained at fixed illumination settings. Video endoscopes were then assessed with varying levels of illumination. Initial results show that at constant luminance endoscopy systems introduce non-linear shifts in colour. Techniques for examining how this colour shift varies with illumination intensity were developed and both methodology and results will be presented. We conclude that more rigorous quality assurance is required to reduce colour error and are developing calibration procedures applicable to medical endoscopes.

  3. Space telescope phase B definition study. Volume 2A: Science instruments, f24 field camera

    NASA Technical Reports Server (NTRS)

    Grosso, R. P.; Mccarthy, D. J.

    1976-01-01

    The analysis and design of the F/24 field camera for the space telescope are discussed. The camera was designed for application to the radial bay of the optical telescope assembly and has an on axis field of view of 3 arc-minutes by 3 arc-minutes.

  4. Studying Upper-Limb Amputee Prosthesis Use to Inform Device Design

    DTIC Science & Technology

    2015-10-01

    the study. This equipment has included a modified GoPro head-mounted camera and a Vicon 13-camera optical motion capture system, which was not part...also completed for relevant members of the study team. 4. The head-mounted camera setup has been established (a modified GoPro Hero 3 with external

  5. Wavefront measurement of plastic lenses for mobile-phone applications

    NASA Astrophysics Data System (ADS)

    Huang, Li-Ting; Cheng, Yuan-Chieh; Wang, Chung-Yen; Wang, Pei-Jen

    2016-08-01

    In camera lenses for mobile-phone applications, all lens elements have been designed with aspheric surfaces because of the requirements in minimal total track length of the lenses. Due to the diffraction-limited optics design with precision assembly procedures, element inspection and lens performance measurement have become cumbersome in the production of mobile-phone cameras. Recently, wavefront measurements based on Shack-Hartmann sensors have been successfully implemented on injection-molded plastic lens with aspheric surfaces. However, the applications of wavefront measurement on small-sized plastic lenses have yet to be studied both theoretically and experimentally. In this paper, both an in-house-built and a commercial wavefront measurement system configured on two optics structures have been investigated with measurement of wavefront aberrations on two lens elements from a mobile-phone camera. First, the wet-cell method has been employed for verifications of aberrations due to residual birefringence in an injection-molded lens. Then, two lens elements of a mobile-phone camera with large positive and negative power have been measured with aberrations expressed in Zernike polynomial to illustrate the effectiveness in wavefront measurement for troubleshooting defects in optical performance.

  6. Design of microcontroller based system for automation of streak camera.

    PubMed

    Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  7. Design of microcontroller based system for automation of streak camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor.more » A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.« less

  8. Design framework for a spectral mask for a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Berkner, Kathrin; Shroff, Sapna A.

    2012-01-01

    Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.

  9. Completely optical orientation determination for an unstabilized aerial three-line camera

    NASA Astrophysics Data System (ADS)

    Wohlfeil, Jürgen

    2010-10-01

    Aerial line cameras allow the fast acquisition of high-resolution images at low costs. Unfortunately the measurement of the camera's orientation with the necessary rate and precision is related with large effort, unless extensive camera stabilization is used. But also stabilization implicates high costs, weight, and power consumption. This contribution shows that it is possible to completely derive the absolute exterior orientation of an unstabilized line camera from its images and global position measurements. The presented approach is based on previous work on the determination of the relative orientation of subsequent lines using optical information from the remote sensing system. The relative orientation is used to pre-correct the line images, in which homologous points can reliably be determined using the SURF operator. Together with the position measurements these points are used to determine the absolute orientation from the relative orientations via bundle adjustment of a block of overlapping line images. The approach was tested at a flight with the DLR's RGB three-line camera MFC. To evaluate the precision of the resulting orientation the measurements of a high-end navigation system and ground control points are used.

  10. A high-speed trapezoid image sensor design for continuous traffic monitoring at signalized intersection approaches.

    DOT National Transportation Integrated Search

    2014-10-01

    The goal of this project is to monitor traffic flow continuously with an innovative camera system composed of a custom : designed image sensor integrated circuit (IC) containing trapezoid pixel array and camera system that is capable of : intelligent...

  11. Cherenkov imaging for Total Skin Electron Therapy (TSET)

    NASA Astrophysics Data System (ADS)

    Xie, Yunhe; Petroccia, Heather; Maity, Amit; Miao, Tianshun; Zhu, Yihua; Bruza, Petr; Pogue, Brian W.; Andreozzi, Jacqueline M.; Plastaras, John P.; Dong, Lei; Zhu, Timothy C.

    2018-03-01

    Total Skin Electron Therapy (TSET) utilizes high-energy electrons to treat cancers on the entire body surface. The otherwise invisible radiation beam can be observed via the optical Cherenkov photons emitted from interaction between the high-energy electron beam and tissue. Using a specialized camera-system, the Cherenkov emission can thus be used to evaluate the dose uniformity on the surface of the patient in real-time. Each patient was also monitored during TSET via in-vivo detectors (IVD) in nine locations. Patients undergoing TSET in various conditions (whole body and half body) were imaged and analyzed, and the viability of the system to provide clinical feedback was established.

  12. VizieR Online Data Catalog: UV and optical photometric data for SN 2013by (Valenti+, 2015)

    NASA Astrophysics Data System (ADS)

    Valenti, S.; Sand, D.; Stritzinger, M.; Howell, D. A.; Arcavi, I.; McCully, C.; Childress, M. J.; Hsiao, E. Y.; Contreras, C.; Morrell, N.; Phillips, M. M.; Gromadzki, M.; Kirshner, R. P.; Marion, G. H.

    2017-11-01

    Photometric monitoring in BVgri of SN 2013by with the LCOGT 1 m telescope network began on 2013 April 24 (UT), and continued every two to three nights (52 epochs of data were collected) for more than 150 d, well after the light curve settled on to the 56Co decay tail. Additional imaging was obtained from Swift and the CSP. The CSP obtained 17 epochs of science images using the SITe3 CCD camera along with a set of ugriBV filters attached to the Swope 1 m telescope located at Las Campanas Observatory (LCO). (1 data file).

  13. Cherenkov imaging during volumetric modulated arc therapy for real-time radiation beam tracking and treatment response monitoring

    NASA Astrophysics Data System (ADS)

    Andreozzi, Jacqueline M.; Zhang, Rongxiao; Glaser, Adam K.; Gladstone, David J.; Jarvis, Lesley A.; Pogue, Brian W.

    2016-03-01

    External beam radiotherapy utilizes high energy radiation to target cancer with dynamic, patient-specific treatment plans. The otherwise invisible radiation beam can be observed via the optical Cherenkov photons emitted from interaction between the high energy beam and tissue. Using a specialized camera-system, the Cherenkov emission can thus be used to track the radiation beam on the surface of the patient in real-time, even for complex cases such as volumetric modulated arc therapy (VMAT). Two patients undergoing VMAT of the head and neck were imaged and analyzed, and the viability of the system to provide clinical feedback was established.

  14. ACS (Alma Common Software) operating a set of robotic telescopes

    NASA Astrophysics Data System (ADS)

    Westhues, C.; Ramolla, M.; Lemke, R.; Haas, M.; Drass, H.; Chini, R.

    2014-07-01

    We use the ALMA Common Software (ACS) to establish a unified middleware for robotic observations with the 40cm Optical, 80cm Infrared and 1.5m Hexapod telescopes located at OCA (Observatorio Cerro Armazones) and the ESO 1-m located at La Silla. ACS permits to hide from the observer the technical specifications, like mount-type or camera-model. Furthermore ACS provides a uniform interface to the different telescopes, allowing us to run the same planning program for each telescope. Observations are carried out for long-term monitoring campaigns to study the variability of stars and AGN. We present here the specific implementation to the different telescopes.

  15. Camera Traps on Wildlife Crossing Structures as a Tool in Gray Wolf (Canis lupus) Management - Five-Years Monitoring of Wolf Abundance Trends in Croatia

    PubMed Central

    Križan, Josip; Gužvica, Goran

    2016-01-01

    The conservation of gray wolf (Canis lupus) and its coexistence with humans presents a challenge and requires continuous monitoring and management efforts. One of the non-invasive methods that produces high-quality wolf monitoring datasets is camera trapping. We present a novel monitoring approach where camera traps are positioned on wildlife crossing structures that channel the animals, thereby increasing trapping success and increasing the cost-efficiency of the method. In this way we have followed abundance trends of five wolf packs whose home ranges are intersected by a motorway which spans throughout the wolf distribution range in Croatia. During the five-year monitoring of six green bridges we have recorded 28 250 camera-events, 132 with wolves. Four viaducts were monitored for two years, recording 4914 camera-events, 185 with wolves. We have detected a negative abundance trend of the monitored Croatian wolf packs since 2011, especially severe in the northern part of the study area. Further, we have pinpointed the legal cull as probable major negative influence on the wolf pack abundance trends (linear regression, r2 > 0.75, P < 0.05). Using the same approach we did not find evidence for a negative impact of wolves on the prey populations, both wild ungulates and livestock. We encourage strict protection of wolf in Croatia until there is more data proving population stability. In conclusion, quantitative methods, such as the one presented here, should be used as much as possible when assessing wolf abundance trends. PMID:27327498

  16. Camera Traps on Wildlife Crossing Structures as a Tool in Gray Wolf (Canis lupus) Management - Five-Years Monitoring of Wolf Abundance Trends in Croatia.

    PubMed

    Šver, Lidija; Bielen, Ana; Križan, Josip; Gužvica, Goran

    2016-01-01

    The conservation of gray wolf (Canis lupus) and its coexistence with humans presents a challenge and requires continuous monitoring and management efforts. One of the non-invasive methods that produces high-quality wolf monitoring datasets is camera trapping. We present a novel monitoring approach where camera traps are positioned on wildlife crossing structures that channel the animals, thereby increasing trapping success and increasing the cost-efficiency of the method. In this way we have followed abundance trends of five wolf packs whose home ranges are intersected by a motorway which spans throughout the wolf distribution range in Croatia. During the five-year monitoring of six green bridges we have recorded 28 250 camera-events, 132 with wolves. Four viaducts were monitored for two years, recording 4914 camera-events, 185 with wolves. We have detected a negative abundance trend of the monitored Croatian wolf packs since 2011, especially severe in the northern part of the study area. Further, we have pinpointed the legal cull as probable major negative influence on the wolf pack abundance trends (linear regression, r2 > 0.75, P < 0.05). Using the same approach we did not find evidence for a negative impact of wolves on the prey populations, both wild ungulates and livestock. We encourage strict protection of wolf in Croatia until there is more data proving population stability. In conclusion, quantitative methods, such as the one presented here, should be used as much as possible when assessing wolf abundance trends.

  17. Modeling of skin cooling, blood flow, and optical properties in wounds created by electrical shock

    NASA Astrophysics Data System (ADS)

    Nguyen, Thu T. A.; Shupp, Jeffrey W.; Moffatt, Lauren T.; Jordan, Marion H.; Jeng, James C.; Ramella-Roman, Jessica C.

    2012-02-01

    High voltage electrical injuries may lead to irreversible tissue damage or even death. Research on tissue injury following high voltage shock is needed and may yield stage-appropriate therapy to reduce amputation rate. One of the mechanisms by which electricity damages tissue is through Joule heating, with subsequent protein denaturation. Previous studies have shown that blood flow had a significant effect on the cooling rate of heated subcutaneous tissue. To assess the thermal damage in tissue, this study focused on monitoring changes of temperature and optical properties of skin next to high voltage wounds. The burns were created between left fore limb and right hind limb extremities of adult male Sprague-Dawley rats by a 1000VDC delivery shock system. A thermal camera was utilized to record temperature variation during the exposure. The experimental results were then validated using a thermal-electric finite element model (FEM).

  18. Can camera traps monitor Komodo dragons a large ectothermic predator?

    PubMed

    Ariefiandy, Achmad; Purwandana, Deni; Seno, Aganto; Ciofi, Claudio; Jessop, Tim S

    2013-01-01

    Camera trapping has greatly enhanced population monitoring of often cryptic and low abundance apex carnivores. Effectiveness of passive infrared camera trapping, and ultimately population monitoring, relies on temperature mediated differences between the animal and its ambient environment to ensure good camera detection. In ectothermic predators such as large varanid lizards, this criterion is presumed less certain. Here we evaluated the effectiveness of camera trapping to potentially monitor the population status of the Komodo dragon (Varanus komodoensis), an apex predator, using site occupancy approaches. We compared site-specific estimates of site occupancy and detection derived using camera traps and cage traps at 181 trapping locations established across six sites on four islands within Komodo National Park, Eastern Indonesia. Detection and site occupancy at each site were estimated using eight competing models that considered site-specific variation in occupancy (ψ)and varied detection probabilities (p) according to detection method, site and survey number using a single season site occupancy modelling approach. The most parsimonious model [ψ (site), p (site survey); ω = 0.74] suggested that site occupancy estimates differed among sites. Detection probability varied as an interaction between site and survey number. Our results indicate that overall camera traps produced similar estimates of detection and site occupancy to cage traps, irrespective of being paired, or unpaired, with cage traps. Whilst one site showed some evidence detection was affected by trapping method detection was too low to produce an accurate occupancy estimate. Overall, as camera trapping is logistically more feasible it may provide, with further validation, an alternative method for evaluating long-term site occupancy patterns in Komodo dragons, and potentially other large reptiles, aiding conservation of this species.

  19. Can Camera Traps Monitor Komodo Dragons a Large Ectothermic Predator?

    PubMed Central

    Ariefiandy, Achmad; Purwandana, Deni; Seno, Aganto; Ciofi, Claudio; Jessop, Tim S.

    2013-01-01

    Camera trapping has greatly enhanced population monitoring of often cryptic and low abundance apex carnivores. Effectiveness of passive infrared camera trapping, and ultimately population monitoring, relies on temperature mediated differences between the animal and its ambient environment to ensure good camera detection. In ectothermic predators such as large varanid lizards, this criterion is presumed less certain. Here we evaluated the effectiveness of camera trapping to potentially monitor the population status of the Komodo dragon (Varanus komodoensis), an apex predator, using site occupancy approaches. We compared site-specific estimates of site occupancy and detection derived using camera traps and cage traps at 181 trapping locations established across six sites on four islands within Komodo National Park, Eastern Indonesia. Detection and site occupancy at each site were estimated using eight competing models that considered site-specific variation in occupancy (ψ)and varied detection probabilities (p) according to detection method, site and survey number using a single season site occupancy modelling approach. The most parsimonious model [ψ (site), p (site*survey); ω = 0.74] suggested that site occupancy estimates differed among sites. Detection probability varied as an interaction between site and survey number. Our results indicate that overall camera traps produced similar estimates of detection and site occupancy to cage traps, irrespective of being paired, or unpaired, with cage traps. Whilst one site showed some evidence detection was affected by trapping method detection was too low to produce an accurate occupancy estimate. Overall, as camera trapping is logistically more feasible it may provide, with further validation, an alternative method for evaluating long-term site occupancy patterns in Komodo dragons, and potentially other large reptiles, aiding conservation of this species. PMID:23527027

  20. Micro-optical system based 3D imaging for full HD depth image capturing

    NASA Astrophysics Data System (ADS)

    Park, Yong-Hwa; Cho, Yong-Chul; You, Jang-Woo; Park, Chang-Young; Yoon, Heesun; Lee, Sang-Hun; Kwon, Jong-Oh; Lee, Seung-Wan

    2012-03-01

    20 Mega-Hertz-switching high speed image shutter device for 3D image capturing and its application to system prototype are presented. For 3D image capturing, the system utilizes Time-of-Flight (TOF) principle by means of 20MHz high-speed micro-optical image modulator, so called 'optical shutter'. The high speed image modulation is obtained using the electro-optic operation of the multi-layer stacked structure having diffractive mirrors and optical resonance cavity which maximizes the magnitude of optical modulation. The optical shutter device is specially designed and fabricated realizing low resistance-capacitance cell structures having small RC-time constant. The optical shutter is positioned in front of a standard high resolution CMOS image sensor and modulates the IR image reflected from the object to capture a depth image. Suggested novel optical shutter device enables capturing of a full HD depth image with depth accuracy of mm-scale, which is the largest depth image resolution among the-state-of-the-arts, which have been limited up to VGA. The 3D camera prototype realizes color/depth concurrent sensing optical architecture to capture 14Mp color and full HD depth images, simultaneously. The resulting high definition color/depth image and its capturing device have crucial impact on 3D business eco-system in IT industry especially as 3D image sensing means in the fields of 3D camera, gesture recognition, user interface, and 3D display. This paper presents MEMS-based optical shutter design, fabrication, characterization, 3D camera system prototype and image test results.

  1. Network based sky Brightness Monitor

    NASA Astrophysics Data System (ADS)

    McKenna, Dan; Pulvermacher, R.; Davis, D. R.

    2009-01-01

    We have developed and are currently testing an autonomous 2 channel photometer designed to measure the night sky brightness in the visual wavelengths over a multi-year campaign. The photometer uses a robust silicon sensor filtered with Hoya CM500 glass. The Sky brightness is measured every minute at two elevation angles typically zenith and 20 degrees to monitor brightness and transparency. The Sky Brightness monitor consists of two units, the remote photometer and a network interface. Currently these devices use 2.4 Ghz transceivers with a free space range of 100 meters. The remote unit is battery powered with day time recharging using a solar panel. Data received by the network interface transmits data via standard POP Email protocol. A second version is under development for radio sensitive areas using an optical fiber for data transmission. We will present the current comparison with the National Park Service sky monitoring camera. We will also discuss the calibration methods used for standardization and temperature compensation. This system is expected to be deployed in the next year and be operated by the International Dark Sky Association SKYMONITOR project.

  2. Use of a color CMOS camera as a colorimeter

    NASA Astrophysics Data System (ADS)

    Dallas, William J.; Roehrig, Hans; Redford, Gary R.

    2006-08-01

    In radiology diagnosis, film is being quickly replaced by computer monitors as the display medium for all imaging modalities. Increasingly, these monitors are color instead of monochrome. It is important to have instruments available to characterize the display devices in order to guarantee reproducible presentation of image material. We are developing an imaging colorimeter based on a commercially available color digital camera. The camera uses a sensor that has co-located pixels in all three primary colors.

  3. Interplanetary approach optical navigation with applications

    NASA Technical Reports Server (NTRS)

    Jerath, N.

    1978-01-01

    The use of optical data from onboard television cameras for the navigation of interplanetary spacecraft during the planet approach phase is investigated. Three optical data types were studied: the planet limb with auxiliary celestial references, the satellite-star, and the planet-star two-camera methods. Analysis and modelling issues related to the nature and information content of the optical methods were examined. Dynamic and measurement system modelling, data sequence design, measurement extraction, model estimation and orbit determination, as relating optical navigation, are discussed, and the various error sources were analyzed. The methodology developed was applied to the Mariner 9 and the Viking Mars missions. Navigation accuracies were evaluated at the control and knowledge points, with particular emphasis devoted to the combined use of radio and optical data. A parametric probability analysis technique was developed to evaluate navigation performance as a function of system reliabilities.

  4. Imaging of optically diffusive media by use of opto-elastography

    NASA Astrophysics Data System (ADS)

    Bossy, Emmanuel; Funke, Arik R.; Daoudi, Khalid; Tanter, Mickael; Fink, Mathias; Boccara, Claude

    2007-02-01

    We present a camera-based optical detection scheme designed to detect the transient motion created by the acoustic radiation force in elastic media. An optically diffusive tissue mimicking phantom was illuminated with coherent laser light, and a high speed camera (2 kHz frame rate) was used to acquire and cross-correlate consecutive speckle patterns. Time-resolved transient decorrelations of the optical speckle were measured as the results of localised motion induced in the medium by the radiation force and subsequent propagating shear waves. As opposed to classical acousto-optic techniques which are sensitive to vibrations induced by compressional waves at ultrasonic frequencies, the proposed technique is sensitive only to the low frequency transient motion induced in the medium by the radiation force. It therefore provides a way to assess both optical and shear mechanical properties.

  5. On the collaborative design and simulation of space camera: stop structural/thermal/optical) analysis

    NASA Astrophysics Data System (ADS)

    Duan, Pengfei; Lei, Wenping

    2017-11-01

    A number of disciplines (mechanics, structures, thermal, and optics) are needed to design and build Space Camera. Separate design models are normally constructed by each discipline CAD/CAE tools. Design and analysis is conducted largely in parallel subject to requirements that have been levied on each discipline, and technical interaction between the different disciplines is limited and infrequent. As a result a unified view of the Space Camera design across discipline boundaries is not directly possible in the approach above, and generating one would require a large manual, and error-prone process. A collaborative environment that is built on abstract model and performance template allows engineering data and CAD/CAE results to be shared across above discipline boundaries within a common interface, so that it can help to attain speedy multivariate design and directly evaluate optical performance under environment loadings. A small interdisciplinary engineering team from Beijing Institute of Space Mechanics and Electricity has recently conducted a Structural/Thermal/Optical (STOP) analysis of a space camera with this collaborative environment. STOP analysis evaluates the changes in image quality that arise from the structural deformations when the thermal environment of the camera changes throughout its orbit. STOP analyses were conducted for four different test conditions applied during final thermal vacuum (TVAC) testing of the payload on the ground. The STOP Simulation Process begins with importing an integrated CAD model of the camera geometry into the collaborative environment, within which 1. Independent thermal and structural meshes are generated. 2. The thermal mesh and relevant engineering data for material properties and thermal boundary conditions are then used to compute temperature distributions at nodal points in both the thermal and structures mesh through Thermal Desktop, a COTS thermal design and analysis code. 3. Thermally induced structural deformations of the camera are then evaluated in Nastran, an industry standard code for structural design and analysis. 4. Thermal and structural results are next imported into SigFit, another COTS tool that computes deformation and best fit rigid body displacements for the optical surfaces. 5. SigFit creates a modified optical prescription that is imported into CODE V for evaluation of optical performance impacts. The integrated STOP analysis was validated using TVAC test data. For the four different TVAC tests, the relative errors between simulation and test data of measuring points temperatures were almost around 5%, while in some test conditions, they were even much lower to 1%. As to image quality MTF, relative error between simulation and test was 8.3% in the worst condition, others were all below 5%. Through the validation, it has been approved that the collaborative design and simulation environment can achieved the integrated STOP analysis of Space Camera efficiently. And further, the collaborative environment allows an interdisciplinary analysis that formerly might take several months to perform to be completed in two or three weeks, which is very adaptive to scheme demonstration of projects in earlier stages.

  6. Qualification Tests of Micro-camera Modules for Space Applications

    NASA Astrophysics Data System (ADS)

    Kimura, Shinichi; Miyasaka, Akira

    Visual capability is very important for space-based activities, for which small, low-cost space cameras are desired. Although cameras for terrestrial applications are continually being improved, little progress has been made on cameras used in space, which must be extremely robust to withstand harsh environments. This study focuses on commercial off-the-shelf (COTS) CMOS digital cameras because they are very small and are based on an established mass-market technology. Radiation and ultrahigh-vacuum tests were conducted on a small COTS camera that weighs less than 100 mg (including optics). This paper presents the results of the qualification tests for COTS cameras and for a small, low-cost COTS-based space camera.

  7. Refinery evaluation of optical imaging to locate fugitive emissions.

    PubMed

    Robinson, Donald R; Luke-Boone, Ronke; Aggarwal, Vineet; Harris, Buzz; Anderson, Eric; Ranum, David; Kulp, Thomas J; Armstrong, Karla; Sommers, Ricky; McRae, Thomas G; Ritter, Karin; Siegell, Jeffrey H; Van Pelt, Doug; Smylie, Mike

    2007-07-01

    Fugitive emissions account for approximately 50% of total hydrocarbon emissions from process plants. Federal and state regulations aiming at controlling these emissions require refineries and petrochemical plants in the United States to implement a Leak Detection and Repair Program (LDAR). The current regulatory work practice, U.S. Environment Protection Agency Method 21, requires designated components to be monitored individually at regular intervals. The annual costs of these LDAR programs in a typical refinery can exceed US$1,000,000. Previous studies have shown that a majority of controllable fugitive emissions come from a very small fraction of components. The Smart LDAR program aims to find cost-effective methods to monitor and reduce emissions from these large leakers. Optical gas imaging has been identified as one such technology that can help achieve this objective. This paper discusses a refinery evaluation of an instrument based on backscatter absorption gas imaging technology. This portable camera allows an operator to scan components more quickly and image gas leaks in real time. During the evaluation, the instrument was able to identify leaking components that were the source of 97% of the total mass emissions from leaks detected. More than 27,000 components were monitored. This was achieved in far less time than it would have taken using Method 21. In addition, the instrument was able to find leaks from components that are not required to be monitored by the current LDAR regulations. The technology principles and the parameters that affect instrument performance are also discussed in the paper.

  8. Real-time local experimental monitoring of the bleaching process.

    PubMed

    Rakic, Mario; Klaric, Eva; Sever, Ivan; Rakic, Iva Srut; Pichler, Goran; Tarle, Zrinka

    2015-04-01

    The purpose of this article was to investigate a new setup for tooth bleaching and monitoring of the same process in real time, so to prevent overbleaching and related sideeffects of the bleaching procedure. So far, known bleaching procedures cannot simultaneously monitor and perform the bleaching process or provide any local control over bleaching. The experimental setup was developed at the Institute of Physics, Zagreb. The setup consists of a camera, a controller, and optical fibers. The bleaching was performed with 25% hydrogen peroxide activated by ultraviolet light diodes, and the light for monitoring was emitted by white light diodes. The collected light was analyzed using a red-green-blue (RGB) index. A K-type thermocouple was used for temperature measurements. Pastilles made from hydroxylapatite powder as well as human teeth served as experimental objects. Optimal bleaching time substantially varied among differently stained specimens. To reach reference color (A1, Chromascop shade guide), measured as an RGB index, bleaching time for pastilles ranged from 8 to >20 min, whereas for teeth it ranged from 3.5 to >20 min. The reflected light intensity of each R, G, and B component at the end of bleaching process (after 20 min) had increased up to 56% of the baseline intensity. The presented experimental setup provides essential information about when to stop the bleaching process to achieve the desired optical results so that the bleaching process can be completely responsive to the characteristics of every individual, leading to more satisfying results.

  9. Stereoscopic Configurations To Minimize Distortions

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.

    1991-01-01

    Proposed television system provides two stereoscopic displays. Two-camera, two-monitor system used in various camera configurations and with stereoscopic images on monitors magnified to various degrees. Designed to satisfy observer's need to perceive spatial relationships accurately throughout workspace or to perceive them at high resolution in small region of workspace. Potential applications include industrial, medical, and entertainment imaging and monitoring and control of telemanipulators, telerobots, and remotely piloted vehicles.

  10. Structural Dynamics Analysis and Research for FEA Modeling Method of a Light High Resolution CCD Camera

    NASA Astrophysics Data System (ADS)

    Sun, Jiwen; Wei, Ling; Fu, Danying

    2002-01-01

    resolution and wide swath. In order to assure its high optical precision smoothly passing the rigorous dynamic load of launch, it should be of high structural rigidity. Therefore, a careful study of the dynamic features of the camera structure should be performed. Pro/E. An interference examination is performed on the precise CAD model of the camera for mending the structural design. for the first time in China, and the analysis of structural dynamic of the camera is accomplished by applying the structural analysis code PATRAN and NASTRAN. The main research programs include: 1) the comparative calculation of modes analysis of the critical structure of the camera is achieved by using 4 nodes and 10 nodes tetrahedral elements respectively, so as to confirm the most reasonable general model; 2) through the modes analysis of the camera from several cases, the inherent frequencies and modes are obtained and further the rationality of the structural design of the camera is proved; 3) the static analysis of the camera under self gravity and overloads is completed and the relevant deformation and stress distributions are gained; 4) the response calculation of sine vibration of the camera is completed and the corresponding response curve and maximum acceleration response with corresponding frequencies are obtained. software technique is accurate and efficient. sensitivity, the dynamic design and engineering optimization of the critical structure of the camera are discussed. fundamental technology in design of forecoming space optical instruments.

  11. a Spatio-Spectral Camera for High Resolution Hyperspectral Imaging

    NASA Astrophysics Data System (ADS)

    Livens, S.; Pauly, K.; Baeck, P.; Blommaert, J.; Nuyts, D.; Zender, J.; Delauré, B.

    2017-08-01

    Imaging with a conventional frame camera from a moving remotely piloted aircraft system (RPAS) is by design very inefficient. Less than 1 % of the flying time is used for collecting light. This unused potential can be utilized by an innovative imaging concept, the spatio-spectral camera. The core of the camera is a frame sensor with a large number of hyperspectral filters arranged on the sensor in stepwise lines. It combines the advantages of frame cameras with those of pushbroom cameras. By acquiring images in rapid succession, such a camera can collect detailed hyperspectral information, while retaining the high spatial resolution offered by the sensor. We have developed two versions of a spatio-spectral camera and used them in a variety of conditions. In this paper, we present a summary of three missions with the in-house developed COSI prototype camera (600-900 nm) in the domains of precision agriculture (fungus infection monitoring in experimental wheat plots), horticulture (crop status monitoring to evaluate irrigation management in strawberry fields) and geology (meteorite detection on a grassland field). Additionally, we describe the characteristics of the 2nd generation, commercially available ButterflEYE camera offering extended spectral range (475-925 nm), and we discuss future work.

  12. The TESS camera: modeling and measurements with deep depletion devices

    NASA Astrophysics Data System (ADS)

    Woods, Deborah F.; Vanderspek, Roland; MacDonald, Robert; Morgan, Edward; Villasenor, Joel; Thayer, Carolyn; Burke, Barry; Chesbrough, Christian; Chrisp, Michael; Clark, Kristin; Furesz, Gabor; Gonzales, Alexandria; Nguyen, Tam; Prigozhin, Gregory; Primeau, Brian; Ricker, George; Sauerwein, Timothy; Suntharalingam, Vyshnavi

    2016-07-01

    The Transiting Exoplanet Survey Satellite, a NASA Explorer-class mission in development, will discover planets around nearby stars, most notably Earth-like planets with potential for follow up characterization. The all-sky survey requires a suite of four wide field-of-view cameras with sensitivity across a broad spectrum. Deep depletion CCDs with a silicon layer of 100 μm thickness serve as the camera detectors, providing enhanced performance in the red wavelengths for sensitivity to cooler stars. The performance of the camera is critical for the mission objectives, with both the optical system and the CCD detectors contributing to the realized image quality. Expectations for image quality are studied using a combination of optical ray tracing in Zemax and simulations in Matlab to account for the interaction of the incoming photons with the 100 μm silicon layer. The simulations include a probabilistic model to determine the depth of travel in the silicon before the photons are converted to photo-electrons, and a Monte Carlo approach to charge diffusion. The charge diffusion model varies with the remaining depth for the photo-electron to traverse and the strength of the intermediate electric field. The simulations are compared with laboratory measurements acquired by an engineering unit camera with the TESS optical design and deep depletion CCDs. In this paper we describe the performance simulations and the corresponding measurements taken with the engineering unit camera, and discuss where the models agree well in predicted trends and where there are differences compared to observations.

  13. Suppressing the image smear of the vibration modulation transfer function for remote-sensing optical cameras.

    PubMed

    Li, Jin; Liu, Zilong; Liu, Si

    2017-02-20

    In on-board photographing processes of satellite cameras, the platform vibration can generate image motion, distortion, and smear, which seriously affect the image quality and image positioning. In this paper, we create a mathematical model of a vibrating modulate transfer function (VMTF) for a remote-sensing camera. The total MTF of a camera is reduced by the VMTF, which means the image quality is degraded. In order to avoid the degeneration of the total MTF caused by vibrations, we use an Mn-20Cu-5Ni-2Fe (M2052) manganese copper alloy material to fabricate a vibration-isolation mechanism (VIM). The VIM can transform platform vibration energy into irreversible thermal energy with its internal twin crystals structure. Our experiment shows the M2052 manganese copper alloy material is good enough to suppress image motion below 125 Hz, which is the vibration frequency of satellite platforms. The camera optical system has a higher MTF after suppressing the vibration of the M2052 material than before.

  14. Smartphone Based Platform for Colorimetric Sensing of Dyes

    NASA Astrophysics Data System (ADS)

    Dutta, Sibasish; Nath, Pabitra

    We demonstrate the working of a smartphone based optical sensor for measuring absorption band of coloured dyes. By integration of simple laboratory optical components with the camera unit of the smartphone we have converted it into a visible spectrometer with a pixel resolution of 0.345 nm/pixel. Light from a broadband optical source is allowed to transmit through a specific dye solution. The transmitted light signal is captured by the camera of the smartphone. The present sensor is inexpensive, portable and light weight making it an ideal handy sensor suitable for different on-field sensing.

  15. Imaging using a supercontinuum laser to assess tumors in patients with breast carcinoma

    NASA Astrophysics Data System (ADS)

    Sordillo, Laura A.; Sordillo, Peter P.; Alfano, R. R.

    2016-03-01

    The supercontinuum laser light source has many advantages over other light sources, including broad spectral range. Transmission images of paired normal and malignant breast tissue samples from two patients were obtained using a Leukos supercontinuum (SC) laser light source with wavelengths in the second and third NIR optical windows and an IR- CCD InGaAs camera detector (Goodrich Sensors Inc. high response camera SU320KTSW-1.7RT with spectral response between 900 nm and 1,700 nm). Optical attenuation measurements at the four NIR optical windows were obtained from the samples.

  16. Conference Proceedings of the America Institute for Aeronautics and Astronautics Missile Sciences Held in Monterey, California on 29 November - 1 December 1988. Volume 6. Navy Ballistic Missile Technology

    DTIC Science & Technology

    1988-11-01

    atmospheric point the sensor line of sight to a target. Both oxidizers.) The stability of the booster plume as optical systems look out through windows...vertical. The optical layout olume unless it is tracking the UV plume outside for the UV camera is as shown in Figure 1. A the atmosphere. Thus, other...and olune and handoff to the missile in the atmosphere camera was used on the rear platform for the with high resolution optics . visible observation

  17. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  18. Image-based spectroscopy for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Bachmakov, Eduard; Molina, Carolyn; Wynne, Rosalind

    2014-03-01

    An image-processing algorithm for use with a nano-featured spectrometer chemical agent detection configuration is presented. The spectrometer chip acquired from Nano-Optic DevicesTM can reduce the size of the spectrometer down to a coin. The nanospectrometer chip was aligned with a 635nm laser source, objective lenses, and a CCD camera. The images from a nanospectrometer chip were collected and compared to reference spectra. Random background noise contributions were isolated and removed from the diffraction pattern image analysis via a threshold filter. Results are provided for the image-based detection of the diffraction pattern produced by the nanospectrometer. The featured PCF spectrometer has the potential to measure optical absorption spectra in order to detect trace amounts of contaminants. MATLAB tools allow for implementation of intelligent, automatic detection of the relevant sub-patterns in the diffraction patterns and subsequent extraction of the parameters using region-detection algorithms such as the generalized Hough transform, which detects specific shapes within the image. This transform is a method for detecting curves by exploiting the duality between points on a curve and parameters of that curve. By employing this imageprocessing technique, future sensor systems will benefit from new applications such as unsupervised environmental monitoring of air or water quality.

  19. Optical monitoring of film pollution on sea surface

    NASA Astrophysics Data System (ADS)

    Pavlov, Andrey; Konstantinov, Oleg; Shmirko, Konstantin

    2017-11-01

    The organic films form a brightness contrast on the sea surface. It makes possible to use cheap simple and miniature systems for video monitoring of pollution of coastal marine areas by oil products in the bunkering of ships, emergency situations at oil terminals, gas and oil pipelines, hydrocarbon production platforms on the shelf, etc.1-16 A panoramic video system with a polarization filter on the lens, located at an altitude of 90 m above sea level, can provide effective control of the water area within a radius of 7 kilometers,17-19 and modern photogrammetry technologies allow not only to register the fact of pollution and get a portrait of the offender, but also with a high Spatial and temporal resolution to estimate the dimensions and trace the dynamics of movement and transformation of the film in a geographic coordinate system. Of particular relevance is the optical method of controlling the pollution of the sea surface at the present time with the development of unmanned aerial vehicles that are already equipped with video cameras and require only a minor upgrade of their video system to enhance the contrast of images of organic films.

  20. Combining Digital Image Correlation and Acoustic Emission for Monitoring of the Strain Distribution until Yielding During Compression of Bovine Cancellous Bone

    NASA Astrophysics Data System (ADS)

    Tsirigotis, Athanasios; Deligianni, Despoina D.

    2017-12-01

    In this work, the surface heterogeneity in mechanical compressive strain of cancellous bone was investigated with digital image correlation (DIC). Moreover, the onset and progression of failure was studied by acoustic emission (AE). Cubic cancellous bone specimens, with side of 15 mm, were obtained from bovine femur and kept frozen at -20ºC until testing. Specimen strain was analyzed by measuring the change of distance between the platens (crosshead) and via an optical method, by following the strain evolution with a camera. Simultaneously, AE monitoring was performed. The experiments showed that compressive Young’s modulus determined by crosshead strain is underestimated at 23% in comparison to optically determined strain. However, surface strain fields defined by DIC displayed steep strain gradients, which can be attributed to cancellous bone porosity and inhomogeneity. The cumulative number of events for the total AE activity recorded from the sensors showed that the activity started at a mean load level of 36% of the maximum load and indicated the initiation of micro-cracking phenomena. Further experiments, determining 3D strain with μCT apart from surface strain, are necessary to clarify the issue of strain inhomogeneity in cancellous bone.

  1. Wind turbine rotor blade monitoring using digital image correlation: a comparison to aeroelastic simulations of a multi-megawatt wind turbine

    NASA Astrophysics Data System (ADS)

    Winstroth, J.; Schoen, L.; Ernst, B.; Seume, J. R.

    2014-06-01

    Optical full-field measurement methods such as Digital Image Correlation (DIC) provide a new opportunity for measuring deformations and vibrations with high spatial and temporal resolution. However, application to full-scale wind turbines is not trivial. Elaborate preparation of the experiment is vital and sophisticated post processing of the DIC results essential. In the present study, a rotor blade of a 3.2 MW wind turbine is equipped with a random black-and-white dot pattern at four different radial positions. Two cameras are located in front of the wind turbine and the response of the rotor blade is monitored using DIC for different turbine operations. In addition, a Light Detection and Ranging (LiDAR) system is used in order to measure the wind conditions. Wind fields are created based on the LiDAR measurements and used to perform aeroelastic simulations of the wind turbine by means of advanced multibody codes. The results from the optical DIC system appear plausible when checked against common and expected results. In addition, the comparison of relative out-ofplane blade deflections shows good agreement between DIC results and aeroelastic simulations.

  2. Time-Series Monitoring of Open Star Clusters

    NASA Astrophysics Data System (ADS)

    Hojaev, A. S.; Semakov, D. G.

    2006-08-01

    Star clusters especially a compact ones (with diameter of few to ten arcmin) are suitable targets to search of light variability for orchestera of stars by means of ordinary Casegrain telescope plus CCD system. A special patroling with short time-fixed exposures and mmag accuracy could be used also to study of stellar oscillation for group of stars simultaneously. The last can be carried out both separately from one site and within international campaigns. Detection and study of optical variability of X-ray sources including X-ray binaries with compact objects might be as a result of a long-term monitoring of such clusters as well. We present the program of open star clusters monitoring with Zeiss 1 meter RCC telescope of Maidanak observatory has been recently automated. In combination with quite good seeing at this observatory (see, e.g., Sarazin, M. 1999, URL http://www.eso.org/gen-fac/pubs/astclim/) the automatic telescope equipped with large-format (2KX2K) CCD camera AP-10 available will allow to collect homogenious time-series for analysis. We already started this program in 2001 and had a set of patrol observations with Zeiss 0.6 meter telescope and AP-10 camera in 2003. 7 compact open clusters in the Milky Way (NGC 7801, King1, King 13, King18, King20, Berkeley 55, IC 4996) have been monitored for stellar variability and some results of photometry will be presented. A few interesting variables were discovered and dozens were suspected for variability to the moment in these clusters for the first time. We have made steps to join the Whole-Earth Telescope effort in its future campaigns.

  3. Development of an all-in-one gamma camera/CCD system for safeguard verification

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Il; An, Su Jung; Chung, Yong Hyun; Kwak, Sung-Woo

    2014-12-01

    For the purpose of monitoring and verifying efforts at safeguarding radioactive materials in various fields, a new all-in-one gamma camera/charged coupled device (CCD) system was developed. This combined system consists of a gamma camera, which gathers energy and position information on gamma-ray sources, and a CCD camera, which identifies the specific location in a monitored area. Therefore, 2-D image information and quantitative information regarding gamma-ray sources can be obtained using fused images. A gamma camera consists of a diverging collimator, a 22 × 22 array CsI(Na) pixelated scintillation crystal with a pixel size of 2 × 2 × 6 mm3 and Hamamatsu H8500 position-sensitive photomultiplier tube (PSPMT). The Basler scA640-70gc CCD camera, which delivers 70 frames per second at video graphics array (VGA) resolution, was employed. Performance testing was performed using a Co-57 point source 30 cm from the detector. The measured spatial resolution and sensitivity were 4.77 mm full width at half maximum (FWHM) and 7.78 cps/MBq, respectively. The energy resolution was 18% at 122 keV. These results demonstrate that the combined system has considerable potential for radiation monitoring.

  4. The Brazilian wide field imaging camera (WFI) for the China/Brazil earth resources satellite: CBERS 3 and 4

    NASA Astrophysics Data System (ADS)

    Scaduto, L. C. N.; Carvalho, E. G.; Modugno, R. G.; Cartolano, R.; Evangelista, S. H.; Segoria, D.; Santos, A. G.; Stefani, M. A.; Castro Neto, J. C.

    2017-11-01

    The purpose of this paper is to present the optical system developed for the Wide Field imaging Camera - WFI that will be integrated to the CBERS 3 and 4 satellites (China Brazil Earth resources Satellite). This camera will be used for remote sensing of the Earth and it is aimed to work at an altitude of 778 km. The optical system is designed for four spectral bands covering the range of wavelengths from blue to near infrared and its field of view is +/-28.63°, which covers 866 km, with a ground resolution of 64 m at nadir. WFI has been developed through a consortium formed by Opto Electrônica S. A. and Equatorial Sistemas. In particular, we will present the optical analysis based on the Modulation Transfer Function (MTF) obtained during the Engineering Model phase (EM) and the optical tests performed to evaluate the requirements. Measurements of the optical system MTF have been performed using an interferometer at the wavelength of 632.8nm and global MTF tests (including the CCD and signal processing electronic) have been performed by using a collimator with a slit target. The obtained results showed that the performance of the optical system meets the requirements of project.

  5. Cryogenic optical systems for the rapid infrared imager/spectrometer (RIMAS)

    NASA Astrophysics Data System (ADS)

    Capone, John I.; Content, David A.; Kutyrev, Alexander S.; Robinson, Frederick D.; Lotkin, Gennadiy N.; Toy, Vicki L.; Veilleux, Sylvain; Moseley, Samuel H.; Gehrels, Neil A.; Vogel, Stuart N.

    2014-07-01

    The Rapid Infrared Imager/Spectrometer (RIMAS) is designed to perform follow-up observations of transient astronomical sources at near infrared (NIR) wavelengths (0.9 - 2.4 microns). In particular, RIMAS will be used to perform photometric and spectroscopic observations of gamma-ray burst (GRB) afterglows to compliment the Swift satellite's science goals. Upon completion, RIMAS will be installed on Lowell Observatory's 4.3 meter Discovery Channel Telescope (DCT) located in Happy Jack, Arizona. The instrument's optical design includes a collimator lens assembly, a dichroic to divide the wavelength coverage into two optical arms (0.9 - 1.4 microns and 1.4 - 2.4 microns respectively), and a camera lens assembly for each optical arm. Because the wavelength coverage extends out to 2.4 microns, all optical elements are cooled to ~70 K. Filters and transmission gratings are located on wheels prior to each camera allowing the instrument to be quickly configured for photometry or spectroscopy. An athermal optomechanical design is being implemented to prevent lenses from loosing their room temperature alignment as the system is cooled. The thermal expansion of materials used in this design have been measured in the lab. Additionally, RIMAS has a guide camera consisting of four lenses to aid observers in passing light from target sources through spectroscopic slits. Efforts to align these optics are ongoing.

  6. Optical Arc-Length Sensor For TIG Welding

    NASA Technical Reports Server (NTRS)

    Smith, Matthew A.

    1990-01-01

    Proposed subsystem of tungsten/inert-gas (TIG) welding system measures length of welding arc optically. Viewed by video camera, in one of three alternative optical configurations. Length of arc measured instead of inferred from voltage.

  7. A Feasibility Study on the Use of a Structured Light Depth-Camera for Three-Dimensional Body Measurements of Dairy Cows in Free-Stall Barns

    PubMed Central

    2018-01-01

    Frequent checks on livestock’s body growth can help reducing problems related to cow infertility or other welfare implications, and recognizing health’s anomalies. In the last ten years, optical methods have been proposed to extract information on various parameters while avoiding direct contact with animals’ body, generally causes stress. This research aims to evaluate a new monitoring system, which is suitable to frequently check calves and cow’s growth through a three-dimensional analysis of their bodies’ portions. The innovative system is based on multiple acquisitions from a low cost Structured Light Depth-Camera (Microsoft Kinect™ v1). The metrological performance of the instrument is proved through an uncertainty analysis and a proper calibration procedure. The paper reports application of the depth camera for extraction of different body parameters. Expanded uncertainty ranging between 3 and 15 mm is reported in the case of ten repeated measurements. Coefficients of determination R² > 0.84 and deviations lower than 6% from manual measurements where in general detected in the case of head size, hips distance, withers to tail length, chest girth, hips, and withers height. Conversely, lower performances where recognized in the case of animal depth (R² = 0.74) and back slope (R² = 0.12). PMID:29495290

  8. Establishment of Imaging Spectroscopy of Nuclear Gamma-Rays based on Geometrical Optics.

    PubMed

    Tanimori, Toru; Mizumura, Yoshitaka; Takada, Atsushi; Miyamoto, Shohei; Takemura, Taito; Kishimoto, Tetsuro; Komura, Shotaro; Kubo, Hidetoshi; Kurosawa, Shunsuke; Matsuoka, Yoshihiro; Miuchi, Kentaro; Mizumoto, Tetsuya; Nakamasu, Yuma; Nakamura, Kiseki; Parker, Joseph D; Sawano, Tatsuya; Sonoda, Shinya; Tomono, Dai; Yoshikawa, Kei

    2017-02-03

    Since the discovery of nuclear gamma-rays, its imaging has been limited to pseudo imaging, such as Compton Camera (CC) and coded mask. Pseudo imaging does not keep physical information (intensity, or brightness in Optics) along a ray, and thus is capable of no more than qualitative imaging of bright objects. To attain quantitative imaging, cameras that realize geometrical optics is essential, which would be, for nuclear MeV gammas, only possible via complete reconstruction of the Compton process. Recently we have revealed that "Electron Tracking Compton Camera" (ETCC) provides a well-defined Point Spread Function (PSF). The information of an incoming gamma is kept along a ray with the PSF and that is equivalent to geometrical optics. Here we present an imaging-spectroscopic measurement with the ETCC. Our results highlight the intrinsic difficulty with CCs in performing accurate imaging, and show that the ETCC surmounts this problem. The imaging capability also helps the ETCC suppress the noise level dramatically by ~3 orders of magnitude without a shielding structure. Furthermore, full reconstruction of Compton process with the ETCC provides spectra free of Compton edges. These results mark the first proper imaging of nuclear gammas based on the genuine geometrical optics.

  9. Diffraction-based optical sensor detection system for capture-restricted environments

    NASA Astrophysics Data System (ADS)

    Khandekar, Rahul M.; Nikulin, Vladimir V.

    2008-04-01

    The use of digital cameras and camcorders in prohibited areas presents a growing problem. Piracy in the movie theaters results in huge revenue loss to the motion picture industry every year, but still image and video capture may present even a bigger threat if performed in high-security locations. While several attempts are being made to address this issue, an effective solution is yet to be found. We propose to approach this problem using a very commonly observed optical phenomenon. Cameras and camcorders use CCD and CMOS sensors, which include a number of photosensitive elements/pixels arranged in a certain fashion. Those are photosites in CCD sensors and semiconductor elements in CMOS sensors. They are known to reflect a small fraction of incident light, but could also act as a diffraction grating, resulting in the optical response that could be utilized to identify the presence of such a sensor. A laser-based detection system is proposed that accounts for the elements in the optical train of the camera, as well as the eye-safety of the people who could be exposed to optical beam radiation. This paper presents preliminary experimental data, as well as the proof-of-concept simulation results.

  10. A Simple Spectrophotometer Using Common Materials and a Digital Camera

    ERIC Educational Resources Information Center

    Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal

    2011-01-01

    A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…

  11. Surveillance Cameras and Their Use as a Dissecting Microscope in the Teaching of Biological Sciences

    ERIC Educational Resources Information Center

    Vale, Marcus R.

    2016-01-01

    Surveillance cameras are prevalent in various public and private areas, and they can also be coupled to optical microscopes and telescopes with excellent results. They are relatively simple cameras without sophisticated technological features and are much less expensive and more accessible to many people. These features enable them to be used in…

  12. Creating and Using a Camera Obscura

    ERIC Educational Resources Information Center

    Quinnell, Justin

    2012-01-01

    The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material.…

  13. Video monitoring system for car seat

    NASA Technical Reports Server (NTRS)

    Elrod, Susan Vinz (Inventor); Dabney, Richard W. (Inventor)

    2004-01-01

    A video monitoring system for use with a child car seat has video camera(s) mounted in the car seat. The video images are wirelessly transmitted to a remote receiver/display encased in a portable housing that can be removably mounted in the vehicle in which the car seat is installed.

  14. Improved head-controlled TV system produces high-quality remote image

    NASA Technical Reports Server (NTRS)

    Goertz, R.; Lindberg, J.; Mingesz, D.; Potts, C.

    1967-01-01

    Manipulator operator uses an improved resolution tv camera/monitor positioning system to view the remote handling and processing of reactive, flammable, explosive, or contaminated materials. The pan and tilt motions of the camera and monitor are slaved to follow the corresponding motions of the operators head.

  15. Radiation-Triggered Surveillance for UF6 Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Michael M.

    2015-12-01

    This paper recommends the use of radiation detectors, singly or in sets, to trigger surveillance cameras. Ideally, the cameras will monitor cylinders transiting the process area as well as the process area itself. The general process area will be surveyed to record how many cylinders have been attached and detached to the process between inspections. Rad-triggered cameras can dramatically reduce the quantity of recorded images, because the movement of personnel and equipment not involving UF6 cylinders will not generate a surveillance review file.

  16. A scale-up field experiment for the monitoring of a burning process using chemical, audio, and video sensors.

    PubMed

    Stavrakakis, P; Agapiou, A; Mikedi, K; Karma, S; Statheropoulos, M; Pallis, G C; Pappa, A

    2014-01-01

    Fires are becoming more violent and frequent resulting in major economic losses and long-lasting effects on communities and ecosystems; thus, efficient fire monitoring is becoming a necessity. A novel triple multi-sensor approach was developed for monitoring and studying the burning of dry forest fuel in an open field scheduled experiment; chemical, optical, and acoustical sensors were combined to record the fire spread. The results of this integrated field campaign for real-time monitoring of the fire event are presented and discussed. Chemical analysis, despite its limitations, corresponded to the burning process with a minor time delay. Nevertheless, the evolution profile of CO2, CO, NO, and O2 were detected and monitored. The chemical monitoring of smoke components enabled the observing of the different fire phases (flaming, smoldering) based on the emissions identified in each phase. The analysis of fire acoustical signals presented accurate and timely response to the fire event. In the same content, the use of a thermographic camera, for monitoring the biomass burning, was also considerable (both profiles of the intensities of average gray and red component greater than 230) and presented similar promising potentials to audio results. Further work is needed towards integrating sensors signals for automation purposes leading to potential applications in real situations.

  17. 3-D endoscopic imaging using plenoptic camera.

    PubMed

    Le, Hanh N D; Decker, Ryan; Opferman, Justin; Kim, Peter; Krieger, Axel; Kang, Jin U

    2016-06-01

    Three-dimensional endoscopic imaging using plenoptic technique combined with F-matching algorithm has been pursued in this study. A custom relay optics was designed to integrate a commercial surgical straight endoscope with a plenoptic camera.

  18. Near-infrared imaging of developmental defects in dental enamel.

    PubMed

    Hirasuna, Krista; Fried, Daniel; Darling, Cynthia L

    2008-01-01

    Polarization-sensitive optical coherence tomography (PS-OCT) and near-infrared (NIR) imaging are promising new technologies under development for monitoring early carious lesions. Fluorosis is a growing problem in the United States, and the more prevalent mild fluorosis can be visually mistaken for early enamel demineralization. Unfortunately, there is little quantitative information available regarding the differences in optical properties of sound enamel, enamel developmental defects, and caries. Thirty extracted human teeth with various degrees of suspected fluorosis were imaged using PS-OCT and NIR. An InGaAs camera and a NIR diode laser were used to measure the optical attenuation through transverse tooth sections (approximately 200 microm). A digital microradiography system was used to quantify the enamel defect severity by measurement of the relative mineral loss for comparison with optical scattering measurements. Developmental defects were clearly visible in the polarization-resolved OCT images, demonstrating that PS-OCT can be used to nondestructively measure the depth and possible severity of the defects. Enamel defects on whole teeth that could be imaged with high contrast with visible light were transparent in the NIR. This study suggests that PS-OCT and NIR methods may potentially be used as tools to assess the severity and extent of enamel defects.

  19. Simultaneous estimation of arterial and venous oxygen saturation using a camera

    NASA Astrophysics Data System (ADS)

    van Gastel, Mark; Liang, Hangbing; Stuijk, Sander; de Haan, Gerard

    2018-02-01

    Optical monitoring of arterial blood oxygenation, SpO2, using cameras has recently been shown feasible by measuring the relative amplitudes of the remotely sensed PPG waveforms captured at different wavelengths. SvO2 measures the venous blood oxygenation which together with SpO2 provides an indication of tissue oxygen consumption. In contrast to SpO2 it usually still requires a blood sample from a pulmonary artery catheter. In this work we present a method which suggests simultaneous estimation of SpO2 and SvO2 with a camera. Contrary to earlier work, our method does not require external cuffs leading to better usability and improved comfort. Since the arterial blood varies synchronously with the heart rate, all frequencies outside the heart rate band are typically filtered out for SpO2 measurements. For SvO2 estimation, we include intensity variations in the respiratory frequency range since respiration modulates venous blood due to intrathoracic pressure variations in the chest and abdomen. Consequently, under static conditions, the two dominant components in the PPG signals are respiration and pulse. By measuring the amplitude ratios of these components, it seems possible to monitor both SpO2 and SvO2 continuously. We asked healthy subjects to follow an auditory breathing pattern while recording the face and hand. Results show a difference in estimated SpO2 and SvO2 values in the range 5-30 percent for both anatomical locations, which is normal for healthy people. This continuous, non-contact, method shows promise to alert the clinician to a change in patient condition sooner than SpO2 alone.

  20. Development of a safe ground to space laser propagation system for the optical communications telescope laboratory

    NASA Technical Reports Server (NTRS)

    Wu, Janet P.

    2003-01-01

    Furthering pursuits in high bandwidth communications to future NASA deep space and neat-Earth probes, the Jet Propulsion Laboratory (JPL) is building the Optical communications Telescope Laboratory (OCTL) atop Table Mountain in Southern California. This R&D optical antenna will be used to develop optical communication strategies for future optical ground stations. Initial experiments to be conducted include propagating high-powered, Q-switched laser beams to retro-reflecting satellites. Yet laser beam propagation from the ground to space is under the cognizance of various government agencies, namely: the Occupational Safety and Health Administration (ISHA) that is responsible for protecting workforce personnel; the Federal Aviation Administration (FAA) responsible for protecting pilots and aircraft; and the Laser Clearinghouse of Space Command responsible for protecting space assets. To ensure that laser beam propagation from the OCTL and future autonomously operated ground stations comply with the guidelines of these organizations, JPL is developing a multi-tiered safety system that will meet the coordination, monitoring, and reporting functions required by the agencies. At Tier 0, laser operators will meet OSHA safety standards for protection and access to the high power lasers area will be restricted and interlocked. Tier 1, the area defined from the telescope dome out to a range of 3.4-km, will utilize long wave infrared camera sensors to alert operators of at risk aircraft in the FAA controlled airspace. Tier 2, defined to extend from 3.4-km out to the aircraft service ceiling in FAA airspace, will detect at risk aircraft by radar. Lastly, beam propagation into space, defined as Tier 3, will require coordination with the Laser Clearinghouse. A detailed description of the four tiers is presented along with the design of the integrated monitoring and beam transmission control system.

  1. The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team

    2002-12-01

    The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide the ability to validate the radiometric and geometric calibration on Mars. Pancam relies heavily on use of the JPL ICER wavelet compression algorithm to maximize data return within stringent mission downlink limits. The scientific goals of the Pancam investigation are to: (a) obtain monoscopic and stereoscopic image mosaics to assess the morphology, topography, and geologic context of each MER landing site; (b) obtain multispectral visible to short-wave near-IR images of selected regions to determine surface color and mineralogic properties; (c) obtain multispectral images over a range of viewing geometries to constrain surface photometric and physical properties; and (d) obtain images of the Martian sky, including direct images of the Sun, to determine dust and aerosol opacity and physical properties. In addition, Pancam also serves a variety of operational functions on the MER mission, including (e) serving as the primary Sun-finding camera for rover navigation; (f) resolving objects on the scale of the rover wheels to distances of ~100 m to help guide navigation decisions; (g) providing stereo coverage adequate for the generation of digital terrain models to help guide and refine rover traverse decisions; (h) providing high resolution images and other context information to guide the selection of the most interesting in situ sampling targets; and (i) supporting acquisition and release of exciting E/PO products.

  2. Demonstration of a real-time interferometer as a bunch-length monitor in a high-current electron beam accelerator.

    PubMed

    Thangaraj, J; Andonian, G; Thurman-Keup, R; Ruan, J; Johnson, A S; Lumpkin, A; Santucci, J; Maxwell, T; Murokh, A; Ruelas, M; Ovodenko, A

    2012-04-01

    A real-time interferometer (RTI) has been developed to monitor the bunch length of an electron beam in an accelerator. The RTI employs spatial autocorrelation, reflective optics, and a fast response pyro-detector array to obtain a real-time autocorrelation trace of the coherent radiation from an electron beam thus providing the possibility of online bunch-length diagnostics. A complete RTI system has been commissioned at the A0 photoinjector facility to measure sub-mm bunches at 13 MeV. Bunch length variation (FWHM) between 0.8 ps (~0.24 mm) and 1.5 ps (~0.45 mm) has been measured and compared with a Martin-Puplett interferometer and a streak camera. The comparisons show that RTI is a viable, complementary bunch length diagnostic for sub-mm electron bunches. © 2012 American Institute of Physics

  3. Demonstration of a real-time interferometer as a bunch-lenght monitor in a high-current electron beam accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thangaraj, J.; Thurman-Keup, R.; Ruan, J.

    2012-03-01

    A real-time interferometer (RTI) has been developed to monitor the bunch length of an electron beam in an accelerator. The RTI employs spatial autocorrelation, reflective optics, and a fast response pyro-detector array to obtain a real-time autocorrelation trace of the coherent radiation from an electron beam thus providing the possibility of online bunch-length diagnostics. A complete RTI system has been commissioned at the A0 photoinjector facility to measure sub-mm bunches at 13 MeV. Bunch length variation (FWHM) between 0.8 ps (-0.24 mm) and 1.5 ps (-0.45 mm) has been measured and compared with a Martin-Puplett interferometer and a streak camera.more » The comparisons show that RTI is a viable, complementary bunch length diagnostic for sub-mm electron bunches.« less

  4. Demonstration of a real-time interferometer as a bunch-length monitor in a high-current electron beam accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thangaraj, J.; Thurman-Keup, R.; Ruan, J.

    2012-04-15

    A real-time interferometer (RTI) has been developed to monitor the bunch length of an electron beam in an accelerator. The RTI employs spatial autocorrelation, reflective optics, and a fast response pyro-detector array to obtain a real-time autocorrelation trace of the coherent radiation from an electron beam thus providing the possibility of online bunch-length diagnostics. A complete RTI system has been commissioned at the A0 photoinjector facility to measure sub-mm bunches at 13 MeV. Bunch length variation (FWHM) between 0.8 ps ({approx}0.24 mm) and 1.5 ps ({approx}0.45 mm) has been measured and compared with a Martin-Puplett interferometer and a streak camera.more » The comparisons show that RTI is a viable, complementary bunch length diagnostic for sub-mm electron bunches.« less

  5. Ultraviolet laser beam monitor using radiation responsive crystals

    DOEpatents

    McCann, Michael P.; Chen, Chung H.

    1988-01-01

    An apparatus and method for monitoring an ultraviolet laser beam includes disposing in the path of an ultraviolet laser beam a substantially transparent crystal that will produce a color pattern in response to ultraviolet radiation. The crystal is exposed to the ultraviolet laser beam and a color pattern is produced within the crystal corresponding to the laser beam intensity distribution therein. The crystal is then exposed to visible light, and the color pattern is observed by means of the visible light to determine the characteristics of the laser beam that passed through crystal. In this manner, a perpendicular cross sectional intensity profile and a longitudinal intensity profile of the ultraviolet laser beam may be determined. The observation of the color pattern may be made with forward or back scattered light and may be made with the naked eye or with optical systems such as microscopes and television cameras.

  6. The NACA High-Speed Motion-Picture Camera Optical Compensation at 40,000 Photographs Per Second

    NASA Technical Reports Server (NTRS)

    Miller, Cearcy D

    1946-01-01

    The principle of operation of the NACA high-speed camera is completely explained. This camera, operating at the rate of 40,000 photographs per second, took the photographs presented in numerous NACA reports concerning combustion, preignition, and knock in the spark-ignition engine. Many design details are presented and discussed, details of an entirely conventional nature are omitted. The inherent aberrations of the camera are discussed and partly evaluated. The focal-plane-shutter effect of the camera is explained. Photographs of the camera are presented. Some high-speed motion pictures of familiar objects -- photoflash bulb, firecrackers, camera shutter -- are reproduced as an illustration of the quality of the photographs taken by the camera.

  7. Coordinated Global Measurements of TLEs from the Space Shuttle and Ground Stations during MEIDEX

    NASA Astrophysics Data System (ADS)

    Yair, Y.; Price, C.; Levin, Z.; Israelevitch, P.; Devir, A.; Ziv, B.; Jospeh, J.; Mekler, Y.

    2001-12-01

    The Mediterranean Israeli Dust Experiment (MEIDEX) is scheduled to fly on-board the Columbia in May 2002, in a 39º inclination orbit for 16 days, passing over the major thunderstorm regions on Earth. The primary science instrument is a Xybion IMC-201 image-intensified radiometric camera with 6 narrow band filters (340nm, 380nm, 470nm, 555nm, 665nm, 860nm). A Sekai color video camera is a boresighted wide-FOV viewfinder. The cameras are mounted on a single-axis gimbal with a cross-track scan of ±22º degrees, inside a pressurized canister sealed with a coated quartz window that is mounted in the shuttle cargo bay. Data will be recorded in 3 digital VCRs and downlinked to the ground. During the night-side of the orbit there will be dedicated observations toward the Earth's limb above areas of active thunderstorms, in an effort to image TLEs from space. While earlier shuttle flights have succeeded in recording several ionospheric discharges by using cargo bay video cameras, MEIDEX offers a unique opportunity to conduct targeted observations with a calibrated, multispectral instrument. The Xybion camera has a rectangular FOV of 14.04(H) x 10.76 (V) degrees, that covers a volume of 466km (H) x 358km (V) at the Earth's limb, 1900km away from the shuttle. The spatial resolution is 665m (H) x 745m (V) per pixel, enabling to resolve some structural features of TLEs. Optical observations from space will be conducted with the 665nm filter that matches the observed wide peak centered at 670nm that typifies red sprites, and also with the 380 and 470nm filters to record blue jets. Observations will consist of a continuous recording of the Earth's limb, from the direction of the dusk terminator towards the night side. Areas of high convective activity will be forecast by using global aviation SIG maps, and uplinked to the crew before the observation. The astronaut will direct the camera toward areas with lightning activity, observed visually through the windows and on monitors in the crew cabin. Simultaneously with the optical observations from space, dedicated ground measurements will be conducted on a global scale. Two field sites in the Negev Desert in Israel will be used to collect electromagnetic data in the ELF and VLF frequency range. Additional ground stations in Germany, Hungary, USA, Antarctica, Chile, South Africa, Australia, Taiwan and Japan will also record Schumann Resonance and VLF signals. The coordinated measurements from various locations on Earth and from space will enable us to triangulate the location, and determine the polarity and charge moment of the parent lightning of the optically observed TLEs. The success of the campaign will further clarify the global picture of TLE occurrence.

  8. MTF measurements on real time for performance analysis of electro-optical systems

    NASA Astrophysics Data System (ADS)

    Stuchi, Jose Augusto; Signoreto Barbarini, Elisa; Vieira, Flavio Pascoal; dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fatima Maria Mitsue; Castro Neto, Jarbas C.; Linhari Rodrigues, Evandro Luis

    2012-06-01

    The need of methods and tools that assist in determining the performance of optical systems is actually increasing. One of the most used methods to perform analysis of optical systems is to measure the Modulation Transfer Function (MTF). The MTF represents a direct and quantitative verification of the image quality. This paper presents the implementation of the software, in order to calculate the MTF of electro-optical systems. The software was used for calculating the MTF of Digital Fundus Camera, Thermal Imager and Ophthalmologic Surgery Microscope. The MTF information aids the analysis of alignment and measurement of optical quality, and also defines the limit resolution of optical systems. The results obtained with the Fundus Camera and Thermal Imager was compared with the theoretical values. For the Microscope, the results were compared with MTF measured of Microscope Zeiss model, which is the quality standard of ophthalmological microscope.

  9. Procedure Enabling Simulation and In-Depth Analysis of Optical Effects in Camera-Based Time-Of Sensors

    NASA Astrophysics Data System (ADS)

    Baumgart, M.; Druml, N.; Consani, M.

    2018-05-01

    This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens) are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.

  10. High-speed spectral domain polarization- sensitive optical coherence tomography using a single camera and an optical switch at 1.3 microm.

    PubMed

    Lee, Sang-Won; Jeong, Hyun-Woo; Kim, Beop-Min

    2010-01-01

    We propose high-speed spectral domain polarization-sensitive optical coherence tomography (SD-PS-OCT) using a single camera and a 1x2 optical switch at the 1.3-microm region. The PS-low coherence interferometer used in the system is constructed using free-space optics. The reflected horizontal and vertical polarization light rays are delivered via an optical switch to a single spectrometer by turns. Therefore, our system costs less to build than those that use dual spectrometers, and the processes of timing and triggering are simpler from the viewpoints of both hardware and software. Our SD-PS-OCT has a sensitivity of 101.5 dB, an axial resolution of 8.2 microm, and an acquisition speed of 23,496 A-scans per second. We obtain the intensity, phase retardation, and fast axis orientation images of a rat tail tendon ex vivo.

  11. New Optical Sensing Materials for Application in Marine Research

    NASA Astrophysics Data System (ADS)

    Borisov, S.; Klimant, I.

    2012-04-01

    Optical chemosensors are versatile analytical tools which find application in numerous fields of science and technology. They proved to be a promising alternative to electrochemical methods and are applied increasingly often in marine research. However, not all state-of-the- art optical chemosensors are suitable for these demanding applications since they do not fully fulfil the requirements of high luminescence brightness, high chemical- and photochemical stability or their spectral properties are not adequate. Therefore, development of new advanced sensing materials is still of utmost importance. Here we present a set of novel optical sensing materials recently developed in the Institute of Analytical Chemistry and Food Chemistry which are optimized for marine applications. Particularly, we present new NIR indicators and sensors for oxygen and pH which feature high brightness and low level of autofluorescence. The oxygen sensors rely on highly photostable metal complexes of benzoporphyrins and azabenzoporphyrins and enable several important applications such as simultaneous monitoring of oxygen and chlorophyll or ultra-fast oxygen monitoring (Eddy correlation). We also developed ulta-sensitive oxygen optodes which enable monitoring in nM range and are primary designed for investigation of oxygen minimum zones. The dynamic range of our new NIR pH indicators based on aza-BODIPY dyes is optimized for the marine environment. A highly sensitive NIR luminescent phosphor (chromium(III) doped yttrium aluminium borate) can be used for non-invasive temperature measurements. Notably, the oxygen, pH sensors and temperature sensors are fully compatible with the commercially available fiber-optic readers (Firesting from PyroScience). An optical CO2 sensor for marine applications employs novel diketopyrrolopyrrol indicators and enables ratiometric imaging using a CCD camera. Oxygen, pH and temperature sensors suitable for lifetime and ratiometric imaging of analytes distribution are also realized. To enable versatility of applications we also obtained a range of nano- and microparticles suitable for intra- and extracellular imaging of the above analytes. Bright ratiometric 2-photon-excitable probes were also developed. Magnetic microparticles are demonstrated to be very promising tools for imaging of oxygen, temperature and other parameters in biofilms, corals etc. since they combine the sensing function with the possibility of external manipulation.

  12. A multipurpose camera system for monitoring Kīlauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Orr, Tim R.; Lee, Lopaka; Moniz, Cyril J.

    2015-01-01

    We describe a low-cost, compact multipurpose camera system designed for field deployment at active volcanoes that can be used either as a webcam (transmitting images back to an observatory in real-time) or as a time-lapse camera system (storing images onto the camera system for periodic retrieval during field visits). The system also has the capability to acquire high-definition video. The camera system uses a Raspberry Pi single-board computer and a 5-megapixel low-light (near-infrared sensitive) camera, as well as a small Global Positioning System (GPS) module to ensure accurate time-stamping of images. Custom Python scripts control the webcam and GPS unit and handle data management. The inexpensive nature of the system allows it to be installed at hazardous sites where it might be lost. Another major advantage of this camera system is that it provides accurate internal timing (independent of network connection) and, because a full Linux operating system and the Python programming language are available on the camera system itself, it has the versatility to be configured for the specific needs of the user. We describe example deployments of the camera at Kīlauea Volcano, Hawai‘i, to monitor ongoing summit lava lake activity. 

  13. Quasi-microscope concept for planetary missions.

    PubMed

    Huck, F O; Arvidson, R E; Burcher, E E; Giat, O; Wall, S D

    1977-09-01

    Viking lander cameras have returned stereo and multispectral views of the Martian surface with a resolution that approaches 2 mm/lp in the near field. A two-orders-of-magnitude increase in resolution could be obtained for collected surface samples by augmenting these cameras with auxiliary optics that would neither impose special camera design requirements nor limit the cameras field of view of the terrain. Quasi-microscope images would provide valuable data on the physical and chemical characteristics of planetary regoliths.

  14. FOREX-A Fiber Optics Diagnostic System For Study Of Materials At High Temperatures And Pressures

    NASA Astrophysics Data System (ADS)

    Smith, D. E.; Roeske, F.

    1983-03-01

    We have successfully fielded a Fiber Optics Radiation EXperiment system (FOREX) designed for measuring material properties at high temperatures and pressures on an underground nuclear test. The system collects light from radiating materials and transmits it through several hundred meters of optical fibers to a recording station consisting of a streak camera with film readout. The use of fiber optics provides a faster time response than can presently be obtained with equalized coaxial cables over comparable distances. Fibers also have significant cost and physical size advantages over coax cables. The streak camera achieves a much higher information density than an equivalent oscilloscope system, and it also serves as the light detector. The result is a wide bandwidth high capacity system that can be fielded at a relatively low cost in manpower, space, and materials. For this experiment, the streak camera had a 120 ns time window with a 1.2 ns time resolution. Dynamic range for the system was about 1000. Beam current statistical limitations were approximately 8% for a 0.3 ns wide data point at one decade above the threshold recording intensity.

  15. Using hacked point and shoot cameras for time-lapse snow cover monitoring in an Alpine valley

    NASA Astrophysics Data System (ADS)

    Weijs, S. V.; Diebold, M.; Mutzner, R.; Golay, J. R.; Parlange, M. B.

    2012-04-01

    In Alpine environments, monitoring snow cover is essential get insight in the hydrological processes and water balance. Although measurement techniques based on LIDAR are available, their cost is often a restricting factor. In this research, an experiment was done using a distributed array of cheap consumer cameras to get insight in the spatio-temporal evolution of snowpack. Two experiments are planned. The first involves the measurement of eolic snow transport around a hill, to validate a snow saltation model. The second monitors the snowmelt during the melting season, which can then be combined with data from a wireless network of meteorological stations and discharge measurements at the outlet of the catchment. The poster describes the hardware and software setup, based on an external timer circuit and CHDK, the Canon Hack Development Kit. This latter is a flexible and developing software package, released under a GPL license. It was developed by hackers that reverse engineered the firmware of the camera and added extra functionality such as raw image output, more full control of the camera, external trigger and motion detection, and scripting. These features make it a great tool for geosciences. Possible other applications involve aerial stereo photography, monitoring vegetation response. We are interested in sharing experiences and brainstorming about new applications. Bring your camera!

  16. A versatile photogrammetric camera automatic calibration suite for multispectral fusion and optical helmet tracking

    NASA Astrophysics Data System (ADS)

    de Villiers, Jason; Jermy, Robert; Nicolls, Fred

    2014-06-01

    This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.

  17. Geometric calibration of lens and filter distortions for multispectral filter-wheel cameras.

    PubMed

    Brauers, Johannes; Aach, Til

    2011-02-01

    High-fidelity color image acquisition with a multispectral camera utilizes optical filters to separate the visible electromagnetic spectrum into several passbands. This is often realized with a computer-controlled filter wheel, where each position is equipped with an optical bandpass filter. For each filter wheel position, a grayscale image is acquired and the passbands are finally combined to a multispectral image. However, the different optical properties and non-coplanar alignment of the filters cause image aberrations since the optical path is slightly different for each filter wheel position. As in a normal camera system, the lens causes additional wavelength-dependent image distortions called chromatic aberrations. When transforming the multispectral image with these aberrations into an RGB image, color fringes appear, and the image exhibits a pincushion or barrel distortion. In this paper, we address both the distortions caused by the lens and by the filters. Based on a physical model of the bandpass filters, we show that the aberrations caused by the filters can be modeled by displaced image planes. The lens distortions are modeled by an extended pinhole camera model, which results in a remaining mean calibration error of only 0.07 pixels. Using an absolute calibration target, we then geometrically calibrate each passband and compensate for both lens and filter distortions simultaneously. We show that both types of aberrations can be compensated and present detailed results on the remaining calibration errors.

  18. Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Max, C.E.; Gavel, D.T.; Olivier, S.S.

    1995-08-03

    A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less

  19. Aerial Photography

    NASA Technical Reports Server (NTRS)

    1985-01-01

    John Hill, a pilot and commercial aerial photographer, needed an information base. He consulted NERAC and requested a search of the latest developments in camera optics. NERAC provided information; Hill contacted the manufacturers of camera equipment and reduced his photographic costs significantly.

  20. 3-D endoscopic imaging using plenoptic camera

    PubMed Central

    Le, Hanh N. D.; Decker, Ryan; Opferman, Justin; Kim, Peter; Krieger, Axel

    2017-01-01

    Three-dimensional endoscopic imaging using plenoptic technique combined with F-matching algorithm has been pursued in this study. A custom relay optics was designed to integrate a commercial surgical straight endoscope with a plenoptic camera. PMID:29276806

  1. VizieR Online Data Catalog: Antennae galaxies (NGC 4038/4039) revisited (Whitmore+, 2010)

    NASA Astrophysics Data System (ADS)

    Whitmore, B. C.; Chandar, R.; Schweizer, F.; Rothberg, B.; Leitherer, C.; Rieke, M.; Rieke, G.; Blair, W. P.; Mengel, S.; Alonso-Herrero, A.

    2012-06-01

    Observations of the main bodies of NGC 4038/39 were made with the Hubble Space Telescope (HST), using the ACS, as part of Program GO-10188. Multi-band photometry was obtained in the following optical broadband filters: F435W (~B), F550M (~V), and F814W (~I). Archival F336W photometry of the Antennae (Program GO-5962) was used to supplement our optical ACS/WFC observations. Infrared observations were made using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) camera on HST as part of Program GO-10188. Observations were made using the NIC2 camera with the F160W, F187N, and F237M filters, and the NIC3 camera with the F110W, F160W, F164W, F187N, and F222M filters. (10 data files).

  2. Video monitoring in the Gadria debris flow catchment: preliminary results of large scale particle image velocimetry (LSPIV)

    NASA Astrophysics Data System (ADS)

    Theule, Joshua; Crema, Stefano; Comiti, Francesco; Cavalli, Marco; Marchi, Lorenzo

    2015-04-01

    Large scale particle image velocimetry (LSPIV) is a technique mostly used in rivers to measure two dimensional velocities from high resolution images at high frame rates. This technique still needs to be thoroughly explored in the field of debris flow studies. The Gadria debris flow monitoring catchment in Val Venosta (Italian Alps) has been equipped with four MOBOTIX M12 video cameras. Two cameras are located in a sediment trap located close to the alluvial fan apex, one looking upstream and the other looking down and more perpendicular to the flow. The third camera is in the next reach upstream from the sediment trap at a closer proximity to the flow. These three cameras are connected to a field shelter equipped with power supply and a server collecting all the monitoring data. The fourth camera is located in an active gully, the camera is activated by a rain gauge when there is one minute of rainfall. Before LSPIV can be used, the highly distorted images need to be corrected and accurate reference points need to be made. We decided to use IMGRAFT (an opensource image georectification toolbox) which can correct distorted images using reference points and camera location, and then finally rectifies the batch of images onto a DEM grid (or the DEM grid onto the image coordinates). With the orthorectified images, we used the freeware Fudaa-LSPIV (developed by EDF, IRSTEA, and DeltaCAD Company) to generate the LSPIV calculations of the flow events. Calculated velocities can easily be checked manually because of the already orthorectified images. During the monitoring program (since 2011) we recorded three debris flow events at the sediment trap area (each with very different surge dynamics). The camera in the gully was in operation in 2014 which managed to record granular flows and rockfalls, which particle tracking may be more appropriate for velocity measurements. The four cameras allows us to explore the limitations of camera distance, angle, frame rate, and image quality.

  3. A method of camera calibration in the measurement process with reference mark for approaching observation space target

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Zeng, Luan

    2017-11-01

    Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.

  4. ARNICA, the Arcetri Near-Infrared Camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

    1996-04-01

    ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

  5. General Model of Photon-Pair Detection with an Image Sensor

    NASA Astrophysics Data System (ADS)

    Defienne, Hugo; Reichert, Matthew; Fleischer, Jason W.

    2018-05-01

    We develop an analytic model that relates intensity correlation measurements performed by an image sensor to the properties of photon pairs illuminating it. Experiments using an effective single-photon counting camera, a linear electron-multiplying charge-coupled device camera, and a standard CCD camera confirm the model. The results open the field of quantum optical sensing using conventional detectors.

  6. ATTICA family of thermal cameras in submarine applications

    NASA Astrophysics Data System (ADS)

    Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold

    2001-10-01

    Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.

  7. Camera System MTF: combining optic with detector

    NASA Astrophysics Data System (ADS)

    Andersen, Torben B.; Granger, Zachary A.

    2017-08-01

    MTF is one of the most common metrics used to quantify the resolving power of an optical component. Extensive literature is dedicated to describing methods to calculate the Modulation Transfer Function (MTF) for stand-alone optical components such as a camera lens or telescope, and some literature addresses approaches to determine an MTF for combination of an optic with a detector. The formulations pertaining to a combined electro-optical system MTF are mostly based on theory, and assumptions that detector MTF is described only by the pixel pitch which does not account for wavelength dependencies. When working with real hardware, detectors are often characterized by testing MTF at discrete wavelengths. This paper presents a method to simplify the calculation of a polychromatic system MTF when it is permissible to consider the detector MTF to be independent of wavelength.

  8. Miniaturized unified imaging system using bio-inspired fluidic lens

    NASA Astrophysics Data System (ADS)

    Tsai, Frank S.; Cho, Sung Hwan; Qiao, Wen; Kim, Nam-Hyong; Lo, Yu-Hwa

    2008-08-01

    Miniaturized imaging systems have become ubiquitous as they are found in an ever-increasing number of devices, such as cellular phones, personal digital assistants, and web cameras. Until now, the design and fabrication methodology of such systems have not been significantly different from conventional cameras. The only established method to achieve focusing is by varying the lens distance. On the other hand, the variable-shape crystalline lens found in animal eyes offers inspiration for a more natural way of achieving an optical system with high functionality. Learning from the working concepts of the optics in the animal kingdom, we developed bio-inspired fluidic lenses for a miniature universal imager with auto-focusing, macro, and super-macro capabilities. Because of the enormous dynamic range of fluidic lenses, the miniature camera can even function as a microscope. To compensate for the image quality difference between the central vision and peripheral vision and the shape difference between a solid-state image sensor and a curved retina, we adopted a hybrid design consisting of fluidic lenses for tunability and fixed lenses for aberration and color dispersion correction. A design of the world's smallest surgical camera with 3X optical zoom capabilities is also demonstrated using the approach of hybrid lenses.

  9. Long-term optical and X-ray variability of the Be/X-ray binary H 1145-619: Discovery of an ongoing retrograde density wave

    NASA Astrophysics Data System (ADS)

    Alfonso-Garzón, J.; Fabregat, J.; Reig, P.; Kajava, J. J. E.; Sánchez-Fernández, C.; Townsend, L. J.; Mas-Hesse, J. M.; Crawford, S. M.; Kretschmar, P.; Coe, M. J.

    2017-11-01

    Context. Multiwavelength monitoring of Be/X-ray binaries is crucial to understand the mechanisms producing their outbursts. H 1145-619 is one of these systems, which has recently displayed X-ray activity. Aims: We investigate the correlation between the optical emission and X-ray activity to predict the occurrence of new X-ray outbursts from the inferred state of the circumstellar disc. Methods: We have performed a multiwavelength study of H 1145-619 from 1973 to 2017 and present here a global analysis of its variability over the last 40 yr. We used optical spectra from the SAAO, SMARTS, and SALT telescopes and optical photometry from the Optical Monitoring Camera (OMC) onboard INTEGRAL and from the All Sky Automated Survey (ASAS). We also used X-ray observations from INTEGRAL/JEM-X, and IBIS to generate the light curves and combined them with Swift/XRT to extract the X-ray spectra. In addition, we compiled archival observations and measurements from the literature to complement these data. Results: Comparing the evolution of the optical continuum emission with the Hα line variability, we identified three different patterns of optical variability: first, global increases and decreases of the optical brightness, observed from 1982 to 1994 and from 2009 to 2017, which can be explained by the dissipation and replenishment of the circumstellar disc; second, superorbital variations with a period of Psuperorb ≈ 590 days, observed in 2002-2009, which seems to be related to the circumstellar disc; and third, optical outbursts, observed in 1998-1999 and 2002-2005, which we interpret as mass ejections from the Be star. We discovered the presence of a retrograde one-armed density wave, which appeared in 2016 and is still present in the circumstellar disc. Conclusions: We carried out the most complete long-term optical study of the Be/X-ray binary H 1145-619 in correlation with its X-ray activity. For the first time, we found the presence of a retrograde density perturbation in the circumstellar disc of a Be/X-ray binary.

  10. Drone swarm with free-space optical communication to detect and make deep decisions about physical problems for area surveillance

    NASA Astrophysics Data System (ADS)

    Mazher, Wamidh Jalil; Ibrahim, Hadeel T.; Ucan, Osman N.; Bayat, Oguz

    2018-03-01

    This paper aims to design a drone swarm network by employing free-space optical (FSO) communication for detecting and deep decision making of topological problems (e.g., oil pipeline leak), where deep decision making requires the highest image resolution. Drones have been widely used for monitoring and detecting problems in industrial applications during which the drone sends images from the on-air camera video stream using radio frequency (RF) signals. To obtain higher-resolution images, higher bandwidth (BW) is required. The current study proposed the use of the FSO communication system to facilitate higher BW for higher image resolution. Moreover, the number of drones required to survey a large physical area exceeded the capabilities of RF technologies. Our configuration of the drones is V-shaped swarm with one leading drone called mother drone (DM). The optical decode-and-forward (DF) technique is used to send the optical payloads of all drones in V-shaped swarm to the single ground station through DM. Furthermore, it is found that the transmitted optical power (Pt) is required for each drone based on the threshold outage probability of FSO link failure among the onboard optical-DF drones. The bit error rate of optical payload is calculated based on optical-DF onboard processing. Finally, the number of drones required for different image resolutions based on the size of the considered topological area is optimized.

  11. Material of LAPAN's thermal IR camera equipped with two microbolometers in one aperture

    NASA Astrophysics Data System (ADS)

    Bustanul, A.; Irwan, P.; Andi M., T.

    2017-11-01

    Besides the wavelength used, there is another factor that we have to notice in designing an optical system. It is material used which is correct for the spectral bands determined. Basically, due the limitation of the available range and expensive, choosing and determining materials for Infra Red (IR) wavelength are more difficult and complex rather than visible spectrum. We also had the same problem while designing our thermal IR camera equipped with two microbolometers sharing aperture. Two spectral bands, 3 - 4 μm (MWIR) and 8 - 12 μm (LWIR), have been decided to be our thermal IR camera spectrum to address missions, i.e., peat land fire, volcanoes activities, and Sea Surface Temperature (SST). Referring those bands, we chose the appropriate material for LAPAN's IR camera optics. This paper describes material of LAPAN's IR camera equipped with two microbolometer in one aperture. First of all, we were learning and understanding of optical materials properties all matters of IR technology including its bandwidths. Considering some aspects, i.e., Transmission, Index of Refraction, Thermal properties covering the index gradient and coefficient of thermal expansion (CTE), the analysis then has been accomplished. Moreover, we were utilizing a commercial software, Thermal Desktop/Sinda Fluint, to strengthen the process. Some restrictions such as space environment, low cost, and performance mainly durability and transmission, were also cared throughout the trade off the works. The results of all those analysis, either in graphs or in measurement, indicate that the lens of LAPAN's IR camera with sharing aperture is based on Germanium/Zinc Selenide materials.

  12. 3D imaging and wavefront sensing with a plenoptic objective

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, J. M.; Lüke, J. P.; López, R.; Marichal-Hernández, J. G.; Montilla, I.; Trujillo-Sevilla, J.; Femenía, B.; Puga, M.; López, M.; Fernández-Valdivia, J. J.; Rosa, F.; Dominguez-Conde, C.; Sanluis, J. C.; Rodríguez-Ramos, L. F.

    2011-06-01

    Plenoptic cameras have been developed over the last years as a passive method for 3d scanning. Several superresolution algorithms have been proposed in order to increase the resolution decrease associated with lightfield acquisition with a microlenses array. A number of multiview stereo algorithms have also been applied in order to extract depth information from plenoptic frames. Real time systems have been implemented using specialized hardware as Graphical Processing Units (GPUs) and Field Programmable Gates Arrays (FPGAs). In this paper, we will present our own implementations related with the aforementioned aspects but also two new developments consisting of a portable plenoptic objective to transform every conventional 2d camera in a 3D CAFADIS plenoptic camera, and the novel use of a plenoptic camera as a wavefront phase sensor for adaptive optics (OA). The terrestrial atmosphere degrades the telescope images due to the diffraction index changes associated with the turbulence. These changes require a high speed processing that justify the use of GPUs and FPGAs. Na artificial Laser Guide Stars (Na-LGS, 90km high) must be used to obtain the reference wavefront phase and the Optical Transfer Function of the system, but they are affected by defocus because of the finite distance to the telescope. Using the telescope as a plenoptic camera allows us to correct the defocus and to recover the wavefront phase tomographically. These advances significantly increase the versatility of the plenoptic camera, and provides a new contribution to relate the wave optics and computer vision fields, as many authors claim.

  13. Minimizing camera-eye optical aberrations during the 3D reconstruction of retinal structures

    NASA Astrophysics Data System (ADS)

    Aldana-Iuit, Javier; Martinez-Perez, M. Elena; Espinosa-Romero, Arturo; Diaz-Uribe, Rufino

    2010-05-01

    3D reconstruction of blood vessels is a powerful visualization tool for physicians, since it allows them to refer to qualitative representation of their subject of study. In this paper we propose a 3D reconstruction method of retinal vessels from fundus images. The reconstruction method propose herein uses images of the same retinal structure in epipolar geometry. Images are preprocessed by RISA system for segmenting blood vessels and obtaining feature points for correspondences. The correspondence points process is solved using correlation. The LMedS analysis and Graph Transformation Matching algorithm are used for outliers suppression. Camera projection matrices are computed with the normalized eight point algorithm. Finally, we retrieve 3D position of the retinal tree points by linear triangulation. In order to increase the power of visualization, 3D tree skeletons are represented by surfaces via generalized cylinders whose radius correspond to morphological measurements obtained by RISA. In this paper the complete calibration process including the fundus camera and the optical properties of the eye, the so called camera-eye system is proposed. On one hand, the internal parameters of the fundus camera are obtained by classical algorithms using a reference pattern. On the other hand, we minimize the undesirable efects of the aberrations induced by the eyeball optical system assuming that contact enlarging lens corrects astigmatism, spherical and coma aberrations are reduced changing the aperture size and eye refractive errors are suppressed adjusting camera focus during image acquisition. Evaluation of two self-calibration proposals and results of 3D blood vessel surface reconstruction are presented.

  14. Trade-off between TMA and RC configurations for JANUS camera

    NASA Astrophysics Data System (ADS)

    Greggio, D.; Magrin, D.; Munari, M.; Paolinetti, R.; Turella, A.; Zusi, M.; Cremonese, G.; Debei, S.; Della Corte, V.; Friso, E.; Hoffmann, H.; Jaumann, R.; Michaelis, H.; Mugnuolo, R.; Olivieri, A.; Palumbo, P.; Ragazzoni, R.; Schmitz, N.

    2016-07-01

    JANUS (Jovis Amorum Ac Natorum Undique Scrutator) is a high-resolution visible camera designed for the ESA space mission JUICE (Jupiter Icy moons Explorer). The main scientific goal of JANUS is to observe the surface of the Jupiter satellites Ganymede and Europa in order to characterize their physical and geological properties. During the design phases, we have proposed two possible optical configurations: a Three Mirror Anastigmat (TMA) and a Ritchey-Chrétien (RC) both matching the performance requirements. Here we describe the two optical solutions and compare their performance both in terms of achieved optical quality, sensitivity to misalignment and stray light performances.

  15. STS-61 art concept of astronauts during HST servicing

    NASA Image and Video Library

    1993-11-12

    S93-48826 (November 1993) --- This artist's rendition of the 1993 Hubble Space Telescope (HST) servicing mission shows astronauts installing the new Wide Field/Planetary Camera (WF/PC 2). The instruments to replace the original camera and contains corrective optics that compensate for the telescope's flawed primary mirror. During the 11-plus day mission, astronauts are also scheduled to install the Corrective Optics Space Telescope Axial Replacement (COSTAR) -- an optics package that focuses and routes light to the other three instruments aboard the observatory -- a new set of solar array panels, and other hardware and components. The artwork was done for JPL by Paul Hudson.

  16. Comparison of low-cost handheld retinal camera and traditional table top retinal camera in the detection of retinal features indicating a risk of cardiovascular disease

    NASA Astrophysics Data System (ADS)

    Joshi, V.; Wigdahl, J.; Nemeth, S.; Zamora, G.; Ebrahim, E.; Soliz, P.

    2018-02-01

    Retinal abnormalities associated with hypertensive retinopathy are useful in assessing the risk of cardiovascular disease, heart failure, and stroke. Assessing these risks as part of primary care can lead to a decrease in the incidence of cardiovascular disease-related deaths. Primary care is a resource limited setting where low cost retinal cameras may bring needed help without compromising care. We compared a low-cost handheld retinal camera to a traditional table top retinal camera on their optical characteristics and performance to detect hypertensive retinopathy. A retrospective dataset of N=40 subjects (28 with hypertensive retinopathy, 12 controls) was used from a clinical study conducted at a primary care clinic in Texas. Non-mydriatic retinal fundus images were acquired using a Pictor Plus hand held camera (Volk Optical Inc.) and a Canon CR1-Mark II tabletop camera (Canon USA) during the same encounter. The images from each camera were graded by a licensed optometrist according to the universally accepted Keith-Wagener-Barker Hypertensive Retinopathy Classification System, three weeks apart to minimize memory bias. The sensitivity of the hand-held camera to detect any level of hypertensive retinopathy was 86% compared to the Canon. Insufficient photographer's skills produced 70% of the false negative cases. The other 30% were due to the handheld camera's insufficient spatial resolution to resolve the vascular changes such as minor A/V nicking and copper wiring, but these were associated with non-referable disease. Physician evaluation of the performance of the handheld camera indicates it is sufficient to provide high risk patients with adequate follow up and management.

  17. Camera-Only Kinematics for Small Lunar Rovers

    NASA Astrophysics Data System (ADS)

    Fang, E.; Suresh, S.; Whittaker, W.

    2016-11-01

    Knowledge of the kinematic state of rovers is critical. Existing methods add sensors and wiring to moving parts, which can fail and adds mass and volume. This research presents a method to optically determine kinematic state using a single camera.

  18. STS-31 Space Shuttle mission report

    NASA Technical Reports Server (NTRS)

    Camp, David W.; Germany, D. M.; Nicholson, Leonard S.

    1990-01-01

    The STS-31 Space Shuttle Program Mission Report contains a summary of the vehicle subsystem activities on this thirty-fifth flight of the Space Shuttle and the tenth flight of the Orbiter Vehicle Discovery (OV-103). In addition to the Discovery vehicle, the flight vehicle consisted of an External Tank (ET) (designated as ET-34/LWT-27), three Space Shuttle main engines (SSME's) (serial numbers 2011, 2031, and 2107), and two Solid Rocket Booster (SRB) (designated as BI-037). The primary objective of the mission was to place the Hubble Space Telescope (HST) into a 330 nmi. circular orbit having an inclination of 28.45 degrees. The secondary objectives were to perform all operations necessary to support the requirements of the Protein Crystal Growth (PCG), Investigations into Polymer Membrane Processing (IPMP), Radiation Monitoring Equipment (RME), Ascent Particle Monitor (APM), IMAX Cargo Bay Camera (ICBC), Air Force Maui Optical Site Calibration Test (AMOS), IMAX Crew Compartment Camera, and Ion Arc payloads. In addition, 12 development test objectives (DTO's) and 10 detailed supplementary objectives (DSO's) were assigned to the flight. The sequence of events for this mission is shown. The significant problems that occurred in the Space Shuttle Orbiter subsystems during the mission are summarized, and the official problem tracking list is presented. In addition, each of the Space Shuttle Orbiter problems is cited in the subsystem discussion.

  19. Computer-generated hologram calculation for real scenes using a commercial portable plenoptic camera

    NASA Astrophysics Data System (ADS)

    Endo, Yutaka; Wakunami, Koki; Shimobaba, Tomoyoshi; Kakue, Takashi; Arai, Daisuke; Ichihashi, Yasuyuki; Yamamoto, Kenji; Ito, Tomoyoshi

    2015-12-01

    This paper shows the process used to calculate a computer-generated hologram (CGH) for real scenes under natural light using a commercial portable plenoptic camera. In the CGH calculation, a light field captured with the commercial plenoptic camera is converted into a complex amplitude distribution. Then the converted complex amplitude is propagated to a CGH plane. We tested both numerical and optical reconstructions of the CGH and showed that the CGH calculation from captured data with the commercial plenoptic camera was successful.

  20. Design of a MATLAB(registered trademark) Image Comparison and Analysis Tool for Augmentation of the Results of the Ann Arbor Distortion Test

    DTIC Science & Technology

    2016-06-25

    The equipment used in this procedure includes: Ann Arbor distortion tester with 50-line grating reticule, IQeye 720 digital video camera with 12...and import them into MATLAB. In order to digitally capture images of the distortion in an optical sample, an IQeye 720 video camera with a 12... video camera and Ann Arbor distortion tester. Figure 8. Computer interface for capturing images seen by IQeye 720 camera. Once an image was

Top