The first satellite laser echoes recorded on the streak camera
NASA Technical Reports Server (NTRS)
Hamal, Karel; Prochazka, Ivan; Kirchner, Georg; Koidl, F.
1993-01-01
The application of the streak camera with the circular sweep for the satellite laser ranging is described. The Modular Streak Camera system employing the circular sweep option was integrated into the conventional Satellite Laser System. The experimental satellite tracking and ranging has been performed. The first satellite laser echo streak camera records are presented.
Light-Directed Ranging System Implementing Single Camera System for Telerobotics Applications
NASA Technical Reports Server (NTRS)
Wells, Dennis L. (Inventor); Li, Larry C. (Inventor); Cox, Brian J. (Inventor)
1997-01-01
A laser-directed ranging system has utility for use in various fields, such as telerobotics applications and other applications involving physically handicapped individuals. The ranging system includes a single video camera and a directional light source such as a laser mounted on a camera platform, and a remotely positioned operator. In one embodiment, the position of the camera platform is controlled by three servo motors to orient the roll axis, pitch axis and yaw axis of the video cameras, based upon an operator input such as head motion. The laser is offset vertically and horizontally from the camera, and the laser/camera platform is directed by the user to point the laser and the camera toward a target device. The image produced by the video camera is processed to eliminate all background images except for the spot created by the laser. This processing is performed by creating a digital image of the target prior to illumination by the laser, and then eliminating common pixels from the subsequent digital image which includes the laser spot. A reference point is defined at a point in the video frame, which may be located outside of the image area of the camera. The disparity between the digital image of the laser spot and the reference point is calculated for use in a ranging analysis to determine range to the target.
Ranging Apparatus and Method Implementing Stereo Vision System
NASA Technical Reports Server (NTRS)
Li, Larry C. (Inventor); Cox, Brian J. (Inventor)
1997-01-01
A laser-directed ranging system for use in telerobotics applications and other applications involving physically handicapped individuals. The ranging system includes a left and right video camera mounted on a camera platform, and a remotely positioned operator. The position of the camera platform is controlled by three servo motors to orient the roll axis, pitch axis and yaw axis of the video cameras, based upon an operator input such as head motion. A laser is provided between the left and right video camera and is directed by the user to point to a target device. The images produced by the left and right video cameras are processed to eliminate all background images except for the spot created by the laser. This processing is performed by creating a digital image of the target prior to illumination by the laser, and then eliminating common pixels from the subsequent digital image which includes the laser spot. The horizontal disparity between the two processed images is calculated for use in a stereometric ranging analysis from which range is determined.
The research of adaptive-exposure on spot-detecting camera in ATP system
NASA Astrophysics Data System (ADS)
Qian, Feng; Jia, Jian-jun; Zhang, Liang; Wang, Jian-Yu
2013-08-01
High precision acquisition, tracking, pointing (ATP) system is one of the key techniques of laser communication. The spot-detecting camera is used to detect the direction of beacon in laser communication link, so that it can get the position information of communication terminal for ATP system. The positioning accuracy of camera decides the capability of laser communication system directly. So the spot-detecting camera in satellite-to-earth laser communication ATP systems needs high precision on target detection. The positioning accuracy of cameras should be better than +/-1μ rad . The spot-detecting cameras usually adopt centroid algorithm to get the position information of light spot on detectors. When the intensity of beacon is moderate, calculation results of centroid algorithm will be precise. But the intensity of beacon changes greatly during communication for distance, atmospheric scintillation, weather etc. The output signal of detector will be insufficient when the camera underexposes to beacon because of low light intensity. On the other hand, the output signal of detector will be saturated when the camera overexposes to beacon because of high light intensity. The calculation accuracy of centroid algorithm becomes worse if the spot-detecting camera underexposes or overexposes, and then the positioning accuracy of camera will be reduced obviously. In order to improve the accuracy, space-based cameras should regulate exposure time in real time according to light intensity. The algorithm of adaptive-exposure technique for spot-detecting camera based on metal-oxide-semiconductor (CMOS) detector is analyzed. According to analytic results, a CMOS camera in space-based laser communication system is described, which utilizes the algorithm of adaptive-exposure to adapting exposure time. Test results from imaging experiment system formed verify the design. Experimental results prove that this design can restrain the reduction of positioning accuracy for the change of light intensity. So the camera can keep stable and high positioning accuracy during communication.
A Structured Light Sensor System for Tree Inventory
NASA Technical Reports Server (NTRS)
Chien, Chiun-Hong; Zemek, Michael C.
2000-01-01
Tree Inventory is referred to measurement and estimation of marketable wood volume in a piece of land or forest for purposes such as investment or for loan applications. Exist techniques rely on trained surveyor conducting measurements manually using simple optical or mechanical devices, and hence are time consuming subjective and error prone. The advance of computer vision techniques makes it possible to conduct automatic measurements that are more efficient, objective and reliable. This paper describes 3D measurements of tree diameters using a uniquely designed ensemble of two line laser emitters rigidly mounted on a video camera. The proposed laser camera system relies on a fixed distance between two parallel laser planes and projections of laser lines to calculate tree diameters. Performance of the laser camera system is further enhanced by fusion of information induced from structured lighting and that contained in video images. Comparison will be made between the laser camera sensor system and a stereo vision system previously developed for measurements of tree diameters.
NASA Technical Reports Server (NTRS)
Ponseggi, B. G. (Editor); Johnson, H. C. (Editor)
1985-01-01
Papers are presented on the picosecond electronic framing camera, photogrammetric techniques using high-speed cineradiography, picosecond semiconductor lasers for characterizing high-speed image shutters, the measurement of dynamic strain by high-speed moire photography, the fast framing camera with independent frame adjustments, design considerations for a data recording system, and nanosecond optical shutters. Consideration is given to boundary-layer transition detectors, holographic imaging, laser holographic interferometry in wind tunnels, heterodyne holographic interferometry, a multispectral video imaging and analysis system, a gated intensified camera, a charge-injection-device profile camera, a gated silicon-intensified-target streak tube and nanosecond-gated photoemissive shutter tubes. Topics discussed include high time-space resolved photography of lasers, time-resolved X-ray spectrographic instrumentation for laser studies, a time-resolving X-ray spectrometer, a femtosecond streak camera, streak tubes and cameras, and a short pulse X-ray diagnostic development facility.
NASA Astrophysics Data System (ADS)
Zhao, Guihua; Chen, Hong; Li, Xingquan; Zou, Xiaoliang
The paper presents the concept of lever arm and boresight angle, the design requirements of calibration sites and the integrated calibration method of boresight angles of digital camera or laser scanner. Taking test data collected by Applanix's LandMark system as an example, the camera calibration method is introduced to be piling three consecutive stereo images and OTF-Calibration method using ground control points. The laser calibration of boresight angle is proposed to use a manual and automatic method with ground control points. Integrated calibration between digital camera and laser scanner is introduced to improve the systemic precision of two sensors. By analyzing the measurement value between ground control points and its corresponding image points in sequence images, a conclusion is that position objects between camera and images are within about 15cm in relative errors and 20cm in absolute errors. By comparing the difference value between ground control points and its corresponding laser point clouds, the errors is less than 20cm. From achieved results of these experiments in analysis, mobile mapping system is efficient and reliable system for generating high-accuracy and high-density road spatial data more rapidly.
Sensor fusion of cameras and a laser for city-scale 3D reconstruction.
Bok, Yunsu; Choi, Dong-Geol; Kweon, In So
2014-11-04
This paper presents a sensor fusion system of cameras and a 2D laser sensorfor large-scale 3D reconstruction. The proposed system is designed to capture data on afast-moving ground vehicle. The system consists of six cameras and one 2D laser sensor,and they are synchronized by a hardware trigger. Reconstruction of 3D structures is doneby estimating frame-by-frame motion and accumulating vertical laser scans, as in previousworks. However, our approach does not assume near 2D motion, but estimates free motion(including absolute scale) in 3D space using both laser data and image features. In orderto avoid the degeneration associated with typical three-point algorithms, we present a newalgorithm that selects 3D points from two frames captured by multiple cameras. The problemof error accumulation is solved by loop closing, not by GPS. The experimental resultsshow that the estimated path is successfully overlaid on the satellite images, such that thereconstruction result is very accurate.
Protective laser beam viewing device
Neil, George R.; Jordan, Kevin Carl
2012-12-18
A protective laser beam viewing system or device including a camera selectively sensitive to laser light wavelengths and a viewing screen receiving images from the laser sensitive camera. According to a preferred embodiment of the invention, the camera is worn on the head of the user or incorporated into a goggle-type viewing display so that it is always aimed at the area of viewing interest to the user and the viewing screen is incorporated into a video display worn as goggles over the eyes of the user.
NASA Technical Reports Server (NTRS)
Tedder, Sarah; Hicks, Yolanda
2012-01-01
Planar laser induced fluorescence (PLIF) is used by the Combustion Branch at the NASA Glenn Research Center (NASA Glenn) to assess the characteristics of the flowfield produced by aircraft fuel injectors. To improve and expand the capabilities of the PLIF system new equipment was installed. The new capabilities of the modified PLIF system are assessed by collecting OH PLIF in a methane/air flame produced by a flat flame burner. Specifically, the modifications characterized are the addition of an injection seeder to a Nd:YAG laser pumping an optical parametric oscillator (OPO) and the use of a new camera with an interline CCD. OH fluorescence results using the injection seeded OPO laser are compared to results using a Nd:YAG pumped dye laser with ultraviolet extender (UVX). Best settings of the new camera for maximum detection of PLIF signal are reported for the controller gain and microchannel plate (MCP) bracket pulsing. Results are also reported from tests of the Dual Image Feature (DIF) mode of the new camera which allows image pairs to be acquired in rapid succession. This allows acquisition of a PLIF image and a background signal almost simultaneously. Saturation effects in the new camera were also investigated and are reported.
Design of an infrared camera based aircraft detection system for laser guide star installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman, H.; Macintosh, B.
1996-03-05
There have been incidents in which the irradiance resulting from laser guide stars have temporarily blinded pilots or passengers of aircraft. An aircraft detection system based on passive near infrared cameras (instead of active radar) is described in this report.
NASA Technical Reports Server (NTRS)
Kuebert, E. J.
1977-01-01
A Laser Altimeter and Mapping Camera System was included in the Apollo Lunar Orbital Experiment Missions. The backup system, never used in the Apollo Program, is available for use in the Lidar Test Experiments on the STS Orbital Flight Tests 2 and 4. Studies were performed to assess the problem associated with installation and operation of the Mapping Camera System in the STS. They were conducted on the photographic capabilities of the Mapping Camera System, its mechanical and electrical interface with the STS, documentation, operation and survivability in the expected environments, ground support equipment, test and field support.
NASA Astrophysics Data System (ADS)
Kersten, T. P.; Stallmann, D.; Tschirschwitz, F.
2016-06-01
For mapping of building interiors various 2D and 3D indoor surveying systems are available today. These systems essentially differ from each other by price and accuracy as well as by the effort required for fieldwork and post-processing. The Laboratory for Photogrammetry & Laser Scanning of HafenCity University (HCU) Hamburg has developed, as part of an industrial project, a lowcost indoor mapping system, which enables systematic inventory mapping of interior facilities with low staffing requirements and reduced, measurable expenditure of time and effort. The modelling and evaluation of the recorded data take place later in the office. The indoor mapping system of HCU Hamburg consists of the following components: laser range finder, panorama head (pan-tilt-unit), single-board computer (Raspberry Pi) with digital camera and battery power supply. The camera is pre-calibrated in a photogrammetric test field under laboratory conditions. However, remaining systematic image errors are corrected simultaneously within the generation of the panorama image. Due to cost reasons the camera and laser range finder are not coaxially arranged on the panorama head. Therefore, eccentricity and alignment of the laser range finder against the camera must be determined in a system calibration. For the verification of the system accuracy and the system calibration, the laser points were determined from measurements with total stations. The differences to the reference were 4-5mm for individual coordinates.
Overview of a Hybrid Underwater Camera System
2014-07-01
meters), in increments of 200ps. The camera is also equipped with 6:1 motorized zoom lens. A precision miniature attitude, heading reference system ( AHRS ...LUCIE Control & Power Distribution System AHRS Pulsed LASER Gated Camera -^ Sonar Transducer (b) LUCIE sub-systems Proc. ofSPIEVol. 9111
Multi-channel automotive night vision system
NASA Astrophysics Data System (ADS)
Lu, Gang; Wang, Li-jun; Zhang, Yi
2013-09-01
A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.
Automatic camera to laser calibration for high accuracy mobile mapping systems using INS
NASA Astrophysics Data System (ADS)
Goeman, Werner; Douterloigne, Koen; Gautama, Sidharta
2013-09-01
A mobile mapping system (MMS) is a mobile multi-sensor platform developed by the geoinformation community to support the acquisition of huge amounts of geodata in the form of georeferenced high resolution images and dense laser clouds. Since data fusion and data integration techniques are increasingly able to combine the complementary strengths of different sensor types, the external calibration of a camera to a laser scanner is a common pre-requisite on today's mobile platforms. The methods of calibration, nevertheless, are often relatively poorly documented, are almost always time-consuming, demand expert knowledge and often require a carefully constructed calibration environment. A new methodology is studied and explored to provide a high quality external calibration for a pinhole camera to a laser scanner which is automatic, easy to perform, robust and foolproof. The method presented here, uses a portable, standard ranging pole which needs to be positioned on a known ground control point. For calibration, a well studied absolute orientation problem needs to be solved. In many cases, the camera and laser sensor are calibrated in relation to the INS system. Therefore, the transformation from camera to laser contains the cumulated error of each sensor in relation to the INS. Here, the calibration of the camera is performed in relation to the laser frame using the time synchronization between the sensors for data association. In this study, the use of the inertial relative movement will be explored to collect more useful calibration data. This results in a better intersensor calibration allowing better coloring of the clouds and a more accurate depth mask for images, especially on the edges of objects in the scene.
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.
Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi
2016-08-30
This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.
Opto-mechanical system design of test system for near-infrared and visible target
NASA Astrophysics Data System (ADS)
Wang, Chunyan; Zhu, Guodong; Wang, Yuchao
2014-12-01
Guidance precision is the key indexes of the guided weapon shooting. The factors of guidance precision including: information processing precision, control system accuracy, laser irradiation accuracy and so on. The laser irradiation precision is an important factor. This paper aimed at the demand of the precision test of laser irradiator,and developed the laser precision test system. The system consists of modified cassegrain system, the wide range CCD camera, tracking turntable and industrial PC, and makes visible light and near infrared target imaging at the same time with a Near IR camera. Through the analysis of the design results, when it exposures the target of 1000 meters that the system measurement precision is43mm, fully meet the needs of the laser precision test.
Compact streak camera for the shock study of solids by using the high-pressure gas gun
NASA Astrophysics Data System (ADS)
Nagayama, Kunihito; Mori, Yasuhito
1993-01-01
For the precise observation of high-speed impact phenomena, a compact high-speed streak camera recording system has been developed. The system consists of a high-pressure gas gun, a streak camera, and a long-pulse dye laser. The gas gun installed in our laboratory has a muzzle of 40 mm in diameter, and a launch tube of 2 m long. Projectile velocity is measured by the laser beam cut method. The gun is capable of accelerating a 27 g projectile up to 500 m/s, if helium gas is used as a driver. The system has been designed on the principal idea that the precise optical measurement methods developed in other areas of research can be applied to the gun study. The streak camera is 300 mm in diameter, with a rectangular rotating mirror which is driven by an air turbine spindle. The attainable streak velocity is 3 mm/microsecond(s) . The size of the camera is rather small aiming at the portability and economy. Therefore, the streak velocity is relatively slower than the fast cameras, but it is possible to use low-sensitivity but high-resolution film as a recording medium. We have also constructed a pulsed dye laser of 25 - 30 microsecond(s) in duration. The laser can be used as a light source of observation. The advantage for the use of the laser will be multi-fold, i.e., good directivity, almost single frequency, and so on. The feasibility of the system has been demonstrated by performing several experiments.
Laser-Directed Ranging System Implementing Single Camera System for Telerobotics Applications
NASA Technical Reports Server (NTRS)
Wells, Dennis L. (Inventor); Li, Larry C. (Inventor); Cox, Brian J. (Inventor)
1995-01-01
The invention relates generally to systems for determining the range of an object from a reference point and, in one embodiment, to laser-directed ranging systems useful in telerobotics applications. Digital processing techniques are employed which minimize the complexity and cost of the hardware and software for processing range calculations, thereby enhancing the commercial attractiveness of the system for use in relatively low-cost robotic systems. The system includes a video camera for generating images of the target, image digitizing circuitry, and an associated frame grabber circuit. The circuit first captures one of the pairs of stereo video images of the target, and then captures a second video image of the target as it is partly illuminated by the light beam, suitably generated by a laser. The two video images, taken sufficiently close together in time to minimize camera and scene motion, are converted to digital images and then compared. Common pixels are eliminated, leaving only a digital image of the laser-illuminated spot on the target. Mw centroid of the laser illuminated spot is dm obtained and compared with a predetermined reference point, predetermined by design or calibration, which represents the coordinate at the focal plane of the laser illumination at infinite range. Preferably, the laser and camera are mounted on a servo-driven platform which can be oriented to direct the camera and the laser toward the target. In one embodiment the platform is positioned in response to movement of the operator's head. Position and orientation sensors are used to monitor head movement. The disparity between the digital image of the laser spot and the reference point is calculated for determining range to the target. Commercial applications for the system relate to active range-determination systems, such as those used with robotic systems in which it is necessary to determine the, range to a workpiece or object to be grasped or acted upon by a robot arm end-effector in response to commands generated by an operator. In one embodiment, the system provides a real-time image of the target for the operator as the robot approaches the object. The system is also adapted for use in virtual reality systems in which a remote object or workpiece is to be acted upon by a remote robot arm or other mechanism controlled by an operator.
The guidance methodology of a new automatic guided laser theodolite system
NASA Astrophysics Data System (ADS)
Zhang, Zili; Zhu, Jigui; Zhou, Hu; Ye, Shenghua
2008-12-01
Spatial coordinate measurement systems such as theodolites, laser trackers and total stations have wide application in manufacturing and certification processes. The traditional operation of theodolites is manual and time-consuming which does not meet the need of online industrial measurement, also laser trackers and total stations need reflective targets which can not realize noncontact and automatic measurement. A new automatic guided laser theodolite system is presented to achieve automatic and noncontact measurement with high precision and efficiency which is comprised of two sub-systems: the basic measurement system and the control and guidance system. The former system is formed by two laser motorized theodolites to accomplish the fundamental measurement tasks while the latter one consists of a camera and vision system unit mounted on a mechanical displacement unit to provide azimuth information of the measured points. The mechanical displacement unit can rotate horizontally and vertically to direct the camera to the desired orientation so that the camera can scan every measured point in the measuring field, then the azimuth of the corresponding point is calculated for the laser motorized theodolites to move accordingly to aim at it. In this paper the whole system composition and measuring principle are analyzed, and then the emphasis is laid on the guidance methodology for the laser points from the theodolites to move towards the measured points. The guidance process is implemented based on the coordinate transformation between the basic measurement system and the control and guidance system. With the view field angle of the vision system unit and the world coordinate of the control and guidance system through coordinate transformation, the azimuth information of the measurement area that the camera points at can be attained. The momentary horizontal and vertical changes of the mechanical displacement movement are also considered and calculated to provide real time azimuth information of the pointed measurement area by which the motorized theodolite will move accordingly. This methodology realizes the predetermined location of the laser points which is within the camera-pointed scope so that it accelerates the measuring process and implements the approximate guidance instead of manual operations. The simulation results show that the proposed method of automatic guidance is effective and feasible which provides good tracking performance of the predetermined location of laser points.
RESTORATION OF ATMOSPHERICALLY DEGRADED IMAGES. VOLUME 3.
AERIAL CAMERAS, LASERS, ILLUMINATION, TRACKING CAMERAS, DIFFRACTION, PHOTOGRAPHIC GRAIN, DENSITY, DENSITOMETERS, MATHEMATICAL ANALYSIS, OPTICAL SCANNING, SYSTEMS ENGINEERING, TURBULENCE, OPTICAL PROPERTIES, SATELLITE TRACKING SYSTEMS.
Noise and sensitivity of x-ray framing cameras at Nike (abstract)
NASA Astrophysics Data System (ADS)
Pawley, C. J.; Deniz, A. V.; Lehecka, T.
1999-01-01
X-ray framing cameras are the most widely used tool for radiographing density distributions in laser and Z-pinch driven experiments. The x-ray framing cameras that were developed specifically for experiments on the Nike laser system are described. One of these cameras has been coupled to a CCD camera and was tested for resolution and image noise using both electrons and x rays. The largest source of noise in the images was found to be due to low quantum detection efficiency of x-ray photons.
High-speed imaging system for observation of discharge phenomena
NASA Astrophysics Data System (ADS)
Tanabe, R.; Kusano, H.; Ito, Y.
2008-11-01
A thin metal electrode tip instantly changes its shape into a sphere or a needlelike shape in a single electrical discharge of high current. These changes occur within several hundred microseconds. To observe these high-speed phenomena in a single discharge, an imaging system using a high-speed video camera and a high repetition rate pulse laser was constructed. A nanosecond laser, the wavelength of which was 532 nm, was used as the illuminating source of a newly developed high-speed video camera, HPV-1. The time resolution of our system was determined by the laser pulse width and was about 80 nanoseconds. The system can take one hundred pictures at 16- or 64-microsecond intervals in a single discharge event. A band-pass filter at 532 nm was placed in front of the camera to block the emission of the discharge arc at other wavelengths. Therefore, clear images of the electrode were recorded even during the discharge. If the laser was not used, only images of plasma during discharge and thermal radiation from the electrode after discharge were observed. These results demonstrate that the combination of a high repetition rate and a short pulse laser with a high speed video camera provides a unique and powerful method for high speed imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garfield, B.R.; Rendell, J.T.
1991-01-01
The present conference discusses the application of schlieren photography in industry, laser fiber-optic high speed photography, holographic visualization of hypervelocity explosions, sub-100-picosec X-ray grating cameras, flash soft X-radiography, a novel approach to synchroballistic photography, a programmable image converter framing camera, high speed readout CCDs, an ultrafast optomechanical camera, a femtosec streak tube, a modular streak camera for laser ranging, and human-movement analysis with real-time imaging. Also discussed are high-speed photography of high-resolution moire patterns, a 2D electron-bombarded CCD readout for picosec electrooptical data, laser-generated plasma X-ray diagnostics, 3D shape restoration with virtual grating phase detection, Cu vapor lasers for highmore » speed photography, a two-frequency picosec laser with electrooptical feedback, the conversion of schlieren systems to high speed interferometers, laser-induced cavitation bubbles, stereo holographic cinematography, a gatable photonic detector, and laser generation of Stoneley waves at liquid-solid boundaries.« less
Stratified charge rotary engine - Internal flow studies at the MSU engine research laboratory
NASA Technical Reports Server (NTRS)
Hamady, F.; Kosterman, J.; Chouinard, E.; Somerton, C.; Schock, H.; Chun, K.; Hicks, Y.
1989-01-01
High-speed visualization and laser Doppler velocimetry (LDV) systems consisting of a 40-watt copper vapor laser, mirrors, cylindrical lenses, a high speed camera, a synchronization timing system, and a particle generator were developed for the study of the fuel spray-air mixing flow characteristics within the combustion chamber of a motored rotary engine. The laser beam is focused down to a sheet approximately 1 mm thick, passing through the combustion chamber and illuminates smoke particles entrained in the intake air. The light scattered off the particles is recorded by a high speed rotating prism camera. Movies are made showing the air flow within the combustion chamber. The results of a movie showing the development of a high-speed (100 Hz) high-pressure (68.94 MPa, 10,000 psi) fuel jet are also discussed. The visualization system is synchronized so that a pulse generated by the camera triggers the laser's thyratron.
Active solution of homography for pavement crack recovery with four laser lines.
Xu, Guan; Chen, Fang; Wu, Guangwei; Li, Xiaotao
2018-05-08
An active solution method of the homography, which is derived from four laser lines, is proposed to recover the pavement cracks captured by the camera to the real-dimension cracks in the pavement plane. The measurement system, including a camera and four laser projectors, captures the projection laser points on the 2D reference in different positions. The projection laser points are reconstructed in the camera coordinate system. Then, the laser lines are initialized and optimized by the projection laser points. Moreover, the plane-indicated Plücker matrices of the optimized laser lines are employed to model the laser projection points of the laser lines on the pavement. The image-pavement homography is actively determined by the solutions of the perpendicular feet of the projection laser points. The pavement cracks are recovered by the active solution of homography in the experiments. The recovery accuracy of the active solution method is verified by the 2D dimension-known reference. The test case with the measurement distance of 700 mm and the relative angle of 8° achieves the smallest recovery error of 0.78 mm in the experimental investigations, which indicates the application potentials in the vision-based pavement inspection.
HERCULES/MSI: a multispectral imager with geolocation for STS-70
NASA Astrophysics Data System (ADS)
Simi, Christopher G.; Kindsfather, Randy; Pickard, Henry; Howard, William, III; Norton, Mark C.; Dixon, Roberta
1995-11-01
A multispectral intensified CCD imager combined with a ring laser gyroscope based inertial measurement unit was flown on the Space Shuttle Discovery from July 13-22, 1995 (Space Transport System Flight No. 70, STS-70). The camera includes a six position filter wheel, a third generation image intensifier, and a CCD camera. The camera is integrated with a laser gyroscope system that determines the ground position of the imagery to an accuracy of better than three nautical miles. The camera has two modes of operation; a panchromatic mode for high-magnification imaging [ground sample distance (GSD) of 4 m], or a multispectral mode consisting of six different user-selectable spectral ranges at reduced magnification (12 m GSD). This paper discusses the system hardware and technical trade-offs involved with camera optimization, and presents imagery observed during the shuttle mission.
Development of on-line laser power monitoring system
NASA Astrophysics Data System (ADS)
Ding, Chien-Fang; Lee, Meng-Shiou; Li, Kuan-Ming
2016-03-01
Since the laser was invented, laser has been applied in many fields such as material processing, communication, measurement, biomedical engineering, defense industries and etc. Laser power is an important parameter in laser material processing, i.e. laser cutting, and laser drilling. However, the laser power is easily affected by the environment temperature, we tend to monitor the laser power status, ensuring there is an effective material processing. Besides, the response time of current laser power meters is too long, they cannot measure laser power accurately in a short time. To be more precisely, we can know the status of laser power and help us to achieve an effective material processing at the same time. To monitor the laser power, this study utilize a CMOS (Complementary metal-oxide-semiconductor) camera to develop an on-line laser power monitoring system. The CMOS camera captures images of incident laser beam after it is split and attenuated by beam splitter and neutral density filter. By comparing the average brightness of the beam spots and measurement results from laser power meter, laser power can be estimated. Under continuous measuring mode, the average measuring error is about 3%, and the response time is at least 3.6 second shorter than thermopile power meters; under trigger measuring mode which enables the CMOS camera to synchronize with intermittent laser output, the average measuring error is less than 3%, and the shortest response time is 20 millisecond.
Dynamic characteristics of far-field radiation of current modulated phase-locked diode laser arrays
NASA Technical Reports Server (NTRS)
Elliott, R. A.; Hartnett, K.
1987-01-01
A versatile and powerful streak camera/frame grabber system for studying the evolution of the near and far field radiation patterns of diode lasers was assembled and tested. Software needed to analyze and display the data acquired with the steak camera/frame grabber system was written and the total package used to record and perform preliminary analyses on the behavior of two types of laser, a ten emitter gain guided array and a flared waveguide Y-coupled array. Examples of the information which can be gathered with this system are presented.
Laser Imaging Video Camera Sees Through Fire, Fog, Smoke
NASA Technical Reports Server (NTRS)
2015-01-01
Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.
Sellers and Fossum on the end of the OBSS during EVA1 on STS-121 / Expedition 13 joint operations
2006-07-08
STS121-323-011 (8 July 2006) --- Astronauts Piers J. Sellers and Michael E. Fossum, STS-121 mission specialists, work in tandem on Space Shuttle Discovery's Remote Manipulator System/Orbiter Boom Sensor System (RMS/OBSS) during the mission's first scheduled session of extravehicular activity (EVA). Also visible on the OBSS are the Laser Dynamic Range Imager (LDRI), Intensified Television Camera (ITVC) and Laser Camera System (LCS).
In vivo burn diagnosis by camera-phone diffuse reflectance laser speckle detection.
Ragol, S; Remer, I; Shoham, Y; Hazan, S; Willenz, U; Sinelnikov, I; Dronov, V; Rosenberg, L; Bilenca, A
2016-01-01
Burn diagnosis using laser speckle light typically employs widefield illumination of the burn region to produce two-dimensional speckle patterns from light backscattered from the entire irradiated tissue volume. Analysis of speckle contrast in these time-integrated patterns can then provide information on burn severity. Here, by contrast, we use point illumination to generate diffuse reflectance laser speckle patterns of the burn. By examining spatiotemporal fluctuations in these time-integrated patterns along the radial direction from the incident point beam, we show the ability to distinguish partial-thickness burns in a porcine model in vivo within the first 24 hours post-burn. Furthermore, our findings suggest that time-integrated diffuse reflectance laser speckle can be useful for monitoring burn healing over time post-burn. Unlike conventional diffuse reflectance laser speckle detection systems that utilize scientific or industrial-grade cameras, our system is designed with a camera-phone, demonstrating the potential for burn diagnosis with a simple imager.
In vivo burn diagnosis by camera-phone diffuse reflectance laser speckle detection
Ragol, S.; Remer, I.; Shoham, Y.; Hazan, S.; Willenz, U.; Sinelnikov, I.; Dronov, V.; Rosenberg, L.; Bilenca, A.
2015-01-01
Burn diagnosis using laser speckle light typically employs widefield illumination of the burn region to produce two-dimensional speckle patterns from light backscattered from the entire irradiated tissue volume. Analysis of speckle contrast in these time-integrated patterns can then provide information on burn severity. Here, by contrast, we use point illumination to generate diffuse reflectance laser speckle patterns of the burn. By examining spatiotemporal fluctuations in these time-integrated patterns along the radial direction from the incident point beam, we show the ability to distinguish partial-thickness burns in a porcine model in vivo within the first 24 hours post-burn. Furthermore, our findings suggest that time-integrated diffuse reflectance laser speckle can be useful for monitoring burn healing over time post-burn. Unlike conventional diffuse reflectance laser speckle detection systems that utilize scientific or industrial-grade cameras, our system is designed with a camera-phone, demonstrating the potential for burn diagnosis with a simple imager. PMID:26819831
Robotic Vehicle Communications Interoperability
1988-08-01
starter (cold start) X X Fire suppression X Fording control X Fuel control X Fuel tank selector X Garage toggle X Gear selector X X X X Hazard warning...optic Sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor control...optic sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor
Streak camera receiver definition study
NASA Technical Reports Server (NTRS)
Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.
1990-01-01
Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.
Investigation into the Use of the Concept Laser QM System as an In-Situ Research and Evaluation Tool
NASA Technical Reports Server (NTRS)
Bagg, Stacey
2014-01-01
The NASA Marshall Space Flight Center (MSFC) is using a Concept Laser Fusing (Cusing) M2 powder bed additive manufacturing system for the build of space flight prototypes and hardware. NASA MSFC is collecting and analyzing data from the M2 QM Meltpool and QM Coating systems for builds. This data is intended to aide in understanding of the powder-bed additive manufacturing process, and in the development of a thermal model for the process. The QM systems are marketed by Concept Laser GmbH as in-situ quality management modules. The QM Meltpool system uses both a high-speed near-IR camera and a photodiode to monitor the melt pool generated by the laser. The software determines from the camera images the size of the melt pool. The camera also measures the integrated intensity of the IR radiation, and the photodiode gives an intensity value based on the brightness of the melt pool. The QM coating system uses a high resolution optical camera to image the surface after each layer has been formed. The objective of this investigation was to determine the adequacy of the QM Meltpool system as a research instrument for in-situ measurement of melt pool size and temperature and its applicability to NASA's objectives in (1) Developing a process thermal model and (2) Quantifying feedback measurements with the intent of meeting quality requirements or specifications. Note that Concept Laser markets the system only as capable of giving an indication of changes between builds, not as an in-situ research and evaluation tool. A secondary objective of the investigation is to determine the adequacy of the QM Coating system as an in-situ layer-wise geometry and layer quality evaluation tool.
Optics design of laser spotter camera for ex-CCD sensor
NASA Astrophysics Data System (ADS)
Nautiyal, R. P.; Mishra, V. K.; Sharma, P. K.
2015-06-01
Development of Laser based instruments like laser range finder and laser ranger designator has received prominence in modern day military application. Aiming the laser on the target is done with the help of a bore sighted graticule as human eye cannot see the laser beam directly. To view Laser spot there are two types of detectors available, InGaAs detector and Ex-CCD detector, the latter being a cost effective solution. In this paper optics design for Ex-CCD based camera is discussed. The designed system is light weight and compact and has the ability to see the 1064nm pulsed laser spot upto a range of 5 km.
Laser line scan underwater imaging by complementary metal-oxide-semiconductor camera
NASA Astrophysics Data System (ADS)
He, Zhiyi; Luo, Meixing; Song, Xiyu; Wang, Dundong; He, Ning
2017-12-01
This work employs the complementary metal-oxide-semiconductor (CMOS) camera to acquire images in a scanning manner for laser line scan (LLS) underwater imaging to alleviate backscatter impact of seawater. Two operating features of the CMOS camera, namely the region of interest (ROI) and rolling shutter, can be utilized to perform image scan without the difficulty of translating the receiver above the target as the traditional LLS imaging systems have. By the dynamically reconfigurable ROI of an industrial CMOS camera, we evenly divided the image into five subareas along the pixel rows and then scanned them by changing the ROI region automatically under the synchronous illumination by the fun beams of the lasers. Another scanning method was explored by the rolling shutter operation of the CMOS camera. The fun beam lasers were turned on/off to illuminate the narrow zones on the target in a good correspondence to the exposure lines during the rolling procedure of the camera's electronic shutter. The frame synchronization between the image scan and the laser beam sweep may be achieved by either the strobe lighting output pulse or the external triggering pulse of the industrial camera. Comparison between the scanning and nonscanning images shows that contrast of the underwater image can be improved by our LLS imaging techniques, with higher stability and feasibility than the mechanically controlled scanning method.
Augmented reality in laser laboratories
NASA Astrophysics Data System (ADS)
Quercioli, Franco
2018-05-01
Laser safety glasses block visibility of the laser light. This is a big nuisance when a clear view of the beam path is required. A headset made up of a smartphone and a viewer can overcome this problem. The user looks at the image of the real world on the cellphone display, captured by its rear camera. An unimpeded and safe sight of the laser beam is then achieved. If the infrared blocking filter of the smartphone camera is removed, the spectral sensitivity of the CMOS image sensor extends in the near infrared region up to 1100 nm. This substantial improvement widens the usability of the device to many laser systems for industrial and medical applications, which are located in this spectral region. The paper describes this modification of a phone camera to extend its sensitivity beyond the visible and make a true augmented reality laser viewer.
Laser-Camera Vision Sensing for Spacecraft Mobile Robot Navigation
NASA Technical Reports Server (NTRS)
Maluf, David A.; Khalil, Ahmad S.; Dorais, Gregory A.; Gawdiak, Yuri
2002-01-01
The advent of spacecraft mobile robots-free-flyng sensor platforms and communications devices intended to accompany astronauts or remotely operate on space missions both inside and outside of a spacecraft-has demanded the development of a simple and effective navigation schema. One such system under exploration involves the use of a laser-camera arrangement to predict relative positioning of the mobile robot. By projecting laser beams from the robot, a 3D reference frame can be introduced. Thus, as the robot shifts in position, the position reference frame produced by the laser images is correspondingly altered. Using normalization and camera registration techniques presented in this paper, the relative translation and rotation of the robot in 3D are determined from these reference frame transformations.
The Sensor Irony: How Reliance on Sensor Technology is Limiting Our View of the Battlefield
2010-05-10
thermal ) camera, as well as a laser illuminator/range finder.73 Similar to the MQ- 1 , the MQ-9 Reaper is primarily a strike asset for emerging targets...Wescam 14TS. 1 Both systems have an Electro-optical (daylight) TV camera, an Infra-red ( thermal ) camera, as well as a laser illuminator/range finder...Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
Time-resolved x-ray spectra from laser-generated high-density plasmas
NASA Astrophysics Data System (ADS)
Andiel, U.; Eidmann, Klaus; Witte, Klaus-Juergen
2001-04-01
We focused frequency doubled ultra short laser pulses on solid C, F, Na and Al targets, K-shell emission was systematically investigated by time resolved spectroscopy using a sub-ps streak camera. A large number of laser shots can be accumulated when triggering the camera with an Auston switch system at very high temporal precision. The system provides an outstanding time resolution of 1.7ps accumulating thousands of laser shots. The time duration of the He-(alpha) K-shell resonance lines was observed in the range of (2-4)ps and shows a decrease with the atomic number. The experimental results are well reproduced by hydro code simulations post processed with an atomic kinetics code.
Pulsed spatial phase-shifting digital shearography based on a micropolarizer camera
NASA Astrophysics Data System (ADS)
Aranchuk, Vyacheslav; Lal, Amit K.; Hess, Cecil F.; Trolinger, James Davis; Scott, Eddie
2018-02-01
We developed a pulsed digital shearography system that utilizes the spatial phase-shifting technique. The system employs a commercial micropolarizer camera and a double pulse laser, which allows for instantaneous phase measurements. The system can measure dynamic deformation of objects as large as 1 m at a 2-m distance during the time between two laser pulses that range from 30 μs to 30 ms. The ability of the system to measure dynamic deformation was demonstrated by obtaining phase wrapped and unwrapped shearograms of a vibrating object.
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
3D Modeling of Interior Building Environments and Objects from Noisy Sensor Suites
2015-05-14
building environments. The interior environment of a building is scanned by a custom hardware system, which provides raw laser and camera sensor readings...interior environment of a building is scanned by a custom hardware system, which provides raw laser and camera sensor readings used to develop these...seemed straight out of a Calvin & Hobbes strip . As soon as I met the people here, I immediately found that the intellectual adventure matched the
Method and apparatus for coherent imaging of infrared energy
Hutchinson, D.P.
1998-05-12
A coherent camera system performs ranging, spectroscopy, and thermal imaging. Local oscillator radiation is combined with target scene radiation to enable heterodyne detection by the coherent camera`s two-dimensional photodetector array. Versatility enables deployment of the system in either a passive mode (where no laser energy is actively transmitted toward the target scene) or an active mode (where a transmitting laser is used to actively illuminate the target scene). The two-dimensional photodetector array eliminates the need to mechanically scan the detector. Each element of the photodetector array produces an intermediate frequency signal that is amplified, filtered, and rectified by the coherent camera`s integrated circuitry. By spectroscopic examination of the frequency components of each pixel of the detector array, a high-resolution, three-dimensional or holographic image of the target scene is produced for applications such as air pollution studies, atmospheric disturbance monitoring, and military weapons targeting. 8 figs.
Laser-induced damage threshold of camera sensors and micro-optoelectromechanical systems
NASA Astrophysics Data System (ADS)
Schwarz, Bastian; Ritt, Gunnar; Koerber, Michael; Eberle, Bernd
2017-03-01
The continuous development of laser systems toward more compact and efficient devices constitutes an increasing threat to electro-optical imaging sensors, such as complementary metal-oxide-semiconductors (CMOS) and charge-coupled devices. These types of electronic sensors are used in day-to-day life but also in military or civil security applications. In camera systems dedicated to specific tasks, micro-optoelectromechanical systems, such as a digital micromirror device (DMD), are part of the optical setup. In such systems, the DMD can be located at an intermediate focal plane of the optics and it is also susceptible to laser damage. The goal of our work is to enhance the knowledge of damaging effects on such devices exposed to laser light. The experimental setup for the investigation of laser-induced damage is described in detail. As laser sources, both pulsed lasers and continuous-wave (CW)-lasers are used. The laser-induced damage threshold is determined by the single-shot method by increasing the pulse energy from pulse to pulse or in the case of CW-lasers, by increasing the laser power. Furthermore, we investigate the morphology of laser-induced damage patterns and the dependence of the number of destructive device elements on the laser pulse energy or laser power. In addition to the destruction of single pixels, we observe aftereffects, such as persistent dead columns or rows of pixels in the sensor image.
Engine flow visualization using a copper vapor laser
NASA Technical Reports Server (NTRS)
Regan, Carolyn A.; Chun, Kue S.; Schock, Harold J., Jr.
1987-01-01
A flow visualization system has been developed to determine the air flow within the combustion chamber of a motored, axisymmetric engine. The engine has been equipped with a transparent quartz cylinder, allowing complete optical access to the chamber. A 40-Watt copper vapor laser is used as the light source. Its beam is focused down to a sheet approximately 1 mm thick. The light plane is passed through the combustion chamber, and illuminates oil particles which were entrained in the intake air. The light scattered off of the particles is recorded by a high speed rotating prism movie camera. A movie is then made showing the air flow within the combustion chamber for an entire four-stroke engine cycle. The system is synchronized so that a pulse generated by the camera triggers the laser's thyratron. The camera is run at 5,000 frames per second; the trigger drives one laser pulse per frame. This paper describes the optics used in the flow visualization system, the synchronization circuit, and presents results obtained from the movie. This is believed to be the first published study showing a planar observation of airflow in a four-stroke piston-cylinder assembly. These flow visualization results have been used to interpret flow velocity measurements previously obtained with a laser Doppler velocimetry system.
Design of noise barrier inspection system for high-speed railway
NASA Astrophysics Data System (ADS)
Liu, Bingqian; Shao, Shuangyun; Feng, Qibo; Ma, Le; Cholryong, Kim
2016-10-01
The damage of noise barriers will highly reduce the transportation safety of the high-speed railway. In this paper, an online inspection system of noise barrier based on laser vision for the safety of high-speed railway is proposed. The inspection system, mainly consisted of a fast camera and a line laser, installed in the first carriage of the high-speed CIT(Composited Inspection Train).A Laser line was projected on the surface of the noise barriers and the images of the light line were received by the camera while the train is running at high speed. The distance between the inspection system and the noise barrier can be obtained based on laser triangulation principle. The results of field tests show that the proposed system can meet the need of high speed and high accuracy to get the contour distortion of the noise barriers.
NASA Technical Reports Server (NTRS)
Sarrafzadeh-Khoee, Adel K. (Inventor)
2000-01-01
The invention provides a method of triple-beam and triple-sensor in a laser speckle strain/deformation measurement system. The triple-beam/triple-camera configuration combined with sequential timing of laser beam shutters is capable of providing indications of surface strain and structure deformations. The strain and deformation quantities, the four variables of surface strain, in-plane displacement, out-of-plane displacement and tilt, are determined in closed form solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sels, Seppe, E-mail: Seppe.Sels@uantwerpen.be; Ribbens, Bart; Mertens, Luc
Scanning laser Doppler vibrometers (LDV) are used to measure full-field vibration shapes of products and structures. In most commercially available scanning laser Doppler vibrometer systems the user manually draws a grid of measurement locations on a 2D camera image of the product. The determination of the correct physical measurement locations can be a time consuming and diffcult task. In this paper we present a new methodology for product testing and quality control that integrates 3D imaging techniques with vibration measurements. This procedure allows to test prototypes in a shorter period because physical measurements locations will be located automatically. The proposedmore » methodology uses a 3D time-of-flight camera to measure the location and orientation of the test-object. The 3D image of the time-of-flight camera is then matched with the 3D-CAD model of the object in which measurement locations are pre-defined. A time of flight camera operates strictly in the near infrared spectrum. To improve the signal to noise ratio in the time-of-flight measurement, a time-of-flight camera uses a band filter. As a result of this filter, the laser spot of most laser vibrometers is invisible in the time-of-flight image. Therefore a 2D RGB-camera is used to find the laser-spot of the vibrometer. The laser spot is matched to the 3D image obtained by the time-of-flight camera. Next an automatic calibration procedure is used to aim the laser at the (pre)defined locations. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. Secondly the orientation of the CAD model is known with respect to the laser beam. This information can be used to find the direction of the measured vibration relatively to the surface of the object. With this direction, the vibration measurements can be compared more precisely with numerical experiments.« less
The research on calibration methods of dual-CCD laser three-dimensional human face scanning system
NASA Astrophysics Data System (ADS)
Wang, Jinjiang; Chang, Tianyu; Ge, Baozhen; Tian, Qingguo; Yang, Fengting; Shi, Shendong
2013-09-01
In this paper, on the basis of considering the performance advantages of two-step method, we combines the stereo matching of binocular stereo vision with active laser scanning to calibrate the system. Above all, we select a reference camera coordinate system as the world coordinate system and unity the coordinates of two CCD cameras. And then obtain the new perspective projection matrix (PPM) of each camera after the epipolar rectification. By those, the corresponding epipolar equation of two cameras can be defined. So by utilizing the trigonometric parallax method, we can measure the space point position after distortion correction and achieve stereo matching calibration between two image points. Experiments verify that this method can improve accuracy and system stability is guaranteed. The stereo matching calibration has a simple process with low-cost, and simplifies regular maintenance work. It can acquire 3D coordinates only by planar checkerboard calibration without the need of designing specific standard target or using electronic theodolite. It is found that during the experiment two-step calibration error and lens distortion lead to the stratification of point cloud data. The proposed calibration method which combining active line laser scanning and binocular stereo vision has the both advantages of them. It has more flexible applicability. Theory analysis and experiment shows the method is reasonable.
People counting and re-identification using fusion of video camera and laser scanner
NASA Astrophysics Data System (ADS)
Ling, Bo; Olivera, Santiago; Wagley, Raj
2016-05-01
We present a system for people counting and re-identification. It can be used by transit and homeland security agencies. Under FTA SBIR program, we have developed a preliminary system for transit passenger counting and re-identification using a laser scanner and video camera. The laser scanner is used to identify the locations of passenger's head and shoulder in an image, a challenging task in crowed environment. It can also estimate the passenger height without prior calibration. Various color models have been applied to form color signatures. Finally, using a statistical fusion and classification scheme, passengers are counted and re-identified.
Nikcevic, Irena; Piruska, Aigars; Wehmeyer, Kenneth R.; Seliskar, Carl J.; Limbach, Patrick A.; Heineman, William R.
2010-01-01
Parallel separations using capillary electrophoresis on a multilane microchip with multiplexed laser induced fluorescence detection is demonstrated. The detection system was developed to simultaneously record data on all channels using an expanded laser beam for excitation, a camera lens to capture emission, and a CCD camera for detection. The detection system enables monitoring of each channel continuously and distinguishing individual lanes without significant crosstalk between adjacent lanes. Multiple analytes can be analyzed on parallel lanes within a single microchip in a single run, leading to increased sample throughput. The pKa determination of small molecule analytes is demonstrated with the multilane microchip. PMID:20737446
Active imaging system with Faraday filter
Snyder, James J.
1993-01-01
An active imaging system has a low to medium powered laser transmitter and receiver wherein the receiver includes a Faraday filter with an ultranarrow optical bandpass and a bare (nonintensified) CCD camera. The laser is locked in the vicinity of the passband of the Faraday filter. The system has high sensitivity to the laser illumination while eliminating solar background.
Active imaging system with Faraday filter
Snyder, J.J.
1993-04-13
An active imaging system has a low to medium powered laser transmitter and receiver wherein the receiver includes a Faraday filter with an ultranarrow optical bandpass and a bare (nonintensified) CCD camera. The laser is locked in the vicinity of the passband of the Faraday filter. The system has high sensitivity to the laser illumination while eliminating solar background.
Pixel-based characterisation of CMOS high-speed camera systems
NASA Astrophysics Data System (ADS)
Weber, V.; Brübach, J.; Gordon, R. L.; Dreizler, A.
2011-05-01
Quantifying high-repetition rate laser diagnostic techniques for measuring scalars in turbulent combustion relies on a complete description of the relationship between detected photons and the signal produced by the detector. CMOS-chip based cameras are becoming an accepted tool for capturing high frame rate cinematographic sequences for laser-based techniques such as Particle Image Velocimetry (PIV) and Planar Laser Induced Fluorescence (PLIF) and can be used with thermographic phosphors to determine surface temperatures. At low repetition rates, imaging techniques have benefitted from significant developments in the quality of CCD-based camera systems, particularly with the uniformity of pixel response and minimal non-linearities in the photon-to-signal conversion. The state of the art in CMOS technology displays a significant number of technical aspects that must be accounted for before these detectors can be used for quantitative diagnostics. This paper addresses these issues.
NASA Astrophysics Data System (ADS)
Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.
2014-02-01
A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.
Laser-induced damage threshold of camera sensors and micro-opto-electro-mechanical systems
NASA Astrophysics Data System (ADS)
Schwarz, Bastian; Ritt, Gunnar; Körber, Michael; Eberle, Bernd
2016-10-01
The continuous development of laser systems towards more compact and efficient devices constitutes an increasing threat to electro-optical imaging sensors such as complementary metal-oxide-semiconductors (CMOS) and charge-coupled devices (CCD). These types of electronic sensors are used in day-to-day life but also in military or civil security applications. In camera systems dedicated to specific tasks, also micro-opto-electro-mechanical systems (MOEMS) like a digital micromirror device (DMD) are part of the optical setup. In such systems, the DMD can be located at an intermediate focal plane of the optics and it is also susceptible to laser damage. The goal of our work is to enhance the knowledge of damaging effects on such devices exposed to laser light. The experimental setup for the investigation of laser-induced damage is described in detail. As laser sources both pulsed lasers and continuous-wave (CW) lasers are used. The laser-induced damage threshold (LIDT) is determined by the single-shot method by increasing the pulse energy from pulse to pulse or in the case of CW-lasers, by increasing the laser power. Furthermore, we investigate the morphology of laser-induced damage patterns and the dependence of the number of destructed device elements on the laser pulse energy or laser power. In addition to the destruction of single pixels, we observe aftereffects like persisting dead columns or rows of pixels in the sensor image.
Real-time laser cladding control with variable spot size
NASA Astrophysics Data System (ADS)
Arias, J. L.; Montealegre, M. A.; Vidal, F.; Rodríguez, J.; Mann, S.; Abels, P.; Motmans, F.
2014-03-01
Laser cladding processing has been used in different industries to improve the surface properties or to reconstruct damaged pieces. In order to cover areas considerably larger than the diameter of the laser beam, successive partially overlapping tracks are deposited. With no control over the process variables this conduces to an increase of the temperature, which could decrease mechanical properties of the laser cladded material. Commonly, the process is monitored and controlled by a PC using cameras, but this control suffers from a lack of speed caused by the image processing step. The aim of this work is to design and develop a FPGA-based laser cladding control system. This system is intended to modify the laser beam power according to the melt pool width, which is measured using a CMOS camera. All the control and monitoring tasks are carried out by a FPGA, taking advantage of its abundance of resources and speed of operation. The robustness of the image processing algorithm is assessed, as well as the control system performance. Laser power is decreased as substrate temperature increases, thus maintaining a constant clad width. This FPGA-based control system is integrated in an adaptive laser cladding system, which also includes an adaptive optical system that will control the laser focus distance on the fly. The whole system will constitute an efficient instrument for part repair with complex geometries and coating selective surfaces. This will be a significant step forward into the total industrial implementation of an automated industrial laser cladding process.
Goyal, Anish; Myers, Travis; Wang, Christine A; Kelly, Michael; Tyrrell, Brian; Gokden, B; Sanchez, Antonio; Turner, George; Capasso, Federico
2014-06-16
We demonstrate active hyperspectral imaging using a quantum-cascade laser (QCL) array as the illumination source and a digital-pixel focal-plane-array (DFPA) camera as the receiver. The multi-wavelength QCL array used in this work comprises 15 individually addressable QCLs in which the beams from all lasers are spatially overlapped using wavelength beam combining (WBC). The DFPA camera was configured to integrate the laser light reflected from the sample and to perform on-chip subtraction of the passive thermal background. A 27-frame hyperspectral image was acquired of a liquid contaminant on a diffuse gold surface at a range of 5 meters. The measured spectral reflectance closely matches the calculated reflectance. Furthermore, the high-speed capabilities of the system were demonstrated by capturing differential reflectance images of sand and KClO3 particles that were moving at speeds of up to 10 m/s.
High-Speed Edge-Detecting Line Scan Smart Camera
NASA Technical Reports Server (NTRS)
Prokop, Norman F.
2012-01-01
A high-speed edge-detecting line scan smart camera was developed. The camera is designed to operate as a component in a NASA Glenn Research Center developed inlet shock detection system. The inlet shock is detected by projecting a laser sheet through the airflow. The shock within the airflow is the densest part and refracts the laser sheet the most in its vicinity, leaving a dark spot or shadowgraph. These spots show up as a dip or negative peak within the pixel intensity profile of an image of the projected laser sheet. The smart camera acquires and processes in real-time the linear image containing the shock shadowgraph and outputting the shock location. Previously a high-speed camera and personal computer would perform the image capture and processing to determine the shock location. This innovation consists of a linear image sensor, analog signal processing circuit, and a digital circuit that provides a numerical digital output of the shock or negative edge location. The smart camera is capable of capturing and processing linear images at over 1,000 frames per second. The edges are identified as numeric pixel values within the linear array of pixels, and the edge location information can be sent out from the circuit in a variety of ways, such as by using a microcontroller and onboard or external digital interface to include serial data such as RS-232/485, USB, Ethernet, or CAN BUS; parallel digital data; or an analog signal. The smart camera system can be integrated into a small package with a relatively small number of parts, reducing size and increasing reliability over the previous imaging system..
Marshall, Garrett J; Thompson, Scott M; Shamsaei, Nima
2016-06-01
An OPTOMEC Laser Engineered Net Shaping (LENS(™)) 750 system was retrofitted with a melt pool pyrometer and in-chamber infrared (IR) camera for nondestructive thermal inspection of the blown-powder, direct laser deposition (DLD) process. Data indicative of temperature and heat transfer within the melt pool and heat affected zone atop a thin-walled structure of Ti-6Al-4V during its additive manufacture are provided. Melt pool temperature data were collected via the dual-wavelength pyrometer while the dynamic, bulk part temperature distribution was collected using the IR camera. Such data are provided in Comma Separated Values (CSV) file format, containing a 752×480 matrix and a 320×240 matrix of temperatures corresponding to individual pixels of the pyrometer and IR camera, respectively. The IR camera and pyrometer temperature data are provided in blackbody-calibrated, raw forms. Provided thermal data can aid in generating and refining process-property-performance relationships between laser manufacturing and its fabricated materials.
Marshall, Garrett J.; Thompson, Scott M.; Shamsaei, Nima
2016-01-01
An OPTOMEC Laser Engineered Net Shaping (LENS™) 750 system was retrofitted with a melt pool pyrometer and in-chamber infrared (IR) camera for nondestructive thermal inspection of the blown-powder, direct laser deposition (DLD) process. Data indicative of temperature and heat transfer within the melt pool and heat affected zone atop a thin-walled structure of Ti–6Al–4V during its additive manufacture are provided. Melt pool temperature data were collected via the dual-wavelength pyrometer while the dynamic, bulk part temperature distribution was collected using the IR camera. Such data are provided in Comma Separated Values (CSV) file format, containing a 752×480 matrix and a 320×240 matrix of temperatures corresponding to individual pixels of the pyrometer and IR camera, respectively. The IR camera and pyrometer temperature data are provided in blackbody-calibrated, raw forms. Provided thermal data can aid in generating and refining process-property-performance relationships between laser manufacturing and its fabricated materials. PMID:27054180
Research on range-gated laser active imaging seeker
NASA Astrophysics Data System (ADS)
You, Mu; Wang, PengHui; Tan, DongJie
2013-09-01
Compared with other imaging methods such as millimeter wave imaging, infrared imaging and visible light imaging, laser imaging provides both a 2-D array of reflected intensity data as well as 2-D array of range data, which is the most important data for use in autonomous target acquisition .In terms of application, it can be widely used in military fields such as radar, guidance and fuse. In this paper, we present a laser active imaging seeker system based on range-gated laser transmitter and sensor technology .The seeker system presented here consist of two important part, one is laser image system, which uses a negative lens to diverge the light from a pulse laser to flood illuminate a target, return light is collected by a camera lens, each laser pulse triggers the camera delay and shutter. The other is stabilization gimbals, which is designed to be a rotatable structure both in azimuth and elevation angles. The laser image system consists of transmitter and receiver. The transmitter is based on diode pumped solid-state lasers that are passively Q-switched at 532nm wavelength. A visible wavelength was chosen because the receiver uses a Gen III image intensifier tube with a spectral sensitivity limited to wavelengths less than 900nm.The receiver is image intensifier tube's micro channel plate coupled into high sensitivity charge coupled device camera. The image has been taken at range over one kilometer and can be taken at much longer range in better weather. Image frame frequency can be changed according to requirement of guidance with modifiable range gate, The instantaneous field of views of the system was found to be 2×2 deg. Since completion of system integration, the seeker system has gone through a series of tests both in the lab and in the outdoor field. Two different kinds of buildings have been chosen as target, which is located at range from 200m up to 1000m.To simulate dynamic process of range change between missile and target, the seeker system has been placed on the truck vehicle running along the road in an expected speed. The test result shows qualified image and good performance of the seeker system.
NASA Technical Reports Server (NTRS)
Humphreys, William M., Jr.; Bartram, Scott M.
2001-01-01
A novel multiple-camera system for the recording of digital particle image velocimetry (DPIV) images acquired in a two-dimensional separating/reattaching flow is described. The measurements were performed in the NASA Langley Subsonic Basic Research Tunnel as part of an overall series of experiments involving the simultaneous acquisition of dynamic surface pressures and off-body velocities. The DPIV system utilized two frequency-doubled Nd:YAG lasers to generate two coplanar, orthogonally polarized light sheets directed upstream along the horizontal centerline of the test model. A recording system containing two pairs of matched high resolution, 8-bit cameras was used to separate and capture images of illuminated tracer particles embedded in the flow field. Background image subtraction was used to reduce undesirable flare light emanating from the surface of the model, and custom pixel alignment algorithms were employed to provide accurate registration among the various cameras. Spatial cross correlation analysis with median filter validation was used to determine the instantaneous velocity structure in the separating/reattaching flow region illuminated by the laser light sheets. In operation the DPIV system exhibited a good ability to resolve large-scale separated flow structures with acceptable accuracy over the extended field of view of the cameras. The recording system design provided enhanced performance versus traditional DPIV systems by allowing a variety of standard and non-standard cameras to be easily incorporated into the system.
Towards Robust Self-Calibration for Handheld 3d Line Laser Scanning
NASA Astrophysics Data System (ADS)
Bleier, M.; Nüchter, A.
2017-11-01
This paper studies self-calibration of a structured light system, which reconstructs 3D information using video from a static consumer camera and a handheld cross line laser projector. Intersections between the individual laser curves and geometric constraints on the relative position of the laser planes are exploited to achieve dense 3D reconstruction. This is possible without any prior knowledge of the movement of the projector. However, inaccurrately extracted laser lines introduce noise in the detected intersection positions and therefore distort the reconstruction result. Furthermore, when scanning objects with specular reflections, such as glossy painted or metalic surfaces, the reflections are often extracted from the camera image as erroneous laser curves. In this paper we investiagte how robust estimates of the parameters of the laser planes can be obtained despite of noisy detections.
Impact of laser phase and amplitude noises on streak camera temporal resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wlotzko, V., E-mail: wlotzko@optronis.com; Optronis GmbH, Ludwigstrasse 2, 77694 Kehl; Uhring, W.
2015-09-15
Streak cameras are now reaching sub-picosecond temporal resolution. In cumulative acquisition mode, this resolution does not entirely rely on the electronic or the vacuum tube performances but also on the light source characteristics. The light source, usually an actively mode-locked laser, is affected by phase and amplitude noises. In this paper, the theoretical effects of such noises on the synchronization of the streak system are studied in synchroscan and triggered modes. More precisely, the contribution of band-pass filters, delays, and time walk is ascertained. Methods to compute the resulting synchronization jitter are depicted. The results are verified by measurement withmore » a streak camera combined with a Ti:Al{sub 2}O{sub 3} solid state laser oscillator and also a fiber oscillator.« less
An imaging system for PLIF/Mie measurements for a combusting flow
NASA Technical Reports Server (NTRS)
Wey, C. C.; Ghorashi, B.; Marek, C. J.; Wey, C.
1990-01-01
The equipment required to establish an imaging system can be divided into four parts: (1) the light source and beam shaping optics; (2) camera and recording; (3) image acquisition and processing; and (4) computer and output systems. A pulsed, Nd:YAG-pummped, frequency-doubled dye laser which can freeze motion in the flowfield is used for an illumination source. A set of lenses is used to form the laser beam into a sheet. The induced fluorescence is collected by an UV-enhanced lens and passes through an UV-enhanced microchannel plate intensifier which is optically coupled to a gated solid state CCD camera. The output of the camera is simultaneously displayed on a monitor and recorded on either a laser videodisc set of a Super VHS VCR. This videodisc set is controlled by a minicomputer via a connection to the RS-232C interface terminals. The imaging system is connected to the host computer by a bus repeater and can be multiplexed between four video input sources. Sample images from a planar shear layer experiment are presented to show the processing capability of the imaging system with the host computer.
Versatile microsecond movie camera
NASA Astrophysics Data System (ADS)
Dreyfus, R. W.
1980-03-01
A laboratory-type movie camera is described which satisfies many requirements in the range 1 microsec to 1 sec. The camera consists of a He-Ne laser and compatible state-of-the-art components; the primary components are an acoustooptic modulator, an electromechanical beam deflector, and a video tape system. The present camera is distinct in its operation in that submicrosecond laser flashes freeze the image motion while still allowing the simplicity of electromechanical image deflection in the millisecond range. The gating and pulse delay circuits of an oscilloscope synchronize the modulator and scanner relative to the subject being photographed. The optical table construction and electronic control enhance the camera's versatility and adaptability. The instant replay video tape recording allows for easy synchronization and immediate viewing of the results. Economy is achieved by using off-the-shelf components, optical table construction, and short assembly time.
Application of a laser scanner to three dimensional visual sensing tasks
NASA Technical Reports Server (NTRS)
Ryan, Arthur M.
1992-01-01
The issues are described which are associated with using a laser scanner for visual sensing and the methods developed by the author to address them. A laser scanner is a device that controls the direction of a laser beam by deflecting it through a pair of orthogonal mirrors, the orientations of which are specified by a computer. If a calibrated laser scanner is combined with a calibrated camera, it is possible to perform three dimensional sensing by directing the laser at objects within the field of view of the camera. There are several issues associated with using a laser scanner for three dimensional visual sensing that must be addressed in order to use the laser scanner effectively. First, methods are needed to calibrate the laser scanner and estimate three dimensional points. Second, methods to estimate three dimensional points using a calibrated camera and laser scanner are required. Third, methods are required for locating the laser spot in a cluttered image. Fourth, mathematical models that predict the laser scanner's performance and provide structure for three dimensional data points are necessary. Several methods were developed to address each of these and has evaluated them to determine how and when they should be applied. The theoretical development, implementation, and results when used in a dual arm eighteen degree of freedom robotic system for space assembly is described.
Joint Calibration of 3d Laser Scanner and Digital Camera Based on Dlt Algorithm
NASA Astrophysics Data System (ADS)
Gao, X.; Li, M.; Xing, L.; Liu, Y.
2018-04-01
Design a calibration target that can be scanned by 3D laser scanner while shot by digital camera, achieving point cloud and photos of a same target. A method to joint calibrate 3D laser scanner and digital camera based on Direct Linear Transformation algorithm was proposed. This method adds a distortion model of digital camera to traditional DLT algorithm, after repeating iteration, it can solve the inner and external position element of the camera as well as the joint calibration of 3D laser scanner and digital camera. It comes to prove that this method is reliable.
Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules
NASA Astrophysics Data System (ADS)
Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix
2009-02-01
Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.
Combined hostile fire and optics detection
NASA Astrophysics Data System (ADS)
Brännlund, Carl; Tidström, Jonas; Henriksson, Markus; Sjöqvist, Lars
2013-10-01
Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.
Utilization of a Terrestrial Laser Scanner for the Calibration of Mobile Mapping Systems
Hong, Seunghwan; Park, Ilsuk; Lee, Jisang; Lim, Kwangyong; Choi, Yoonjo; Sohn, Hong-Gyoo
2017-01-01
This paper proposes a practical calibration solution for estimating the boresight and lever-arm parameters of the sensors mounted on a Mobile Mapping System (MMS). On our MMS devised for conducting the calibration experiment, three network video cameras, one mobile laser scanner, and one Global Navigation Satellite System (GNSS)/Inertial Navigation System (INS) were mounted. The geometric relationships between three sensors were solved by the proposed calibration, considering the GNSS/INS as one unit sensor. Our solution basically uses the point cloud generated by a 3-dimensional (3D) terrestrial laser scanner rather than using conventionally obtained 3D ground control features. With the terrestrial laser scanner, accurate and precise reference data could be produced and the plane features corresponding with the sparse mobile laser scanning data could be determined with high precision. Furthermore, corresponding point features could be extracted from the dense terrestrial laser scanning data and the images captured by the video cameras. The parameters of the boresight and the lever-arm were calculated based on the least squares approach and the precision of the boresight and lever-arm could be achieved by 0.1 degrees and 10 mm, respectively. PMID:28264457
New Submount Requirement of Conductively Cooled Laser Diodes for Lidar Applications
NASA Technical Reports Server (NTRS)
Mo, S. Y.; Cutler, A. D.; Choi, S. H.; Lee, M. H.; Singh, U. N.
2000-01-01
New submount technology is essential for the development of conductively cooled high power diode laser. The simulation and experimental results indicate that thermal conductivity of submount for high power laser-diode must be at least 600 W/m/k or higher for stable operation. We have simulated several theoretical thermal model based on new submount designs and characterized high power diode lasers to determine temperature effects on the performances of laser diodes. The characterization system measures the beam power, output beam profile, temperature distribution, and spectroscopic property of high power diode laser. The characterization system is composed of four main parts: an infrared imaging camera, a CCD camera, a monochromator, and a power meter. Thermal characteristics of two commercial-grade CW 20-W diode laser bars with open heat-sink type were determined with respect to the line shift of emission spectra and beam power stability. The center wavelength of laser emission has a tendency to shift toward longer wavelength as the driving current and heat sink temperature are increased. The increase of heat sink temperature decreases the output power of the laser bar too. Such results lay the guidelines for the design of new submount for high power laser-diodes.
Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Max, C.E.; Gavel, D.T.; Olivier, S.S.
1995-08-03
A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less
QUANTITATIVE DETECTION OF ENVIRONMENTALLY IMPORTANT DYES USING DIODE LASER/FIBER-OPTIC RAMAN
A compact diode laser/fiber-optic Raman spectrometer is used for quantitative detection of environmentally important dyes. This system is based on diode laser excitation at 782 mm, fiber optic probe technology, an imaging spectrometer, and state-of-the-art scientific CCD camera. ...
Visual servoing of a laser ablation based cochleostomy
NASA Astrophysics Data System (ADS)
Kahrs, Lüder A.; Raczkowsky, Jörg; Werner, Martin; Knapp, Felix B.; Mehrwald, Markus; Hering, Peter; Schipper, Jörg; Klenzner, Thomas; Wörn, Heinz
2008-03-01
The aim of this study is a defined, visually based and camera controlled bone removal by a navigated CO II laser on the promontory of the inner ear. A precise and minimally traumatic opening procedure of the cochlea for the implantation of a cochlear implant electrode (so-called cochleostomy) is intended. Harming the membrane linings of the inner ear can result in damage of remaining organ functions (e.g. complete deafness or vertigo). A precise tissue removal by a laser-based bone ablation system is investigated. Inside the borehole the pulsed laser beam is guided automatically over the bone by using a two mirror galvanometric scanner. The ablation process is controlled by visual servoing. For the detection of the boundary layers of the inner ear the ablation area is monitored by a color camera. The acquired pictures are analyzed by image processing. The results of this analysis are used to control the process of laser ablation. This publication describes the complete system including image processing algorithms and the concept for the resulting distribution of single laser pulses. The system has been tested on human cochleae in ex-vivo studies. Further developments could lead to safe intraoperative openings of the cochlea by a robot based surgical laser instrument.
Indoor space 3D visual reconstruction using mobile cart with laser scanner and cameras
NASA Astrophysics Data System (ADS)
Gashongore, Prince Dukundane; Kawasue, Kikuhito; Yoshida, Kumiko; Aoki, Ryota
2017-02-01
Indoor space 3D visual reconstruction has many applications and, once done accurately, it enables people to conduct different indoor activities in an efficient manner. For example, an effective and efficient emergency rescue response can be accomplished in a fire disaster situation by using 3D visual information of a destroyed building. Therefore, an accurate Indoor Space 3D visual reconstruction system which can be operated in any given environment without GPS has been developed using a Human-Operated mobile cart equipped with a laser scanner, CCD camera, omnidirectional camera and a computer. By using the system, accurate indoor 3D Visual Data is reconstructed automatically. The obtained 3D data can be used for rescue operations, guiding blind or partially sighted persons and so forth.
Modulated CMOS camera for fluorescence lifetime microscopy.
Chen, Hongtao; Holst, Gerhard; Gratton, Enrico
2015-12-01
Widefield frequency-domain fluorescence lifetime imaging microscopy (FD-FLIM) is a fast and accurate method to measure the fluorescence lifetime of entire images. However, the complexity and high costs involved in construction of such a system limit the extensive use of this technique. PCO AG recently released the first luminescence lifetime imaging camera based on a high frequency modulated CMOS image sensor, QMFLIM2. Here we tested and provide operational procedures to calibrate the camera and to improve the accuracy using corrections necessary for image analysis. With its flexible input/output options, we are able to use a modulated laser diode or a 20 MHz pulsed white supercontinuum laser as the light source. The output of the camera consists of a stack of modulated images that can be analyzed by the SimFCS software using the phasor approach. The nonuniform system response across the image sensor must be calibrated at the pixel level. This pixel calibration is crucial and needed for every camera settings, e.g. modulation frequency and exposure time. A significant dependency of the modulation signal on the intensity was also observed and hence an additional calibration is needed for each pixel depending on the pixel intensity level. These corrections are important not only for the fundamental frequency, but also for the higher harmonics when using the pulsed supercontinuum laser. With these post data acquisition corrections, the PCO CMOS-FLIM camera can be used for various biomedical applications requiring a large frame and high speed acquisition. © 2015 Wiley Periodicals, Inc.
A new method to acquire 3-D images of a dental cast
NASA Astrophysics Data System (ADS)
Li, Zhongke; Yi, Yaxing; Zhu, Zhen; Li, Hua; Qin, Yongyuan
2006-01-01
This paper introduced our newly developed method to acquire three-dimensional images of a dental cast. A rotatable table, a laser-knife, a mirror, a CCD camera and a personal computer made up of a three-dimensional data acquiring system. A dental cast is placed on the table; the mirror is installed beside the table; a linear laser is projected to the dental cast; the CCD camera is put up above the dental cast, it can take picture of the dental cast and the shadow in the mirror; while the table rotating, the camera records the shape of the laser streak projected on the dental cast, and transmit the data to the computer. After the table rotated one circuit, the computer processes the data, calculates the three-dimensional coordinates of the dental cast's surface. In data processing procedure, artificial neural networks are enrolled to calibrate the lens distortion, map coordinates form screen coordinate system to world coordinate system. According to the three-dimensional coordinates, the computer reconstructs the stereo image of the dental cast. It is essential for computer-aided diagnosis and treatment planning in orthodontics. In comparison with other systems in service, for example, laser beam three-dimensional scanning system, the characteristic of this three-dimensional data acquiring system: a. celerity, it casts only 1 minute to scan a dental cast; b. compact, the machinery is simple and compact; c. no blind zone, a mirror is introduced ably to reduce blind zone.
Xu, Guan; Yuan, Jing; Li, Xiaotao; Su, Jian
2017-08-01
Vision measurement on the basis of structured light plays a significant role in the optical inspection research. The 2D target fixed with a line laser projector is designed to realize the transformations among the world coordinate system, the camera coordinate system and the image coordinate system. The laser projective point and five non-collinear points that are randomly selected from the target are adopted to construct a projection invariant. The closed form solutions of the 3D laser points are solved by the homogeneous linear equations generated from the projection invariants. The optimization function is created by the parameterized re-projection errors of the laser points and the target points in the image coordinate system. Furthermore, the nonlinear optimization solutions of the world coordinates of the projection points, the camera parameters and the lens distortion coefficients are contributed by minimizing the optimization function. The accuracy of the 3D reconstruction is evaluated by comparing the displacements of the reconstructed laser points with the actual displacements. The effects of the image quantity, the lens distortion and the noises are investigated in the experiments, which demonstrate that the reconstruction approach is effective to contribute the accurate test in the measurement system.
NASA Astrophysics Data System (ADS)
Boxx, I.; Carter, C. D.; Meier, W.
2014-08-01
Tomographic particle image velocimetry (tomographic-PIV) is a recently developed measurement technique used to acquire volumetric velocity field data in liquid and gaseous flows. The technique relies on line-of-sight reconstruction of the rays between a 3D particle distribution and a multi-camera imaging system. In a turbulent flame, however, index-of-refraction variations resulting from local heat-release may inhibit reconstruction and thereby render the technique infeasible. The objective of this study was to test the efficacy of tomographic-PIV in a turbulent flame. An additional goal was to determine the feasibility of acquiring usable tomographic-PIV measurements in a turbulent flame at multi-kHz acquisition rates with current-generation laser and camera technology. To this end, a setup consisting of four complementary metal oxide semiconductor cameras and a dual-cavity Nd:YAG laser was implemented to test the technique in a lifted turbulent jet flame. While the cameras were capable of kHz-rate image acquisition, the laser operated at a pulse repetition rate of only 10 Hz. However, use of this laser allowed exploration of the required pulse energy and thus power for a kHz-rate system. The imaged region was 29 × 28 × 2.7 mm in size. The tomographic reconstruction of the 3D particle distributions was accomplished using the multiplicative algebraic reconstruction technique. The results indicate that volumetric velocimetry via tomographic-PIV is feasible with pulse energies of 25 mJ, which is within the capability of current-generation kHz-rate diode-pumped solid-state lasers.
Extrinsic Calibration of a Laser Galvanometric Setup and a Range Camera.
Sels, Seppe; Bogaerts, Boris; Vanlanduit, Steve; Penne, Rudi
2018-05-08
Currently, galvanometric scanning systems (like the one used in a scanning laser Doppler vibrometer) rely on a planar calibration procedure between a two-dimensional (2D) camera and the laser galvanometric scanning system to automatically aim a laser beam at a particular point on an object. In the case of nonplanar or moving objects, this calibration is not sufficiently accurate anymore. In this work, a three-dimensional (3D) calibration procedure that uses a 3D range sensor is proposed. The 3D calibration is valid for all types of objects and retains its accuracy when objects are moved between subsequent measurement campaigns. The proposed 3D calibration uses a Non-Perspective-n-Point (NPnP) problem solution. The 3D range sensor is used to calculate the position of the object under test relative to the laser galvanometric system. With this extrinsic calibration, the laser galvanometric scanning system can automatically aim a laser beam to this object. In experiments, the mean accuracy of aiming the laser beam on an object is below 10 mm for 95% of the measurements. This achieved accuracy is mainly determined by the accuracy and resolution of the 3D range sensor. The new calibration method is significantly better than the original 2D calibration method, which in our setup achieves errors below 68 mm for 95% of the measurements.
Optical fiducial timing system for X-ray streak cameras with aluminum coated optical fiber ends
Nilson, David G.; Campbell, E. Michael; MacGowan, Brian J.; Medecki, Hector
1988-01-01
An optical fiducial timing system is provided for use with interdependent groups of X-ray streak cameras (18). The aluminum coated (80) ends of optical fibers (78) are positioned with the photocathodes (20, 60, 70) of the X-ray streak cameras (18). The other ends of the optical fibers (78) are placed together in a bundled array (90). A fiducial optical signal (96), that is comprised of 2.omega. or 1.omega. laser light, after introduction to the bundled array (90), travels to the aluminum coated (82) optical fiber ends and ejects quantities of electrons (84) that are recorded on the data recording media (52) of the X-ray streak cameras (18). Since both 2.omega. and 1.omega. laser light can travel long distances in optical fiber with only a slight attenuation, the initial arial power density of the fiducial optical signal (96) is well below the damage threshold of the fused silica or other material that comprises the optical fibers (78, 90). Thus the fiducial timing system can be repeatably used over long durations of time.
Visual based laser speckle pattern recognition method for structural health monitoring
NASA Astrophysics Data System (ADS)
Park, Kyeongtaek; Torbol, Marco
2017-04-01
This study performed the system identification of a target structure by analyzing the laser speckle pattern taken by a camera. The laser speckle pattern is generated by the diffuse reflection of the laser beam on a rough surface of the target structure. The camera, equipped with a red filter, records the scattered speckle particles of the laser light in real time and the raw speckle image of the pixel data is fed to the graphic processing unit (GPU) in the system. The algorithm for laser speckle contrast analysis (LASCA) computes: the laser speckle contrast images and the laser speckle flow images. The k-mean clustering algorithm is used to classify the pixels in each frame and the clusters' centroids, which function as virtual sensors, track the displacement between different frames in time domain. The fast Fourier transform (FFT) and the frequency domain decomposition (FDD) compute the modal properties of the structure: natural frequencies and damping ratios. This study takes advantage of the large scale computational capability of GPU. The algorithm is written in Compute Unifies Device Architecture (CUDA C) that allows the processing of speckle images in real time.
Cavitation effect of holmium laser pulse applied to ablation of hard tissue underwater.
Lü, Tao; Xiao, Qing; Xia, Danqing; Ruan, Kai; Li, Zhengjia
2010-01-01
To overcome the inconsecutive drawback of shadow and schlieren photography, the complete dynamics of cavitation bubble oscillation or ablation products induced by a single holmium laser pulse [2.12 microm, 300 micros (FWHM)] transmitted in different core diameter (200, 400, and 600 microm) fibers is recorded by means of high-speed photography. Consecutive images from high-speed cameras can stand for the true and complete process of laser-water or laser-tissue interaction. Both laser pulse energy and fiber diameter determine cavitation bubble size, which further determines acoustic transient amplitudes. Based on the pictures taken by high-speed camera and scanned by an optical coherent microscopy (OCM) system, it is easily seen that the liquid layer at the distal end of the fiber plays an important role during the process of laser-tissue interaction, which can increase ablation efficiency, decrease heat side effects, and reduce cost.
Atmospheric aerosol profiling with a bistatic imaging lidar system.
Barnes, John E; Sharma, N C Parikh; Kaplan, Trevor B
2007-05-20
Atmospheric aerosols have been profiled using a simple, imaging, bistatic lidar system. A vertical laser beam is imaged onto a charge-coupled-device camera from the ground to the zenith with a wide-angle lens (CLidar). The altitudes are derived geometrically from the position of the camera and laser with submeter resolution near the ground. The system requires no overlap correction needed in monostatic lidar systems and needs a much smaller dynamic range. Nighttime measurements of both molecular and aerosol scattering were made at Mauna Loa Observatory. The CLidar aerosol total scatter compares very well with a nephelometer measuring at 10 m above the ground. The results build on earlier work that compared purely molecular scattered light to theory, and detail instrument improvements.
VSF Measurements and Inversion for RaDyO
2012-09-30
near-surface waters, including the surf zone. APPROACH MASCOT (Multi-Angle SCattering Optical Tool) has a 30 mW 658 nm laser diode source...in Santa Barbara Channel are provided in Fig. 1. Despite the widespread use of polarized laser sources across a diversity of Navy applications, this...operations that rely on divers, cameras, laser imaging systems, and active and passive remote sensing systems. These include mine countermeasures, harbor
Nikcevic, Irena; Piruska, Aigars; Wehmeyer, Kenneth R; Seliskar, Carl J; Limbach, Patrick A; Heineman, William R
2010-08-01
Parallel separations using CE on a multilane microchip with multiplexed LIF detection is demonstrated. The detection system was developed to simultaneously record data on all channels using an expanded laser beam for excitation, a camera lens to capture emission, and a CCD camera for detection. The detection system enables monitoring of each channel continuously and distinguishing individual lanes without significant crosstalk between adjacent lanes. Multiple analytes can be determined in parallel lanes within a single microchip in a single run, leading to increased sample throughput. The pK(a) determination of small molecule analytes is demonstrated with the multilane microchip.
Brown, David M; Juarez, Juan C; Brown, Andrea M
2013-12-01
A laser differential image-motion monitor (DIMM) system was designed and constructed as part of a turbulence characterization suite during the DARPA free-space optical experimental network experiment (FOENEX) program. The developed link measurement system measures the atmospheric coherence length (r0), atmospheric scintillation, and power in the bucket for the 1550 nm band. DIMM measurements are made with two separate apertures coupled to a single InGaAs camera. The angle of arrival (AoA) for the wavefront at each aperture can be calculated based on focal spot movements imaged by the camera. By utilizing a single camera for the simultaneous measurement of the focal spots, the correlation of the variance in the AoA allows a straightforward computation of r0 as in traditional DIMM systems. Standard measurements of scintillation and power in the bucket are made with the same apertures by redirecting a percentage of the incoming signals to InGaAs detectors integrated with logarithmic amplifiers for high sensitivity and high dynamic range. By leveraging two, small apertures, the instrument forms a small size and weight configuration for mounting to actively tracking laser communication terminals for characterizing link performance.
A goggle navigation system for cancer resection surgery
NASA Astrophysics Data System (ADS)
Xu, Junbin; Shao, Pengfei; Yue, Ting; Zhang, Shiwu; Ding, Houzhu; Wang, Jinkun; Xu, Ronald
2014-02-01
We describe a portable fluorescence goggle navigation system for cancer margin assessment during oncologic surgeries. The system consists of a computer, a head mount display (HMD) device, a near infrared (NIR) CCD camera, a miniature CMOS camera, and a 780 nm laser diode excitation light source. The fluorescence and the background images of the surgical scene are acquired by the CCD camera and the CMOS camera respectively, co-registered, and displayed on the HMD device in real-time. The spatial resolution and the co-registration deviation of the goggle navigation system are evaluated quantitatively. The technical feasibility of the proposed goggle system is tested in an ex vivo tumor model. Our experiments demonstrate the feasibility of using a goggle navigation system for intraoperative margin detection and surgical guidance.
NASA Astrophysics Data System (ADS)
Kim, Min Young; Cho, Hyung Suck; Kim, Jae H.
2002-10-01
In recent years, intelligent autonomous mobile robots have drawn tremendous interests as service robots for serving human or industrial robots for replacing human. To carry out the task, robots must be able to sense and recognize 3D space that they live or work. In this paper, we deal with the topic related to 3D sensing system for the environment recognition of mobile robots. For this, the structured lighting is basically utilized for a 3D visual sensor system because of the robustness on the nature of the navigation environment and the easy extraction of feature information of interest. The proposed sensing system is classified into a trinocular vision system, which is composed of the flexible multi-stripe laser projector, and two cameras. The principle of extracting the 3D information is based on the optical triangulation method. With modeling the projector as another camera and using the epipolar constraints which the whole cameras makes, the point-to-point correspondence between the line feature points in each image is established. In this work, the principle of this sensor is described in detail, and a series of experimental tests is performed to show the simplicity and efficiency and accuracy of this sensor system for 3D the environment sensing and recognition.
Vann, C.
1998-03-24
The Laser Pulse Sampler (LPS) measures temporal pulse shape without the problems of a streak camera. Unlike the streak camera, the laser pulse directly illuminates a camera in the LPS, i.e., no additional equipment or energy conversions are required. The LPS has several advantages over streak cameras. The dynamic range of the LPS is limited only by the range of its camera, which for a cooled camera can be as high as 16 bits, i.e., 65,536. The LPS costs less because there are fewer components, and those components can be mass produced. The LPS is easier to calibrate and maintain because there is only one energy conversion, i.e., photons to electrons, in the camera. 5 figs.
Software for Acquiring Image Data for PIV
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Cheung, H. M.; Kressler, Brian
2003-01-01
PIV Acquisition (PIVACQ) is a computer program for acquisition of data for particle-image velocimetry (PIV). In the PIV system for which PIVACQ was developed, small particles entrained in a flow are illuminated with a sheet of light from a pulsed laser. The illuminated region is monitored by a charge-coupled-device camera that operates in conjunction with a data-acquisition system that includes a frame grabber and a counter-timer board, both installed in a single computer. The camera operates in "frame-straddle" mode where a pair of images can be obtained closely spaced in time (on the order of microseconds). The frame grabber acquires image data from the camera and stores the data in the computer memory. The counter/timer board triggers the camera and synchronizes the pulsing of the laser with acquisition of data from the camera. PIVPROC coordinates all of these functions and provides a graphical user interface, through which the user can control the PIV data-acquisition system. PIVACQ enables the user to acquire a sequence of single-exposure images, display the images, process the images, and then save the images to the computer hard drive. PIVACQ works in conjunction with the PIVPROC program which processes the images of particles into the velocity field in the illuminated plane.
The First Light of the Subaru Laser Guide Star Adaptive Optics System
NASA Astrophysics Data System (ADS)
Takami, H.; Hayano, Y.; Oya, S.; Hattori, M.; Watanabe, M.; Guyon, O.; Eldred, M.; Colley, S.; Saito, Y.; Itoh, M.; Dinkins, M.
Subaru Telescope has been operating 36 element curvature sensor AO system for the Cassegrain focus since 2000. We have developed a new AO system for the Nasmyth focus. The AO system has 188 element curvature wavefront sensor and bimorph deformable mirror. It is the largest format system for this type of sensor . The deformable mirror has also 188 element with 90 mm effective aperture and 130 mm blank size. The real time controller is 4 CPU real time Linux OS computer and the update speed is now 1.5 kHz. The AO system also has laser guide star system. The laser is sum frequency solid state laser generating 589 nm light. We have achieved 4.7 W output power with excellent beam quality of M^2=1.1 and good stability. The laser is installed in a clean room on the Nasmyth platform. The laser beam is transferred by photonic crystal optical fiber with 35 m to the 50 cm laser launching telescope mounted behind the Subaru 2ry mirror. The field of view of the low order wavefront sensor for tilt guide star in LGS mode is 2.7 arcmin in diameter. The AO system had the first light with natural guide star in October 2006. The Strehl ratio was > 0.5 at K band under the 0.8 arcsec visible seeing. We also has projected laser beam on the sky during the same engineering run. Three instruments will be used with the AO system. Infrared camera and spectrograph (IRCS), High dynamic range IR camera (HiCIAO) for exosolar planet detection, and visible 3D spectrograph.
Photogrammetry of Apollo 15 photography, part C
NASA Technical Reports Server (NTRS)
Wu, S. S. C.; Schafer, F. J.; Jordan, R.; Nakata, G. M.; Derick, J. L.
1972-01-01
In the Apollo 15 mission, a mapping camera system and a 61 cm optical bar, high resolution panoramic camera, as well as a laser altimeter were used. The panoramic camera is described, having several distortion sources, such as cylindrical shape of the negative film surface, the scanning action of the lens, the image motion compensator, and the spacecraft motion. Film products were processed on a specifically designed analytical plotter.
Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing
NASA Technical Reports Server (NTRS)
Crooke, Julie A.
2003-01-01
The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.
Vision and spectroscopic sensing for joint tracing in narrow gap laser butt welding
NASA Astrophysics Data System (ADS)
Nilsen, Morgan; Sikström, Fredrik; Christiansson, Anna-Karin; Ancona, Antonio
2017-11-01
The automated laser beam butt welding process is sensitive to positioning the laser beam with respect to the joint because a small offset may result in detrimental lack of sidewall fusion. This problem is even more pronounced in case of narrow gap butt welding, where most of the commercial automatic joint tracing systems fail to detect the exact position and size of the gap. In this work, a dual vision and spectroscopic sensing approach is proposed to trace narrow gap butt joints during laser welding. The system consists of a camera with suitable illumination and matched optical filters and a fast miniature spectrometer. An image processing algorithm of the camera recordings has been developed in order to estimate the laser spot position relative to the joint position. The spectral emissions from the laser induced plasma plume have been acquired by the spectrometer, and based on the measurements of the intensities of selected lines of the spectrum, the electron temperature signal has been calculated and correlated to variations of process conditions. The individual performances of these two systems have been experimentally investigated and evaluated offline by data from several welding experiments, where artificial abrupt as well as gradual deviations of the laser beam out of the joint were produced. Results indicate that a combination of the information provided by the vision and spectroscopic systems is beneficial for development of a hybrid sensing system for joint tracing.
Method and apparatus for coherent imaging of infrared energy
Hutchinson, Donald P.
1998-01-01
A coherent camera system performs ranging, spectroscopy, and thermal imaging. Local oscillator radiation is combined with target scene radiation to enable heterodyne detection by the coherent camera's two-dimensional photodetector array. Versatility enables deployment of the system in either a passive mode (where no laser energy is actively transmitted toward the target scene) or an active mode (where a transmitting laser is used to actively illuminate the target scene). The two-dimensional photodetector array eliminates the need to mechanically scan the detector. Each element of the photodetector array produces an intermediate frequency signal that is amplified, filtered, and rectified by the coherent camera's integrated circuitry. By spectroscopic examination of the frequency components of each pixel of the detector array, a high-resolution, three-dimensional or holographic image of the target scene is produced for applications such as air pollution studies, atmospheric disturbance monitoring, and military weapons targeting.
Vacuum-Compatible Wideband White Light and Laser Combiner Source System
NASA Technical Reports Server (NTRS)
Azizi, Alineza; Ryan, Daniel J.; Tang, Hong; Demers, Richard T.; Kadogawa, Hiroshi; An, Xin; Sun, George Y.
2010-01-01
For the Space Interferometry Mission (SIM) Spectrum Calibration Development Unit (SCDU) testbed, wideband white light is used to simulate starlight. The white light source mount requires extremely stable pointing accuracy (<3.2 microradians). To meet this and other needs, the laser light from a single-mode fiber was combined, through a beam splitter window with special coating from broadband wavelengths, with light from multimode fiber. Both lights were coupled to a photonic crystal fiber (PCF). In many optical systems, simulating a point star with broadband spectrum with stability of microradians for white light interferometry is a challenge. In this case, the cameras use the white light interference to balance two optical paths, and to maintain close tracking. In order to coarse align the optical paths, a laser light is sent into the system to allow tracking of fringes because a narrow band laser has a great range of interference. The design requirements forced the innovators to use a new type of optical fiber, and to take a large amount of care in aligning the input sources. The testbed required better than 1% throughput, or enough output power on the lowest spectrum to be detectable by the CCD camera (6 nW at camera). The system needed to be vacuum-compatible and to have the capability for combining a visible laser light at any time for calibration purposes. The red laser is a commercially produced 635-nm laser 5-mW diode, and the white light source is a commercially produced tungsten halogen lamp that gives a broad spectrum of about 525 to 800 nm full width at half maximum (FWHM), with about 1.4 mW of power at 630 nm. A custom-made beam splitter window with special coating for broadband wavelengths is used with the white light input via a 50-mm multi-mode fiber. The large mode area PCF is an LMA-8 made by Crystal Fibre (core diameter of 8.5 mm, mode field diameter of 6 mm, and numerical aperture at 625 nm of 0.083). Any science interferometer that needs a tracking laser fringe to assist in alignment can use this system.
NASA Astrophysics Data System (ADS)
Grasser, R.; Peyronneaudi, Benjamin; Yon, Kevin; Aubry, Marie
2015-10-01
CILAS, subsidiary of Airbus Defense and Space, develops, manufactures and sales laser-based optronics equipment for defense and homeland security applications. Part of its activity is related to active systems for threat detection, recognition and identification. Active surveillance and active imaging systems are often required to achieve identification capacity in case for long range observation in adverse conditions. In order to ease the deployment of active imaging systems often complex and expensive, CILAS suggests a new concept. It consists on the association of two apparatus working together. On one side, a patented versatile laser platform enables high peak power laser illumination for long range observation. On the other side, a small camera add-on works as a fast optical switch to select photons with specific time of flight only. The association of the versatile illumination platform and the fast optical switch presents itself as an independent body, so called "flash module", giving to virtually any passive observation systems gated active imaging capacity in NIR and SWIR.
The laser and optical system for the RIBF-PALIS experiment
NASA Astrophysics Data System (ADS)
Sonoda, T.; Iimura, H.; Reponen, M.; Wada, M.; Katayama, I.; Sonnenschein, V.; Takamatsu, T.; Tomita, H.; Kojima, T. M.
2018-01-01
This paper describes the laser and optical system for the Parasitic radioactive isotope (RI) beam production by Laser Ion-Source (PALIS) in the RIKEN fragment separator facility. This system requires an optical path length of 70 m for transporting the laser beam from the laser light source to the place for resonance ionization. To accomplish this, we designed and implemented a simple optical system consisting of several mirrors equipped with compact stepping motor actuators, lenses, beam spot screens and network cameras. The system enables multi-step laser resonance ionization in the gas cell and gas jet via overlap with a diameter of a few millimeters, between the laser photons and atomic beam. Despite such a long transport distance, we achieved a transport efficiency for the UV laser beam of about 50%. We also confirmed that the position stability of the laser beam stays within a permissible range for dedicated resonance ionization experiments.
On-sky performance of the tip-tilt correction system for GLAS using an EMCCD camera
NASA Astrophysics Data System (ADS)
Skvarč, Jure; Tulloch, Simon
2008-07-01
Adaptive optics systems based on laser guide stars still need a natural guide star (NGS) to correct for the image motion caused by the atmosphere and by imperfect telescope tracking. The ability to properly compensate for this motion using a faint NGS is critical to achieve large sky coverage. For the laser guide system (GLAS) on the 4.2 m William Herschel Telescope we designed and tested in the laboratory and on-sky a tip-tilt correction system based on a PC running Linux and an EMCCD technology camera. The control software allows selection of different centroiding algorithms and loop control methods as well as the control parameters. Parameter analysis has been performed using tip-tilt only correction before the laser commissioning and the selected sets of parameters were then used during commissioning of the laser guide star system. We have established the SNR of the guide star as a function of magnitude, depending on the image sampling frequency and on the dichroic used in the optical system; achieving a measurable improvement using full AO correction with NGSes down to magnitude range R=16.5 to R=18. A minimum SNR of about 10 was established to be necessary for a useful correction. The system was used to produce 0.16 arcsecond images in H band using bright NGS and laser correction during GLAS commissioning runs.
Measurement system with high accuracy for laser beam quality.
Ke, Yi; Zeng, Ciling; Xie, Peiyuan; Jiang, Qingshan; Liang, Ke; Yang, Zhenyu; Zhao, Ming
2015-05-20
Presently, most of the laser beam quality measurement system collimates the optical path manually with low efficiency and low repeatability. To solve these problems, this paper proposed a new collimated method to improve the reliability and accuracy of the measurement results. The system accuracy controlled the position of the mirror to change laser beam propagation direction, which can realize the beam perpendicularly incident to the photosurface of camera. The experiment results show that the proposed system has good repeatability and the measuring deviation of M2 factor is less than 0.6%.
Wang, Fei; Dong, Hang; Chen, Yanan; Zheng, Nanning
2016-12-09
Strong demands for accurate non-cooperative target measurement have been arising recently for the tasks of assembling and capturing. Spherical objects are one of the most common targets in these applications. However, the performance of the traditional vision-based reconstruction method was limited for practical use when handling poorly-textured targets. In this paper, we propose a novel multi-sensor fusion system for measuring and reconstructing textureless non-cooperative spherical targets. Our system consists of four simple lasers and a visual camera. This paper presents a complete framework of estimating the geometric parameters of textureless spherical targets: (1) an approach to calibrate the extrinsic parameters between a camera and simple lasers; and (2) a method to reconstruct the 3D position of the laser spots on the target surface and achieve the refined results via an optimized scheme. The experiment results show that our proposed calibration method can obtain a fine calibration result, which is comparable to the state-of-the-art LRF-based methods, and our calibrated system can estimate the geometric parameters with high accuracy in real time.
Enhanced Monocular Visual Odometry Integrated with Laser Distance Meter for Astronaut Navigation
Wu, Kai; Di, Kaichang; Sun, Xun; Wan, Wenhui; Liu, Zhaoqin
2014-01-01
Visual odometry provides astronauts with accurate knowledge of their position and orientation. Wearable astronaut navigation systems should be simple and compact. Therefore, monocular vision methods are preferred over stereo vision systems, commonly used in mobile robots. However, the projective nature of monocular visual odometry causes a scale ambiguity problem. In this paper, we focus on the integration of a monocular camera with a laser distance meter to solve this problem. The most remarkable advantage of the system is its ability to recover a global trajectory for monocular image sequences by incorporating direct distance measurements. First, we propose a robust and easy-to-use extrinsic calibration method between camera and laser distance meter. Second, we present a navigation scheme that fuses distance measurements with monocular sequences to correct the scale drift. In particular, we explain in detail how to match the projection of the invisible laser pointer on other frames. Our proposed integration architecture is examined using a live dataset collected in a simulated lunar surface environment. The experimental results demonstrate the feasibility and effectiveness of the proposed method. PMID:24618780
Wang, Fei; Dong, Hang; Chen, Yanan; Zheng, Nanning
2016-01-01
Strong demands for accurate non-cooperative target measurement have been arising recently for the tasks of assembling and capturing. Spherical objects are one of the most common targets in these applications. However, the performance of the traditional vision-based reconstruction method was limited for practical use when handling poorly-textured targets. In this paper, we propose a novel multi-sensor fusion system for measuring and reconstructing textureless non-cooperative spherical targets. Our system consists of four simple lasers and a visual camera. This paper presents a complete framework of estimating the geometric parameters of textureless spherical targets: (1) an approach to calibrate the extrinsic parameters between a camera and simple lasers; and (2) a method to reconstruct the 3D position of the laser spots on the target surface and achieve the refined results via an optimized scheme. The experiment results show that our proposed calibration method can obtain a fine calibration result, which is comparable to the state-of-the-art LRF-based methods, and our calibrated system can estimate the geometric parameters with high accuracy in real time. PMID:27941705
Imaging strategies for the study of gas turbine spark ignition
NASA Astrophysics Data System (ADS)
Gord, James R.; Tyler, Charles; Grinstead, Keith D., Jr.; Fiechtner, Gregory J.; Cochran, Michael J.; Frus, John R.
1999-10-01
Spark-ignition systems play a critical role in the performance of essentially all gas turbine engines. These devices are responsible for initiating the combustion process that sustains engine operation. Demanding applications such as cold start and high-altitude relight require continued enhancement of ignition systems. To characterize advanced ignition systems, we have developed a number of laser-based diagnostic techniques configured for ultrafast imaging of spark parameters including emission, density, temperature, and species concentration. These diagnostics have been designed to exploit an ultrafast- framing charge-coupled-device (CCD) camera and high- repetition-rate laser sources including mode-locked Ti:sapphire oscillators and regenerative amplifiers. Spontaneous-emission and laser-shlieren measurements have been accomplished with this instrumentation and the result applied to the study of a novel Unison Industries spark igniter that shows great promise for improved cold-start and high-altitude-relight capability as compared to that of igniters currently in use throughout military and commercial fleets. Phase-locked and ultrafast real-time imaging strategies are explored, and details of the imaging instrumentation, particularly the CCD camera and laser sources, are discussed.
NASA Astrophysics Data System (ADS)
Köhler, M.; Boxx, I.; Geigle, K. P.; Meier, W.
2011-05-01
We describe a newly developed combustion diagnostic for the simultaneous planar imaging of soot structure and velocity fields in a highly sooting, lifted turbulent jet flame at 3000 frames per second, or two orders of magnitude faster than "conventional" laser imaging systems. This diagnostic uses short pulse duration (8 ns), frequency-doubled, diode-pumped solid state (DPSS) lasers to excite laser-induced incandescence (LII) at 3 kHz, which is then imaged onto a high framerate CMOS camera. A second (dual-cavity) DPSS laser and CMOS camera form the basis of a particle image velocity (PIV) system used to acquire 2-component velocity field in the flame. The LII response curve (measured in a laminar propane diffusion flame) is presented and the combined diagnostics then applied in a heavily sooting lifted turbulent jet flame. The potential challenges and rewards of application of this combined imaging technique at high speeds are discussed.
A feasibility study of damage detection in beams using high-speed camera (Conference Presentation)
NASA Astrophysics Data System (ADS)
Wan, Chao; Yuan, Fuh-Gwo
2017-04-01
In this paper a method for damage detection in beam structures using high-speed camera is presented. Traditional methods of damage detection in structures typically involve contact (i.e., piezoelectric sensor or accelerometer) or non-contact sensors (i.e., laser vibrometer) which can be costly and time consuming to inspect an entire structure. With the popularity of the digital camera and the development of computer vision technology, video cameras offer a viable capability of measurement including higher spatial resolution, remote sensing and low-cost. In the study, a damage detection method based on the high-speed camera was proposed. The system setup comprises a high-speed camera and a line-laser which can capture the out-of-plane displacement of a cantilever beam. The cantilever beam with an artificial crack was excited and the vibration process was recorded by the camera. A methodology called motion magnification, which can amplify subtle motions in a video is used for modal identification of the beam. A finite element model was used for validation of the proposed method. Suggestions for applications of this methodology and challenges in future work will be discussed.
Coincidence ion imaging with a fast frame camera
NASA Astrophysics Data System (ADS)
Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen
2014-12-01
A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.
Laser scatter feature of surface defect on apples
NASA Astrophysics Data System (ADS)
Rao, Xiuqin; Ying, Yibin; Cen, YiKe; Huang, Haibo
2006-10-01
A machine vision system for real-time fruit quality inspection was developed. The system consists of a chamber, a laser projector, a TMS-7DSP CCD camera (PULNIX Inc.), and a computer. A Meteor-II/MC frame grabber (Matrox Graphics Inc.) was inserted into the slot of the computer to grab fruit images. The laser projector and the camera were mounted at the ceiling of the chamber. An apple was put in the chamber, the spot of the laser projector was projected on the surface of the fruit, and an image was grabbed. 2 breed of apples was test, Each apple was imaged twice, one was imaged for the normal surface, and the other for the defect. The red component of the images was used to get the feature of the defect and the sound surface of the fruits. The average value, STD value and comentropy Value of red component of the laser scatter image were analyzed. The Standard Deviation value of red component of normal is more suitable to separate the defect surface from sound surface for the ShuijinFuji apples, but for bintang apples, there is more work need to do to separate the different surface with laser scatter image.
Spickermann, Gunnar; Friederich, Fabian; Roskos, Hartmut G; Bolívar, Peter Haring
2009-11-01
We present a 64x48 pixel 2D electro-optical terahertz (THz) imaging system using a photonic mixing device time-of-flight camera as an optical demodulating detector array. The combination of electro-optic detection with a time-of-flight camera increases sensitivity drastically, enabling the use of a nonamplified laser source for high-resolution real-time THz electro-optic imaging.
Pothole Detection System Using a Black-box Camera.
Jo, Youngtae; Ryu, Seungki
2015-11-19
Aging roads and poor road-maintenance systems result a large number of potholes, whose numbers increase over time. Potholes jeopardize road safety and transportation efficiency. Moreover, they are often a contributing factor to car accidents. To address the problems associated with potholes, the locations and size of potholes must be determined quickly. Sophisticated road-maintenance strategies can be developed using a pothole database, which requires a specific pothole-detection system that can collect pothole information at low cost and over a wide area. However, pothole repair has long relied on manual detection efforts. Recent automatic detection systems, such as those based on vibrations or laser scanning, are insufficient to detect potholes correctly and inexpensively owing to the unstable detection of vibration-based methods and high costs of laser scanning-based methods. Thus, in this paper, we introduce a new pothole-detection system using a commercial black-box camera. The proposed system detects potholes over a wide area and at low cost. We have developed a novel pothole-detection algorithm specifically designed to work with the embedded computing environments of black-box cameras. Experimental results are presented with our proposed system, showing that potholes can be detected accurately in real-time.
Soft x-ray streak camera for laser fusion applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stradling, G.L.
This thesis reviews the development and significance of the soft x-ray streak camera (SXRSC) in the context of inertial confinement fusion energy development. A brief introduction of laser fusion and laser fusion diagnostics is presented. The need for a soft x-ray streak camera as a laser fusion diagnostic is shown. Basic x-ray streak camera characteristics, design, and operation are reviewed. The SXRSC design criteria, the requirement for a subkilovolt x-ray transmitting window, and the resulting camera design are explained. Theory and design of reflector-filter pair combinations for three subkilovolt channels centered at 220 eV, 460 eV, and 620 eV aremore » also presented. Calibration experiments are explained and data showing a dynamic range of 1000 and a sweep speed of 134 psec/mm are presented. Sensitivity modifications to the soft x-ray streak camera for a high-power target shot are described. A preliminary investigation, using a stepped cathode, of the thickness dependence of the gold photocathode response is discussed. Data from a typical Argus laser gold-disk target experiment are shown.« less
Darmanis, Spyridon; Toms, Andrew; Durman, Robert; Moore, Donna; Eyres, Keith
2007-07-01
To reduce the operating time in computer-assisted navigated total knee replacement (TKR), by improving communication between the infrared camera and the trackers placed on the patient. The innovation involves placing a routinely used laser pointer on top of the camera, so that the infrared cameras focus precisely on the trackers located on the knee to be operated on. A prospective randomized study was performed involving 40 patients divided into two groups, A and B. Both groups underwent navigated TKR, but for group B patients a laser pointer was used to improve the targeting capabilities of the cameras. Without the laser pointer, the camera had to move a mean 9.2 times in order to identify the trackers. With the introduction of the laser pointer, this was reduced to 0.9 times. Accordingly, the additional mean time required without the laser pointer was 11.6 minutes. Time delays are a major problem in computer-assisted surgery, and our technical suggestion can contribute towards reducing the delays associated with this particular application.
Wide-field Fourier ptychographic microscopy using laser illumination source
Chung, Jaebum; Lu, Hangwen; Ou, Xiaoze; Zhou, Haojiang; Yang, Changhuei
2016-01-01
Fourier ptychographic (FP) microscopy is a coherent imaging method that can synthesize an image with a higher bandwidth using multiple low-bandwidth images captured at different spatial frequency regions. The method’s demand for multiple images drives the need for a brighter illumination scheme and a high-frame-rate camera for a faster acquisition. We report the use of a guided laser beam as an illumination source for an FP microscope. It uses a mirror array and a 2-dimensional scanning Galvo mirror system to provide a sample with plane-wave illuminations at diverse incidence angles. The use of a laser presents speckles in the image capturing process due to reflections between glass surfaces in the system. They appear as slowly varying background fluctuations in the final reconstructed image. We are able to mitigate these artifacts by including a phase image obtained by differential phase contrast (DPC) deconvolution in the FP algorithm. We use a 1-Watt laser configured to provide a collimated beam with 150 mW of power and beam diameter of 1 cm to allow for the total capturing time of 0.96 seconds for 96 raw FPM input images in our system, with the camera sensor’s frame rate being the bottleneck for speed. We demonstrate a factor of 4 resolution improvement using a 0.1 NA objective lens over the full camera field-of-view of 2.7 mm by 1.5 mm. PMID:27896016
A Virtual Blind Cane Using a Line Laser-Based Vision System and an Inertial Measurement Unit
Dang, Quoc Khanh; Chee, Youngjoon; Pham, Duy Duong; Suh, Young Soo
2016-01-01
A virtual blind cane system for indoor application, including a camera, a line laser and an inertial measurement unit (IMU), is proposed in this paper. Working as a blind cane, the proposed system helps a blind person find the type of obstacle and the distance to it. The distance from the user to the obstacle is estimated by extracting the laser coordinate points on the obstacle, as well as tracking the system pointing angle. The paper provides a simple method to classify the obstacle’s type by analyzing the laser intersection histogram. Real experimental results are presented to show the validity and accuracy of the proposed system. PMID:26771618
Nike Facility Diagnostics and Data Acquisition System
NASA Astrophysics Data System (ADS)
Chan, Yung; Aglitskiy, Yefim; Karasik, Max; Kehne, David; Obenschain, Steve; Oh, Jaechul; Serlin, Victor; Weaver, Jim
2013-10-01
The Nike laser-target facility is a 56-beam krypton fluoride system that can deliver 2 to 3 kJ of laser energy at 248 nm onto targets inside a two meter diameter vacuum chamber. Nike is used to study physics and technology issues related to laser direct-drive ICF fusion, including hydrodynamic and laser-plasma instabilities, material behavior at extreme pressures, and optical and x-ray diagnostics for laser-heated targets. A suite of laser and target diagnostics are fielded on the Nike facility, including high-speed, high-resolution x-ray and visible imaging cameras, spectrometers and photo-detectors. A centrally-controlled, distributed computerized data acquisition system provides robust data management and near real-time analysis feedback capability during target shots. Work supported by DOE/NNSA.
Crackscope : automatic pavement cracking inspection system.
DOT National Transportation Integrated Search
2008-08-01
The CrackScope system is an automated pavement crack rating system consisting of a : digital line scan camera, laser-line illuminator, and proprietary crack detection and classification : software. CrackScope is able to perform real-time pavement ins...
Upgrading the Arecibo Potassium Lidar Receiver for Meridional Wind Measurements
NASA Astrophysics Data System (ADS)
Piccone, A. N.; Lautenbach, J.
2017-12-01
Lidar can be used to measure a plethora of variables: temperature, density of metals, and wind. This REU project is focused on the set up of a semi steerable telescope that will allow the measurement of meridional wind in the mesosphere (80-105 km) with Arecibo Observatory's potassium resonance lidar. This includes the basic design concept of a steering system that is able to turn the telescope to a maximum of 40°, alignment of the mirror with the telescope frame to find the correct focusing, and the triggering and programming of a CCD camera. The CCD camera's purpose is twofold: looking though the telescope and matching the stars in the field of view with a star map to accurately calibrate the steering system and determining the laser beam properties and position. Using LabVIEW, the frames from the CCD camera can be analyzed to identify the most intense pixel in the image (and therefore the brightest point in the laser beam or stars) by plotting average pixel values per row and column and locating the peaks of these plots. The location of this pixel can then be plotted, determining the jitter in the laser and position within the field of view of the telescope.
Selective laser ablation of carious lesions using simultaneous scanned near-IR diode and CO2 lasers
NASA Astrophysics Data System (ADS)
Chan, Kenneth H.; Fried, Daniel
2017-02-01
Previous studies have established that carious lesions can be imaged with high contrast using near-IR wavelengths coincident with high water absorption, namely 1450-nm, without the interference of stains. It has been demonstrated that computer-controlled laser scanning systems utilizing IR lasers operating at high pulse repetition rates can be used for serial imaging and selective removal of caries lesions. In this study, a point-to-point scanning system was developed integrating a 1450-nm diode laser with the CO2 ablation laser. This approach is advantageous since it does not require an expensive near-IR camera. In this pilot study, we demonstrate the feasibility of a combined NIR and IR laser system for the selective removal of carious lesions.
Selective Laser Ablation of Carious Lesions using Simultaneous Scanned Near-IR Diode and CO2 Lasers.
Chan, Kenneth H; Fried, Daniel
2017-01-28
Previous studies have established that carious lesions can be imaged with high contrast using near-IR wavelengths coincident with high water absorption, namely 1450-nm, without the interference of stains. It has been demonstrated that computer-controlled laser scanning systems utilizing IR lasers operating at high pulse repetition rates can be used for serial imaging and selective removal of caries lesions. In this study, a point-to-point scanning system was developed integrating a 1450-nm diode laser with the CO 2 ablation laser. This approach is advantageous since it does not require an expensive near-IR camera. In this pilot study, we demonstrate the feasibility of a combined NIR and IR laser system for the selective removal of carious lesions.
Extrinsic Calibration of Camera and 2D Laser Sensors without Overlap
Al-Widyan, Khalid
2017-01-01
Extrinsic calibration of a camera and a 2D laser range finder (lidar) sensors is crucial in sensor data fusion applications; for example SLAM algorithms used in mobile robot platforms. The fundamental challenge of extrinsic calibration is when the camera-lidar sensors do not overlap or share the same field of view. In this paper we propose a novel and flexible approach for the extrinsic calibration of a camera-lidar system without overlap, which can be used for robotic platform self-calibration. The approach is based on the robot–world hand–eye calibration (RWHE) problem; proven to have efficient and accurate solutions. First, the system was mapped to the RWHE calibration problem modeled as the linear relationship AX=ZB, where X and Z are unknown calibration matrices. Then, we computed the transformation matrix B, which was the main challenge in the above mapping. The computation is based on reasonable assumptions about geometric structure in the calibration environment. The reliability and accuracy of the proposed approach is compared to a state-of-the-art method in extrinsic 2D lidar to camera calibration. Experimental results from real datasets indicate that the proposed approach provides better results with an L2 norm translational and rotational deviations of 314 mm and 0.12∘ respectively. PMID:29036905
Extrinsic Calibration of Camera and 2D Laser Sensors without Overlap.
Ahmad Yousef, Khalil M; Mohd, Bassam J; Al-Widyan, Khalid; Hayajneh, Thaier
2017-10-14
Extrinsic calibration of a camera and a 2D laser range finder (lidar) sensors is crucial in sensor data fusion applications; for example SLAM algorithms used in mobile robot platforms. The fundamental challenge of extrinsic calibration is when the camera-lidar sensors do not overlap or share the same field of view. In this paper we propose a novel and flexible approach for the extrinsic calibration of a camera-lidar system without overlap, which can be used for robotic platform self-calibration. The approach is based on the robot-world hand-eye calibration (RWHE) problem; proven to have efficient and accurate solutions. First, the system was mapped to the RWHE calibration problem modeled as the linear relationship AX = ZB , where X and Z are unknown calibration matrices. Then, we computed the transformation matrix B , which was the main challenge in the above mapping. The computation is based on reasonable assumptions about geometric structure in the calibration environment. The reliability and accuracy of the proposed approach is compared to a state-of-the-art method in extrinsic 2D lidar to camera calibration. Experimental results from real datasets indicate that the proposed approach provides better results with an L2 norm translational and rotational deviations of 314 mm and 0 . 12 ∘ respectively.
Remote sensing of atmospheric pressure and sea state using laser altimeters
NASA Technical Reports Server (NTRS)
Gardner, C. S.
1985-01-01
Short-pulse multicolor laser ranging systems are currently being developed for satellite ranging applications. These systems use Q-switched pulsed lasers and streak-tube cameras to provide timing accuracies approaching a few picoseconds. Satellite laser ranging systems have been used to evaluate many important geophysical phenomena such as fault motion, polar motion and solid earth tides, by measuring the orbital perturbations of retroreflector equipped satellites. Some existing operational systems provide range resolution approaching a few millimeters. There is currently considerable interest in adapting these highly accurate systems for use as airborne and satellite based altimeters. Potential applications include the measurement of sea state, ground topography and atmospheric pressure. This paper reviews recent progress in the development of multicolor laser altimeters for use in monitoring sea state and atmospheric pressure.
NASA Astrophysics Data System (ADS)
Bykov, A. A.; Kutuza, I. B.; Zinin, P. V.; Machikhin, A. S.; Troyan, I. A.; Bulatov, K. M.; Batshev, V. I.; Mantrova, Y. V.; Gaponov, M. I.; Prakapenka, V. B.; Sharma, S. K.
2018-01-01
Recently it has been shown that it is possible to measure the two-dimensional distribution of the surface temperature of microscopic specimens. The main component of the system is a tandem imaging acousto-optical tunable filter synchronized with a video camera. In this report, we demonstrate that combining the laser heating system with a tandem imaging acousto-optical tunable filter allows measurement of the temperature distribution under laser heating of the platinum plates as well as a visualization of the infrared laser beam, that is widely used for laser heating in diamond anvil cells.
Development of an Extra-vehicular (EVA) Infrared (IR) Camera Inspection System
NASA Technical Reports Server (NTRS)
Gazarik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Pandolf, John; Jenkins, Rusty; Yates, Rusty
2006-01-01
Designed to fulfill a critical inspection need for the Space Shuttle Program, the EVA IR Camera System can detect crack and subsurface defects in the Reinforced Carbon-Carbon (RCC) sections of the Space Shuttle s Thermal Protection System (TPS). The EVA IR Camera performs this detection by taking advantage of the natural thermal gradients induced in the RCC by solar flux and thermal emission from the Earth. This instrument is a compact, low-mass, low-power solution (1.2cm3, 1.5kg, 5.0W) for TPS inspection that exceeds existing requirements for feature detection. Taking advantage of ground-based IR thermography techniques, the EVA IR Camera System provides the Space Shuttle program with a solution that can be accommodated by the existing inspection system. The EVA IR Camera System augments the visible and laser inspection systems and finds cracks and subsurface damage that is not measurable by the other sensors, and thus fills a critical gap in the Space Shuttle s inspection needs. This paper discusses the on-orbit RCC inspection measurement concept and requirements, and then presents a detailed description of the EVA IR Camera System design.
NASA Astrophysics Data System (ADS)
Matilainen, Ville-Pekka; Piili, Heidi; Salminen, Antti; Nyrhilä, Olli
Laser additive manufacturing (LAM) is a fabrication technology that enables production of complex parts from metallic materials with mechanical properties comparable to conventionally manufactured parts. In the LAM process, parts are manufactured by melting metallic powder layer-by-layer with a laser beam. This manufacturing technology is nowadays called powder bed fusion (PBF) according to the ASTM F2792-12a standard. This strategy involves several different independent and dependent thermal cycles, all of which have an influence on the final properties of the manufactured part. The quality of PBF parts depends strongly on the characteristics of each single laser-melted track and each single layer. This study consequently concentrates on investigating the effects of process parameters such as laser power on single track and layer formation and laser-material interaction phenomena occurring during the PBF process. Experimental tests were done with two different machines: a modified research machine based on an EOS EOSINT M-series system and an EOS EOSINT M280 system. The material used was EOS stainless steel 17-4 PH. Process monitoring was done with an active illuminated high speed camera system. After microscopy analysis, it was concluded that a keyhole can form during laser additive manufacturing of stainless steel. It was noted that heat input has an important effect on the likelihood of keyhole formation. The threshold intensity value for keyhole formation of 106 W/cm2 was exceeded in all manufactured single tracks. Laser interaction time was found to have an effect on penetration depth and keyhole formation, since the penetration depth increased with increased laser interaction time. It was also concluded that active illuminated high speed camera systems are suitable for monitoring of the manufacturing process and facilitate process control.
SPADAS: a high-speed 3D single-photon camera for advanced driver assistance systems
NASA Astrophysics Data System (ADS)
Bronzi, D.; Zou, Y.; Bellisai, S.; Villa, F.; Tisa, S.; Tosi, A.; Zappa, F.
2015-02-01
Advanced Driver Assistance Systems (ADAS) are the most advanced technologies to fight road accidents. Within ADAS, an important role is played by radar- and lidar-based sensors, which are mostly employed for collision avoidance and adaptive cruise control. Nonetheless, they have a narrow field-of-view and a limited ability to detect and differentiate objects. Standard camera-based technologies (e.g. stereovision) could balance these weaknesses, but they are currently not able to fulfill all automotive requirements (distance range, accuracy, acquisition speed, and frame-rate). To this purpose, we developed an automotive-oriented CMOS single-photon camera for optical 3D ranging based on indirect time-of-flight (iTOF) measurements. Imagers based on Single-photon avalanche diode (SPAD) arrays offer higher sensitivity with respect to CCD/CMOS rangefinders, have inherent better time resolution, higher accuracy and better linearity. Moreover, iTOF requires neither high bandwidth electronics nor short-pulsed lasers, hence allowing the development of cost-effective systems. The CMOS SPAD sensor is based on 64 × 32 pixels, each able to process both 2D intensity-data and 3D depth-ranging information, with background suppression. Pixel-level memories allow fully parallel imaging and prevents motion artefacts (skew, wobble, motion blur) and partial exposure effects, which otherwise would hinder the detection of fast moving objects. The camera is housed in an aluminum case supporting a 12 mm F/1.4 C-mount imaging lens, with a 40°×20° field-of-view. The whole system is very rugged and compact and a perfect solution for vehicle's cockpit, with dimensions of 80 mm × 45 mm × 70 mm, and less that 1 W consumption. To provide the required optical power (1.5 W, eye safe) and to allow fast (up to 25 MHz) modulation of the active illumination, we developed a modular laser source, based on five laser driver cards, with three 808 nm lasers each. We present the full characterization of the 3D automotive system, operated both at night and during daytime, in both indoor and outdoor, in real traffic, scenario. The achieved long-range (up to 45m), high dynamic-range (118 dB), highspeed (over 200 fps) 3D depth measurement, and high precision (better than 90 cm at 45 m), highlight the excellent performance of this CMOS SPAD camera for automotive applications.
NASA Astrophysics Data System (ADS)
Nara, Shunsuke; Takahashi, Satoru
In this paper, what we want to do is to develop an observation device to measure the working radius of a crane truck. The device has a single CCD camera, a laser range finder and two AC servo motors. First, in order to measure the working radius, we need to consider algorithm of a crane hook recognition. Then, we attach the cross mark on the crane hook. Namely, instead of the crane hook, we try to recognize the cross mark. Further, for the observation device, we construct PI control system with an extended Kalman filter to track the moving cross mark. Through experiments, we show the usefulness of our device including new control system of mark tracking.
Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
Ultrafast Imaging of Electronic Motion in Atoms and Molecules
2016-01-12
pulses were measured with a home-made faraday cup and laser-triggered streak camera, respectively. Both are retractable and can measure the beam in...100 fs. The charge and duration of the electron pulses were measured with a home-made faraday cup and laser-triggered streak camera, respectively... faraday cup and laser-triggered streak camera, respectively. Both are retractable and can measure the beam in-situ. The gun was shown to generate pulses
NASA Astrophysics Data System (ADS)
Chen, C.; Zou, X.; Tian, M.; Li, J.; Wu, W.; Song, Y.; Dai, W.; Yang, B.
2017-11-01
In order to solve the automation of 3D indoor mapping task, a low cost multi-sensor robot laser scanning system is proposed in this paper. The multiple-sensor robot laser scanning system includes a panorama camera, a laser scanner, and an inertial measurement unit and etc., which are calibrated and synchronized together to achieve simultaneously collection of 3D indoor data. Experiments are undertaken in a typical indoor scene and the data generated by the proposed system are compared with ground truth data collected by a TLS scanner showing an accuracy of 99.2% below 0.25 meter, which explains the applicability and precision of the system in indoor mapping applications.
Nondestructive defect detection in laser optical coatings
NASA Astrophysics Data System (ADS)
Marrs, C. D.; Porteus, J. O.; Palmer, J. R.
1985-03-01
Defects responsible for laser damage in visible-wavelength mirrors are observed at nondamaging intensities using a new video microscope system. Studies suggest that a defect scattering phenomenon combined with lag characteristics of video cameras makes this possible. Properties of the video-imaged light are described for multilayer dielectric coatings and diamond-turned metals.
Luegmair, Georg; Mehta, Daryush D.; Kobler, James B.; Döllinger, Michael
2015-01-01
Vocal fold kinematics and its interaction with aerodynamic characteristics play a primary role in acoustic sound production of the human voice. Investigating the temporal details of these kinematics using high-speed videoendoscopic imaging techniques has proven challenging in part due to the limitations of quantifying complex vocal fold vibratory behavior using only two spatial dimensions. Thus, we propose an optical method of reconstructing the superior vocal fold surface in three spatial dimensions using a high-speed video camera and laser projection system. Using stereo-triangulation principles, we extend the camera-laser projector method and present an efficient image processing workflow to generate the three-dimensional vocal fold surfaces during phonation captured at 4000 frames per second. Initial results are provided for airflow-driven vibration of an ex vivo vocal fold model in which at least 75% of visible laser points contributed to the reconstructed surface. The method captures the vertical motion of the vocal folds at a high accuracy to allow for the computation of three-dimensional mucosal wave features such as vibratory amplitude, velocity, and asymmetry. PMID:26087485
Coincidence ion imaging with a fast frame camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei
2014-12-15
A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots onmore » each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.« less
A fuzzy automated object classification by infrared laser camera
NASA Astrophysics Data System (ADS)
Kanazawa, Seigo; Taniguchi, Kazuhiko; Asari, Kazunari; Kuramoto, Kei; Kobashi, Syoji; Hata, Yutaka
2011-06-01
Home security in night is very important, and the system that watches a person's movements is useful in the security. This paper describes a classification system of adult, child and the other object from distance distribution measured by an infrared laser camera. This camera radiates near infrared waves and receives reflected ones. Then, it converts the time of flight into distance distribution. Our method consists of 4 steps. First, we do background subtraction and noise rejection in the distance distribution. Second, we do fuzzy clustering in the distance distribution, and form several clusters. Third, we extract features such as the height, thickness, aspect ratio, area ratio of the cluster. Then, we make fuzzy if-then rules from knowledge of adult, child and the other object so as to classify the cluster to one of adult, child and the other object. Here, we made the fuzzy membership function with respect to each features. Finally, we classify the clusters to one with the highest fuzzy degree among adult, child and the other object. In our experiment, we set up the camera in room and tested three cases. The method successfully classified them in real time processing.
Design and Development of a Low-Cost Aerial Mobile Mapping System for Multi-Purpose Applications
NASA Astrophysics Data System (ADS)
Acevedo Pardo, C.; Farjas Abadía, M.; Sternberg, H.
2015-08-01
The research project with the working title "Design and development of a low-cost modular Aerial Mobile Mapping System" was formed during the last year as the result from numerous discussions and considerations with colleagues from the HafenCity University Hamburg, Department Geomatics. The aim of the project is to design a sensor platform which can be embedded preferentially on an UAV, but also can be integrated on any adaptable vehicle. The system should perform a direct scanning of surfaces with a laser scanner and supported through sensors for determining the position and attitude of the platform. The modular design allows his extension with other sensors such as multispectral cameras, digital cameras or multiple cameras systems.
A novel design measuring method based on linearly polarized laser interference
NASA Astrophysics Data System (ADS)
Cao, Yanbo; Ai, Hua; Zhao, Nan
2013-09-01
The interferometric method is widely used in the precision measurement, including the surface quality of the large-aperture mirror. The laser interference technology has been developing rapidly as the laser sources become more and more mature and reliable. We adopted the laser diode as the source for the sake of the short coherent wavelength of it for the optical path difference of the system is quite shorter as several wavelengths, and the power of laser diode is sufficient for measurement and safe to human eye. The 673nm linearly laser was selected and we construct a novel form of interferometric system as we called `Closed Loop', comprised of polarizing optical components, such as polarizing prism and quartz wave plate, the light from the source split by which into measuring beam and referencing beam, they've both reflected by the measuring mirror, after the two beams transforming into circular polarization and spinning in the opposite directions we induced the polarized light synchronous phase shift interference technology to get the detecting fringes, which transfers the phase shifting in time domain to space, so that we did not need to consider the precise-controlled shift of optical path difference, which will introduce the disturbance of the air current and vibration. We got the interference fringes from four different CCD cameras well-alignment, and the fringes are shifted into four different phases of 0, π/2, π, and 3π/2 in time. After obtaining the images from the CCD cameras, we need to align the interference fringes pixel to pixel from different CCD cameras, and synthesis the rough morphology, after getting rid of systematic error, we could calculate the surface accuracy of the measuring mirror. This novel design detecting method could be applied into measuring the optical system aberration, and it would develop into the setup of the portable structural interferometer and widely used in different measuring circumstances.
NASA Technical Reports Server (NTRS)
Glaeser, P.; Haase, I.; Oberst, J.; Neumann, G. A.
2013-01-01
We have derived algorithms and techniques to precisely co-register laser altimeter profiles with gridded Digital Terrain Models (DTMs), typically derived from stereo images. The algorithm consists of an initial grid search followed by a least-squares matching and yields the translation parameters at sub-pixel level needed to align the DTM and the laser profiles in 3D space. This software tool was primarily developed and tested for co-registration of laser profiles from the Lunar Orbiter Laser Altimeter (LOLA) with DTMs derived from the Lunar Reconnaissance Orbiter (LRO) Narrow Angle Camera (NAC) stereo images. Data sets can be co-registered with positional accuracy between 0.13 m and several meters depending on the pixel resolution and amount of laser shots, where rough surfaces typically result in more accurate co-registrations. Residual heights of the data sets are as small as 0.18 m. The software can be used to identify instrument misalignment, orbit errors, pointing jitter, or problems associated with reference frames being used. Also, assessments of DTM effective resolutions can be obtained. From the correct position between the two data sets, comparisons of surface morphology and roughness can be made at laser footprint- or DTM pixel-level. The precise co-registration allows us to carry out joint analysis of the data sets and ultimately to derive merged high-quality data products. Examples of matching other planetary data sets, like LOLA with LRO Wide Angle Camera (WAC) DTMs or Mars Orbiter Laser Altimeter (MOLA) with stereo models from the High Resolution Stereo Camera (HRSC) as well as Mercury Laser Altimeter (MLA) with Mercury Dual Imaging System (MDIS) are shown to demonstrate the broad science applications of the software tool.
NASA Astrophysics Data System (ADS)
Coleman, L. W.
1985-01-01
Progress in laser fusion research has increased the need for detail and precision in the diagnosis of experiments. This has spawned the development and use of sophisticated sub-nanosecond resolution diagnostic systems. These systems typically use ultrafast X-ray or optical streak cameras in combination with spatially imaging or spectrally dispersing elements. These instruments provide high resolution data essential for understanding the processes occurring in the interaction of high intensity laser light with targets. Several of these types of instruments and their capabilities will be discussed. The utilization of these kinds of diagnostics systems on the nearly completed 100 kJ Nova laser facility will be described.
The application of laser triangulation method on the blind guidance
NASA Astrophysics Data System (ADS)
Wu, Jih-Huah; Wang, Jinn-Der; Fang, Wei; Shan, Yi-Chia; Ma, Shih-Hsin; Kao, Hai-Ko; Jiang, Joe-Air; Lee, Yun-Parn
2011-08-01
A new apparatus for blind-guide is proposed in this paper. Optical triangulation method was used to realize the system. The main components comprise a notebook computer, a camera and two laser modules. One laser module emits a light line beam on the vertical axis. Another laser module emits a light line beam on the tilt horizontal axis. The track of the light line beam on the ground or on the object is captured by the camera, and the image is sent to the notebook computer for calculation. The system can calculate the object width and the distance between the object and the blind in terms of the light line positions on the image. Based on the experiment, the distance between the test object and the blind can be measured with a standard deviation of less than 3% within the range of 60 to 150 cm. The test object width can be measured with a standard deviation of less than 1% within the range of 60 to 150 cm. For saving the power consumption, the laser modules are switched on/off with a trigger pulse. And for reducing the complex computation, the two laser modules are switched on alternately. Besides this, a band pass filter is used to filter out the signal except the specific laser light, which can increase the signal to noise ratio.
Kim, Young-Keun; Kim, Kyung-Soo
2014-10-01
Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-based sensor, the system is expected to be highly robust to sea weather conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Young-Keun, E-mail: ykkim@handong.edu; Kim, Kyung-Soo
Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-basedmore » sensor, the system is expected to be highly robust to sea weather conditions.« less
NASA Astrophysics Data System (ADS)
Kim, Young-Keun; Kim, Kyung-Soo
2014-10-01
Maritime transportation demands an accurate measurement system to track the motion of oscillating container boxes in real time. However, it is a challenge to design a sensor system that can provide both reliable and non-contact methods of 6-DOF motion measurements of a remote object for outdoor applications. In the paper, a sensor system based on two 2D laser scanners is proposed for detecting the relative 6-DOF motion of a crane load in real time. Even without implementing a camera, the proposed system can detect the motion of a remote object using four laser beam points. Because it is a laser-based sensor, the system is expected to be highly robust to sea weather conditions.
Method of laser beam coding for control systems
NASA Astrophysics Data System (ADS)
Pałys, Tomasz; Arciuch, Artur; Walczak, Andrzej; Murawski, Krzysztof
2017-08-01
The article presents the method of encoding a laser beam for control systems. The experiments were performed using a red laser emitting source with a wavelength of λ = 650 nm and a power of P ≍ 3 mW. The aim of the study was to develop methods of modulation and demodulation of the laser beam. Results of research, in which we determined the effect of selected camera parameters, such as image resolution, number of frames per second on the result of demodulation of optical signal, is also shown in the paper. The experiments showed that the adopted coding method provides sufficient information encoded in a single laser beam (36 codes with the effectiveness of decoding at 99.9%).
Coincidence electron/ion imaging with a fast frame camera
NASA Astrophysics Data System (ADS)
Li, Wen; Lee, Suk Kyoung; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander; Fan, Lin
2015-05-01
A new time- and position- sensitive particle detection system based on a fast frame CMOS camera is developed for coincidence electron/ion imaging. The system is composed of three major components: a conventional microchannel plate (MCP)/phosphor screen electron/ion imager, a fast frame CMOS camera and a high-speed digitizer. The system collects the positional information of ions/electrons from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of MCPs processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of electron/ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide. We further show that a time resolution of 30 ps can be achieved when measuring electron TOF spectrum and this enables the new system to achieve a good energy resolution along the TOF axis.
Seam tracking with adaptive image capture for fine-tuning of a high power laser welding process
NASA Astrophysics Data System (ADS)
Lahdenoja, Olli; Säntti, Tero; Laiho, Mika; Paasio, Ari; Poikonen, Jonne K.
2015-02-01
This paper presents the development of methods for real-time fine-tuning of a high power laser welding process of thick steel by using a compact smart camera system. When performing welding in butt-joint configuration, the laser beam's location needs to be adjusted exactly according to the seam line in order to allow the injected energy to be absorbed uniformly into both steel sheets. In this paper, on-line extraction of seam parameters is targeted by taking advantage of a combination of dynamic image intensity compression, image segmentation with a focal-plane processor ASIC, and Hough transform on an associated FPGA. Additional filtering of Hough line candidates based on temporal windowing is further applied to reduce unrealistic frame-to-frame tracking variations. The proposed methods are implemented in Matlab by using image data captured with adaptive integration time. The simulations are performed in a hardware oriented way to allow real-time implementation of the algorithms on the smart camera system.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F
2016-09-16
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.
Laser Technology in Interplanetary Exploration: The Past and the Future
NASA Technical Reports Server (NTRS)
Smith, David E.
2000-01-01
Laser technology has been used in planetary exploration for many years but it has only been in the last decade that laser altimeters and ranging systems have been selected as flight instruments alongside cameras, spectrometers, magnetometers, etc. Today we have an active laser system operating at Mars and another destined for the asteroid Eros. A few years ago a laser ranging system on the Clementine mission changed much of our thinking about the moon and in a few years laser altimeters will be on their way to Mercury, and also to Europa. Along with the increased capabilities and reliability of laser systems has came the realization that precision ranging to the surface of planetary bodies from orbiting spacecraft enables more scientific problems to be addressed, including many associated with planetary rotation, librations, and tides. In addition, new Earth-based laser ranging systems working with similar systems on other planetary bodies in an asynchronous transponder mode will be able to make interplanetary ranging measurements at the few cm level and will advance our understanding of solar system dynamics and relativistic physics.
NASA Technical Reports Server (NTRS)
1979-01-01
Eastman Kodak Company, Rochester, New York is a broad-based firm which produces photographic apparatus and supplies, fibers, chemicals and vitamin concentrates. Much of the company's research and development effort is devoted to photographic science and imaging technology, including laser technology. Eastman Kodak is using a COSMIC computer program called LACOMA in the analysis of laser optical systems and camera design studies. The company reports that use of the program has provided development time savings and reduced computer service fees.
Automatic vision system for analysis of microscopic behavior of flow and transport in porous media
NASA Astrophysics Data System (ADS)
Rashidi, Mehdi; Dehmeshki, Jamshid; Dickenson, Eric; Daemi, M. Farhang
1997-10-01
This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurement within a porous medium. An aqueous fluid lace with a fluorescent dye to microspheres flows through a transparent, refractive-index-matched column packed with transparent crystals. For illumination purposes, a planar sheet of laser passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fields have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows through the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder. The recorded images are acquired automatically frame by frame and transferred to the computer for processing, by using a frame grabber an written relevant algorithms through an RS-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these enhanced particles are monitored to calculate velocity vectors in the plane of the beam. For concentration measurements, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact images that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentrations as a function of time within the porous column.
Sub-picosecond streak camera measurements at LLNL: From IR to x-rays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuba, J; Shepherd, R; Booth, R
An ultra fast, sub-picosecond resolution streak camera has been recently developed at the LLNL. The camera is a versatile instrument with a wide operating wavelength range. The temporal resolution of up to 300 fs can be achieved, with routine operation at 500 fs. The streak camera has been operated in a wide wavelength range from IR to x-rays up to 2 keV. In this paper we briefly review the main design features that result in the unique properties of the streak camera and present its several scientific applications: (1) Streak camera characterization using a Michelson interferometer in visible range, (2)more » temporally resolved study of a transient x-ray laser at 14.7 nm, which enabled us to vary the x-ray laser pulse duration from {approx}2-6 ps by changing the pump laser parameters, and (3) an example of a time-resolved spectroscopy experiment with the streak camera.« less
Kozak, Igor; Luttrull, Jeffrey K.
2014-01-01
Medicinal lasers are a standard source of light to produce retinal tissue photocoagulation to treat retinovascular disease. The Diabetic Retinopathy Study and the Early Treatment Diabetic Retinopathy Study were large randomized clinical trials that have shown beneficial effect of retinal laser photocoagulation in diabetic retinopathy and have dictated the standard of care for decades. However, current treatment protocols undergo modifications. Types of lasers used in treatment of retinal diseases include argon, diode, dye and multicolor lasers, micropulse lasers and lasers for photodynamic therapy. Delivery systems include contact lens slit-lamp laser delivery, indirect ophthalmocope based laser photocoagulation and camera based navigated retinal photocoagulation with retinal eye-tracking. Selective targeted photocoagulation could be a future alternative to panretinal photocoagulation. PMID:25892934
ICALEO '91 - Laser materials processing; Proceedings of the Meeting, San Jose, CA, Nov. 3-8, 1991
NASA Astrophysics Data System (ADS)
Metzbower, Edward A.; Beyer, Eckhard; Matsunawa, Akira
Consideration is given to new developments in LASERCAV technology, modeling of deep penetration laser welding, the theory of radiative transfer in the plasma of the keyhole in penetration laser welding, a synchronized laser-video camera system study of high power laser material interactions, laser process monitoring with dual wavelength optical sensors, new devices for on-line process diagnostics during laser machining, and the process development for a portable Nd:YAG laser materials processing system. Attention is also given to laser welding of alumina-reinforced 6061 aluminum alloy composite, the new trend of laser materials processing, optimization of the laser cutting process for thin section stainless steels, a new nozzle concept for cutting with high power lasers, rapid solidification effects during laser welding, laser surface modification of a low carbon steel with tungsten carbide and carbon, absorptivity of a polarized beam during laser hardening, and laser surface melting of 440 C tool steel. (No individual items are abstracted in this volume)
A vision-based system for fast and accurate laser scanning in robot-assisted phonomicrosurgery.
Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G
2015-02-01
Surgical quality in phonomicrosurgery can be improved by open-loop laser control (e.g., high-speed scanning capabilities) with a robust and accurate closed-loop visual servoing systems. A new vision-based system for laser scanning control during robot-assisted phonomicrosurgery was developed and tested. Laser scanning was accomplished with a dual control strategy, which adds a vision-based trajectory correction phase to a fast open-loop laser controller. The system is designed to eliminate open-loop aiming errors caused by system calibration limitations and by the unpredictable topology of real targets. Evaluation of the new system was performed using CO(2) laser cutting trials on artificial targets and ex-vivo tissue. This system produced accuracy values corresponding to pixel resolution even when smoke created by the laser-target interaction clutters the camera view. In realistic test scenarios, trajectory following RMS errors were reduced by almost 80 % with respect to open-loop system performances, reaching mean error values around 30 μ m and maximum observed errors in the order of 60 μ m. A new vision-based laser microsurgical control system was shown to be effective and promising with significant positive potential impact on the safety and quality of laser microsurgeries.
3D Rainbow Particle Tracking Velocimetry
NASA Astrophysics Data System (ADS)
Aguirre-Pablo, Andres A.; Xiong, Jinhui; Idoughi, Ramzi; Aljedaani, Abdulrahman B.; Dun, Xiong; Fu, Qiang; Thoroddsen, Sigurdur T.; Heidrich, Wolfgang
2017-11-01
A single color camera is used to reconstruct a 3D-3C velocity flow field. The camera is used to record the 2D (X,Y) position and colored scattered light intensity (Z) from white polyethylene tracer particles in a flow. The main advantage of using a color camera is the capability of combining different intensity levels for each color channel to obtain more depth levels. The illumination system consists of an LCD projector placed perpendicularly to the camera. Different intensity colored level gradients are projected onto the particles to encode the depth position (Z) information of each particle, benefiting from the possibility of varying the color profiles and projected frequencies up to 60 Hz. Chromatic aberrations and distortions are estimated and corrected using a 3D laser engraved calibration target. The camera-projector system characterization is presented considering size and depth position of the particles. The use of these components reduces dramatically the cost and complexity of traditional 3D-PTV systems.
Hydrometeor Size Distribution Measurements by Imaging the Attenuation of a Laser Spot
NASA Technical Reports Server (NTRS)
Lane, John
2013-01-01
The optical extinction of a laser due to scattering of particles is a well-known phenomenon. In a laboratory environment, this physical principle is known as the Beer-Lambert law, and is often used to measure the concentration of scattering particles in a fluid or gas. This method has been experimentally shown to be a usable means to measure the dust density from a rocket plume interaction with the lunar surface. Using the same principles and experimental arrangement, this technique can be applied to hydrometeor size distributions, and for launch-pad operations, specifically as a passive hail detection and measurement system. Calibration of a hail monitoring system is a difficult process. In the past, it has required comparison to another means of measuring hydrometeor size and density. Using a technique recently developed for estimating the density of surface dust dispersed during a rocket landing, measuring the extinction of a laser passing through hail (or dust in the rocket case) yields an estimate of the second moment of the particle cloud, and hydrometeor size distribution in the terrestrial meteorological case. With the exception of disdrometers, instruments that measure rain and hail fall make indirect measurements of the drop-size distribution. Instruments that scatter microwaves off of hydrometeors, such as the WSR-88D (Weather Surveillance Radar 88 Doppler), vertical wind profilers, and microwave disdrometers, measure the sixth moment of the drop size distribution (DSD). By projecting a laser onto a target, changes in brightness of the laser spot against the target background during rain and hail yield a measurement of the DSD's second moment by way of the Beer-Lambert law. In order to detect the laser attenuation within the 8-bit resolution of most camera image arrays, a minimum path length is required. Depending on the intensity of the hail fall rate for moderate to heavy rainfall, a laser path length of 100 m is sufficient to measure variations in optical extinction using a digital camera. For hail fall only, the laser path may be shorter because of greater scattering due to the properties of hailstones versus raindrops. A photodetector may replace the camera in automated installations. Laser-based rain and hail measurement systems are available, but they are based on measuring the interruption of a thin laser beam, thus counting individual hydrometeors. These systems are true disdrometers since they also measure size and velocity. The method reported here is a simple method, requiring far less processing, but it is not a disdrometer.
Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6
NASA Technical Reports Server (NTRS)
Lee, George
1993-01-01
A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.
Injector Mixing Efficiency Experiments
NASA Technical Reports Server (NTRS)
Moser, Marlow D.
1998-01-01
Various optical diagnostic techniques such as laser induce fluorescence, Raman spectroscopy, laser Doppler velocimetry, and laser light scattering have been employed to study the flowfield downstream of a single injector element in a optically accessible rocket chamber at Penn State for a number o years. These techniques have been used with both liquid and gaseous oxygen at pressures up to 1000 psia which is the limit of the facility. The purpose of the test programs at Penn State were to develop the techniques and to study the flow field from various injector designs. To extend these studies to higher pressure and ultimately to multiple injectors require the capabilities of the Marshall Space Flight Center. These studies will extend the data base available for the various injector designs to higher pressure as well as to determine the interaction between multiple injectors. During this effort the Princeton Instruments ICCD camera was set up and checked out. The functionality of the system has been thoroughly checked and the shutter compensation time was found to be not working. The controller was returned to the manufacturer for warranty repair. The sensitivity has been measured and found to be approximately 60 counts per photon at maximum gain which agrees with the test data supplied by the manufacturer. The actual value depends on wavelength. The Princeton Instruments camera was been installed in a explosion proof tube for use with the rocket combustor. A 35 mm camera was also made ready for taking still photos inside the combustor. A fiber optic was used to transmit the laser light from an argon-ion laser to the rocket combustor for the light scattering images. This images were obtained for a LOX-hydrogen swirl coax injector. Several still photos were also obtained with the 35 mm camera for these firings.
Innovative Camera and Image Processing System to Characterize Cryospheric Changes
NASA Astrophysics Data System (ADS)
Schenk, A.; Csatho, B. M.; Nagarajan, S.
2010-12-01
The polar regions play an important role in Earth’s climatic and geodynamic systems. Digital photogrammetric mapping provides a means for monitoring the dramatic changes observed in the polar regions during the past decades. High-resolution, photogrammetrically processed digital aerial imagery provides complementary information to surface measurements obtained by laser altimetry systems. While laser points accurately sample the ice surface, stereo images allow for the mapping of features, such as crevasses, flow bands, shear margins, moraines, leads, and different types of sea ice. Tracking features in repeat images produces a dense velocity vector field that can either serve as validation for interferometrically derived surface velocities or it constitutes a stand-alone product. A multi-modal, photogrammetric platform consists of one or more high-resolution, commercial color cameras, GPS and inertial navigation system as well as optional laser scanner. Such a system, using a Canon EOS-1DS Mark II camera, was first flown on the Icebridge missions Fall 2009 and Spring 2010, capturing hundreds of thousands of images at a frame rate of about one second. While digital images and videos have been used for quite some time for visual inspection, precise 3D measurements with low cost, commercial cameras require special photogrammetric treatment that only became available recently. Calibrating the multi-camera imaging system and geo-referencing the images are absolute prerequisites for all subsequent applications. Commercial cameras are inherently non-metric, that is, their sensor model is only approximately known. Since these cameras are not as rugged as photogrammetric cameras, the interior orientation also changes, due to temperature and pressure changes and aircraft vibration, resulting in large errors in 3D measurements. It is therefore necessary to calibrate the cameras frequently, at least whenever the system is newly installed. Geo-referencing the images is performed by the Applanix navigation system. Our new method enables a 3D reconstruction of ice sheet surface with high accuracy and unprecedented details, as it is demonstrated by examples from the Antarctic Peninsula, acquired by the IceBridge mission. Repeat digital imaging also provides data for determining surface elevation changes and velocities that are critical parameters for ice sheet models. Although these methods work well, there are known problems with satellite images and the traditional area-based matching, especially over rapidly changing outlet glaciers. To take full advantage of the high resolution, repeat stereo imaging we have developed a new method. The processing starts with the generation of a DEM from geo-referenced stereo images of the first time epoch. The next step is concerned with extracting and matching interest points in object space. Since an interest point moves its spatial position between two time epochs, such points are only radiometrically conjugate but not geometrically. In fact, the geometric displacement of two identical points, together with the time difference, renders velocities. We computed the evolution of the velocity field and surface topography on the floating tongue of the Jakobshavn glacier from historical stereo aerial photographs to illustrate the approach.
Brute Force Matching Between Camera Shots and Synthetic Images from Point Clouds
NASA Astrophysics Data System (ADS)
Boerner, R.; Kröhnert, M.
2016-06-01
3D point clouds, acquired by state-of-the-art terrestrial laser scanning techniques (TLS), provide spatial information about accuracies up to several millimetres. Unfortunately, common TLS data has no spectral information about the covered scene. However, the matching of TLS data with images is important for monoplotting purposes and point cloud colouration. Well-established methods solve this issue by matching of close range images and point cloud data by fitting optical camera systems on top of laser scanners or rather using ground control points. The approach addressed in this paper aims for the matching of 2D image and 3D point cloud data from a freely moving camera within an environment covered by a large 3D point cloud, e.g. a 3D city model. The key advantage of the free movement affects augmented reality applications or real time measurements. Therefore, a so-called real image, captured by a smartphone camera, has to be matched with a so-called synthetic image which consists of reverse projected 3D point cloud data to a synthetic projection centre whose exterior orientation parameters match the parameters of the image, assuming an ideal distortion free camera.
Handheld laser scanner automatic registration based on random coding
NASA Astrophysics Data System (ADS)
He, Lei; Yu, Chun-ping; Wang, Li
2011-06-01
Current research on Laser Scanner often focuses mainly on the static measurement. Little use has been made of dynamic measurement, that are appropriate for more problems and situations. In particular, traditional Laser Scanner must Keep stable to scan and measure coordinate transformation parameters between different station. In order to make the scanning measurement intelligently and rapidly, in this paper ,we developed a new registration algorithm for handleheld laser scanner based on the positon of target, which realize the dynamic measurement of handheld laser scanner without any more complex work. the double camera on laser scanner can take photograph of the artificial target points to get the three-dimensional coordinates, this points is designed by random coding. And then, a set of matched points is found from control points to realize the orientation of scanner by the least-square common points transformation. After that the double camera can directly measure the laser point cloud in the surface of object and get the point cloud data in an unified coordinate system. There are three major contributions in the paper. Firstly, a laser scanner based on binocular vision is designed with double camera and one laser head. By those, the real-time orientation of laser scanner is realized and the efficiency is improved. Secondly, the coding marker is introduced to solve the data matching, a random coding method is proposed. Compared with other coding methods,the marker with this method is simple to match and can avoid the shading for the object. Finally, a recognition method of coding maker is proposed, with the use of the distance recognition, it is more efficient. The method present here can be used widely in any measurement from small to huge obiect, such as vehicle, airplane which strengthen its intelligence and efficiency. The results of experiments and theory analzing demonstrate that proposed method could realize the dynamic measurement of handheld laser scanner. Theory analysis and experiment shows the method is reasonable and efficient.
NASA Astrophysics Data System (ADS)
Buschinelli, Pedro D. V.; Melo, João. Ricardo C.; Albertazzi, Armando; Santos, João. M. C.; Camerini, Claudio S.
2013-04-01
An axis-symmetrical optical laser triangulation system was developed by the authors to measure the inner geometry of long pipes used in the oil industry. It has a special optical configuration able to acquire shape information of the inner geometry of a section of a pipe from a single image frame. A collimated laser beam is pointed to the tip of a 45° conical mirror. The laser light is reflected in such a way that a radial light sheet is formed and intercepts the inner geometry and forms a bright laser line on a section of the inspected pipe. A camera acquires the image of the laser line through a wide angle lens. An odometer-based triggering system is used to shot the camera to acquire a set of equally spaced images at high speed while the device is moved along the pipe's axis. Image processing is done in real-time (between images acquisitions) thanks to the use of parallel computing technology. The measured geometry is analyzed to identify corrosion damages. The measured geometry and results are graphically presented using virtual reality techniques and devices as 3D glasses and head-mounted displays. The paper describes the measurement principles, calibration strategies, laboratory evaluation of the developed device, as well as, a practical example of a corroded pipe used in an industrial gas production plant.
Laser designator protection filter for see-spot thermal imaging systems
NASA Astrophysics Data System (ADS)
Donval, Ariela; Fisher, Tali; Lipman, Ofir; Oron, Moshe
2012-06-01
In some cases the FLIR has an open window in the 1.06 micrometer wavelength range; this capability is called 'see spot' and allows seeing a laser designator spot using the FLIR. A problem arises when the returned laser energy is too high for the camera sensitivity, and therefore can cause damage to the sensor. We propose a non-linear, solid-state dynamic filter solution protecting from damage in a passive way. Our filter blocks the transmission, only if the power exceeds a certain threshold as opposed to spectral filters that block a certain wavelength permanently. In this paper we introduce the Wideband Laser Protection Filter (WPF) solution for thermal imaging systems possessing the ability to see the laser spot.
High-speed mid-infrared hyperspectral imaging using quantum cascade lasers
NASA Astrophysics Data System (ADS)
Kelley, David B.; Goyal, Anish K.; Zhu, Ninghui; Wood, Derek A.; Myers, Travis R.; Kotidis, Petros; Murphy, Cara; Georgan, Chelsea; Raz, Gil; Maulini, Richard; Müller, Antoine
2017-05-01
We report on a standoff chemical detection system using widely tunable external-cavity quantum cascade lasers (ECQCLs) to illuminate target surfaces in the mid infrared (λ = 7.4 - 10.5 μm). Hyperspectral images (hypercubes) are acquired by synchronously operating the EC-QCLs with a LN2-cooled HgCdTe camera. The use of rapidly tunable lasers and a high-frame-rate camera enables the capture of hypercubes with 128 x 128 pixels and >100 wavelengths in <0.1 s. Furthermore, raster scanning of the laser illumination allowed imaging of a 100-cm2 area at 5-m standoff. Raw hypercubes are post-processed to generate a hypercube that represents the surface reflectance relative to that of a diffuse reflectance standard. Results will be shown for liquids (e.g., silicone oil) and solid particles (e.g., caffeine, acetaminophen) on a variety of surfaces (e.g., aluminum, plastic, glass). Signature spectra are obtained for particulate loadings of RDX on glass of <1 μg/cm2.
A global station coordinate solution based upon camera and laser data - GSFC 1973
NASA Technical Reports Server (NTRS)
Marsh, J. G.; Douglas, B. C.; Klosko, S. M.
1973-01-01
Results for the geocentric coordinates of 72 globally distributed satellite tracking stations consisting of 58 cameras and 14 lasers are presented. The observational data for this solution consists of over 65,000 optical observations and more than 350 laser passes recorded during the National Geodetic Satellite Program, the 1968 Centre National d'Etudes Spatiales/Smithsonian Astrophysical Observatory (SAO) Program, and International Satellite Geodesy Experiment Program. Dynamic methods were used. The data were analyzed with the GSFC GEM and SAO 1969 Standard Earth Gravity Models. The recent value of GM = 398600.8 cu km/sec square derived at the Jet Propulsion Laboratory (JPL) gave the best results for this combination laser/optical solution. Solutions are made with the deep space solution of JPL (LS-25 solution) including results obtained at GSFC from Mariner-9 Unified B-Band tracking. Datum transformation parameters relating North America, Europe, South America, and Australia are given, enabling the positions of some 200 other tracking stations to be placed in the geocentric system.
New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.
Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G
2012-01-01
This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.
Out of lab calibration of a rotating 2D scanner for 3D mapping
NASA Astrophysics Data System (ADS)
Koch, Rainer; Böttcher, Lena; Jahrsdörfer, Maximilian; Maier, Johannes; Trommer, Malte; May, Stefan; Nüchter, Andreas
2017-06-01
Mapping is an essential task in mobile robotics. To fulfil advanced navigation and manipulation tasks a 3D representation of the environment is required. Applying stereo cameras or Time-of-flight cameras (TOF cameras) are one way to archive this requirement. Unfortunately, they suffer from drawbacks which makes it difficult to map properly. Therefore, costly 3D laser scanners are applied. An inexpensive way to build a 3D representation is to use a 2D laser scanner and rotate the scan plane around an additional axis. A 3D point cloud acquired with such a custom device consists of multiple 2D line scans. Therefore the scanner pose of each line scan need to be determined as well as parameters resulting from a calibration to generate a 3D point cloud. Using external sensor systems are a common method to determine these calibration parameters. This is costly and difficult when the robot needs to be calibrated outside the lab. Thus, this work presents a calibration method applied on a rotating 2D laser scanner. It uses a hardware setup to identify the required parameters for calibration. This hardware setup is light, small, and easy to transport. Hence, an out of lab calibration is possible. Additional a theoretical model was created to test the algorithm and analyse impact of the scanner accuracy. The hardware components of the 3D scanner system are an HOKUYO UTM-30LX-EW 2D laser scanner, a Dynamixel servo-motor, and a control unit. The calibration system consists of an hemisphere. In the inner of the hemisphere a circular plate is mounted. The algorithm needs to be provided with a dataset of a single rotation from the laser scanner. To achieve a proper calibration result the scanner needs to be located in the middle of the hemisphere. By means of geometric formulas the algorithms determine the individual deviations of the placed laser scanner. In order to minimize errors, the algorithm solves the formulas in an iterative process. First, the calibration algorithm was tested with an ideal hemisphere model created in Matlab. Second, laser scanner was mounted differently, the scanner position and the rotation axis was modified. In doing so, every deviation, was compared with the algorithm results. Several measurement settings were tested repeatedly with the 3D scanner system and the calibration system. The results show that the length accuracy of the laser scanner is most critical. It influences the required size of the hemisphere and the calibration accuracy.
Fluorescence endoscopy using fiber speckle illumination
NASA Astrophysics Data System (ADS)
Nakano, Shuhei; Katagiri, Takashi; Matsuura, Yuji
2018-02-01
An endoscopic fluorescence imaging system based on fiber speckle illumination is proposed. In this system, a multimode fiber for transmission of excitation laser light and collection of fluorescence is inserted into a conventional flexible endoscope. Since the excitation laser light has random speckle structure, one can detect fluorescence signal corresponding to the irradiation pattern if the sample contains fluorophores. The irradiation pattern can be captured by the endoscope camera when the excitation wavelength is within the sensitivity range of the camera. By performing multiple measurements while changing the irradiation pattern, a fluorescence image is reconstructed by solving a norm minimization problem. The principle of our method was experimentally demonstrated. A 2048 pixels image of quantum dots coated on a frosted glass was successfully reconstructed by 32 measurements. We also confirmed that our method can be applied on biological tissues.
Skupsch, C; Chaves, H; Brücker, C
2011-08-01
The Cranz-Schardin camera utilizes a Q-switched Nd:YAG laser and four single CCD cameras. Light pulse energy in the range of 25 mJ and pulse duration of about 5 ns is provided by the laser. The laser light is converted to incoherent light by Rhodamine-B fluorescence dye in a cuvette. The laser beam coherence is intentionally broken in order to avoid speckle. Four light fibers collect the fluorescence light and are used for illumination. Different light fiber lengths enable a delay of illumination between consecutive images. The chosen interframe time is 25 ns, corresponding to 40 × 10(6) frames per second. Exemplarily, the camera is applied to observe the bow shock in front of a water jet, propagating in air at supersonic speed. The initial phase of the formation of a jet structure is recorded.
Soft X-ray streak camera for laser fusion applications
NASA Astrophysics Data System (ADS)
Stradling, G. L.
1981-04-01
The development and significance of the soft x-ray streak camera (SXRSC) in the context of inertial confinement fusion energy development is reviewed as well as laser fusion and laser fusion diagnostics. The SXRSC design criteria, the requirement for a subkilovolt x-ray transmitting window, and the resulting camera design are explained. Theory and design of reflector-filter pair combinations for three subkilovolt channels centered at 220 eV, 460 eV, and 620 eV are also presented. Calibration experiments are explained and data showing a dynamic range of 1000 and a sweep speed of 134 psec/mm are presented. Sensitivity modifications to the soft x-ray streak camera for a high-power target shot are described. A preliminary investigation, using a stepped cathode, of the thickness dependence of the gold photocathode response is discussed. Data from a typical Argus laser gold-disk target experiment are shown.
Remote sensing technologies are a class of instrument and sensor systems that include laser imageries, imaging spectrometers, and visible to thermal infrared cameras. These systems have been successfully used for gas phase chemical compound identification in a variety of field e...
Precision of FLEET Velocimetry Using High-Speed CMOS Camera Systems
NASA Technical Reports Server (NTRS)
Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.
2015-01-01
Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 microseconds, precisions of 0.5 meters per second in air and 0.2 meters per second in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision HighSpeed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.
Design of optical axis jitter control system for multi beam lasers based on FPGA
NASA Astrophysics Data System (ADS)
Ou, Long; Li, Guohui; Xie, Chuanlin; Zhou, Zhiqiang
2018-02-01
A design of optical axis closed-loop control system for multi beam lasers coherent combining based on FPGA was introduced. The system uses piezoelectric ceramics Fast Steering Mirrors (FSM) as actuator, the Fairfield spot detection of multi beam lasers by the high speed CMOS camera for optical detecting, a control system based on FPGA for real-time optical axis jitter suppression. The algorithm for optical axis centroid detecting and PID of anti-Integral saturation were realized by FPGA. Optimize the structure of logic circuit by reuse resource and pipeline, as a result of reducing logic resource but reduced the delay time, and the closed-loop bandwidth increases to 100Hz. The jitter of laser less than 40Hz was reduced 40dB. The cost of the system is low but it works stably.
Research on airborne infrared leakage detection of natural gas pipeline
NASA Astrophysics Data System (ADS)
Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie
2011-12-01
An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.
Video-Camera-Based Position-Measuring System
NASA Technical Reports Server (NTRS)
Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert
2005-01-01
A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white squares to an object of interest (see Figure 2). For other situations, where circular symmetry is more desirable, circular targets also can be created. Such a target can readily be generated and modified by use of commercially available software and printed by use of a standard office printer. All three relative coordinates (x, y, and z) of each target can be determined by processing the video image of the target. Because of the unique design of corresponding image-processing filters and targets, the vision-based position- measurement system is extremely robust and tolerant of widely varying fields of view, lighting conditions, and varying background imagery.
Non-contact finger vein acquisition system using NIR laser
NASA Astrophysics Data System (ADS)
Kim, Jiman; Kong, Hyoun-Joong; Park, Sangyun; Noh, SeungWoo; Lee, Seung-Rae; Kim, Taejeong; Kim, Hee Chan
2009-02-01
Authentication using finger vein pattern has substantial advantage than other biometrics. Because human vein patterns are hidden inside the skin and tissue, it is hard to forge vein structure. But conventional system using NIR LED array has two drawbacks. First, direct contact with LED array raise sanitary problem. Second, because of discreteness of LEDs, non-uniform illumination exists. We propose non-contact finger vein acquisition system using NIR laser and Laser line generator lens. Laser line generator lens makes evenly distributed line laser from focused laser light. Line laser is aimed on the finger longitudinally. NIR camera was used for image acquisition. 200 index finger vein images from 20 candidates are collected. Same finger vein pattern extraction algorithm was used to evaluate two sets of images. Acquired images from proposed non-contact system do not show any non-uniform illumination in contrary with conventional system. Also results of matching are comparable to conventional system. We developed Non-contact finger vein acquisition system. It can prevent potential cross contamination of skin diseases. Also the system can produce uniformly illuminated images unlike conventional system. With the benefit of non-contact, proposed system shows almost equivalent performance compared with conventional system.
Optical Extinction Measurements of Dust Density in the GMRO Regolith Test Bin
NASA Technical Reports Server (NTRS)
Lane, J.; Mantovani, J.; Mueller, R.; Nugent, M.; Nick, A.; Schuler, J.; Townsend, I.
2016-01-01
A regolith simulant test bin was constructed and completed in the Granular Mechanics and Regolith Operations (GMRO) Lab in 2013. This Planetary Regolith Test Bed (PRTB) is a 64 sq m x 1 m deep test bin, is housed in a climate-controlled facility, and contains 120 MT of lunar-regolith simulant, called Black Point-1 or BP-1, from Black Point, AZ. One of the current uses of the test bin is to study the effects of difficult lighting and dust conditions on Telerobotic Perception Systems to better assess and refine regolith operations for asteroid, Mars and polar lunar missions. Low illumination and low angle of incidence lighting pose significant problems to computer vision and human perception. Levitated dust on Asteroids interferes with imaging and degrades depth perception. Dust Storms on Mars pose a significant problem. Due to these factors, the likely performance of telerobotics is poorly understood for future missions. Current space telerobotic systems are only operated in bright lighting and dust-free conditions. This technology development testing will identify: (1) the impact of degraded lighting and environmental dust on computer vision and operator perception, (2) potential methods and procedures for mitigating these impacts, (3) requirements for telerobotic perception systems for asteroid capture, Mars dust storms and lunar regolith ISRU missions. In order to solve some of the Telerobotic Perception system problems, a plume erosion sensor (PES) was developed in the Lunar Regolith Simulant Bin (LRSB), containing 2 MT of JSC-1a lunar simulant. PES is simply a laser and digital camera with a white target. Two modes of operation have been investigated: (1) single laser spot - the brightness of the spot is dependent on the optical extinction due to dust and is thus an indirect measure of particle number density, and (2) side-scatter - the camera images the laser from the side, showing beam entrance into the dust cloud and the boundary between dust and void. Both methods must assume a mean particle size in order to extract a number density. The optical extinction measurement yields the product of the 2nd moment of the particle size distribution and the extinction efficiency Qe. For particle sizes in the range of interest (greater than 1 micrometer), Qe approximately equal to 2. Scaling up of the PES single laser and camera system is underway in the PRTB, where an array of lasers penetrate a con-trolled dust cloud, illuminating multiple targets. Using high speed HD GoPro video cameras, the evolution of the dust cloud and particle size density can be studied in detail.
Process control of laser conduction welding by thermal imaging measurement with a color camera.
Bardin, Fabrice; Morgan, Stephen; Williams, Stewart; McBride, Roy; Moore, Andrew J; Jones, Julian D C; Hand, Duncan P
2005-11-10
Conduction welding offers an alternative to keyhole welding. Compared with keyhole welding, it is an intrinsically stable process because vaporization phenomena are minimal. However, as with keyhole welding, an on-line process-monitoring system is advantageous for quality assurance to maintain the required penetration depth, which in conduction welding is more sensitive to changes in heat sinking. The maximum penetration is obtained when the surface temperature is just below the boiling point, and so we normally wish to maintain the temperature at this level. We describe a two-color optical system that we have developed for real-time temperature profile measurement of the conduction weld pool. The key feature of the system is the use of a complementary metal-oxide semiconductor standard color camera leading to a simplified low-cost optical setup. We present and discuss the real-time temperature measurement and control performance of the system when a defocused beam from a high power Nd:YAG laser is used on 5 mm thick stainless steel workpieces.
Producing a Linear Laser System for 3d Modelimg of Small Objects
NASA Astrophysics Data System (ADS)
Amini, A. Sh.; Mozaffar, M. H.
2012-07-01
Today, three dimensional modeling of objects is considered in many applications such as documentation of ancient heritage, quality control, reverse engineering and animation In this regard, there are a variety of methods for producing three-dimensional models. In this paper, a 3D modeling system is developed based on photogrammetry method using image processing and laser line extraction from images. In this method the laser beam profile is radiated on the body of the object and with video image acquisition, and extraction of laser line from the frames, three-dimensional coordinates of the objects can be achieved. In this regard, first the design and implementation of hardware, including cameras and laser systems was conducted. Afterwards, the system was calibrated. Finally, the software of the system was implemented for three dimensional data extraction. The system was investigated for modeling a number of objects. The results showed that the system can provide benefits such as low cost, appropriate speed and acceptable accuracy in 3D modeling of objects.
A simple apparatus for quick qualitative analysis of CR39 nuclear track detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gautier, D. C.; Kline, J. L.; Flippo, K. A.
2008-10-15
Quantifying the ion pits in Columbia Resin 39 (CR39) nuclear track detector from Thomson parabolas is a time consuming and tedious process using conventional microscope based techniques. A simple inventive apparatus for fast screening and qualitative analysis of CR39 detectors has been developed, enabling efficient selection of data for a more detailed analysis. The system consists simply of a green He-Ne laser and a high-resolution digital single-lens reflex camera. The laser illuminates the edge of the CR39 at grazing incidence and couples into the plastic, acting as a light pipe. Subsequently, the laser illuminates all ion tracks on the surface.more » A high-resolution digital camera is used to photograph the scattered light from the ion tracks, enabling one to quickly determine charge states and energies measured by the Thomson parabola.« less
Blind guidance system based on laser triangulation
NASA Astrophysics Data System (ADS)
Wu, Jih-Huah; Wang, Jinner-Der; Fang, Wei; Lee, Yun-Parn; Shan, Yi-Chia; Kao, Hai-Ko; Ma, Shih-Hsin; Jiang, Joe-Air
2012-05-01
We propose a new guidance system for the blind. An optical triangulation method is used in the system. The main components of the proposed system comprise of a notebook computer, a camera, and two laser modules. The track image of the light beam on the ground or on the object is captured by the camera and then the image is sent to the notebook computer for further processing and analysis. Using a developed signal-processing algorithm, our system can determine the object width and the distance between the object and the blind person through the calculation of the light line positions on the image. A series of feasibility tests of the developed blind guidance system were conducted. The experimental results show that the distance between the test object and the blind can be measured with a standard deviation of less than 8.5% within the range of 40 and 130 cm, while the test object width can be measured with a standard deviation of less than 4.5% within the range of 40 and 130 cm. The application potential of the designed system to the blind guidance can be expected.
Closed Loop Control and Turbulent Flows
2005-10-01
10 2.9 A schematic diagram of the PIV setup. The PIV controller synchronizes the firing of the lasers and camera...is 16 ms. Consequently, the frequency scaling factor, f*, is 62 Hz. Twin Nd:YAG PIv Back Laser , measurement tressur I5 area tank Water’SuppIly Laser ...YAG twin 532-nm laser was used to illuminate the flow field, and 8-bit gTay-scale images were captured using a 1360 by 1024-pixel resolution camera
A measurement system applicable for landslide experiments in the field.
Guo, Wen-Zhao; Xu, Xiang-Zhou; Wang, Wen-Long; Yang, Ji-Shan; Liu, Ya-Kun; Xu, Fei-Long
2016-04-01
Observation of gravity erosion in the field with strong sunshine and wind poses a challenge. Here, a novel topography meter together with a movable tent addresses the challenge. With the topography meter, a 3D geometric shape of the target surface can be digitally reconstructed. Before the commencement of a test, the laser generator position and the camera sightline should be adjusted with a sight calibrator. Typically, the topography meter can measure the gravity erosion on the slope with a gradient of 30°-70°. Two methods can be used to obtain a relatively clear video, despite the extreme steepness of the slopes. One method is to rotate the laser source away from the slope to ensure that the camera sightline remains perpendicular to the laser plane. Another way is to move the camera farther away from the slope in which the measured volume of the slope needs to be corrected; this method will reduce distortion of the image. In addition, installation of tent poles with concrete columns helps to surmount the altitude difference on steep slopes. Results observed by the topography meter in real landslide experiments are rational and reliable.
Small format digital photogrammetry for applications in the earth sciences
NASA Astrophysics Data System (ADS)
Rieke-Zapp, Dirk
2010-05-01
Small format digital photogrammetry for applications in the earth sciences Photogrammetry is often considered one of the most precise and versatile surveying techniques. The same camera and analysis software can be used for measurements from sub-millimetre to kilometre scale. Such a measurement device is well suited for application by earth scientists working in the field. In this case a small toolset and a straight forward setup best fit the needs of the operator. While a digital camera is typically already part of the field equipment of an earth scientist the main focus of the field work is often not surveying. Lack in photogrammetric training at the same time requires an easy to learn, straight forward surveying technique. A photogrammetric method was developed aimed primarily at earth scientists for taking accurate measurements in the field minimizing extra bulk and weight of the required equipment. The work included several challenges. A) Definition of an upright coordinate system without heavy and bulky tools like a total station or GNS-Sensor. B) Optimization of image acquisition and geometric stability of the image block. C) Identification of a small camera suitable for precise measurements in the field. D) Optimization of the workflow from image acquisition to preparation of images for stereo measurements. E) Introduction of students and non-photogrammetrists to the workflow. Wooden spheres were used as target points in the field. They were more rugged and available in different sizes than ping pong balls used in a previous setup. Distances between three spheres were introduced as scale information in a photogrammetric adjustment. The distances were measured with a laser distance meter accurate to 1 mm (1 sigma). The vertical angle between the spheres was measured with the same laser distance meter. The precision of the measurement was 0.3° (1 sigma) which is sufficient, i.e. better than inclination measurements with a geological compass. The upright coordinate system is important to measure the dip angle of geologic features in outcrop. The planimetric coordinate systems would be arbitrary, but may easily be oriented to compass north introducing a direction measurement of a compass. Wooden spheres and a Leica disto D3 laser distance meter added less than 0.150 kg to the field equipment considering that a suitable digital camera was already part of it. Identification of a small digital camera suitable for precise measurements was a major part of this work. A group of cameras were calibrated several times over different periods of time on a testfield. Further evaluation involved an accuracy assessment in the field comparing distances between signalized points calculated form a photogrammetric setup with coordinates derived from a total station survey. The smallest camera in the test required calibration on the job as the interior orientation changed significantly between testfield calibration and use in the field. We attribute this to the fact that the lens was retracted then the camera was switched off. Fairly stable camera geometry in a compact size camera with lens retracting system was accomplished for Sigma DP1 and DP2 cameras. While the pixel count of the cameras was less than for the Ricoh, the pixel pitch in the Sigma cameras was much larger. Hence, the same mechanical movement would have less per pixel effect for the Sigma cameras than for the Ricoh camera. A large pixel pitch may therefore compensate for some camera instability explaining why cameras with large sensors and larger pixel pitch typically yield better accuracy in object space. Both Sigma cameras weigh approximately 0.250 kg and may even be suitable for use with ultralight aerial vehicles (UAV) which have payload restriction of 0.200 to 0.300 kg. A set of other cameras that were available were also tested on a calibration field and on location showing once again that it is difficult to reason geometric stability from camera specifications. Image acquisition with geometrically stable cameras was fairly straight forward to cover the area of interest with stereo pairs for analysis. We limited our tests to setups with three to five images to minimize the amount of post processing. The laser dot of the laser distance meter was not visible for distances farther than 5-7 m with the naked eye which also limited the maximum stereo area that may be covered with this technique. Extrapolating the setup to fairly large areas showed no significant decrease in accuracy accomplished in object space. Working with a Sigma SD14 SLR camera on a 6 x 18 x 20 m3 volume the maximum length measurement error ranged between 20 and 30 mm depending on image setup and analysis. For smaller outcrops even the compact cameras yielded maximum length measurement errors in the mm range which was considered sufficient for measurements in the earth sciences. In many cases the resolution per pixel was the limiting factor of image analysis rather than accuracy. A field manual was developed guiding novice users and students to this technique. The technique does not simplify ease of use for precision; therefore successful users of the presented method easily grow into more advanced photogrammetric methods for high precision applications. Originally camera calibration was not part of the methodology for the novice operators. Recent introduction of Camera Calibrator which is a low cost, well automated software for camera calibration, allowed beginners to calibrate their camera within a couple minutes. The complete set of calibration parameters can be applied in ERDAS LPS software easing the workflow. Image orientation was performed in LPS 9.2 software which was also used for further image analysis.
Laser cutting of irregular shape object based on stereo vision laser galvanometric scanning system
NASA Astrophysics Data System (ADS)
Qi, Li; Zhang, Yixin; Wang, Shun; Tang, Zhiqiang; Yang, Huan; Zhang, Xuping
2015-05-01
Irregular shape objects with different 3-dimensional (3D) appearances are difficult to be shaped into customized uniform pattern by current laser machining approaches. A laser galvanometric scanning system (LGS) could be a potential candidate since it can easily achieve path-adjustable laser shaping. However, without knowing the actual 3D topography of the object, the processing result may still suffer from 3D shape distortion. It is desirable to have a versatile auxiliary tool that is capable of generating 3D-adjusted laser processing path by measuring the 3D geometry of those irregular shape objects. This paper proposed the stereo vision laser galvanometric scanning system (SLGS), which takes the advantages of both the stereo vision solution and conventional LGS system. The 3D geometry of the object obtained by the stereo cameras is used to guide the scanning galvanometers for 3D-shape-adjusted laser processing. In order to achieve precise visual-servoed laser fabrication, these two independent components are integrated through a system calibration method using plastic thin film target. The flexibility of SLGS has been experimentally demonstrated by cutting duck feathers for badminton shuttle manufacture.
Initial Demonstration of 9-MHz Framing Camera Rates on the FAST UV Drive Laser Pulse Trains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumpkin, A. H.; Edstrom Jr., D.; Ruan, J.
2016-10-09
We report the configuration of a Hamamatsu C5680 streak camera as a framing camera to record transverse spatial information of green-component laser micropulses at 3- and 9-MHz rates for the first time. The latter is near the time scale of the ~7.5-MHz revolution frequency of the Integrable Optics Test Accelerator (IOTA) ring and its expected synchroton radiation source temporal structure. The 2-D images are recorded with a Gig-E readout CCD camera. We also report a first proof of principle with an OTR source using the linac streak camera in a semi-framing mode.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.
2016-01-01
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203
ATTICA family of thermal cameras in submarine applications
NASA Astrophysics Data System (ADS)
Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold
2001-10-01
Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.
Hult, Johan; Richter, Mattias; Nygren, Jenny; Aldén, Marcus; Hultqvist, Anders; Christensen, Magnus; Johansson, Bengt
2002-08-20
High-repetition-rate laser-induced fluorescence measurements of fuel and OH concentrations in internal combustion engines are demonstrated. Series of as many as eight fluorescence images, with a temporal resolution ranging from 10 micros to 1 ms, are acquired within one engine cycle. A multiple-laser system in combination with a multiple-CCD camera is used for cycle-resolved imaging in spark-ignition, direct-injection stratified-charge, and homogeneous-charge compression-ignition engines. The recorded data reveal unique information on cycle-to-cycle variations in fuel transport and combustion. Moreover, the imaging system in combination with a scanning mirror is used to perform instantaneous three-dimensional fuel-concentration measurements.
Structure Formation in Complex Plasma
2011-08-24
Dewer bottle (upper figures) or in the vapor of liquid helium (lower figures). Liq. He Ring electrode Particles Green Laser RF Plasma ... Ring electrode CCD camera Prism mirror Liq. He Glass Tube Liq. N2 Glass Dewar Acrylic particles Gas Helium Green Laser CCD camera Pressure
Curiosity ChemCam Removes Dust
2013-04-08
This pair of images taken a few minutes apart show how laser firing by NASA Mars rover Curiosity removes dust from the surface of a rock. The images were taken by the remote micro-imager camera in the laser-firing Chemistry and Camera ChemCam.
Personal medical information system using laser card
NASA Astrophysics Data System (ADS)
Cho, Seong H.; Kim, Keun Ho; Choi, Hyung-Sik; Park, Hyun Wook
1996-04-01
The well-known hospital information system (HIS) and the picture archiving and communication system (PACS) are typical applications of multimedia to medical area. This paper proposes a personal medical information save-and-carry system using a laser card. This laser card is very useful, especially in emergency situations, because the medical information in the laser card can be read at anytime and anywhere if there exists a laser card reader/writer. The contents of the laser card include the clinical histories of a patient such as clinical chart, exam result, diagnostic reports, images, and so on. The purpose of this system is not a primary diagnosis, but emergency reference of clinical history of the patient. This personal medical information system consists of a personal computer integrated with laser card reader/writer, color frame grabber, color CCD camera and a high resolution image scanner optionally. Window-based graphical user interface was designed for easy use. The laser card has relatively sufficient capacity to store the personal medical information, and has fast access speed to restore and load the data with a portable size as compact as a credit card. Database items of laser card provide the doctors with medical data such as laser card information, patient information, clinical information, and diagnostic result information.
High-speed measurements of steel-plate deformations during laser surface processing.
Jezersek, Matija; Gruden, Valter; Mozina, Janez
2004-10-04
In this paper we present a novel approach to monitoring the deformations of a steel plate's surface during various types of laser processing, e.g., engraving, marking, cutting, bending, and welding. The measuring system is based on a laser triangulation principle, where the laser projector generates multiple lines simultaneously. This enables us to measure the shape of the surface with a high sampling rate (80 Hz with our camera) and high accuracy (+/-7 microm). The measurements of steel-plate deformations for plates of different thickness and with different illumination patterns are presented graphically and in an animation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, K.K.
A Mather-type dense plasma focus (MDPF) system was designed, built, and tested specifically to study its luminescent characteristics and to assess its potential as a new light source of high-energy, short-wavelength lasers. The luminescence study of MDPF showed that the conversion efficiency from the electrical input to the optical output energies is at least 50%, up to the time the plasma compression is complete. Using the system, for the first time as an optical pump, laser activities were successfully obtained from a variety of liquid organic dyes. Diagnostic capabilities included an optical multichannel analyzer system complete with a computer control,more » a nitrogen-pumped tunable dye-laser system, a high-speed streak/framing camera, a digital laser energy meter, voltage and current probes, and a computer-based data-acquisition system.« less
Design of microcontroller based system for automation of streak camera.
Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P
2010-08-01
A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.
Design of microcontroller based system for automation of streak camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.
2010-08-15
A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor.more » A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.« less
Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid
2016-06-13
Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.
NASA Astrophysics Data System (ADS)
West, Patricia; Baker, Lionel R.
1989-03-01
This paper is a review of the applications of laser scanning in inspection. The reasons for the choice of a laser in flying spot scanning and the optical properties of a laser beam which are of value in a scanning instrument will be given. The many methods of scanning laser beams in both one and two dimensions will be described. The use of one dimensional laser scanners for automatic surface inspection for transmitting and reflective products will be covered in detail, with particular emphasis on light collection techniques. On-line inspection applications which will be mentioned include: photographic film web, metal strip products, paper web, glass sheet, car body paint surfaces and internal cylinder bores. Two dimensional laser scanning is employed in applications where increased resolution, increased depth of focus, and better contrast are required compared with conventional vidicon TV or solid state array cameras. Such examples as special microscope laser scanning systems and a TV compatible system for use in restricted areas of a nuclear reactor will be described. The technical and economic benefits and limitations of laser scanning video systems will be compared with conventional TV and CCD array devices.
Portable Airborne Laser System Measures Forest-Canopy Height
NASA Technical Reports Server (NTRS)
Nelson, Ross
2005-01-01
(PALS) is a combination of laser ranging, video imaging, positioning, and data-processing subsystems designed for measuring the heights of forest canopies along linear transects from tens to thousands of kilometers long. Unlike prior laser ranging systems designed to serve the same purpose, the PALS is not restricted to use aboard a single aircraft of a specific type: the PALS fits into two large suitcases that can be carried to any convenient location, and the PALS can be installed in almost any local aircraft for hire, thereby making it possible to sample remote forests at relatively low cost. The initial cost and the cost of repairing the PALS are also lower because the PALS hardware consists mostly of commercial off-the-shelf (COTS) units that can easily be replaced in the field. The COTS units include a laser ranging transceiver, a charge-coupled-device camera that images the laser-illuminated targets, a differential Global Positioning System (dGPS) receiver capable of operation within the Wide Area Augmentation System, a video titler, a video cassette recorder (VCR), and a laptop computer equipped with two serial ports. The VCR and computer are powered by batteries; the other units are powered at 12 VDC from the 28-VDC aircraft power system via a low-pass filter and a voltage converter. The dGPS receiver feeds location and time data, at an update rate of 0.5 Hz, to the video titler and the computer. The laser ranging transceiver, operating at a sampling rate of 2 kHz, feeds its serial range and amplitude data stream to the computer. The analog video signal from the CCD camera is fed into the video titler wherein the signal is annotated with position and time information. The titler then forwards the annotated signal to the VCR for recording on 8-mm tapes. The dGPS and laser range and amplitude serial data streams are processed by software that displays the laser trace and the dGPS information as they are fed into the computer, subsamples the laser range and amplitude data, interleaves the subsampled data with the dGPS information, and records the resulting interleaved data stream.
NASA Astrophysics Data System (ADS)
Gelderblom, Erik C.; Vos, Hendrik J.; Mastik, Frits; Faez, Telli; Luan, Ying; Kokhuis, Tom J. A.; van der Steen, Antonius F. W.; Lohse, Detlef; de Jong, Nico; Versluis, Michel
2012-10-01
The Brandaris 128 ultra-high-speed imaging facility has been updated over the last 10 years through modifications made to the camera's hardware and software. At its introduction the camera was able to record 6 sequences of 128 images (500 × 292 pixels) at a maximum frame rate of 25 Mfps. The segmented mode of the camera was revised to allow for subdivision of the 128 image sensors into arbitrary segments (1-128) with an inter-segment time of 17 μs. Furthermore, a region of interest can be selected to increase the number of recordings within a single run of the camera from 6 up to 125. By extending the imaging system with a laser-induced fluorescence setup, time-resolved ultra-high-speed fluorescence imaging of microscopic objects has been enabled. Minor updates to the system are also reported here.
The SALSA Project - High-End Aerial 3d Camera
NASA Astrophysics Data System (ADS)
Rüther-Kindel, W.; Brauchle, J.
2013-08-01
The ATISS measurement drone, developed at the University of Applied Sciences Wildau, is an electrical powered motor glider with a maximum take-off weight of 25 kg including a payload capacity of 10 kg. Two 2.5 kW engines enable ultra short take-off procedures and the motor glider design results in a 1 h endurance. The concept of ATISS is based on the idea to strictly separate between aircraft and payload functions, which makes ATISS a very flexible research platform for miscellaneous payloads. ATISS is equipped with an autopilot for autonomous flight patterns but under permanent pilot control from the ground. On the basis of ATISS the project SALSA was undertaken. The aim was to integrate a system for digital terrain modelling. Instead of a laser scanner a new design concept was chosen based on two synchronized high resolution digital cameras, one in a fixed nadir orientation and the other in a oblique orientation. Thus from every object on the ground images from different view angles are taken. This new measurement camera system MACS-TumbleCam was developed at the German Aerospace Center DLR Berlin-Adlershof especially for the ATISS payload concept. Special advantage in comparison to laser scanning is the fact, that instead of a cloud of points a surface including texture is generated and a high-end inertial orientation system can be omitted. The first test flights show a ground resolution of 2 cm and height resolution of 3 cm, which underline the extraordinary capabilities of ATISS and the MACS measurement camera system.
Correction for spatial averaging in laser speckle contrast analysis
Thompson, Oliver; Andrews, Michael; Hirst, Evan
2011-01-01
Practical laser speckle contrast analysis systems face a problem of spatial averaging of speckles, due to the pixel size in the cameras used. Existing practice is to use a system factor in speckle contrast analysis to account for spatial averaging. The linearity of the system factor correction has not previously been confirmed. The problem of spatial averaging is illustrated using computer simulation of time-integrated dynamic speckle, and the linearity of the correction confirmed using both computer simulation and experimental results. The valid linear correction allows various useful compromises in the system design. PMID:21483623
Study of Cryogenic Complex Plasma
2007-04-26
enabled us to detect the formation of the Coulomb crystals as shown in Fig. 2. Liq. He Ring electrode Particles Green Laser RF Plasma ... Ring electrode CCD camera Prism mirror Liq. He Glass Tube Liq. N2 Glass Dewar Acrylic particles Gas Helium Green Laser CCD camera Pressure
Optical sensing in laser machining
NASA Astrophysics Data System (ADS)
Smurov, Igor; Doubenskaia, Maria
2009-05-01
Optical monitoring of temperature evolution and temperature distribution in laser machining provides important information to optimise and to control technological process under study. The multi-wavelength pyrometer is used to measure brightness temperature under the pulsed action of Nd:YAG laser on stainless steel substrates. Specially developed "notch" filters (10-6 transparency at 1.06 μm wavelength) are applied to avoid the influence of laser radiation on temperature measurements. The true temperature is restored based on the method of multi-colour pyrometry. Temperature monitoring of the thin-walled gilded kovar boxes is applied to detect deviation of the welding seam from its optimum position. The pyrometers are used to control CO2-laser welding of steel and Ti plates: misalignment of the welded plates, variation of the welding geometry, internal defects, deviation of the laser beam trajectory from the junction, etc. The temperature profiles along and across the welding axis are measured by the 2D pyrometer. When using multi-component powder blends in laser cladding, for example metal matrix composite with ceramic reinforcement, one needs to control temperature of the melt to avoid thermal decomposition of certain compounds (as WC) and to assure melting of the base metal (as Co). Infra-red camera FLIR Phoenix RDAS provides detailed information on distribution of brightness temperature in laser cladding zone. CCD-camera based diagnostic system is used to measure particles-in-flight velocity and size distribution.
Low-cost laser speckle contrast imaging of blood flow using a webcam.
Richards, Lisa M; Kazmi, S M Shams; Davis, Janel L; Olin, Katherine E; Dunn, Andrew K
2013-01-01
Laser speckle contrast imaging has become a widely used tool for dynamic imaging of blood flow, both in animal models and in the clinic. Typically, laser speckle contrast imaging is performed using scientific-grade instrumentation. However, due to recent advances in camera technology, these expensive components may not be necessary to produce accurate images. In this paper, we demonstrate that a consumer-grade webcam can be used to visualize changes in flow, both in a microfluidic flow phantom and in vivo in a mouse model. A two-camera setup was used to simultaneously image with a high performance monochrome CCD camera and the webcam for direct comparison. The webcam was also tested with inexpensive aspheric lenses and a laser pointer for a complete low-cost, compact setup ($90, 5.6 cm length, 25 g). The CCD and webcam showed excellent agreement with the two-camera setup, and the inexpensive setup was used to image dynamic blood flow changes before and after a targeted cerebral occlusion.
Low-cost laser speckle contrast imaging of blood flow using a webcam
Richards, Lisa M.; Kazmi, S. M. Shams; Davis, Janel L.; Olin, Katherine E.; Dunn, Andrew K.
2013-01-01
Laser speckle contrast imaging has become a widely used tool for dynamic imaging of blood flow, both in animal models and in the clinic. Typically, laser speckle contrast imaging is performed using scientific-grade instrumentation. However, due to recent advances in camera technology, these expensive components may not be necessary to produce accurate images. In this paper, we demonstrate that a consumer-grade webcam can be used to visualize changes in flow, both in a microfluidic flow phantom and in vivo in a mouse model. A two-camera setup was used to simultaneously image with a high performance monochrome CCD camera and the webcam for direct comparison. The webcam was also tested with inexpensive aspheric lenses and a laser pointer for a complete low-cost, compact setup ($90, 5.6 cm length, 25 g). The CCD and webcam showed excellent agreement with the two-camera setup, and the inexpensive setup was used to image dynamic blood flow changes before and after a targeted cerebral occlusion. PMID:24156082
Fluorescent image tracking velocimeter
Shaffer, Franklin D.
1994-01-01
A multiple-exposure fluorescent image tracking velocimeter (FITV) detects and measures the motion (trajectory, direction and velocity) of small particles close to light scattering surfaces. The small particles may follow the motion of a carrier medium such as a liquid, gas or multi-phase mixture, allowing the motion of the carrier medium to be observed, measured and recorded. The main components of the FITV include: (1) fluorescent particles; (2) a pulsed fluorescent excitation laser source; (3) an imaging camera; and (4) an image analyzer. FITV uses fluorescing particles excited by visible laser light to enhance particle image detectability near light scattering surfaces. The excitation laser light is filtered out before reaching the imaging camera allowing the fluoresced wavelengths emitted by the particles to be detected and recorded by the camera. FITV employs multiple exposures of a single camera image by pulsing the excitation laser light for producing a series of images of each particle along its trajectory. The time-lapsed image may be used to determine trajectory and velocity and the exposures may be coded to derive directional information.
NASA Astrophysics Data System (ADS)
Lynam, Jeff R.
2001-09-01
A more highly integrated, electro-optical sensor suite using Laser Illuminated Viewing and Ranging (LIVAR) techniques is being developed under the Army Advanced Concept Technology- II (ACT-II) program for enhanced manportable target surveillance and identification. The ManPortable LIVAR system currently in development employs a wide-array of sensor technologies that provides the foot-bound soldier and UGV significant advantages and capabilities in lightweight, fieldable, target location, ranging and imaging systems. The unit incorporates a wide field-of-view, 5DEG x 3DEG, uncooled LWIR passive sensor for primary target location. Laser range finding and active illumination is done with a triggered, flash-lamp pumped, eyesafe micro-laser operating in the 1.5 micron region, and is used in conjunction with a range-gated, electron-bombarded CCD digital camera to then image the target objective in a more- narrow, 0.3$DEG, field-of-view. Target range determination is acquired using the integrated LRF and a target position is calculated using data from other onboard devices providing GPS coordinates, tilt, bank and corrected magnetic azimuth. Range gate timing and coordinated receiver optics focus control allow for target imaging operations to be optimized. The onboard control electronics provide power efficient, system operations for extended field use periods from the internal, rechargeable battery packs. Image data storage, transmission, and processing performance capabilities are also being incorporated to provide the best all-around support, for the electronic battlefield, in this type of system. The paper will describe flash laser illumination technology, EBCCD camera technology with flash laser detection system, and image resolution improvement through frame averaging.
NASA Astrophysics Data System (ADS)
Roberts, Randy S.; Bliss, Erlan S.; Rushford, Michael C.; Halpin, John M.; Awwal, Abdul A. S.; Leach, Richard R.
2014-09-01
The Advance Radiographic Capability (ARC) at the National Ignition Facility (NIF) is a laser system designed to produce a sequence of short pulses used to backlight imploding fuel capsules. Laser pulses from a short-pulse oscillator are dispersed in wavelength into long, low-power pulses, injected in the NIF main laser for amplification, and then compressed into high-power pulses before being directed into the NIF target chamber. In the target chamber, the laser pulses hit targets which produce x-rays used to backlight imploding fuel capsules. Compression of the ARC laser pulses is accomplished with a set of precision-surveyed optical gratings mounted inside of vacuum vessels. The tilt of each grating is monitored by a measurement system consisting of a laser diode, camera and crosshair, all mounted in a pedestal outside of the vacuum vessel, and a mirror mounted on the back of a grating inside the vacuum vessel. The crosshair is mounted in front of the camera, and a diffraction pattern is formed when illuminated with the laser diode beam reflected from the mirror. This diffraction pattern contains information related to relative movements between the grating and the pedestal. Image analysis algorithms have been developed to determine the relative movements between the gratings and pedestal. In the paper we elaborate on features in the diffraction pattern, and describe the image analysis algorithms used to monitor grating tilt changes. Experimental results are provided which indicate the high degree of sensitivity provided by the tilt sensor and image analysis algorithms.
Registration of Vehicle-Borne Point Clouds and Panoramic Images Based on Sensor Constellations.
Yao, Lianbi; Wu, Hangbin; Li, Yayun; Meng, Bin; Qian, Jinfei; Liu, Chun; Fan, Hongchao
2017-04-11
A mobile mapping system (MMS) is usually utilized to collect environmental data on and around urban roads. Laser scanners and panoramic cameras are the main sensors of an MMS. This paper presents a new method for the registration of the point clouds and panoramic images based on sensor constellation. After the sensor constellation was analyzed, a feature point, the intersection of the connecting line between the global positioning system (GPS) antenna and the panoramic camera with a horizontal plane, was utilized to separate the point clouds into blocks. The blocks for the central and sideward laser scanners were extracted with the segmentation feature points. Then, the point clouds located in the blocks were separated from the original point clouds. Each point in the blocks was used to find the accurate corresponding pixel in the relative panoramic images via a collinear function, and the position and orientation relationship amongst different sensors. A search strategy is proposed for the correspondence of laser scanners and lenses of panoramic cameras to reduce calculation complexity and improve efficiency. Four cases of different urban road types were selected to verify the efficiency and accuracy of the proposed method. Results indicate that most of the point clouds (with an average of 99.7%) were successfully registered with the panoramic images with great efficiency. Geometric evaluation results indicate that horizontal accuracy was approximately 0.10-0.20 m, and vertical accuracy was approximately 0.01-0.02 m for all cases. Finally, the main factors that affect registration accuracy, including time synchronization amongst different sensors, system positioning and vehicle speed, are discussed.
Split ring resonator based THz-driven electron streak camera featuring femtosecond resolution
Fabiańska, Justyna; Kassier, Günther; Feurer, Thomas
2014-01-01
Through combined three-dimensional electromagnetic and particle tracking simulations we demonstrate a THz driven electron streak camera featuring a temporal resolution on the order of a femtosecond. The ultrafast streaking field is generated in a resonant THz sub-wavelength antenna which is illuminated by an intense single-cycle THz pulse. Since electron bunches and THz pulses are generated with parts of the same laser system, synchronization between the two is inherently guaranteed. PMID:25010060
Zhang, Zhuang; Zhao, Rujin; Liu, Enhai; Yan, Kun; Ma, Yuebo
2018-06-15
This article presents a new sensor fusion method for visual simultaneous localization and mapping (SLAM) through integration of a monocular camera and a 1D-laser range finder. Such as a fusion method provides the scale estimation and drift correction and it is not limited by volume, e.g., the stereo camera is constrained by the baseline and overcomes the limited depth range problem associated with SLAM for RGBD cameras. We first present the analytical feasibility for estimating the absolute scale through the fusion of 1D distance information and image information. Next, the analytical derivation of the laser-vision fusion is described in detail based on the local dense reconstruction of the image sequences. We also correct the scale drift of the monocular SLAM using the laser distance information which is independent of the drift error. Finally, application of this approach to both indoor and outdoor scenes is verified by the Technical University of Munich dataset of RGBD and self-collected data. We compare the effects of the scale estimation and drift correction of the proposed method with the SLAM for a monocular camera and a RGBD camera.
Evaluation of Particle Image Velocimetry Measurement Using Multi-wavelength Illumination
NASA Astrophysics Data System (ADS)
Lai, HC; Chew, TF; Razak, NA
2018-05-01
In past decades, particle image velocimetry (PIV) has been widely used in measuring fluid flow and a lot of researches have been done to improve the PIV technique. Many researches are conducted on high power light emitting diode (HPLED) to replace the traditional laser illumination system in PIV. As an extended work to the research in PIV illumination system, two high power light emitting diodes (HPLED) with different wavelength are introduced as PIV illumination system. The objective of this research is using dual colours LED to directly replace laser as illumination system in order for a single frame to be captured by a normal camera instead of a high speed camera. Dual colours HPLEDs PIV are capable with single frame double pulses mode which able to plot the velocity vector of the particles after correlation. An illumination system is designed and fabricated and evaluated by measuring water flow in a small tank. The results indicates that HPLEDs promises a few advantages in terms of cost, safety and performance. It has a high potential to be develop into an alternative for PIV in the near future.
Synchronization of video recording and laser pulses including background light suppression
NASA Technical Reports Server (NTRS)
Kalshoven, Jr., James E. (Inventor); Tierney, Jr., Michael (Inventor); Dabney, Philip W. (Inventor)
2004-01-01
An apparatus for and a method of triggering a pulsed light source, in particular a laser light source, for predictable capture of the source by video equipment. A frame synchronization signal is derived from the video signal of a camera to trigger the laser and position the resulting laser light pulse in the appropriate field of the video frame and during the opening of the electronic shutter, if such shutter is included in the camera. Positioning of the laser pulse in the proper video field allows, after recording, for the viewing of the laser light image with a video monitor using the pause mode on a standard cassette-type VCR. This invention also allows for fine positioning of the laser pulse to fall within the electronic shutter opening. For cameras with externally controllable electronic shutters, the invention provides for background light suppression by increasing shutter speed during the frame in which the laser light image is captured. This results in the laser light appearing in one frame in which the background scene is suppressed with the laser light being uneffected, while in all other frames, the shutter speed is slower, allowing for the normal recording of the background scene. This invention also allows for arbitrary (manual or external) triggering of the laser with full video synchronization and background light suppression.
Spacecraft hazard avoidance utilizing structured light
NASA Technical Reports Server (NTRS)
Liebe, Carl Christian; Padgett, Curtis; Chapsky, Jacob; Wilson, Daniel; Brown, Kenneth; Jerebets, Sergei; Goldberg, Hannah; Schroeder, Jeffrey
2006-01-01
At JPL, a <5 kg free-flying micro-inspector spacecraft is being designed for host-vehicle inspection. The spacecraft includes a hazard avoidance sensor to navigate relative to the vehicle being inspected. Structured light was selected for hazard avoidance because of its low mass and cost. Structured light is a method of remote sensing 3-dimensional structure of the proximity utilizing a laser, a grating, and a single regular APS camera. The laser beam is split into 400 different beams by a grating to form a regular spaced grid of laser beams that are projected into the field of view of an APS camera. The laser source and the APS camera are separated forming the base of a triangle. The distance to all beam intersections of the host are calculated based on triangulation.
Structured-Light Based 3d Laser Scanning of Semi-Submerged Structures
NASA Astrophysics Data System (ADS)
van der Lucht, J.; Bleier, M.; Leutert, F.; Schilling, K.; Nüchter, A.
2018-05-01
In this work we look at 3D acquisition of semi-submerged structures with a triangulation based underwater laser scanning system. The motivation is that we want to simultaneously capture data above and below water to create a consistent model without any gaps. The employed structured light scanner consist of a machine vision camera and a green line laser. In order to reconstruct precise surface models of the object it is necessary to model and correct for the refraction of the laser line and camera rays at the water-air boundary. We derive a geometric model for the refraction at the air-water interface and propose a method for correcting the scans. Furthermore, we show how the water surface is directly estimated from sensor data. The approach is verified using scans captured with an industrial manipulator to achieve reproducible scanner trajectories with different incident angles. We show that the proposed method is effective for refractive correction and that it can be applied directly to the raw sensor data without requiring any external markers or targets.
X-ray Measurements of Laser Irradiated Foam Filled Liners
NASA Astrophysics Data System (ADS)
Patankar, Siddharth; Mariscal, Derek; Goyon, Clement; Baker, Kevin; MacLaren, Stephan; Hammer, Jim; Baumann, Ted; Amendt, Peter; Menapace, Joseph; Berger, Bob; Afeyan, Bedros; Tabak, Max; Dixit, Sham; Kim, Sung Ho; Moody, John; Jones, Ogden
2016-10-01
Low-density foam liners are being investigated as sources of efficient x-rays. Understanding the laser-foam interaction is key to modeling and optimizing foam composition and density for x-ray production with reduced backscatter. We report on the experimental results of laser-irradiated foam liners filled with SiO2 and Ta2O5 foams at densities between 2 to 30mg/cc. The foam liners consist of polyimide tubes filled with low-density foams and sealed with a gold foil at one end. The open end of the tube is driven with 250J of 527nm laser light in a 2ns 2-step pulse using the Jupiter Laser Facility at LLNL. A full aperture backscatter system is used to diagnose the coupled energy and losses. A streaked x-ray camera and filtered x-ray pinhole cameras are used to measure laser penetration into the low-density foam for different mass densities. A HOPG crystal spectrometer is used to estimate a thermal electron temperature. Comparisons with beam propagation and x-ray emission simulations are presented. This work was performed under the auspices of the U.S. Department of Energy by the Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, with funding support from the Laboratory Directed Research and Development Program under project 15.
Hypervelocity impact studies using a rotating mirror framing laser shadowgraph camera
NASA Technical Reports Server (NTRS)
Parker, Vance C.; Crews, Jeanne Lee
1988-01-01
The need to study the effects of the impact of micrometeorites and orbital debris on various space-based systems has brought together the technologies of several companies and individuals in order to provide a successful instrumentation package. A light gas gun was employed to accelerate small projectiles to speeds in excess of 7 km/sec. Their impact on various targets is being studied with the help of a specially designed continuous-access rotating-mirror framing camera. The camera provides 80 frames of data at up to 1 x 10 to the 6th frames/sec with exposure times of 20 nsec.
Grünbein, Marie Luise; Shoeman, Robert L; Doak, R Bruce
2018-03-19
To conduct X-ray Free-Electron Laser (XFEL) measurements at megahertz (MHz) repetition rates, sample solution must be delivered in a micron-sized liquid free-jet moving at up to 100 m/s. This exceeds by over a factor of two the jet speeds measurable with current high-speed camera techniques. Accordingly we have developed and describe herein an alternative jet velocimetry based on dual-pulse nanosecond laser illumination. Three separate implementations are described, including a small laser-diode system that is inexpensive and highly portable. We have also developed and describe analysis techniques to automatically and rapidly extract jet speed from dual-pulse images.
The threshold of vapor channel formation in water induced by pulsed CO2 laser
NASA Astrophysics Data System (ADS)
Guo, Wenqing; Zhang, Xianzeng; Zhan, Zhenlin; Xie, Shusen
2012-12-01
Water plays an important role in laser ablation. There are two main interpretations of laser-water interaction: hydrokinetic effect and vapor phenomenon. The two explanations are reasonable in some way, but they can't explain the mechanism of laser-water interaction completely. In this study, the dynamic process of vapor channel formation induced by pulsed CO2 laser in static water layer was monitored by high-speed camera. The wavelength of pulsed CO2 laser is 10.64 um, and pulse repetition rate is 60 Hz. The laser power ranged from 1 to 7 W with a step of 0.5 W. The frame rate of high-speed camera used in the experiment was 80025 fps. Based on high-speed camera pictures, the dynamic process of vapor channel formation was examined, and the threshold of vapor channel formation, pulsation period, the volume, the maximum depth and corresponding width of vapor channel were determined. The results showed that the threshold of vapor channel formation was about 2.5 W. Moreover, pulsation period, the maximum depth and corresponding width of vapor channel increased with the increasing of the laser power.
Human cadaver retina model for retinal heating during corneal surgery with a femtosecond laser
NASA Astrophysics Data System (ADS)
Sun, Hui; Fan, Zhongwei; Yun, Jin; Zhao, Tianzhuo; Yan, Ying; Kurtz, Ron M.; Juhasz, Tibor
2014-02-01
Femtosecond lasers are widely used in everyday clinical procedures to perform minimally invasive corneal refractive surgery. The intralase femtosecond laser (AMO Corp. Santa Ana, CA) is a common example of such a laser. In the present study a numerical simulation was developed to quantify the temperature rise in the retina during femtosecond intracorneal surgery. Also, ex-vivo retinal heating due to laser irradiation was measured with an infrared thermal camera (Fluke Corp. Everett, WA) as a validation of the simulation. A computer simulation was developed using Comsol Multiphysics to calculate the temperature rise in the cadaver retina during femtosecond laser corneal surgery. The simulation showed a temperature rise of less than 0.3 degrees for realistic pulse energies for the various repetition rates. Human cadaver retinas were irradiated with a 150 kHz Intralase femtosecond laser and the temperature rise was measured withan infrared thermal camera. Thermal camera measurements are in agreement with the simulation. During routine femtosecond laser corneal surgery with normal clinical parameters, the temperature rise is well beneath the threshold for retina damage. The simulation predictions are in agreement with thermal measurements providing a level of experimental validation.
New Airborne Sensors and Platforms for Solving Specific Tasks in Remote Sensing
NASA Astrophysics Data System (ADS)
Kemper, G.
2012-07-01
A huge number of small and medium sized sensors entered the market. Today's mid format sensors reach 80 MPix and allow to run projects of medium size, comparable with the first big format digital cameras about 6 years ago. New high quality lenses and new developments in the integration prepared the market for photogrammetric work. Companies as Phase One or Hasselblad and producers or integrators as Trimble, Optec, and others utilized these cameras for professional image production. In combination with small camera stabilizers they can be used also in small aircraft and make the equipment small and easy transportable e.g. for rapid assessment purposes. The combination of different camera sensors enables multi or hyper-spectral installations e.g. useful for agricultural or environmental projects. Arrays of oblique viewing cameras are in the market as well, in many cases these are small and medium format sensors combined as rotating or shifting devices or just as a fixed setup. Beside the proper camera installation and integration, also the software that controls the hardware and guides the pilot has to solve much more tasks than a normal FMS did in the past. Small and relatively cheap Laser Scanners (e.g. Riegl) are in the market and a proper combination with MS Cameras and an integrated planning and navigation is a challenge that has been solved by different softwares. Turnkey solutions are available e.g. for monitoring power line corridors where taking images is just a part of the job. Integration of thermal camera systems with laser scanner and video capturing must be combined with specific information of the objects stored in a database and linked when approaching the navigation point.
In-situ quality monitoring during laser brazing
NASA Astrophysics Data System (ADS)
Ungers, Michael; Fecker, Daniel; Frank, Sascha; Donst, Dmitri; Märgner, Volker; Abels, Peter; Kaierle, Stefan
Laser brazing of zinc coated steel is a widely established manufacturing process in the automotive sector, where high quality requirements must be fulfilled. The strength, impermeablitiy and surface appearance of the joint are particularly important for judging its quality. The development of an on-line quality control system is highly desired by the industry. This paper presents recent works on the development of such a system, which consists of two cameras operating in different spectral ranges. For the evaluation of the system, seam imperfections are created artificially during experiments. Finally image processing algorithms for monitoring process parameters based the captured images are presented.
NASA Astrophysics Data System (ADS)
Song, Zhen; Moore, Kevin L.; Chen, YangQuan; Bahl, Vikas
2003-09-01
As an outgrowth of series of projects focused on mobility of unmanned ground vehicles (UGV), an omni-directional (ODV), multi-robot, autonomous mobile parking security system has been developed. The system has two types of robots: the low-profile Omni-Directional Inspection System (ODIS), which can be used for under-vehicle inspections, and the mid-sized T4 robot, which serves as a ``marsupial mothership'' for the ODIS vehicles and performs coarse resolution inspection. A key task for the T4 robot is license plate recognition (LPR). For a successful LPR task without compromising the recognition rate, the robot must be able to identify the bumper locations of vehicles in the parking area and then precisely position the LPR camera relative to the bumper. This paper describes a 2D-laser scanner based approach to bumper identification and laser servoing for the T4 robot. The system uses a gimbal-mounted scanning laser. As the T4 robot travels down a row of parking stalls, data is collected from the laser every 100ms. For each parking stall in the range of the laser during the scan, the data is matched to a ``bumper box'' corresponding to where a car bumper is expected, resulting in a point cloud of data corresponding to a vehicle bumper for each stall. Next, recursive line-fitting algorithms are used to determine a line for the data in each stall's ``bumper box.'' The fitting technique uses Hough based transforms, which are robust against segmentation problems and fast enough for real-time line fitting. Once a bumper line is fitted with an acceptable confidence, the bumper location is passed to the T4 motion controller, which moves to position the LPR camera properly relative to the bumper. The paper includes examples and results that show the effectiveness of the technique, including its ability to work in real-time.
Tracking the course of the manufacturing process in selective laser melting
NASA Astrophysics Data System (ADS)
Thombansen, U.; Gatej, A.; Pereira, M.
2014-02-01
An innovative optical train for a selective laser melting based manufacturing system (SLM) has been designed under the objective to track the course of the SLM process. In this, the thermal emission from the melt pool and the geometric properties of the interaction zone are addressed by applying a pyrometer and a camera system respectively. The optical system is designed such that all three radiations from processing laser, thermal emission and camera image are coupled coaxially and that they propagate on the same optical axis. As standard f-theta lenses for high power applications inevitably lead to aberrations and divergent optical axes for increasing deflection angles in combination with multiple wavelengths, a pre-focus system is used to implement a focusing unit which shapes the beam prior to passing the scanner. The sensor system records synchronously the current position of the laser beam, the current emission from the melt pool and an image of the interaction zone. Acquired data of the thermal emission is being visualized after processing which allows an instant evaluation of the course of the process at any position of each layer. As such, it provides a fully detailed history of the product This basic work realizes a first step towards self-optimization of the manufacturing process by providing information about quality relevant events during manufacture. The deviation from the planned course of the manufacturing process to the actual course of the manufacturing process can be used to adapt the manufacturing strategy from one layer to the next. In the current state, the system can be used to facilitate the setup of the manufacturing system as it allows identification of false machine settings without having to analyze the work piece.
A flexible, on-line magnetic spectrometer for ultra-intense laser produced fast electron measurement
NASA Astrophysics Data System (ADS)
Ge, Xulei; Yuan, Xiaohui; Yang, Su; Deng, Yanqing; Wei, Wenqing; Fang, Yuan; Gao, Jian; Liu, Feng; Chen, Min; Zhao, Li; Ma, Yanyun; Sheng, Zhengming; Zhang, Jie
2018-04-01
We have developed an on-line magnetic spectrometer to measure energy distributions of fast electrons generated from ultra-intense laser-solid interactions. The spectrometer consists of a sheet of plastic scintillator, a bundle of non-scintillating plastic fibers, and an sCMOS camera recording system. The design advantages include on-line capturing ability, versatility of detection arrangement, and resistance to harsh in-chamber environment. The validity of the instrument was tested experimentally. This spectrometer can be applied to the characterization of fast electron source for understanding fundamental laser-plasma interaction physics and to the optimization of high-repetition-rate laser-driven applications.
NASA Astrophysics Data System (ADS)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.; Kalantar, Daniel H.
2015-02-01
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high energy density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.
3D Lasers Increase Efficiency, Safety of Moving Machines
NASA Technical Reports Server (NTRS)
2015-01-01
Canadian company Neptec Design Group Ltd. developed its Laser Camera System, used by shuttles to render 3D maps of their hulls for assessing potential damage. Using NASA funding, the firm incorporated LiDAR technology and created the TriDAR 3D sensor. Its commercial arm, Neptec Technologies Corp., has sold the technology to Orbital Sciences, which uses it to guide its Cygnus spacecraft during rendezvous and dock operations at the International Space Station.
Photogrammetry and Laser Imagery Tests for Tank Waste Volume Estimates: Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Jim G.
2013-03-27
Feasibility tests were conducted using photogrammetry and laser technologies to estimate the volume of waste in a tank. These technologies were compared with video Camera/CAD Modeling System (CCMS) estimates; the current method used for post-retrieval waste volume estimates. This report summarizes test results and presents recommendations for further development and deployment of technologies to provide more accurate and faster waste volume estimates in support of tank retrieval and closure.
Obstacle Detection and Avoidance of a Mobile Robotic Platform Using Active Depth Sensing
2014-06-01
price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its environment in three...inception. At the price of nearly one tenth of a laser range finder, the Xbox Kinect uses an infrared projector and camera to capture images of its...cropped between 280 and 480 pixels. ........11 Figure 9. RGB image captured by the camera on the Xbox Kinect. ...............................12 Figure
Video sensor with range measurement capability
NASA Technical Reports Server (NTRS)
Howard, Richard T. (Inventor); Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Broderick, David J. (Inventor)
2008-01-01
A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.
NASA Astrophysics Data System (ADS)
Gnyawali, Surya C.; Blum, Kevin; Pal, Durba; Ghatak, Subhadip; Khanna, Savita; Roy, Sashwati; Sen, Chandan K.
2017-01-01
Cutaneous microvasculopathy complicates wound healing. Functional assessment of gated individual dermal microvessels is therefore of outstanding interest. Functional performance of laser speckle contrast imaging (LSCI) systems is compromised by motion artefacts. To address such weakness, post-processing of stacked images is reported. We report the first post-processing of binary raw data from a high-resolution LSCI camera. Sharp images of low-flowing microvessels were enabled by introducing inverse variance in conjunction with speckle contrast in Matlab-based program code. Extended moving window averaging enhanced signal-to-noise ratio. Functional quantitative study of blood flow kinetics was performed on single gated microvessels using a free hand tool. Based on detection of flow in low-flow microvessels, a new sharp contrast image was derived. Thus, this work presents the first distinct image with quantitative microperfusion data from gated human foot microvasculature. This versatile platform is applicable to study a wide range of tissue systems including fine vascular network in murine brain without craniotomy as well as that in the murine dorsal skin. Importantly, the algorithm reported herein is hardware agnostic and is capable of post-processing binary raw data from any camera source to improve the sensitivity of functional flow data above and beyond standard limits of the optical system.
Gnyawali, Surya C.; Blum, Kevin; Pal, Durba; Ghatak, Subhadip; Khanna, Savita; Roy, Sashwati; Sen, Chandan K.
2017-01-01
Cutaneous microvasculopathy complicates wound healing. Functional assessment of gated individual dermal microvessels is therefore of outstanding interest. Functional performance of laser speckle contrast imaging (LSCI) systems is compromised by motion artefacts. To address such weakness, post-processing of stacked images is reported. We report the first post-processing of binary raw data from a high-resolution LSCI camera. Sharp images of low-flowing microvessels were enabled by introducing inverse variance in conjunction with speckle contrast in Matlab-based program code. Extended moving window averaging enhanced signal-to-noise ratio. Functional quantitative study of blood flow kinetics was performed on single gated microvessels using a free hand tool. Based on detection of flow in low-flow microvessels, a new sharp contrast image was derived. Thus, this work presents the first distinct image with quantitative microperfusion data from gated human foot microvasculature. This versatile platform is applicable to study a wide range of tissue systems including fine vascular network in murine brain without craniotomy as well as that in the murine dorsal skin. Importantly, the algorithm reported herein is hardware agnostic and is capable of post-processing binary raw data from any camera source to improve the sensitivity of functional flow data above and beyond standard limits of the optical system. PMID:28106129
Fisheye Multi-Camera System Calibration for Surveying Narrow and Complex Architectures
NASA Astrophysics Data System (ADS)
Perfetti, L.; Polari, C.; Fassi, F.
2018-05-01
Narrow spaces and passages are not a rare encounter in cultural heritage, the shape and extension of those areas place a serious challenge on any techniques one may choose to survey their 3D geometry. Especially on techniques that make use of stationary instrumentation like terrestrial laser scanning. The ratio between space extension and cross section width of many corridors and staircases can easily lead to distortions/drift of the 3D reconstruction because of the problem of propagation of uncertainty. This paper investigates the use of fisheye photogrammetry to produce the 3D reconstruction of such spaces and presents some tests to contain the degree of freedom of the photogrammetric network, thereby containing the drift of long data set as well. The idea is that of employing a multi-camera system composed of several fisheye cameras and to implement distances and relative orientation constraints, as well as the pre-calibration of the internal parameters for each camera, within the bundle adjustment. For the beginning of this investigation, we used the NCTech iSTAR panoramic camera as a rigid multi-camera system. The case study of the Amedeo Spire of the Milan Cathedral, that encloses a spiral staircase, is the stage for all the tests. Comparisons have been made between the results obtained with the multi-camera configuration, the auto-stitched equirectangular images and a data set obtained with a monocular fisheye configuration using a full frame DSLR. Results show improved accuracy, down to millimetres, using a rigidly constrained multi-camera.
Streak camera based SLR receiver for two color atmospheric measurements
NASA Technical Reports Server (NTRS)
Varghese, Thomas K.; Clarke, Christopher; Oldham, Thomas; Selden, Michael
1993-01-01
To realize accurate two-color differential measurements, an image digitizing system with variable spatial resolution was designed, built, and integrated to a photon-counting picosecond streak camera, yielding a temporal scan resolution better than 300 femtosecond/pixel. The streak camera is configured to operate with 3 spatial channels; two of these support green (532 nm) and uv (355 nm) while the third accommodates reference pulses (764 nm) for real-time calibration. Critical parameters affecting differential timing accuracy such as pulse width and shape, number of received photons, streak camera/imaging system nonlinearities, dynamic range, and noise characteristics were investigated to optimize the system for accurate differential delay measurements. The streak camera output image consists of three image fields, each field is 1024 pixels along the time axis and 16 pixels across the spatial axis. Each of the image fields may be independently positioned across the spatial axis. Two of the image fields are used for the two wavelengths used in the experiment; the third window measures the temporal separation of a pair of diode laser pulses which verify the streak camera sweep speed for each data frame. The sum of the 16 pixel intensities across each of the 1024 temporal positions for the three data windows is used to extract the three waveforms. The waveform data is processed using an iterative three-point running average filter (10 to 30 iterations are used) to remove high-frequency structure. The pulse pair separations are determined using the half-max and centroid type analysis. Rigorous experimental verification has demonstrated that this simplified process provides the best measurement accuracy. To calibrate the receiver system sweep, two laser pulses with precisely known temporal separation are scanned along the full length of the sweep axis. The experimental measurements are then modeled using polynomial regression to obtain a best fit to the data. Data aggregation using normal point approach has provided accurate data fitting techniques and is found to be much more convenient than using the full rate single shot data. The systematic errors from this model have been found to be less than 3 ps for normal points.
Stereoscopic Machine-Vision System Using Projected Circles
NASA Technical Reports Server (NTRS)
Mackey, Jeffrey R.
2010-01-01
A machine-vision system capable of detecting obstacles large enough to damage or trap a robotic vehicle is undergoing development. The system includes (1) a pattern generator that projects concentric circles of laser light forward onto the terrain, (2) a stereoscopic pair of cameras that are aimed forward to acquire images of the circles, (3) a frame grabber and digitizer for acquiring image data from the cameras, and (4) a single-board computer that processes the data. The system is being developed as a prototype of machine- vision systems to enable robotic vehicles ( rovers ) on remote planets to avoid craters, large rocks, and other terrain features that could capture or damage the vehicles. Potential terrestrial applications of systems like this one could include terrain mapping, collision avoidance, navigation of robotic vehicles, mining, and robotic rescue. This system is based partly on the same principles as those of a prior stereoscopic machine-vision system in which the cameras acquire images of a single stripe of laser light that is swept forward across the terrain. However, this system is designed to afford improvements over some of the undesirable features of the prior system, including the need for a pan-and-tilt mechanism to aim the laser to generate the swept stripe, ambiguities in interpretation of the single-stripe image, the time needed to sweep the stripe across the terrain and process the data from many images acquired during that time, and difficulty of calibration because of the narrowness of the stripe. In this system, the pattern generator does not contain any moving parts and need not be mounted on a pan-and-tilt mechanism: the pattern of concentric circles is projected steadily in the forward direction. The system calibrates itself by use of data acquired during projection of the concentric-circle pattern onto a known target representing flat ground. The calibration- target image data are stored in the computer memory for use as a template in processing terrain images. During operation on terrain, the images acquired by the left and right cameras are analyzed. The analysis includes (1) computation of the horizontal and vertical dimensions and the aspect ratios of rectangles that bound the circle images and (2) comparison of these aspect ratios with those of the template. Coordinates of distortions of the circles are used to identify and locate objects. If the analysis leads to identification of an object of significant size, then stereoscopicvision algorithms are used to estimate the distance to the object. The time taken in performing this analysis on a single pair of images acquired by the left and right cameras in this system is a fraction of the time taken in processing the many pairs of images acquired in a sweep of the laser stripe across the field of view in the prior system. The results of the analysis include data on sizes and shapes of, and distances and directions to, objects. Coordinates of objects are updated as the vehicle moves so that intelligent decisions regarding speed and direction can be made. The results of the analysis are utilized in a computational decision-making process that generates obstacle-avoidance data and feeds those data to the control system of the robotic vehicle.
Proceedings of Workshop on Laser Diagnostics in Fluid Mechanics and Combustion
NASA Astrophysics Data System (ADS)
1993-10-01
Proceedings of the Workshop on Laser Diagnostics in Fluid Mechanics and Combustion are presented. Topics included are: Accuracy of Laser Doppler Anemometry; Applications of Raman-Rayleigh-LIF Diagnostics in Combustion Research; Phase Doppler Anemometer Technique Concepts and Applications; CARS; Particle Image Velocimetry; Practical Consideration in the Use and Design of Laser Velocimetry Systems in Turbomachinery Applications; Phase Doppler Measurements of Gas-Particle Flow Through a Tube Bank; Degenerate Four Wave Mixing for Shock Tunnel Studies of Supersonic Combustion; Laser Induced Photodissociation and Fluorescence (LIPF) of Sodium Species Present in Coal Combustion; 3D Holographic Measurements Inside a Spark Ignition Engine; Laser Doppler Velocimeter Measurements in Compressible Flow; Bursting in a Tornado Vortex; Quantitative Imaging of OH and Temperature Using a Single Laser Source and Single Intensified Camera; and Laser Doppler Measurements Inside an Artificial Heart Valve.
NASA Astrophysics Data System (ADS)
Rasztovits, S.; Dorninger, P.
2013-07-01
Terrestrial Laser Scanning (TLS) is an established method to reconstruct the geometrical surface of given objects. Current systems allow for fast and efficient determination of 3D models with high accuracy and richness in detail. Alternatively, 3D reconstruction services are using images to reconstruct the surface of an object. While the instrumental expenses for laser scanning systems are high, upcoming free software services as well as open source software packages enable the generation of 3D models using digital consumer cameras. In addition, processing TLS data still requires an experienced user while recent web-services operate completely automatically. An indisputable advantage of image based 3D modeling is its implicit capability for model texturing. However, the achievable accuracy and resolution of the 3D models is lower than those of laser scanning data. Within this contribution, we investigate the results of automated web-services for image based 3D model generation with respect to a TLS reference model. For this, a copper sculpture was acquired using a laser scanner and using image series of different digital cameras. Two different webservices, namely Arc3D and AutoDesk 123D Catch were used to process the image data. The geometric accuracy was compared for the entire model and for some highly structured details. The results are presented and interpreted based on difference models. Finally, an economical comparison of the generation of the models is given considering the interactive and processing time costs.
Development of a highly automated system for the remote evaluation of individual tree parameters
Richard Pollock
2000-01-01
A highly-automated procedure for remotely estimating individual tree location, crown diameter, species class, and height has been developed. This procedure will involve the use of a multimodal airborne sensing system that consists of a digital frame camera, a scanning laser rangefinder, and a position and orientation measurement system. Data from the multimodal sensing...
Data acquisition and analysis of range-finding systems for spacing construction
NASA Technical Reports Server (NTRS)
Shen, C. N.
1981-01-01
For space missions of future, completely autonomous robotic machines will be required to free astronauts from routine chores of equipment maintenance, servicing of faulty systems, etc. and to extend human capabilities in hazardous environments full of cosmic and other harmful radiations. In places of high radiation and uncontrollable ambient illuminations, T.V. camera based vision systems cannot work effectively. However, a vision system utilizing directly measured range information with a time of flight laser rangefinder, can successfully operate under these environments. Such a system will be independent of proper illumination conditions and the interfering effects of intense radiation of all kinds will be eliminated by the tuned input of the laser instrument. Processing the range data according to certain decision, stochastic estimation and heuristic schemes, the laser based vision system will recognize known objects and thus provide sufficient information to the robot's control system which can develop strategies for various objectives.
Function-based design process for an intelligent ground vehicle vision system
NASA Astrophysics Data System (ADS)
Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.
2010-10-01
An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.
Development of Mobile Mapping System for 3D Road Asset Inventory.
Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott
2016-03-12
Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed.
Development of Mobile Mapping System for 3D Road Asset Inventory
Sairam, Nivedita; Nagarajan, Sudhagar; Ornitz, Scott
2016-01-01
Asset Management is an important component of an infrastructure project. A significant cost is involved in maintaining and updating the asset information. Data collection is the most time-consuming task in the development of an asset management system. In order to reduce the time and cost involved in data collection, this paper proposes a low cost Mobile Mapping System using an equipped laser scanner and cameras. First, the feasibility of low cost sensors for 3D asset inventory is discussed by deriving appropriate sensor models. Then, through calibration procedures, respective alignments of the laser scanner, cameras, Inertial Measurement Unit and GPS (Global Positioning System) antenna are determined. The efficiency of this Mobile Mapping System is experimented by mounting it on a truck and golf cart. By using derived sensor models, geo-referenced images and 3D point clouds are derived. After validating the quality of the derived data, the paper provides a framework to extract road assets both automatically and manually using techniques implementing RANSAC plane fitting and edge extraction algorithms. Then the scope of such extraction techniques along with a sample GIS (Geographic Information System) database structure for unified 3D asset inventory are discussed. PMID:26985897
NASA Astrophysics Data System (ADS)
Fuchs, Eran; Tuell, Grady
2010-04-01
The CZMIL system is a new generation airborne bathymetric and topographic remote sensing platform composed of an active lidar, passive hyperspectral imager, high resolution frame camera, navigation system, and storage media running on a linux-based Gigabit Ethernet network. The lidar is a hybrid scanned-flash system employing a 10 KHz green laser and novel circular scanner, with a large aperture receiver (0.20m) having multiple channels. A PMT-based segmented detector is used on one channel to support simultaneous topographic and bathymetric data collection, and multiple fields-of- view are measured to support bathymetric measurements. The measured laser returns are digitized at 1 GHz to produce the waveforms required for ranging measurements, and unique data compression and storage techniques are used to address the large data volume. Simulated results demonstrate CZMIL's capability to discriminate bottom and surface returns in very shallow water conditions without compromising performance in deep water. Simulated waveforms are compared with measured data from the SHOALS system and show promising expected results. The system's prototype is expected to be completed by end of 2010, and ready for initial calibration tests in the spring of 2010.
Sub-pixel accuracy thickness calculation of poultry fillets from scattered laser profiles
NASA Astrophysics Data System (ADS)
Jing, Hansong; Chen, Xin; Tao, Yang; Zhu, Bin; Jin, Fenghua
2005-11-01
A laser range imaging system based on the triangulation method was designed and implemented for online high-resolution thickness calculation of poultry fillets. A laser pattern was projected onto the surface of the chicken fillet for calculation of the thickness of the meat. Because chicken fillets are relatively loosely-structured material, a laser light easily penetrates the meat, and scattering occurs both at and under the surface. When laser light is scattered under the surface it is reflected back and further blurs the laser line sharpness. To accurately calculate the thickness of the object, the light transportation has to be considered. In the system, the Bidirectional Reflectance Distribution Function (BSSRDF) was used to model the light transportation and the light pattern reflected into the cameras. BSSRDF gives the reflectance of a target as a function of illumination geometry and viewing geometry. Based on this function, an empirical method has been developed and it has been proven that this method can be used to accurately calculate the thickness of the object from a scattered laser profile. The laser range system is designed as a sub-system that complements the X-ray bone inspection system for non-invasive detection of hazardous materials in boneless poultry meat with irregular thickness.
Registration of Vehicle-Borne Point Clouds and Panoramic Images Based on Sensor Constellations
Yao, Lianbi; Wu, Hangbin; Li, Yayun; Meng, Bin; Qian, Jinfei; Liu, Chun; Fan, Hongchao
2017-01-01
A mobile mapping system (MMS) is usually utilized to collect environmental data on and around urban roads. Laser scanners and panoramic cameras are the main sensors of an MMS. This paper presents a new method for the registration of the point clouds and panoramic images based on sensor constellation. After the sensor constellation was analyzed, a feature point, the intersection of the connecting line between the global positioning system (GPS) antenna and the panoramic camera with a horizontal plane, was utilized to separate the point clouds into blocks. The blocks for the central and sideward laser scanners were extracted with the segmentation feature points. Then, the point clouds located in the blocks were separated from the original point clouds. Each point in the blocks was used to find the accurate corresponding pixel in the relative panoramic images via a collinear function, and the position and orientation relationship amongst different sensors. A search strategy is proposed for the correspondence of laser scanners and lenses of panoramic cameras to reduce calculation complexity and improve efficiency. Four cases of different urban road types were selected to verify the efficiency and accuracy of the proposed method. Results indicate that most of the point clouds (with an average of 99.7%) were successfully registered with the panoramic images with great efficiency. Geometric evaluation results indicate that horizontal accuracy was approximately 0.10–0.20 m, and vertical accuracy was approximately 0.01–0.02 m for all cases. Finally, the main factors that affect registration accuracy, including time synchronization amongst different sensors, system positioning and vehicle speed, are discussed. PMID:28398256
Laser focus compensating sensing and imaging device
Vann, Charles S.
1993-01-01
A laser focus compensating sensing and imaging device permits the focus of a single focal point of different frequency laser beams emanating from the same source point. In particular it allows the focusing of laser beam originating from the same laser device but having differing intensities so that a low intensity beam will not convert to a higher frequency when passing through a conversion crystal associated with the laser generating device. The laser focus compensating sensing and imaging device uses a cassegrain system to fold the lower frequency, low intensity beam back upon itself so that it will focus at the same focal point as a high intensity beam. An angular tilt compensating lens is mounted about the secondary mirror of the cassegrain system to assist in alignment. In addition cameras or CCD's are mounted with the primary mirror to sense the focused image. A convex lens is positioned co-axial with the cassegrain system on the side of the primary mirror distal of the secondary for use in aligning a target with the laser beam. A first alternate embodiment includes a cassegrain system using a series of shutters and an internally mounted dichroic mirror. A second alternate embodiment uses two laser focus compensating sensing and imaging devices for aligning a moving tool with a work piece.
Laser focus compensating sensing and imaging device
Vann, C.S.
1993-08-31
A laser focus compensating sensing and imaging device permits the focus of a single focal point of different frequency laser beams emanating from the same source point. In particular it allows the focusing of laser beam originating from the same laser device but having differing intensities so that a low intensity beam will not convert to a higher frequency when passing through a conversion crystal associated with the laser generating device. The laser focus compensating sensing and imaging device uses a Cassegrain system to fold the lower frequency, low intensity beam back upon itself so that it will focus at the same focal point as a high intensity beam. An angular tilt compensating lens is mounted about the secondary mirror of the Cassegrain system to assist in alignment. In addition cameras or CCD's are mounted with the primary mirror to sense the focused image. A convex lens is positioned co-axial with the Cassegrain system on the side of the primary mirror distal of the secondary for use in aligning a target with the laser beam. A first alternate embodiment includes a Cassegrain system using a series of shutters and an internally mounted dichroic mirror. A second alternate embodiment uses two laser focus compensating sensing and imaging devices for aligning a moving tool with a work piece.
High resolution Thomson scattering system for steady-state linear plasma sources
NASA Astrophysics Data System (ADS)
Lee, K. Y.; Lee, K. I.; Kim, J. H.; Lho, T.
2018-01-01
The high resolution Thomson scattering system with 63 points along a 25 mm line measures the radial electron temperature (Te) and its density (ne) in an argon plasma. By using a DC arc source with lanthanum hexaboride (LaB6) electrode, plasmas with electron temperature of over 5 eV and densities of 1.5 × 1019 m-3 have been measured. The system uses a frequency doubled (532 nm) Nd:YAG laser with 0.25 J/pulse at 20 Hz. The scattered light is collected and sent to a triple-grating spectrometer via optical-fibers, where images are recorded by an intensified charge coupled device (ICCD) camera. Although excellent in stray-light reduction, a disadvantage comes with its relatively low optical transmission and in sampling a tiny scattering volume. Thus requires accumulating multitude of images. In order to improve photon statistics, pixel binning in the ICCD camera as well as enlarging the intermediate slit-width inside the triple-grating spectrometer has been exploited. In addition, the ICCD camera capture images at 40 Hz while the laser is at 20 Hz. This operation mode allows us to alternate between background and scattering shot images. By image subtraction, influences from the plasma background are effectively taken out. Maximum likelihood estimation that uses a parameter sweep finds best fitting parameters Te and ne with the incoherent scattering spectrum.
NASA Astrophysics Data System (ADS)
Zheng, Li; Yi, Ruan
2009-11-01
Power line inspection and maintenance already benefit from developments in mobile robotics. This paper presents mobile robots capable of crossing obstacles on overhead ground wires. A teleoperated robot realizes inspection and maintenance tasks on power transmission line equipment. The inspection robot is driven by 11 motor with two arms, two wheels and two claws. The inspection robot is designed to realize the function of observation, grasp, walk, rolling, turn, rise, and decline. This paper is oriented toward 100% reliable obstacle detection and identification, and sensor fusion to increase the autonomy level. An embedded computer based on PC/104 bus is chosen as the core of control system. Visible light camera and thermal infrared Camera are both installed in a programmable pan-and-tilt camera (PPTC) unit. High-quality visual feedback rapidly becomes crucial for human-in-the-loop control and effective teleoperation. The communication system between the robot and the ground station is based on Mesh wireless networks by 700 MHz bands. An expert system programmed with Visual C++ is developed to implement the automatic control. Optoelectronic laser sensors and laser range scanner were installed in robot for obstacle-navigation control to grasp the overhead ground wires. A novel prototype with careful considerations on mobility was designed to inspect the 500KV power transmission lines. Results of experiments demonstrate that the robot can be applied to execute the navigation and inspection tasks.
High resolution Thomson scattering system for steady-state linear plasma sources.
Lee, K Y; Lee, K I; Kim, J H; Lho, T
2018-01-01
The high resolution Thomson scattering system with 63 points along a 25 mm line measures the radial electron temperature (T e ) and its density (n e ) in an argon plasma. By using a DC arc source with lanthanum hexaboride (LaB 6 ) electrode, plasmas with electron temperature of over 5 eV and densities of 1.5 × 10 19 m -3 have been measured. The system uses a frequency doubled (532 nm) Nd:YAG laser with 0.25 J/pulse at 20 Hz. The scattered light is collected and sent to a triple-grating spectrometer via optical-fibers, where images are recorded by an intensified charge coupled device (ICCD) camera. Although excellent in stray-light reduction, a disadvantage comes with its relatively low optical transmission and in sampling a tiny scattering volume. Thus requires accumulating multitude of images. In order to improve photon statistics, pixel binning in the ICCD camera as well as enlarging the intermediate slit-width inside the triple-grating spectrometer has been exploited. In addition, the ICCD camera capture images at 40 Hz while the laser is at 20 Hz. This operation mode allows us to alternate between background and scattering shot images. By image subtraction, influences from the plasma background are effectively taken out. Maximum likelihood estimation that uses a parameter sweep finds best fitting parameters T e and n e with the incoherent scattering spectrum.
Kinect2 - respiratory movement detection study.
Rihana, Sandy; Younes, Elie; Visvikis, Dimitris; Fayad, Hadi
2016-08-01
Radiotherapy is one of the main cancer treatments. It consists in irradiating tumor cells to destroy them while sparing healthy tissue. The treatment is planned based on Computed Tomography (CT) and is delivered over fractions during several days. One of the main challenges is replacing patient in the same position every day to irradiate the tumor volume while sparing healthy tissues. Many patient positioning techniques are available. They are both invasive and not accurate performed using tattooed marker on the patient's skin aligned with a laser system calibrated in the treatment room or irradiating using X-ray. Currently systems such as Vision RT use two Time of Flight cameras. Time of Flight cameras have the advantage of having a very fast acquisition rate allows the real time monitoring of patient movement and patient repositioning. The purpose of this work is to test the Microsoft Kinect2 camera for potential use for patient positioning and respiration trigging. This type of Time of Flight camera is non-invasive and costless which facilitate its transfer to clinical practice.
Camera, handlens, and microscope optical system for imaging and coupled optical spectroscopy
NASA Technical Reports Server (NTRS)
Mungas, Greg S. (Inventor); Boynton, John (Inventor); Sepulveda, Cesar A. (Inventor); Nunes de Sepulveda, legal representative, Alicia (Inventor); Gursel, Yekta (Inventor)
2012-01-01
An optical system comprising two lens cells, each lens cell comprising multiple lens elements, to provide imaging over a very wide image distance and within a wide range of magnification by changing the distance between the two lens cells. An embodiment also provides scannable laser spectroscopic measurements within the field-of-view of the instrument.
Camera, handlens, and microscope optical system for imaging and coupled optical spectroscopy
NASA Technical Reports Server (NTRS)
Mungas, Greg S. (Inventor); Boynton, John (Inventor); Sepulveda, Cesar A. (Inventor); Nunes de Sepulveda, Alicia (Inventor); Gursel, Yekta (Inventor)
2011-01-01
An optical system comprising two lens cells, each lens cell comprising multiple lens elements, to provide imaging over a very wide image distance and within a wide range of magnification by changing the distance between the two lens cells. An embodiment also provides scannable laser spectroscopic measurements within the field-of-view of the instrument.
A modular approach to detection and identification of defects in rough lumber
Sang Mook Lee; A. Lynn Abbott; Daniel L. Schmoldt
2001-01-01
This paper describes a prototype scanning system that can automatically identify several important defects on rough hardwood lumber. The scanning system utilizes 3 laser sources and an embedded-processor camera to capture and analyze profile and gray-scale images. The modular approach combines the detection of wane (the curved sides of a board, possibly containing...
NASA Astrophysics Data System (ADS)
Behnken, Barry N.; Karunasiri, Gamani; Chamberlin, Danielle; Robrish, Peter; Faist, Jérôme
2008-02-01
Real-time imaging in the terahertz (THz) spectral range was achieved using a 3.6-THz quantum cascade laser (QCL) and an uncooled, 160×120 pixel microbolometer camera fitted with a picarin lens. Noise equivalent temperature difference of the camera in the 1-5 THz frequency range was calculated to be at least 3 K, confirming the need for external THz illumination when imaging in this frequency regime. After evaluating the effects of various operating parameters on laser performance, the QCL found to perform optimally at 1.9 A in pulsed mode with a 300 kHz repetition rate and 10-20% duty cycle; average output power was approximately 1 mW. Under this scheme, a series of metallic objects were imaged while wrapped in various obscurants. Single-frame and extended video recordings demonstrate strong contrast between metallic materials and those of plastic, cloth, and paper - supporting the viability of this imaging technology in security screening applications. Thermal effects arising from Joule heating of the laser were found to be the dominant issue affecting output power and image quality; these effects were mitigated by limiting laser pulse widths to 670 ns and operating the system under closed-cycle refrigeration at a temperature of 10 K.
Digital readout for image converter cameras
NASA Astrophysics Data System (ADS)
Honour, Joseph
1991-04-01
There is an increasing need for fast and reliable analysis of recorded sequences from image converter cameras so that experimental information can be readily evaluated without recourse to more time consuming photographic procedures. A digital readout system has been developed using a randomly triggerable high resolution CCD camera, the output of which is suitable for use with IBM AT compatible PC. Within half a second from receipt of trigger pulse, the frame reformatter displays the image and transfer to storage media can be readily achieved via the PC and dedicated software. Two software programmes offer different levels of image manipulation which includes enhancement routines and parameter calculations with accuracy down to pixel levels. Hard copy prints can be acquired using a specially adapted Polaroid printer, outputs for laser and video printer extend the overall versatility of the system.
Cable and Line Inspection Mechanism
NASA Technical Reports Server (NTRS)
Ross, Terence J. (Inventor)
2003-01-01
An automated cable and line inspection mechanism visually scans the entire surface of a cable as the mechanism travels along the cable=s length. The mechanism includes a drive system, a video camera, a mirror assembly for providing the camera with a 360 degree view of the cable, and a laser micrometer for measuring the cable=s diameter. The drive system includes an electric motor and a plurality of drive wheels and tension wheels for engaging the cable or line to be inspected, and driving the mechanism along the cable. The mirror assembly includes mirrors that are positioned to project multiple images of the cable on the camera lens, each of which is of a different portion of the cable. A data transceiver and a video transmitter are preferably employed for transmission of video images, data and commands between the mechanism and a remote control station.
Cable and line inspection mechanism
NASA Technical Reports Server (NTRS)
Ross, Terence J. (Inventor)
2003-01-01
An automated cable and line inspection mechanism visually scans the entire surface of a cable as the mechanism travels along the cable=s length. The mechanism includes a drive system, a video camera, a mirror assembly for providing the camera with a 360 degree view of the cable, and a laser micrometer for measuring the cable=s diameter. The drive system includes an electric motor and a plurality of drive wheels and tension wheels for engaging the cable or line to be inspected, and driving the mechanism along the cable. The mirror assembly includes mirrors that are positioned to project multiple images of the cable on the camera lens, each of which is of a different portion of the cable. A data transceiver and a video transmitter are preferably employed for transmission of video images, data and commands between the mechanism and a remote control station.
Two-Color Laser Speckle Shift Strain Measurement System
NASA Technical Reports Server (NTRS)
Tuma, Margaret L.; Krasowski, Michael J.; Oberle, Lawrence G.; Greer, Lawrence C., III; Spina, Daniel; Barranger, John
1996-01-01
A two color laser speckle shift strain measurement system based on the technique of Yamaguchi was designed. The dual wavelength light output from an Argon Ion laser was coupled into two separate single-mode optical fibers (patchcords). The output of the patchcords is incident on the test specimen (here a structural fiber). Strain on the fiber, in one direction, is produced using an Instron 4502. Shifting interference patterns or speckle patterns will be detected at real-time rates using 2 CCD cameras with image processing performed by a hardware correlator. Strain detected in fibers with diameters from 21 microns to 143 microns is expected to be resolved to 15 mu epsilon. This system was designed to be compact and robust and does not require surface preparation of the structural fibers.
Intra-cavity upconversion to 631 nm of images illuminated by an eye-safe ASE source at 1550 nm.
Torregrosa, A J; Maestre, H; Capmany, J
2015-11-15
We report an image wavelength upconversion system. The system mixes an incoming image at around 1550 nm (eye-safe region) illuminated by an amplified spontaneous emission (ASE) fiber source with a Gaussian beam at 1064 nm generated in a continuous-wave diode-pumped Nd(3+):GdVO(4) laser. Mixing takes place in a periodically poled lithium niobate (PPLN) crystal placed intra-cavity. The upconverted image obtained by sum-frequency mixing falls around the 631 nm red spectral region, well within the spectral response of standard silicon focal plane array bi-dimensional sensors, commonly used in charge-coupled device (CCD) or complementary metal-oxide-semiconductor (CMOS) video cameras, and of most image intensifiers. The use of ASE illumination benefits from a noticeable increase in the field of view (FOV) that can be upconverted with regard to using coherent laser illumination. The upconverted power allows us to capture real-time video in a standard nonintensified CCD camera.
Performance of laser guide star adaptive optics at Lick Observatory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S.S.; An, J.; Avicola, K.
1995-07-19
A sodium-layer laser guide star adaptive optics system has been developed at Lawrence Livermore National Laboratory (LLNL) for use on the 3-meter Shane telescope at Lick Observatory. The system is based on a 127-actuator continuous-surface deformable mirror, a Hartmann wavefront sensor equipped with a fast-framing low-noise CCD camera, and a pulsed solid-state-pumped dye laser tuned to the atomic sodium resonance line at 589 nm. The adaptive optics system has been tested on the Shane telescope using natural reference stars yielding up to a factor of 12 increase in image peak intensity and a factor of 6.5 reduction in image fullmore » width at half maximum (FWHM). The results are consistent with theoretical expectations. The laser guide star system has been installed and operated on the Shane telescope yielding a beam with 22 W average power at 589 nm. Based on experimental data, this laser should generate an 8th magnitude guide star at this site, and the integrated laser guide star adaptive optics system should produce images with Strehl ratios of 0.4 at 2.2 {mu}m in median seeing and 0.7 at 2.2 {mu}m in good seeing.« less
Experimental research on femto-second laser damaging array CCD cameras
NASA Astrophysics Data System (ADS)
Shao, Junfeng; Guo, Jin; Wang, Ting-feng; Wang, Ming
2013-05-01
Charged Coupled Devices (CCD) are widely used in military and security applications, such as airborne and ship based surveillance, satellite reconnaissance and so on. Homeland security requires effective means to negate these advanced overseeing systems. Researches show that CCD based EO systems can be significantly dazzled or even damaged by high-repetition rate pulsed lasers. Here, we report femto - second laser interaction with CCD camera, which is probable of great importance in future. Femto - second laser is quite fresh new lasers, which has unique characteristics, such as extremely short pulse width (1 fs = 10-15 s), extremely high peak power (1 TW = 1012W), and especially its unique features when interacting with matters. Researches in femto second laser interaction with materials (metals, dielectrics) clearly indicate non-thermal effect dominates the process, which is of vast difference from that of long pulses interaction with matters. Firstly, the damage threshold test are performed with femto second laser acting on the CCD camera. An 800nm, 500μJ, 100fs laser pulse is used to irradiate interline CCD solid-state image sensor in the experiment. In order to focus laser energy onto tiny CCD active cells, an optical system of F/5.6 is used. A Sony production CCDs are chose as typical targets. The damage threshold is evaluated with multiple test data. Point damage, line damage and full array damage were observed when the irradiated pulse energy continuously increase during the experiment. The point damage threshold is found 151.2 mJ/cm2.The line damage threshold is found 508.2 mJ/cm2.The full-array damage threshold is found to be 5.91 J/cm2. Although the phenomenon is almost the same as that of nano laser interaction with CCD, these damage thresholds are substantially lower than that of data obtained from nano second laser interaction with CCD. Then at the same time, the electric features after different degrees of damage are tested with electronic multi meter. The resistance values between clock signal lines are measured. Contrasting the resistance values of the CCD before and after damage, it is found that the resistances decrease significantly between the vertical transfer clock signal lines values. The same results are found between the vertical transfer clock signal line and the earth electrode (ground).At last, the damage position and the damage mechanism were analyzed with above results and SEM morphological experiments. The point damage results in the laser destroying material, which shows no macro electro influence. The line damage is quite different from that of point damage, which shows deeper material corroding effect. More importantly, short circuits are found between vertical clock lines. The full array damage is even more severe than that of line damage starring with SEM, while no obvious different electrical features than that of line damage are found. Further researches are anticipated in femto second laser caused CCD damage mechanism with more advanced tools. This research is valuable in EO countermeasure and/or laser shielding applications.
Hinken, David; Schinke, Carsten; Herlufsen, Sandra; Schmidt, Arne; Bothe, Karsten; Brendel, Rolf
2011-03-01
We report in detail on the luminescence imaging setup developed within the last years in our laboratory. In this setup, the luminescence emission of silicon solar cells or silicon wafers is analyzed quantitatively. Charge carriers are excited electrically (electroluminescence) using a power supply for carrier injection or optically (photoluminescence) using a laser as illumination source. The luminescence emission arising from the radiative recombination of the stimulated charge carriers is measured spatially resolved using a camera. We give details of the various components including cameras, optical filters for electro- and photo-luminescence, the semiconductor laser and the four-quadrant power supply. We compare a silicon charged-coupled device (CCD) camera with a back-illuminated silicon CCD camera comprising an electron multiplier gain and a complementary metal oxide semiconductor indium gallium arsenide camera. For the detection of the luminescence emission of silicon we analyze the dominant noise sources along with the signal-to-noise ratio of all three cameras at different operation conditions.
Advances in detection of diffuse seafloor venting using structured light imaging.
NASA Astrophysics Data System (ADS)
Smart, C.; Roman, C.; Carey, S.
2016-12-01
Systematic, remote detection and high resolution mapping of low temperature diffuse hydrothermal venting is inefficient and not currently tractable using traditional remotely operated vehicle (ROV) mounted sensors. Preliminary results for hydrothermal vent detection using a structured light laser sensor were presented in 2011 and published in 2013 (Smart) with continual advancements occurring in the interim. As the structured light laser passes over active venting, the projected laser line effectively blurs due to the associated turbulence and density anomalies in the vent fluid. The degree laser disturbance is captured by a camera collecting images of the laser line at 20 Hz. Advancements in the detection of the laser and fluid interaction have included extensive normalization of the collected laser data and the implementation of a support vector machine algorithm to develop a classification routine. The image data collected over a hydrothermal vent field is then labeled as seafloor, bacteria or a location of venting. The results can then be correlated with stereo images, bathymetry and backscatter data. This sensor is a component of an ROV mounted imaging suite which also includes stereo cameras and a multibeam sonar system. Originally developed for bathymetric mapping, the structured light laser sensor, and other imaging suite components, are capable of creating visual and bathymetric maps with centimeter level resolution. Surveys are completed in a standard mowing the lawn pattern completing a 30m x 30m survey with centimeter level resolution in under an hour. Resulting co-registered data includes, multibeam and structured light laser bathymetry and backscatter, stereo images and vent detection. This system allows for efficient exploration of areas with diffuse and small point source hydrothermal venting increasing the effectiveness of scientific sampling and observation. Recent vent detection results collected during the 2013-2015 E/V Nautilus seasons will be presented. Smart, C. J. and Roman, C. and Carey, S. N. (2013) Detection of diffuse seafloor venting using structured light imaging, Geochemistry, Geophysics, Geosystems, 14, 4743-4757
Welding pool measurement using thermal array sensor
NASA Astrophysics Data System (ADS)
Cho, Chia-Hung; Hsieh, Yi-Chen; Chen, Hsin-Yi
2015-08-01
Selective laser melting (SLM) is an additive manufacturing (AM) technology that uses a high-power laser beam to melt metal powder in chamber of inert gas. The process starts by slicing the 3D CAD data as a digital information source into layers to create a 2D image of each layer. Melting pool was formed by using laser irradiation on metal powders which then solidified to consolidated structure. In a selective laser melting process, the variation of melt pool affects the yield of a printed three-dimensional product. For three dimensional parts, the border conditions of the conductive heat transport have a very large influence on the melt pool dimensions. Therefore, melting pool is an important behavior that affects the final quality of the 3D object. To meet the temperature and geometry of the melting pool for monitoring in additive manufacturing technology. In this paper, we proposed the temperature sensing system which is composed of infrared photodiode, high speed camera, band-pass filter, dichroic beam splitter and focus lens. Since the infrared photodiode and high speed camera look at the process through the 2D galvanometer scanner and f-theta lens, the temperature sensing system can be used to observe the melting pool at any time, regardless of the movement of the laser spot. In order to obtain a wide temperature detecting range, 500 °C to 2500 °C, the radiation from the melting pool to be measured is filtered into a plurality of radiation portions, and since the intensity ratio distribution of the radiation portions is calculated by using black-body radiation. The experimental result shows that the system is suitable for melting pool to measure temperature.
100-kHz shot-to-shot broadband data acquisition for high-repetition-rate pump-probe spectroscopy.
Kanal, Florian; Keiber, Sabine; Eck, Reiner; Brixner, Tobias
2014-07-14
Shot-to-shot broadband detection is common in ultrafast pump-probe spectroscopy. Taking advantage of the intensity correlation of subsequent laser pulses improves the signal-to-noise ratio. Finite data readout times of CCD chips in the employed spectrometer and the maximum available speed of mechanical pump-beam choppers typically limit this approach to lasers with repetition rates of a few kHz. For high-repetition (≥ 100 kHz) systems, one typically averages over a larger number of laser shots leading to inferior signal-to-noise ratios or longer measurement times. Here we demonstrate broadband shot-to-shot detection in transient absorption spectroscopy with a 100-kHz femtosecond laser system. This is made possible using a home-built high-speed chopper with external laser synchronization and a fast CCD line camera. Shot-to-shot detection can reduce the data acquisition time by two orders of magnitude compared to few-kHz lasers while keeping the same signal-to-noise ratio.
Multiplexed fluorescence detector system for capillary electrophoresis
Yeung, E.S.; Taylor, J.A.
1996-03-12
A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis. 14 figs.
Multiplexed fluorescence detector system for capillary electrophoresis
Yeung, E.S.; Taylor, J.A.
1994-06-28
A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis. 14 figures.
Neil A. Clark
2001-01-01
A multisensor video system has been developed incorporating a CCD video camera, a 3-axis magnetometer, and a laser-rangefinding device, for the purpose of measuring individual tree stems. While preliminary results show promise, some changes are needed to improve the accuracy and efficiency of the system. Image matching is needed to improve the accuracy of length...
Multiplexed fluorescence detector system for capillary electrophoresis
Yeung, Edward S.; Taylor, John A.
1996-03-12
A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis.
Multiplexed fluorescence detector system for capillary electrophoresis
Yeung, Edward S.; Taylor, John A.
1994-06-28
A fluorescence detection system for capillary electrophoresis is provided wherein the detection system can simultaneously excite fluorescence and substantially simultaneously monitor separations in multiple capillaries. This multiplexing approach involves laser irradiation of a sample in a plurality of capillaries through optical fibers that are coupled individually with the capillaries. The array is imaged orthogonally through a microscope onto a charge-coupled device camera for signal analysis.
High-resolution hyperspectral ground mapping for robotic vision
NASA Astrophysics Data System (ADS)
Neuhaus, Frank; Fuchs, Christian; Paulus, Dietrich
2018-04-01
Recently released hyperspectral cameras use large, mosaiced filter patterns to capture different ranges of the light's spectrum in each of the camera's pixels. Spectral information is sparse, as it is not fully available in each location. We propose an online method that avoids explicit demosaicing of camera images by fusing raw, unprocessed, hyperspectral camera frames inside an ego-centric ground surface map. It is represented as a multilayer heightmap data structure, whose geometry is estimated by combining a visual odometry system with either dense 3D reconstruction or 3D laser data. We use a publicly available dataset to show that our approach is capable of constructing an accurate hyperspectral representation of the surface surrounding the vehicle. We show that in many cases our approach increases spatial resolution over a demosaicing approach, while providing the same amount of spectral information.
A new compact, high sensitivity neutron imaging systema)
NASA Astrophysics Data System (ADS)
Caillaud, T.; Landoas, O.; Briat, M.; Rossé, B.; Thfoin, I.; Philippe, F.; Casner, A.; Bourgade, J. L.; Disdier, L.; Glebov, V. Yu.; Marshall, F. J.; Sangster, T. C.; Park, H. S.; Robey, H. F.; Amendt, P.
2012-10-01
We have developed a new small neutron imaging system (SNIS) diagnostic for the OMEGA laser facility. The SNIS uses a penumbral coded aperture and has been designed to record images from low yield (109-1010 neutrons) implosions such as those using deuterium as the fuel. This camera was tested at OMEGA in 2009 on a rugby hohlraum energetics experiment where it recorded an image at a yield of 1.4 × 1010. The resolution of this image was 54 μm and the camera was located only 4 meters from target chamber centre. We recently improved the instrument by adding a cooled CCD camera. The sensitivity of the new camera has been fully characterized using a linear accelerator and a 60Co γ-ray source. The calibration showed that the signal-to-noise ratio could be improved by using raw binning detection.
A simple optical tweezers for trapping polystyrene particles
NASA Astrophysics Data System (ADS)
Shiddiq, Minarni; Nasir, Zulfa; Yogasari, Dwiyana
2013-09-01
Optical tweezers is an optical trap. For decades, it has become an optical tool that can trap and manipulate any particle from the very small size like DNA to the big one like bacteria. The trapping force comes from the radiation pressure of laser light which is focused to a group of particles. Optical tweezers has been used in many research areas such as atomic physics, medical physics, biophysics, and chemistry. Here, a simple optical tweezers has been constructed using a modified Leybold laboratory optical microscope. The ocular lens of the microscope has been removed for laser light and digital camera accesses. A laser light from a Coherent diode laser with wavelength λ = 830 nm and power 50 mW is sent through an immersion oil objective lens with magnification 100 × and NA 1.25 to a cell made from microscope slides containing polystyrene particles. Polystyrene particles with size 3 μm and 10 μm are used. A CMOS Thorlabs camera type DCC1545M with USB Interface and Thorlabs camera lens 35 mm are connected to a desktop and used to monitor the trapping and measure the stiffness of the trap. The camera is accompanied by camera software which makes able for the user to capture and save images. The images are analyzed using ImageJ and Scion macro. The polystyrene particles have been trapped successfully. The stiffness of the trap depends on the size of the particles and the power of the laser. The stiffness increases linearly with power and decreases as the particle size larger.
Panoramic 3D Reconstruction by Fusing Color Intensity and Laser Range Data
NASA Astrophysics Data System (ADS)
Jiang, Wei; Lu, Jian
Technology for capturing panoramic (360 degrees) three-dimensional information in a real environment have many applications in fields: virtual and complex reality, security, robot navigation, and so forth. In this study, we examine an acquisition device constructed of a regular CCD camera and a 2D laser range scanner, along with a technique for panoramic 3D reconstruction using a data fusion algorithm based on an energy minimization framework. The acquisition device can capture two types of data of a panoramic scene without occlusion between two sensors: a dense spatio-temporal volume from a camera and distance information from a laser scanner. We resample the dense spatio-temporal volume for generating a dense multi-perspective panorama that has equal spatial resolution to that of the original images acquired using a regular camera, and also estimate a dense panoramic depth-map corresponding to the generated reference panorama by extracting trajectories from the dense spatio-temporal volume with a selecting camera. Moreover, for determining distance information robustly, we propose a data fusion algorithm that is embedded into an energy minimization framework that incorporates active depth measurements using a 2D laser range scanner and passive geometry reconstruction from an image sequence obtained using the CCD camera. Thereby, measurement precision and robustness can be improved beyond those available by conventional methods using either passive geometry reconstruction (stereo vision) or a laser range scanner. Experimental results using both synthetic and actual images show that our approach can produce high-quality panoramas and perform accurate 3D reconstruction in a panoramic environment.
Drug injection into fat tissue with a laser based microjet injector
NASA Astrophysics Data System (ADS)
Han, Tae-hee; Hah, Jung-moo; Yoh, Jack J.
2011-05-01
We have investigated a new micro drug jet injector using laser pulse energy. An infrared laser beam of high energy (˜3 J/pulse) is focused inside a driving fluid in a small chamber. The pulse then induces various energy releasing processes, and generates fast microjets through a micronozzle. The elastic membrane of this system plays an important role in transferring mechanical pressure and protecting drug from heat release. In this paper, we offer the sequential images of microjet generation taken by a high speed camera as an evidence of the multiple injections via single pulse. Furthermore, we test the proposed system to penetrate soft animal tissues in order to evaluate its feasibility as an advanced transdermal drug delivery method.
Upwelling Radiance at 976 nm Measured from Space Using a CCD Camera
NASA Technical Reports Server (NTRS)
Biswas, Abhijit; Kovalik, Joseph M.; Oaida, Bogdan V.; Abrahamson, Matthew J.; Wright, Malcolm W.
2015-01-01
The Optical Payload for Lasercomm Science (OPALS) Flight System on-board the International Space Station uses a charge coupled device (CCD) camera for receiving a beacon laser from Earth. Relative measurements of the background contributed by upwelling radiance under diverse illumination conditions and varying terrain is presented. In some cases clouds in the field-of-view allowed a comparison of terrestrial and cloud-top upwelling radiance. In this paper we will report these measurements and examine the extent of agreement with atmospheric model predictions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.
2015-01-12
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high-energy-density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However,more » the camera nonlinearities drift over time, affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.« less
Design of a Multi-Sensor Cooperation Travel Environment Perception System for Autonomous Vehicle
Chen, Long; Li, Qingquan; Li, Ming; Zhang, Liang; Mao, Qingzhou
2012-01-01
This paper describes the environment perception system designed for intelligent vehicle SmartV-II, which won the 2010 Future Challenge. This system utilizes the cooperation of multiple lasers and cameras to realize several necessary functions of autonomous navigation: road curb detection, lane detection and traffic sign recognition. Multiple single scan lasers are integrated to detect the road curb based on Z-variance method. Vision based lane detection is realized by two scans method combining with image model. Haar-like feature based method is applied for traffic sign detection and SURF matching method is used for sign classification. The results of experiments validate the effectiveness of the proposed algorithms and the whole system.
Performances Of The New Streak Camera TSN 506
NASA Astrophysics Data System (ADS)
Nodenot, P.; Imhoff, C.; Bouchu, M.; Cavailler, C.; Fleurot, N.; Launspach, J.
1985-02-01
The number of streack cameras used in research laboratory has been continuously increased du-ring the past years. The increasing of this type of equipment is due to the development of various measurement techniques in the nanosecond and picosecond range. Among the many different applications, we would mention detonics chronometry measurement, measurement of the speed of matter by means of Doppler-laser interferometry, laser and plasma diagnostics associated with laser-matter interaction. The old range of cameras have been remodelled, in order to standardize and rationalize the production of ultrafast cinematography instruments, to produce a single camera known as TSN 506. Tne TSN 506 is composed of an electronic control unit, built around the image converter tube it can be fitted with a nanosecond sweep circuit covering the whole range from 1 ms to 200 ns or with a picosecond circuit providing streak durations from 1 to 100 ns. We shall describe the main electronic and opto-electronic performance of the TSN 506 operating in these two temporal fields.
Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras
NASA Astrophysics Data System (ADS)
Chen, C.; Yang, B. S.; Song, S.
2016-06-01
Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To find an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each Kinect are calibrated using a chess board pattern; 3, the external parameters between every Kinect are firstly calculated using a pre-set calibration field and further refined by an iterative closet point algorithm. Experiments are carried out to validate the proposed method upon RGB-D datasets collected by the indoor mapping suite prototype. The effectiveness and accuracy of the proposed method is evaluated by comparing the point clouds derived from the prototype with ground truth data collected by commercial terrestrial laser scanner at ultra-high density. The overall analysis of the results shows that the proposed method achieves seamless integration of multiple point clouds form different RGB-D cameras collected at 30 frame per second.
Simulation of laser beam reflection at the sea surface modeling and validation
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric; Repasi, Endre
2013-06-01
A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for the pre-calculation of images for cameras operating in different spectral wavebands (visible, short wave infrared) for a bistatic configuration of laser source and receiver for different atmospheric conditions. In the visible waveband the calculated detected total power of reflected laser light from a 660nm laser source is compared with data collected in a field trial. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser beam reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the view of a camera the sea surface radiance must be calculated for the specific waveband. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). Validation of simulation results is prerequisite before applying the computer simulation to maritime laser applications. For validation purposes data (images and meteorological data) were selected from field measurements, using a 660nm cw-laser diode to produce laser beam reflection at the water surface and recording images by a TV camera. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam.
Design of an ROV-based lidar for seafloor monitoring
NASA Astrophysics Data System (ADS)
Harsdorf, Stefan; Janssen, Manfred; Reuter, Rainer; Wachowicz, Bernhard
1997-05-01
In recent years, accidents of ships with chemical cargo have led to strong impacts on the marine ecosystem, and to risks for pollution control and clean-up teams. In order to enable a fast, safe, and efficient reaction, a new optical instrument has been designed for the inspection of objects on the seafloor by range-gated scattered light images as well as for the detection of substances by measuring the laser induced emission on the seafloor and within the water column. This new lidar is operated as a payload of a remotely operated vehicle (ROV). A Nd:YAG laser is employed as the light source of the lidar. In the video mode, the submarine lidar system uses the 2nd harmonic laser pulse to illuminate the seafloor. Elastically scattered and reflected light is collected with a gateable intensified CCD camera. The beam divergence of the laser is the same as the camera field-of-view. Synchronization of laser emission and camera gate time allows to suppress backscattered light from the water column and to record only the light backscattered by the object. This results in a contrast enhanced video image which increases the visibility range in turbid water up to four times. Substances seeping out from a container are often invisible in video images because of their low contrast. Therefore, a fluorescence lidar mode is integrated into the submarine lidar. the 3rd harmonic Nd:YAG laser pulse is applied, and the emission response of the water body between ROV and seafloor and of the seafloor itself is recorded at variable wavelengths with a maximum depth resolution is realized by a 2D scanner, which allows to select targets within the range-gated image for a measurement of fluorescence. The analysis of the time- and spectral-resolved signals permits the detection, the exact location, and a classification of fluorescent and/or absorbing substances.
Arabski, Michał; Wasik, Sławomir; Piskulak, Patrycja; Góźdź, Natalia; Slezak, Andrzej; Kaca, Wiesław
2011-01-01
The aim of this study was to analysis of antibiotics (ampicilin, streptomycin, ciprofloxacin or colistin) release from agarose gel by spectrophotmetry and laser interferometry methods. The interferometric system consisted of a Mach-Zehnder interferometer with a He-Ne laser, TV-CCD camera, computerised data acquisition system and a gel system. The gel system under study consists of two cuvettes. We filled the lower cuvette with an aqueous 1% agarose solution with the antibiotics at initial concentration of antibiotics in the range of 0.12-2 mg/ml for spectrophotmetry analysis or 0.05-0.5 mg/ml for laser interferometry methods, while in the upper cuvette there was pure water. The diffusion was analysed from 120 to 2400 s with a time interval of deltat = 120 s by both methods. We observed that 0.25-1 mg/ml and 0,05 mg/ml are minimal initial concentrations detected by spectrophotometric and laser interferometry methods, respectively. Additionally, we observed differences in kinetic of antibiotic diffusion from gel measured by both methods. In conclusion, the laser interferometric method is a useful tool for studies of antibiotic release from agarose gel, especially for substances are not fully soluble in water, for example: colistin.
Automated grading, upgrading, and cuttings prediction of surfaced dry hardwood lumber
Sang-Mook Lee; Phil Araman; A.Lynn Abbott; Matthew F. Winn
2010-01-01
This paper concerns the scanning, sawing, and grading of kiln-dried hardwood lumber. A prototype system is described that uses laser sources and a video camera to scan boards. The system automatically detects defects and wane, searches for optimal sawing solutions, and then estimates the grades of the boards that would result. The goal is to derive maximum commercial...
Hardwood lumber scanning tests to determine NHLA lumber grades
Philip A. Araman; Ssang-Mook Lee; A. Lynn Abbott; Matthew F. Winn
2011-01-01
This paper concerns the scanning, and grading of kiln-dried hardwood lumber. A prototype system is described that uses laser sources and a video camera to scan boards. The system automatically detects defects and wane, grades the boards, and then searches for higher value boards within the original board. The goal is to derive maximum commercial value based on current...
Microscope self-calibration based on micro laser line imaging and soft computing algorithms
NASA Astrophysics Data System (ADS)
Apolinar Muñoz Rodríguez, J.
2018-06-01
A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.
Standoff detection of explosive molecules using nanosecond gated Raman spectroscopy
NASA Astrophysics Data System (ADS)
Chung, Jin Hyuk; Cho, Soo Gyeong
2013-06-01
Recently, improvised explosive device (IED) has been a serious threat for many countries. One of the approaches to alleviate this threat is standoff detection of explosive molecules used in IEDs. Raman spectroscopy is a prospective method among many technologies under research to achieve this goal. It provides unique information of the target materials, through which the ingredients used in IEDs can be analyzed and identified. The main problem of standoff Raman spectroscopic detection is the large background noise hindering weak Raman signals from the target samples. Typical background noise comes from both ambient fluorescent lights indoor and sunlight outdoor whose intensities are usually much larger than that of Raman scattering from the sample. Under the proper condition using pulse laser and ICCD camera with nanosecond pulse width and gating technology, we succeed to separate and remove these background noises from Raman signals. For this experiment, we build an optical system for standoff detection of explosive molecules. We use 532 nm, 10 Hz, Q-switching Nd:YAG laser as light source, and ICCD camera triggered by laser Qswitching time with proper gate delay regarding the flight time of Raman from target materials. Our detection system is successfully applied to detect and identify more than 20 ingredients of IEDs including TNT, RDX, and HMX which are located 10 to 54 meters away from the system.
Ultraviolet laser beam monitor using radiation responsive crystals
McCann, Michael P.; Chen, Chung H.
1988-01-01
An apparatus and method for monitoring an ultraviolet laser beam includes disposing in the path of an ultraviolet laser beam a substantially transparent crystal that will produce a color pattern in response to ultraviolet radiation. The crystal is exposed to the ultraviolet laser beam and a color pattern is produced within the crystal corresponding to the laser beam intensity distribution therein. The crystal is then exposed to visible light, and the color pattern is observed by means of the visible light to determine the characteristics of the laser beam that passed through crystal. In this manner, a perpendicular cross sectional intensity profile and a longitudinal intensity profile of the ultraviolet laser beam may be determined. The observation of the color pattern may be made with forward or back scattered light and may be made with the naked eye or with optical systems such as microscopes and television cameras.
Pulsed laser linescanner for a backscatter absorption gas imaging system
Kulp, Thomas J.; Reichardt, Thomas A.; Schmitt, Randal L.; Bambha, Ray P.
2004-02-10
An active (laser-illuminated) imaging system is described that is suitable for use in backscatter absorption gas imaging (BAGI). A BAGI imager operates by imaging a scene as it is illuminated with radiation that is absorbed by the gas to be detected. Gases become "visible" in the image when they attenuate the illumination creating a shadow in the image. This disclosure describes a BAGI imager that operates in a linescanned manner using a high repetition rate pulsed laser as its illumination source. The format of this system allows differential imaging, in which the scene is illuminated with light at least 2 wavelengths--one or more absorbed by the gas and one or more not absorbed. The system is designed to accomplish imaging in a manner that is insensitive to motion of the camera, so that it can be held in the hand of an operator or operated from a moving vehicle.
NASA Technical Reports Server (NTRS)
Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.
1987-01-01
An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.
Performance of Laser Megajoule’s x-ray streak camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuber, C., E-mail: celine.zuber@cea.fr; Bazzoli, S.; Brunel, P.
2016-11-15
A prototype of a picosecond x-ray streak camera has been developed and tested by Commissariat à l’Énergie Atomique et aux Énergies Alternatives to provide plasma-diagnostic support for the Laser Megajoule. We report on the measured performance of this streak camera, which almost fulfills the requirements: 50-μm spatial resolution over a 15-mm field in the photocathode plane, 17-ps temporal resolution in a 2-ns timebase, a detection threshold lower than 625 nJ/cm{sup 2} in the 0.05–15 keV spectral range, and a dynamic range greater than 100.
Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination
NASA Astrophysics Data System (ADS)
Spigulis, Janis; Oshina, Ilze; Berzina, Anna; Bykov, Alexander
2017-09-01
Chromophore distribution maps are useful tools for skin malformation severity assessment and for monitoring of skin recovery after burns, surgeries, and other interactions. The chromophore maps can be obtained by processing several spectral images of skin, e.g., captured by hyperspectral or multispectral cameras during seconds or even minutes. To avoid motion artifacts and simplify the procedure, a single-snapshot technique for mapping melanin, oxyhemoglobin, and deoxyhemoglobin of in-vivo skin by a smartphone under simultaneous three-wavelength (448-532-659 nm) laser illumination is proposed and examined. Three monochromatic spectral images related to the illumination wavelengths were extracted from the smartphone camera RGB image data set with respect to crosstalk between the RGB detection bands. Spectral images were further processed accordingly to Beer's law in a three chromophore approximation. Photon absorption path lengths in skin at the exploited wavelengths were estimated by means of Monte Carlo simulations. The technique was validated clinically on three kinds of skin lesions: nevi, hemangiomas, and seborrheic keratosis. Design of the developed add-on laser illumination system, image-processing details, and the results of clinical measurements are presented and discussed.
NASA Astrophysics Data System (ADS)
Göhler, Benjamin; Lutzmann, Peter
2017-10-01
Primarily, a laser gated-viewing (GV) system provides range-gated 2D images without any range resolution within the range gate. By combining two GV images with slightly different gate positions, 3D information within a part of the range gate can be obtained. The depth resolution is higher (super-resolution) than the minimal gate shift step size in a tomographic sequence of the scene. For a state-of-the-art system with a typical frame rate of 20 Hz, the time difference between the two required GV images is 50 ms which may be too long in a dynamic scenario with moving objects. Therefore, we have applied this approach to the reset and signal level images of a new short-wave infrared (SWIR) GV camera whose read-out integrated circuit supports correlated double sampling (CDS) actually intended for the reduction of kTC noise (reset noise). These images are extracted from only one single laser pulse with a marginal time difference in between. The SWIR GV camera consists of 640 x 512 avalanche photodiodes based on mercury cadmium telluride with a pixel pitch of 15 μm. A Q-switched, flash lamp pumped solid-state laser with 1.57 μm wavelength (OPO), 52 mJ pulse energy after beam shaping, 7 ns pulse length and 20 Hz pulse repetition frequency is used for flash illumination. In this paper, the experimental set-up is described and the operating principle of CDS is explained. The method of deriving super-resolution depth information from a GV system by using CDS is introduced and optimized. Further, the range accuracy is estimated from measured image data.
A Novel Laser and Video-Based Displacement Transducer to Monitor Bridge Deflections
Vicente, Miguel A.; Gonzalez, Dorys C.; Minguez, Jesus; Schumacher, Thomas
2018-01-01
The measurement of static vertical deflections on bridges continues to be a first-level technological challenge. These data are of great interest, especially for the case of long-term bridge monitoring; in fact, they are perhaps more valuable than any other measurable parameter. This is because material degradation processes and changes of the mechanical properties of the structure due to aging (for example creep and shrinkage in concrete bridges) have a direct impact on the exhibited static vertical deflections. This paper introduces and evaluates an approach to monitor displacements and rotations of structures using a novel laser and video-based displacement transducer (LVBDT). The proposed system combines the use of laser beams, LED lights, and a digital video camera, and was especially designed to capture static and slow-varying displacements. Contrary to other video-based approaches, the camera is located on the bridge, hence allowing to capture displacements at one location. Subsequently, the sensing approach and the procedure to estimate displacements and the rotations are described. Additionally, laboratory and in-service field testing carried out to validate the system are presented and discussed. The results demonstrate that the proposed sensing approach is robust, accurate, and reliable, and also inexpensive, which are essential for field implementation. PMID:29587380
A Novel Laser and Video-Based Displacement Transducer to Monitor Bridge Deflections.
Vicente, Miguel A; Gonzalez, Dorys C; Minguez, Jesus; Schumacher, Thomas
2018-03-25
The measurement of static vertical deflections on bridges continues to be a first-level technological challenge. These data are of great interest, especially for the case of long-term bridge monitoring; in fact, they are perhaps more valuable than any other measurable parameter. This is because material degradation processes and changes of the mechanical properties of the structure due to aging (for example creep and shrinkage in concrete bridges) have a direct impact on the exhibited static vertical deflections. This paper introduces and evaluates an approach to monitor displacements and rotations of structures using a novel laser and video-based displacement transducer (LVBDT). The proposed system combines the use of laser beams, LED lights, and a digital video camera, and was especially designed to capture static and slow-varying displacements. Contrary to other video-based approaches, the camera is located on the bridge, hence allowing to capture displacements at one location. Subsequently, the sensing approach and the procedure to estimate displacements and the rotations are described. Additionally, laboratory and in-service field testing carried out to validate the system are presented and discussed. The results demonstrate that the proposed sensing approach is robust, accurate, and reliable, and also inexpensive, which are essential for field implementation.
Miniaturized GPS/MEMS IMU integrated board
NASA Technical Reports Server (NTRS)
Lin, Ching-Fang (Inventor)
2012-01-01
This invention documents the efforts on the research and development of a miniaturized GPS/MEMS IMU integrated navigation system. A miniaturized GPS/MEMS IMU integrated navigation system is presented; Laser Dynamic Range Imager (LDRI) based alignment algorithm for space applications is discussed. Two navigation cameras are also included to measure the range and range rate which can be integrated into the GPS/MEMS IMU system to enhance the navigation solution.
Holostrain system: a powerful tool for experimental mechanics
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-09-01
A portable holographic interferometer that can be used to measure displacements and strains in all kinds of mechanical components and structures is described. The holostrain system captures images on a TV camera that detects interference patterns produced by laser illumination. The video signals are digitized. The digitized interferograms are processed by a fast processing system. The output of the system are the strains or the stresses of the observed mechanical component or structure.
MEMS compatible illumination and imaging micro-optical systems
NASA Astrophysics Data System (ADS)
Bräuer, A.; Dannberg, P.; Duparré, J.; Höfer, B.; Schreiber, P.; Scholles, M.
2007-01-01
The development of new MOEMS demands for cooperation between researchers in micromechanics, optoelectronics and microoptics at a very early state. Additionally, microoptical technologies being compatible with structured silicon have to be developed. The microoptical technologies used for two silicon based microsystems are described in the paper. First, a very small scanning laser projector with a volume of less than 2 cm 3, which operates with a directly modulated lasers collimated with a microlens, is shown. The laser radiation illuminates a 2D-MEMS scanning mirror. The optical design is optimized for high resolution (VGA). Thermomechanical stability is realized by design and using a structured ceramics motherboard. Secondly, an ultrathin CMOS-camera having an insect inspired imaging system has been realized. It is the first experimental realization of an artificial compound eye. Micro-optical design principles and technology is used. The overall thickness of the imaging system is only 320 μm, the diagonal field of view is 21°, and the f-number is 2.6. The monolithic device consists of an UV-replicated microlens array upon a thin silica substrate with a pinhole array in a metal layer on the back side. The pitch of the pinholes differs from that of the lens array to provide individual viewing angle for each channel. The imaging chip is directly glued to a CMOS sensor with adapted pitch. The whole camera is less than 1mm thick. New packaging methods for these systems are under development.
Schlossberg, David J.; Bodner, Grant M.; Bongard, Michael W.; ...
2016-09-16
Here, a novel, cost-effective, multi-point Thomson scattering system has been designed, implemented, and operated on the Pegasus Toroidal Experiment. Leveraging advances in Nd:YAG lasers, high-efficiency volume phase holographic transmission gratings, and increased quantum-efficiency Generation 3 image-intensified charge coupled device (ICCD) cameras, the system provides Thomson spectra at eight spatial locations for a single grating/camera pair. The on-board digitization of the ICCD camera enables easy modular expansion, evidenced by recent extension from 4 to 12 plasma/background spatial location pairs. Stray light is rejected using time-of-flight methods suited to gated ICCDs, and background light is blocked during detector readout by a fastmore » shutter. This –10 3 reduction in background light enables further expansion to up to 24 spatial locations. The implementation now provides single-shot T e(R) for n e > 5 × 10 18 m –3.« less
NASA Astrophysics Data System (ADS)
Pillans, Luke; Harmer, Jack; Edwards, Tim; Richardson, Lee
2016-05-01
Geolocation is the process of calculating a target position based on bearing and range relative to the known location of the observer. A high performance thermal imager with integrated geolocation functions is a powerful long range targeting device. Firefly is a software defined camera core incorporating a system-on-a-chip processor running the AndroidTM operating system. The processor has a range of industry standard serial interfaces which were used to interface to peripheral devices including a laser rangefinder and a digital magnetic compass. The core has built in Global Positioning System (GPS) which provides the third variable required for geolocation. The graphical capability of Firefly allowed flexibility in the design of the man-machine interface (MMI), so the finished system can give access to extensive functionality without appearing cumbersome or over-complicated to the user. This paper covers both the hardware and software design of the system, including how the camera core influenced the selection of peripheral hardware, and the MMI design process which incorporated user feedback at various stages.
Atmospheric turbulence temperature on the laser wavefront properties
NASA Astrophysics Data System (ADS)
Contreras López, J. C.; Ballesteros Díaz, A.; Tíjaro Rojas, O. J.; Torres Moreno, Y.
2017-06-01
Temperature is a physical magnitude that if is higher, the refractive index presents more important random fluctuations, which produce a greater distortion in the wavefront and thus a displacement in its centroid. To observe the effect produced by the turbulent medium strongly influenced by temperature on propagation laser beam, we experimented with two variable and controllable temperature systems designed as optical turbulence generators (OTG): a Turbulator and a Parallelepiped glass container. The experimental setup use three CMOS cameras and four temperature sensors spatially distributed to acquire synchronously information of the laser beam wavefront and turbulence temperature, respectively. The acquired information was analyzed with MATLAB® software tool, that it allows to compute the position, in terms of the evolution time, of the laser beam center of mass and their deviations produced by different turbulent conditions generated inside the two manufactured systems. The results were reflected in the statistical analysis of the centroid shifting.
Flexible mobile robot system for smart optical pipe inspection
NASA Astrophysics Data System (ADS)
Kampfer, Wolfram; Bartzke, Ralf; Ziehl, Wolfgang
1998-03-01
Damages of pipes can be inspected and graded by TV technology available on the market. Remotely controlled vehicles carry a TV-camera through pipes. Thus, depending on the experience and the capability of the operator, diagnosis failures can not be avoided. The classification of damages requires the knowledge of the exact geometrical dimensions of the damages such as width and depth of cracks, fractures and defect connections. Within the framework of a joint R&D project a sensor based pipe inspection system named RODIAS has been developed with two partners from industry and research institute. It consists of a remotely controlled mobile robot which carries intelligent sensors for on-line sewerage inspection purpose. The sensor is based on a 3D-optical sensor and a laser distance sensor. The laser distance sensor is integrated in the optical system of the camera and can measure the distance between camera and object. The angle of view can be determined from the position of the pan and tilt unit. With coordinate transformations it is possible to calculate the spatial coordinates for every point of the video image. So the geometry of an object can be described exactly. The company Optimess has developed TriScan32, a special software for pipe condition classification. The user can start complex measurements of profiles, pipe displacements or crack widths simply by pressing a push-button. The measuring results are stored together with other data like verbal damage descriptions and digitized images in a data base.
Study on the measurement system of the target polarization characteristics and test
NASA Astrophysics Data System (ADS)
Fu, Qiang; Zhu, Yong; Zhang, Su; Duan, Jin; Yang, Di; Zhan, Juntong; Wang, Xiaoman; Jiang, Hui-Lin
2015-10-01
The polarization imaging detection technology increased the polarization information on the basis of the intensity imaging, which is extensive application in the military and civil and other fields, the research on the polarization characteristics of target is particularly important. The research of the polarization reflection model was introduced in this paper, which describes the scattering vector light energy distribution in reflecting hemisphere polarization characteristics, the target polarization characteristics test system solutions was put forward, by the irradiation light source, measuring turntable and camera, etc, which illuminate light source shall direct light source, with laser light sources and xenon lamp light source, light source can be replaced according to the test need; Hemispherical structure is used in measuring circumarotate placed near its base material sample, equipped with azimuth and pitching rotation mechanism, the manual in order to adjust the azimuth Angle and high Angle observation; Measuring camera pump works, through the different in the way of motor control polaroid polarization test, to ensure the accuracy of measurement and imaging resolution. The test platform has set up by existing laboratory equipment, the laser is 532 nm, line polaroid camera, at the same time also set the sending and receiving optical system. According to the different materials such as wood, metal, plastic, azimuth Angle and zenith Angle in different observation conditions, measurement of target in the polarization scattering properties of different exposure conditions, implementation of hemisphere space pBRDF measurement.
Development of an airborne laser bathymeter
NASA Technical Reports Server (NTRS)
Kim, H., H.; Cervenka, P. O.; Lankford, C. B.
1975-01-01
An airborne laser depth sounding system was built and taken through a complete series of field tests. Two green laser sources were tried: a pulsed neon laser at 540 nm and a frequency-doubled Nd:YAG transmitter at 532 nm. To obtain a depth resolution of better than 20 cm, the pulses had a duration of 5 to 7 nanoseconds and could be fired up to at rates of 50 pulses per second. In the receiver, the signal was detected by a photomultiplier tube connected to a 28 cm diameter Cassegrainian telescope that was aimed vertically downward. Oscilloscopic traces of the signal reflected from the sea surface and the ocean floor could either be recorded by a movie camera on 35 mm film or digitized into 500 discrete channels of information and stored on magnetic tape, from which depth information could be extracted. An aerial color movie camera recorded the geographic footprint while a boat crew of oceanographers measured depth and other relevant water parameters. About two hundred hours of flight time on the NASA C-54 airplane in the area of Chincoteague, Virginia, the Chesapeake Bay, and in Key West, Florida, have yielded information on the actual operating conditions of such a system and helped to optimize the design. One can predict the maximum depth attainable in a mission by measuring the effective attenuation coefficient in flight. This quantity is four times smaller than the usual narrow beam attenuation coefficient. Several square miles of a varied underwater landscape were also mapped.
Qin, Jia; Shi, Lei; Dziennis, Suzan; Reif, Roberto; Wang, Ruikang K.
2014-01-01
In this paper, we describe a newly developed synchronized dual-wavelength laser speckle contrast imaging (SDW-LSCI) system, which contains two cameras that are synchronously triggered to acquire data. The system can acquire data at a high spatiotemporal resolution (up to 500Hz for ~1000×1000 pixels). A mouse model of stroke is used to demonstrate the capability for imaging the fast changes (within tens of milliseconds) in oxygenated and deoxygenated hemoglobin concentration, and the relative changes in blood flow in the mouse brain, through an intact cranium. This novel imaging technology will enable the study of fast hemodynamics and metabolic changes in vascular diseases. PMID:23027260
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gondal, M. A., E-mail: magondal@kfupm.edu.sa; Baig, Umair; Dastageer, M. A.
A detection system based on laser induced breakdown spectroscopy (LIBS) was built using 266 nm wavelength pulsed laser from the fourth harmonic of Nd:YAG laser, 500 mm spectrograph and gated ICCD camera with built-in delay generator. The LIBS system was used to study the elemental composition in coffee available in the local market of Saudi Arabia for the detection of elements in coffee samples. The LIBS spectrum of coffee sample revealed the presence magnesium, calcium, aluminum, copper, sodium, barium, bromine, cobalt, chromium, cerium manganese and molybdenum. Atomic transition line of sodium is used to study the parametric dependence of LIBSmore » signal. The study of the dependence of LIBS signal on the laser pulse energy is proven to be linear and the dependence of LIBS signal on the time delay between the excitation and data acquisition showed a typical increase, a peak value and a decrease with the optimum excitation – acquisition delay at 400 ns.« less
NASA Technical Reports Server (NTRS)
2002-01-01
Goddard Space Flight Center and Triangle Research & Development Corporation collaborated to create "Smart Eyes," a charge coupled device camera that, for the first time, could read and measure bar codes without the use of lasers. The camera operated in conjunction with software and algorithms created by Goddard and Triangle R&D that could track bar code position and direction with speed and precision, as well as with software that could control robotic actions based on vision system input. This accomplishment was intended for robotic assembly of the International Space Station, helping NASA to increase production while using less manpower. After successfully completing the two- phase SBIR project with Goddard, Triangle R&D was awarded a separate contract from the U.S. Department of Transportation (DOT), which was interested in using the newly developed NASA camera technology to heighten automotive safety standards. In 1990, Triangle R&D and the DOT developed a mask made from a synthetic, plastic skin covering to measure facial lacerations resulting from automobile accidents. By pairing NASA's camera technology with Triangle R&D's and the DOT's newly developed mask, a system that could provide repeatable, computerized evaluations of laceration injury was born.
NASA Technical Reports Server (NTRS)
Browne, Edward P.; Nivaggioli, Thierry; Hatton, T. Alan
1994-01-01
A noninvasive fluorescence recovery after photobleaching (FRAP) technique is under development to measure interfacial transport in two phase systems without disturbing the interface. The concentration profiles of a probe solute are measured in both sides of the interface by argon-ion laser, and the system relaxation is then monitored by a microscope-mounted CCD camera.
Laser Ablation of Biological Tissue Using Pulsed CO{sub 2} Laser
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hashishin, Yuichi; Sano, Shu; Nakayama, Takeyoshi
2010-10-13
Laser scalpels are currently used as a form of laser treatment. However, their ablation mechanism has not been clarified because laser excision of biological tissue occurs over a short time scale. Biological tissue ablation generates sound (laser-induced sound). This study seeks to clarify the ablation mechanism. The state of the gelatin ablation was determined using a high-speed video camera and the power reduction of a He-Ne laser beam. The aim of this study was to clarify the laser ablation mechanism by observing laser excision using the high-speed video camera and monitoring the power reduction of the He-Ne laser beam. Wemore » simulated laser excision of a biological tissue by irradiating gelatin (10 wt%) with radiation from a pulsed CO{sub 2} laser (wavelength: 10.6 {mu}m; pulse width: 80 ns). In addition, a microphone was used to measure the laser-induced sound. The first pulse caused ablation particles to be emitted in all directions; these particles were subsequently damped so that they formed a mushroom cloud. Furthermore, water was initially evaporated by laser irradiation and then tissue was ejected.« less
Development of Next Generation Lifetime PSP Imaging Systems
NASA Technical Reports Server (NTRS)
Watkins, A. Neal; Jordan, Jeffrey D.; Leighty, Bradley D.; Ingram, JoAnne L.; Oglesby, Donald M.
2002-01-01
This paper describes a lifetime PSP system that has recently been developed using pulsed light-emitting diode (LED) lamps and a new interline transfer CCD camera technology. This system alleviates noise sources associated with lifetime PSP systems that use either flash-lamp or laser excitation sources and intensified CCD cameras for detection. Calibration curves have been acquired for a variety of PSP formulations using this system, and a validation test was recently completed in the Subsonic Aerodynamic Research Laboratory (SARL) at Wright-Patterson Air Force Base (WPAFB). In this test, global surface pressure distributions were recovered using both a standard intensity-based method and the new lifetime system. Results from the lifetime system agree both qualitatively and quantitatively with those measured using the intensity-based method. Finally, an advanced lifetime imaging technique capable of measuring temperature and pressure simultaneously is introduced and initial results are presented.
High-Resolution Surface Reconstruction from Imagery for Close Range Cultural Heritage Applications
NASA Astrophysics Data System (ADS)
Wenzel, K.; Abdel-Wahab, M.; Cefalu, A.; Fritsch, D.
2012-07-01
The recording of high resolution point clouds with sub-mm resolution is a demanding and cost intensive task, especially with current equipment like handheld laser scanners. We present an image based approached, where techniques of image matching and dense surface reconstruction are combined with a compact and affordable rig of off-the-shelf industry cameras. Such cameras provide high spatial resolution with low radiometric noise, which enables a one-shot solution and thus an efficient data acquisition while satisfying high accuracy requirements. However, the largest drawback of image based solutions is often the acquisition of surfaces with low texture where the image matching process might fail. Thus, an additional structured light projector is employed, represented here by the pseudo-random pattern projector of the Microsoft Kinect. Its strong infrared-laser projects speckles of different sizes. By using dense image matching techniques on the acquired images, a 3D point can be derived for almost each pixel. The use of multiple cameras enables the acquisition of a high resolution point cloud with high accuracy for each shot. For the proposed system up to 3.5 Mio. 3D points with sub-mm accuracy can be derived per shot. The registration of multiple shots is performed by Structure and Motion reconstruction techniques, where feature points are used to derive the camera positions and rotations automatically without initial information.
Development and application of 3-D foot-shape measurement system under different loads
NASA Astrophysics Data System (ADS)
Liu, Guozhong; Wang, Boxiong; Shi, Hui; Luo, Xiuzhi
2008-03-01
The 3-D foot-shape measurement system under different loads based on laser-line-scanning principle was designed and the model of the measurement system was developed. 3-D foot-shape measurements without blind areas under different loads and the automatic extraction of foot-parameter are achieved with the system. A global calibration method for CCD cameras using a one-axis motion unit in the measurement system and the specialized calibration kits is presented. Errors caused by the nonlinearity of CCD cameras and other devices and caused by the installation of the one axis motion platform, the laser plane and the toughened glass plane can be eliminated by using the nonlinear coordinate mapping function and the Powell optimized method in calibration. Foot measurements under different loads for 170 participants were conducted and the statistic foot parameter measurement results for male and female participants under non-weight condition and changes of foot parameters under half-body-weight condition, full-body-weight condition and over-body-weight condition compared with non-weight condition are presented. 3-D foot-shape measurement under different loads makes it possible to realize custom-made shoe-making and shows great prosperity in shoe design, foot orthopaedic treatment, shoe size standardization, and establishment of a feet database for consumers and athletes.
NASA Astrophysics Data System (ADS)
Meola, Joseph; Absi, Anthony; Islam, Mohammed N.; Peterson, Lauren M.; Ke, Kevin; Freeman, Michael J.; Ifaraguerri, Agustin I.
2014-06-01
Hyperspectral imaging systems are currently used for numerous activities related to spectral identification of materials. These passive imaging systems rely on naturally reflected/emitted radiation as the source of the signal. Thermal infrared systems measure radiation emitted from objects in the scene. As such, they can operate at both day and night. However, visible through shortwave infrared systems measure solar illumination reflected from objects. As a result, their use is limited to daytime applications. Omni Sciences has produced high powered broadband shortwave infrared super-continuum laser illuminators. A 64-watt breadboard system was recently packaged and tested at Wright-Patterson Air Force Base to gauge beam quality and to serve as a proof-of-concept for potential use as an illuminator for a hyperspectral receiver. The laser illuminator was placed in a tower and directed along a 1.4km slant path to various target materials with reflected radiation measured with both a broadband camera and a hyperspectral imaging system to gauge performance.
A high-resolution full-field range imaging system
NASA Astrophysics Data System (ADS)
Carnegie, D. A.; Cree, M. J.; Dorrington, A. A.
2005-08-01
There exist a number of applications where the range to all objects in a field of view needs to be obtained. Specific examples include obstacle avoidance for autonomous mobile robots, process automation in assembly factories, surface profiling for shape analysis, and surveying. Ranging systems can be typically characterized as being either laser scanning systems where a laser point is sequentially scanned over a scene or a full-field acquisition where the range to every point in the image is simultaneously obtained. The former offers advantages in terms of range resolution, while the latter tend to be faster and involve no moving parts. We present a system for determining the range to any object within a camera's field of view, at the speed of a full-field system and the range resolution of some point laser scans. Initial results obtained have a centimeter range resolution for a 10 second acquisition time. Modifications to the existing system are discussed that should provide faster results with submillimeter resolution.
The Geoscience Laser Altimeter System (GLAS) for the ICESAT Mission
NASA Technical Reports Server (NTRS)
Abshire, James B.; Sun, Xia-Li; Ketchum, Eleanor A.; Afzal, Robert S.; Millar, Pamela S.; Smith, David E. (Technical Monitor)
2000-01-01
The Laser In space Technology Experiment, Shuttle Laser Altimeter and the Mars Observer Laser Altimeter have demonstrated accurate measurements of atmospheric backscatter and Surface heights from space. The recent MOLA measurements of the Mars surface have 40 cm vertical resolution and have reduced the global uncertainty in Mars topography from a few km to about 5 m. The Geoscience Laser Altimeter System (GLAS) is a next generation lidar for Earth orbit being developed as part of NASA's Icesat Mission. The GLAS design combines a 10 cm precision surface lidar with a sensitive dual wavelength cloud and aerosol lidar. GLAS will precisely measure the heights of the Earth's polar ice sheets, establish a grid of accurate height profiles of the Earth's land topography, and profile the vertical backscatter of clouds and aerosols on a global scale. GLAS is being developed to fly on a small dedicated spacecraft in a polar orbit with a 590 630 km altitude at inclination of 94 degrees. GLAS is scheduled to launch in the summer 2001 and to operate continuously for a minimum of 3 years with a goal of 5 years. The primary mission for GLAS is to measure the seasonal and annual changes in the heights of the Greenland and Antarctic ice sheets. GLAS will continuously measure the vertical distance from orbit to the Earth's surface with 1064 nm pulses from a ND:YAG laser at a 40 Hz rate. Each 5 nsec wide laser pulse is used to produce a single range measurement, and the laser spots have 66 m diameter and about 170 m center-center spacings. When over land GLAS will profile the heights of the topography and vegetation. The GLAS receiver uses a 1 m diameter telescope and a Si APD detector. The detector signal is sampled by an all digital receiver which records each surface echo waveform with I nsec resolution and a stored echo record lengths of either 200, 400, or 600 samples. Analysis of the echo waveforms within the instrument permits discrimination between cloud and surface echoes. Ground based echo analysis permits precise ranging, determining the roughness or slopes of the surface as well as the vertical distributions of vegetation illuminated by the laser. Accurate knowledge of the laser beam's pointing angle is needed to prevent height biases when over sloped surfaces. For surfaces with 2 deg. slopes, knowledge of pointing angle of the beam's centroid to about 8 urad is needed to achieve 10 cm height accuracy. GLAS uses a stellar reference system (SRS) to determine the pointing angle of each laser firing relative to inertial space. The SRS uses a high precision star camera oriented toward local zenith and a gyroscope to determine the inertial orientation of the SRS optical bench. The far field pattern of each laser is measured pulse relative to the star camera with a laser reference system (LRS). Optically measuring each laser far field pattern relative to the orientation of the star camera and gyroscope permits the precise pointing angle of each laser pulse to be determined. GLAS will also determine the vertical distributions of clouds and aerosols by measuring the vertical profile of laser energy backscattered by the atmosphere at both 1064 and 532 nm. The 1064 nm measurements use the Si APD detector and profile the height and vertical structure of thicker clouds. The measurements at 532 nm use new highly sensitive photon counting, detectors, and measure the height distributions of very thin Clouds and aerosol layers. With averaging these can be used to determine the height of the planetary boundary layer. The instrument design and expected performance will be discussed.
Imaging using a supercontinuum laser to assess tumors in patients with breast carcinoma
NASA Astrophysics Data System (ADS)
Sordillo, Laura A.; Sordillo, Peter P.; Alfano, R. R.
2016-03-01
The supercontinuum laser light source has many advantages over other light sources, including broad spectral range. Transmission images of paired normal and malignant breast tissue samples from two patients were obtained using a Leukos supercontinuum (SC) laser light source with wavelengths in the second and third NIR optical windows and an IR- CCD InGaAs camera detector (Goodrich Sensors Inc. high response camera SU320KTSW-1.7RT with spectral response between 900 nm and 1,700 nm). Optical attenuation measurements at the four NIR optical windows were obtained from the samples.
Plasma ignition for laser propulsion
NASA Technical Reports Server (NTRS)
Askew, R. F.
1982-01-01
For a specific optical system a pulsed carbon dioxide laser having an energy output of up to 15 joules was used to initiate a plasma in air at one atmosphere pressure. The spatial and temporal development of the plasma were measured using a multiframe image converter camera. In addition the time dependent velocity of the laser supported plasma front which moves opposite to the direction of the laser pulse was measured in order to characterize the type of wavefront developed. Reliable and reproducible spark initiation was achieved. The lifetime of the highly dense plasma at the initial focal spot was determined to be less than 100 nanoseconds. The plasma front propagates toward the laser at a variable speed ranging from zero to 1.6 x 1,000,000 m/sec. The plasma front propagates for a total distance of approximately five centimeters for the energy and laser pulse shape employed.
Programmable 10 MHz optical fiducial system for hydrodiagnostic cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huen, T.
1987-07-01
A solid state light control system was designed and fabricated for use with hydrodiagnostic streak cameras of the electro-optic type. With its use, the film containing the streak images will have on it two time scales simultaneously exposed with the signal. This allows timing and cross timing. The latter is achieved with exposure modulation marking onto the time tick marks. The purpose of using two time scales will be discussed. The design is based on a microcomputer, resulting in a compact and easy to use instrument. The light source is a small red light emitting diode. Time marking can bemore » programmed in steps of 0.1 microseconds, with a range of 255 steps. The time accuracy is based on a precision 100 MHz quartz crystal, giving a divided down 10 MHz system frequency. The light is guided by two small 100 micron diameter optical fibers, which facilitates light coupling onto the input slit of an electro-optic streak camera. Three distinct groups of exposure modulation of the time tick marks can be independently set anywhere onto the streak duration. This system has been successfully used in Fabry-Perot laser velocimeters for over four years in our Laboratory. The microcomputer control section is also being used in providing optical fids to mechanical rotor cameras.« less
Laser-Based Optical System for Reactive Radical Concentration Measurements in Plasmas and Flames
2006-08-01
role of different plasma components in chain propagation support: (1) and (2) - corona plasma generators with high-voltage multiple needle electrodes ; (3...H20 2) and HCN. Measurements in Gliding Arc, Dielectric Barrier Discharge and Pulsed Corona Plasma systems and in flame and flow reactor systems are...discharges operating in air with iron electrodes - 260V.35 Using visual quantification from high speed camera arc images, the approximate thickness of
Iron-Nickel Meteorite Zapped by Mars Rover Laser
2016-11-02
The dark, golf-ball-size object in this composite, colorized view from the Chemistry and Camera (ChemCam) instrument on NASA's Curiosity Mars rover shows a grid of shiny dots where ChemCam had fired laser pulses used for determining the chemical elements in the target's composition. The analysis confirmed that this object, informally named "Egg Rock," is an iron-nickel meteorite. Iron-nickel meteorites are a common class of space rocks found on Earth, and previous examples have been found on Mars, but Egg Rock is the first on Mars to be examined with a laser-firing spectrometer. The laser pulses on Oct. 30, 2016, induced bursts of glowing gas at the target, and ChemCam's spectrometer read the wavelengths of light from those bursts to gain information about the target's composition. The laser pulses also burned through the dark outer surface, exposing bright interior material. This view combines two images taken later the same day by ChemCam's remote micro-imager (RMI) camera, with color added from an image taken by Curiosity's Mast Camera (Mastcam). A Mastcam image of Egg Rock is at PIA21134. http://photojournal.jpl.nasa.gov/catalog/PIA21133
NASA Astrophysics Data System (ADS)
Stock, Karl; Wurm, Holger; Hausladen, Florian
2016-02-01
Flashlamp pumped Er:YAG lasers are successfully used clinically for both precise soft and hard tissue ablation. Since several years a novel diode pumped Er:YAG laser system (Pantec Engineering AG) is available, with mean laser power up to 40 W and pulse repetition rate up to 1 kHz. The aim of the study was to investigate the suitability of the laser system specifically for stapedotomy. Firstly an experimental setup was realized with a beam focusing unit and a computer controlled translation stage to move the samples (slices of porcine bone) with a defined velocity while irradiation with various laser parameters. A microphone was positioned in a defined distance to the ablation point and the resulting acoustic signal of the ablation process was recorded. For comparison, measurements were also performed with a flash lamp pumped Er:YAG laser system. After irradiation the resulting ablation quality and efficacy were determined using light microscopy. Using a high speed camera and "Töpler-Schlierentechnik" the cavitation bubble in water after perforation of a bone slice was investigated. The results show efficient bone ablation using the diode pumped Er:YAG laser system. Also a decrease of the sound level and of the cavitation bubble volume was observed with decreasing pulse duration. Higher repetition rates lead to a slightly increase of thermal side effects but have no influence on the ablation efficiency. In conclusion, these first experiments demonstrate the high potential of the diode pumped Er:YAG laser system for use in middle ear surgery.
Laser Ground System for Communication Experiments with ARTEMIS
NASA Astrophysics Data System (ADS)
Kuzkov, Volodymyr; Volovyk, Dmytro; Kuzkov, Sergii; Sodnik, Zoran; Pukha, Sergii; Caramia, Vincenzo
2012-10-01
The ARTEMIS satellite with the OPALE laser communication terminal on-board was launched on 12 July, 2001. 1789 laser communications sessions were performed between ARTEMIS and SPOT-4 (PASTEL) from 01 April 2003 to 09 January 2008 with total duration of 378 hours. Regular laser communication experiments between ESA's Optical Ground Station (OGS - altitude 2400 m above see level) and ARTEMIS in various atmosphere conditions were also performed. The Japanese Space Agency (JAXA) launched the KIRARI (OICETS) satellite with laser communication terminal called LUCE. Laser communication links between KIRARI and ARTEMIS were successfully realized and international laser communications experiments from the KIRARI satellite were also successfully performed with optical ground stations located in the USA (JPL), Spain (ESA OGS), Germany (DLR), and Japan (NICT). The German Space Agency (DLR) performed laser communication links between two LEO satellites (TerraSAR-X and NFIRE), demonstrating data transfer rates of 5.6Gbit/s and performed laser communication experiments between the satellites and the ESA optical ground station. To reduce the influence of weather conditions on laser communication between satellites and ground stations, a network of optical stations situated in different atmosphere regions needs to be created. In 2002, the Main Astronomical Observatory (MAO) started the development of its own laser communication system to be placed into the Cassegrain focus of its 0.7m AZT-2 telescope (Fe = 10.5m), located in Kyiv 190 meters above sea level. The work was supported by the National Space Agency of Ukraine and by ESA ARTEMIS has an orbital position of 21.4° E and an orbital inclination of more than 9.75°. As a result we developed a precise tracking system for AZT-2 telescope (weighing more than 2 tons) using micro-step motors. Software was developed for computer control of the telescope to track the satellite's orbit and a tracking accuracy of 0.6 arcsec was achieved. A compact terminal for Laser Atmosphere and Communication Experiments with Satellite (LACES) has been produced. The LACES terminal includes: A CMOS camera of the pointing subsystem, a CCD camera of the tracking subsystem, an avalanche photodiode receiver module with thermoelectric cooling, a laser transmitter module with thermoelectric temperature control, a tip/tilt atmospheric turbulence compensation subsystem with movable mirrors, a four-quadrant photo-detector, a bit error rate tester module and other optical and electronic components. The principal subsystems and optical elements are mounted on a platform (weight < 20kg), which is located in the Cassegrain focus of the telescope. All systems were tested with ARTEMIS. The telemetry and dump buffer information from OPALE received by the control center in Redu (Belgium) was analyzed. During the beacon scan, the acquisition phase of laser link between OPALE laser terminal of ARTEMIS and LACES laser terminal started and laser signals from AZT-2 were detected by acquisition and tracking CCD sensors of OPALE. Some of the tests were performed in cloudy conditions. A description of our laser ground system and the experimental results will be presented in the report.
Smart lens: tunable liquid lens for laser tracking
NASA Astrophysics Data System (ADS)
Lin, Fan-Yi; Chu, Li-Yu; Juan, Yu-Shan; Pan, Sih-Ting; Fan, Shih-Kang
2007-05-01
A tracking system utilizing tunable liquid lens is proposed and demonstrated. Adapting the concept of EWOD (electrowetting-on-dielectric), the curvature of a droplet on a dielectric film can be controlled by varying the applied voltage. When utilizing the droplet as an optical lens, the focal length of this adaptive liquid lens can be adjusted as desired. Moreover, the light that passes through it can therefore be focused to different positions in space. In this paper, the tuning range of the curvature and focal length of the tunable liquid lens is investigated. Droplet transformation is observed and analyzed under a CCD camera. A tracking system combining the tunable liquid lens with a laser detection system is also proposed. With a feedback circuit that maximizing the returned signal by controlling the tunable lens, the laser beam can keep tracked on a distant reflected target while it is moving.
Laser speckle contrast imaging using light field microscope approach
NASA Astrophysics Data System (ADS)
Ma, Xiaohui; Wang, Anting; Ma, Fenghua; Wang, Zi; Ming, Hai
2018-01-01
In this paper, a laser speckle contrast imaging (LSCI) system using light field (LF) microscope approach is proposed. As far as we known, it is first time to combine LSCI with LF. To verify this idea, a prototype consists of a modified LF microscope imaging system and an experimental device was built. A commercially used Lytro camera was modified for microscope imaging. Hollow glass tubes with different depth fixed in glass dish were used to simulate the vessels in brain and test the performance of the system. Compared with conventional LSCI, three new functions can be realized by using our system, which include refocusing, extending the depth of field (DOF) and gathering 3D information. Experiments show that the principle is feasible and the proposed system works well.
Progress with the lick adaptive optics system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavel, D T; Olivier, S S; Bauman, B
2000-03-01
Progress and results of observations with the Lick Observatory Laser Guide Star Adaptive Optics System are presented. This system is optimized for diffraction-limited imaging in the near infrared, 1-2 micron wavelength bands. We describe our development efforts in a number of component areas including, a redesign of the optical bench layout, the commissioning of a new infrared science camera, and improvements to the software and user interface. There is also an ongoing effort to characterize the system performance with both natural and laser guide stars and to fold this data into a refined system model. Such a model can bemore » used to help plan future observations, for example, predicting the point-spread function as a function of seeing and guide star magnitude.« less
Integrated Laser Characterization, Data Acquisition, and Command and Control Test System
NASA Technical Reports Server (NTRS)
Stysley, Paul; Coyle, Barry; Lyness, Eric
2012-01-01
Satellite-based laser technology has been developed for topographical measurements of the Earth and of other planets. Lasers for such missions must be highly efficient and stable over long periods in the temperature variations of orbit. In this innovation, LabVIEW is used on an Apple Macintosh to acquire and analyze images of the laser beam as it exits the laser cavity to evaluate the laser s performance over time, and to monitor and control the environmental conditions under which the laser is tested. One computer attached to multiple cameras and instruments running LabVIEW-based software replaces a conglomeration of computers and software packages, saving hours in maintenance and data analysis, and making very longterm tests possible. This all-in-one system was written primarily using LabVIEW for Mac OS X, which allows the combining of data from multiple RS-232, USB, and Ethernet instruments for comprehensive laser analysis and control. The system acquires data from CCDs (charge coupled devices), power meters, thermistors, and oscilloscopes over a controllable period of time. This data is saved to an html file that can be accessed later from a variety of data analysis programs. Also, through the LabVIEW interface, engineers can easily control laser input parameters such as current, pulse width, chiller temperature, and repetition rates. All of these parameters can be adapted and cycled over a period of time.
High-throughput Raman chemical imaging for evaluating food safety and quality
NASA Astrophysics Data System (ADS)
Qin, Jianwei; Chao, Kuanglin; Kim, Moon S.
2014-05-01
A line-scan hyperspectral system was developed to enable Raman chemical imaging for large sample areas. A custom-designed 785 nm line-laser based on a scanning mirror serves as an excitation source. A 45° dichroic beamsplitter reflects the laser light to form a 24 cm x 1 mm excitation line normally incident on the sample surface. Raman signals along the laser line are collected by a detection module consisting of a dispersive imaging spectrograph and a CCD camera. A hypercube is accumulated line by line as a motorized table moves the samples transversely through the laser line. The system covers a Raman shift range of -648.7-2889.0 cm-1 and a 23 cm wide area. An example application, for authenticating milk powder, was presented to demonstrate the system performance. In four minutes, the system acquired a 512x110x1024 hypercube (56,320 spectra) from four 47-mm-diameter Petri dishes containing four powder samples. Chemical images were created for detecting two adulterants (melamine and dicyandiamide) that had been mixed into the milk powder.
Anazawa, Takashi; Uchiho, Yuichi; Yokoi, Takahide; Chalkidis, George; Yamazaki, Motohiro
2017-06-27
A five-color fluorescence-detection system for eight-channel plastic-microchip electrophoresis was developed. In the eight channels (with effective electrophoretic lengths of 10 cm), single-stranded DNA fragments were separated (with single-base resolution up to 300 bases within 10 min), and seventeen-loci STR genotyping for forensic human identification was successfully demonstrated. In the system, a side-entry laser beam is passed through the eight channels (eight A channels), with alternately arrayed seven sacrificial channels (seven B channels), by a technique called "side-entry laser-beam zigzag irradiation." Laser-induced fluorescence from the eight A channels and Raman-scattered light from the seven B channels are then simultaneously, uniformly, and spectroscopically detected, in the direction perpendicular to the channel array plane, through a transmission grating and a CCD camera. The system is therefore simple and highly sensitive. Because the microchip is fabricated by plastic-injection molding, it is inexpensive and disposable and thus suitable for actual use in various fields.
Using Stars to Align a Steered Laser System for Cosmic Ray Simulation
NASA Astrophysics Data System (ADS)
Krantz, Harry; Wiencke, Lawrence
2016-03-01
Ultra high energy cosmic rays (UHECRs) are the highest energy cosmic particles with kinetic energy above 1018eV . UHECRs are detected from the air shower of secondary particles and UV florescence that results from interaction with the atmosphere. A high power UV laser beam can be used to simulate the optical signature of a UHCER air shower. The Global Light System (GLS) is a planned network of ground-based light sources including lasers to support the planned space-based Extreme Universe Space Observatory (EUSO). A portable prototype GLS laser station has been constructed at the Colorado School of Mines. Currently the laser system uses reference targets on the ground but stars can be used to better align the beam by providing a complete hemisphere of targets. In this work, a CCD camera is used to capture images of known stars through the steering head optics. The images are analyzed to find the steering head coordinates of the target star. The true coordinates of the star are calculated from the location and time of observation. A universal adjustment for the steering head is determined from the differences between the two pairs of coordinates across multiple stars. This laser system prototype will also be used for preflight tests of the ESUO Super Pressure Balloon mission.
Sidelooking laser altimeter for a flight simulator
NASA Technical Reports Server (NTRS)
Webster, L. D. (Inventor)
1983-01-01
An improved laser altimeter for a flight simulator which allows measurement of the height of the simulator probe above the terrain directly below the probe tip is described. A laser beam is directed from the probe at an angle theta to the horizontal to produce a beam spot on the terrain. The angle theta that the laser beam makes with the horizontal is varied so as to bring the beam spot into coincidence with a plumb line coaxial with the longitudinal axis of the probe. A television altimeter camera observes the beam spot and has a raster line aligned with the plumb line. Spot detector circuit coupled to the output of the TV camera monitors the position of the beam spot relative to the plumb line.
Brahme, Anders; Nyman, Peter; Skatt, Björn
2008-05-01
A four-dimensional (4D) laser camera (LC) has been developed for accurate patient imaging in diagnostic and therapeutic radiology. A complementary metal-oxide semiconductor camera images the intersection of a scanned fan shaped laser beam with the surface of the patient and allows real time recording of movements in a three-dimensional (3D) or four-dimensional (4D) format (3D +time). The LC system was first designed as an accurate patient setup tool during diagnostic and therapeutic applications but was found to be of much wider applicability as a general 4D photon "tag" for the surface of the patient in different clinical procedures. It is presently used as a 3D or 4D optical benchmark or tag for accurate delineation of the patient surface as demonstrated for patient auto setup, breathing and heart motion detection. Furthermore, its future potential applications in gating, adaptive therapy, 3D or 4D image fusion between most imaging modalities and image processing are discussed. It is shown that the LC system has a geometrical resolution of about 0, 1 mm and that the rigid body repositioning accuracy is about 0, 5 mm below 20 mm displacements, 1 mm below 40 mm and better than 2 mm at 70 mm. This indicates a slight need for repeated repositioning when the initial error is larger than about 50 mm. The positioning accuracy with standard patient setup procedures for prostate cancer at Karolinska was found to be about 5-6 mm when independently measured using the LC system. The system was found valuable for positron emission tomography-computed tomography (PET-CT) in vivo tumor and dose delivery imaging where it potentially may allow effective correction for breathing artifacts in 4D PET-CT and image fusion with lymph node atlases for accurate target volume definition in oncology. With a LC system in all imaging and radiation therapy rooms, auto setup during repeated diagnostic and therapeutic procedures may save around 5 min per session, increase accuracy and allow efficient image fusion between all imaging modalities employed.
Velocity visualization in gaseous flows
NASA Technical Reports Server (NTRS)
Hanson, R. K.
1985-01-01
Techniques are established for visualizing velocity in gaseous flows. Two approaches are considered, both of which are capable of yielding velocity simultaneously at a large number of flowfield locations, thereby providing images of velocity. The first technique employs a laser to mark specific fluid elements and a camera to track their subsequent motion. Marking is done by laser-induced phosphorescence of biacetyl, added as a tracer species in a flow of N2, or by laser-induced formation of sulfur particulates in SF6-H2-N2 mixtures. The second technique is based on the Doppler effect, and uses an intensified photodiode array camera and a planar form of laser-induced fluorescence to detect 2-d velocities of I2 (in I2-N2 mixtures) via Doppler-shifted absorption of narrow-linewidth laser radiation at 514.5 nm.
Simulation of a polarized laser beam reflected at the sea surface: modeling and validation
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric
2015-05-01
A 3-D simulation of the polarization-dependent reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation considers polarized or unpolarized laser sources and calculates the polarization states upon reflection at the sea surface. It is suitable for the radiance calculation of the scene in different spectral wavebands (e.g. near-infrared, SWIR, etc.) not including the camera degradations. The simulation also considers a bistatic configuration of laser source and receiver as well as different atmospheric conditions. In the SWIR, the detected total power of reflected laser light is compared with data collected in a field trial. Our computer simulation combines the 3-D simulation of a maritime scene (open sea/clear sky) with the simulation of polarized or unpolarized laser light reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the input of a camera equipped with a linear polarizer, the polarized sea surface radiance must be calculated for the specific waveband. The s- and p-polarization states are calculated for the emitted sea surface radiance and the specularly reflected sky radiance to determine the total polarized sea surface radiance of each component. The states of polarization and the radiance of laser light specularly reflected at the wind-roughened sea surface are calculated by considering the s- and p- components of the electric field of laser light with respect to the specular plane of incidence. This is done by using the formalism of their coherence matrices according to E. Wolf [1]. Additionally, an analytical statistical sea surface BRDF (bidirectional reflectance distribution function) is considered for the reflection of laser light radiances. Validation of the simulation results is required to ensure model credibility and applicability to maritime laser applications. For validation purposes, field measurement data (images and meteorological data) was analyzed. An infrared laser, with or without a mounted polarizer, produced laser beam reflection at the water surface and images were recorded by a camera equipped with a polarizer with horizontal or vertical alignment. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam and different alignment for the laser polarizers (vertical/horizontal/without) and the camera (vertical/horizontal).
Phase and amplitude wave front sensing and reconstruction with a modified plenoptic camera
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Ko, Jonathan; Nelson, William; Davis, Christopher C.
2014-10-01
A plenoptic camera is a camera that can retrieve the direction and intensity distribution of light rays collected by the camera and allows for multiple reconstruction functions such as: refocusing at a different depth, and for 3D microscopy. Its principle is to add a micro-lens array to a traditional high-resolution camera to form a semi-camera array that preserves redundant intensity distributions of the light field and facilitates back-tracing of rays through geometric knowledge of its optical components. Though designed to process incoherent images, we found that the plenoptic camera shows high potential in solving coherent illumination cases such as sensing both the amplitude and phase information of a distorted laser beam. Based on our earlier introduction of a prototype modified plenoptic camera, we have developed the complete algorithm to reconstruct the wavefront of the incident light field. In this paper the algorithm and experimental results will be demonstrated, and an improved version of this modified plenoptic camera will be discussed. As a result, our modified plenoptic camera can serve as an advanced wavefront sensor compared with traditional Shack- Hartmann sensors in handling complicated cases such as coherent illumination in strong turbulence where interference and discontinuity of wavefronts is common. Especially in wave propagation through atmospheric turbulence, this camera should provide a much more precise description of the light field, which would guide systems in adaptive optics to make intelligent analysis and corrections.
Murine fundus fluorescein angiography: An alternative approach using a handheld camera.
Ehrenberg, Moshe; Ehrenberg, Scott; Schwob, Ouri; Benny, Ofra
2016-07-01
In today's modern pharmacologic approach to treating sight-threatening retinal vascular disorders, there is an increasing demand for a compact, mobile, lightweight and cost-effective fluorescein fundus camera to document the effects of antiangiogenic drugs on laser-induced choroidal neovascularization (CNV) in mice and other experimental animals. We have adapted the use of the Kowa Genesis Df Camera to perform Fundus Fluorescein Angiography (FFA) in mice. The 1 kg, 28 cm high camera has built-in barrier and exciter filters to allow digital FFA recording to a Compact Flash memory card. Furthermore, this handheld unit has a steady Indirect Lens Holder that firmly attaches to the main unit, that securely holds a 90 diopter lens in position, in order to facilitate appropriate focus and stability, for photographing the delicate central murine fundus. This easily portable fundus fluorescein camera can effectively record exceptional central retinal vascular detail in murine laser-induced CNV, while readily allowing the investigator to adjust the camera's position according to the variable head and eye movements that can randomly occur while the mouse is optimally anesthetized. This movable image recording device, with efficiencies of space, time, cost, energy and personnel, has enabled us to accurately document the alterations in the central choroidal and retinal vasculature following induction of CNV, implemented by argon-green laser photocoagulation and disruption of Bruch's Membrane, in the experimental murine model of exudative macular degeneration. Copyright © 2016 Elsevier Ltd. All rights reserved.
Measurement system for 3-D foot coordinates and parameters
NASA Astrophysics Data System (ADS)
Liu, Guozhong; Li, Yunhui; Wang, Boxiong; Shi, Hui; Luo, Xiuzhi
2008-12-01
The 3-D foot-shape measurement system based on laser-line-scanning principle and the model of the measurement system were presented. Errors caused by nonlinearity of CCD cameras and caused by installation can be eliminated by using the global calibration method for CCD cameras, which based on nonlinear coordinate mapping function and the optimized method. A local foot coordinate system is defined with the Pternion and the Acropodion extracted from the boundaries of foot projections. The characteristic points can thus be located and foot parameters be extracted automatically by the local foot coordinate system and the related sections. Foot measurements for about 200 participants were conducted and the measurement results for male and female participants were presented. 3-D foot coordinates and parameters measurement makes it possible to realize custom-made shoe-making and shows great prosperity in shoe design, foot orthopaedic treatment, shoe size standardization, and establishment of a feet database for consumers.
Phoenix's Laser Beam in Action on Mars
NASA Technical Reports Server (NTRS)
2008-01-01
[figure removed for brevity, see original site] Click on image to view the animation The Surface Stereo Imager camera aboard NASA's Phoenix Mars Lander acquired a series of images of the laser beam in the Martian night sky. Bright spots in the beam are reflections from ice crystals in the low level ice-fog. The brighter area at the top of the beam is due to enhanced scattering of the laser light in a cloud. The Canadian-built lidar instrument emits pulses of laser light and records what is scattered back. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.Research on a solid state-streak camera based on an electro-optic crystal
NASA Astrophysics Data System (ADS)
Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang
2006-06-01
With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.
Large format geiger-mode avalanche photodiode LADAR camera
NASA Astrophysics Data System (ADS)
Yuan, Ping; Sudharsanan, Rengarajan; Bai, Xiaogang; Labios, Eduardo; Morris, Bryan; Nicholson, John P.; Stuart, Gary M.; Danny, Harrison
2013-05-01
Recently Spectrolab has successfully demonstrated a compact 32x32 Laser Detection and Range (LADAR) camera with single photo-level sensitivity with small size, weight, and power (SWAP) budget for threedimensional (3D) topographic imaging at 1064 nm on various platforms. With 20-kHz frame rate and 500- ps timing uncertainty, this LADAR system provides coverage down to inch-level fidelity and allows for effective wide-area terrain mapping. At a 10 mph forward speed and 1000 feet above ground level (AGL), it covers 0.5 square-mile per hour with a resolution of 25 in2/pixel after data averaging. In order to increase the forward speed to fit for more platforms and survey a large area more effectively, Spectrolab is developing 32x128 Geiger-mode LADAR camera with 43 frame rate. With the increase in both frame rate and array size, the data collection rate is improved by 10 times. With a programmable bin size from 0.3 ps to 0.5 ns and 14-bit timing dynamic range, LADAR developers will have more freedom in system integration for various applications. Most of the special features of Spectrolab 32x32 LADAR camera, such as non-uniform bias correction, variable range gate width, windowing for smaller arrays, and short pixel protection, are implemented in this camera.
Imaging hydrogen flames by two-photon, laser-induced fluorescence
NASA Technical Reports Server (NTRS)
Miles, R.; Lempert, W.; Kumar, V.; Diskin, G.
1991-01-01
A nonintrusive multicomponent imaging system is developed which can image hydrogen, hot oxygen, and air simultaneously. An Ar-F excimer laser is injection-locked to cover the Q1 two-photon transition in molecular hydrogen which allows the observation of both hot oxygen and cold hydrogen. Rayleigh scattering from the water molecules occurs at the same frequency as the illuminating laser allowing analysis of the air density. Images of ignited and nonignited hydrogen jets are recorded with a high-sensitivity gated video camera. The images permit the analysis of turbulent hydrogen-core jet, the combustion zone, and the surrounding air, and two-dimensional spatial correlations can be made to study the turbulent structure and couplings between different regions of the flow field. The method is of interest to the study of practical combustion systems which employ hydrogen-air diffusion flames.
Experimental investigation of the laser ablation process on wood surfaces
NASA Astrophysics Data System (ADS)
Panzner, M.; Wiedemann, G.; Henneberg, K.; Fischer, R.; Wittke, Th.; Dietsch, R.
1998-05-01
Processing of wood by conventional mechanical tools like saws or planes leaves behind a layer of squeezed wood only slightly adhering to the solid wood surface. Laser ablation of this layer could improve the durability of coatings and glued joints. For technical applications, thorough knowledge about the laser ablation process is necessary. Results of ablation experiments by excimer lasers, Nd:YAG lasers, and TEA-CO 2 lasers on surfaces of different wood types and cut orientations are shown. The process of ablation was observed by a high-speed camera system and optical spectroscopy. The influence of the experimental parameters are demonstrated by SEM images and measurement of the ablation rate depending on energy density. Thermal effects like melting and also carbonizing of cellulose were found for IR- and also UV-laser wavelengths. Damage of the wood surface after laser ablation was weaker for excimer lasers and CO 2-TEA lasers. This can be explained by the high absorption of wood in the ultraviolet and middle infrared spectral range. As an additional result, this technique provides an easy way for preparing wood surfaces with excellently conserved cellular structure.
NASA Astrophysics Data System (ADS)
Doughty, Austin; Hasanjee, Aamr; Pettitt, Alex; Silk, Kegan; Liu, Hong; Chen, Wei R.; Zhou, Feifan
2016-03-01
Laser Immunotherapy is a novel cancer treatment modality that has seen much success in treating many different types of cancer, both in animal studies and in clinical trials. The treatment consists of the synergistic interaction between photothermal laser irradiation and the local injection of an immunoadjuvant. As a result of the therapy, the host immune system launches a systemic antitumor response. The photothermal effect induced by the laser irradiation has multiple effects at different temperature elevations which are all required for optimal response. Therefore, determining the temperature distribution in the target tumor during the laser irradiation in laser immunotherapy is crucial to facilitate the treatment of cancers. To investigate the temperature distribution in the target tumor, female Wistar Furth rats were injected with metastatic mammary tumor cells and, upon sufficient tumor growth, underwent laser irradiation and were monitored using thermocouples connected to locally-inserted needle probes and infrared thermography. From the study, we determined that the maximum central tumor temperature was higher for tumors of less volume. Additionally, we determined that the temperature near the edge of the tumor as measured with a thermocouple had a strong correlation with the maximum temperature value in the infrared camera measurement.
Visualization of evolving laser-generated structures by frequency domain tomography
NASA Astrophysics Data System (ADS)
Chang, Yenyu; Li, Zhengyan; Wang, Xiaoming; Zgadzaj, Rafal; Downer, Michael
2011-10-01
We introduce frequency domain tomography (FDT) for single-shot visualization of time-evolving refractive index structures (e.g. laser wakefields, nonlinear index structures) moving at light-speed. Previous researchers demonstrated single-shot frequency domain holography (FDH), in which a probe-reference pulse pair co- propagates with the laser-generated structure, to obtain snapshot-like images. However, in FDH, information about the structure's evolution is averaged. To visualize an evolving structure, we use several frequency domain streak cameras (FDSCs), in each of which a probe-reference pulse pair propagates at an angle to the propagation direction of the laser-generated structure. The combination of several FDSCs constitutes the FDT system. We will present experimental results for a 4-probe FDT system that has imaged the whole-beam self-focusing of a pump pulse propagating through glass in a single laser shot. Combining temporal and angle multiplexing methods, we successfully processed data from four probe pulses in one spectrometer in a single-shot. The output of data processing is a multi-frame movie of the self- focusing pulse. Our results promise the possibility of visualizing evolving laser wakefield structures that underlie laser-plasma accelerators used for multi-GeV electron acceleration.
NASA Astrophysics Data System (ADS)
Fan, Shuzhen; Qi, Feng; Notake, Takashi; Nawata, Kouji; Matsukawa, Takeshi; Takida, Yuma; Minamide, Hiroaki
2014-03-01
Real-time terahertz (THz) wave imaging has wide applications in areas such as security, industry, biology, medicine, pharmacy, and arts. In this letter, we report on real-time room-temperature THz imaging by nonlinear optical frequency up-conversion in organic 4-dimethylamino-N'-methyl-4'-stilbazolium tosylate crystal. The active projection-imaging system consisted of (1) THz wave generation, (2) THz-near-infrared hybrid optics, (3) THz wave up-conversion, and (4) an InGaAs camera working at 60 frames per second. The pumping laser system consisted of two optical parametric oscillators pumped by a nano-second frequency-doubled Nd:YAG laser. THz-wave images of handmade samples at 19.3 THz were taken, and videos of a sample moving and a ruler stuck with a black polyethylene film moving were supplied online to show real-time ability. Thanks to the high speed and high responsivity of this technology, real-time THz imaging with a higher signal-to-noise ratio than a commercially available THz micro-bolometer camera was proven to be feasible. By changing the phase-matching condition, i.e., by changing the wavelength of the pumping laser, we suggest THz imaging with a narrow THz frequency band of interest in a wide range from approximately 2 to 30 THz is possible.
A compact electron spectrometer for an LWFA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumpkin, A.; Crowell, R.; Li, Y.
2007-01-01
The use of a laser wakefield accelerator (LWFA) beam as a driver for a compact free-electron laser (FEL) has been proposed recently. A project is underway at Argonne National Laboratory (ANL) to operate an LWFA in the bubble regime and to use the quasi-monoenergetic electron beam as a driver for a 3-m-long undulator for generation of sub-ps UV radiation. The Terawatt Ultrafast High Field Facility (TUHFF) in the Chemistry Division provides the 20-TW peak power laser. A compact electron spectrometer whose initial fields of 0.45 T provide energy coverage of 30-200 MeV has been selected to characterize the electron beams.more » The system is based on the Ecole Polytechnique design used for their LWFA and incorporates the 5-cm-long permanent magnet dipole, the LANEX scintillator screen located at the dispersive plane, a Roper Scientific 16-bit MCP-intensified CCD camera, and a Bergoz ICT for complementary charge measurements. Test results on the magnets, the 16-bit camera, and the ICT will be described, and initial electron beam data will be presented as available. Other challenges will also be addressed.« less
Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination.
Spigulis, Janis; Oshina, Ilze; Berzina, Anna; Bykov, Alexander
2017-09-01
Chromophore distribution maps are useful tools for skin malformation severity assessment and for monitoring of skin recovery after burns, surgeries, and other interactions. The chromophore maps can be obtained by processing several spectral images of skin, e.g., captured by hyperspectral or multispectral cameras during seconds or even minutes. To avoid motion artifacts and simplify the procedure, a single-snapshot technique for mapping melanin, oxyhemoglobin, and deoxyhemoglobin of in-vivo skin by a smartphone under simultaneous three-wavelength (448–532–659 nm) laser illumination is proposed and examined. Three monochromatic spectral images related to the illumination wavelengths were extracted from the smartphone camera RGB image data set with respect to crosstalk between the RGB detection bands. Spectral images were further processed accordingly to Beer’s law in a three chromophore approximation. Photon absorption path lengths in skin at the exploited wavelengths were estimated by means of Monte Carlo simulations. The technique was validated clinically on three kinds of skin lesions: nevi, hemangiomas, and seborrheic keratosis. Design of the developed add-on laser illumination system, image-processing details, and the results of clinical measurements are presented and discussed.
Deviation rectification for dynamic measurement of rail wear based on coordinate sets projection
NASA Astrophysics Data System (ADS)
Wang, Chao; Ma, Ziji; Li, Yanfu; Zeng, Jiuzhen; Jin, Tan; Liu, Hongli
2017-10-01
Dynamic measurement of rail wear using a laser imaging system suffers from random vibrations in the laser-based imaging sensor which cause distorted rail profiles. In this paper, a simple and effective method for rectifying profile deviation is presented to address this issue. There are two main steps: profile recognition and distortion calibration. According to the constant camera and projector parameters, efficient recognition of measured profiles is achieved by analyzing the geometric difference between normal profiles and distorted ones. For a distorted profile, by constructing coordinate sets projecting from it to the standard one on triple projecting primitives, including the rail head inner line, rail waist curve and rail jaw, iterative extrinsic camera parameter self-compensation is implemented. The distortion is calibrated by projecting the distorted profile onto the x-y plane of a measuring coordinate frame, which is parallel to the rail cross section, to eliminate the influence of random vibrations in the laser-based imaging sensor. As well as evaluating the implementation with comprehensive experiments, we also compare our method with other published works. The results exhibit the effectiveness and superiority of our method for the dynamic measurement of rail wear.
Optical Extinction Measurements of Laser Side-Scatter During Tropical Storm Colin
NASA Technical Reports Server (NTRS)
Lane, John E.; Kasparis, Takis; Metzger, Philip; Michaelides, Silas
2017-01-01
A side-scatter imaging (SSI) technique using a 447 nm, 500 mW laser and a Nikon D80 camera was tested at Kennedy Space Center, Florida during the passing of a rain band associated with Tropical Storm Colin. The June 6, 2016, 22:00 GMT rain event was intense but short-lived owing to the strong west-to-east advection of the rain band. An effort to validate the optical extinction measurement was conducted by setting up a line of three tipping rain gauges along an 80 m east-west path and below the laser beam. Differences between tipping bucket measurements were correlated to the extinction coefficient profile along the lasers path, as determined by the SSI measurement. In order to compare the tipping bucket to the optical extinction data, a Marshall-Palmer DSD model was assumed. Since this was a daytime event, the laser beam was difficult to detect in the camera images, pointing out an important limitation of SSI measurements: the practical limit of DSD density that can be effectively detected and analyzed under daylight conditions using this laser and camera, corresponds to a fairly moderate rainfall rate on the order of 20 mmh (night measurements achieve a much improved sensitivity). The SSI analysis model under test produced promising results, but in order to use the SSI method for routine meteorological studies, improvements to the math model will be required.
Geoscience Laser Ranging System design and performance predictions
NASA Technical Reports Server (NTRS)
Anderson, Kent L.
1991-01-01
The Geoscience Laser System (GLRS) will be a high-precision distance-measuring instrument planned for deployment on the EOS-B platform. Its primary objectives are to perform ranging measurements to ground targets to monitor crustal deformation and tectonic plate motions, and nadir-looking altimetry to determine ice sheet thicknesses, surface topography, and vertical profiles of clouds and aerosols. The system uses a mode-locked, 3-color Nd:YAG laser source, a Microchannel Plate-PMT for absolute time-of-flight (TOF) measurement (at 532 nm), a streak camera for TOF 2-color dispersion measurement (532 nm and 355 nm), and a Si avalanche photodiode for altimeter waveform detection (1064 nm). The performance goals are to make ranging measurements to ground targets with about 1 cm accuracy, and altimetry height measurements over ice with 10 cm accuracy. This paper presents an overview of the design concept developed during a phase B study. System engineering issues and trade studies are discussed, with particular attention to error budgets and performance predictions.
Concepts for laser beam parameter monitoring during industrial mass production
NASA Astrophysics Data System (ADS)
Harrop, Nicholas J.; Maerten, Otto; Wolf, Stefan; Kramer, Reinhard
2017-02-01
In today's industrial mass production, lasers have become an established tool for a variety of processes. As with any other tool, mechanical or otherwise, the laser and its ancillary components are prone to wear and ageing. Monitoring of these ageing processes at full operating power of an industrial laser is challenging for a range of reasons. Not only the damage threshold of the measurement device itself, but also cycle time constraints in industrial processing are just two of these challenges. Power measurement, focus spot size or full beam caustic measurements are being implemented in industrial laser systems. The scope of the measurement and the amount of data collected is limited by the above mentioned cycle time, which in some cases can only be a few seconds. For successful integration of these measurement systems into automated production lines, the devices must be equipped with standardized communication interfaces, enabling a feedback loop from the measurement device to the laser processing systems. If necessary these measurements can be performed before each cycle. Power is determined with either static or dynamic calorimetry while camera and scanning systems are used for beam profile analysis. Power levels can be measured from 25W up to 20 kW, with focus spot sizes between 10μm and several millimeters. We will show, backed by relevant statistical data, that defects or contamination of the laser beam path can be detected with applied measurement systems, enabling a quality control chain to prevent process defects.
Fan, Yingwei; Zhang, Boyu; Chang, Wei; Zhang, Xinran; Liao, Hongen
2018-03-01
Complete resection of diseased lesions reduces the recurrence of cancer, making it critical for surgical treatment. However, precisely resecting residual tumors is a challenge during operation. A novel integrated spectral-domain optical-coherence-tomography (SD-OCT) and laser-ablation therapy system for soft-biological-tissue resection is proposed. This is a prototype optical integrated diagnosis and therapeutic system as well as an optical theranostics system. We develop an optical theranostics system, which integrates SD-OCT, a laser-ablation unit, and an automatic scanning platform. The SD-OCT image of biological tissue provides an intuitive and clear view for intraoperative diagnosis and monitoring in real time. The effect of laser ablation is analyzed using a quantitative mathematical model. The automatic endoscopic scanning platform combines an endoscopic probe and an SD-OCT sample arm to provide optical theranostic scanning motion. An optical fiber and a charge-coupled device camera are integrated into the endoscopic probe, allowing detection and coupling of the OCT-aiming beam and laser spots. The integrated diagnostic and therapeutic system combines SD-OCT imaging and laser-ablation modules with an automatic scanning platform. OCT imaging, laser-ablation treatment, and the integration and control of diagnostic and therapeutic procedures were evaluated by performing phantom experiments. Furthermore, SD-OCT-guided laser ablation provided precision laser ablation and resection for the malignant lesions in soft-biological-tissue-lesion surgery. The results demonstrated that the appropriate laser-radiation power and duration time were 10 W and 10 s, respectively. In the laser-ablation evaluation experiment, the error reached approximately 0.1 mm. Another validation experiment was performed to obtain OCT images of the pre- and post-ablated craters of ex vivo porcine brainstem. We propose an optical integrated diagnosis and therapeutic system. The primary experimental results show the high efficiency and feasibility of our theranostics system, which is promising for realizing accurate resection of tumors in vivo and in situ in the future.
Virtual-stereo fringe reflection technique for specular free-form surface testing
NASA Astrophysics Data System (ADS)
Ma, Suodong; Li, Bo
2016-11-01
Due to their excellent ability to improve the performance of optical systems, free-form optics have attracted extensive interest in many fields, e.g. optical design of astronomical telescopes, laser beam expanders, spectral imagers, etc. However, compared with traditional simple ones, testing for such kind of optics is usually more complex and difficult which has been being a big barrier for the manufacture and the application of these optics. Fortunately, owing to the rapid development of electronic devices and computer vision technology, fringe reflection technique (FRT) with advantages of simple system structure, high measurement accuracy and large dynamic range is becoming a powerful tool for specular free-form surface testing. In order to obtain absolute surface shape distributions of test objects, two or more cameras are often required in the conventional FRT which makes the system structure more complex and the measurement cost much higher. Furthermore, high precision synchronization between each camera is also a troublesome issue. To overcome the aforementioned drawback, a virtual-stereo FRT for specular free-form surface testing is put forward in this paper. It is able to achieve absolute profiles with the help of only one single biprism and a camera meanwhile avoiding the problems of stereo FRT based on binocular or multi-ocular cameras. Preliminary experimental results demonstrate the feasibility of the proposed technique.
Laser-Induced-Fluorescence Photogrammetry and Videogrammetry
NASA Technical Reports Server (NTRS)
Danehy, Paul; Jones, Tom; Connell, John; Belvin, Keith; Watson, Kent
2004-01-01
An improved method of dot-projection photogrammetry and an extension of the method to encompass dot-projection videogrammetry overcome some deficiencies of dot-projection photogrammetry as previously practiced. The improved method makes it possible to perform dot-projection photogrammetry or videogrammetry on targets that have previously not been amenable to dot-projection photogrammetry because they do not scatter enough light. Such targets include ones that are transparent, specularly reflective, or dark. In standard dot-projection photogrammetry, multiple beams of white light are projected onto the surface of an object of interest (denoted the target) to form a known pattern of bright dots. The illuminated surface is imaged in one or more cameras oriented at a nonzero angle or angles with respect to a central axis of the illuminating beams. The locations of the dots in the image(s) contain stereoscopic information on the locations of the dots, and, hence, on the location, shape, and orientation of the illuminated surface of the target. The images are digitized and processed to extract this information. Hardware and software to implement standard dot-projection photogrammetry are commercially available. Success in dot-projection photogrammetry depends on achieving sufficient signal-to-noise ratios: that is, it depends on scattering of enough light by the target so that the dots as imaged in the camera(s) stand out clearly against the ambient-illumination component of the image of the target. In one technique used previously to increase the signal-to-noise ratio, the target is illuminated by intense, pulsed laser light and the light entering the camera(s) is band-pass filtered at the laser wavelength. Unfortunately, speckle caused by the coherence of the laser light engenders apparent movement in the projected dots, thereby giving rise to errors in the measurement of the centroids of the dots and corresponding errors in the computed shape and location of the surface of the target. The improved method is denoted laser-induced-fluorescence photogrammetry.
Non-optically combined multispectral source for IR, visible, and laser testing
NASA Astrophysics Data System (ADS)
Laveigne, Joe; Rich, Brian; McHugh, Steve; Chua, Peter
2010-04-01
Electro Optical technology continues to advance, incorporating developments in infrared and laser technology into smaller, more tightly-integrated systems that can see and discriminate military targets at ever-increasing distances. New systems incorporate laser illumination and ranging with gated sensors that allow unparalleled vision at a distance. These new capabilities augment existing all-weather performance in the mid-wave infrared (MWIR) and long-wave infrared (LWIR), as well as low light level visible and near infrared (VNIR), giving the user multiple means of looking at targets of interest. There is a need in the test industry to generate imagery in the relevant spectral bands, and to provide temporal stimulus for testing range-gated systems. Santa Barbara Infrared (SBIR) has developed a new means of combining a uniform infrared source with uniform laser and visible sources for electro-optics (EO) testing. The source has been designed to allow laboratory testing of surveillance systems incorporating an infrared imager and a range-gated camera; and for field testing of emerging multi-spectral/fused sensor systems. A description of the source will be presented along with performance data relating to EO testing, including output in pertinent spectral bands, stability and resolution.
Single shot laser speckle based 3D acquisition system for medical applications
NASA Astrophysics Data System (ADS)
Khan, Danish; Shirazi, Muhammad Ayaz; Kim, Min Young
2018-06-01
The state of the art techniques used by medical practitioners to extract the three-dimensional (3D) geometry of different body parts requires a series of images/frames such as laser line profiling or structured light scanning. Movement of the patients during scanning process often leads to inaccurate measurements due to sequential image acquisition. Single shot structured techniques are robust to motion but the prevalent challenges in single shot structured light methods are the low density and algorithm complexity. In this research, a single shot 3D measurement system is presented that extracts the 3D point cloud of human skin by projecting a laser speckle pattern using a single pair of images captured by two synchronized cameras. In contrast to conventional laser speckle 3D measurement systems that realize stereo correspondence by digital correlation of projected speckle patterns, the proposed system employs KLT tracking method to locate the corresponding points. The 3D point cloud contains no outliers and sufficient quality of 3D reconstruction is achieved. The 3D shape acquisition of human body parts validates the potential application of the proposed system in the medical industry.
Multi-camera digital image correlation method with distributed fields of view
NASA Astrophysics Data System (ADS)
Malowany, Krzysztof; Malesa, Marcin; Kowaluk, Tomasz; Kujawinska, Malgorzata
2017-11-01
A multi-camera digital image correlation (DIC) method and system for measurements of large engineering objects with distributed, non-overlapping areas of interest are described. The data obtained with individual 3D DIC systems are stitched by an algorithm which utilizes the positions of fiducial markers determined simultaneously by Stereo-DIC units and laser tracker. The proposed calibration method enables reliable determination of transformations between local (3D DIC) and global coordinate systems. The applicability of the method was proven during in-situ measurements of a hall made of arch-shaped (18 m span) self-supporting metal-plates. The proposed method is highly recommended for 3D measurements of shape and displacements of large and complex engineering objects made from multiple directions and it provides the suitable accuracy of data for further advanced structural integrity analysis of such objects.
Protection performance evaluation regarding imaging sensors hardened against laser dazzling
NASA Astrophysics Data System (ADS)
Ritt, Gunnar; Koerber, Michael; Forster, Daniel; Eberle, Bernd
2015-05-01
Electro-optical imaging sensors are widely distributed and used for many different purposes, including civil security and military operations. However, laser irradiation can easily disturb their operational capability. Thus, an adequate protection mechanism for electro-optical sensors against dazzling and damaging is highly desirable. Different protection technologies exist now, but none of them satisfies the operational requirements without any constraints. In order to evaluate the performance of various laser protection measures, we present two different approaches based on triangle orientation discrimination on the one hand and structural similarity on the other hand. For both approaches, image analysis algorithms are applied to images taken of a standard test scene with triangular test patterns which is superimposed by dazzling laser light of various irradiance levels. The evaluation methods are applied to three different sensors: a standard complementary metal oxide semiconductor camera, a high dynamic range camera with a nonlinear response curve, and a sensor hardened against laser dazzling.
NASA Technical Reports Server (NTRS)
Mungas, Greg S.; Gursel, Yekta; Sepulveda, Cesar A.; Anderson, Mark; La Baw, Clayton; Johnson, Kenneth R.; Deans, Matthew; Beegle, Luther; Boynton, John
2008-01-01
Conducting high resolution field microscopy with coupled laser spectroscopy that can be used to selectively analyze the surface chemistry of individual pixels in a scene is an enabling capability for next generation robotic and manned spaceflight missions, civil, and military applications. In the laboratory, we use a range of imaging and surface preparation tools that provide us with in-focus images, context imaging for identifying features that we want to investigate at high magnification, and surface-optical coupling that allows us to apply optical spectroscopic analysis techniques for analyzing surface chemistry particularly at high magnifications. The camera, hand lens, and microscope probe with scannable laser spectroscopy (CHAMP-SLS) is an imaging/spectroscopy instrument capable of imaging continuously from infinity down to high resolution microscopy (resolution of approx. 1 micron/pixel in a final camera format), the closer CHAMP-SLS is placed to a feature, the higher the resultant magnification. At hand lens to microscopic magnifications, the imaged scene can be selectively interrogated with point spectroscopic techniques such as Raman spectroscopy, microscopic Laser Induced Breakdown Spectroscopy (micro-LIBS), laser ablation mass-spectrometry, Fluorescence spectroscopy, and/or Reflectance spectroscopy. This paper summarizes the optical design, development, and testing of the CHAMP-SLS optics.
Digest of NASA earth observation sensors
NASA Technical Reports Server (NTRS)
Drummond, R. R.
1972-01-01
A digest of technical characteristics of remote sensors and supporting technological experiments uniquely developed under NASA Applications Programs for Earth Observation Flight Missions is presented. Included are camera systems, sounders, interferometers, communications and experiments. In the text, these are grouped by types, such as television and photographic cameras, lasers and radars, radiometers, spectrometers, technology experiments, and transponder technology experiments. Coverage of the brief history of development extends from the first successful earth observation sensor aboard Explorer 7 in October, 1959, through the latest funded and flight-approved sensors under development as of October 1, 1972. A standard resume format is employed to normalize and mechanize the information presented.
Techniques for optically compressing light intensity ranges
Rushford, Michael C.
1989-01-01
A pin hole camera assembly for use in viewing an object having a relatively large light intensity range, for example a crucible containing molten uranium in an atomic vapor laser isotope separator (AVLIS) system is disclosed herein. The assembly includes means for optically compressing the light intensity range appearing at its input sufficient to make it receivable and decipherable by a standard video camera. A number of different means for compressing the intensity range are disclosed. These include the use of photogray glass, the use of a pair of interference filters, and the utilization of a new liquid crystal notch filter in combination with an interference filter.
Techniques for optically compressing light intensity ranges
Rushford, M.C.
1989-03-28
A pin hole camera assembly for use in viewing an object having a relatively large light intensity range, for example a crucible containing molten uranium in an atomic vapor laser isotope separator (AVLIS) system is disclosed herein. The assembly includes means for optically compressing the light intensity range appearing at its input sufficient to make it receivable and decipherable by a standard video camera. A number of different means for compressing the intensity range are disclosed. These include the use of photogray glass, the use of a pair of interference filters, and the utilization of a new liquid crystal notch filter in combination with an interference filter. 18 figs.
Small SWAP 3D imaging flash ladar for small tactical unmanned air systems
NASA Astrophysics Data System (ADS)
Bird, Alan; Anderson, Scott A.; Wojcik, Michael; Budge, Scott E.
2015-05-01
The Space Dynamics Laboratory (SDL), working with Naval Research Laboratory (NRL) and industry leaders Advanced Scientific Concepts (ASC) and Hood Technology Corporation, has developed a small SWAP (size, weight, and power) 3D imaging flash ladar (LAser Detection And Ranging) sensor system concept design for small tactical unmanned air systems (STUAS). The design utilizes an ASC 3D flash ladar camera and laser in a Hood Technology gyro-stabilized gimbal system. The design is an autonomous, intelligent, geo-aware sensor system that supplies real-time 3D terrain and target images. Flash ladar and visible camera data are processed at the sensor using a custom digitizer/frame grabber with compression. Mounted in the aft housing are power, controls, processing computers, and GPS/INS. The onboard processor controls pointing and handles image data, detection algorithms and queuing. The small SWAP 3D imaging flash ladar sensor system generates georeferenced terrain and target images with a low probability of false return and <10 cm range accuracy through foliage in real-time. The 3D imaging flash ladar is designed for a STUAS with a complete system SWAP estimate of <9 kg, <0.2 m3 and <350 W power. The system is modeled using LadarSIM, a MATLAB® and Simulink®- based ladar system simulator designed and developed by the Center for Advanced Imaging Ladar (CAIL) at Utah State University. We will present the concept design and modeled performance predictions.
Measuring the circular motion of small objects using laser stroboscopic images.
Wang, Hairong; Fu, Y; Du, R
2008-01-01
Measuring the circular motion of a small object, including its displacement, speed, and acceleration, is a challenging task. This paper presents a new method for measuring repetitive and/or nonrepetitive, constant speed and/or variable speed circular motion using laser stroboscopic images. Under stroboscopic illumination, each image taken by an ordinary camera records multioutlines of an object in motion; hence, processing the stroboscopic image will be able to extract the motion information. We built an experiment apparatus consisting of a laser as the light source, a stereomicroscope to magnify the image, and a normal complementary metal oxide semiconductor camera to record the image. As the object is in motion, the stroboscopic illumination generates a speckle pattern on the object that can be recorded by the camera and analyzed by a computer. Experimental results indicate that the stroboscopic imaging is stable under various conditions. Moreover, the characteristics of the motion, including the displacement, the velocity, and the acceleration can be calculated based on the width of speckle marks, the illumination intensity, the duty cycle, and the sampling frequency. Compared with the popular high-speed camera method, the presented method may achieve the same measuring accuracy, but with much reduced cost and complexity.
Optical correlator method and apparatus for particle image velocimetry processing
NASA Technical Reports Server (NTRS)
Farrell, Patrick V. (Inventor)
1991-01-01
Young's fringes are produced from a double exposure image of particles in a flowing fluid by passing laser light through the film and projecting the light onto a screen. A video camera receives the image from the screen and controls a spatial light modulator. The spatial modulator has a two dimensional array of cells the transmissiveness of which are controlled in relation to the brightness of the corresponding pixel of the video camera image of the screen. A collimated beam of laser light is passed through the spatial light modulator to produce a diffraction pattern which is focused onto another video camera, with the output of the camera being digitized and provided to a microcomputer. The diffraction pattern formed when the laser light is passed through the spatial light modulator and is focused to a point corresponds to the two dimensional Fourier transform of the Young's fringe pattern projected onto the screen. The data obtained fro This invention was made with U.S. Government support awarded by the Department of the Army (DOD) and NASA grand number(s): DOD #DAAL03-86-K0174 and NASA #NAG3-718. The U.S. Government has certain rights in this invention.
NASA Astrophysics Data System (ADS)
Ocylok, Sörn; Alexeev, Eugen; Mann, Stefan; Weisheit, Andreas; Wissenbach, Konrad; Kelbassa, Ingomar
One major demand of today's laser metal deposition (LMD) processes is to achieve a fail-save build-up regarding changing conditions like heat accumulations. Especially for the repair of thin parts like turbine blades is the knowledge about the correlations between melt pool behavior and process parameters like laser power, feed rate and powder mass stream indispensable. The paper will show the process layout with the camera based coaxial monitoring system and the quantitative influence of the process parameters on the melt pool geometry. Therefore the diameter, length and area of the melt pool are measured by a video analytic system at various parameters and compared with the track wide in cross-sections and the laser spot diameter. The influence of changing process conditions on the melt pool is also investigated. On the base of these results an enhanced process of the build-up of a multilayer one track fillet geometry will be presented.
Compact fluorescence and white-light imaging system for intraoperative visualization of nerves
NASA Astrophysics Data System (ADS)
Gray, Dan; Kim, Evgenia; Cotero, Victoria; Staudinger, Paul; Yazdanfar, Siavash; tan Hehir, Cristina
2012-02-01
Fluorescence image guided surgery (FIGS) allows intraoperative visualization of critical structures, with applications spanning neurology, cardiology and oncology. An unmet clinical need is prevention of iatrogenic nerve damage, a major cause of post-surgical morbidity. Here we describe the advancement of FIGS imaging hardware, coupled with a custom nerve-labeling fluorophore (GE3082), to bring FIGS nerve imaging closer to clinical translation. The instrument is comprised of a 405nm laser and a white light LED source for excitation and illumination. A single 90 gram color CCD camera is coupled to a 10mm surgical laparoscope for image acquisition. Synchronization of the light source and camera allows for simultaneous visualization of reflected white light and fluorescence using only a single camera. The imaging hardware and contrast agent were evaluated in rats during in situ surgical procedures.
A compact fluorescence and white light imaging system for intraoperative visualization of nerves
NASA Astrophysics Data System (ADS)
Gray, Dan; Kim, Evgenia; Cotero, Victoria; Staudinger, Paul; Yazdanfar, Siavash; Tan Hehir, Cristina
2012-03-01
Fluorescence image guided surgery (FIGS) allows intraoperative visualization of critical structures, with applications spanning neurology, cardiology and oncology. An unmet clinical need is prevention of iatrogenic nerve damage, a major cause of post-surgical morbidity. Here we describe the advancement of FIGS imaging hardware, coupled with a custom nerve-labeling fluorophore (GE3082), to bring FIGS nerve imaging closer to clinical translation. The instrument is comprised of a 405nm laser and a white light LED source for excitation and illumination. A single 90 gram color CCD camera is coupled to a 10mm surgical laparoscope for image acquisition. Synchronization of the light source and camera allows for simultaneous visualization of reflected white light and fluorescence using only a single camera. The imaging hardware and contrast agent were evaluated in rats during in situ surgical procedures.
GEOS observation systems intercomparison investigation results
NASA Technical Reports Server (NTRS)
Berbert, J. H.
1974-01-01
The results of an investigation designed to determine the relative accuracy and precision of the different types of geodetic observation systems used by NASA is presented. A collocation technique was used to minimize the effects of uncertainties in the relative station locations and in the earth's gravity field model by installing accurate reference tracking systems close to the systems to be compared, and by precisely determining their relative survey. The Goddard laser and camera systems were shipped to selected sites, where they tracked the GEOS satellite simultaneously with other systems for an intercomparison observation.
Performance of PHOTONIS' low light level CMOS imaging sensor for long range observation
NASA Astrophysics Data System (ADS)
Bourree, Loig E.
2014-05-01
Identification of potential threats in low-light conditions through imaging is commonly achieved through closed-circuit television (CCTV) and surveillance cameras by combining the extended near infrared (NIR) response (800-10000nm wavelengths) of the imaging sensor with NIR LED or laser illuminators. Consequently, camera systems typically used for purposes of long-range observation often require high-power lasers in order to generate sufficient photons on targets to acquire detailed images at night. While these systems may adequately identify targets at long-range, the NIR illumination needed to achieve such functionality can easily be detected and therefore may not be suitable for covert applications. In order to reduce dependency on supplemental illumination in low-light conditions, the frame rate of the imaging sensors may be reduced to increase the photon integration time and thus improve the signal to noise ratio of the image. However, this may hinder the camera's ability to image moving objects with high fidelity. In order to address these particular drawbacks, PHOTONIS has developed a CMOS imaging sensor (CIS) with a pixel architecture and geometry designed specifically to overcome these issues in low-light level imaging. By combining this CIS with field programmable gate array (FPGA)-based image processing electronics, PHOTONIS has achieved low-read noise imaging with enhanced signal-to-noise ratio at quarter moon illumination, all at standard video frame rates. The performance of this CIS is discussed herein and compared to other commercially available CMOS and CCD for long-range observation applications.
Laser- and Multi-Spectral Monitoring of Natural Objects from UAVs
NASA Astrophysics Data System (ADS)
Reiterer, Alexander; Frey, Simon; Koch, Barbara; Stemmler, Simon; Weinacker, Holger; Hoffmann, Annemarie; Weiler, Markus; Hergarten, Stefan
2016-04-01
The paper describes the research, development and evaluation of a lightweight sensor system for UAVs. The system is composed of three main components: (1) a laser scanning module, (2) a multi-spectral camera system, and (3) a processing/storage unit. All three components are newly developed. Beside measurement precision and frequency, the low weight has been one of the challenging tasks. The current system has a total weight of about 2.5 kg and is designed as a self-contained unit (incl. storage and battery units). The main features of the system are: laser-based multi-echo 3D measurement by a wavelength of 905 nm (totally eye save), measurement range up to 200 m, measurement frequency of 40 kHz, scanning frequency of 16 Hz, relative distance accuracy of 10 mm. The system is equipped with both GNSS and IMU. Alternatively, a multi-visual-odometry system has been integrated to estimate the trajectory of the UAV by image features (based on this system a calculation of 3D-coordinates without GNSS is possible). The integrated multi-spectral camera system is based on conventional CMOS-image-chips equipped with a special sets of band-pass interference filters with a full width half maximum (FWHM) of 50 nm. Good results for calculating the normalized difference vegetation index (NDVI) and the wide dynamic range vegetation index (WDRVI) have been achieved using the band-pass interference filter-set with a FWHM of 50 nm and an exposure times between 5.000 μs and 7.000 μs. The system is currently used for monitoring of natural objects and surfaces, like forest, as well as for geo-risk analysis (landslides). By measuring 3D-geometric and multi-spectral information a reliable monitoring and interpretation of the data-set is possible. The paper gives an overview about the development steps, the system, the evaluation and first results.
Industrial applications of shearography for inspection of aircraft components
NASA Astrophysics Data System (ADS)
Krupka, Rene; Walz, Thomas; Ettemeyer, Andreas
2005-04-01
Shearography has been validated as fast and reliable inspection technique for aerospace components. Following several years phase of evaluation of the technique, meanwhile, shearography has entered the industrial production inspection. The applications basically range from serial inspection in the production line to field inspection in assembly and to applications in the maintenance and repair area. In all applications, the main advantages of shearography, as very fast and full field insection and high sensitivity even on very complex on composite materials have led to the decision for laser shearography as inspection tool. In this paper, we present some highlights of industrial shearography inspection. One of the first industrial installations of laser shearography in Europe was a fully automatic inspection system for helicopter rotorblades. Complete rotor blades are inspected within 10 minutes on delaminations and debondingg in the composite structure. In case of more complex components, robotic manipulation of the shearography camera has proven to be the optimal solution. An industry 6-axis robot give utmost flexibility to position the camera in any angle and distance. Automatic defect marking systems have also been introduced to indicate the exact position of the defect directly on the inspected component. Other applications are shearography inspection systems for abradable seals in jet engines and portable shearography inspection systems for maintenance and repair inspection in the field. In this paper, recent installations of automatice inspection systems in aerospace industries are presented.
NASA Astrophysics Data System (ADS)
Qi, Li; Wang, Shun; Zhang, Yixin; Sun, Yingying; Zhang, Xuping
2015-11-01
The quality inspection process is usually carried out after first processing of the raw materials such as cutting and milling. This is because the parts of the materials to be used are unidentified until they have been trimmed. If the quality of the material is assessed before the laser process, then the energy and efforts wasted on defected materials can be saved. We proposed a new production scheme that can achieve quantitative quality inspection prior to primitive laser cutting by means of three-dimensional (3-D) vision measurement. First, the 3-D model of the object is reconstructed by the stereo cameras, from which the spatial cutting path is derived. Second, collaborating with another rear camera, the 3-D cutting path is reprojected to both the frontal and rear views of the object and thus generates the regions-of-interest (ROIs) for surface defect analysis. An accurate visual guided laser process and reprojection-based ROI segmentation are enabled by a global-optimization-based trinocular calibration method. The prototype system was built and tested with the processing of raw duck feathers for high-quality badminton shuttle manufacture. Incorporating with a two-dimensional wavelet-decomposition-based defect analysis algorithm, both the geometrical and appearance features of the raw feathers are quantified before they are cut into small patches, which result in fully automatic feather cutting and sorting.
Apparatus and method for laser beam diagnosis
Salmon, Jr., Joseph T.
1991-01-01
An apparatus and method is disclosed for accurate, real time monitoring of the wavefront curvature of a coherent laser beam. Knowing the curvature, it can be quickly determined whether the laser beam is collimated, or focusing (converging), or de-focusing (diverging). The apparatus includes a lateral interferometer for forming an interference pattern of the laser beam to be diagnosed. The interference pattern is imaged to a spatial light modulator (SLM), whose output is a coherent laser beam having an image of the interference pattern impressed on it. The SLM output is focused to obtain the far-field diffraction pattern. A video camera, such as CCD, monitors the far-field diffraction pattern, and provides an electrical output indicative of the shape of the far-field pattern. Specifically, the far-field pattern comprises a central lobe and side lobes, whose relative positions are indicative of the radius of curvature of the beam. The video camera's electrical output may be provided to a computer which analyzes the data to determine the wavefront curvature of the laser beam.
Apparatus and method for laser beam diagnosis
Salmon, J.T. Jr.
1991-08-27
An apparatus and method are disclosed for accurate, real time monitoring of the wavefront curvature of a coherent laser beam. Knowing the curvature, it can be quickly determined whether the laser beam is collimated, or focusing (converging), or de-focusing (diverging). The apparatus includes a lateral interferometer for forming an interference pattern of the laser beam to be diagnosed. The interference pattern is imaged to a spatial light modulator (SLM), whose output is a coherent laser beam having an image of the interference pattern impressed on it. The SLM output is focused to obtain the far-field diffraction pattern. A video camera, such as CCD, monitors the far-field diffraction pattern, and provides an electrical output indicative of the shape of the far-field pattern. Specifically, the far-field pattern comprises a central lobe and side lobes, whose relative positions are indicative of the radius of curvature of the beam. The video camera's electrical output may be provided to a computer which analyzes the data to determine the wavefront curvature of the laser beam. 11 figures.
NASA Astrophysics Data System (ADS)
Thoeni, K.; Giacomini, A.; Murtagh, R.; Kniest, E.
2014-06-01
This work presents a comparative study between multi-view 3D reconstruction using various digital cameras and a terrestrial laser scanner (TLS). Five different digital cameras were used in order to estimate the limits related to the camera type and to establish the minimum camera requirements to obtain comparable results to the ones of the TLS. The cameras used for this study range from commercial grade to professional grade and included a GoPro Hero 1080 (5 Mp), iPhone 4S (8 Mp), Panasonic Lumix LX5 (9.5 Mp), Panasonic Lumix ZS20 (14.1 Mp) and Canon EOS 7D (18 Mp). The TLS used for this work was a FARO Focus 3D laser scanner with a range accuracy of ±2 mm. The study area is a small rock wall of about 6 m height and 20 m length. The wall is partly smooth with some evident geological features, such as non-persistent joints and sharp edges. Eight control points were placed on the wall and their coordinates were measured by using a total station. These coordinates were then used to georeference all models. A similar number of images was acquired from a distance of between approximately 5 to 10 m, depending on field of view of each camera. The commercial software package PhotoScan was used to process the images, georeference and scale the models, and to generate the dense point clouds. Finally, the open-source package CloudCompare was used to assess the accuracy of the multi-view results. Each point cloud obtained from a specific camera was compared to the point cloud obtained with the TLS. The latter is taken as ground truth. The result is a coloured point cloud for each camera showing the deviation in relation to the TLS data. The main goal of this study is to quantify the quality of the multi-view 3D reconstruction results obtained with various cameras as objectively as possible and to evaluate its applicability to geotechnical problems.
Analysis of Photogrammetry Data from ISIM Mockup
NASA Technical Reports Server (NTRS)
Nowak, Maria; Hill, Mike
2007-01-01
During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software, proprietary software owned by Geodetic Systems Inc. The primary objectives of the metrology performed on the ISIM mock-up were (1) to quantify the accuracy of the INCA3 photogrammetry camera on a representative full scale version of the ISIM structure at ambient temperature by comparing the measurements obtained with this camera to measurements using the Leica laser tracker system and (2), empirically determine the smallest increment of target position movement that can be resolved by the PG camera in the test setup, i.e., precision, or resolution. In addition, the geometrical details of the test setup defined during the mockup testing, such as target locations and camera positions, will contribute to the final design of the photogrammetry system to be used on the ISIM Flight Structure.
Ripeness detection simulation of oil palm fruit bunches using laser-based imaging system
NASA Astrophysics Data System (ADS)
Shiddiq, Minarni; Fitmawati, Anjasmara, Ridho; Sari, Nurmaya; Hefniati
2017-01-01
Ripeness is one of important factors for quality sorting of harvested oil palm fresh fruit bunches (FFB). Traditional ripeness classifications using FFB color and number of fruit loose for harvesting have some disadvantages especially for high oil palm trees. A laser based imaging system is proposed to substitute the traditional method. In this study, ripeness detection simulation of oil palm FFBs was performed. The system composed of two diode lasers with 532 nm and 680 nm in wavelengths and a CMOS camera which was set on a rotating plate for easy adjustment of laser beam hitting FFB. The FFB samples were placed on an aluminum platform with 4 height variations, 1.5 m, 2 m, 2.5 m, and 3 m. The relations of reflectance intensities represented by Red Green Blue (RGB) values of the FFB images to the height variations and ripeness levels of FFBs with and without laser beam were analyzed. The samples were from Tenera variety with 4 ripeness levels called F0, F1, F3, and F4. The results showed that the red component of RGB values were dominant for FFBs without laser and with red laser. The average RGB values are higher for F3 (ripe) level and F4 (overripe). Imaging with green laser showed consistency. Imaging methods using laser was able to differentiate ripeness levels of oil palm fresh fruit bunch, it could be applied for future remote detection of oil palm FFB ripeness.
NASA Astrophysics Data System (ADS)
Berruto, G.; Madan, I.; Murooka, Y.; Vanacore, G. M.; Pomarico, E.; Rajeswari, J.; Lamb, R.; Huang, P.; Kruchkov, A. J.; Togawa, Y.; LaGrange, T.; McGrouther, D.; Rønnow, H. M.; Carbone, F.
2018-03-01
We demonstrate that light-induced heat pulses of different duration and energy can write Skyrmions in a broad range of temperatures and magnetic field in FeGe. Using a combination of camera-rate and pump-probe cryo-Lorentz transmission electron microscopy, we directly resolve the spatiotemporal evolution of the magnetization ensuing optical excitation. The Skyrmion lattice was found to maintain its structural properties during the laser-induced demagnetization, and its recovery to the initial state happened in the sub-μ s to μ s range, depending on the cooling rate of the system.
Simultaneous one-dimensional fluorescence lifetime measurements of OH and CO in premixed flames
NASA Astrophysics Data System (ADS)
Jonsson, Malin; Ehn, Andreas; Christensen, Moah; Aldén, Marcus; Bood, Joakim
2014-04-01
A method for simultaneous measurements of fluorescence lifetimes of two species along a line is described. The experimental setup is based on picosecond laser pulses from two tunable optical parametric generator/optical parametric amplifier systems together with a streak camera. With an appropriate optical time delay between the two laser pulses, whose wavelengths are tuned to excite two different species, laser-induced fluorescence can be both detected temporally and spatially resolved by the streak camera. Hence, our method enables one-dimensional imaging of fluorescence lifetimes of two species in the same streak camera recording. The concept is demonstrated for fluorescence lifetime measurements of CO and OH in a laminar methane/air flame on a Bunsen-type burner. Measurements were taken in flames with four different equivalence ratios, namely ϕ = 0.9, 1.0, 1.15, and 1.25. The measured one-dimensional lifetime profiles generally agree well with lifetimes calculated from quenching cross sections found in the literature and quencher concentrations predicted by the GRI 3.0 mechanism. For OH, there is a systematic deviation of approximately 30 % between calculated and measured lifetimes. It is found that this is mainly due to the adiabatic assumption regarding the flame and uncertainty in H2O quenching cross section. This emphasizes the strength of measuring the quenching rates rather than relying on models. The measurement concept might be useful for single-shot measurements of fluorescence lifetimes of several species pairs of vital importance in combustion processes, hence allowing fluorescence signals to be corrected for quenching and ultimately yield quantitative concentration profiles.
Common aperture multispectral spotter camera: Spectro XR
NASA Astrophysics Data System (ADS)
Petrushevsky, Vladimir; Freiman, Dov; Diamant, Idan; Giladi, Shira; Leibovich, Maor
2017-10-01
The Spectro XRTM is an advanced color/NIR/SWIR/MWIR 16'' payload recently developed by Elbit Systems / ELOP. The payload's primary sensor is a spotter camera with common 7'' aperture. The sensor suite includes also MWIR zoom, EO zoom, laser designator or rangefinder, laser pointer / illuminator and laser spot tracker. Rigid structure, vibration damping and 4-axes gimbals enable high level of line-of-sight stabilization. The payload's list of features include multi-target video tracker, precise boresight, strap-on IMU, embedded moving map, geodetic calculations suite, and image fusion. The paper describes main technical characteristics of the spotter camera. Visible-quality, all-metal front catadioptric telescope maintains optical performance in wide range of environmental conditions. High-efficiency coatings separate the incoming light into EO, SWIR and MWIR band channels. Both EO and SWIR bands have dual FOV and 3 spectral filters each. Several variants of focal plane array formats are supported. The common aperture design facilitates superior DRI performance in EO and SWIR, in comparison to the conventionally configured payloads. Special spectral calibration and color correction extend the effective range of color imaging. An advanced CMOS FPA and low F-number of the optics facilitate low light performance. SWIR band provides further atmospheric penetration, as well as see-spot capability at especially long ranges, due to asynchronous pulse detection. MWIR band has good sharpness in the entire field-of-view and (with full HD FPA) delivers amount of detail far exceeding one of VGA-equipped FLIRs. The Spectro XR offers level of performance typically associated with larger and heavier payloads.
A compact, efficient, and lightweight laser head for CARLO®: integration, performance, and benefits
NASA Astrophysics Data System (ADS)
Deibel, Waldemar; Schneider, Adrian; Augello, Marcello; Bruno, Alfredo E.; Juergens, Philipp; Cattin, Philippe
2015-09-01
Ever since the first functional lasers were built about 50 years ago, researchers and doctors dream of a medical use for such systems. Today's technology is finally advanced enough to realize these ambitions in a variety of medical fields. There are well-established laser based systems in ophthalmology, dental applications, treatment of kidney stones, and many more. Using lasers presents more than just an alternative to conventional methods for osteotomies. It offers less tissue damage, faster healing times, comparable intervention duration and in consequence improves postoperative treatment of patients. However, there are a few factors that limit routine applications. These technical drawbacks include missing depth control and safe guiding of the laser beam. This paper presents the engineering and integration of a miniaturized laser head for a computer assisted and robot-guided laser osteotome (CARLO®), which can overcome the mentioned drawbacks. The CARLO® device ensures a safe and precise guidance of the laser beam. Such guidance also enables new opportunities and methods, e.g. free geometrical functional cuts, which have the potential to revolutionize bone surgery. The laser head is optimized for beam shaping, target conditioning, working distance, compactness and the integration of all other parts needed, e.g. CCD-cameras for monitoring and referencing, a visible laser for cut simulation, etc. The beam coming out of the laser system is conditioned in shape, energy properties and working distance with an optical arrangement to achieve the desired cutting performance. Here also parameters like optical losses, operating mode, optics materials and long-term stability have are taken into account.
Comparison of laser Doppler and laser speckle contrast imaging using a concurrent processing system
NASA Astrophysics Data System (ADS)
Sun, Shen; Hayes-Gill, Barrie R.; He, Diwei; Zhu, Yiqun; Huynh, Nam T.; Morgan, Stephen P.
2016-08-01
Full field laser Doppler imaging (LDI) and single exposure laser speckle contrast imaging (LSCI) are directly compared using a novel instrument which can concurrently image blood flow using both LDI and LSCI signal processing. Incorporating a commercial CMOS camera chip and a field programmable gate array (FPGA) the flow images of LDI and the contrast maps of LSCI are simultaneously processed by utilizing the same detected optical signals. The comparison was carried out by imaging a rotating diffuser. LDI has a linear response to the velocity. In contrast, LSCI is exposure time dependent and does not provide a linear response in the presence of static speckle. It is also demonstrated that the relationship between LDI and LSCI can be related through a power law which depends on the exposure time of LSCI.
Automatic 3D relief acquisition and georeferencing of road sides by low-cost on-motion SfM
NASA Astrophysics Data System (ADS)
Voumard, Jérémie; Bornemann, Perrick; Malet, Jean-Philippe; Derron, Marc-Henri; Jaboyedoff, Michel
2017-04-01
3D terrain relief acquisition is important for a large part of geosciences. Several methods have been developed to digitize terrains, such as total station, LiDAR, GNSS or photogrammetry. To digitize road (or rail tracks) sides on long sections, mobile spatial imaging system or UAV are commonly used. In this project, we compare a still fairly new method -the SfM on-motion technics- with some traditional technics of terrain digitizing (terrestrial laser scanning, traditional SfM, UAS imaging solutions, GNSS surveying systems and total stations). The SfM on-motion technics generates 3D spatial data by photogrammetric processing of images taken from a moving vehicle. Our mobile system consists of six action cameras placed on a vehicle. Four fisheye cameras mounted on a mast on the vehicle roof are placed at 3.2 meters above the ground. Three of them have a GNNS chip providing geotagged images. Two pictures were acquired every second by each camera. 4K resolution fisheye videos were also used to extract 8.3M not geotagged pictures. All these pictures are then processed with the Agisoft PhotoScan Professional software. Results from the SfM on-motion technics are compared with results from classical SfM photogrammetry on a 500 meters long alpine track. They were also compared with mobile laser scanning data on the same road section. First results seem to indicate that slope structures are well observable up to decimetric accuracy. For the georeferencing, the planimetric (XY) accuracy of few meters is much better than the altimetric (Z) accuracy. There is indeed a Z coordinate shift of few tens of meters between GoPro cameras and Garmin camera. This makes necessary to give a greater freedom to altimetric coordinates in the processing software. Benefits of this low-cost SfM on-motion method are: 1) a simple setup to use in the field (easy to switch between vehicle types as car, train, bike, etc.), 2) a low cost and 3) an automatic georeferencing of 3D points clouds. Main disadvantages are: 1) results are less accurate than those from LiDAR system, 2) a heavy images processing and 3) a short distance of acquisition.
10-kW-class YAG laser application for heavy components
NASA Astrophysics Data System (ADS)
Ishide, Takashi; Tsubota, S.; Nayama, Michisuke; Shimokusu, Yoshiaki; Nagashima, Tadashi; Okimura, K.
2000-02-01
The authors have put the YAG laser of the kW class to practical use for repair welding of nuclear power plant steam generator heat exchanger tubes, all-position welding of pipings, etc. This paper describes following developed methods and systems of high power YAG laser processing. First, we apply the 6 kW to 10 kW YAG lasers for welding and cutting in heavy components. The beam guide systems we have used are optical fibers which core diameter is 0.6 mm to 0.8 mm and its length is 200 m as standard one. Using these system, we can get the 1 pass penetration of 15 mm to 20 mm and multi pass welding for more thick plates. Cutting of 100 mm thickness plate data also described for dismantling of nuclear power plants. In these systems we carried out the in-process monitoring by using CCD camera image processing and monitoring fiber which placed coaxial to the YAG optical lens system. In- process monitoring by the monitoring fiber, we measured the light intensity from welding area. Further, we have developed new hybrid welding with the TIG electrode at the center of lens for high power. The hybrid welding with TIG-YAG system aims lightening of welding groove allowances and welding of high quality. Through these techniques we have applied 7 kW class YAG laser for welding in the components of nuclear power plants.
Single-pixel camera with one graphene photodetector.
Li, Gongxin; Wang, Wenxue; Wang, Yuechao; Yang, Wenguang; Liu, Lianqing
2016-01-11
Consumer cameras in the megapixel range are ubiquitous, but the improvement of them is hindered by the poor performance and high cost of traditional photodetectors. Graphene, a two-dimensional micro-/nano-material, recently has exhibited exceptional properties as a sensing element in a photodetector over traditional materials. However, it is difficult to fabricate a large-scale array of graphene photodetectors to replace the traditional photodetector array. To take full advantage of the unique characteristics of the graphene photodetector, in this study we integrated a graphene photodetector in a single-pixel camera based on compressive sensing. To begin with, we introduced a method called laser scribing for fabrication the graphene. It produces the graphene components in arbitrary patterns more quickly without photoresist contamination as do traditional methods. Next, we proposed a system for calibrating the optoelectrical properties of micro/nano photodetectors based on a digital micromirror device (DMD), which changes the light intensity by controlling the number of individual micromirrors positioned at + 12°. The calibration sensitivity is driven by the sum of all micromirrors of the DMD and can be as high as 10(-5)A/W. Finally, the single-pixel camera integrated with one graphene photodetector was used to recover a static image to demonstrate the feasibility of the single-pixel imaging system with the graphene photodetector. A high-resolution image can be recovered with the camera at a sampling rate much less than Nyquist rate. The study was the first demonstration for ever record of a macroscopic camera with a graphene photodetector. The camera has the potential for high-speed and high-resolution imaging at much less cost than traditional megapixel cameras.
SLATE: scanning laser automatic threat extraction
NASA Astrophysics Data System (ADS)
Clark, David J.; Prickett, Shaun L.; Napier, Ashley A.; Mellor, Matthew P.
2016-10-01
SLATE is an Autonomous Sensor Module (ASM) designed to work with the SAPIENT system providing accurate location tracking and classifications of targets that pass through its field of view. The concept behind the SLATE ASM is to produce a sensor module that provides a complementary view of the world to the camera-based systems that are usually used for wide area surveillance. Cameras provide a hi-fidelity, human understandable view of the world with which tracking and identification algorithms can be used. Unfortunately, positioning and tracking in a 3D environment is difficult to implement robustly, making location-based threat assessment challenging. SLATE uses a Scanning Laser Rangefinder (SLR) that provides precise (<1cm) positions, sizes, shapes and velocities of targets within its field-of-view (FoV). In this paper we will discuss the development of the SLATE ASM including the techniques used to track and classify detections that move through the field of view of the sensor providing the accurate tracking information to the SAPIENT system. SLATE's ability to locate targets precisely allows subtle boundary-crossing judgements, e.g. on which side of a chain-link fence a target is. SLATE's ability to track targets in 3D throughout its FoV enables behavior classification such as running and walking which can provide an indication of intent and help reduce false alarm rates.
Enhanced Video-Oculography System
NASA Technical Reports Server (NTRS)
Moore, Steven T.; MacDougall, Hamish G.
2009-01-01
A previously developed video-oculography system has been enhanced for use in measuring vestibulo-ocular reflexes of a human subject in a centrifuge, motor vehicle, or other setting. The system as previously developed included a lightweight digital video camera mounted on goggles. The left eye was illuminated by an infrared light-emitting diode via a dichroic mirror, and the camera captured images of the left eye in infrared light. To extract eye-movement data, the digitized video images were processed by software running in a laptop computer. Eye movements were calibrated by having the subject view a target pattern, fixed with respect to the subject s head, generated by a goggle-mounted laser with a diffraction grating. The system as enhanced includes a second camera for imaging the scene from the subject s perspective, and two inertial measurement units (IMUs) for measuring linear accelerations and rates of rotation for computing head movements. One IMU is mounted on the goggles, the other on the centrifuge or vehicle frame. All eye-movement and head-motion data are time-stamped. In addition, the subject s point of regard is superimposed on each scene image to enable analysis of patterns of gaze in real time.
Range camera on conveyor belts: estimating size distribution and systematic errors due to occlusion
NASA Astrophysics Data System (ADS)
Blomquist, Mats; Wernersson, Ake V.
1999-11-01
When range cameras are used for analyzing irregular material on a conveyor belt there will be complications like missing segments caused by occlusion. Also, a number of range discontinuities will be present. In a frame work towards stochastic geometry, conditions are found for the cases when range discontinuities take place. The test objects in this paper are pellets for the steel industry. An illuminating laser plane will give range discontinuities at the edges of each individual object. These discontinuities are used to detect and measure the chord created by the intersection of the laser plane and the object. From the measured chords we derive the average diameter and its variance. An improved method is to use a pair of parallel illuminating light planes to extract two chords. The estimation error for this method is not larger than the natural shape fluctuations (the difference in diameter) for the pellets. The laser- camera optronics is sensitive enough both for material on a conveyor belt and free falling material leaving the conveyor.
First Imaging of Laser-Induced Spark on Mars
2014-07-16
NASA Curiosity Mars rover used the Mars Hand Lens Imager MAHLI camera on its arm to catch the first images of sparks produced by the rover laser being shot at a rock on Mars. The left image is from before the laser zapped this rock, called Nova.
Total Internal Reflection Microscopy (TIRM) as a nondestructive surface damage assessment tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, Z.M.; Cohen, S.J.; Taylor, J.R.
1994-10-01
An easy to use, nondestructive, method for evaluating subsurface damage in polished substrates has been established at LLNL. Subsurface damage has been related to laser damage in coated optical components used in high power, high repetition rate laser systems. Total Internal Reflection Microscopy (TIRM) has been shown to be a viable nondestructive technique in analyzing subsurface damage in optical components. A successful TIRM system has been established for evaluating subsurface damage on fused silica components. Laser light scattering from subsurface damage sites is collected through a Nomarski microscope. These images are then captured by a CCD camera for analysis onmore » a computer. A variety of optics, including components with intentional subsurface damage due to grinding and polishing, have been analyzed and their TIRM images compared to an existing destructive etching method. Methods for quantitative measurement of subsurface damage are also discussed.« less
NASA Astrophysics Data System (ADS)
Li, Shichun; Chen, Genyu; Katayama, Seiji; Zhang, Yi
2014-06-01
The spatter and the molten pool behavior, which were the important phenomena concerned with the welding quality, were observed and studied by using the high-speed camera and the X-ray transmission imaging system during laser welding under different welding parameters. The formation mechanism of spatter and the corresponding relationships between the spatter and molten pool behavior were investigated. The increase of laser power could cause more intense evaporation and lead to more spatter. When the focal position of laser beam was changed, different forms of spatter were generated, as well as the flow trends of molten metal on the front keyhole wall and at the rear molten pool were changed. The results revealed that the behavior of molten pool, which could be affected by the absorbed energy distribution in the keyhole, was the key factor to determine the spatter formation during laser welding. The relatively sound weld seam could be obtained during laser welding with the focal position located inside the metal.
Experimental study of hot cracking at circular welding joints of 42CrMo steel
NASA Astrophysics Data System (ADS)
Zhang, Yan; Chen, Genyu; Chen, Binghua; Wang, Jinhai; Zhou, Cong
2017-12-01
The hot cracking at circular welding joints of quenched and tempered 42CrMo steel were studied. The flow of the molten pool and the solidification process of weld were observed with a high-speed video camera. The information on the variations in the weld temperature was collected using an infrared (IR) thermal imaging system. The metallurgical factors of hot cracking were analyzed via metallographic microscope and scanning electron microscope (SEM). The result shows that leading laser laser-metal active gas (MAG) hybrid welding process has a smaller solid-liquid boundary movement rate (VSL) and a smaller solid-liquid boundary temperature gradient (GSL) compared with leading arc laser-MAG hybrid welding process and laser welding process. Additionally, the metal in the molten pool has superior permeability while flowing toward the dendritic roots and can compensate for the inner-dendritic pressure balance. Therefore, leading laser laser-MAG hybrid welding process has the lowest hot cracking susceptibility.
The 1973 Smithsonian standard earth (3). [for the satellite geodesy program
NASA Technical Reports Server (NTRS)
Garoschkin, E. M. (Editor)
1973-01-01
The origins of the satellite geodesy program are described, starting with the International Geophysical Year, continuing through a number of international programs, and culminating with the National Geodetic Satellite Program. The philosophical basis for the Baker-Nunn camera and the laser ranging system, the evolution of international scientific cooperation, and the significance of the results are discussed.
Khalil, Ahmed Asaad I; Gondal, Mohammed A; Shemis, Mohamed; Khan, Irfan S
2015-03-10
The UV single-pulsed (SP) laser-induced breakdown spectroscopy (LIBS) system was developed to detect the carcinogenic metals in human kidney stones extracted through the surgical operation. A neodymium yttrium aluminium garnet laser operating at 266 nm wavelength and 20 Hz repetition rate along with a spectrometer interfaced with an intensified CCD (ICCD) was applied for spectral analysis of kidney stones. The ICCD camera shutter was synchronized with the laser-trigger pulse and the effect of laser energy and delay time on LIBS signal intensity was investigated. The experimental parameters were optimized to obtain the LIBS plasma in local thermodynamic equilibrium. Laser energy was varied from 25 to 50 mJ in order to enhance the LIBS signal intensity and attain the best signal to noise ratio. The parametric dependence studies were important to improve the limit of detection of trace amounts of toxic elements present inside stones. The carcinogenic metals detected in kidney stones were chromium, cadmium, lead, zinc, phosphate, and vanadium. The results achieved from LIBS system were also compared with the inductively coupled plasma-mass spectrometry analysis and the concentration detected with both techniques was in very good agreement. The plasma parameters (electron temperature and density) for SP-LIBS system were also studied and their dependence on incident laser energy and delay time was investigated as well.
Nonlinear excitation fluorescence microscopy: source considerations for biological applications
NASA Astrophysics Data System (ADS)
Wokosin, David L.
2008-02-01
Ultra-short-pulse solid-state laser sources have improved contrast within fluorescence imaging and also opened new windows of investigation in biological imaging applications. Additionally, the pulsed illumination enables harmonic scattering microscopy which yields intrinsic structure, symmetry and contrast from viable embryos, cells and tissues. Numerous human diseases are being investigated by the combination of (more) intact dynamic tissue imaging of cellular function with gene-targeted specificity and electrophysiology context. The major limitation to more widespread use of multi-photon microscopy has been the complete system cost and added complexity above and beyond commercial camera and confocal systems. The current status of all-solid-state ultrafast lasers as excitation sources will be reviewed since these lasers offer tremendous potential for affordable, reliable, "turnkey" multiphoton imaging systems. This effort highlights the single box laser systems currently commercially available, with defined suggestions for the ranges for individual laser parameters as derived from a biological and fluorophore limited perspective. The standard two-photon dose is defined by 800nm, 10mW, 200fs, and 80Mhz - at the sample plane for tissue culture cells, i.e. after the full scanning microscope system. Selected application-derived excitation wavelengths are well represented by 700nm, 780nm, ~830nm, ~960nm, 1050nm, and 1250nm. Many of the one-box lasers have fixed or very limited excitation wavelengths available, so the lasers will be lumped near 780nm, 800nm, 900nm, 1050nm, and 1250nm. The following laser parameter ranges are discussed: average power from 200mW to 2W, pulse duration from 70fs to 700fs, pulse repetition rate from 20MHz to 200MHz, with the laser output linearly polarized with an extinction ratio at least 100:1.
Time-resolved soft-x-ray studies of energy transport in layered and planar laser-driven targets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stradling, G.L.
New low-energy x-ray diagnostic techniques are used to explore energy-transport processes in laser heated plasmas. Streak cameras are used to provide 15-psec time-resolution measurements of subkeV x-ray emission. A very thin (50 ..mu..g/cm/sup 2/) carbon substrate provides a low-energy x-ray transparent window to the transmission photocathode of this soft x-ray streak camera. Active differential vacuum pumping of the instrument is required. The use of high-sensitivity, low secondary-electron energy-spread CsI photocathodes in x-ray streak cameras is also described. Significant increases in sensitivity with only a small and intermittant decrease in dynamic range were observed. These coherent, complementary advances in subkeV, time-resolvedmore » x-ray diagnostic capability are applied to energy-transport investigations of 1.06-..mu..m laser plasmas. Both solid disk targets of a variety of Z's as well as Be-on-Al layered-disk targets were irradiated with 700-psec laser pulses of selected intensity between 3 x 10/sup 14/ W/cm/sup 2/ and 1 x 10/sup 15/ W/cm/sup 2/.« less
Measurement of vibration using phase only correlation technique
NASA Astrophysics Data System (ADS)
Balachandar, S.; Vipin, K.
2017-08-01
A novel method for the measurement of vibration is proposed and demonstrated. The proposed experiment is based on laser triangulation: consists of line laser, object under test and a high speed camera remotely controlled by a software. Experiment involves launching a line-laser probe beam perpendicular to the axis of the vibrating object. The reflected probe beam is recorded by a high speed camera. The dynamic position of the line laser in camera plane is governed by the magnitude and frequency of the vibrating test-object. Using phase correlation technique the maximum distance travelled by the probe beam in CCD plane is measured in terms of pixels using MATLAB. An actual displacement of the object in mm is measured by calibration. Using displacement data with time, other vibration associated quantities such as acceleration, velocity and frequency are evaluated. The preliminary result of the proposed method is reported for acceleration from 1g to 3g, and from frequency 6Hz to 26Hz. The results are closely matching with its theoretical values. The advantage of the proposed method is that it is a non-destructive method and using phase correlation algorithm subpixel displacement in CCD plane can be measured with high accuracy.
Time-resolved lidar fluorosensor for sea pollution detection
NASA Technical Reports Server (NTRS)
Ferrario, A.; Pizzolati, P. L.; Zanzottera, E.
1986-01-01
A contemporary time and spectral analysis of oil fluorescence is useful for the detection and the characterization of oil spills on the sea surface. Nevertheless the fluorosensor lidars, which were realized up to now, have only partial capability to perform this double analysis. The main difficulties are the high resolution required (of the order of 1 nanosecond) and the complexity of the detection system for the recording of a two-dimensional matrix of data for each laser pulse. An airborne system whose major specifications were: time range, 30 to 75 ns; time resolution, 1 ns; spectral range, 350 to 700 nm; and spectral resolution, 10 nm was designed and constructed. The designed system of a short pulse ultraviolet laser source and a streak camera based detector are described.
Prabhakar, Ramachandran
2012-01-01
Source to surface distance (SSD) plays a very important role in external beam radiotherapy treatment verification. In this study, a simple technique has been developed to verify the SSD automatically with lasers. The study also suggests a methodology for determining the respiratory signal with lasers. Two lasers, red and green are mounted on the collimator head of a Clinac 2300 C/D linac along with a camera to determine the SSD. A software (SSDLas) was developed to estimate the SSD automatically from the images captured by a 12-megapixel camera. To determine the SSD to a patient surface, the external body contour of the central axis transverse computed tomography (CT) cut is imported into the software. Another important aspect in radiotherapy is the generation of respiratory signal. The changes in the lasers separation as the patient breathes are converted to produce a respiratory signal. Multiple frames of laser images were acquired from the camera mounted on the collimator head and each frame was analyzed with SSDLas to generate the respiratory signal. The SSD as observed with the ODI on the machine and SSD measured by the SSDlas software was found to be within the tolerance limit. The methodology described for generating the respiratory signals will be useful for the treatment of mobile tumors such as lung, liver, breast, pancreas etc. The technique described for determining the SSD and the generation of respiratory signals using lasers is cost effective and simple to implement. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Imaging monitoring techniques applications in the transient gratings detection
NASA Astrophysics Data System (ADS)
Zhao, Qing-ming
2009-07-01
Experimental studies of Degenerate four-wave mixing (DFWM) in iodine vapor at atmospheric pressure and 0℃ and 25℃ are reported. The Laser-induced grating (LIG) studies are carried out by generating the thermal grating using a pulsed, narrow bandwidth, dye laser .A new image processing system for detecting forward DFWM spectroscopy on iodine vapor is reported. This system is composed of CCD camera, imaging processing card and the related software. With the help of the detecting system, phase matching can be easily achieved in the optical arrangement by crossing the two pumps and the probe as diagonals linking opposite corners of a rectangular box ,and providing a way to position the PhotoMultiplier Tube (PMT) . Also it is practical to know the effect of the pointing stability on the optical path by monitoring facula changing with the laser beam pointing and disturbs of the environment. Finally the effects of Photostability of dye laser on the ration of signal to noise in DFWM using forward geometries have been investigated in iodine vapor. This system makes it feasible that the potential application of FG-DFWM is used as a diagnostic tool in combustion research and environment monitoring.
Novel atmospheric extinction measurement techniques for aerospace laser system applications
NASA Astrophysics Data System (ADS)
Sabatini, Roberto; Richardson, Mark
2013-01-01
Novel techniques for laser beam atmospheric extinction measurements, suitable for manned and unmanned aerospace vehicle applications, are presented in this paper. Extinction measurements are essential to support the engineering development and the operational employment of a variety of aerospace electro-optical sensor systems, allowing calculation of the range performance attainable with such systems in current and likely future applications. Such applications include ranging, weaponry, Earth remote sensing and possible planetary exploration missions performed by satellites and unmanned flight vehicles. Unlike traditional LIDAR methods, the proposed techniques are based on measurements of the laser energy (intensity and spatial distribution) incident on target surfaces of known geometric and reflective characteristics, by means of infrared detectors and/or infrared cameras calibrated for radiance. Various laser sources can be employed with wavelengths from the visible to the far infrared portions of the spectrum, allowing for data correlation and extended sensitivity. Errors affecting measurements performed using the proposed methods are discussed in the paper and algorithms are proposed that allow a direct determination of the atmospheric transmittance and spatial characteristics of the laser spot. These algorithms take into account a variety of linear and non-linear propagation effects. Finally, results are presented relative to some experimental activities performed to validate the proposed techniques. Particularly, data are presented relative to both ground and flight trials performed with laser systems operating in the near infrared (NIR) at λ = 1064 nm and λ = 1550 nm. This includes ground tests performed with 10 Hz and 20 kHz PRF NIR laser systems in a large variety of atmospheric conditions, and flight trials performed with a 10 Hz airborne NIR laser system installed on a TORNADO aircraft, flying up to altitudes of 22,000 ft.
Photogrammetry and altimetry. Part A: Apollo 16 laser altimeter
NASA Technical Reports Server (NTRS)
Wollenhaupt, W. R.; Sjogren, W. L.
1972-01-01
The laser altimeter measures precise altitudes of the command and service module above the lunar surface and can function either with the metric (mapping) camera or independently. In the camera mode, the laser altimeter ranges at each exposure time, which varies between 20 and 28 sec (i.e., 30 to 43 km on the lunar surface). In the independent mode, the laser altimeter ranges every 20 sec. These altitude data and the spacecraft attitudes that are derived from simultaneous stellar photography are used to constrain the photogrammetric reduction of the lunar surface photographs when cartographic products are generated. In addition, the altimeter measurements alone provide broad-scale topographic relief around the entire circumference of the moon. These data are useful in investigating the selenodetic figure of the moon and may provide information regarding gravitational anomalies on the lunar far side.
NASA Astrophysics Data System (ADS)
Gondal, M. A.; Maganda, Y. W.; Dastageer, M. A.; Al Adel, F. F.; Naqvi, A. A.; Qahtan, T. F.
2014-04-01
Fourth harmonic of a pulsed Nd:YAG laser (wavelength 266 nm) in combination with high resolution spectrograph equipped with Gated ICCD camera has been employed to design a high sensitive analytical system. This detection system is based on Laser Induced Breakdown Spectroscopy and has been tested first time for analysis of semi-fluid samples to detect fluoride content present in the commercially available toothpaste samples. The experimental parameters were optimized to achieve an optically thin and in local thermo dynamic equilibrium plasma. This improved the limits of detection of fluoride present in tooth paste samples. The strong atomic transition line of fluorine at 731.102 nm was used as the marker line to quantify the fluoride concentration levels. Our LIBS system was able to detect fluoride concentration levels in the range of 1300-1750 ppm with a detection limit of 156 ppm.
A Combined Laser-Communication and Imager for Microspacecraft (ACLAIM)
NASA Technical Reports Server (NTRS)
Hemmati, H.; Lesh, J.
1998-01-01
ACLAIM is a multi-function instrument consisting of a laser communication terminal and an imaging camera that share a common telescope. A single APS- (Active Pixel Sensor) based focal-plane-array is used to perform both the acquisition and tracking (for laser communication) and science imaging functions.
NASA Astrophysics Data System (ADS)
Khalil, A. A. I.
2015-12-01
Double-pulse lasers ablation (DPLA) technique was developed to generate gold (Au) ion source and produce high current under applying an electric potential in an argon ambient gas environment. Two Q-switched Nd:YAG lasers operating at 1064 and 266 nm wavelengths are combined in an unconventional orthogonal (crossed-beam) double-pulse configuration with 45° angle to focus on a gold target along with a spectrometer for spectral analysis of gold plasma. The properties of gold plasma produced under double-pulse lasers excitation were studied. The velocity distribution function (VDF) of the emitted plasma was studied using a dedicated Faraday-cup ion probe (FCIP) under argon gas discharge. The experimental parameters were optimized to attain the best signal to noise (S/N) ratio. The results depicted that the VDF and current signals depend on the discharge applied voltage, laser intensity, laser wavelength and ambient argon gas pressure. A seven-fold increases in the current signal by increasing the discharge applied voltage and ion velocity under applying double-pulse lasers field. The plasma parameters (electron temperature and density) were also studied and their dependence on the delay (times between the excitation laser pulse and the opening of camera shutter) was investigated as well. This study could provide significant reference data for the optimization and design of DPLA systems engaged in laser induced plasma deposition thin films and facing components diagnostics.
Airborne laser systems for atmospheric sounding in the near infrared
NASA Astrophysics Data System (ADS)
Sabatini, Roberto; Richardson, Mark A.; Jia, Huamin; Zammit-Mangion, David
2012-06-01
This paper presents new techniques for atmospheric sounding using Near Infrared (NIR) laser sources, direct detection electro-optics and passive infrared imaging systems. These techniques allow a direct determination of atmospheric extinction and, through the adoption of suitable inversion algorithms, the indirect measurement of some important natural and man-made atmospheric constituents, including Carbon Dioxide (CO2). The proposed techniques are suitable for remote sensing missions performed by using aircraft, satellites, Unmanned Aerial Vehicles (UAV), parachute/gliding vehicles, Roving Surface Vehicles (RSV), or Permanent Surface Installations (PSI). The various techniques proposed offer relative advantages in different scenarios. All are based on measurements of the laser energy/power incident on target surfaces of known geometric and reflective characteristics, by means of infrared detectors and/or infrared cameras calibrated for radiance. Experimental results are presented relative to ground and flight trials performed with laser systems operating in the near infrared (NIR) at λ = 1064 nm and λ = 1550 nm. This includes ground tests performed with 10 Hz and 20 KHz PRF NIR laser systems in a variety of atmospheric conditions, and flight trials performed with a 10 Hz airborne NIR laser system installed on a TORNADO aircraft, flying up to altitudes of 22,000 ft above ground level. Future activities are planned to validate the atmospheric retrieval algorithms developed for CO2 column density measurements, with emphasis on aircraft related emissions at airports and other high air-traffic density environments.
Automatic Docking System Sensor Design, Test, and Mission Performance
NASA Technical Reports Server (NTRS)
Jackson, John L.; Howard, Richard T.; Cole, Helen J.
1998-01-01
The Video Guidance Sensor is a key element of an automatic rendezvous and docking program administered by NASA that was flown on STS-87 in November of 1997. The system used laser illumination of a passive target in the field of view of an on-board camera and processed the video image to determine the relative position and attitude between the target and the sensor. Comparisons of mission results with theoretical models and laboratory measurements will be discussed.
Implementation of a Multi-Robot Coverage Algorithm on a Two-Dimensional, Grid-Based Environment
2017-06-01
two planar laser range finders with a 180-degree field of view , color camera, vision beacons, and wireless communicator. In their system, the robots...Master’s thesis 4. TITLE AND SUBTITLE IMPLEMENTATION OF A MULTI -ROBOT COVERAGE ALGORITHM ON A TWO -DIMENSIONAL, GRID-BASED ENVIRONMENT 5. FUNDING NUMBERS...path planning coverage algorithm for a multi -robot system in a two -dimensional, grid-based environment. We assess the applicability of a topology
3D shape measurement with thermal pattern projection
NASA Astrophysics Data System (ADS)
Brahm, Anika; Reetz, Edgar; Schindwolf, Simon; Correns, Martin; Kühmstedt, Peter; Notni, Gunther
2016-12-01
Structured light projection techniques are well-established optical methods for contactless and nondestructive three-dimensional (3D) measurements. Most systems operate in the visible wavelength range (VIS) due to commercially available projection and detection technology. For example, the 3D reconstruction can be done with a stereo-vision setup by finding corresponding pixels in both cameras followed by triangulation. Problems occur, if the properties of object materials disturb the measurements, which are based on the measurement of diffuse light reflections. For example, there are existing materials in the VIS range that are too transparent, translucent, high absorbent, or reflective and cannot be recorded properly. To overcome these challenges, we present an alternative thermal approach that operates in the infrared (IR) region of the electromagnetic spectrum. For this purpose, we used two cooled mid-wave (MWIR) cameras (3-5 μm) to detect emitted heat patterns, which were introduced by a CO2 laser. We present a thermal 3D system based on a GOBO (GOes Before Optics) wheel projection unit and first 3D analyses for different system parameters and samples. We also show a second alternative approach based on an incoherent (heat) source, to overcome typical disadvantages of high-power laser-based systems, such as industrial health and safety considerations, as well as high investment costs. Thus, materials like glass or fiber-reinforced composites can be measured contactless and without the need of additional paintings.
Industrial applications of shearography for inspections of aircraft components
NASA Astrophysics Data System (ADS)
Krupka, Rene; Waltz, T.; Ettemeyer, Andreas
2003-05-01
Shearography has been validated as fast and reliable inspection technique for aerospace components. Following several years phase of evaluation of the technique, meanwhile, shearography has entered the industrial production inspection. The applications basically range from serial inspection in the production line to field inspection in assembly and to applications in the maintenance and repair area. In all applications, the main advantages of shearography, as very fast and full field inspection and high sensitivity even on very complex composite materials have led to the decision for laser shearography as inspection tool. In this paper, we present examples of recent industrial shearography inspection systems in the field of aerospace. One of the first industrial installations of laser shearography in Europe was a fully automatic inspection system for helicopter rotorblades. Complete rotor blades are inspected within 10 minutes on delaminations and debondings in the composite structure. In case of more complex components, robotic manipulation of the shearography camera has proven to be the optimum solution. An industry 6-axis robot gives utmost flexibility to position the camera in any angle and distance. Automatic defect marking systems have also been introduced to indicate the exact position of the defect directly on the inspected component. Other applications cover the inspection of abradable seals in jet engines and portable shearography inspection systems for maintenance and repair inspection in the field.
Continuous-wave terahertz digital holography by use of a pyroelectric array camera.
Ding, Sheng-Hui; Li, Qi; Li, Yun-Da; Wang, Qi
2011-06-01
Terahertz (THz) digital holography is realized based on a 2.52 THz far-IR gas laser and a commercial 124 × 124 pyroelectric array camera. Off-axis THz holograms are obtained by recording interference patterns between light passing through the sample and the reference wave. A numerical reconstruction process is performed to obtain the field distribution at the object surface. Different targets were imaged to test the system's imaging capability. Compared with THz focal plane images, the image quality of the reconstructed images are improved a lot. The results show that the system's imaging resolution can reach at least 0.4 mm. The system also has the potential for real-time imaging application. This study confirms that digital holography is a promising technique for real-time, high-resolution THz imaging, which has extensive application prospects. © 2011 Optical Society of America
New continuous recording procedure of holographic information on transient phenomena
NASA Astrophysics Data System (ADS)
Nagayama, Kunihito; Nishihara, H. Keith; Murakami, Terutoshi
1992-09-01
A new method for continuous recording of holographic information, 'streak holography,' is proposed. This kind of record can be useful for velocity and acceleration measurement as well as for observing a moving object whose trajectory cannot be predicted in advance. A very high speed camera system has been designed and constructed for streak holography. A ring-shaped 100-mm-diam film has been cut out from the high-resolution sheet film and mounted on a thin duralmin disk, which has been driven to rotate directly by an air-turbine spindle. Attainable streak velocity is 0.3 mm/microsecond(s) . A direct film drive mechanism makes it possible to use a relay lens system of extremely small f number. The feasibility of the camera system has been demonstrated by observing several transient events, such as the forced oscillation of a wire and the free fall of small glass particles, using an argon-ion laser as a light source.
STS-53 Discovery, OV-103, DOD Hercules digital electronic imagery equipment
1992-04-22
STS-53 Discovery, Orbiter Vehicle (OV) 103, Department of Defense (DOD) mission Hand-held Earth-oriented Real-time Cooperative, User-friendly, Location, targeting, and Environmental System (Hercules) spaceborne experiment equipment is documented in this table top view. HERCULES is a joint NAVY-NASA-ARMY payload designed to provide real-time high resolution digital electronic imagery and geolocation (latitude and longitude determination) of earth surface targets of interest. HERCULES system consists of (from left to right): a specially modified GRID Systems portable computer mounted atop NASA developed Playback-Downlink Unit (PDU) and the Naval Research Laboratory (NRL) developed HERCULES Attitude Processor (HAP); the NASA-developed Electronic Still Camera (ESC) Electronics Box (ESCEB) including removable imagery data storage disks and various connecting cables; the ESC (a NASA modified Nikon F-4 camera) mounted atop the NRL HERCULES Inertial Measurement Unit (HIMU) containing the three-axis ring-laser gyro.
Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera
NASA Astrophysics Data System (ADS)
Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji
1999-10-01
A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.
NASA Technical Reports Server (NTRS)
Krabach, Timothy
1998-01-01
Some of the many new and advanced exploration technologies which will enable space missions in the 21st century and specifically the Manned Mars Mission are explored in this presentation. Some of these are the system on a chip, the Computed-Tomography imaging Spectrometer, the digital camera on a chip, and other Micro Electro Mechanical Systems (MEMS) technology for space. Some of these MEMS are the silicon micromachined microgyroscope, a subliming solid micro-thruster, a micro-ion thruster, a silicon seismometer, a dewpoint microhygrometer, a micro laser doppler anemometer, and tunable diode laser (TDL) sensors. The advanced technology insertion is critical for NASA to decrease mass, volume, power and mission costs, and increase functionality, science potential and robustness.
Endoscopic laser range scanner for minimally invasive, image guided kidney surgery
NASA Astrophysics Data System (ADS)
Friets, Eric; Bieszczad, Jerry; Kynor, David; Norris, James; Davis, Brynmor; Allen, Lindsay; Chambers, Robert; Wolf, Jacob; Glisson, Courtenay; Herrell, S. Duke; Galloway, Robert L.
2013-03-01
Image guided surgery (IGS) has led to significant advances in surgical procedures and outcomes. Endoscopic IGS is hindered, however, by the lack of suitable intraoperative scanning technology for registration with preoperative tomographic image data. This paper describes implementation of an endoscopic laser range scanner (eLRS) system for accurate, intraoperative mapping of the kidney surface, registration of the measured kidney surface with preoperative tomographic images, and interactive image-based surgical guidance for subsurface lesion targeting. The eLRS comprises a standard stereo endoscope coupled to a steerable laser, which scans a laser fan beam across the kidney surface, and a high-speed color camera, which records the laser-illuminated pixel locations on the kidney. Through calibrated triangulation, a dense set of 3-D surface coordinates are determined. At maximum resolution, the eLRS acquires over 300,000 surface points in less than 15 seconds. Lower resolution scans of 27,500 points are acquired in one second. Measurement accuracy of the eLRS, determined through scanning of reference planar and spherical phantoms, is estimated to be 0.38 +/- 0.27 mm at a range of 2 to 6 cm. Registration of the scanned kidney surface with preoperative image data is achieved using a modified iterative closest point algorithm. Surgical guidance is provided through graphical overlay of the boundaries of subsurface lesions, vasculature, ducts, and other renal structures labeled in the CT or MR images, onto the eLRS camera image. Depth to these subsurface targets is also displayed. Proof of clinical feasibility has been established in an explanted perfused porcine kidney experiment.
Adaptive optics at the Subaru telescope: current capabilities and development
NASA Astrophysics Data System (ADS)
Guyon, Olivier; Hayano, Yutaka; Tamura, Motohide; Kudo, Tomoyuki; Oya, Shin; Minowa, Yosuke; Lai, Olivier; Jovanovic, Nemanja; Takato, Naruhisa; Kasdin, Jeremy; Groff, Tyler; Hayashi, Masahiko; Arimoto, Nobuo; Takami, Hideki; Bradley, Colin; Sugai, Hajime; Perrin, Guy; Tuthill, Peter; Mazin, Ben
2014-08-01
Current AO observations rely heavily on the AO188 instrument, a 188-elements system that can operate in natural or laser guide star (LGS) mode, and delivers diffraction-limited images in near-IR. In its LGS mode, laser light is transported from the solid state laser to the launch telescope by a single mode fiber. AO188 can feed several instruments: the infrared camera and spectrograph (IRCS), a high contrast imaging instrument (HiCIAO) or an optical integral field spectrograph (Kyoto-3DII). Adaptive optics development in support of exoplanet observations has been and continues to be very active. The Subaru Coronagraphic Extreme-AO (SCExAO) system, which combines extreme-AO correction with advanced coronagraphy, is in the commissioning phase, and will greatly increase Subaru Telescope's ability to image and study exoplanets. SCExAO currently feeds light to HiCIAO, and will soon be combined with the CHARIS integral field spectrograph and the fast frame MKIDs exoplanet camera, which have both been specifically designed for high contrast imaging. SCExAO also feeds two visible-light single pupil interferometers: VAMPIRES and FIRST. In parallel to these direct imaging activities, a near-IR high precision spectrograph (IRD) is under development for observing exoplanets with the radial velocity technique. Wide-field adaptive optics techniques are also being pursued. The RAVEN multi-object adaptive optics instrument was installed on Subaru telescope in early 2014. Subaru Telescope is also planning wide field imaging with ground-layer AO with the ULTIMATE-Subaru project.
NASA Astrophysics Data System (ADS)
Zhou, Renjie; Jin, Di; Yaqoob, Zahid; So, Peter T. C.
2017-02-01
Due to the large number of available mirrors, the patterning speed, low-cost, and compactness, digital-micromirror devices (DMDs) have been extensively used in biomedical imaging system. Recently, DMDs have been brought to the quantitative phase microscopy (QPM) field to achieve synthetic-aperture imaging and tomographic imaging. Last year, our group demonstrated using DMD for QPM, where the phase-retrieval is based on a recently developed Fourier ptychography algorithm. In our previous system, the illumination angle was varied through coding the aperture plane of the illumination system, which has a low efficiency on utilizing the laser power. In our new DMD-based QPM system, we use the Lee-holograms, which is conjugated to the sample plane, to change the illumination angles for much higher power efficiency. Multiple-angle illumination can also be achieved with this method. With this versatile system, we can achieve FPM-based high-resolution phase imaging with 250 nm lateral resolution using the Rayleigh criteria. Due to the use of a powerful laser, the imaging speed would only be limited by the camera acquisition speed. With a fast camera, we expect to achieve close to 100 fps phase imaging speed that has not been achieved in current FPM imaging systems. By adding reference beam, we also expect to achieve synthetic-aperture imaging while directly measuring the phase of the sample fields. This would reduce the phase-retrieval processing time to allow for real-time imaging applications in the future.
Laser-Beam-Alignment Controller
NASA Technical Reports Server (NTRS)
Krasowski, M. J.; Dickens, D. E.
1995-01-01
In laser-beam-alignment controller, images from video camera compared to reference patterns by fuzzy-logic pattern comparator. Results processed by fuzzy-logic microcontroller, which sends control signals to motor driver adjusting lens and pinhole in spatial filter.
ChemCam Mast Unit Being Prepared for Laser Firing
2010-12-23
Researchers prepare for a test of the Chemistry and Camera ChemCam instrument that will fly on NASA Mars Science Laboratory mission; researchers are preparing the instrument mast unit for a laser firing test.
Six-degrees-of-freedom sensing based on pictures taken by single camera.
Zhongke, Li; Yong, Wang; Yongyuan, Qin; Peijun, Lu
2005-02-01
Two six-degrees-of-freedom sensing methods are presented. In the first method, three laser beams are employed to set up Descartes' frame on a rigid body and a screen is adopted to form diffuse spots. In the second method, two superimposed grid screens and two laser beams are used. A CCD camera is used to take photographs in both methods. Both approaches provide a simple and error-free method to record the positions and the attitudes of a rigid body in motion continuously.
Sambot II: A self-assembly modular swarm robot
NASA Astrophysics Data System (ADS)
Zhang, Yuchao; Wei, Hongxing; Yang, Bo; Jiang, Cancan
2018-04-01
The new generation of self-assembly modular swarm robot Sambot II, based on the original generation of self-assembly modular swarm robot Sambot, adopting laser and camera module for information collecting, is introduced in this manuscript. The visual control algorithm of Sambot II is detailed and feasibility of the algorithm is verified by the laser and camera experiments. At the end of this manuscript, autonomous docking experiments of two Sambot II robots are presented. The results of experiments are showed and analyzed to verify the feasibility of whole scheme of Sambot II.
Composite x-ray pinholes for time-resolved microphotography of laser compressed targets.
Attwood, D T; Weinstein, B W; Wuerker, R F
1977-05-01
Composite x-ray pinholes having dichroic properties are presented. These pinholes permit both x-ray imaging and visible alignment with micron accuracy by presenting different apparent apertures in these widely disparate regions of the spectrum. Their use is mandatory in certain applications in which the x-ray detection consists of a limited number of resolvable elements whose use one wishes to maximize. Mating the pinhole camera with an x-ray streaking camera is described, along with experiments which spatially and temporally resolve the implosion of laser irradiated targets.
Computerized lateral-shear interferometer
NASA Astrophysics Data System (ADS)
Hasegan, Sorin A.; Jianu, Angela; Vlad, Valentin I.
1998-07-01
A lateral-shear interferometer, coupled with a computer for laser wavefront analysis, is described. A CCD camera is used to transfer the fringe images through a frame-grabber into a PC. 3D phase maps are obtained by fringe pattern processing using a new algorithm for direct spatial reconstruction of the optical phase. The program describes phase maps by Zernike polynomials yielding an analytical description of the wavefront aberration. A compact lateral-shear interferometer has been built using a laser diode as light source, a CCD camera and a rechargeable battery supply, which allows measurements in-situ, if necessary.
NASA Astrophysics Data System (ADS)
Sabatini, Roberto; Richardson, Mark
2013-03-01
Novel techniques for laser beam atmospheric extinction measurements, suitable for several air and space platform applications, are presented in this paper. Extinction measurements are essential to support the engineering development and the operational employment of a variety of aerospace electro-optical sensor systems, allowing calculation of the range performance attainable with such systems in current and likely future applications. Such applications include ranging, weaponry, Earth remote sensing and possible planetary exploration missions performed by satellites and unmanned flight vehicles. Unlike traditional LIDAR methods, the proposed techniques are based on measurements of the laser energy (intensity and spatial distribution) incident on target surfaces of known geometric and reflective characteristics, by means of infrared detectors and/or infrared cameras calibrated for radiance. Various laser sources can be employed with wavelengths from the visible to the far infrared portions of the spectrum, allowing for data correlation and extended sensitivity. Errors affecting measurements performed using the proposed methods are discussed in the paper and algorithms are proposed that allow a direct determination of the atmospheric transmittance and spatial characteristics of the laser spot. These algorithms take into account a variety of linear and non-linear propagation effects. Finally, results are presented relative to some experimental activities performed to validate the proposed techniques. Particularly, data are presented relative to both ground and flight trials performed with laser systems operating in the near infrared (NIR) at λ= 1064 nm and λ= 1550 nm. This includes ground tests performed with 10 Hz and 20 KHz PRF NIR laser systems in a large variety of atmospheric conditions, and flight trials performed with a 10 Hz airborne NIR laser system installed on a TORNADO aircraft, flying up to altitudes of 22,000 ft.
An Algorithm Enabling Blind Users to Find and Read Barcodes
Tekin, Ender; Coughlan, James M.
2010-01-01
Most camera-based systems for finding and reading barcodes are designed to be used by sighted users (e.g. the Red Laser iPhone app), and assume the user carefully centers the barcode in the image before the barcode is read. Blind individuals could benefit greatly from such systems to identify packaged goods (such as canned goods in a supermarket), but unfortunately in their current form these systems are completely inaccessible because of their reliance on visual feedback from the user. To remedy this problem, we propose a computer vision algorithm that processes several frames of video per second to detect barcodes from a distance of several inches; the algorithm issues directional information with audio feedback (e.g. “left,” “right”) and thereby guides a blind user holding a webcam or other portable camera to locate and home in on a barcode. Once the barcode is detected at sufficiently close range, a barcode reading algorithm previously developed by the authors scans and reads aloud the barcode and the corresponding product information. We demonstrate encouraging experimental results of our proposed system implemented on a desktop computer with a webcam held by a blindfolded user; ultimately the system will be ported to a camera phone for use by visually impaired users. PMID:20617114
A practical indoor context-aware surveillance system with multi-Kinect sensors
NASA Astrophysics Data System (ADS)
Jia, Lili; You, Ying; Li, Tiezhu; Zhang, Shun
2014-11-01
In this paper we develop a novel practical application, which give scalable services to the end users when abnormal actives are happening. Architecture of the application has been presented consisting of network infrared cameras and a communication module. In this intelligent surveillance system we use Kinect sensors as the input cameras. Kinect is an infrared laser camera which its user can access the raw infrared sensor stream. We install several Kinect sensors in one room to track the human skeletons. Each sensor returns the body positions with 15 coordinates in its own coordinate system. We use calibration algorithms to calibrate all the body positions points into one unified coordinate system. With the body positions points, we can infer the surveillance context. Furthermore, the messages from the metadata index matrix will be sent to mobile phone through communication module. User will instantly be aware of an abnormal case happened in the room without having to check the website. In conclusion, theoretical analysis and experimental results in this paper show that the proposed system is reasonable and efficient. And the application method introduced in this paper is not only to discourage the criminals and assist police in the apprehension of suspects, but also can enabled the end-users monitor the indoor environments anywhere and anytime by their phones.
Bore-sight calibration of the profile laser scanner using a large size exterior calibration field
NASA Astrophysics Data System (ADS)
Koska, Bronislav; Křemen, Tomáš; Štroner, Martin
2014-10-01
The bore-sight calibration procedure and results of a profile laser scanner using a large size exterior calibration field is presented in the paper. The task is a part of Autonomous Mapping Airship (AMA) project which aims to create s surveying system with specific properties suitable for effective surveying of medium-wide areas (units to tens of square kilometers per a day). As is obvious from the project name an airship is used as a carrier. This vehicle has some specific properties. The most important properties are high carrying capacity (15 kg), long flight time (3 hours), high operating safety and special flight characteristics such as stability of flight, in terms of vibrations, and possibility to flight at low speed. The high carrying capacity enables using of high quality sensors like professional infrared (IR) camera FLIR SC645, high-end visible spectrum (VIS) digital camera and optics in the visible spectrum and tactical grade INSGPS sensor iMAR iTracerRT-F200 and profile laser scanner SICK LD-LRS1000. The calibration method is based on direct laboratory measuring of coordinate offset (lever-arm) and in-flight determination of rotation offsets (bore-sights). The bore-sight determination is based on the minimization of squares of individual point distances from measured planar surfaces.
Use of a Fluorometric Imaging Plate Reader in high-throughput screening
NASA Astrophysics Data System (ADS)
Groebe, Duncan R.; Gopalakrishnan, Sujatha; Hahn, Holly; Warrior, Usha; Traphagen, Linda; Burns, David J.
1999-04-01
High-throughput screening (HTS) efforts at Abbott Laboratories have been greatly facilitated by the use of a Fluorometric Imaging Plate Reader. The FLIPR consists of an incubated cabinet with integrated 96-channel pipettor and fluorometer. An argon laser is used to excite fluorophores in a 96-well microtiter plate and the emitted fluorometer. An argon laser is used to excite fluorophores in a 96-well microtiter plate and the emitted fluorescence is imaged by a cooled CCD camera. The image data is downloaded from the camera and processed to average the signal form each well of the microtiter pate for each time point. The data is presented in real time on the computer screen, facilitating interpretation and trouble-shooting. In addition to fluorescence, the camera can also detect luminescence form firefly luciferase.
1990-03-23
defined (personal communciation between R. Pozos and Simon, 1985). In summary, there have been studies dealing with shivering which indicate that the...microcomputer (IBM PS/2, Model 30/286). The Firearms Training System combines features of several technologies, notably: interactive video-disc/ computer ...technology and laser designator/camera/ computer /target-hit generation, which provides for immediate visual performance feedback. The subject is
Thrust Measurements in Ballistic Pendulum Ablative Laser Propulsion Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brazolin, H.; Rodrigues, N. A. S.; Minucci, M. A. S.
This paper describes a setup for thrust measurement in ablative laser propulsion experiments, based on a simple ballistic pendulum associated to an imaging system, which is being assembled at IEAv. A light aluminium pendulum holding samples is placed inside a 100 liters vacuum chamber with two optical windows: the first (in ZnSe) for the laser beam and the second (in fused quartz) for the pendulum visualization. A TEA-CO{sub 2} laser beam is focused to the samples providing ablation and transferring linear moment to the pendulum as a whole. A CCD video camera captures the oscillatory movement of the pendulum andmore » the its trajectory is obtained by image processing. By fitting the trajectory of the pendulum to a dumped sinusoidal curve is possible to obtain the amplitude of the movement which is directly related to the momentum transfered to the sample.« less
NASA Astrophysics Data System (ADS)
Chan, Kenneth H.; Tom, Henry; Darling, Cynthia L.; Fried, Daniel
2015-02-01
Previous studies have established that caries lesions can be imaged with high contrast without the interference of stains at near-IR wavelengths greater than 1300-nm. It has been demonstrated that computer controlled laser scanning systems utilizing IR lasers operating at high pulse repetition rates can be used for serial imaging and selective removal of caries lesions. In this study, we report our progress towards the development of algorithms for generating rasterized ablation maps from near-IR reflectance images for the removal of natural lesions from tooth occlusal surfaces. An InGaAs camera and a filtered tungsten-halogen lamp producing near-IR light in the range of 1500-1700-nm were used to collect crosspolarization reflectance images of tooth occlusal surfaces. A CO2 laser operating at a wavelength of 9.3- μm with a pulse duration of 10-15-μs was used for image-guided ablation.
Combined Infrared Stereo and Laser Ranging Cloud Measurements from Shuttle Mission STS-85
NASA Technical Reports Server (NTRS)
Lancaster, R. S.; Spinhirne, J. D.; Manizade, K. F.
2004-01-01
Multiangle remote sensing provides a wealth of information for earth and climate monitoring, such as the ability to measure the height of cloud tops through stereoscopic imaging. As technology advances so do the options for developing spacecraft instrumentation versatile enough to meet the demands associated with multiangle measurements. One such instrument is the infrared spectral imaging radiometer, which flew as part of mission STS-85 of the space shuttle in 1997 and was the first earth- observing radiometer to incorporate an uncooled microbolometer array detector as its image sensor. Specifically, a method for computing cloud-top height with a precision of +/- 620 m from the multispectral stereo measurements acquired during this flight has been developed, and the results are compared with coincident direct laser ranging measurements from the shuttle laser altimeter. Mission STS-85 was the first space flight to combine laser ranging and thermal IR camera systems for cloud remote sensing.
ARGOS wavefront sensing: from detection to correction
NASA Astrophysics Data System (ADS)
Orban de Xivry, Gilles; Bonaglia, M.; Borelli, J.; Busoni, L.; Connot, C.; Esposito, S.; Gaessler, W.; Kulas, M.; Mazzoni, T.; Puglisi, A.; Rabien, S.; Storm, J.; Ziegleder, J.
2014-08-01
Argos is the ground-layer adaptive optics system for the Large Binocular Telescope. In order to perform its wide-field correction, Argos uses three laser guide stars which sample the atmospheric turbulence. To perform the correction, Argos has at disposal three different wavefront sensing measurements : its three laser guide stars, a NGS tip-tilt, and a third wavefront sensor. We present the wavefront sensing architecture and its individual components, in particular: the finalized Argos pnCCD camera detecting the 3 laser guide stars at 1kHz, high quantum efficiency and 4e- noise; the Argos tip-tilt sensor based on a quad-cell avalanche photo-diodes; and the Argos wavefront computer. Being in the middle of the commissioning, we present the first wavefront sensing configurations and operations performed at LBT, and discuss further improvements in the measurements of the 3 laser guide star slopes as detected by the pnCCD.
[INVITED] Evaluation of process observation features for laser metal welding
NASA Astrophysics Data System (ADS)
Tenner, Felix; Klämpfl, Florian; Nagulin, Konstantin Yu.; Schmidt, Michael
2016-06-01
In the present study we show how fast the fluid dynamics change when changing the laser power for different feed rates during laser metal welding. By the use of two high-speed cameras and a data acquisition system we conclude how fast we have to image the process to measure the fluid dynamics with a very high certainty. Our experiments show that not all process features which can be measured during laser welding do represent the process behavior similarly well. Despite the good visibility of the vapor plume the monitoring of its movement is less suitable as an input signal for a closed-loop control. The features measured inside the keyhole show a good correlation with changes of process parameters. Due to its low noise, the area of the keyhole opening is well suited as an input signal for a closed-loop control of the process.
NASA Astrophysics Data System (ADS)
Bechis, K.; Pitruzzello, A.
2014-09-01
This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera operation is that the target must be within the near-field (Fraunhofer distance) of the collecting optics. For example, in visible light the near-field of a 1-m telescope extends out to about 3,500 km, while the near-field of the AEOS telescope extends out over 46,000 km. For our initial proof of concept, we have integrated our light field camera with a 14-inch Meade LX600 advanced coma-free telescope, to image various surrogate ground targets at up to tens of kilometers range. Our experiments with the 14-inch telescope have assessed factors and requirements that are traceable and scalable to a larger-aperture system that would have the near-field distance needed to obtain 3D images of LEO objects. The next step would be to integrate a light field camera with a 1-m or larger telescope and evaluate its 3D imaging capability against LEO objects. 3D imaging of LEO space objects with light field camera technology can potentially provide a valuable new tool for space situational awareness, especially for those situations where laser or radar illumination of the target objects is not feasible.
CMO YAG laser damage test facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hue, J.; Dijon, J.; Lyan, P.
1996-12-31
The CMO YAG laser damage test facility, which is equipped with a 30Hz laser, is presented in this paper. The main points are described below: (1) The characteristics of the laser beam and the in situ damage detection technique (a scattered light measurement system) are perfectly suited to work up to the frequency of the laser. They are monitored in real time, and work at three wavelengths: 1064 nm, 532 nm, 355 nm. (2) With this same shutter, it is possible to automatically stop the laser on the pulse which induces the first damages. These automatic capabilities enable the samplesmore » to be tested quickly. (3) A Nomarski microscope supplied with a 16-bit CCD camera enables the test sites to be photographed before and after the laser interaction. Image processing enables the authors to extract the first damages. before and after the laser interaction. Image processing enables them to extract the first damages. (4) Six pulse widths are available (between 3ns and 13ns). Therefore, with all these characterization tools, many kinds of laser tests may be considered. These different features are illustrated by experimental results (1-on-1 test or R-on-1 test).« less
Under-vehicle autonomous inspection through undercarriage signatures
NASA Astrophysics Data System (ADS)
Schoenherr, Edward; Smuda, Bill
2005-05-01
Increased threats to gate security have caused recent need for improved vehicle inspection methods at security checkpoints in various fields of defense and security. A fast, reliable system of under-vehicle inspection that detects possibly harmful or unwanted materials hidden on vehicle undercarriages and notifies the user of the presence of these materials while allowing the user a safe standoff distance from the inspection site is desirable. An autonomous under-vehicle inspection system would provide for this. The proposed system would function as follows: A low-clearance tele-operated robotic platform would be equipped with sonar/laser range finding sensors as well as a video camera. As a vehicle to be inspected enters a checkpoint, the robot would autonomously navigate under the vehicle, using algorithms to detect tire locations for weigh points. During this navigation, data would be collected from the sonar/laser range finding hardware. This range data would be used to compile an impression of the vehicle undercarriage. Once this impression is complete, the system would compare it to a database of pre-scanned undercarriage impressions. Based on vehicle makes and models, any variance between the undercarriage being inspected and the impression compared against in the database would be marked as potentially threatening. If such variances exist, the robot would navigate to these locations and place the video camera in such a manner that the location in question can be viewed from a standoff position through a TV monitor. At this time, manual control of the robot navigation and camera control can be taken to imply further, more detailed inspection of the area/materials in question. After-market vehicle modifications would provide some difficulty, yet with enough pre-screening of such modifications, the system should still prove accurate. Also, impression scans that are taken in the field can be stored and tagged with a vehicles's license plate number, and future inspections of that vehicle can be compared to already screened and cleared impressions of the same vehicle in order to search for variance.
Marks of Laser Exam on Martian Soil
2012-08-30
The Chemistry and Camera ChemCam instrument on NASA Mars rover Curiosity used its laser to examine side-by-side points in a target patch of soil, leaving the marks apparent in this before-and-after comparison.
Laser Hits on Martian Drill Tailings
2013-02-13
A day after NASA Mars rover Curiosity drilled the first sample-collection hole into a rock on Mars, the rover Chemistry and Camera ChemCam instrument shot laser pulses into the fresh rock powder that the drilling generated.
NASA Astrophysics Data System (ADS)
Özel, Tuğrul; Arısoy, Yiğit M.; Criales, Luis E.
Computational modelling of Laser Powder Bed Fusion (L-PBF) processes such as Selective laser Melting (SLM) can reveal information that is hard to obtain or unobtainable by in-situ experimental measurements. A 3D thermal field that is not visible by the thermal camera can be obtained by solving the 3D heat transfer problem. Furthermore, microstructural modelling can be used to predict the quality and mechanical properties of the product. In this paper, a nonlinear 3D Finite Element Method based computational code is developed to simulate the SLM process with different process parameters such as laser power and scan velocity. The code is further improved by utilizing an in-situ thermal camera recording to predict spattering which is in turn included as a stochastic heat loss. Then, thermal gradients extracted from the simulations applied to predict growth directions in the resulting microstructure.
Marshall, F J; Radha, P B
2014-11-01
A method to simultaneously image both the absorption and the self-emission of an imploding inertial confinement fusion plasma has been demonstrated on the OMEGA Laser System. The technique involves the use of a high-Z backlighter, half of which is covered with a low-Z material, and a high-speed x-ray framing camera aligned to capture images backlit by this masked backlighter. Two strips of the four-strip framing camera record images backlit by the high-Z portion of the backlighter, while the other two strips record images aligned with the low-Z portion of the backlighter. The emission from the low-Z material is effectively eliminated by a high-Z filter positioned in front of the framing camera, limiting the detected backlighter emission to that of the principal emission line of the high-Z material. As a result, half of the images are of self-emission from the plasma and the other half are of self-emission plus the backlighter. The advantage of this technique is that the self-emission simultaneous with backlighter absorption is independently measured from a nearby direction. The absorption occurs only in the high-Z backlit frames and is either spatially separated from the emission or the self-emission is suppressed by filtering, or by using a backlighter much brighter than the self-emission, or by subtraction. The masked-backlighter technique has been used on the OMEGA Laser System to simultaneously measure the emission profiles and the absorption profiles of polar-driven implosions.
High performance gel imaging with a commercial single lens reflex camera
NASA Astrophysics Data System (ADS)
Slobodan, J.; Corbett, R.; Wye, N.; Schein, J. E.; Marra, M. A.; Coope, R. J. N.
2011-03-01
A high performance gel imaging system was constructed using a digital single lens reflex camera with epi-illumination to image 19 × 23 cm agarose gels with up to 10,000 DNA bands each. It was found to give equivalent performance to a laser scanner in this high throughput DNA fingerprinting application using the fluorophore SYBR Green®. The specificity and sensitivity of the imager and scanner were within 1% using the same band identification software. Low and high cost color filters were also compared and it was found that with care, good results could be obtained with inexpensive dyed acrylic filters in combination with more costly dielectric interference filters, but that very poor combinations were also possible. Methods for determining resolution, dynamic range, and optical efficiency for imagers are also proposed to facilitate comparison between systems.
Mehder, A O; Gondal, Mohammed A; Dastageer, Mohamed A; Habibullah, Yusuf B; Iqbal, Mohammed A; Oloore, Luqman E; Gondal, Bilal
2016-01-01
Laser induced breakdown spectroscopy (LIBS) was applied for the detection of carcinogenic elements like bromine in four representative brands of loaf bread samples and the measured bromine concentrations were 352, 157, 451, and 311 ppm, using Br I (827.2 nm) atomic transition line as the finger print atomic transition. Our LIBS system is equipped with a pulsed laser of wavelength 266 nm with energy 25 mJ pulse(-1), 8 ns pulse duration, 20 Hz repetition rate, and a gated ICCD camera. The LIBS system was calibrated with the standards of known concentrations in the sample (bread) matrix and such plot is linear in 20-500 ppm range. The capability of our system in terms of limit of detection and relative accuracy with respect to the standard inductively coupled plasma mass spectrometry (ICPMS) technique was evaluated and these values were 5.09 ppm and 0.01-0.05, respectively, which ensures the applicability of our system for Br trace level detection, and LIBS results are in excellent agreement with that of ICPMS results.
Simultaneous three wavelength imaging with a scanning laser ophthalmoscope.
Reinholz, F; Ashman, R A; Eikelboom, R H
1999-11-01
Various imaging properties of scanning laser ophthalmoscopes (SLO) such as contrast or depth discrimination, are superior to those of the traditional photographic fundus camera. However, most SLO are monochromatic whereas photographic systems produce colour images, which inherently contain information over a broad wavelength range. An SLO system has been modified to allow simultaneous three channel imaging. Laser light sources in the visible and infrared spectrum were concurrently launched into the system. Using different wavelength triads, digital fundus images were acquired at high frame rates. Favourable wavelengths combinations were established and high contrast, true (red, green, blue) or false (red, green, infrared) colour images of the retina were recorded. The monochromatic frames which form the colour image exhibit improved distinctness of different retinal structures such as the nerve fibre layer, the blood vessels, and the choroid. A multi-channel SLO combines the advantageous imaging properties of a tunable, monochrome SLO with the benefits and convenience of colour ophthalmoscopy. The options to modify parameters such as wavelength, intensity, gain, beam profile, aperture sizes, independently for every channel assign a high degree of versatility to the system. Copyright 1999 Wiley-Liss, Inc.
True color blood flow imaging using a high-speed laser photography system
NASA Astrophysics Data System (ADS)
Liu, Chien-Sheng; Lin, Cheng-Hsien; Sun, Yung-Nien; Ho, Chung-Liang; Hsu, Chung-Chi
2012-10-01
Physiological changes in the retinal vasculature are commonly indicative of such disorders as diabetic retinopathy, glaucoma, and age-related macular degeneration. Thus, various methods have been developed for noninvasive clinical evaluation of ocular hemodynamics. However, to the best of our knowledge, current ophthalmic instruments do not provide a true color blood flow imaging capability. Accordingly, we propose a new method for the true color imaging of blood flow using a high-speed pulsed laser photography system. In the proposed approach, monochromatic images of the blood flow are acquired using a system of three cameras and three color lasers (red, green, and blue). A high-quality true color image of the blood flow is obtained by assembling the monochromatic images by means of image realignment and color calibration processes. The effectiveness of the proposed approach is demonstrated by imaging the flow of mouse blood within a microfluidic channel device. The experimental results confirm the proposed system provides a high-quality true color blood flow imaging capability, and therefore has potential for noninvasive clinical evaluation of ocular hemodynamics.
A contribution to laser range imaging technology
NASA Technical Reports Server (NTRS)
Defigueiredo, Rui J. P.; Denney, Bradley S.
1991-01-01
The goal of the project was to develop a methodology for fusion of a Laser Range Imaging Device (LRID) and camera data. Our initial work in the project led to the conclusion that none of the LRID's that were available were sufficiently adequate for this purpose. Thus we spent the time and effort on the development of the new LRID with several novel features which elicit the desired fusion objectives. In what follows, we describe the device developed and built under contract. The Laser Range Imaging Device (LRID) is an instrument which scans a scene using a laser and returns range and reflection intensity data. Such a system would be extremely useful in scene analysis in industry and space applications. The LRID will be eventually implemented on board a mobile robot. The current system has several advantages over some commercially available systems. One improvement is the use of X-Y galvonometer scanning mirrors instead of polygonal mirrors present in some systems. The advantage of the X-Y scanning mirrors is that the mirror system can be programmed to provide adjustable scanning regions. For each mirror there are two controls accessible by the computer. The first is the mirror position and the second is a zoom factor which modifies the amplitude of the position of the parameter. Another advantage of the LRID is the use of a visible low power laser. Some of the commercial systems use a higher intensity invisible laser which causes safety concerns. By using a low power visible laser, not only can one see the beam and avoid direct eye contact, but also the lower intensity reduces the risk of damage to the eye, and no protective eyeware is required.
University of Pennsylvania MAGIC 2010 Final Report
2011-01-10
and mapping ( SLAM ) techniques are employed to build a local map of the environment surrounding the robot. Readings from the two complementary LIDAR sen...IMU, LIDAR , Cameras Localization Disrupter UGV Local Navigation Sensors: GPS, IMU, LIDAR , Cameras Laser Control Localization Task Planner Strategy/Plan...various components shown in Figure 2. This is comprised of the following subsystems: • Sensor UGV: Mobile UGVs with LIDAR and camera sensors, GPS, and
NASA Astrophysics Data System (ADS)
Masciotti, James M.; Rahim, Shaheed; Grover, Jarrett; Hielscher, Andreas H.
2007-02-01
We present a design for frequency domain instrument that allows for simultaneous gathering of magnetic resonance and diffuse optical tomographic imaging data. This small animal imaging system combines the high anatomical resolution of magnetic resonance imaging (MRI) with the high temporal resolution and physiological information provided by diffuse optical tomography (DOT). The DOT hardware comprises laser diodes and an intensified CCD camera, which are modulated up to 1 GHz by radio frequency (RF) signal generators. An optical imaging head is designed to fit inside the 4 cm inner diameter of a 9.4 T MRI system. Graded index fibers are used to transfer light between the optical hardware and the imaging head within the RF coil. Fiducial markers are integrated into the imaging head to allow the determination of the positions of the source and detector fibers on the MR images and to permit co-registration of MR and optical tomographic images. Detector fibers are arranged compactly and focused through a camera lens onto the photocathode of the intensified CCD camera.
Computer simulation and discussion of high-accuracy laser direction finding in real time
NASA Astrophysics Data System (ADS)
Chen, Wenyi; Chen, Yongzhi
1997-12-01
On condition that CCD is used as the sensor, there are at least five methods that can be used to realize laser's direction finding with high accuracy. They are: image matching method, radiation center method, geometric center method, center of rectangle envelope method and center of maximum run length method. The first three can get the highest accuracy but working in real-time it is too complicated to realize and the cost is very expansive. The other two can also get high accuracy, and it is not difficult to realize working in real time. By using a single-chip microcomputer and an ordinary CCD camera a very simple system can get the position information of a laser beam. The data rate is 50 times per second.
Beam alignment based on two-dimensional power spectral density of a near-field image.
Wang, Shenzhen; Yuan, Qiang; Zeng, Fa; Zhang, Xin; Zhao, Junpu; Li, Kehong; Zhang, Xiaolu; Xue, Qiao; Yang, Ying; Dai, Wanjun; Zhou, Wei; Wang, Yuanchen; Zheng, Kuixing; Su, Jingqin; Hu, Dongxia; Zhu, Qihua
2017-10-30
Beam alignment is crucial to high-power laser facilities and is used to adjust the laser beams quickly and accurately to meet stringent requirements of pointing and centering. In this paper, a novel alignment method is presented, which employs data processing of the two-dimensional power spectral density (2D-PSD) for a near-field image and resolves the beam pointing error relative to the spatial filter pinhole directly. Combining this with a near-field fiducial mark, the operation of beam alignment is achieved. It is experimentally demonstrated that this scheme realizes a far-field alignment precision of approximately 3% of the pinhole size. This scheme adopts only one near-field camera to construct the alignment system, which provides a simple, efficient, and low-cost way to align lasers.
Vision-based weld pool boundary extraction and width measurement during keyhole fiber laser welding
NASA Astrophysics Data System (ADS)
Luo, Masiyang; Shin, Yung C.
2015-01-01
In keyhole fiber laser welding processes, the weld pool behavior is essential to determining welding quality. To better observe and control the welding process, the accurate extraction of the weld pool boundary as well as the width is required. This work presents a weld pool edge detection technique based on an off axial green illumination laser and a coaxial image capturing system that consists of a CMOS camera and optic filters. According to the difference of image quality, a complete developed edge detection algorithm is proposed based on the local maximum gradient of greyness searching approach and linear interpolation. The extracted weld pool geometry and the width are validated by the actual welding width measurement and predictions by a numerical multi-phase model.
Laser Communication Experiments with Artemis Satellite
NASA Astrophysics Data System (ADS)
Kuzkov, Sergii; Sodnik, Zoran; Kuzkov, Volodymyr
2013-10-01
In November 2001, the European Space Agency (ESA) established the world-first inter-satellite laser communication link between the geostationary ARTEMIS satellite and the low Earth orbiting (LEO) SPOT-4 Earth observation satellite, demonstrating data rates of 50 Mbps. In 2006, the Japanese Space Agency launched the KIRARI (OICETS) LEO satellite with a compatible laser communication terminal and bidirectional laser communication links (50 Mbps and 2 Mbps) were successfully realized between KIRARI and ARTEMIS. ESA is now developing the European Data Relay Satellite (EDRS) system, which will use laser communication technology to transmit data between the Sentinel 1 and 2 satellites in LEO to two geostationary satellites (EDRS-A and EDRS-C) at data rates of 1.8 Gbps. As the data handling capabilities of state-of-the-art telecommunication satellites in GEO increase so is the demand for the feeder-link bandwidth to be transmitted from ground. This is why there is an increasing interest in developing high bandwidth ground-to-space laser communication systems working through atmosphere. In 2002, the Main Astronomical Observatory (MAO) started the development of its own laser communication system for its 0.7m AZT-2 telescope, located in Kyiv, Ukraine. The work was supported by the National Space Agency of Ukraine and by ESA. MAO developed a highly accurate computerized tracking system for AZT-2 telescope and a compact laser communication package called LACES (Laser Atmosphere and Communication Experiments with Satellites). The LACES instrument includes a camera of the pointing and tracking subsystems, a receiver module, a laser transmitter module, a tip/tilt atmospheric turbulence compensation subsystem, a bit error rate tester module and other optical and electronic components. The principal subsystems are mounted on a platform, which is located at the Cassegrain focus of the AZT-2 telescope. All systems were tested with the laser communication payload on-board ARTEMIS and the data analysis was supported by the telemetry received from the ARTEMIS payload control centre in Redu (Belgium). Special attention was focused on the investigation of the impact of atmosphere turbulence on laser beam propagation, especially in cloudy conditions. A description of our telescope and ground based laser system as well as the experimental results will be presented.
Plasma turbulence imaging using high-power laser Thomson scattering
NASA Astrophysics Data System (ADS)
Zweben, S. J.; Caird, J.; Davis, W.; Johnson, D. W.; Le Blanc, B. P.
2001-01-01
The two-dimensional (2D) structure of plasma density turbulence in a magnetically confined plasma can potentially be measured using a Thomson scattering system made from components of the Nova laser of Lawrence Livermore National Laboratory. For a plasma such as the National Spherical Torus Experiment at the Princeton Plasma Physics Laboratory, the laser would form an ≈10-cm-wide plane sheet beam passing vertically through the chamber across the magnetic field. The scattered light would be imaged by a charge coupled device camera viewing along the direction of the magnetic field. The laser energy required to make 2D images of density turbulence is in the range 1-3 kJ, which can potentially be obtained from a set of frequency-doubled Nd:glass amplifiers with diameters in the range of 208-315 mm. A laser pulse width of ⩽100 ns would be short enough to capture the highest frequency components of the expected density fluctuations.
Multivariate image analysis of laser-induced photothermal imaging used for detection of caries tooth
NASA Astrophysics Data System (ADS)
El-Sherif, Ashraf F.; Abdel Aziz, Wessam M.; El-Sharkawy, Yasser H.
2010-08-01
Time-resolved photothermal imaging has been investigated to characterize tooth for the purpose of discriminating between normal and caries areas of the hard tissue using thermal camera. Ultrasonic thermoelastic waves were generated in hard tissue by the absorption of fiber-coupled Q-switched Nd:YAG laser pulses operating at 1064 nm in conjunction with a laser-induced photothermal technique used to detect the thermal radiation waves for diagnosis of human tooth. The concepts behind the use of photo-thermal techniques for off-line detection of caries tooth features were presented by our group in earlier work. This paper illustrates the application of multivariate image analysis (MIA) techniques to detect the presence of caries tooth. MIA is used to rapidly detect the presence and quantity of common caries tooth features as they scanned by the high resolution color (RGB) thermal cameras. Multivariate principal component analysis is used to decompose the acquired three-channel tooth images into a two dimensional principal components (PC) space. Masking score point clusters in the score space and highlighting corresponding pixels in the image space of the two dominant PCs enables isolation of caries defect pixels based on contrast and color information. The technique provides a qualitative result that can be used for early stage caries tooth detection. The proposed technique can potentially be used on-line or real-time resolved to prescreen the existence of caries through vision based systems like real-time thermal camera. Experimental results on the large number of extracted teeth as well as one of the thermal image panoramas of the human teeth voltanteer are investigated and presented.
20 kHz toluene planar laser-induced fluorescence imaging of a jet in nearly sonic crossflow
NASA Astrophysics Data System (ADS)
Miller, V. A.; Troutman, V. A.; Mungal, M. G.; Hanson, R. K.
2014-10-01
This manuscript describes continuous, high-repetition-rate (20 kHz) toluene planar laser-induced fluorescence (PLIF) imaging in an expansion tube impulse flow facility. Cinematographic image sequences are acquired that visualize an underexpanded jet of hydrogen in Mach 0.9 crossflow, a practical flow configuration relevant to aerospace propulsion systems. The freestream gas is nitrogen seeded with toluene; toluene broadly absorbs and fluoresces in the ultraviolet, and the relatively high quantum yield of toluene produces large signals and high signal-to-noise ratios. Toluene is excited using a commercially available, frequency-quadrupled (266 nm), high-repetition-rate (20 kHz), pulsed (0.8-0.9 mJ per pulse), diode-pumped solid-state Nd:YAG laser, and fluorescence is imaged with a high-repetition-rate intensifier and CMOS camera. The resulting PLIF movie and image sequences are presented, visualizing the jet start-up process and the dynamics of the jet in crossflow; the freestream duration and a measure of freestream momentum flux steadiness are also inferred. This work demonstrates progress toward continuous PLIF imaging of practical flow systems in impulse facilities at kHz acquisition rates using practical, turn-key, high-speed laser and imaging systems.
Active retroreflector to measure the rotational orientation in conjunction with a laser tracker
NASA Astrophysics Data System (ADS)
Hofherr, O.; Wachten, C.; Müller, C.; Reinecke, H.
2012-10-01
High precision optical non-contact position measurement is a key technology in modern engineering. Laser trackers (LT) can determine accurately x-y-z coordinates of passive retroreflectors. Next-generation systems answer the additional need to measure an object's rotational orientation (pitch, yaw, roll). These devices are based on photogrammetry or on enhanced retroreflectors. However, photogrammetry relies on camera systems and time-consuming image processing. Enhanced retroreflectors analyze the LT's beam but are restricted in roll angle measurements. Here we present an integrated laser based method to evaluate all six degrees of freedom. An active retroreflector directly analyzes its orientation to the LT's beam path by outcoupling laser light on detectors. A proof of concept prototype has been designed with a specified measuring range of 360° for roll angle measurements and +/-15° for pitch and yaw angle respectively. The prototype's optical design is inspired by a cat's eye retroreflector. First results are promising and further improvements are under development. We anticipate our method to facilitate simple and cost-effective six degrees of freedom measurements. Furthermore, for industrial applications wide customizations are possible, e.g. adaptation of measuring range, optimization of accuracy, and further system miniaturization.
Recent advancements in system design for miniaturized MEMS-based laser projectors
NASA Astrophysics Data System (ADS)
Scholles, M.; Frommhagen, K.; Gerwig, Ch.; Knobbe, J.; Lakner, H.; Schlebusch, D.; Schwarzenberg, M.; Vogel, U.
2008-02-01
Laser projection systems that use the flying spot principle and which are based on a single MEMS micro scanning mirrors are a very promising way to build ultra-compact projectors that may fit into mobile devices. First demonstrators that show the feasibility of this approach and the applicability of the micro scanning mirror developed by Fraunhofer IPMS for these systems have already been presented. However, a number of items still have to be resolved until miniaturized laser projectors are ready for the market. This contribution describes progress on several different items, each of them of major importance for laser projection systems. First of all, the overall performance of the system has been increased from VGA resolution to SVGA (800×600 pixels) with easy connection to a PC via DVI interface or by using the projector as embedded system with direct camera interface. Secondly, the degree of integration of the electronics has been enhanced by design of an application specific analog front end IC for the micro scanning mirror. It has been fabricated in a special high voltage technology and does not only allow to generate driving signals for the scanning mirror with amplitudes of up to 200V but also integrates position detection of the mirror by several methods. Thirdly, first results concerning Speckle reduction have been achieved, which is necessary for generation of images with high quality. Other aspects include laser modulation and solutions regarding projection on tilted screens which is possible because of the unlimited depth of focus.
NASA Astrophysics Data System (ADS)
Khalifa, Aly A.; Aly, Hussein A.; El-Sherif, Ashraf F.
2016-02-01
Near infrared (NIR) dynamic scene projection systems are used to perform hardware in-the-loop (HWIL) testing of a unit under test operating in the NIR band. The common and complex requirement of a class of these units is a dynamic scene that is spatio-temporal variant. In this paper we apply and investigate active external modulation of NIR laser in different ranges of temporal frequencies. We use digital micromirror devices (DMDs) integrated as the core of a NIR projection system to generate these dynamic scenes. We deploy the spatial pattern to the DMD controller to simultaneously yield the required amplitude by pulse width modulation (PWM) of the mirror elements as well as the spatio-temporal pattern. Desired modulation and coding of high stable, high power visible (Red laser at 640 nm) and NIR (Diode laser at 976 nm) using the combination of different optical masks based on DMD were achieved. These spatial versatile active coding strategies for both low and high frequencies in the range of kHz for irradiance of different targets were generated by our system and recorded using VIS-NIR fast cameras. The temporally-modulated laser pulse traces were measured using array of fast response photodetectors. Finally using a high resolution spectrometer, we evaluated the NIR dynamic scene projection system response in terms of preserving the wavelength and band spread of the NIR source after projection.
3D imaging and wavefront sensing with a plenoptic objective
NASA Astrophysics Data System (ADS)
Rodríguez-Ramos, J. M.; Lüke, J. P.; López, R.; Marichal-Hernández, J. G.; Montilla, I.; Trujillo-Sevilla, J.; Femenía, B.; Puga, M.; López, M.; Fernández-Valdivia, J. J.; Rosa, F.; Dominguez-Conde, C.; Sanluis, J. C.; Rodríguez-Ramos, L. F.
2011-06-01
Plenoptic cameras have been developed over the last years as a passive method for 3d scanning. Several superresolution algorithms have been proposed in order to increase the resolution decrease associated with lightfield acquisition with a microlenses array. A number of multiview stereo algorithms have also been applied in order to extract depth information from plenoptic frames. Real time systems have been implemented using specialized hardware as Graphical Processing Units (GPUs) and Field Programmable Gates Arrays (FPGAs). In this paper, we will present our own implementations related with the aforementioned aspects but also two new developments consisting of a portable plenoptic objective to transform every conventional 2d camera in a 3D CAFADIS plenoptic camera, and the novel use of a plenoptic camera as a wavefront phase sensor for adaptive optics (OA). The terrestrial atmosphere degrades the telescope images due to the diffraction index changes associated with the turbulence. These changes require a high speed processing that justify the use of GPUs and FPGAs. Na artificial Laser Guide Stars (Na-LGS, 90km high) must be used to obtain the reference wavefront phase and the Optical Transfer Function of the system, but they are affected by defocus because of the finite distance to the telescope. Using the telescope as a plenoptic camera allows us to correct the defocus and to recover the wavefront phase tomographically. These advances significantly increase the versatility of the plenoptic camera, and provides a new contribution to relate the wave optics and computer vision fields, as many authors claim.
Control and automation of the Pegasus multi-point Thomson scattering system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodner, G. M., E-mail: gbodner@wisc.edu; Bongard, M. W.; Fonck, R. J.
A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. Additionally, the system has been upgraded with a set of fast (∼1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.
Control and automation of the Pegasus multi-point Thomson scattering system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodner, Grant M.; Bongard, Michael W.; Fonck, Raymond J.
A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. In addition, the system has been upgraded with a set of fast (~1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.
Free-space laser communication system with rapid acquisition based on astronomical telescopes.
Wang, Jianmin; Lv, Junyi; Zhao, Guang; Wang, Gang
2015-08-10
The general structure of a free-space optical (FSO) communication system based on astronomical telescopes is proposed. The light path for astronomical observation and for communication can be easily switched. A separate camera is used as a star sensor to determine the pointing direction of the optical terminal's antenna. The new system exhibits rapid acquisition and is widely applicable in various astronomical telescope systems and wavelengths. We present a detailed analysis of the acquisition time, which can be decreased by one order of magnitude compared with traditional optical communication systems. Furthermore, we verify software algorithms and tracking accuracy.
Control and automation of the Pegasus multi-point Thomson scattering system
Bodner, Grant M.; Bongard, Michael W.; Fonck, Raymond J.; ...
2016-08-12
A new control system for the Pegasus Thomson scattering diagnostic has recently been deployed to automate the laser operation, data collection process, and interface with the system-wide Pegasus control code. Automation has been extended to areas outside of data collection, such as manipulation of beamline cameras and remotely controlled turning mirror actuators to enable intra-shot beam alignment. In addition, the system has been upgraded with a set of fast (~1 ms) mechanical shutters to mitigate contamination from background light. Modification and automation of the Thomson system have improved both data quality and diagnostic reliability.
NASA Astrophysics Data System (ADS)
Lefcourt, Alan M.; Kistler, Ross; Gadsden, S. Andrew
2016-05-01
The goal of this project was to construct a cart and a mounting system that would allow a hyperspectral laser-induced fluorescence imaging system (HLIFIS) to be used to detect fecal material in produce fields. Fecal contaminated produce is a recognized food safety risk. Previous research demonstrated the HLIFIS could detect fecal contamination in a laboratory setting. A cart was designed and built, and then tested to demonstrate that the cart was capable of moving at constant speeds or at precise intervals. A mounting system was designed and built to facilitate the critical alignment of the camera's imaging and the laser's illumination fields, and to allow the HLIFIS to be used in both field and laboratory settings without changing alignments. A hardened mount for the Powell lens that is used to produce the appropriate illumination profile was also designed, built, and tested.
Three-dimensional digitizer for the footwear industry
NASA Astrophysics Data System (ADS)
Gonzalez, Francisco; Campoy, Pascual; Aracil, Rafael; Penafiel, Francisco; Sebastian, Jose M.
1993-12-01
This paper presents a developed system for digitizing 3D objects in the footwear industry (e.g. mould, soles, heels) and their introduction in a CAD system for further manipulation and production of rapid prototypes. The system is based on the acquisition of the sequence of images of the projection of a laser line onto the 3D object when this is moving in front of the laser beam and the camera. This beam projection lights a 3D curve on the surface of the object, whose image is processed in order to obtain the 3D coordinates of every point of mentioned curve according to a previous calibration of the system. These coordinates of points in all the curves are analyzed and combined in order to make up a 3D wire-frame model of the object, which is introduced in a CAD station for further design and connection to the machinery for rapid prototyping.
NASA Astrophysics Data System (ADS)
Micheletti, Natan; Chandler, Jim; Lane, Stuart
2013-04-01
Whilst high-resolution topographic and terrain data is essential in many geoscience applications, its acquisition has traditionally required either specific expertise (e.g. applications of photogrammetry) or expensive equipment (e.g. ground-based laser altimetric systems). Recent work in geomorphology (e.g. James and Robson, 2012; Carbonneau et al., 2012) has demonstrated the potential of Structure-from-Motion photogrammetry as a low cost, low expertise alternative for Digital Elevation Model (DEM) generation. These methods have geomorphological appeal because the more sophisticated image matching approaches remove many of the geometrical constraints associated with image acquisition: traditionally, vertical and "normal" image pairs acquired with a metric camera. This increases both the number of potential applications and the efficacy of image acquisition in the field. It also allows for genuine 3D (where the same (x,y) can have multiple z values) rather than 2.5D (where each (x,y) must have a unique z value) representation of the terrain surface. In this paper, we progress this technology further, by testing what can be acquired using hand-held smartphone technology, where the acquired images can be uploaded in the field to Open Source technology freely available to the research community. This is achieved by evaluating the quality of DEMs generated with a fully automated, open-source, Structure-from-Motion package and a smartphone (Apple Iphone 4) integrated camera (5 megapixels) using terrestrial laser scanning (TLS) data as benchmark. To allow a more objective assessment, it is necessary to compare both device and package with traditional approaches. Accordingly, we compare the error in the smartphone DEMs with the errors associated with data derived using a 16.2 megapixel digital camera and processed using the more traditional, commercial, close-range and semi-automated software PhotoModeler. Results demonstrate that centimeter precision DTMs can be achieved at close range, using a smartphone camera and a fully automated package, here illustrated for river bank survey. Results improve to sub-centimeter precision with either higher resolution images or by applying specific post-processing techniques to the smartphone DEMs. Extension to the survey of an entire Alpine alluvial fan system shows that the degradation of precision scales linearly with image scale, but that the quality: maintains a good level of precision; and is influenced equally with the difficulties of separating vegetation and sediment cover, typical of laser scanning systems.
Laser guide star pointing camera for ESO LGS Facilities
NASA Astrophysics Data System (ADS)
Bonaccini Calia, D.; Centrone, M.; Pedichini, F.; Ricciardi, A.; Cerruto, A.; Ambrosino, F.
2014-08-01
Every observatory using LGS-AO routinely has the experience of the long time needed to bring and acquire the laser guide star in the wavefront sensor field of view. This is mostly due to the difficulty of creating LGS pointing models, because of the opto-mechanical flexures and hysteresis in the launch and receiver telescope structures. The launch telescopes are normally sitting on the mechanical structure of the larger receiver telescope. The LGS acquisition time is even longer in case of multiple LGS systems. In this framework the optimization of the LGS systems absolute pointing accuracy is relevant to boost the time efficiency of both science and technical observations. In this paper we show the rationale, the design and the feasibility tests of a LGS Pointing Camera (LPC), which has been conceived for the VLT Adaptive Optics Facility 4LGSF project. The LPC would assist in pointing the four LGS, while the VLT is doing the initial active optics cycles to adjust its own optics on a natural star target, after a preset. The LPC allows minimizing the needed accuracy for LGS pointing model calibrations, while allowing to reach sub-arcsec LGS absolute pointing accuracy. This considerably reduces the LGS acquisition time and observations operation overheads. The LPC is a smart CCD camera, fed by a 150mm diameter aperture of a Maksutov telescope, mounted on the top ring of the VLT UT4, running Linux and acting as server for the client 4LGSF. The smart camera is able to recognize within few seconds the sky field using astrometric software, determining the stars and the LGS absolute positions. Upon request it returns the offsets to give to the LGS, to position them at the required sky coordinates. As byproduct goal, once calibrated the LPC can calculate upon request for each LGS, its return flux, its fwhm and the uplink beam scattering levels.
Yi, Shengzhen; Zhang, Zhe; Huang, Qiushi; Zhang, Zhong; Mu, Baozhong; Wang, Zhanshan; Fang, Zhiheng; Wang, Wei; Fu, Sizu
2016-10-01
Because grazing-incidence Kirkpatrick-Baez (KB) microscopes have better resolution and collection efficiency than pinhole cameras, they have been widely used for x-ray imaging diagnostics of laser inertial confinement fusion. The assembly and adjustment of a multichannel KB microscope must meet stringent requirements for image resolution and reproducible alignment. In the present study, an eight-channel KB microscope was developed for diagnostics by imaging self-emission x-rays with a framing camera at the Shenguang-II Update (SGII-Update) laser facility. A consistent object field of view is ensured in the eight channels using an assembly method based on conical reference cones, which also allow the intervals between the eight images to be tuned to couple with the microstrips of the x-ray framing camera. The eight-channel KB microscope was adjusted via real-time x-ray imaging experiments in the laboratory. This paper describes the details of the eight-channel KB microscope, its optical and multilayer design, the assembly and alignment methods, and results of imaging in the laboratory and at the SGII-Update.
STREAK CAMERA MEASUREMENTS OF THE APS PC GUN DRIVE LASER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dooling, J. C.; Lumpkin, A. H.
We report recent pulse-duration measurements of the APS PC Gun drive laser at both second harmonic and fourth harmonic wavelengths. The drive laser is a Nd:Glass-based chirped pulsed amplifier (CPA) operating at an IR wavelength of 1053 nm, twice frequency-doubled to obtain UV output for the gun. A Hamamatsu C5680 streak camera and an M5675 synchroscan unit are used for these measurements; the synchroscan unit is tuned to 119 MHz, the 24th subharmonic of the linac s-band operating frequency. Calibration is accomplished both electronically and optically. Electronic calibration utilizes a programmable delay line in the 119 MHz rf path. Themore » optical delay uses an etalon with known spacing between reflecting surfaces and is coated for the visible, SH wavelength. IR pulse duration is monitored with an autocorrelator. Fitting the streak camera image projected profiles with Gaussians, UV rms pulse durations are found to vary from 2.1 ps to 3.5 ps as the IR varies from 2.2 ps to 5.2 ps.« less
NASA Technical Reports Server (NTRS)
Marsh, J. G.; Douglas, B. C.; Walls, D. M.
1974-01-01
Laser and camera data taken during the International Satellite Geodesy Experiment (ISAGEX) were used in dynamical solutions to obtain center-of-mass coordinates for the Astro-Soviet camera sites at Helwan, Egypt, and Oulan Bator, Mongolia, as well as the East European camera sites at Potsdam, German Democratic Republic, and Ondrejov, Czechoslovakia. The results are accurate to about 20m in each coordinate. The orbit of PEOLE (i=15) was also determined from ISAGEX data. Mean Kepler elements suitable for geodynamic investigations are presented.
Usachev uses a laser range finder during rendezvous ops
2001-03-10
STS102-E-5085 (10 March 2001) --- Cosmonaut Yury V. Usachev, STS-102 mission specialist, uses a laser ranging device on Discovery's aft flight deck during rendezvous operations. The photograph was recorded with a digital still camera.
Phoenix Laser Beam in Action on Mars
2008-09-30
The Surface Stereo Imager camera aboard NASA Phoenix Mars Lander acquired a series of images of the laser beam in the Martian night sky. Bright spots in the beam are reflections from ice crystals in the low level ice-fog.
Automatic concrete cracks detection and mapping of terrestrial laser scan data
NASA Astrophysics Data System (ADS)
Rabah, Mostafa; Elhattab, Ahmed; Fayad, Atef
2013-12-01
Terrestrial laser scanning has become one of the standard technologies for object acquisition in surveying engineering. The high spatial resolution of imaging and the excellent capability of measuring the 3D space by laser scanning bear a great potential if combined for both data acquisition and data compilation. Automatic crack detection from concrete surface images is very effective for nondestructive testing. The crack information can be used to decide the appropriate rehabilitation method to fix the cracked structures and prevent any catastrophic failure. In practice, cracks on concrete surfaces are traced manually for diagnosis. On the other hand, automatic crack detection is highly desirable for efficient and objective crack assessment. The current paper submits a method for automatic concrete cracks detection and mapping from the data that was obtained during laser scanning survey. The method of cracks detection and mapping is achieved by three steps, namely the step of shading correction in the original image, step of crack detection and finally step of crack mapping and processing steps. The detected crack is defined in a pixel coordinate system. To remap the crack into the referred coordinate system, a reverse engineering is used. This is achieved by a hybrid concept of terrestrial laser-scanner point clouds and the corresponding camera image, i.e. a conversion from the pixel coordinate system to the terrestrial laser-scanner or global coordinate system. The results of the experiment show that the mean differences between terrestrial laser scan and the total station are about 30.5, 16.4 and 14.3 mms in x, y and z direction, respectively.
Combined optical resolution photoacoustic and fluorescence micro-endoscopy
NASA Astrophysics Data System (ADS)
Shao, Peng; Shi, Wei; Hajireza, Parsin; Zemp, Roger J.
2012-02-01
We present a new micro-endoscopy system combining real-time C-scan optical-resolution photoacoustic micro-endoscopy (OR-PAME), and a high-resolution fluorescence micro-endoscopy system for visualizing fluorescently labeled cellular components and optically absorbing microvasculature simultaneously. With a diode-pumped 532-nm fiber laser, the OR-PAM sub-system is capable of imaging with a resolution of ~ 7μm. The fluorescence sub-system consists of a diode laser with 445 nm-centered emissions as the light source, an objective lens and a CCD camera. Proflavine, a FDA approved drug for human use, is used as the fluorescent contrast agent by topical application. The fluorescence system does not require any mechanical scanning. The scanning laser and the diode laser light source share the same light path within an optical fiber bundle containing 30,000 individual single mode fibers. The absorption of Proflavine at 532 nm is low, which mitigates absorption bleaching of the contrast agent by the photoacoustic excitation source. We demonstrate imaging in live murine models. The system is able to provide cellular morphology with cellular resolution co-registered with the structural and functional information given by OR-PAM. Therefore, the system has the potential to serve as a virtual biopsy technique, helping researchers and clinicians visualize angiogenesis, effects of anti-cancer drugs on both cells and the microcirculation, as well as aid in the study of other diseases.
Park, Jae Byung; Lee, Seung Hun; Lee, Il Jae
2009-01-01
In this study, we propose a precise 3D lug pose detection sensor for automatic robot welding of a lug to a huge steel plate used in shipbuilding, where the lug is a handle to carry the huge steel plate. The proposed sensor consists of a camera and four laser line diodes, and its design parameters are determined by analyzing its detectable range and resolution. For the lug pose acquisition, four laser lines are projected on both lug and plate, and the projected lines are detected by the camera. For robust detection of the projected lines against the illumination change, the vertical threshold, thinning, Hough transform and separated Hough transform algorithms are successively applied to the camera image. The lug pose acquisition is carried out by two stages: the top view alignment and the side view alignment. The top view alignment is to detect the coarse lug pose relatively far from the lug, and the side view alignment is to detect the fine lug pose close to the lug. After the top view alignment, the robot is controlled to move close to the side of the lug for the side view alignment. By this way, the precise 3D lug pose can be obtained. Finally, experiments with the sensor prototype are carried out to verify the feasibility and effectiveness of the proposed sensor. PMID:22400007
Design and build a compact Raman sensor for identification of chemical composition
NASA Astrophysics Data System (ADS)
Garcia, Christopher S.; Abedin, M. Nurul; Ismail, Syed; Sharma, Shiv K.; Misra, Anupam K.; Sandford, Stephen P.; Elsayed-Ali, Hani
2008-04-01
A compact remote Raman sensor system was developed at NASA Langley Research Center. This sensor is an improvement over the previously reported system, which consisted of a 532 nm pulsed laser, a 4-inch telescope, a spectrograph, and an intensified CCD camera. One of the attractive features of the previous system was its portability, thereby making it suitable for applications such as planetary surface explorations, homeland security and defense applications where a compact portable instrument is important. The new system was made more compact by replacing bulky components with smaller and lighter components. The new compact system uses a smaller spectrograph measuring 9 x 4 x 4 in. and a smaller intensified CCD camera measuring 5 in. long and 2 in. in diameter. The previous system was used to obtain the Raman spectra of several materials that are important to defense and security applications. Furthermore, the new compact Raman sensor system is used to obtain the Raman spectra of a diverse set of materials to demonstrate the sensor system's potential use in the identification of unknown materials.
Design and Build a Compact Raman Sensor for Identification of Chemical Composition
NASA Technical Reports Server (NTRS)
Garcia, Christopher S.; Abedin, M. Nurul; Ismail, Syed; Sharma, Shiv K.; Misra, Anupam K.; Sandford, Stephen P.; Elsayed-Ali, Hani
2008-01-01
A compact remote Raman sensor system was developed at NASA Langley Research Center. This sensor is an improvement over the previously reported system, which consisted of a 532 nm pulsed laser, a 4-inch telescope, a spectrograph, and an intensified charge-coupled devices (CCD) camera. One of the attractive features of the previous system was its portability, thereby making it suitable for applications such as planetary surface explorations, homeland security and defense applications where a compact portable instrument is important. The new system was made more compact by replacing bulky components with smaller and lighter components. The new compact system uses a smaller spectrograph measuring 9 x 4 x 4 in. and a smaller intensified CCD camera measuring 5 in. long and 2 in. in diameter. The previous system was used to obtain the Raman spectra of several materials that are important to defense and security applications. Furthermore, the new compact Raman sensor system is used to obtain the Raman spectra of a diverse set of materials to demonstrate the sensor system's potential use in the identification of unknown materials.
COMPARISON OF RETINAL PATHOLOGY VISUALIZATION IN MULTISPECTRAL SCANNING LASER IMAGING.
Meshi, Amit; Lin, Tiezhu; Dans, Kunny; Chen, Kevin C; Amador, Manuel; Hasenstab, Kyle; Muftuoglu, Ilkay Kilic; Nudleman, Eric; Chao, Daniel; Bartsch, Dirk-Uwe; Freeman, William R
2018-03-16
To compare retinal pathology visualization in multispectral scanning laser ophthalmoscope imaging between the Spectralis and Optos devices. This retrospective cross-sectional study included 42 eyes from 30 patients with age-related macular degeneration (19 eyes), diabetic retinopathy (10 eyes), and epiretinal membrane (13 eyes). All patients underwent retinal imaging with a color fundus camera (broad-spectrum white light), the Spectralis HRA-2 system (3-color monochromatic lasers), and the Optos P200 system (2-color monochromatic lasers). The Optos image was cropped to a similar size as the Spectralis image. Seven masked graders marked retinal pathologies in each image within a 5 × 5 grid that included the macula. The average area with detected retinal pathology in all eyes was larger in the Spectralis images compared with Optos images (32.4% larger, P < 0.0001), mainly because of better visualization of epiretinal membrane and retinal hemorrhage. The average detection rate of age-related macular degeneration and diabetic retinopathy pathologies was similar across the three modalities, whereas epiretinal membrane detection rate was significantly higher in the Spectralis images. Spectralis tricolor multispectral scanning laser ophthalmoscope imaging had higher rate of pathology detection primarily because of better epiretinal membrane and retinal hemorrhage visualization compared with Optos bicolor multispectral scanning laser ophthalmoscope imaging.
Evaluation of a high framerate multi-exposure laser speckle contrast imaging setup
NASA Astrophysics Data System (ADS)
Hultman, Martin; Fredriksson, Ingemar; Strömberg, Tomas; Larsson, Marcus
2018-02-01
We present a first evaluation of a new multi-exposure laser speckle contrast imaging (MELSCI) system for assessing spatial variations in the microcirculatory perfusion. The MELSCI system is based on a 1000 frames per second 1-megapixel camera connected to a field programmable gate arrays (FPGA) capable of producing MELSCI data in realtime. The imaging system is evaluated against a single point laser Doppler flowmetry (LDF) system during occlusionrelease provocations of the arm in five subjects. Perfusion is calculated from MELSCI data using current state-of-the-art inverse models. The analysis displayed a good agreement between measured and modeled data, with an average error below 6%. This strongly indicates that the applied model is capable of accurately describing the MELSCI data and that the acquired data is of high quality. Comparing readings from the occlusion-release provocation showed that the MELSCI perfusion was significantly correlated (R=0.83) to the single point LDF perfusion, clearly outperforming perfusion estimations based on a single exposure time. We conclude that the MELSCI system provides blood flow images of enhanced quality, taking us one step closer to a system that accurately can monitor dynamic changes in skin perfusion over a large area in real-time.
NASA Astrophysics Data System (ADS)
Darwiesh, M.; El-Sherif, Ashraf F.; El-Ghandour, Hatem; Aly, Hussein A.; Mokhtar, A. M.
2011-03-01
Optical imaging systems are widely used in different applications include tracking for portable scanners; input pointing devices for laptop computers, cell phones, and cameras, fingerprint-identification scanners, optical navigation for target tracking, and in optical computer mouse. We presented an experimental work to measure and analyze the laser speckle pattern (LSP) produced from different optical sources (i.e. various color LEDs, 3 mW diode laser, and 10mW He-Ne laser) with different produced operating surfaces (Gabor hologram diffusers), and how they affects the performance of the optical imaging systems; speckle size and signal-to-noise ratio (signal is represented by the patches of the speckles that contain or carry information, and noise is represented by the whole remaining part of the selected image). The theoretical and experimental studies of the colorimetry (color correction is done in the color images captured by the optical imaging system to produce realistic color images which contains most of the information in the image by selecting suitable gray scale which contains most of the informative data in the image, this is done by calculating the accurate Red-Green-Blue (RGB) color components making use of the measured spectrum for light sources, and color matching functions of International Telecommunication Organization (ITU-R709) for CRT phosphorus, Tirinton-SONY Model ) for the used optical sources are investigated and introduced to present the relations between the signal-to-noise ratios with different diffusers for each light source. The source surface coupling has been discussed and concludes that the performance of the optical imaging system for certain source varies from worst to best based on the operating surface. The sensor /surface coupling has been studied and discussed for the case of He-Ne laser and concludes the speckle size is ranged from 4.59 to 4.62 μm, which are slightly different or approximately the same for all produced diffusers (which satisfies the fact that the speckle size is independent on the illuminating surface). But, the calculated value of signal-tonoise ratio takes different values ranged from 0.71 to 0.92 for different diffuser. This means that the surface texture affects the performance of the optical sensor because, all images captured for all diffusers under the same conditions [same source (He-Ne laser), same distances of the experimental set-up, and the same sensor (CCD camera)].
2004-08-23
KENNEDY SPACE CENTER, FLA. - The Remote Manipulator System (RMS), also known as the Canadian robotic arm, for the orbiter Discovery has arrived at KSC’s Vehicle Assembly Building Lab. Seen on the left end is the shoulder pitch joint. The wrist and shoulder joints on the RMS allow the basic structure of the arm to maneuver similar to a human arm. The RMS is used to deploy and retrieve payloads, provide a mobile extension ladder or foot restraints for crew members during extravehicular activities; and to aid the flight crew members in viewing surfaces of the orbiter or payloads through a television camera on the RMS. The arm is also serving as the base for the new Orbiter Boom Sensor System (OBSS), one of the safety measures for Return to Flight, equipping the Shuttle with cameras and laser systems to inspect the Shuttle’s Thermal Protection System while in space. Discovery is scheduled for a launch planning window of March 2005 on mission STS-114.
Real-time image processing of TOF range images using a reconfigurable processor system
NASA Astrophysics Data System (ADS)
Hussmann, S.; Knoll, F.; Edeler, T.
2011-07-01
During the last years, Time-of-Flight sensors achieved a significant impact onto research fields in machine vision. In comparison to stereo vision system and laser range scanners they combine the advantages of active sensors providing accurate distance measurements and camera-based systems recording a 2D matrix at a high frame rate. Moreover low cost 3D imaging has the potential to open a wide field of additional applications and solutions in markets like consumer electronics, multimedia, digital photography, robotics and medical technologies. This paper focuses on the currently implemented 4-phase-shift algorithm in this type of sensors. The most time critical operation of the phase-shift algorithm is the arctangent function. In this paper a novel hardware implementation of the arctangent function using a reconfigurable processor system is presented and benchmarked against the state-of-the-art CORDIC arctangent algorithm. Experimental results show that the proposed algorithm is well suited for real-time processing of the range images of TOF cameras.
NASA Astrophysics Data System (ADS)
Jantzen, Connie; Slagle, Rick
1997-05-01
The distinction between exposure time and sample rate is often the first point raised in any discussion of high speed imaging. Many high speed events require exposure times considerably shorter than those that can be achieved solely by the sample rate of the camera, where exposure time equals 1/sample rate. Gating, a method of achieving short exposure times in digital cameras, is often difficult to achieve for exposure time requirements shorter than 100 microseconds. This paper discusses the advantages and limitations of using the short duration light pulse of a near infrared laser with high speed digital imaging systems. By closely matching the output wavelength of the pulsed laser to the peak near infrared response of current sensors, high speed image capture can be accomplished at very low (visible) light levels of illumination. By virtue of the short duration light pulse, adjustable to as short as two microseconds, image capture of very high speed events can be achieved at relatively low sample rates of less than 100 pictures per second, without image blur. For our initial investigations, we chose a ballistic subject. The results of early experimentation revealed the limitations of applying traditional ballistic imaging methods when using a pulsed infrared lightsource with a digital imaging system. These early disappointing results clarified the need to further identify the unique system characteristics of the digital imager and pulsed infrared combination. It was also necessary to investigate how the infrared reflectance and transmittance of common materials affects the imaging process. This experimental work yielded a surprising, successful methodology which will prove useful in imaging ballistic and weapons tests, as well as forensics, flow visualizations, spray pattern analyses, and nocturnal animal behavioral studies.
Excimer-laser-induced shock wave and its dependence on atmospheric environment
NASA Astrophysics Data System (ADS)
Krueger, Ronald R.; Krasinski, Jerzy S.; Radzewicz, Czeslaw
1993-06-01
High speed shadow photography is performed on excimer laser ablated porcine corneas and rubber stoppers to capture the excimer laser induced shock waves at various time delays between 40 and 320 nanoseconds. The shock waves in air, nitrogen, and helium are recorded by tangentially illuminating the ablated surface with a tunable dye laser, the XeCl excimer laser pulse. The excimer laser ablates the specimen and excites the dye laser, which is then passed through an optical delay line before illuminating the specimen. The shadow of the shock wave produced during ablation is then cast on a screen and photographed with a CCD video camera. The system is pulsed at 30 times per second to allow a video recording of the shock wave at a fixed time delay. We conclude that high energy acoustic waves and gaseous particles are liberated during excimer laser corneal ablation, and dissipate on a submicrosecond time scale. The velocity of their dissipation is dependent on the atmospheric environment and can be increased two-fold when the ablation is performed in a helium atmosphere. Therefore, local temperature increases due to the liberation of high energy gases may be reduced by using helium during corneal photoablation.
The Topographic Data Deluge - Collecting and Maintaining Data in a 21ST Century Mapping Agency
NASA Astrophysics Data System (ADS)
Holland, D. A.; Pook, C.; Capstick, D.; Hemmings, A.
2016-06-01
In the last few years, the number of sensors and data collection systems available to a mapping agency has grown considerably. In the field, in addition to total stations measuring position, angles and distances, the surveyor can choose from hand-held GPS devices, multi-lens imaging systems or laser scanners, which may be integrated with a laptop or tablet to capture topographic data directly in the field. These systems are joined by mobile mapping solutions, mounted on large or small vehicles, or sometimes even on a backpack carried by a surveyor walking around a site. Such systems allow the raw data to be collected rapidly in the field, while the interpretation of the data can be performed back in the office at a later date. In the air, large format digital cameras and airborne lidar sensors are being augmented with oblique camera systems, taking multiple views at each camera position and being used to create more realistic 3D city models. Lower down in the atmosphere, Unmanned Aerial Vehicles (or Remotely Piloted Aircraft Systems) have suddenly become ubiquitous. Hundreds of small companies have sprung up, providing images from UAVs using ever more capable consumer cameras. It is now easy to buy a 42 megapixel camera off the shelf at the local camera shop, and Canon recently announced that they are developing a 250 megapixel sensor for the consumer market. While these sensors may not yet rival the metric cameras used by today's photogrammetrists, the rapid developments in sensor technology could eventually lead to the commoditization of high-resolution camera systems. With data streaming in from so many sources, the main issue for a mapping agency is how to interpret, store and update the data in such a way as to enable the creation and maintenance of the end product. This might be a topographic map, ortho-image or a digital surface model today, but soon it is just as likely to be a 3D point cloud, textured 3D mesh, 3D city model, or Building Information Model (BIM) with all the data interpretation and modelling that entails. In this paper, we describe research/investigations into the developing technologies and outline the findings for a National Mapping Agency (NMA). We also look at the challenges that these new data collection systems will bring to an NMA, and suggest ways that we may work to meet these challenges and deliver the products desired by our users.
Visualization of corona discharge induced by UV (248 nm) pulses of a KrF excimer laser
NASA Astrophysics Data System (ADS)
Mizeraczyk, Jerzy; Ohkubo, Toshikazu; Kanazawa, Seiji; Nomoto, Yukiharu; Kawasaki, Toshiyuki; Kocik, Marek
2000-11-01
A KrF excimer laser (248 nm) was used to induce DC corona discharge streamers in air between the electrodes of a needle-to-plane geometry. The UV laser beam pulses were transformed into the form of a laser sheet (1.5 mm thick and 20 mm-wide) that was positioned along the axis directed from the needle electrode to the plane electrode. The laser pulses were time-synchronized with the exposure of an ICCD camera that record images of the corona streamers induced by the laser sheet. The laser pulse energy flux (75 MW/cm2) crossing the gap was high enough to induce corona streamers with a reliability of 100% even at relatively low operating voltages (e.g., 15 kV) at which self-sustained streamers could not occur. Due to the full synchronization of the corona streamer onset, induced by the laser pulse and the exposure of the ICCD camera, 2-D visualization of the corona streamer evolution with a time resolution of 10 ns was possible. The recorded images made possible determining such features of the corona discharge streamer as its velocity (2.5 105 m/s) and the diameters of the leader channel (200 micrometers ) and the leader streamers (100 micrometers ).
Neil A. Clark; Sang-Mook Lee
2004-01-01
This paper demonstrates how a digital video camera with a long lens can be used with pulse laser ranging in order to collect very large-scale tree crown measurements. The long focal length of the camera lens provides the magnification required for precise viewing of distant points with the trade-off of spatial coverage. Multiple video frames are mosaicked into a single...
NASA Astrophysics Data System (ADS)
Fernandez, J. C.; Shrestha, R. L.; Carter, W. E.; Slatton, C. K.; Singhania, A.
2006-12-01
The UF GEM Research Center is working towards developing a Mobile Terrestrial Laser Scanning System (M- TLSS). The core of the M-TLSS is a commercial 2-axis ground based laser scanner, Optech ILRIS-36D, which is capable of generating XYZ with laser intensity or RGB textured point clouds in a range from 3m to 1500m. The laser operates at a wavelength of 1535 nm. The sample separation can be adjusted down to 0.00115°, and the scanning speed is 2,000 points per second. The scanner is integrated to a mobile telescoping, rotating and tilting platform which is essentially a telescopic lift mounted on the back of a pick up truck. This provides up to 6 degrees of freedom for performing scanning operations. A scanner built-in 6 megapixel digital camera and a digital video camera provide the M-TLSS moving and still imagining capability. The applications of the M-TLSS data sets are numerous in both the fields of science and engineering. This paper will focus on the application of M-TLSS as a complement to ALSM in the study of beach morphology in the St. Augustine, Florida area. ALSM data covers a long stretch of beach with a moderate sample density of approximately 1 laser return per square meter, which enables the detection of submeter-scale changes in shoreline position and dune heights over periods of few months. The M-TLSS, on the other hand, can provide high density point clouds (centimeter scale point spacing) of smaller areas known to be highly prone to erosion. From these point clouds centimeter level surface grids are created. These grids will be compared with the ALSM data and with a time series of M-TLSS data over the same area to provide high resolution, short term beach erosion monitoring. Surface morphological parameters that will be compared among the ALSM and M-TLSS data sets include shoreline position and gradients and standard deviations of elevations on cross- shore transects.
View of Scientific Instrument Module to be flown on Apollo 15
1971-06-27
S71-2250X (June 1971) --- A close-up view of the Scientific Instrument Module (SIM) to be flown for the first time on the Apollo 15 lunar landing mission. Mounted in a previously vacant sector of the Apollo Service Module (SM), the SIM carries specialized cameras and instrumentation for gathering lunar orbit scientific data. SIM equipment includes a laser altimeter for accurate measurement of height above the lunar surface; a large-format panoramic camera for mapping, correlated with a metric camera and the laser altimeter for surface mapping; a gamma ray spectrometer on a 25-feet extendible boom; a mass spectrometer on a 21-feet extendible boom; X-ray and alpha particle spectrometers; and a subsatellite which will be injected into lunar orbit carrying a particle and magnetometer, and the S-Band transponder.
Radiation hardening of gated x-ray imagers for the National Ignition Facility (invited).
Bell, P M; Bradley, D K; Kilkenny, J D; Conder, A; Cerjan, C; Hagmann, C; Hey, D; Izumi, N; Moody, J; Teruya, A; Celeste, J; Kimbrough, J; Khater, H; Eckart, M J; Ayers, J
2010-10-01
The National Ignition Facility will soon be producing x-ray flux and neutron yields higher than any produced in laser driven implosion experiments in the past. Even a non-igniting capsule will require x-ray imaging of near burning plasmas at 10(17) neutrons, requiring x-ray recording systems to work in more hostile conditions than we have encountered in past laser facilities. We will present modeling, experimental data and design concepts for x-ray imaging with electronic recording systems for this environment (ARIANE). A novel instrument, active readout in a nuclear environment, is described which uses the time-of-flight difference between the gated x-ray signal and the neutron which induces a background signal to increase the yield at which gated cameras can be used.
A new position measurement system using a motion-capture camera for wind tunnel tests.
Park, Hyo Seon; Kim, Ji Young; Kim, Jin Gi; Choi, Se Woon; Kim, Yousok
2013-09-13
Considering the characteristics of wind tunnel tests, a position measurement system that can minimize the effects on the flow of simulated wind must be established. In this study, a motion-capture camera was used to measure the displacement responses of structures in a wind tunnel test, and the applicability of the system was tested. A motion-capture system (MCS) could output 3D coordinates using two-dimensional image coordinates obtained from the camera. Furthermore, this remote sensing system had some flexibility regarding lab installation because of its ability to measure at relatively long distances from the target structures. In this study, we performed wind tunnel tests on a pylon specimen and compared the measured responses of the MCS with the displacements measured with a laser displacement sensor (LDS). The results of the comparison revealed that the time-history displacement measurements from the MCS slightly exceeded those of the LDS. In addition, we confirmed the measuring reliability of the MCS by identifying the dynamic properties (natural frequency, damping ratio, and mode shape) of the test specimen using system identification methods (frequency domain decomposition, FDD). By comparing the mode shape obtained using the aforementioned methods with that obtained using the LDS, we also confirmed that the MCS could construct a more accurate mode shape (bending-deflection mode shape) with the 3D measurements.