Automatic multi-camera calibration for deployable positioning systems
NASA Astrophysics Data System (ADS)
Axelsson, Maria; Karlsson, Mikael; Rudner, Staffan
2012-06-01
Surveillance with automated positioning and tracking of subjects and vehicles in 3D is desired in many defence and security applications. Camera systems with stereo or multiple cameras are often used for 3D positioning. In such systems, accurate camera calibration is needed to obtain a reliable 3D position estimate. There is also a need for automated camera calibration to facilitate fast deployment of semi-mobile multi-camera 3D positioning systems. In this paper we investigate a method for automatic calibration of the extrinsic camera parameters (relative camera pose and orientation) of a multi-camera positioning system. It is based on estimation of the essential matrix between each camera pair using the 5-point method for intrinsically calibrated cameras. The method is compared to a manual calibration method using real HD video data from a field trial with a multicamera positioning system. The method is also evaluated on simulated data from a stereo camera model. The results show that the reprojection error of the automated camera calibration method is close to or smaller than the error for the manual calibration method and that the automated calibration method can replace the manual calibration.
NASA Astrophysics Data System (ADS)
Feng, Zhixin
2018-02-01
Projector calibration is crucial for a camera-projector three-dimensional (3-D) structured light measurement system, which has one camera and one projector. In this paper, a novel projector calibration method is proposed based on digital image correlation. In the method, the projector is viewed as an inverse camera, and a plane calibration board with feature points is used to calibrate the projector. During the calibration processing, a random speckle pattern is projected onto the calibration board with different orientations to establish the correspondences between projector images and camera images. Thereby, dataset for projector calibration are generated. Then the projector can be calibrated using a well-established camera calibration algorithm. The experiment results confirm that the proposed method is accurate and reliable for projector calibration.
A Novel Multi-Camera Calibration Method based on Flat Refractive Geometry
NASA Astrophysics Data System (ADS)
Huang, S.; Feng, M. C.; Zheng, T. X.; Li, F.; Wang, J. Q.; Xiao, L. F.
2018-03-01
Multi-camera calibration plays an important role in many field. In the paper, we present a novel multi-camera calibration method based on flat refractive geometry. All cameras can acquire calibration images of transparent glass calibration board (TGCB) at the same time. The application of TGCB leads to refractive phenomenon which can generate calibration error. The theory of flat refractive geometry is employed to eliminate the error. The new method can solve the refractive phenomenon of TGCB. Moreover, the bundle adjustment method is used to minimize the reprojection error and obtain optimized calibration results. Finally, the four-cameras calibration results of real data show that the mean value and standard deviation of the reprojection error of our method are 4.3411e-05 and 0.4553 pixel, respectively. The experimental results show that the proposed method is accurate and reliable.
Calibration and accuracy analysis of a focused plenoptic camera
NASA Astrophysics Data System (ADS)
Zeller, N.; Quint, F.; Stilla, U.
2014-08-01
In this article we introduce new methods for the calibration of depth images from focused plenoptic cameras and validate the results. We start with a brief description of the concept of a focused plenoptic camera and how from the recorded raw image a depth map can be estimated. For this camera, an analytical expression of the depth accuracy is derived for the first time. In the main part of the paper, methods to calibrate a focused plenoptic camera are developed and evaluated. The optical imaging process is calibrated by using a method which is already known from the calibration of traditional cameras. For the calibration of the depth map two new model based methods, which make use of the projection concept of the camera are developed. These new methods are compared to a common curve fitting approach, which is based on Taylor-series-approximation. Both model based methods show significant advantages compared to the curve fitting method. They need less reference points for calibration than the curve fitting method and moreover, supply a function which is valid in excess of the range of calibration. In addition the depth map accuracy of the plenoptic camera was experimentally investigated for different focal lengths of the main lens and is compared to the analytical evaluation.
Application of single-image camera calibration for ultrasound augmented laparoscopic visualization
NASA Astrophysics Data System (ADS)
Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj
2015-03-01
Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool (rdCalib; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.
Application of single-image camera calibration for ultrasound augmented laparoscopic visualization
Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj
2017-01-01
Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool (rdCalib; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery. PMID:28943703
Application of single-image camera calibration for ultrasound augmented laparoscopic visualization.
Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D; Shekhar, Raj
2015-03-01
Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool ( rdCalib ; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker ® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.
A novel dual-camera calibration method for 3D optical measurement
NASA Astrophysics Data System (ADS)
Gai, Shaoyan; Da, Feipeng; Dai, Xianqiang
2018-05-01
A novel dual-camera calibration method is presented. In the classic methods, the camera parameters are usually calculated and optimized by the reprojection error. However, for a system designed for 3D optical measurement, this error does not denote the result of 3D reconstruction. In the presented method, a planar calibration plate is used. In the beginning, images of calibration plate are snapped from several orientations in the measurement range. The initial parameters of the two cameras are obtained by the images. Then, the rotation and translation matrix that link the frames of two cameras are calculated by using method of Centroid Distance Increment Matrix. The degree of coupling between the parameters is reduced. Then, 3D coordinates of the calibration points are reconstructed by space intersection method. At last, the reconstruction error is calculated. It is minimized to optimize the calibration parameters. This error directly indicates the efficiency of 3D reconstruction, thus it is more suitable for assessing the quality of dual-camera calibration. In the experiments, it can be seen that the proposed method is convenient and accurate. There is no strict requirement on the calibration plate position in the calibration process. The accuracy is improved significantly by the proposed method.
Camera calibration method of binocular stereo vision based on OpenCV
NASA Astrophysics Data System (ADS)
Zhong, Wanzhen; Dong, Xiaona
2015-10-01
Camera calibration, an important part of the binocular stereo vision research, is the essential foundation of 3D reconstruction of the spatial object. In this paper, the camera calibration method based on OpenCV (open source computer vision library) is submitted to make the process better as a result of obtaining higher precision and efficiency. First, the camera model in OpenCV and an algorithm of camera calibration are presented, especially considering the influence of camera lens radial distortion and decentering distortion. Then, camera calibration procedure is designed to compute those parameters of camera and calculate calibration errors. High-accurate profile extraction algorithm and a checkboard with 48 corners have also been used in this part. Finally, results of calibration program are presented, demonstrating the high efficiency and accuracy of the proposed approach. The results can reach the requirement of robot binocular stereo vision.
Automatic Camera Calibration Using Multiple Sets of Pairwise Correspondences.
Vasconcelos, Francisco; Barreto, Joao P; Boyer, Edmond
2018-04-01
We propose a new method to add an uncalibrated node into a network of calibrated cameras using only pairwise point correspondences. While previous methods perform this task using triple correspondences, these are often difficult to establish when there is limited overlap between different views. In such challenging cases we must rely on pairwise correspondences and our solution becomes more advantageous. Our method includes an 11-point minimal solution for the intrinsic and extrinsic calibration of a camera from pairwise correspondences with other two calibrated cameras, and a new inlier selection framework that extends the traditional RANSAC family of algorithms to sampling across multiple datasets. Our method is validated on different application scenarios where a lack of triple correspondences might occur: addition of a new node to a camera network; calibration and motion estimation of a moving camera inside a camera network; and addition of views with limited overlap to a Structure-from-Motion model.
A calibration method based on virtual large planar target for cameras with large FOV
NASA Astrophysics Data System (ADS)
Yu, Lei; Han, Yangyang; Nie, Hong; Ou, Qiaofeng; Xiong, Bangshu
2018-02-01
In order to obtain high precision in camera calibration, a target should be large enough to cover the whole field of view (FOV). For cameras with large FOV, using a small target will seriously reduce the precision of calibration. However, using a large target causes many difficulties in making, carrying and employing the large target. In order to solve this problem, a calibration method based on the virtual large planar target (VLPT), which is virtually constructed with multiple small targets (STs), is proposed for cameras with large FOV. In the VLPT-based calibration method, first, the positions and directions of STs are changed several times to obtain a number of calibration images. Secondly, the VLPT of each calibration image is created by finding the virtual point corresponding to the feature points of the STs. Finally, intrinsic and extrinsic parameters of the camera are calculated by using the VLPTs. Experiment results show that the proposed method can not only achieve the similar calibration precision as those employing a large target, but also have good stability in the whole measurement area. Thus, the difficulties to accurately calibrate cameras with large FOV can be perfectly tackled by the proposed method with good operability.
Joint Calibration of 3d Laser Scanner and Digital Camera Based on Dlt Algorithm
NASA Astrophysics Data System (ADS)
Gao, X.; Li, M.; Xing, L.; Liu, Y.
2018-04-01
Design a calibration target that can be scanned by 3D laser scanner while shot by digital camera, achieving point cloud and photos of a same target. A method to joint calibrate 3D laser scanner and digital camera based on Direct Linear Transformation algorithm was proposed. This method adds a distortion model of digital camera to traditional DLT algorithm, after repeating iteration, it can solve the inner and external position element of the camera as well as the joint calibration of 3D laser scanner and digital camera. It comes to prove that this method is reliable.
Camera calibration for multidirectional flame chemiluminescence tomography
NASA Astrophysics Data System (ADS)
Wang, Jia; Zhang, Weiguang; Zhang, Yuhong; Yu, Xun
2017-04-01
Flame chemiluminescence tomography (FCT), which combines computerized tomography theory and multidirectional chemiluminescence emission measurements, can realize instantaneous three-dimensional (3-D) diagnostics for flames with high spatial and temporal resolutions. One critical step of FCT is to record the projections by multiple cameras from different view angles. For high accuracy reconstructions, it requires that extrinsic parameters (the positions and orientations) and intrinsic parameters (especially the image distances) of cameras be accurately calibrated first. Taking the focus effect of the camera into account, a modified camera calibration method was presented for FCT, and a 3-D calibration pattern was designed to solve the parameters. The precision of the method was evaluated by reprojections of feature points to cameras with the calibration results. The maximum root mean square error of the feature points' position is 1.42 pixels and 0.0064 mm for the image distance. An FCT system with 12 cameras was calibrated by the proposed method and the 3-D CH* intensity of a propane flame was measured. The results showed that the FCT system provides reasonable reconstruction accuracy using the camera's calibration results.
Automatic calibration method for plenoptic camera
NASA Astrophysics Data System (ADS)
Luan, Yinsen; He, Xing; Xu, Bing; Yang, Ping; Tang, Guomao
2016-04-01
An automatic calibration method is proposed for a microlens-based plenoptic camera. First, all microlens images on the white image are searched and recognized automatically based on digital morphology. Then, the center points of microlens images are rearranged according to their relative position relationships. Consequently, the microlens images are located, i.e., the plenoptic camera is calibrated without the prior knowledge of camera parameters. Furthermore, this method is appropriate for all types of microlens-based plenoptic cameras, even the multifocus plenoptic camera, the plenoptic camera with arbitrarily arranged microlenses, or the plenoptic camera with different sizes of microlenses. Finally, we verify our method by the raw data of Lytro. The experiments show that our method has higher intelligence than the methods published before.
Extrinsic Calibration of Camera Networks Based on Pedestrians
Guan, Junzhi; Deboeverie, Francis; Slembrouck, Maarten; Van Haerenborgh, Dirk; Van Cauwelaert, Dimitri; Veelaert, Peter; Philips, Wilfried
2016-01-01
In this paper, we propose a novel extrinsic calibration method for camera networks by analyzing tracks of pedestrians. First of all, we extract the center lines of walking persons by detecting their heads and feet in the camera images. We propose an easy and accurate method to estimate the 3D positions of the head and feet w.r.t. a local camera coordinate system from these center lines. We also propose a RANSAC-based orthogonal Procrustes approach to compute relative extrinsic parameters connecting the coordinate systems of cameras in a pairwise fashion. Finally, we refine the extrinsic calibration matrices using a method that minimizes the reprojection error. While existing state-of-the-art calibration methods explore epipolar geometry and use image positions directly, the proposed method first computes 3D positions per camera and then fuses the data. This results in simpler computations and a more flexible and accurate calibration method. Another advantage of our method is that it can also handle the case of persons walking along straight lines, which cannot be handled by most of the existing state-of-the-art calibration methods since all head and feet positions are co-planar. This situation often happens in real life. PMID:27171080
An intelligent space for mobile robot localization using a multi-camera system.
Rampinelli, Mariana; Covre, Vitor Buback; de Queiroz, Felippe Mendonça; Vassallo, Raquel Frizera; Bastos-Filho, Teodiano Freire; Mazo, Manuel
2014-08-15
This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization.
An Intelligent Space for Mobile Robot Localization Using a Multi-Camera System
Rampinelli, Mariana.; Covre, Vitor Buback.; de Queiroz, Felippe Mendonça.; Vassallo, Raquel Frizera.; Bastos-Filho, Teodiano Freire.; Mazo, Manuel.
2014-01-01
This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization. PMID:25196009
A combined microphone and camera calibration technique with application to acoustic imaging.
Legg, Mathew; Bradley, Stuart
2013-10-01
We present a calibration technique for an acoustic imaging microphone array, combined with a digital camera. Computer vision and acoustic time of arrival data are used to obtain microphone coordinates in the camera reference frame. Our new method allows acoustic maps to be plotted onto the camera images without the need for additional camera alignment or calibration. Microphones and cameras may be placed in an ad-hoc arrangement and, after calibration, the coordinates of the microphones are known in the reference frame of a camera in the array. No prior knowledge of microphone positions, inter-microphone spacings, or air temperature is required. This technique is applied to a spherical microphone array and a mean difference of 3 mm was obtained between the coordinates obtained with this calibration technique and those measured using a precision mechanical method.
Camera calibration based on the back projection process
NASA Astrophysics Data System (ADS)
Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui
2015-12-01
Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.
NASA Astrophysics Data System (ADS)
Xia, Renbo; Hu, Maobang; Zhao, Jibin; Chen, Songlin; Chen, Yueling
2018-06-01
Multi-camera vision systems are often needed to achieve large-scale and high-precision measurement because these systems have larger fields of view (FOV) than a single camera. Multiple cameras may have no or narrow overlapping FOVs in many applications, which pose a huge challenge to global calibration. This paper presents a global calibration method for multi-cameras without overlapping FOVs based on photogrammetry technology and a reconfigurable target. Firstly, two planar targets are fixed together and made into a long target according to the distance between the two cameras to be calibrated. The relative positions of the two planar targets can be obtained by photogrammetric methods and used as invariant constraints in global calibration. Then, the reprojection errors of target feature points in the two cameras’ coordinate systems are calculated at the same time and optimized by the Levenberg–Marquardt algorithm to find the optimal solution of the transformation matrix between the two cameras. Finally, all the camera coordinate systems are converted to the reference coordinate system in order to achieve global calibration. Experiments show that the proposed method has the advantages of high accuracy (the RMS error is 0.04 mm) and low cost and is especially suitable for on-site calibration.
Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras.
Liao, Yajie; Sun, Ying; Li, Gongfa; Kong, Jianyi; Jiang, Guozhang; Jiang, Du; Cai, Haibin; Ju, Zhaojie; Yu, Hui; Liu, Honghai
2017-06-24
Camera calibration is a crucial problem in many applications, such as 3D reconstruction, structure from motion, object tracking and face alignment. Numerous methods have been proposed to solve the above problem with good performance in the last few decades. However, few methods are targeted at joint calibration of multi-sensors (more than four devices), which normally is a practical issue in the real-time systems. In this paper, we propose a novel method and a corresponding workflow framework to simultaneously calibrate relative poses of a Kinect and three external cameras. By optimizing the final cost function and adding corresponding weights to the external cameras in different locations, an effective joint calibration of multiple devices is constructed. Furthermore, the method is tested in a practical platform, and experiment results show that the proposed joint calibration method can achieve a satisfactory performance in a project real-time system and its accuracy is higher than the manufacturer's calibration.
Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras
Liao, Yajie; Sun, Ying; Li, Gongfa; Kong, Jianyi; Jiang, Guozhang; Jiang, Du; Cai, Haibin; Ju, Zhaojie; Yu, Hui; Liu, Honghai
2017-01-01
Camera calibration is a crucial problem in many applications, such as 3D reconstruction, structure from motion, object tracking and face alignment. Numerous methods have been proposed to solve the above problem with good performance in the last few decades. However, few methods are targeted at joint calibration of multi-sensors (more than four devices), which normally is a practical issue in the real-time systems. In this paper, we propose a novel method and a corresponding workflow framework to simultaneously calibrate relative poses of a Kinect and three external cameras. By optimizing the final cost function and adding corresponding weights to the external cameras in different locations, an effective joint calibration of multiple devices is constructed. Furthermore, the method is tested in a practical platform, and experiment results show that the proposed joint calibration method can achieve a satisfactory performance in a project real-time system and its accuracy is higher than the manufacturer’s calibration. PMID:28672823
Research on camera on orbit radial calibration based on black body and infrared calibration stars
NASA Astrophysics Data System (ADS)
Wang, YuDu; Su, XiaoFeng; Zhang, WanYing; Chen, FanSheng
2018-05-01
Affected by launching process and space environment, the response capability of a space camera must be attenuated. So it is necessary for a space camera to have a spaceborne radiant calibration. In this paper, we propose a method of calibration based on accurate Infrared standard stars was proposed for increasing infrared radiation measurement precision. As stars can be considered as a point target, we use them as the radiometric calibration source and establish the Taylor expansion method and the energy extrapolation model based on WISE catalog and 2MASS catalog. Then we update the calibration results from black body. Finally, calibration mechanism is designed and the technology of design is verified by on orbit test. The experimental calibration result shows the irradiance extrapolation error is about 3% and the accuracy of calibration methods is about 10%, the results show that the methods could satisfy requirements of on orbit calibration.
Accuracy evaluation of optical distortion calibration by digital image correlation
NASA Astrophysics Data System (ADS)
Gao, Zeren; Zhang, Qingchuan; Su, Yong; Wu, Shangquan
2017-11-01
Due to its convenience of operation, the camera calibration algorithm, which is based on the plane template, is widely used in image measurement, computer vision and other fields. How to select a suitable distortion model is always a problem to be solved. Therefore, there is an urgent need for an experimental evaluation of the accuracy of camera distortion calibrations. This paper presents an experimental method for evaluating camera distortion calibration accuracy, which is easy to implement, has high precision, and is suitable for a variety of commonly used lens. First, we use the digital image correlation method to calculate the in-plane rigid body displacement field of an image displayed on a liquid crystal display before and after translation, as captured with a camera. Next, we use a calibration board to calibrate the camera to obtain calibration parameters which are used to correct calculation points of the image before and after deformation. The displacement field before and after correction is compared to analyze the distortion calibration results. Experiments were carried out to evaluate the performance of two commonly used industrial camera lenses for four commonly used distortion models.
Calibration of an Outdoor Distributed Camera Network with a 3D Point Cloud
Ortega, Agustín; Silva, Manuel; Teniente, Ernesto H.; Ferreira, Ricardo; Bernardino, Alexandre; Gaspar, José; Andrade-Cetto, Juan
2014-01-01
Outdoor camera networks are becoming ubiquitous in critical urban areas of the largest cities around the world. Although current applications of camera networks are mostly tailored to video surveillance, recent research projects are exploiting their use to aid robotic systems in people-assisting tasks. Such systems require precise calibration of the internal and external parameters of the distributed camera network. Despite the fact that camera calibration has been an extensively studied topic, the development of practical methods for user-assisted calibration that minimize user intervention time and maximize precision still pose significant challenges. These camera systems have non-overlapping fields of view, are subject to environmental stress, and are likely to suffer frequent recalibration. In this paper, we propose the use of a 3D map covering the area to support the calibration process and develop an automated method that allows quick and precise calibration of a large camera network. We present two cases of study of the proposed calibration method: one is the calibration of the Barcelona Robot Lab camera network, which also includes direct mappings (homographies) between image coordinates and world points in the ground plane (walking areas) to support person and robot detection and localization algorithms. The second case consist of improving the GPS positioning of geo-tagged images taken with a mobile device in the Facultat de Matemàtiques i Estadística (FME) patio at the Universitat Politècnica de Catalunya (UPC). PMID:25076221
Calibration of an outdoor distributed camera network with a 3D point cloud.
Ortega, Agustín; Silva, Manuel; Teniente, Ernesto H; Ferreira, Ricardo; Bernardino, Alexandre; Gaspar, José; Andrade-Cetto, Juan
2014-07-29
Outdoor camera networks are becoming ubiquitous in critical urban areas of the largest cities around the world. Although current applications of camera networks are mostly tailored to video surveillance, recent research projects are exploiting their use to aid robotic systems in people-assisting tasks. Such systems require precise calibration of the internal and external parameters of the distributed camera network. Despite the fact that camera calibration has been an extensively studied topic, the development of practical methods for user-assisted calibration that minimize user intervention time and maximize precision still pose significant challenges. These camera systems have non-overlapping fields of view, are subject to environmental stress, and are likely to suffer frequent recalibration. In this paper, we propose the use of a 3D map covering the area to support the calibration process and develop an automated method that allows quick and precise calibration of a large camera network. We present two cases of study of the proposed calibration method: one is the calibration of the Barcelona Robot Lab camera network, which also includes direct mappings (homographies) between image coordinates and world points in the ground plane (walking areas) to support person and robot detection and localization algorithms. The second case consist of improving the GPS positioning of geo-tagged images taken with a mobile device in the Facultat de Matemàtiques i Estadística (FME) patio at the Universitat Politècnica de Catalunya (UPC).
NASA Astrophysics Data System (ADS)
Zhao, Guihua; Chen, Hong; Li, Xingquan; Zou, Xiaoliang
The paper presents the concept of lever arm and boresight angle, the design requirements of calibration sites and the integrated calibration method of boresight angles of digital camera or laser scanner. Taking test data collected by Applanix's LandMark system as an example, the camera calibration method is introduced to be piling three consecutive stereo images and OTF-Calibration method using ground control points. The laser calibration of boresight angle is proposed to use a manual and automatic method with ground control points. Integrated calibration between digital camera and laser scanner is introduced to improve the systemic precision of two sensors. By analyzing the measurement value between ground control points and its corresponding image points in sequence images, a conclusion is that position objects between camera and images are within about 15cm in relative errors and 20cm in absolute errors. By comparing the difference value between ground control points and its corresponding laser point clouds, the errors is less than 20cm. From achieved results of these experiments in analysis, mobile mapping system is efficient and reliable system for generating high-accuracy and high-density road spatial data more rapidly.
Calibration Method for IATS and Application in Multi-Target Monitoring Using Coded Targets
NASA Astrophysics Data System (ADS)
Zhou, Yueyin; Wagner, Andreas; Wunderlich, Thomas; Wasmeier, Peter
2017-06-01
The technique of Image Assisted Total Stations (IATS) has been studied for over ten years and is composed of two major parts: one is the calibration procedure which combines the relationship between the camera system and the theodolite system; the other is the automatic target detection on the image by various methods of photogrammetry or computer vision. Several calibration methods have been developed, mostly using prototypes with an add-on camera rigidly mounted on the total station. However, these prototypes are not commercially available. This paper proposes a calibration method based on Leica MS50 which has two built-in cameras each with a resolution of 2560 × 1920 px: an overview camera and a telescope (on-axis) camera. Our work in this paper is based on the on-axis camera which uses the 30-times magnification of the telescope. The calibration consists of 7 parameters to estimate. We use coded targets, which are common tools in photogrammetry for orientation, to detect different targets in IATS images instead of prisms and traditional ATR functions. We test and verify the efficiency and stability of this monitoring method with multi-target.
A novel calibration method of focused light field camera for 3-D reconstruction of flame temperature
NASA Astrophysics Data System (ADS)
Sun, Jun; Hossain, Md. Moinul; Xu, Chuan-Long; Zhang, Biao; Wang, Shi-Min
2017-05-01
This paper presents a novel geometric calibration method for focused light field camera to trace the rays of flame radiance and to reconstruct the three-dimensional (3-D) temperature distribution of a flame. A calibration model is developed to calculate the corner points and their projections of the focused light field camera. The characteristics of matching main lens and microlens f-numbers are used as an additional constrains for the calibration. Geometric parameters of the focused light field camera are then achieved using Levenberg-Marquardt algorithm. Total focused images in which all the points are in focus, are utilized to validate the proposed calibration method. Calibration results are presented and discussed in details. The maximum mean relative error of the calibration is found less than 0.13%, indicating that the proposed method is capable of calibrating the focused light field camera successfully. The parameters obtained by the calibration are then utilized to trace the rays of flame radiance. A least square QR-factorization algorithm with Plank's radiation law is used to reconstruct the 3-D temperature distribution of a flame. Experiments were carried out on an ethylene air fired combustion test rig to reconstruct the temperature distribution of flames. The flame temperature obtained by the proposed method is then compared with that obtained by using high-precision thermocouple. The difference between the two measurements was found no greater than 6.7%. Experimental results demonstrated that the proposed calibration method and the applied measurement technique perform well in the reconstruction of the flame temperature.
Camera calibration: active versus passive targets
NASA Astrophysics Data System (ADS)
Schmalz, Christoph; Forster, Frank; Angelopoulou, Elli
2011-11-01
Traditionally, most camera calibrations rely on a planar target with well-known marks. However, the localization error of the marks in the image is a source of inaccuracy. We propose the use of high-resolution digital displays as active calibration targets to obtain more accurate calibration results for all types of cameras. The display shows a series of coded patterns to generate correspondences between world points and image points. This has several advantages. No special calibration hardware is necessary because suitable displays are practically ubiquitious. The method is fully automatic, and no identification of marks is necessary. For a coding scheme based on phase shifting, the localization accuracy is approximately independent of the camera's focus settings. Most importantly, higher accuracy can be achieved compared to passive targets, such as printed checkerboards. A rigorous evaluation is performed to substantiate this claim. Our active target method is compared to standard calibrations using a checkerboard target. We perform camera, calibrations with different combinations of displays, cameras, and lenses, as well as with simulated images and find markedly lower reprojection errors when using active targets. For example, in a stereo reconstruction task, the accuracy of a system calibrated with an active target is five times better.
Photogrammetric Modeling and Image-Based Rendering for Rapid Virtual Environment Creation
2004-12-01
area and different methods have been proposed. Pertinent methods include: Camera Calibration , Structure from Motion, Stereo Correspondence, and Image...Based Rendering 1.1.1 Camera Calibration Determining the 3D structure of a model from multiple views becomes simpler if the intrinsic (or internal...can introduce significant nonlinearities into the image. We have found that camera calibration is a straightforward process which can simplify the
A Visual Servoing-Based Method for ProCam Systems Calibration
Berry, Francois; Aider, Omar Ait; Mosnier, Jeremie
2013-01-01
Projector-camera systems are currently used in a wide field of applications, such as 3D reconstruction and augmented reality, and can provide accurate measurements, depending on the configuration and calibration. Frequently, the calibration task is divided into two steps: camera calibration followed by projector calibration. The latter still poses certain problems that are not easy to solve, such as the difficulty in obtaining a set of 2D–3D points to compute the projection matrix between the projector and the world. Existing methods are either not sufficiently accurate or not flexible. We propose an easy and automatic method to calibrate such systems that consists in projecting a calibration pattern and superimposing it automatically on a known printed pattern. The projected pattern is provided by a virtual camera observing a virtual pattern in an OpenGL model. The projector displays what the virtual camera visualizes. Thus, the projected pattern can be controlled and superimposed on the printed one with the aid of visual servoing. Our experimental results compare favorably with those of other methods considering both usability and accuracy. PMID:24084121
Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.
Song, Kai-Tai; Tai, Jen-Chao
2006-10-01
Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.
Photogrammetry Applied to Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Liu, Tian-Shu; Cattafesta, L. N., III; Radeztsky, R. H.; Burner, A. W.
2000-01-01
In image-based measurements, quantitative image data must be mapped to three-dimensional object space. Analytical photogrammetric methods, which may be used to accomplish this task, are discussed from the viewpoint of experimental fluid dynamicists. The Direct Linear Transformation (DLT) for camera calibration, used in pressure sensitive paint, is summarized. An optimization method for camera calibration is developed that can be used to determine the camera calibration parameters, including those describing lens distortion, from a single image. Combined with the DLT method, this method allows a rapid and comprehensive in-situ camera calibration and therefore is particularly useful for quantitative flow visualization and other measurements such as model attitude and deformation in production wind tunnels. The paper also includes a brief description of typical photogrammetric applications to temperature- and pressure-sensitive paint measurements and model deformation measurements in wind tunnels.
Depth estimation and camera calibration of a focused plenoptic camera for visual odometry
NASA Astrophysics Data System (ADS)
Zeller, Niclas; Quint, Franz; Stilla, Uwe
2016-08-01
This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced. In contrast to monocular visual odometry approaches, due to the calibration of the individual depth maps, the scale of the scene can be observed. Furthermore, due to the light-field information better tracking capabilities compared to the monocular case can be expected. As result, the depth information gained by the plenoptic camera based visual odometry algorithm proposed in this paper has superior accuracy and reliability compared to the depth estimated from a single light-field image.
High-precision method of binocular camera calibration with a distortion model.
Li, Weimin; Shan, Siyu; Liu, Hui
2017-03-10
A high-precision camera calibration method for binocular stereo vision system based on a multi-view template and alternative bundle adjustment is presented in this paper. The proposed method could be achieved by taking several photos on a specially designed calibration template that has diverse encoded points in different orientations. In this paper, the method utilized the existing algorithm used for monocular camera calibration to obtain the initialization, which involves a camera model, including radial lens distortion and tangential distortion. We created a reference coordinate system based on the left camera coordinate to optimize the intrinsic parameters of left camera through alternative bundle adjustment to obtain optimal values. Then, optimal intrinsic parameters of the right camera can be obtained through alternative bundle adjustment when we create a reference coordinate system based on the right camera coordinate. We also used all intrinsic parameters that were acquired to optimize extrinsic parameters. Thus, the optimal lens distortion parameters and intrinsic and extrinsic parameters were obtained. Synthetic and real data were used to test the method. The simulation results demonstrate that the maximum mean absolute relative calibration errors are about 3.5e-6 and 1.2e-6 for the focal length and the principal point, respectively, under zero-mean Gaussian noise with 0.05 pixels standard deviation. The real result shows that the reprojection error of our model is about 0.045 pixels with the relative standard deviation of 1.0e-6 over the intrinsic parameters. The proposed method is convenient, cost-efficient, highly precise, and simple to carry out.
Spatial calibration of an optical see-through head mounted display
Gilson, Stuart J.; Fitzgibbon, Andrew W.; Glennerster, Andrew
2010-01-01
We present here a method for calibrating an optical see-through Head Mounted Display (HMD) using techniques usually applied to camera calibration (photogrammetry). Using a camera placed inside the HMD to take pictures simultaneously of a tracked object and features in the HMD display, we could exploit established camera calibration techniques to recover both the intrinsic and extrinsic properties of the HMD (width, height, focal length, optic centre and principal ray of the display). Our method gives low re-projection errors and, unlike existing methods, involves no time-consuming and error-prone human measurements, nor any prior estimates about the HMD geometry. PMID:18599125
Stability analysis for a multi-camera photogrammetric system.
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-08-18
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.
Stability Analysis for a Multi-Camera Photogrammetric System
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-01-01
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction. PMID:25196012
Automatic alignment method for calibration of hydrometers
NASA Astrophysics Data System (ADS)
Lee, Y. J.; Chang, K. H.; Chon, J. C.; Oh, C. Y.
2004-04-01
This paper presents a new method to automatically align specific scale-marks for the calibration of hydrometers. A hydrometer calibration system adopting the new method consists of a vision system, a stepping motor, and software to control the system. The vision system is composed of a CCD camera and a frame grabber, and is used to acquire images. The stepping motor moves the camera, which is attached to the vessel containing a reference liquid, along the hydrometer. The operating program has two main functions: to process images from the camera to find the position of the horizontal plane and to control the stepping motor for the alignment of the horizontal plane with a particular scale-mark. Any system adopting this automatic alignment method is a convenient and precise means of calibrating a hydrometer. The performance of the proposed method is illustrated by comparing the calibration results using the automatic alignment method with those obtained using the manual method.
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.
2015-10-01
Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.
The research on calibration methods of dual-CCD laser three-dimensional human face scanning system
NASA Astrophysics Data System (ADS)
Wang, Jinjiang; Chang, Tianyu; Ge, Baozhen; Tian, Qingguo; Yang, Fengting; Shi, Shendong
2013-09-01
In this paper, on the basis of considering the performance advantages of two-step method, we combines the stereo matching of binocular stereo vision with active laser scanning to calibrate the system. Above all, we select a reference camera coordinate system as the world coordinate system and unity the coordinates of two CCD cameras. And then obtain the new perspective projection matrix (PPM) of each camera after the epipolar rectification. By those, the corresponding epipolar equation of two cameras can be defined. So by utilizing the trigonometric parallax method, we can measure the space point position after distortion correction and achieve stereo matching calibration between two image points. Experiments verify that this method can improve accuracy and system stability is guaranteed. The stereo matching calibration has a simple process with low-cost, and simplifies regular maintenance work. It can acquire 3D coordinates only by planar checkerboard calibration without the need of designing specific standard target or using electronic theodolite. It is found that during the experiment two-step calibration error and lens distortion lead to the stratification of point cloud data. The proposed calibration method which combining active line laser scanning and binocular stereo vision has the both advantages of them. It has more flexible applicability. Theory analysis and experiment shows the method is reasonable.
Improved calibration-based non-uniformity correction method for uncooled infrared camera
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao
2017-08-01
With the latest improvements of microbolometer focal plane arrays (FPA), uncooled infrared (IR) cameras are becoming the most widely used devices in thermography, especially in handheld devices. However the influences derived from changing ambient condition and the non-uniform response of the sensors make it more difficult to correct the nonuniformity of uncooled infrared camera. In this paper, based on the infrared radiation characteristic in the TEC-less uncooled infrared camera, a novel model was proposed for calibration-based non-uniformity correction (NUC). In this model, we introduce the FPA temperature, together with the responses of microbolometer under different ambient temperature to calculate the correction parameters. Based on the proposed model, we can work out the correction parameters with the calibration measurements under controlled ambient condition and uniform blackbody. All correction parameters can be determined after the calibration process and then be used to correct the non-uniformity of the infrared camera in real time. This paper presents the detail of the compensation procedure and the performance of the proposed calibration-based non-uniformity correction method. And our method was evaluated on realistic IR images obtained by a 384x288 pixels uncooled long wave infrared (LWIR) camera operated under changed ambient condition. The results show that our method can exclude the influence caused by the changed ambient condition, and ensure that the infrared camera has a stable performance.
Geometric calibration of Colour and Stereo Surface Imaging System of ESA's Trace Gas Orbiter
NASA Astrophysics Data System (ADS)
Tulyakov, Stepan; Ivanov, Anton; Thomas, Nicolas; Roloff, Victoria; Pommerol, Antoine; Cremonese, Gabriele; Weigel, Thomas; Fleuret, Francois
2018-01-01
There are many geometric calibration methods for "standard" cameras. These methods, however, cannot be used for the calibration of telescopes with large focal lengths and complex off-axis optics. Moreover, specialized calibration methods for the telescopes are scarce in literature. We describe the calibration method that we developed for the Colour and Stereo Surface Imaging System (CaSSIS) telescope, on board of the ExoMars Trace Gas Orbiter (TGO). Although our method is described in the context of CaSSIS, with camera-specific experiments, it is general and can be applied to other telescopes. We further encourage re-use of the proposed method by making our calibration code and data available on-line.
Transient full-field vibration measurement using spectroscopical stereo photogrammetry.
Yue, Kaiduan; Li, Zhongke; Zhang, Ming; Chen, Shan
2010-12-20
Contrasted with other vibration measurement methods, a novel spectroscopical photogrammetric approach is proposed. Two colored light filters and a CCD color camera are used to achieve the function of two traditional cameras. Then a new calibration method is presented. It focuses on the vibrating object rather than the camera and has the advantage of more accuracy than traditional camera calibration. The test results have shown an accuracy of 0.02 mm.
NASA Astrophysics Data System (ADS)
Stavroulakis, Petros I.; Chen, Shuxiao; Sims-Waterhouse, Danny; Piano, Samanta; Southon, Nicholas; Bointon, Patrick; Leach, Richard
2017-06-01
In non-rigid fringe projection 3D measurement systems, where either the camera or projector setup can change significantly between measurements or the object needs to be tracked, self-calibration has to be carried out frequently to keep the measurements accurate1. In fringe projection systems, it is common to use methods developed initially for photogrammetry for the calibration of the camera(s) in the system in terms of extrinsic and intrinsic parameters. To calibrate the projector(s) an extra correspondence between a pre-calibrated camera and an image created by the projector is performed. These recalibration steps are usually time consuming and involve the measurement of calibrated patterns on planes, before the actual object can continue to be measured after a motion of a camera or projector has been introduced in the setup and hence do not facilitate fast 3D measurement of objects when frequent experimental setup changes are necessary. By employing and combining a priori information via inverse rendering, on-board sensors, deep learning and leveraging a graphics processor unit (GPU), we assess a fine camera pose estimation method which is based on optimising the rendering of a model of a scene and the object to match the view from the camera. We find that the success of this calibration pipeline can be greatly improved by using adequate a priori information from the aforementioned sources.
NASA Astrophysics Data System (ADS)
Hanel, A.; Stilla, U.
2017-05-01
Vehicle environment cameras observing traffic participants in the area around a car and interior cameras observing the car driver are important data sources for driver intention recognition algorithms. To combine information from both camera groups, a camera system calibration can be performed. Typically, there is no overlapping field-of-view between environment and interior cameras. Often no marked reference points are available in environments, which are a large enough to cover a car for the system calibration. In this contribution, a calibration method for a vehicle camera system with non-overlapping camera groups in an urban environment is described. A-priori images of an urban calibration environment taken with an external camera are processed with the structure-frommotion method to obtain an environment point cloud. Images of the vehicle interior, taken also with an external camera, are processed to obtain an interior point cloud. Both point clouds are tied to each other with images of both image sets showing the same real-world objects. The point clouds are transformed into a self-defined vehicle coordinate system describing the vehicle movement. On demand, videos can be recorded with the vehicle cameras in a calibration drive. Poses of vehicle environment cameras and interior cameras are estimated separately using ground control points from the respective point cloud. All poses of a vehicle camera estimated for different video frames are optimized in a bundle adjustment. In an experiment, a point cloud is created from images of an underground car park, as well as a point cloud of the interior of a Volkswagen test car is created. Videos of two environment and one interior cameras are recorded. Results show, that the vehicle camera poses are estimated successfully especially when the car is not moving. Position standard deviations in the centimeter range can be achieved for all vehicle cameras. Relative distances between the vehicle cameras deviate between one and ten centimeters from tachymeter reference measurements.
Automatic Calibration of Stereo-Cameras Using Ordinary Chess-Board Patterns
NASA Astrophysics Data System (ADS)
Prokos, A.; Kalisperakis, I.; Petsa, E.; Karras, G.
2012-07-01
Automation of camera calibration is facilitated by recording coded 2D patterns. Our toolbox for automatic camera calibration using images of simple chess-board patterns is freely available on the Internet. But it is unsuitable for stereo-cameras whose calibration implies recovering camera geometry and their true-to-scale relative orientation. In contrast to all reported methods requiring additional specific coding to establish an object space coordinate system, a toolbox for automatic stereo-camera calibration relying on ordinary chess-board patterns is presented here. First, the camera calibration algorithm is applied to all image pairs of the pattern to extract nodes of known spacing, order them in rows and columns, and estimate two independent camera parameter sets. The actual node correspondences on stereo-pairs remain unknown. Image pairs of a textured 3D scene are exploited for finding the fundamental matrix of the stereo-camera by applying RANSAC to point matches established with the SIFT algorithm. A node is then selected near the centre of the left image; its match on the right image is assumed as the node closest to the corresponding epipolar line. This yields matches for all nodes (since these have already been ordered), which should also satisfy the 2D epipolar geometry. Measures for avoiding mismatching are taken. With automatically estimated initial orientation values, a bundle adjustment is performed constraining all pairs on a common (scaled) relative orientation. Ambiguities regarding the actual exterior orientations of the stereo-camera with respect to the pattern are irrelevant. Results from this automatic method show typical precisions not above 1/4 pixels for 640×480 web cameras.
Fast calibration of electromagnetically tracked oblique-viewing rigid endoscopes.
Liu, Xinyang; Rice, Christina E; Shekhar, Raj
2017-10-01
The oblique-viewing (i.e., angled) rigid endoscope is a commonly used tool in conventional endoscopic surgeries. The relative rotation between its two moveable parts, the telescope and the camera head, creates a rotation offset between the actual and the projection of an object in the camera image. A calibration method tailored to compensate such offset is needed. We developed a fast calibration method for oblique-viewing rigid endoscopes suitable for clinical use. In contrast to prior approaches based on optical tracking, we used electromagnetic (EM) tracking as the external tracking hardware to improve compactness and practicality. Two EM sensors were mounted on the telescope and the camera head, respectively, with considerations to minimize EM tracking errors. Single-image calibration was incorporated into the method, and a sterilizable plate, laser-marked with the calibration pattern, was also developed. Furthermore, we proposed a general algorithm to estimate the rotation center in the camera image. Formulas for updating the camera matrix in terms of clockwise and counterclockwise rotations were also developed. The proposed calibration method was validated using a conventional [Formula: see text], 5-mm laparoscope. Freehand calibrations were performed using the proposed method, and the calibration time averaged 2 min and 8 s. The calibration accuracy was evaluated in a simulated clinical setting with several surgical tools present in the magnetic field of EM tracking. The root-mean-square re-projection error averaged 4.9 pixel (range 2.4-8.5 pixel, with image resolution of [Formula: see text] for rotation angles ranged from [Formula: see text] to [Formula: see text]. We developed a method for fast and accurate calibration of oblique-viewing rigid endoscopes. The method was also designed to be performed in the operating room and will therefore support clinical translation of many emerging endoscopic computer-assisted surgical systems.
Omnidirectional Underwater Camera Design and Calibration
Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David
2015-01-01
This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707
Researches on hazard avoidance cameras calibration of Lunar Rover
NASA Astrophysics Data System (ADS)
Li, Chunyan; Wang, Li; Lu, Xin; Chen, Jihua; Fan, Shenghong
2017-11-01
Lunar Lander and Rover of China will be launched in 2013. It will finish the mission targets of lunar soft landing and patrol exploration. Lunar Rover has forward facing stereo camera pair (Hazcams) for hazard avoidance. Hazcams calibration is essential for stereo vision. The Hazcam optics are f-theta fish-eye lenses with a 120°×120° horizontal/vertical field of view (FOV) and a 170° diagonal FOV. They introduce significant distortion in images and the acquired images are quite warped, which makes conventional camera calibration algorithms no longer work well. A photogrammetric calibration method of geometric model for the type of optical fish-eye constructions is investigated in this paper. In the method, Hazcams model is represented by collinearity equations with interior orientation and exterior orientation parameters [1] [2]. For high-precision applications, the accurate calibration model is formulated with the radial symmetric distortion and the decentering distortion as well as parameters to model affinity and shear based on the fisheye deformation model [3] [4]. The proposed method has been applied to the stereo camera calibration system for Lunar Rover.
An automated calibration method for non-see-through head mounted displays.
Gilson, Stuart J; Fitzgibbon, Andrew W; Glennerster, Andrew
2011-08-15
Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements. Copyright © 2011 Elsevier B.V. All rights reserved.
Calibration of asynchronous smart phone cameras from moving objects
NASA Astrophysics Data System (ADS)
Hagen, Oksana; Istenič, Klemen; Bharti, Vibhav; Dhali, Maruf Ahmed; Barmaimon, Daniel; Houssineau, Jérémie; Clark, Daniel
2015-04-01
Calibrating multiple cameras is a fundamental prerequisite for many Computer Vision applications. Typically this involves using a pair of identical synchronized industrial or high-end consumer cameras. This paper considers an application on a pair of low-cost portable cameras with different parameters that are found in smart phones. This paper addresses the issues of acquisition, detection of moving objects, dynamic camera registration and tracking of arbitrary number of targets. The acquisition of data is performed using two standard smart phone cameras and later processed using detections of moving objects in the scene. The registration of cameras onto the same world reference frame is performed using a recently developed method for camera calibration using a disparity space parameterisation and the single-cluster PHD filter.
NASA Technical Reports Server (NTRS)
Gennery, D. B.
1998-01-01
A method is described for calibrating cameras including radial lens distortion, by using known points such as those measured from a calibration fixture. The distortion terms are relative to the optical axis, which is included in the model so that it does not have to be orthogonal to the image sensor plane.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu Feipeng; Shi Hongjian; Bai Pengxiang
In fringe projection, the CCD camera and the projector are often placed at equal height. In this paper, we will study the calibration of an unequal arrangement of the CCD camera and the projector. The principle of fringe projection with two-dimensional digital image correlation to acquire the profile of object surface is described in detail. By formula derivation and experiment, the linear relationship between the out-of-plane calibration coefficient and the y coordinate is clearly found. To acquire the three-dimensional (3D) information of an object correctly, this paper presents an effective calibration method with linear least-squares fitting, which is very simplemore » in principle and calibration. Experiments are implemented to validate the availability and reliability of the calibration method.« less
Calibration Methods for a 3D Triangulation Based Camera
NASA Astrophysics Data System (ADS)
Schulz, Ulrike; Böhnke, Kay
A sensor in a camera takes a gray level image (1536 x 512 pixels), which is reflected by a reference body. The reference body is illuminated by a linear laser line. This gray level image can be used for a 3D calibration. The following paper describes how a calibration program calculates the calibration factors. The calibration factors serve to determine the size of an unknown reference body.
Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking
NASA Technical Reports Server (NTRS)
Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.
2005-01-01
This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.
Structured light system calibration method with optimal fringe angle.
Li, Beiwen; Zhang, Song
2014-11-20
For structured light system calibration, one popular approach is to treat the projector as an inverse camera. This is usually performed by projecting horizontal and vertical sequences of patterns to establish one-to-one mapping between camera points and projector points. However, for a well-designed system, either horizontal or vertical fringe images are not sensitive to depth variation and thus yield inaccurate mapping. As a result, the calibration accuracy is jeopardized if a conventional calibration method is used. To address this limitation, this paper proposes a novel calibration method based on optimal fringe angle determination. Experiments demonstrate that our calibration approach can increase the measurement accuracy up to 38% compared to the conventional calibration method with a calibration volume of 300(H) mm×250(W) mm×500(D) mm.
3D Surface Reconstruction and Automatic Camera Calibration
NASA Technical Reports Server (NTRS)
Jalobeanu, Andre
2004-01-01
Illustrations in this view-graph presentation are presented on a Bayesian approach to 3D surface reconstruction and camera calibration.Existing methods, surface analysis and modeling,preliminary surface reconstruction results, and potential applications are addressed.
Automatic camera to laser calibration for high accuracy mobile mapping systems using INS
NASA Astrophysics Data System (ADS)
Goeman, Werner; Douterloigne, Koen; Gautama, Sidharta
2013-09-01
A mobile mapping system (MMS) is a mobile multi-sensor platform developed by the geoinformation community to support the acquisition of huge amounts of geodata in the form of georeferenced high resolution images and dense laser clouds. Since data fusion and data integration techniques are increasingly able to combine the complementary strengths of different sensor types, the external calibration of a camera to a laser scanner is a common pre-requisite on today's mobile platforms. The methods of calibration, nevertheless, are often relatively poorly documented, are almost always time-consuming, demand expert knowledge and often require a carefully constructed calibration environment. A new methodology is studied and explored to provide a high quality external calibration for a pinhole camera to a laser scanner which is automatic, easy to perform, robust and foolproof. The method presented here, uses a portable, standard ranging pole which needs to be positioned on a known ground control point. For calibration, a well studied absolute orientation problem needs to be solved. In many cases, the camera and laser sensor are calibrated in relation to the INS system. Therefore, the transformation from camera to laser contains the cumulated error of each sensor in relation to the INS. Here, the calibration of the camera is performed in relation to the laser frame using the time synchronization between the sensors for data association. In this study, the use of the inertial relative movement will be explored to collect more useful calibration data. This results in a better intersensor calibration allowing better coloring of the clouds and a more accurate depth mask for images, especially on the edges of objects in the scene.
Timing Calibration in PET Using a Time Alignment Probe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moses, William W.; Thompson, Christopher J.
2006-05-05
We evaluate the Scanwell Time Alignment Probe for performing the timing calibration for the LBNL Prostate-Specific PET Camera. We calibrate the time delay correction factors for each detector module in the camera using two methods--using the Time Alignment Probe (which measures the time difference between the probe and each detector module) and using the conventional method (which measures the timing difference between all module-module combinations in the camera). These correction factors, which are quantized in 2 ns steps, are compared on a module-by-module basis. The values are in excellent agreement--of the 80 correction factors, 62 agree exactly, 17 differ bymore » 1 step, and 1 differs by 2 steps. We also measure on-time and off-time counting rates when the two sets of calibration factors are loaded into the camera and find that they agree within statistical error. We conclude that the performance using the Time Alignment Probe and conventional methods are equivalent.« less
Visual control of robots using range images.
Pomares, Jorge; Gil, Pablo; Torres, Fernando
2010-01-01
In the last years, 3D-vision systems based on the time-of-flight (ToF) principle have gained more importance in order to obtain 3D information from the workspace. In this paper, an analysis of the use of 3D ToF cameras to guide a robot arm is performed. To do so, an adaptive method to simultaneous visual servo control and camera calibration is presented. Using this method a robot arm is guided by using range information obtained from a ToF camera. Furthermore, the self-calibration method obtains the adequate integration time to be used by the range camera in order to precisely determine the depth information.
Extrinsic Calibration of Camera and 2D Laser Sensors without Overlap
Al-Widyan, Khalid
2017-01-01
Extrinsic calibration of a camera and a 2D laser range finder (lidar) sensors is crucial in sensor data fusion applications; for example SLAM algorithms used in mobile robot platforms. The fundamental challenge of extrinsic calibration is when the camera-lidar sensors do not overlap or share the same field of view. In this paper we propose a novel and flexible approach for the extrinsic calibration of a camera-lidar system without overlap, which can be used for robotic platform self-calibration. The approach is based on the robot–world hand–eye calibration (RWHE) problem; proven to have efficient and accurate solutions. First, the system was mapped to the RWHE calibration problem modeled as the linear relationship AX=ZB, where X and Z are unknown calibration matrices. Then, we computed the transformation matrix B, which was the main challenge in the above mapping. The computation is based on reasonable assumptions about geometric structure in the calibration environment. The reliability and accuracy of the proposed approach is compared to a state-of-the-art method in extrinsic 2D lidar to camera calibration. Experimental results from real datasets indicate that the proposed approach provides better results with an L2 norm translational and rotational deviations of 314 mm and 0.12∘ respectively. PMID:29036905
Extrinsic Calibration of Camera and 2D Laser Sensors without Overlap.
Ahmad Yousef, Khalil M; Mohd, Bassam J; Al-Widyan, Khalid; Hayajneh, Thaier
2017-10-14
Extrinsic calibration of a camera and a 2D laser range finder (lidar) sensors is crucial in sensor data fusion applications; for example SLAM algorithms used in mobile robot platforms. The fundamental challenge of extrinsic calibration is when the camera-lidar sensors do not overlap or share the same field of view. In this paper we propose a novel and flexible approach for the extrinsic calibration of a camera-lidar system without overlap, which can be used for robotic platform self-calibration. The approach is based on the robot-world hand-eye calibration (RWHE) problem; proven to have efficient and accurate solutions. First, the system was mapped to the RWHE calibration problem modeled as the linear relationship AX = ZB , where X and Z are unknown calibration matrices. Then, we computed the transformation matrix B , which was the main challenge in the above mapping. The computation is based on reasonable assumptions about geometric structure in the calibration environment. The reliability and accuracy of the proposed approach is compared to a state-of-the-art method in extrinsic 2D lidar to camera calibration. Experimental results from real datasets indicate that the proposed approach provides better results with an L2 norm translational and rotational deviations of 314 mm and 0 . 12 ∘ respectively.
Kwon, Young-Hoo; Casebolt, Jeffrey B
2006-01-01
One of the most serious obstacles to accurate quantification of the underwater motion of a swimmer's body is image deformation caused by refraction. Refraction occurs at the water-air interface plane (glass) owing to the density difference. Camera calibration-reconstruction algorithms commonly used in aquatic research do not have the capability to correct this refraction-induced nonlinear image deformation and produce large reconstruction errors. The aim of this paper is to provide a through review of: the nature of the refraction-induced image deformation and its behaviour in underwater object-space plane reconstruction; the intrinsic shortcomings of the Direct Linear Transformation (DLT) method in underwater motion analysis; experimental conditions that interact with refraction; and alternative algorithms and strategies that can be used to improve the calibration-reconstruction accuracy. Although it is impossible to remove the refraction error completely in conventional camera calibration-reconstruction methods, it is possible to improve the accuracy to some extent by manipulating experimental conditions or calibration frame characteristics. Alternative algorithms, such as the localized DLT and the double-plane method are also available for error reduction. The ultimate solution for the refraction problem is to develop underwater camera calibration and reconstruction algorithms that have the capability to correct refraction.
Kwon, Young-Hoo; Casebolt, Jeffrey B
2006-07-01
One of the most serious obstacles to accurate quantification of the underwater motion of a swimmer's body is image deformation caused by refraction. Refraction occurs at the water-air interface plane (glass) owing to the density difference. Camera calibration-reconstruction algorithms commonly used in aquatic research do not have the capability to correct this refraction-induced nonlinear image deformation and produce large reconstruction errors. The aim of this paper is to provide a thorough review of: the nature of the refraction-induced image deformation and its behaviour in underwater object-space plane reconstruction; the intrinsic shortcomings of the Direct Linear Transformation (DLT) method in underwater motion analysis; experimental conditions that interact with refraction; and alternative algorithms and strategies that can be used to improve the calibration-reconstruction accuracy. Although it is impossible to remove the refraction error completely in conventional camera calibration-reconstruction methods, it is possible to improve the accuracy to some extent by manipulating experimental conditions or calibration frame characteristics. Alternative algorithms, such as the localized DLT and the double-plane method are also available for error reduction. The ultimate solution for the refraction problem is to develop underwater camera calibration and reconstruction algorithms that have the capability to correct refraction.
NASA Technical Reports Server (NTRS)
Wolf, M. B.
1981-01-01
The determination and removal of instrument signature from Viking Lander camera geometric data are described. All tests conducted as well as a listing of the final database (calibration constants) used to remove instrument signature from Viking Lander flight images are included. The theory of the geometric aberrations inherent in the Viking Lander camera is explored.
A New Calibration Method for Commercial RGB-D Sensors.
Darwish, Walid; Tang, Shenjun; Li, Wenbin; Chen, Wu
2017-05-24
Commercial RGB-D sensors such as Kinect and Structure Sensors have been widely used in the game industry, where geometric fidelity is not of utmost importance. For applications in which high quality 3D is required, i.e., 3D building models of centimeter‑level accuracy, accurate and reliable calibrations of these sensors are required. This paper presents a new model for calibrating the depth measurements of RGB-D sensors based on the structured light concept. Additionally, a new automatic method is proposed for the calibration of all RGB-D parameters, including internal calibration parameters for all cameras, the baseline between the infrared and RGB cameras, and the depth error model. When compared with traditional calibration methods, this new model shows a significant improvement in depth precision for both near and far ranges.
Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization.
Lee, Sing Chun; Fuerst, Bernhard; Fotouhi, Javad; Fischer, Marius; Osgood, Greg; Navab, Nassir
2016-06-01
This work proposes a novel algorithm to register cone-beam computed tomography (CBCT) volumes and 3D optical (RGBD) camera views. The co-registered real-time RGBD camera and CBCT imaging enable a novel augmented reality solution for orthopedic surgeries, which allows arbitrary views using digitally reconstructed radiographs overlaid on the reconstructed patient's surface without the need to move the C-arm. An RGBD camera is rigidly mounted on the C-arm near the detector. We introduce a calibration method based on the simultaneous reconstruction of the surface and the CBCT scan of an object. The transformation between the two coordinate spaces is recovered using Fast Point Feature Histogram descriptors and the Iterative Closest Point algorithm. Several experiments are performed to assess the repeatability and the accuracy of this method. Target registration error is measured on multiple visual and radio-opaque landmarks to evaluate the accuracy of the registration. Mixed reality visualizations from arbitrary angles are also presented for simulated orthopedic surgeries. To the best of our knowledge, this is the first calibration method which uses only tomographic and RGBD reconstructions. This means that the method does not impose a particular shape of the phantom. We demonstrate a marker-less calibration of CBCT volumes and 3D depth cameras, achieving reasonable registration accuracy. This design requires a one-time factory calibration, is self-contained, and could be integrated into existing mobile C-arms to provide real-time augmented reality views from arbitrary angles.
Metric Calibration of a Focused Plenoptic Camera Based on a 3d Calibration Target
NASA Astrophysics Data System (ADS)
Zeller, N.; Noury, C. A.; Quint, F.; Teulière, C.; Stilla, U.; Dhome, M.
2016-06-01
In this paper we present a new calibration approach for focused plenoptic cameras. We derive a new mathematical projection model of a focused plenoptic camera which considers lateral as well as depth distortion. Therefore, we derive a new depth distortion model directly from the theory of depth estimation in a focused plenoptic camera. In total the model consists of five intrinsic parameters, the parameters for radial and tangential distortion in the image plane and two new depth distortion parameters. In the proposed calibration we perform a complete bundle adjustment based on a 3D calibration target. The residual of our optimization approach is three dimensional, where the depth residual is defined by a scaled version of the inverse virtual depth difference and thus conforms well to the measured data. Our method is evaluated based on different camera setups and shows good accuracy. For a better characterization of our approach we evaluate the accuracy of virtual image points projected back to 3D space.
Non-contact measurement of rotation angle with solo camera
NASA Astrophysics Data System (ADS)
Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun
2015-02-01
For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.
Calibration of stereo rigs based on the backward projection process
NASA Astrophysics Data System (ADS)
Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui; Zhao, Zixin
2016-08-01
High-accuracy 3D measurement based on binocular vision system is heavily dependent on the accurate calibration of two rigidly-fixed cameras. In most traditional calibration methods, stereo parameters are iteratively optimized through the forward imaging process (FIP). However, the results can only guarantee the minimal 2D pixel errors, but not the minimal 3D reconstruction errors. To address this problem, a simple method to calibrate a stereo rig based on the backward projection process (BPP) is proposed. The position of a spatial point can be determined separately from each camera by planar constraints provided by the planar pattern target. Then combined with pre-defined spatial points, intrinsic and extrinsic parameters of the stereo-rig can be optimized by minimizing the total 3D errors of both left and right cameras. An extensive performance study for the method in the presence of image noise and lens distortions is implemented. Experiments conducted on synthetic and real data demonstrate the accuracy and robustness of the proposed method.
Panorama parking assistant system with improved particle swarm optimization method
NASA Astrophysics Data System (ADS)
Cheng, Ruzhong; Zhao, Yong; Li, Zhichao; Jiang, Weigang; Wang, Xin'an; Xu, Yong
2013-10-01
A panorama parking assistant system (PPAS) for the automotive aftermarket together with a practical improved particle swarm optimization method (IPSO) are proposed in this paper. In the PPAS system, four fisheye cameras are installed in the vehicle with different views, and four channels of video frames captured by the cameras are processed as a 360-deg top-view image around the vehicle. Besides the embedded design of PPAS, the key problem for image distortion correction and mosaicking is the efficiency of parameter optimization in the process of camera calibration. In order to address this problem, an IPSO method is proposed. Compared with other parameter optimization methods, the proposed method allows a certain range of dynamic change for the intrinsic and extrinsic parameters, and can exploit only one reference image to complete all of the optimization; therefore, the efficiency of the whole camera calibration is increased. The PPAS is commercially available, and the IPSO method is a highly practical way to increase the efficiency of the installation and the calibration of PPAS in automobile 4S shops.
Performance evaluation and clinical applications of 3D plenoptic cameras
NASA Astrophysics Data System (ADS)
Decker, Ryan; Shademan, Azad; Opfermann, Justin; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel
2015-06-01
The observation and 3D quantification of arbitrary scenes using optical imaging systems is challenging, but increasingly necessary in many fields. This paper provides a technical basis for the application of plenoptic cameras in medical and medical robotics applications, and rigorously evaluates camera integration and performance in the clinical setting. It discusses plenoptic camera calibration and setup, assesses plenoptic imaging in a clinically relevant context, and in the context of other quantitative imaging technologies. We report the methods used for camera calibration, precision and accuracy results in an ideal and simulated surgical setting. Afterwards, we report performance during a surgical task. Test results showed the average precision of the plenoptic camera to be 0.90mm, increasing to 1.37mm for tissue across the calibrated FOV. The ideal accuracy was 1.14mm. The camera showed submillimeter error during a simulated surgical task.
Auto-calibration of GF-1 WFV images using flat terrain
NASA Astrophysics Data System (ADS)
Zhang, Guo; Xu, Kai; Huang, Wenchao
2017-12-01
Four wide field view (WFV) cameras with 16-m multispectral medium-resolution and a combined swath of 800 km are onboard the Gaofen-1 (GF-1) satellite, which can increase the revisit frequency to less than 4 days and enable large-scale land monitoring. The detection and elimination of WFV camera distortions is key for subsequent applications. Due to the wide swath of WFV images, geometric calibration using either conventional methods based on the ground control field (GCF) or GCF independent methods is problematic. This is predominantly because current GCFs in China fail to cover the whole WFV image and most GCF independent methods are used for close-range photogrammetry or computer vision fields. This study proposes an auto-calibration method using flat terrain to detect nonlinear distortions of GF-1 WFV images. First, a classic geometric calibration model is built for the GF1 WFV camera, and at least two images with an overlap area that cover flat terrain are collected, then the elevation residuals between the real elevation and that calculated by forward intersection are used to solve nonlinear distortion parameters in WFV images. Experiments demonstrate that the orientation accuracy of the proposed method evaluated by GCF CPs is within 0.6 pixel, and residual errors manifest as random errors. Validation using Google Earth CPs further proves the effectiveness of auto-calibration, and the whole scene is undistorted compared to not using calibration parameters. The orientation accuracy of the proposed method and the GCF method is compared. The maximum difference is approximately 0.3 pixel, and the factors behind this discrepancy are analyzed. Generally, this method can effectively compensate for distortions in the GF-1 WFV camera.
NASA Astrophysics Data System (ADS)
Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.
2018-04-01
Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).
A Method to Solve Interior and Exterior Camera Calibration Parameters for Image Resection
NASA Technical Reports Server (NTRS)
Samtaney, Ravi
1999-01-01
An iterative method is presented to solve the internal and external camera calibration parameters, given model target points and their images from one or more camera locations. The direct linear transform formulation was used to obtain a guess for the iterative method, and herein lies one of the strengths of the present method. In all test cases, the method converged to the correct solution. In general, an overdetermined system of nonlinear equations is solved in the least-squares sense. The iterative method presented is based on Newton-Raphson for solving systems of nonlinear algebraic equations. The Jacobian is analytically derived and the pseudo-inverse of the Jacobian is obtained by singular value decomposition.
A New Calibration Method for Commercial RGB-D Sensors
Darwish, Walid; Tang, Shenjun; Li, Wenbin; Chen, Wu
2017-01-01
Commercial RGB-D sensors such as Kinect and Structure Sensors have been widely used in the game industry, where geometric fidelity is not of utmost importance. For applications in which high quality 3D is required, i.e., 3D building models of centimeter-level accuracy, accurate and reliable calibrations of these sensors are required. This paper presents a new model for calibrating the depth measurements of RGB-D sensors based on the structured light concept. Additionally, a new automatic method is proposed for the calibration of all RGB-D parameters, including internal calibration parameters for all cameras, the baseline between the infrared and RGB cameras, and the depth error model. When compared with traditional calibration methods, this new model shows a significant improvement in depth precision for both near and far ranges. PMID:28538695
NASA Astrophysics Data System (ADS)
Zhang, Hua; Zeng, Luan
2017-11-01
Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.
Photometric Calibration of Consumer Video Cameras
NASA Technical Reports Server (NTRS)
Suggs, Robert; Swift, Wesley, Jr.
2007-01-01
Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.
The on-orbit calibration of geometric parameters of the Tian-Hui 1 (TH-1) satellite
NASA Astrophysics Data System (ADS)
Wang, Jianrong; Wang, Renxiang; Hu, Xin; Su, Zhongbo
2017-02-01
The on-orbit calibration of geometric parameters is a key step in improving the location accuracy of satellite images without using Ground Control Points (GCPs). Most methods of on-orbit calibration are based on the self-calibration using additional parameters. When using additional parameters, different number of additional parameters may lead to different results. The triangulation bundle adjustment is another way to calibrate the geometric parameters of camera, which can describe the changes in each geometric parameter. When triangulation bundle adjustment method is applied to calibrate geometric parameters, a prerequisite is that the strip model can avoid systematic deformation caused by the rate of attitude changes. Concerning the stereo camera, the influence of the intersection angle should be considered during calibration. The Equivalent Frame Photo (EFP) bundle adjustment based on the Line-Matrix CCD (LMCCD) image can solve the systematic distortion of the strip model, and obtain high accuracy location without using GCPs. In this paper, the triangulation bundle adjustment is used to calibrate the geometric parameters of TH-1 satellite cameras based on LMCCD image. During the bundle adjustment, the three-line array cameras are reconstructed by adopting the principle of inverse triangulation. Finally, the geometric accuracy is validated before and after on-orbit calibration using 5 testing fields. After on-orbit calibration, the 3D geometric accuracy is improved to 11.8 m from 170 m. The results show that the location accuracy of TH-1 without using GCPs is significantly improved using the on-orbit calibration of the geometric parameters.
3D kinematic measurement of human movement using low cost fish-eye cameras
NASA Astrophysics Data System (ADS)
Islam, Atiqul; Asikuzzaman, Md.; Garratt, Matthew A.; Pickering, Mark R.
2017-02-01
3D motion capture is difficult when the capturing is performed in an outdoor environment without controlled surroundings. In this paper, we propose a new approach of using two ordinary cameras arranged in a special stereoscopic configuration and passive markers on a subject's body to reconstruct the motion of the subject. Firstly for each frame of the video, an adaptive thresholding algorithm is applied for extracting the markers on the subject's body. Once the markers are extracted, an algorithm for matching corresponding markers in each frame is applied. Zhang's planar calibration method is used to calibrate the two cameras. As the cameras use the fisheye lens, they cannot be well estimated using a pinhole camera model which makes it difficult to estimate the depth information. In this work, to restore the 3D coordinates we use a unique calibration method for fisheye lenses. The accuracy of the 3D coordinate reconstruction is evaluated by comparing with results from a commercially available Vicon motion capture system.
Calibration of a dual-PTZ camera system for stereo vision
NASA Astrophysics Data System (ADS)
Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng
2010-08-01
In this paper, we propose a calibration process for the intrinsic and extrinsic parameters of dual-PTZ camera systems. The calibration is based on a complete definition of six coordinate systems fixed at the image planes, and the pan and tilt rotation axes of the cameras. Misalignments between estimated and ideal coordinates of image corners are formed into cost values to be solved by the Nelder-Mead simplex optimization method. Experimental results show that the system is able to obtain 3D coordinates of objects with a consistent accuracy of 1 mm when the distance between the dual-PTZ camera set and the objects are from 0.9 to 1.1 meters.
Camera Calibration with Radial Variance Component Estimation
NASA Astrophysics Data System (ADS)
Mélykuti, B.; Kruck, E. J.
2014-11-01
Camera calibration plays a more and more important role in recent times. Beside real digital aerial survey cameras the photogrammetric market is dominated by a big number of non-metric digital cameras mounted on UAVs or other low-weight flying platforms. The in-flight calibration of those systems has a significant role to enhance the geometric accuracy of survey photos considerably. It is expected to have a better precision of photo measurements in the center of images then along the edges or in the corners. With statistical methods the accuracy of photo measurements in dependency of the distance of points from image center has been analyzed. This test provides a curve for the measurement precision as function of the photo radius. A high number of camera types have been tested with well penetrated point measurements in image space. The result of the tests led to a general consequence to show a functional connection between accuracy and radial distance and to give a method how to check and enhance the geometrical capability of the cameras in respect to these results.
Video auto stitching in multicamera surveillance system
NASA Astrophysics Data System (ADS)
He, Bin; Zhao, Gang; Liu, Qifang; Li, Yangyang
2012-01-01
This paper concerns the problem of video stitching automatically in a multi-camera surveillance system. Previous approaches have used multiple calibrated cameras for video mosaic in large scale monitoring application. In this work, we formulate video stitching as a multi-image registration and blending problem, and not all cameras are needed to be calibrated except a few selected master cameras. SURF is used to find matched pairs of image key points from different cameras, and then camera pose is estimated and refined. Homography matrix is employed to calculate overlapping pixels and finally implement boundary resample algorithm to blend images. The result of simulation demonstrates the efficiency of our method.
Video auto stitching in multicamera surveillance system
NASA Astrophysics Data System (ADS)
He, Bin; Zhao, Gang; Liu, Qifang; Li, Yangyang
2011-12-01
This paper concerns the problem of video stitching automatically in a multi-camera surveillance system. Previous approaches have used multiple calibrated cameras for video mosaic in large scale monitoring application. In this work, we formulate video stitching as a multi-image registration and blending problem, and not all cameras are needed to be calibrated except a few selected master cameras. SURF is used to find matched pairs of image key points from different cameras, and then camera pose is estimated and refined. Homography matrix is employed to calculate overlapping pixels and finally implement boundary resample algorithm to blend images. The result of simulation demonstrates the efficiency of our method.
Thermal-depth matching in dynamic scene based on affine projection and feature registration
NASA Astrophysics Data System (ADS)
Wang, Hongyu; Jia, Tong; Wu, Chengdong; Li, Yongqiang
2018-03-01
This paper aims to study the construction of 3D temperature distribution reconstruction system based on depth and thermal infrared information. Initially, a traditional calibration method cannot be directly used, because the depth and thermal infrared camera is not sensitive to the color calibration board. Therefore, this paper aims to design a depth and thermal infrared camera calibration board to complete the calibration of the depth and thermal infrared camera. Meanwhile a local feature descriptors in thermal and depth images is proposed. The belief propagation matching algorithm is also investigated based on the space affine transformation matching and local feature matching. The 3D temperature distribution model is built based on the matching of 3D point cloud and 2D thermal infrared information. Experimental results show that the method can accurately construct the 3D temperature distribution model, and has strong robustness.
A curve fitting method for extrinsic camera calibration from a single image of a cylindrical object
NASA Astrophysics Data System (ADS)
Winkler, A. W.; Zagar, B. G.
2013-08-01
An important step in the process of optical steel coil quality assurance is to measure the proportions of width and radius of steel coils as well as the relative position and orientation of the camera. This work attempts to estimate these extrinsic parameters from single images by using the cylindrical coil itself as the calibration target. Therefore, an adaptive least-squares algorithm is applied to fit parametrized curves to the detected true coil outline in the acquisition. The employed model allows for strictly separating the intrinsic and the extrinsic parameters. Thus, the intrinsic camera parameters can be calibrated beforehand using available calibration software. Furthermore, a way to segment the true coil outline in the acquired images is motivated. The proposed optimization method yields highly accurate results and can be generalized even to measure other solids which cannot be characterized by the identification of simple geometric primitives.
NASA Astrophysics Data System (ADS)
Al-Durgham, K.; Lichti, D. D.; Detchev, I.; Kuntze, G.; Ronsky, J. L.
2018-05-01
A fundamental task in photogrammetry is the temporal stability analysis of a camera/imaging-system's calibration parameters. This is essential to validate the repeatability of the parameters' estimation, to detect any behavioural changes in the camera/imaging system and to ensure precise photogrammetric products. Many stability analysis methods exist in the photogrammetric literature; each one has different methodological bases, and advantages and disadvantages. This paper presents a simple and rigorous stability analysis method that can be straightforwardly implemented for a single camera or an imaging system with multiple cameras. The basic collinearity model is used to capture differences between two calibration datasets, and to establish the stability analysis methodology. Geometric simulation is used as a tool to derive image and object space scenarios. Experiments were performed on real calibration datasets from a dual fluoroscopy (DF; X-ray-based) imaging system. The calibration data consisted of hundreds of images and thousands of image observations from six temporal points over a two-day period for a precise evaluation of the DF system stability. The stability of the DF system - for a single camera analysis - was found to be within a range of 0.01 to 0.66 mm in terms of 3D coordinates root-mean-square-error (RMSE), and 0.07 to 0.19 mm for dual cameras analysis. It is to the authors' best knowledge that this work is the first to address the topic of DF stability analysis.
In-situ calibration of nonuniformity in infrared staring and modulated systems
NASA Astrophysics Data System (ADS)
Black, Wiley T.
Infrared cameras can directly measure the apparent temperature of objects, providing thermal imaging. However, the raw output from most infrared cameras suffers from a strong, often limiting noise source called nonuniformity. Manufacturing imperfections in infrared focal planes lead to high pixel-to-pixel sensitivity to electronic bias, focal plane temperature, and other effects. The resulting imagery can only provide useful thermal imaging after a nonuniformity calibration has been performed. Traditionally, these calibrations are performed by momentarily blocking the field of view with a at temperature plate or blackbody cavity. However because the pattern is a coupling of manufactured sensitivities with operational variations, periodic recalibration is required, sometimes on the order of tens of seconds. A class of computational methods called Scene-Based Nonuniformity Correction (SBNUC) has been researched for over 20 years where the nonuniformity calibration is estimated in digital processing by analysis of the video stream in the presence of camera motion. The most sophisticated SBNUC methods can completely and robustly eliminate the high-spatial frequency component of nonuniformity with only an initial reference calibration or potentially no physical calibration. I will demonstrate a novel algorithm that advances these SBNUC techniques to support all spatial frequencies of nonuniformity correction. Long-wave infrared microgrid polarimeters are a class of camera that incorporate a microscale per-pixel wire-grid polarizer directly affixed to each pixel of the focal plane. These cameras have the capability of simultaneously measuring thermal imagery and polarization in a robust integrated package with no moving parts. I will describe the necessary adaptations of my SBNUC method to operate on this class of sensor as well as demonstrate SBNUC performance in LWIR polarimetry video collected on the UA mall.
Maximum likelihood estimation in calibrating a stereo camera setup.
Muijtjens, A M; Roos, J M; Arts, T; Hasman, A
1999-02-01
Motion and deformation of the cardiac wall may be measured by following the positions of implanted radiopaque markers in three dimensions, using two x-ray cameras simultaneously. Regularly, calibration of the position measurement system is obtained by registration of the images of a calibration object, containing 10-20 radiopaque markers at known positions. Unfortunately, an accidental change of the position of a camera after calibration requires complete recalibration. Alternatively, redundant information in the measured image positions of stereo pairs can be used for calibration. Thus, a separate calibration procedure can be avoided. In the current study a model is developed that describes the geometry of the camera setup by five dimensionless parameters. Maximum Likelihood (ML) estimates of these parameters were obtained in an error analysis. It is shown that the ML estimates can be found by application of a nonlinear least squares procedure. Compared to the standard unweighted least squares procedure, the ML method resulted in more accurate estimates without noticeable bias. The accuracy of the ML method was investigated in relation to the object aperture. The reconstruction problem appeared well conditioned as long as the object aperture is larger than 0.1 rad. The angle between the two viewing directions appeared to be the parameter that was most likely to cause major inaccuracies in the reconstruction of the 3-D positions of the markers. Hence, attempts to improve the robustness of the method should primarily focus on reduction of the error in this parameter.
Hand-eye calibration for rigid laparoscopes using an invariant point.
Thompson, Stephen; Stoyanov, Danail; Schneider, Crispin; Gurusamy, Kurinchi; Ourselin, Sébastien; Davidson, Brian; Hawkes, David; Clarkson, Matthew J
2016-06-01
Laparoscopic liver resection has significant advantages over open surgery due to less patient trauma and faster recovery times, yet it can be difficult due to the restricted field of view and lack of haptic feedback. Image guidance provides a potential solution but one current challenge is in accurate "hand-eye" calibration, which determines the position and orientation of the laparoscope camera relative to the tracking markers. In this paper, we propose a simple and clinically feasible calibration method based on a single invariant point. The method requires no additional hardware, can be constructed by theatre staff during surgical setup, requires minimal image processing and can be visualised in real time. Real-time visualisation allows the surgical team to assess the calibration accuracy before use in surgery. In addition, in the laboratory, we have developed a laparoscope with an electromagnetic tracking sensor attached to the camera end and an optical tracking marker attached to the distal end. This enables a comparison of tracking performance. We have evaluated our method in the laboratory and compared it to two widely used methods, "Tsai's method" and "direct" calibration. The new method is of comparable accuracy to existing methods, and we show RMS projected error due to calibration of 1.95 mm for optical tracking and 0.85 mm for EM tracking, versus 4.13 and 1.00 mm respectively, using existing methods. The new method has also been shown to be workable under sterile conditions in the operating room. We have proposed a new method of hand-eye calibration, based on a single invariant point. Initial experience has shown that the method provides visual feedback, satisfactory accuracy and can be performed during surgery. We also show that an EM sensor placed near the camera would provide significantly improved image overlay accuracy.
Patankar, S.; Gumbrell, E. T.; Robinson, T. S.; ...
2017-08-17
Here we report a new method using high stability, laser-driven supercontinuum generation in a liquid cell to calibrate the absolute photon response of fast optical streak cameras as a function of wavelength when operating at fastest sweep speeds. A stable, pulsed white light source based around the use of self-phase modulation in a salt solution was developed to provide the required brightness on picosecond timescales, enabling streak camera calibration in fully dynamic operation. The measured spectral brightness allowed for absolute photon response calibration over a broad spectral range (425-650nm). Calibrations performed with two Axis Photonique streak cameras using the Photonismore » P820PSU streak tube demonstrated responses which qualitatively follow the photocathode response. Peak sensitivities were 1 photon/count above background. The absolute dynamic sensitivity is less than the static by up to an order of magnitude. We attribute this to the dynamic response of the phosphor being lower.« less
Volumetric calibration of a plenoptic camera.
Hall, Elise Munz; Fahringer, Timothy W; Guildenbecher, Daniel R; Thurow, Brian S
2018-02-01
The volumetric calibration of a plenoptic camera is explored to correct for inaccuracies due to real-world lens distortions and thin-lens assumptions in current processing methods. Two methods of volumetric calibration based on a polynomial mapping function that does not require knowledge of specific lens parameters are presented and compared to a calibration based on thin-lens assumptions. The first method, volumetric dewarping, is executed by creation of a volumetric representation of a scene using the thin-lens assumptions, which is then corrected in post-processing using a polynomial mapping function. The second method, direct light-field calibration, uses the polynomial mapping in creation of the initial volumetric representation to relate locations in object space directly to image sensor locations. The accuracy and feasibility of these methods is examined experimentally by capturing images of a known dot card at a variety of depths. Results suggest that use of a 3D polynomial mapping function provides a significant increase in reconstruction accuracy and that the achievable accuracy is similar using either polynomial-mapping-based method. Additionally, direct light-field calibration provides significant computational benefits by eliminating some intermediate processing steps found in other methods. Finally, the flexibility of this method is shown for a nonplanar calibration.
Integrated calibration of multiview phase-measuring profilometry
NASA Astrophysics Data System (ADS)
Lee, Yeong Beum; Kim, Min H.
2017-11-01
Phase-measuring profilometry (PMP) measures per-pixel height information of a surface with high accuracy. Height information captured by a camera in PMP relies on its screen coordinates. Therefore, a PMP measurement from a view cannot be integrated directly to other measurements from different views due to the intrinsic difference of the screen coordinates. In order to integrate multiple PMP scans, an auxiliary calibration of each camera's intrinsic and extrinsic properties is required, in addition to principal PMP calibration. This is cumbersome and often requires physical constraints in the system setup, and multiview PMP is consequently rarely practiced. In this work, we present a novel multiview PMP method that yields three-dimensional global coordinates directly so that three-dimensional measurements can be integrated easily. Our PMP calibration parameterizes intrinsic and extrinsic properties of the configuration of both a camera and a projector simultaneously. It also does not require any geometric constraints on the setup. In addition, we propose a novel calibration target that can remain static without requiring any mechanical operation while conducting multiview calibrations, whereas existing calibration methods require manually changing the target's position and orientation. Our results validate the accuracy of measurements and demonstrate the advantages on our multiview PMP.
Color calibration of an RGB camera mounted in front of a microscope with strong color distortion.
Charrière, Renée; Hébert, Mathieu; Trémeau, Alain; Destouches, Nathalie
2013-07-20
This paper aims at showing that performing color calibration of an RGB camera can be achieved even in the case where the optical system before the camera introduces strong color distortion. In the present case, the optical system is a microscope containing a halogen lamp, with a nonuniform irradiance on the viewed surface. The calibration method proposed in this work is based on an existing method, but it is preceded by a three-step preprocessing of the RGB images aiming at extracting relevant color information from the strongly distorted images, taking especially into account the nonuniform irradiance map and the perturbing texture due to the surface topology of the standard color calibration charts when observed at micrometric scale. The proposed color calibration process consists first in computing the average color of the color-chart patches viewed under the microscope; then computing white balance, gamma correction, and saturation enhancement; and finally applying a third-order polynomial regression color calibration transform. Despite the nonusual conditions for color calibration, fairly good performance is achieved from a 48 patch Lambertian color chart, since an average CIE-94 color difference on the color-chart colors lower than 2.5 units is obtained.
In-Situ Cameras for Radiometric Correction of Remotely Sensed Data
NASA Astrophysics Data System (ADS)
Kautz, Jess S.
The atmosphere distorts the spectrum of remotely sensed data, negatively affecting all forms of investigating Earth's surface. To gather reliable data, it is vital that atmospheric corrections are accurate. The current state of the field of atmospheric correction does not account well for the benefits and costs of different correction algorithms. Ground spectral data are required to evaluate these algorithms better. This dissertation explores using cameras as radiometers as a means of gathering ground spectral data. I introduce techniques to implement a camera systems for atmospheric correction using off the shelf parts. To aid the design of future camera systems for radiometric correction, methods for estimating the system error prior to construction, calibration and testing of the resulting camera system are explored. Simulations are used to investigate the relationship between the reflectance accuracy of the camera system and the quality of atmospheric correction. In the design phase, read noise and filter choice are found to be the strongest sources of system error. I explain the calibration methods for the camera system, showing the problems of pixel to angle calibration, and adapting the web camera for scientific work. The camera system is tested in the field to estimate its ability to recover directional reflectance from BRF data. I estimate the error in the system due to the experimental set up, then explore how the system error changes with different cameras, environmental set-ups and inversions. With these experiments, I learn about the importance of the dynamic range of the camera, and the input ranges used for the PROSAIL inversion. Evidence that the camera can perform within the specification set for ELM correction in this dissertation is evaluated. The analysis is concluded by simulating an ELM correction of a scene using various numbers of calibration targets, and levels of system error, to find the number of cameras needed for a full-scale implementation.
Stereoscopic 3D reconstruction using motorized zoom lenses within an embedded system
NASA Astrophysics Data System (ADS)
Liu, Pengcheng; Willis, Andrew; Sui, Yunfeng
2009-02-01
This paper describes a novel embedded system capable of estimating 3D positions of surfaces viewed by a stereoscopic rig consisting of a pair of calibrated cameras. Novel theoretical and technical aspects of the system are tied to two aspects of the design that deviate from typical stereoscopic reconstruction systems: (1) incorporation of an 10x zoom lens (Rainbow- H10x8.5) and (2) implementation of the system on an embedded system. The system components include a DSP running μClinux, an embedded version of the Linux operating system, and an FPGA. The DSP orchestrates data flow within the system and performs complex computational tasks and the FPGA provides an interface to the system devices which consist of a CMOS camera pair and a pair of servo motors which rotate (pan) each camera. Calibration of the camera pair is accomplished using a collection of stereo images that view a common chess board calibration pattern for a set of pre-defined zoom positions. Calibration settings for an arbitrary zoom setting are estimated by interpolation of the camera parameters. A low-computational cost method for dense stereo matching is used to compute depth disparities for the stereo image pairs. Surface reconstruction is accomplished by classical triangulation of the matched points from the depth disparities. This article includes our methods and results for the following problems: (1) automatic computation of the focus and exposure settings for the lens and camera sensor, (2) calibration of the system for various zoom settings and (3) stereo reconstruction results for several free form objects.
A Flexile and High Precision Calibration Method for Binocular Structured Light Scanning System
Yuan, Jianying; Wang, Qiong; Li, Bailin
2014-01-01
3D (three-dimensional) structured light scanning system is widely used in the field of reverse engineering, quality inspection, and so forth. Camera calibration is the key for scanning precision. Currently, 2D (two-dimensional) or 3D fine processed calibration reference object is usually applied for high calibration precision, which is difficult to operate and the cost is high. In this paper, a novel calibration method is proposed with a scale bar and some artificial coded targets placed randomly in the measuring volume. The principle of the proposed method is based on hierarchical self-calibration and bundle adjustment. We get initial intrinsic parameters from images. Initial extrinsic parameters in projective space are estimated with the method of factorization and then upgraded to Euclidean space with orthogonality of rotation matrix and rank 3 of the absolute quadric as constraint. Last, all camera parameters are refined through bundle adjustment. Real experiments show that the proposed method is robust, and has the same precision level as the result using delicate artificial reference object, but the hardware cost is very low compared with the current calibration method used in 3D structured light scanning system. PMID:25202736
Multisensory visual servoing by a neural network.
Wei, G Q; Hirzinger, G
1999-01-01
Conventional computer vision methods for determining a robot's end-effector motion based on sensory data needs sensor calibration (e.g., camera calibration) and sensor-to-hand calibration (e.g., hand-eye calibration). This involves many computations and even some difficulties, especially when different kinds of sensors are involved. In this correspondence, we present a neural network approach to the motion determination problem without any calibration. Two kinds of sensory data, namely, camera images and laser range data, are used as the input to a multilayer feedforward network to associate the direct transformation from the sensory data to the required motions. This provides a practical sensor fusion method. Using a recursive motion strategy and in terms of a network correction, we relax the requirement for the exactness of the learned transformation. Another important feature of our work is that the goal position can be changed without having to do network retraining. Experimental results show the effectiveness of our method.
Full-Field Calibration of Color Camera Chromatic Aberration using Absolute Phase Maps.
Liu, Xiaohong; Huang, Shujun; Zhang, Zonghua; Gao, Feng; Jiang, Xiangqian
2017-05-06
The refractive index of a lens varies for different wavelengths of light, and thus the same incident light with different wavelengths has different outgoing light. This characteristic of lenses causes images captured by a color camera to display chromatic aberration (CA), which seriously reduces image quality. Based on an analysis of the distribution of CA, a full-field calibration method based on absolute phase maps is proposed in this paper. Red, green, and blue closed sinusoidal fringe patterns are generated, consecutively displayed on an LCD (liquid crystal display), and captured by a color camera from the front viewpoint. The phase information of each color fringe is obtained using a four-step phase-shifting algorithm and optimum fringe number selection method. CA causes the unwrapped phase of the three channels to differ. These pixel deviations can be computed by comparing the unwrapped phase data of the red, blue, and green channels in polar coordinates. CA calibration is accomplished in Cartesian coordinates. The systematic errors introduced by the LCD are analyzed and corrected. Simulated results show the validity of the proposed method and experimental results demonstrate that the proposed full-field calibration method based on absolute phase maps will be useful for practical software-based CA calibration.
Indirect Correspondence-Based Robust Extrinsic Calibration of LiDAR and Camera
Sim, Sungdae; Sock, Juil; Kwak, Kiho
2016-01-01
LiDAR and cameras have been broadly utilized in computer vision and autonomous vehicle applications. However, in order to convert data between the local coordinate systems, we must estimate the rigid body transformation between the sensors. In this paper, we propose a robust extrinsic calibration algorithm that can be implemented easily and has small calibration error. The extrinsic calibration parameters are estimated by minimizing the distance between corresponding features projected onto the image plane. The features are edge and centerline features on a v-shaped calibration target. The proposed algorithm contributes two ways to improve the calibration accuracy. First, we use different weights to distance between a point and a line feature according to the correspondence accuracy of the features. Second, we apply a penalizing function to exclude the influence of outliers in the calibration datasets. Additionally, based on our robust calibration approach for a single LiDAR-camera pair, we introduce a joint calibration that estimates the extrinsic parameters of multiple sensors at once by minimizing one objective function with loop closing constraints. We conduct several experiments to evaluate the performance of our extrinsic calibration algorithm. The experimental results show that our calibration method has better performance than the other approaches. PMID:27338416
Calibration method for a large-scale structured light measurement system.
Wang, Peng; Wang, Jianmei; Xu, Jing; Guan, Yong; Zhang, Guanglie; Chen, Ken
2017-05-10
The structured light method is an effective non-contact measurement approach. The calibration greatly affects the measurement precision of structured light systems. To construct a large-scale structured light system with high accuracy, a large-scale and precise calibration gauge is always required, which leads to an increased cost. To this end, in this paper, a calibration method with a planar mirror is proposed to reduce the calibration gauge size and cost. An out-of-focus camera calibration method is also proposed to overcome the defocusing problem caused by the shortened distance during the calibration procedure. The experimental results verify the accuracy of the proposed calibration method.
Volumetric calibration of a plenoptic camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Elise Munz; Fahringer, Timothy W.; Guildenbecher, Daniel Robert
Here, the volumetric calibration of a plenoptic camera is explored to correct for inaccuracies due to real-world lens distortions and thin-lens assumptions in current processing methods. Two methods of volumetric calibration based on a polynomial mapping function that does not require knowledge of specific lens parameters are presented and compared to a calibration based on thin-lens assumptions. The first method, volumetric dewarping, is executed by creation of a volumetric representation of a scene using the thin-lens assumptions, which is then corrected in post-processing using a polynomial mapping function. The second method, direct light-field calibration, uses the polynomial mapping in creationmore » of the initial volumetric representation to relate locations in object space directly to image sensor locations. The accuracy and feasibility of these methods is examined experimentally by capturing images of a known dot card at a variety of depths. Results suggest that use of a 3D polynomial mapping function provides a significant increase in reconstruction accuracy and that the achievable accuracy is similar using either polynomial-mapping-based method. Additionally, direct light-field calibration provides significant computational benefits by eliminating some intermediate processing steps found in other methods. Finally, the flexibility of this method is shown for a nonplanar calibration.« less
Volumetric calibration of a plenoptic camera
Hall, Elise Munz; Fahringer, Timothy W.; Guildenbecher, Daniel Robert; ...
2018-02-01
Here, the volumetric calibration of a plenoptic camera is explored to correct for inaccuracies due to real-world lens distortions and thin-lens assumptions in current processing methods. Two methods of volumetric calibration based on a polynomial mapping function that does not require knowledge of specific lens parameters are presented and compared to a calibration based on thin-lens assumptions. The first method, volumetric dewarping, is executed by creation of a volumetric representation of a scene using the thin-lens assumptions, which is then corrected in post-processing using a polynomial mapping function. The second method, direct light-field calibration, uses the polynomial mapping in creationmore » of the initial volumetric representation to relate locations in object space directly to image sensor locations. The accuracy and feasibility of these methods is examined experimentally by capturing images of a known dot card at a variety of depths. Results suggest that use of a 3D polynomial mapping function provides a significant increase in reconstruction accuracy and that the achievable accuracy is similar using either polynomial-mapping-based method. Additionally, direct light-field calibration provides significant computational benefits by eliminating some intermediate processing steps found in other methods. Finally, the flexibility of this method is shown for a nonplanar calibration.« less
A method of camera calibration with adaptive thresholding
NASA Astrophysics Data System (ADS)
Gao, Lei; Yan, Shu-hua; Wang, Guo-chao; Zhou, Chun-lei
2009-07-01
In order to calculate the parameters of the camera correctly, we must figure out the accurate coordinates of the certain points in the image plane. Corners are the important features in the 2D images. Generally speaking, they are the points that have high curvature and lie in the junction of different brightness regions of images. So corners detection has already widely used in many fields. In this paper we use the pinhole camera model and SUSAN corner detection algorithm to calibrate the camera. When using the SUSAN corner detection algorithm, we propose an approach to retrieve the gray difference threshold, adaptively. That makes it possible to pick up the right chessboard inner comers in all kinds of gray contrast. The experiment result based on this method was proved to be feasible.
Depth profile measurement with lenslet images of the plenoptic camera
NASA Astrophysics Data System (ADS)
Yang, Peng; Wang, Zhaomin; Zhang, Wei; Zhao, Hongying; Qu, Weijuan; Zhao, Haimeng; Asundi, Anand; Yan, Lei
2018-03-01
An approach for carrying out depth profile measurement of an object with the plenoptic camera is proposed. A single plenoptic image consists of multiple lenslet images. To begin with, these images are processed directly with a refocusing technique to obtain the depth map, which does not need to align and decode the plenoptic image. Then, a linear depth calibration is applied based on the optical structure of the plenoptic camera for depth profile reconstruction. One significant improvement of the proposed method concerns the resolution of the depth map. Unlike the traditional method, our resolution is not limited by the number of microlenses inside the camera, and the depth map can be globally optimized. We validated the method with experiments on depth map reconstruction, depth calibration, and depth profile measurement, with the results indicating that the proposed approach is both efficient and accurate.
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling
Tang, Shengjun; Zhu, Qing; Chen, Wu; Darwish, Walid; Wu, Bo; Hu, Han; Chen, Min
2016-01-01
RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method. PMID:27690028
Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.
Tang, Shengjun; Zhu, Qing; Chen, Wu; Darwish, Walid; Wu, Bo; Hu, Han; Chen, Min
2016-09-27
RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method.
NASA Astrophysics Data System (ADS)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.; Kalantar, Daniel H.
2015-02-01
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high energy density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.
Simplified stereo-optical ultrasound plane calibration
NASA Astrophysics Data System (ADS)
Hoßbach, Martin; Noll, Matthias; Wesarg, Stefan
2013-03-01
Image guided therapy is a natural concept and commonly used in medicine. In anesthesia, a common task is the injection of an anesthetic close to a nerve under freehand ultrasound guidance. Several guidance systems exist using electromagnetic tracking of the ultrasound probe as well as the needle, providing the physician with a precise projection of the needle into the ultrasound image. This, however, requires additional expensive devices. We suggest using optical tracking with miniature cameras attached to a 2D ultrasound probe to achieve a higher acceptance among physicians. The purpose of this paper is to present an intuitive method to calibrate freehand ultrasound needle guidance systems employing a rigid stereo camera system. State of the art methods are based on a complex series of error prone coordinate system transformations which makes them susceptible to error accumulation. By reducing the amount of calibration steps to a single calibration procedure we provide a calibration method that is equivalent, yet not prone to error accumulation. It requires a linear calibration object and is validated on three datasets utilizing di erent calibration objects: a 6mm metal bar and a 1:25mm biopsy needle were used for experiments. Compared to existing calibration methods for freehand ultrasound needle guidance systems, we are able to achieve higher accuracy results while additionally reducing the overall calibration complexity. Ke
Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft
NASA Astrophysics Data System (ADS)
Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.
2015-02-01
Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.
A new method to calibrate the absolute sensitivity of a soft X-ray streak camera
NASA Astrophysics Data System (ADS)
Yu, Jian; Liu, Shenye; Li, Jin; Yang, Zhiwen; Chen, Ming; Guo, Luting; Yao, Li; Xiao, Shali
2016-12-01
In this paper, we introduce a new method to calibrate the absolute sensitivity of a soft X-ray streak camera (SXRSC). The calibrations are done in the static mode by using a small laser-produced X-ray source. A calibrated X-ray CCD is used as a secondary standard detector to monitor the X-ray source intensity. In addition, two sets of holographic flat-field grating spectrometers are chosen as the spectral discrimination systems of the SXRSC and the X-ray CCD. The absolute sensitivity of the SXRSC is obtained by comparing the signal counts of the SXRSC to the output counts of the X-ray CCD. Results show that the calibrated spectrum covers the range from 200 eV to 1040 eV. The change of the absolute sensitivity in the vicinity of the K-edge of the carbon can also be clearly seen. The experimental values agree with the calculated values to within 29% error. Compared with previous calibration methods, the proposed method has several advantages: a wide spectral range, high accuracy, and simple data processing. Our calibration results can be used to make quantitative X-ray flux measurements in laser fusion research.
Novel hyperspectral prediction method and apparatus
NASA Astrophysics Data System (ADS)
Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf
2009-05-01
Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.
Quantitative Measurements of X-ray Intensity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugh, M. J., Schneider, M.
This chapter describes the characterization of several X-ray sources and their use in calibrating different types of X-ray cameras at National Security Technologies, LLC (NSTec). The cameras are employed in experimental plasma studies at Lawrence Livermore National Laboratory (LLNL), including the National Ignition Facility (NIF). The sources provide X-rays in the energy range from several hundred eV to 110 keV. The key to this effort is measuring the X-ray beam intensity accurately and traceable to international standards. This is accomplished using photodiodes of several types that are calibrated using radioactive sources and a synchrotron source using methods and materials thatmore » are traceable to the U.S. National Institute of Standards and Technology (NIST). The accreditation procedures are described. The chapter begins with an introduction to the fundamental concepts of X-ray physics. The types of X-ray sources that are used for device calibration are described. The next section describes the photodiode types that are used for measuring X-ray intensity: power measuring photodiodes, energy dispersive photodiodes, and cameras comprising photodiodes as pixel elements. Following their description, the methods used to calibrate the primary detectors, the power measuring photodiodes and the energy dispersive photodiodes, as well as the method used to get traceability to international standards are described. The X-ray source beams can then be measured using the primary detectors. The final section then describes the use of the calibrated X-ray beams to calibrate X-ray cameras. Many of the references are web sites that provide databases, explanations of the data and how it was generated, and data calculations for specific cases. Several general reference books related to the major topics are included. Papers expanding some subjects are cited.« less
Wang, Fei; Dong, Hang; Chen, Yanan; Zheng, Nanning
2016-12-09
Strong demands for accurate non-cooperative target measurement have been arising recently for the tasks of assembling and capturing. Spherical objects are one of the most common targets in these applications. However, the performance of the traditional vision-based reconstruction method was limited for practical use when handling poorly-textured targets. In this paper, we propose a novel multi-sensor fusion system for measuring and reconstructing textureless non-cooperative spherical targets. Our system consists of four simple lasers and a visual camera. This paper presents a complete framework of estimating the geometric parameters of textureless spherical targets: (1) an approach to calibrate the extrinsic parameters between a camera and simple lasers; and (2) a method to reconstruct the 3D position of the laser spots on the target surface and achieve the refined results via an optimized scheme. The experiment results show that our proposed calibration method can obtain a fine calibration result, which is comparable to the state-of-the-art LRF-based methods, and our calibrated system can estimate the geometric parameters with high accuracy in real time.
Wang, Fei; Dong, Hang; Chen, Yanan; Zheng, Nanning
2016-01-01
Strong demands for accurate non-cooperative target measurement have been arising recently for the tasks of assembling and capturing. Spherical objects are one of the most common targets in these applications. However, the performance of the traditional vision-based reconstruction method was limited for practical use when handling poorly-textured targets. In this paper, we propose a novel multi-sensor fusion system for measuring and reconstructing textureless non-cooperative spherical targets. Our system consists of four simple lasers and a visual camera. This paper presents a complete framework of estimating the geometric parameters of textureless spherical targets: (1) an approach to calibrate the extrinsic parameters between a camera and simple lasers; and (2) a method to reconstruct the 3D position of the laser spots on the target surface and achieve the refined results via an optimized scheme. The experiment results show that our proposed calibration method can obtain a fine calibration result, which is comparable to the state-of-the-art LRF-based methods, and our calibrated system can estimate the geometric parameters with high accuracy in real time. PMID:27941705
Sky camera geometric calibration using solar observations
Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan
2016-09-05
A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. Themore » performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. In conclusion, calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.« less
Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.
Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki
2016-06-24
Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.
Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras
Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki
2016-01-01
Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961
NASA Astrophysics Data System (ADS)
Wang, Liqiang; Liu, Zhen; Zhang, Zhonghua
2014-11-01
Stereo vision is the key in the visual measurement, robot vision, and autonomous navigation. Before performing the system of stereo vision, it needs to calibrate the intrinsic parameters for each camera and the external parameters of the system. In engineering, the intrinsic parameters remain unchanged after calibrating cameras, and the positional relationship between the cameras could be changed because of vibration, knocks and pressures in the vicinity of the railway or motor workshops. Especially for large baselines, even minute changes in translation or rotation can affect the epipolar geometry and scene triangulation to such a degree that visual system becomes disabled. A technology including both real-time examination and on-line recalibration for the external parameters of stereo system becomes particularly important. This paper presents an on-line method for checking and recalibrating the positional relationship between stereo cameras. In epipolar geometry, the external parameters of cameras can be obtained by factorization of the fundamental matrix. Thus, it offers a method to calculate the external camera parameters without any special targets. If the intrinsic camera parameters are known, the external parameters of system can be calculated via a number of random matched points. The process is: (i) estimating the fundamental matrix via the feature point correspondences; (ii) computing the essential matrix from the fundamental matrix; (iii) obtaining the external parameters by decomposition of the essential matrix. In the step of computing the fundamental matrix, the traditional methods are sensitive to noise and cannot ensure the estimation accuracy. We consider the feature distribution situation in the actual scene images and introduce a regional weighted normalization algorithm to improve accuracy of the fundamental matrix estimation. In contrast to traditional algorithms, experiments on simulated data prove that the method improves estimation robustness and accuracy of the fundamental matrix. Finally, we take an experiment for computing the relationship of a pair of stereo cameras to demonstrate accurate performance of the algorithm.
Akkaynak, Derya; Treibitz, Tali; Xiao, Bei; Gürkan, Umut A.; Allen, Justine J.; Demirci, Utkan; Hanlon, Roger T.
2014-01-01
Commercial off-the-shelf digital cameras are inexpensive and easy-to-use instruments that can be used for quantitative scientific data acquisition if images are captured in raw format and processed so that they maintain a linear relationship with scene radiance. Here we describe the image-processing steps required for consistent data acquisition with color cameras. In addition, we present a method for scene-specific color calibration that increases the accuracy of color capture when a scene contains colors that are not well represented in the gamut of a standard color-calibration target. We demonstrate applications of the proposed methodology in the fields of biomedical engineering, artwork photography, perception science, marine biology, and underwater imaging. PMID:24562030
Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System
Lu, Yu; Wang, Keyi; Fan, Gongshu
2016-01-01
A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857
Mach-zehnder based optical marker/comb generator for streak camera calibration
Miller, Edward Kirk
2015-03-03
This disclosure is directed to a method and apparatus for generating marker and comb indicia in an optical environment using a Mach-Zehnder (M-Z) modulator. High speed recording devices are configured to record image or other data defining a high speed event. To calibrate and establish time reference, the markers or combs are indicia which serve as timing pulses (markers) or a constant-frequency train of optical pulses (comb) to be imaged on a streak camera for accurate time based calibration and time reference. The system includes a camera, an optic signal generator which provides an optic signal to an M-Z modulator and biasing and modulation signal generators configured to provide input to the M-Z modulator. An optical reference signal is provided to the M-Z modulator. The M-Z modulator modulates the reference signal to a higher frequency optical signal which is output through a fiber coupled link to the streak camera.
Research on three-dimensional reconstruction method based on binocular vision
NASA Astrophysics Data System (ADS)
Li, Jinlin; Wang, Zhihui; Wang, Minjun
2018-03-01
As the hot and difficult issue in computer vision, binocular stereo vision is an important form of computer vision,which has a broad application prospects in many computer vision fields,such as aerial mapping,vision navigation,motion analysis and industrial inspection etc.In this paper, a research is done into binocular stereo camera calibration, image feature extraction and stereo matching. In the binocular stereo camera calibration module, the internal parameters of a single camera are obtained by using the checkerboard lattice of zhang zhengyou the field of image feature extraction and stereo matching, adopted the SURF operator in the local feature operator and the SGBM algorithm in the global matching algorithm are used respectively, and the performance are compared. After completed the feature points matching, we can build the corresponding between matching points and the 3D object points using the camera parameters which are calibrated, which means the 3D information.
Mendikute, Alberto; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-01-01
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g., 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras. PMID:28891946
Mendikute, Alberto; Yagüe-Fabra, José A; Zatarain, Mikel; Bertelsen, Álvaro; Leizea, Ibai
2017-09-09
Photogrammetry methods are being used more and more as a 3D technique for large scale metrology applications in industry. Optical targets are placed on an object and images are taken around it, where measuring traceability is provided by precise off-process pre-calibrated digital cameras and scale bars. According to the 2D target image coordinates, target 3D coordinates and camera views are jointly computed. One of the applications of photogrammetry is the measurement of raw part surfaces prior to its machining. For this application, post-process bundle adjustment has usually been adopted for computing the 3D scene. With that approach, a high computation time is observed, leading in practice to time consuming and user dependent iterative review and re-processing procedures until an adequate set of images is taken, limiting its potential for fast, easy-to-use, and precise measurements. In this paper, a new efficient procedure is presented for solving the bundle adjustment problem in portable photogrammetry. In-process bundle computing capability is demonstrated on a consumer grade desktop PC, enabling quasi real time 2D image and 3D scene computing. Additionally, a method for the self-calibration of camera and lens distortion has been integrated into the in-process approach due to its potential for highest precision when using low cost non-specialized digital cameras. Measurement traceability is set only by scale bars available in the measuring scene, avoiding the uncertainty contribution of off-process camera calibration procedures or the use of special purpose calibration artifacts. The developed self-calibrated in-process photogrammetry has been evaluated both in a pilot case scenario and in industrial scenarios for raw part measurement, showing a total in-process computing time typically below 1 s per image up to a maximum of 2 s during the last stages of the computed industrial scenes, along with a relative precision of 1/10,000 (e.g. 0.1 mm error in 1 m) with an error RMS below 0.2 pixels at image plane, ranging at the same performance reported for portable photogrammetry with precise off-process pre-calibrated cameras.
The new camera calibration system at the US Geological Survey
Light, D.L.
1992-01-01
Modern computerized photogrammetric instruments are capable of utilizing both radial and decentering camera calibration parameters which can increase plotting accuracy over that of older analog instrumentation technology from previous decades. Also, recent design improvements in aerial cameras have minimized distortions and increased the resolving power of camera systems, which should improve the performance of the overall photogrammetric process. In concert with these improvements, the Geological Survey has adopted the rigorous mathematical model for camera calibration developed by Duane Brown. An explanation of the Geological Survey's calibration facility and the additional calibration parameters now being provided in the USGS calibration certificate are reviewed. -Author
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patankar, S.; Gumbrell, E. T.; Robinson, T. S.
Here we report a new method using high stability, laser-driven supercontinuum generation in a liquid cell to calibrate the absolute photon response of fast optical streak cameras as a function of wavelength when operating at fastest sweep speeds. A stable, pulsed white light source based around the use of self-phase modulation in a salt solution was developed to provide the required brightness on picosecond timescales, enabling streak camera calibration in fully dynamic operation. The measured spectral brightness allowed for absolute photon response calibration over a broad spectral range (425-650nm). Calibrations performed with two Axis Photonique streak cameras using the Photonismore » P820PSU streak tube demonstrated responses which qualitatively follow the photocathode response. Peak sensitivities were 1 photon/count above background. The absolute dynamic sensitivity is less than the static by up to an order of magnitude. We attribute this to the dynamic response of the phosphor being lower.« less
Low Cost and Efficient 3d Indoor Mapping Using Multiple Consumer Rgb-D Cameras
NASA Astrophysics Data System (ADS)
Chen, C.; Yang, B. S.; Song, S.
2016-06-01
Driven by the miniaturization, lightweight of positioning and remote sensing sensors as well as the urgent needs for fusing indoor and outdoor maps for next generation navigation, 3D indoor mapping from mobile scanning is a hot research and application topic. The point clouds with auxiliary data such as colour, infrared images derived from 3D indoor mobile mapping suite can be used in a variety of novel applications, including indoor scene visualization, automated floorplan generation, gaming, reverse engineering, navigation, simulation and etc. State-of-the-art 3D indoor mapping systems equipped with multiple laser scanners product accurate point clouds of building interiors containing billions of points. However, these laser scanner based systems are mostly expensive and not portable. Low cost consumer RGB-D Cameras provides an alternative way to solve the core challenge of indoor mapping that is capturing detailed underlying geometry of the building interiors. Nevertheless, RGB-D Cameras have a very limited field of view resulting in low efficiency in the data collecting stage and incomplete dataset that missing major building structures (e.g. ceilings, walls). Endeavour to collect a complete scene without data blanks using single RGB-D Camera is not technic sound because of the large amount of human labour and position parameters need to be solved. To find an efficient and low cost way to solve the 3D indoor mapping, in this paper, we present an indoor mapping suite prototype that is built upon a novel calibration method which calibrates internal parameters and external parameters of multiple RGB-D Cameras. Three Kinect sensors are mounted on a rig with different view direction to form a large field of view. The calibration procedure is three folds: 1, the internal parameters of the colour and infrared camera inside each Kinect are calibrated using a chess board pattern, respectively; 2, the external parameters between the colour and infrared camera inside each Kinect are calibrated using a chess board pattern; 3, the external parameters between every Kinect are firstly calculated using a pre-set calibration field and further refined by an iterative closet point algorithm. Experiments are carried out to validate the proposed method upon RGB-D datasets collected by the indoor mapping suite prototype. The effectiveness and accuracy of the proposed method is evaluated by comparing the point clouds derived from the prototype with ground truth data collected by commercial terrestrial laser scanner at ultra-high density. The overall analysis of the results shows that the proposed method achieves seamless integration of multiple point clouds form different RGB-D cameras collected at 30 frame per second.
Cano-García, Angel E.; Lazaro, José Luis; Infante, Arturo; Fernández, Pedro; Pompa-Chacón, Yamilet; Espinoza, Felipe
2012-01-01
In this study, a camera to infrared diode (IRED) distance estimation problem was analyzed. The main objective was to define an alternative to measures depth only using the information extracted from pixel grey levels of the IRED image to estimate the distance between the camera and the IRED. In this paper, the standard deviation of the pixel grey level in the region of interest containing the IRED image is proposed as an empirical parameter to define a model for estimating camera to emitter distance. This model includes the camera exposure time, IRED radiant intensity and the distance between the camera and the IRED. An expression for the standard deviation model related to these magnitudes was also derived and calibrated using different images taken under different conditions. From this analysis, we determined the optimum parameters to ensure the best accuracy provided by this alternative. Once the model calibration had been carried out, a differential method to estimate the distance between the camera and the IRED was defined and applied, considering that the camera was aligned with the IRED. The results indicate that this method represents a useful alternative for determining the depth information. PMID:22778608
Cano-García, Angel E; Lazaro, José Luis; Infante, Arturo; Fernández, Pedro; Pompa-Chacón, Yamilet; Espinoza, Felipe
2012-01-01
In this study, a camera to infrared diode (IRED) distance estimation problem was analyzed. The main objective was to define an alternative to measures depth only using the information extracted from pixel grey levels of the IRED image to estimate the distance between the camera and the IRED. In this paper, the standard deviation of the pixel grey level in the region of interest containing the IRED image is proposed as an empirical parameter to define a model for estimating camera to emitter distance. This model includes the camera exposure time, IRED radiant intensity and the distance between the camera and the IRED. An expression for the standard deviation model related to these magnitudes was also derived and calibrated using different images taken under different conditions. From this analysis, we determined the optimum parameters to ensure the best accuracy provided by this alternative. Once the model calibration had been carried out, a differential method to estimate the distance between the camera and the IRED was defined and applied, considering that the camera was aligned with the IRED. The results indicate that this method represents a useful alternative for determining the depth information.
Motion capture for human motion measuring by using single camera with triangle markers
NASA Astrophysics Data System (ADS)
Takahashi, Hidenori; Tanaka, Takayuki; Kaneko, Shun'ichi
2005-12-01
This study aims to realize a motion capture for measuring 3D human motions by using single camera. Although motion capture by using multiple cameras is widely used in sports field, medical field, engineering field and so on, optical motion capture method with one camera is not established. In this paper, the authors achieved a 3D motion capture by using one camera, named as Mono-MoCap (MMC), on the basis of two calibration methods and triangle markers which each length of side is given. The camera calibration methods made 3D coordinates transformation parameter and a lens distortion parameter with Modified DLT method. The triangle markers enabled to calculate a coordinate value of a depth direction on a camera coordinate. Experiments of 3D position measurement by using the MMC on a measurement space of cubic 2 m on each side show an average error of measurement of a center of gravity of a triangle marker was less than 2 mm. As compared with conventional motion capture method by using multiple cameras, the MMC has enough accuracy for 3D measurement. Also, by putting a triangle marker on each human joint, the MMC was able to capture a walking motion, a standing-up motion and a bending and stretching motion. In addition, a method using a triangle marker together with conventional spherical markers was proposed. Finally, a method to estimate a position of a marker by measuring the velocity of the marker was proposed in order to improve the accuracy of MMC.
Uav Cameras: Overview and Geometric Calibration Benchmark
NASA Astrophysics Data System (ADS)
Cramer, M.; Przybilla, H.-J.; Zurhorst, A.
2017-08-01
Different UAV platforms and sensors are used in mapping already, many of them equipped with (sometimes) modified cameras as known from the consumer market. Even though these systems normally fulfil their requested mapping accuracy, the question arises, which system performs best? This asks for a benchmark, to check selected UAV based camera systems in well-defined, reproducible environments. Such benchmark is tried within this work here. Nine different cameras used on UAV platforms, representing typical camera classes, are considered. The focus is laid on the geometry here, which is tightly linked to the process of geometrical calibration of the system. In most applications the calibration is performed in-situ, i.e. calibration parameters are obtained as part of the project data itself. This is often motivated because consumer cameras do not keep constant geometry, thus, cannot be seen as metric cameras. Still, some of the commercial systems are quite stable over time, as it was proven from repeated (terrestrial) calibrations runs. Already (pre-)calibrated systems may offer advantages, especially when the block geometry of the project does not allow for a stable and sufficient in-situ calibration. Especially for such scenario close to metric UAV cameras may have advantages. Empirical airborne test flights in a calibration field have shown how block geometry influences the estimated calibration parameters and how consistent the parameters from lab calibration can be reproduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.
2015-01-12
The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high-energy-density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However,more » the camera nonlinearities drift over time, affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.« less
A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.
Qian, Shuo; Sheng, Yang
2011-11-01
Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.
Fly-through viewpoint video system for multi-view soccer movie using viewpoint interpolation
NASA Astrophysics Data System (ADS)
Inamoto, Naho; Saito, Hideo
2003-06-01
This paper presents a novel method for virtual view generation that allows viewers to fly through in a real soccer scene. A soccer match is captured by multiple cameras at a stadium and images of arbitrary viewpoints are synthesized by view-interpolation of two real camera images near the given viewpoint. In the proposed method, cameras do not need to be strongly calibrated, but epipolar geometry between the cameras is sufficient for the view-interpolation. Therefore, it can easily be applied to a dynamic event even in a large space, because the efforts for camera calibration can be reduced. A soccer scene is classified into several regions and virtual view images are generated based on the epipolar geometry in each region. Superimposition of the images completes virtual views for the whole soccer scene. An application for fly-through observation of a soccer match is introduced as well as the algorithm of the view-synthesis and experimental results..
NASA Astrophysics Data System (ADS)
Shahbazi, M.; Sattari, M.; Homayouni, S.; Saadatseresht, M.
2012-07-01
Recent advances in positioning techniques have made it possible to develop Mobile Mapping Systems (MMS) for detection and 3D localization of various objects from a moving platform. On the other hand, automatic traffic sign recognition from an equipped mobile platform has recently been a challenging issue for both intelligent transportation and municipal database collection. However, there are several inevitable problems coherent to all the recognition methods completely relying on passive chromatic or grayscale images. This paper presents the implementation and evaluation of an operational MMS. Being distinct from the others, the developed MMS comprises one range camera based on Photonic Mixer Device (PMD) technology and one standard 2D digital camera. The system benefits from certain algorithms to detect, recognize and localize the traffic signs by fusing the shape, color and object information from both range and intensity images. As the calibrating stage, a self-calibration method based on integrated bundle adjustment via joint setup with the digital camera is applied in this study for PMD camera calibration. As the result, an improvement of 83 % in RMS of range error and 72 % in RMS of coordinates residuals for PMD camera, over that achieved with basic calibration is realized in independent accuracy assessments. Furthermore, conventional photogrammetric techniques based on controlled network adjustment are utilized for platform calibration. Likewise, the well-known Extended Kalman Filtering (EKF) is applied to integrate the navigation sensors, namely GPS and INS. The overall acquisition system along with the proposed techniques leads to 90 % true positive recognition and the average of 12 centimetres 3D positioning accuracy.
NASA Astrophysics Data System (ADS)
Shahbazi, M.; Sattari, M.; Homayouni, S.; Saadatseresht, M.
2012-07-01
Recent advances in positioning techniques have made it possible to develop Mobile Mapping Systems (MMS) for detection and 3D localization of various objects from a moving platform. On the other hand, automatic traffic sign recognition from an equipped mobile platform has recently been a challenging issue for both intelligent transportation and municipal database collection. However, there are several inevitable problems coherent to all the recognition methods completely relying on passive chromatic or grayscale images. This paper presents the implementation and evaluation of an operational MMS. Being distinct from the others, the developed MMS comprises one range camera based on Photonic Mixer Device (PMD) technology and one standard 2D digital camera. The system benefits from certain algorithms to detect, recognize and localize the traffic signs by fusing the shape, color and object information from both range and intensity images. As the calibrating stage, a self-calibration method based on integrated bundle adjustment via joint setup with the digital camera is applied in this study for PMD camera calibration. As the result, an improvement of 83% in RMS of range error and 72% in RMS of coordinates residuals for PMD camera, over that achieved with basic calibration is realized in independent accuracy assessments. Furthermore, conventional photogrammetric techniques based on controlled network adjustment are utilized for platform calibration. Likewise, the well-known Extended Kalman Filtering (EKF) is applied to integrate the navigation sensors, namely GPS and INS. The overall acquisition system along with the proposed techniques leads to 90% true positive recognition and the average of 12 centimetres 3D positioning accuracy.
Application of a laser scanner to three dimensional visual sensing tasks
NASA Technical Reports Server (NTRS)
Ryan, Arthur M.
1992-01-01
The issues are described which are associated with using a laser scanner for visual sensing and the methods developed by the author to address them. A laser scanner is a device that controls the direction of a laser beam by deflecting it through a pair of orthogonal mirrors, the orientations of which are specified by a computer. If a calibrated laser scanner is combined with a calibrated camera, it is possible to perform three dimensional sensing by directing the laser at objects within the field of view of the camera. There are several issues associated with using a laser scanner for three dimensional visual sensing that must be addressed in order to use the laser scanner effectively. First, methods are needed to calibrate the laser scanner and estimate three dimensional points. Second, methods to estimate three dimensional points using a calibrated camera and laser scanner are required. Third, methods are required for locating the laser spot in a cluttered image. Fourth, mathematical models that predict the laser scanner's performance and provide structure for three dimensional data points are necessary. Several methods were developed to address each of these and has evaluated them to determine how and when they should be applied. The theoretical development, implementation, and results when used in a dual arm eighteen degree of freedom robotic system for space assembly is described.
Watching elderly and disabled person's physical condition by remotely controlled monorail robot
NASA Astrophysics Data System (ADS)
Nagasaka, Yasunori; Matsumoto, Yoshinori; Fukaya, Yasutoshi; Takahashi, Tomoichi; Takeshita, Toru
2001-10-01
We are developing a nursing system using robots and cameras. The cameras are mounted on a remote controlled monorail robot which moves inside a room and watches the elderly. It is necessary to pay attention to the elderly at home or nursing homes all time. This requires staffs to pay attention to them at every time. The purpose of our system is to help those staffs. This study intends to improve such situation. A host computer controls a monorail robot to go in front of the elderly using the images taken by cameras on the ceiling. A CCD camera is mounted on the monorail robot to take pictures of their facial expression or movements. The robot sends the images to a host computer that checks them whether something unusual happens or not. We propose a simple calibration method for positioning the monorail robots to track the moves of the elderly for keeping their faces at center of camera view. We built a small experiment system, and evaluated our camera calibration method and image processing algorithm.
Concerning the Video Drift Method to Measure Double Stars
NASA Astrophysics Data System (ADS)
Nugent, Richard L.; Iverson, Ernest W.
2015-05-01
Classical methods to measure position angles and separations of double stars rely on just a few measurements either from visual observations or photographic means. Visual and photographic CCD observations are subject to errors from the following sources: misalignments from eyepiece/camera/barlow lens/micrometer/focal reducers, systematic errors from uncorrected optical distortions, aberrations from the telescope system, camera tilt, magnitude and color effects. Conventional video methods rely on calibration doubles and graphically calculating the east-west direction plus careful choice of select video frames stacked for measurement. Atmospheric motion is one of the larger sources of error in any exposure/measurement method which is on the order of 0.5-1.5. Ideally, if a data set from a short video can be used to derive position angle and separation, with each data set self-calibrating independent of any calibration doubles or star catalogues, this would provide measurements of high systematic accuracy. These aims are achieved by the video drift method first proposed by the authors in 2011. This self calibrating video method automatically analyzes 1,000's of measurements from a short video clip.
A real-time camera calibration system based on OpenCV
NASA Astrophysics Data System (ADS)
Zhang, Hui; Wang, Hua; Guo, Huinan; Ren, Long; Zhou, Zuofeng
2015-07-01
Camera calibration is one of the essential steps in the computer vision research. This paper describes a real-time OpenCV based camera calibration system, and developed and implemented in the VS2008 environment. Experimental results prove that the system to achieve a simple and fast camera calibration, compared with MATLAB, higher precision and does not need manual intervention, and can be widely used in various computer vision system.
Calibration Techniques for Accurate Measurements by Underwater Camera Systems
Shortis, Mark
2015-01-01
Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems. PMID:26690172
Calibration and verification of thermographic cameras for geometric measurements
NASA Astrophysics Data System (ADS)
Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.
2011-03-01
Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better results for the Nec camera.
Infrared stereo calibration for unmanned ground vehicle navigation
NASA Astrophysics Data System (ADS)
Harguess, Josh; Strange, Shawn
2014-06-01
The problem of calibrating two color cameras as a stereo pair has been heavily researched and many off-the-shelf software packages, such as Robot Operating System and OpenCV, include calibration routines that work in most cases. However, the problem of calibrating two infrared (IR) cameras for the purposes of sensor fusion and point could generation is relatively new and many challenges exist. We present a comparison of color camera and IR camera stereo calibration using data from an unmanned ground vehicle. There are two main challenges in IR stereo calibration; the calibration board (material, design, etc.) and the accuracy of calibration pattern detection. We present our analysis of these challenges along with our IR stereo calibration methodology. Finally, we present our results both visually and analytically with computed reprojection errors.
Calibration of imaging parameters for space-borne airglow photography using city light positions
NASA Astrophysics Data System (ADS)
Hozumi, Yuta; Saito, Akinori; Ejiri, Mitsumu K.
2016-09-01
A new method for calibrating imaging parameters of photographs taken from the International Space Station (ISS) is presented in this report. Airglow in the mesosphere and the F-region ionosphere was captured on the limb of the Earth with a digital single-lens reflex camera from the ISS by astronauts. To utilize the photographs as scientific data, imaging parameters, such as the angle of view, exact position, and orientation of the camera, should be determined because they are not measured at the time of imaging. A new calibration method using city light positions shown in the photographs was developed to determine these imaging parameters with high accuracy suitable for airglow study. Applying the pinhole camera model, the apparent city light positions on the photograph are matched with the actual city light locations on Earth, which are derived from the global nighttime stable light map data obtained by the Defense Meteorological Satellite Program satellite. The correct imaging parameters are determined in an iterative process by matching the apparent positions on the image with the actual city light locations. We applied this calibration method to photographs taken on August 26, 2014, and confirmed that the result is correct. The precision of the calibration was evaluated by comparing the results from six different photographs with the same imaging parameters. The precisions in determining the camera position and orientation are estimated to be ±2.2 km and ±0.08°, respectively. The 0.08° difference in the orientation yields a 2.9-km difference at a tangential point of 90 km in altitude. The airglow structures in the photographs were mapped to geographical points using the calibrated imaging parameters and compared with a simultaneous observation by the Visible and near-Infrared Spectral Imager of the Ionosphere, Mesosphere, Upper Atmosphere, and Plasmasphere mapping mission installed on the ISS. The comparison shows good agreements and supports the validity of the calibration. This calibration technique makes it possible to utilize photographs taken on low-Earth-orbit satellites in the nighttime as a reference for the airglow and aurora structures.[Figure not available: see fulltext.
NASA Astrophysics Data System (ADS)
García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier
2016-04-01
Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.
Precise X-ray and video overlay for augmented reality fluoroscopy.
Chen, Xin; Wang, Lejing; Fallavollita, Pascal; Navab, Nassir
2013-01-01
The camera-augmented mobile C-arm (CamC) augments any mobile C-arm by a video camera and mirror construction and provides a co-registration of X-ray with video images. The accurate overlay between these images is crucial to high-quality surgical outcomes. In this work, we propose a practical solution that improves the overlay accuracy for any C-arm orientation by: (i) improving the existing CamC calibration, (ii) removing distortion effects, and (iii) accounting for the mechanical sagging of the C-arm gantry due to gravity. A planar phantom is constructed and placed at different distances to the image intensifier in order to obtain the optimal homography that co-registers X-ray and video with a minimum error. To alleviate distortion, both X-ray calibration based on equidistant grid model and Zhang's camera calibration method are implemented for distortion correction. Lastly, the virtual detector plane (VDP) method is adapted and integrated to reduce errors due to the mechanical sagging of the C-arm gantry. The overlay errors are 0.38±0.06 mm when not correcting for distortion, 0.27±0.06 mm when applying Zhang's camera calibration, and 0.27±0.05 mm when applying X-ray calibration. Lastly, when taking into account all angular and orbital rotations of the C-arm, as well as correcting for distortion, the overlay errors are 0.53±0.24 mm using VDP and 1.67±1.25 mm excluding VDP. The augmented reality fluoroscope achieves an accurate video and X-ray overlay when applying the optimal homography calculated from distortion correction using X-ray calibration together with the VDP.
Automatic and robust extrinsic camera calibration for high-accuracy mobile mapping
NASA Astrophysics Data System (ADS)
Goeman, Werner; Douterloigne, Koen; Bogaert, Peter; Pires, Rui; Gautama, Sidharta
2012-10-01
A mobile mapping system (MMS) is the answer of the geoinformation community to the exponentially growing demand for various geospatial data with increasingly higher accuracies and captured by multiple sensors. As the mobile mapping technology is pushed to explore its use for various applications on water, rail, or road, the need emerges to have an external sensor calibration procedure which is portable, fast and easy to perform. This way, sensors can be mounted and demounted depending on the application requirements without the need for time consuming calibration procedures. A new methodology is presented to provide a high quality external calibration of cameras which is automatic, robust and fool proof.The MMS uses an Applanix POSLV420, which is a tightly coupled GPS/INS positioning system. The cameras used are Point Grey color video cameras synchronized with the GPS/INS system. The method uses a portable, standard ranging pole which needs to be positioned on a known ground control point. For calibration a well studied absolute orientation problem needs to be solved. Here, a mutual information based image registration technique is studied for automatic alignment of the ranging pole. Finally, a few benchmarking tests are done under various lighting conditions which proves the methodology's robustness, by showing high absolute stereo measurement accuracies of a few centimeters.
Hand-eye calibration using a target registration error model.
Chen, Elvis C S; Morgan, Isabella; Jayarathne, Uditha; Ma, Burton; Peters, Terry M
2017-10-01
Surgical cameras are prevalent in modern operating theatres and are often used as a surrogate for direct vision. Visualisation techniques (e.g. image fusion) made possible by tracking the camera require accurate hand-eye calibration between the camera and the tracking system. The authors introduce the concept of 'guided hand-eye calibration', where calibration measurements are facilitated by a target registration error (TRE) model. They formulate hand-eye calibration as a registration problem between homologous point-line pairs. For each measurement, the position of a monochromatic ball-tip stylus (a point) and its projection onto the image (a line) is recorded, and the TRE of the resulting calibration is predicted using a TRE model. The TRE model is then used to guide the placement of the calibration tool, so that the subsequent measurement minimises the predicted TRE. Assessing TRE after each measurement produces accurate calibration using a minimal number of measurements. As a proof of principle, they evaluated guided calibration using a webcam and an endoscopic camera. Their endoscopic camera results suggest that millimetre TRE is achievable when at least 15 measurements are acquired with the tracker sensor ∼80 cm away on the laparoscope handle for a target ∼20 cm away from the camera.
Wu, Defeng; Chen, Tianfei; Li, Aiguo
2016-08-30
A robot-based three-dimensional (3D) measurement system is presented. In the presented system, a structured light vision sensor is mounted on the arm of an industrial robot. Measurement accuracy is one of the most important aspects of any 3D measurement system. To improve the measuring accuracy of the structured light vision sensor, a novel sensor calibration approach is proposed to improve the calibration accuracy. The approach is based on a number of fixed concentric circles manufactured in a calibration target. The concentric circle is employed to determine the real projected centres of the circles. Then, a calibration point generation procedure is used with the help of the calibrated robot. When enough calibration points are ready, the radial alignment constraint (RAC) method is adopted to calibrate the camera model. A multilayer perceptron neural network (MLPNN) is then employed to identify the calibration residuals after the application of the RAC method. Therefore, the hybrid pinhole model and the MLPNN are used to represent the real camera model. Using a standard ball to validate the effectiveness of the presented technique, the experimental results demonstrate that the proposed novel calibration approach can achieve a highly accurate model of the structured light vision sensor.
Implicit multiplane 3D camera calibration matrices for stereo image processing
NASA Astrophysics Data System (ADS)
McKee, James W.; Burgett, Sherrie J.
1997-12-01
By implicit camera calibration, we mean the process of calibrating cameras without explicitly computing their physical parameters. We introduce a new implicit model based on a generalized mapping between an image plane and multiple, parallel calibration planes (usually between four to seven planes). This paper presents a method of computing a relationship between a point on a three-dimensional (3D) object and its corresponding two-dimensional (2D) coordinate in a camera image. This relationship is expanded to form a mapping of points in 3D space to points in image (camera) space and visa versa that requires only matrix multiplication operations. This paper presents the rationale behind the selection of the forms of four matrices and the algorithms to calculate the parameters for the matrices. Two of the matrices are used to map 3D points in object space to 2D points on the CCD camera image plane. The other two matrices are used to map 2D points on the image plane to points on user defined planes in 3D object space. The mappings include compensation for lens distortion and measurement errors. The number of parameters used can be increased, in a straight forward fashion, to calculate and use as many parameters as needed to obtain a user desired accuracy. Previous methods of camera calibration use a fixed number of parameters which can limit the obtainable accuracy and most require the solution of nonlinear equations. The procedure presented can be used to calibrate a single camera to make 2D measurements or calibrate stereo cameras to make 3D measurements. Positional accuracy of better than 3 parts in 10,000 have been achieved. The algorithms in this paper were developed and are implemented in MATLABR (registered trademark of The Math Works, Inc.). We have developed a system to analyze the path of optical fiber during high speed payout (unwinding) of optical fiber off a bobbin. This requires recording and analyzing high speed (5 microsecond exposure time), synchronous, stereo images of the optical fiber during payout. A 3D equation for the fiber at an instant in time is calculated from the corresponding pair of stereo images as follows. In each image, about 20 points along the 2D projection of the fiber are located. Each of these 'fiber points' in one image is mapped to its projection line in 3D space. Each projection line is mapped into another line in the second image. The intersection of each mapped projection line and a curve fitted to the fiber points of the second image (fiber projection in second image) is calculated. Each intersection point is mapped back to the 3D space. A 3D fiber coordinate is formed from the intersection, in 3D space, of a mapped intersection point with its corresponding projection line. The 3D equation for the fiber is computed from this ordered list of 3D coordinates. This process requires a method of accurately mapping 2D (image space) to 3D (object space) and visa versa.3173
Robot calibration with a photogrammetric on-line system using reseau scanning cameras
NASA Astrophysics Data System (ADS)
Diewald, Bernd; Godding, Robert; Henrich, Andreas
1994-03-01
The possibility for testing and calibration of industrial robots becomes more and more important for manufacturers and users of such systems. Exacting applications in connection with the off-line programming techniques or the use of robots as measuring machines are impossible without a preceding robot calibration. At the LPA an efficient calibration technique has been developed. Instead of modeling the kinematic behavior of a robot, the new method describes the pose deviations within a user-defined section of the robot's working space. High- precision determination of 3D coordinates of defined path positions is necessary for calibration and can be done by digital photogrammetric systems. For the calibration of a robot at the LPA a digital photogrammetric system with three Rollei Reseau Scanning Cameras was used. This system allows an automatic measurement of a large number of robot poses with high accuracy.
On the absolute calibration of SO2 cameras
Lübcke, Peter; Bobrowski, Nicole; Illing, Sebastian; Kern, Christoph; Alvarez Nieves, Jose Manuel; Vogel, Leif; Zielcke, Johannes; Delgados Granados, Hugo; Platt, Ulrich
2013-01-01
This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOVDOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. The measurements presented in this work were taken at Popocatepetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 and 14.34 kg s−1 were observed.
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.
Su, Po-Chang; Shen, Ju; Xu, Wanxin; Cheung, Sen-Ching S; Luo, Ying
2018-01-15
From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds.
A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks †
Shen, Ju; Xu, Wanxin; Luo, Ying
2018-01-01
From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds. PMID:29342968
Absolute calibration of a charge-coupled device camera with twin beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meda, A.; Ruo-Berchera, I., E-mail: i.ruoberchera@inrim.it; Degiovanni, I. P.
2014-09-08
We report on the absolute calibration of a Charge-Coupled Device (CCD) camera by exploiting quantum correlation. This method exploits a certain number of spatial pairwise quantum correlated modes produced by spontaneous parametric-down-conversion. We develop a measurement model accounting for all the uncertainty contributions, and we reach the relative uncertainty of 0.3% in low photon flux regime. This represents a significant step forward for the characterization of (scientific) CCDs used in mesoscopic light regime.
Modulated CMOS camera for fluorescence lifetime microscopy.
Chen, Hongtao; Holst, Gerhard; Gratton, Enrico
2015-12-01
Widefield frequency-domain fluorescence lifetime imaging microscopy (FD-FLIM) is a fast and accurate method to measure the fluorescence lifetime of entire images. However, the complexity and high costs involved in construction of such a system limit the extensive use of this technique. PCO AG recently released the first luminescence lifetime imaging camera based on a high frequency modulated CMOS image sensor, QMFLIM2. Here we tested and provide operational procedures to calibrate the camera and to improve the accuracy using corrections necessary for image analysis. With its flexible input/output options, we are able to use a modulated laser diode or a 20 MHz pulsed white supercontinuum laser as the light source. The output of the camera consists of a stack of modulated images that can be analyzed by the SimFCS software using the phasor approach. The nonuniform system response across the image sensor must be calibrated at the pixel level. This pixel calibration is crucial and needed for every camera settings, e.g. modulation frequency and exposure time. A significant dependency of the modulation signal on the intensity was also observed and hence an additional calibration is needed for each pixel depending on the pixel intensity level. These corrections are important not only for the fundamental frequency, but also for the higher harmonics when using the pulsed supercontinuum laser. With these post data acquisition corrections, the PCO CMOS-FLIM camera can be used for various biomedical applications requiring a large frame and high speed acquisition. © 2015 Wiley Periodicals, Inc.
Low Frequency Flats for Imaging Cameras on the Hubble Space Telescope
NASA Astrophysics Data System (ADS)
Kossakowski, Diana; Avila, Roberto J.; Borncamp, David; Grogin, Norman A.
2017-01-01
We created a revamped Low Frequency Flat (L-Flat) algorithm for the Hubble Space Telescope (HST) and all of its imaging cameras. The current program that makes these calibration files does not compile on modern computer systems and it requires translation to Python. We took the opportunity to explore various methods that reduce the scatter of photometric observations using chi-squared optimizers along with Markov Chain Monte Carlo (MCMC). We created simulations to validate the algorithms and then worked with the UV photometry of the globular cluster NGC6681 to update the calibration files for the Advanced Camera for Surveys (ACS) and Solar Blind Channel (SBC). The new software was made for general usage and therefore can be applied to any of the current imaging cameras on HST.
Lincoln Penny on Mars in Camera Calibration Target
2012-09-10
The penny in this image is part of a camera calibration target on NASA Mars rover Curiosity. The MAHLI camera on the rover took this image of the MAHLI calibration target during the 34th Martian day of Curiosity work on Mars, Sept. 9, 2012.
Data filtering with support vector machines in geometric camera calibration.
Ergun, B; Kavzoglu, T; Colkesen, I; Sahin, C
2010-02-01
The use of non-metric digital cameras in close-range photogrammetric applications and machine vision has become a popular research agenda. Being an essential component of photogrammetric evaluation, camera calibration is a crucial stage for non-metric cameras. Therefore, accurate camera calibration and orientation procedures have become prerequisites for the extraction of precise and reliable 3D metric information from images. The lack of accurate inner orientation parameters can lead to unreliable results in the photogrammetric process. A camera can be well defined with its principal distance, principal point offset and lens distortion parameters. Different camera models have been formulated and used in close-range photogrammetry, but generally sensor orientation and calibration is performed with a perspective geometrical model by means of the bundle adjustment. In this study, support vector machines (SVMs) using radial basis function kernel is employed to model the distortions measured for Olympus Aspherical Zoom lens Olympus E10 camera system that are later used in the geometric calibration process. It is intended to introduce an alternative approach for the on-the-job photogrammetric calibration stage. Experimental results for DSLR camera with three focal length settings (9, 18 and 36 mm) were estimated using bundle adjustment with additional parameters, and analyses were conducted based on object point discrepancies and standard errors. Results show the robustness of the SVMs approach on the correction of image coordinates by modelling total distortions on-the-job calibration process using limited number of images.
Systematic Calibration for a Backpacked Spherical Photogrammetry Imaging System
NASA Astrophysics Data System (ADS)
Rau, J. Y.; Su, B. W.; Hsiao, K. W.; Jhan, J. P.
2016-06-01
A spherical camera can observe the environment for almost 720 degrees' field of view in one shoot, which is useful for augmented reality, environment documentation, or mobile mapping applications. This paper aims to develop a spherical photogrammetry imaging system for the purpose of 3D measurement through a backpacked mobile mapping system (MMS). The used equipment contains a Ladybug-5 spherical camera, a tactical grade positioning and orientation system (POS), i.e. SPAN-CPT, and an odometer, etc. This research aims to directly apply photogrammetric space intersection technique for 3D mapping from a spherical image stereo-pair. For this purpose, several systematic calibration procedures are required, including lens distortion calibration, relative orientation calibration, boresight calibration for direct georeferencing, and spherical image calibration. The lens distortion is serious on the ladybug-5 camera's original 6 images. Meanwhile, for spherical image mosaicking from these original 6 images, we propose the use of their relative orientation and correct their lens distortion at the same time. However, the constructed spherical image still contains systematic error, which will reduce the 3D measurement accuracy. Later for direct georeferencing purpose, we need to establish a ground control field for boresight/lever-arm calibration. Then, we can apply the calibrated parameters to obtain the exterior orientation parameters (EOPs) of all spherical images. In the end, the 3D positioning accuracy after space intersection will be evaluated, including EOPs obtained by structure from motion method.
Using the auxiliary camera for system calibration of 3D measurement by digital speckle
NASA Astrophysics Data System (ADS)
Xue, Junpeng; Su, Xianyu; Zhang, Qican
2014-06-01
The study of 3D shape measurement by digital speckle temporal sequence correlation have drawn a lot of attention by its own advantages, however, the measurement mainly for depth z-coordinate, horizontal physical coordinate (x, y) are usually marked as image pixel coordinate. In this paper, a new approach for the system calibration is proposed. With an auxiliary camera, we made up the temporary binocular vision system, which are used for the calibration of horizontal coordinates (mm) while the temporal sequence reference-speckle-sets are calibrated. First, the binocular vision system has been calibrated using the traditional method. Then, the digital speckles are projected on the reference plane, which is moved by equal distance in the direction of depth, temporal sequence speckle images are acquired with camera as reference sets. When the reference plane is in the first position and final position, crossed fringe pattern are projected to the plane respectively. The control points of pixel coordinates are extracted by Fourier analysis from the images, and the physical coordinates are calculated by the binocular vision. The physical coordinates corresponding to each pixel of the images are calculated by interpolation algorithm. Finally, the x and y corresponding to arbitrary depth value z are obtained by the geometric formula. Experiments prove that our method can fast and flexibly measure the 3D shape of an object as point cloud.
Analysis of calibration accuracy of cameras with different target sizes for large field of view
NASA Astrophysics Data System (ADS)
Zhang, Jin; Chai, Zhiwen; Long, Changyu; Deng, Huaxia; Ma, Mengchao; Zhong, Xiang; Yu, Huan
2018-03-01
Visual measurement plays an increasingly important role in the field o f aerospace, ship and machinery manufacturing. Camera calibration of large field-of-view is a critical part of visual measurement . For the issue a large scale target is difficult to be produced, and the precision can not to be guaranteed. While a small target has the advantage of produced of high precision, but only local optimal solutions can be obtained . Therefore, studying the most suitable ratio of the target size to the camera field of view to ensure the calibration precision requirement of the wide field-of-view is required. In this paper, the cameras are calibrated by a series of different dimensions of checkerboard calibration target s and round calibration targets, respectively. The ratios of the target size to the camera field-of-view are 9%, 18%, 27%, 36%, 45%, 54%, 63%, 72%, 81% and 90%. The target is placed in different positions in the camera field to obtain the camera parameters of different positions . Then, the distribution curves of the reprojection mean error of the feature points' restructure in different ratios are analyzed. The experimental data demonstrate that with the ratio of the target size to the camera field-of-view increas ing, the precision of calibration is accordingly improved, and the reprojection mean error changes slightly when the ratio is above 45%.
Liu, Xinyang; Plishker, William; Zaki, George; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj
2017-01-01
Purpose Common camera calibration methods employed in current laparoscopic augmented reality systems require the acquisition of multiple images of an entire checkerboard pattern from various poses. This lengthy procedure prevents performing laparoscope calibration in the operating room (OR). The purpose of this work was to develop a fast calibration method for electromagnetically (EM) tracked laparoscopes, such that calibration can be performed in the OR on demand. Methods We designed a mechanical tracking mount to uniquely and snugly position an EM sensor to an appropriate location on a conventional laparoscope. A tool named fCalib was developed to calibrate intrinsic camera parameters, distortion coefficients, and extrinsic parameters (transformation between the scope lens coordinate system and the EM sensor coordinate system) using a single image that shows an arbitrary portion of a special target pattern. For quick evaluation of calibration result in the OR, we integrated a tube phantom with fCalib and overlaid a virtual representation of the tube on the live video scene. Results We compared spatial target registration error between the common OpenCV method and the fCalib method in a laboratory setting. In addition, we compared the calibration re-projection error between the EM tracking-based fCalib and the optical tracking-based fCalib in a clinical setting. Our results suggested that the proposed method is comparable to the OpenCV method. However, changing the environment, e.g., inserting or removing surgical tools, would affect re-projection accuracy for the EM tracking-based approach. Computational time of the fCalib method averaged 14.0 s (range 3.5 s – 22.7 s). Conclusions We developed and validated a prototype for fast calibration and evaluation of EM tracked conventional (forward viewing) laparoscopes. The calibration method achieved acceptable accuracy and was relatively fast and easy to be performed in the OR on demand. PMID:27250853
Calibration Procedures on Oblique Camera Setups
NASA Astrophysics Data System (ADS)
Kemper, G.; Melykuti, B.; Yu, C.
2016-06-01
Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager) is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna -IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first step with the help of the nadir camera and the GPS/IMU data, an initial orientation correction and radial correction were calculated. With this approach, the whole project was calculated and calibrated in one step. During the iteration process the radial and tangential parameters were switched on individually for the camera heads and after that the camera constants and principal point positions were checked and finally calibrated. Besides that, the bore side calibration can be performed either on basis of the nadir camera and their offsets, or independently for each camera without correlation to the others. This must be performed in a complete mission anyway to get stability between the single camera heads. Determining the lever arms of the nodal-points to the IMU centre needs more caution than for a single camera especially due to the strong tilt angle. Prepared all these previous steps, you get a highly accurate sensor that enables a fully automated data extraction with a rapid update of you existing data. Frequently monitoring urban dynamics is then possible in fully 3D environment.
Calibration strategies for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Gaug, Markus; Berge, David; Daniel, Michael; Doro, Michele; Förster, Andreas; Hofmann, Werner; Maccarone, Maria C.; Parsons, Dan; de los Reyes Lopez, Raquel; van Eldik, Christopher
2014-08-01
The Central Calibration Facilities workpackage of the Cherenkov Telescope Array (CTA) observatory for very high energy gamma ray astronomy defines the overall calibration strategy of the array, develops dedicated hardware and software for the overall array calibration and coordinates the calibration efforts of the different telescopes. The latter include LED-based light pulsers, and various methods and instruments to achieve a calibration of the overall optical throughput. On the array level, methods for the inter-telescope calibration and the absolute calibration of the entire observatory are being developed. Additionally, the atmosphere above the telescopes, used as a calorimeter, will be monitored constantly with state-of-the-art instruments to obtain a full molecular and aerosol profile up to the stratosphere. The aim is to provide a maximal uncertainty of 10% on the reconstructed energy-scale, obtained through various independent methods. Different types of LIDAR in combination with all-sky-cameras will provide the observatory with an online, intelligent scheduling system, which, if the sky is partially covered by clouds, gives preference to sources observable under good atmospheric conditions. Wide-field optical telescopes and Raman Lidars will provide online information about the height-resolved atmospheric extinction, throughout the field-of-view of the cameras, allowing for the correction of the reconstructed energy of each gamma-ray event. The aim is to maximize the duty cycle of the observatory, in terms of usable data, while reducing the dead time introduced by calibration activities to an absolute minimum.
Liu, Xinyang; Plishker, William; Zaki, George; Kang, Sukryool; Kane, Timothy D; Shekhar, Raj
2016-06-01
Common camera calibration methods employed in current laparoscopic augmented reality systems require the acquisition of multiple images of an entire checkerboard pattern from various poses. This lengthy procedure prevents performing laparoscope calibration in the operating room (OR). The purpose of this work was to develop a fast calibration method for electromagnetically (EM) tracked laparoscopes, such that the calibration can be performed in the OR on demand. We designed a mechanical tracking mount to uniquely and snugly position an EM sensor to an appropriate location on a conventional laparoscope. A tool named fCalib was developed to calibrate intrinsic camera parameters, distortion coefficients, and extrinsic parameters (transformation between the scope lens coordinate system and the EM sensor coordinate system) using a single image that shows an arbitrary portion of a special target pattern. For quick evaluation of calibration results in the OR, we integrated a tube phantom with fCalib prototype and overlaid a virtual representation of the tube on the live video scene. We compared spatial target registration error between the common OpenCV method and the fCalib method in a laboratory setting. In addition, we compared the calibration re-projection error between the EM tracking-based fCalib and the optical tracking-based fCalib in a clinical setting. Our results suggest that the proposed method is comparable to the OpenCV method. However, changing the environment, e.g., inserting or removing surgical tools, might affect re-projection accuracy for the EM tracking-based approach. Computational time of the fCalib method averaged 14.0 s (range 3.5 s-22.7 s). We developed and validated a prototype for fast calibration and evaluation of EM tracked conventional (forward viewing) laparoscopes. The calibration method achieved acceptable accuracy and was relatively fast and easy to be performed in the OR on demand.
Automatic Camera Calibration for Cultural Heritage Applications Using Unstructured Planar Objects
NASA Astrophysics Data System (ADS)
Adam, K.; Kalisperakis, I.; Grammatikopoulos, L.; Karras, G.; Petsa, E.
2013-07-01
As a rule, image-based documentation of cultural heritage relies today on ordinary digital cameras and commercial software. As such projects often involve researchers not familiar with photogrammetry, the question of camera calibration is important. Freely available open-source user-friendly software for automatic camera calibration, often based on simple 2D chess-board patterns, are an answer to the demand for simplicity and automation. However, such tools cannot respond to all requirements met in cultural heritage conservation regarding possible imaging distances and focal lengths. Here we investigate the practical possibility of camera calibration from unknown planar objects, i.e. any planar surface with adequate texture; we have focused on the example of urban walls covered with graffiti. Images are connected pair-wise with inter-image homographies, which are estimated automatically through a RANSAC-based approach after extracting and matching interest points with the SIFT operator. All valid points are identified on all images on which they appear. Provided that the image set includes a "fronto-parallel" view, inter-image homographies with this image are regarded as emulations of image-to-world homographies and allow computing initial estimates for the interior and exterior orientation elements. Following this initialization step, the estimates are introduced into a final self-calibrating bundle adjustment. Measures are taken to discard unsuitable images and verify object planarity. Results from practical experimentation indicate that this method may produce satisfactory results. The authors intend to incorporate the described approach into their freely available user-friendly software tool, which relies on chess-boards, to assist non-experts in their projects with image-based approaches.
Royo Sánchez, Ana Cristina; Aguilar Martín, Juan José; Santolaria Mazo, Jorge
2014-12-01
Motion capture systems are often used for checking and analyzing human motion in biomechanical applications. It is important, in this context, that the systems provide the best possible accuracy. Among existing capture systems, optical systems are those with the highest accuracy. In this paper, the development of a new calibration procedure for optical human motion capture systems is presented. The performance and effectiveness of that new calibration procedure are also checked by experimental validation. The new calibration procedure consists of two stages. In the first stage, initial estimators of intrinsic and extrinsic parameters are sought. The camera calibration method used in this stage is the one proposed by Tsai. These parameters are determined from the camera characteristics, the spatial position of the camera, and the center of the capture volume. In the second stage, a simultaneous nonlinear optimization of all parameters is performed to identify the optimal values, which minimize the objective function. The objective function, in this case, minimizes two errors. The first error is the distance error between two markers placed in a wand. The second error is the error of position and orientation of the retroreflective markers of a static calibration object. The real co-ordinates of the two objects are calibrated in a co-ordinate measuring machine (CMM). The OrthoBio system is used to validate the new calibration procedure. Results are 90% lower than those from the previous calibration software and broadly comparable with results from a similarly configured Vicon system.
Questel, E; Durbise, E; Bardy, A-L; Schmitt, A-M; Josse, G
2015-05-01
To assess an objective method evaluating the effects of a retinaldehyde-based cream (RA-cream) on solar lentigines; 29 women randomly applied RA-cream on lentigines of one hand and a control cream on the other, once daily for 3 months. A specific method enabling a reliable visualisation of the lesions was proposed, using high-magnification colour-calibrated camera imaging. Assessment was performed using clinical evaluation by Physician Global Assessment score and image analysis. Luminance determination on the numeric images was performed either on the basis of 5 independent expert's consensus borders or probability map analysis via an algorithm automatically detecting the pigmented area. Both image analysis methods showed a similar lightening of ΔL* = 2 after a 3-month treatment by RA-cream, in agreement with single-blind clinical evaluation. High-magnification colour-calibrated camera imaging combined with probability map analysis is a fast and precise method to follow lentigo depigmentation. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Lee, Ping-Shin; Gan, Han Ming; Clements, Gopalasamy Reuben; Wilson, John-James
2016-11-01
Mammal diversity assessments based on DNA derived from invertebrates have been suggested as alternatives to assessments based on traditional methods; however, no study has field-tested both approaches simultaneously. In Peninsular Malaysia, we calibrated the performance of mammal DNA derived from blowflies (Diptera: Calliphoridae) against traditional methods used to detect species. We first compared five methods (cage trapping, mist netting, hair trapping, scat collection, and blowfly-derived DNA) in a forest reserve with no recent reports of megafauna. Blowfly-derived DNA and mist netting detected the joint highest number of species (n = 6). Only one species was detected by multiple methods. Compared to the other methods, blowfly-derived DNA detected both volant and non-volant species. In another forest reserve, rich in megafauna, we calibrated blowfly-derived DNA against camera traps. Blowfly-derived DNA detected more species (n = 11) than camera traps (n = 9), with only one species detected by both methods. The rarefaction curve indicated that blowfly-derived DNA would continue to detect more species with greater sampling effort. With further calibration, blowfly-derived DNA may join the list of traditional field methods. Areas for further investigation include blowfly feeding and dispersal biology, primer biases, and the assembly of a comprehensive and taxonomically-consistent DNA barcode reference library.
Hand–eye calibration using a target registration error model
Morgan, Isabella; Jayarathne, Uditha; Ma, Burton; Peters, Terry M.
2017-01-01
Surgical cameras are prevalent in modern operating theatres and are often used as a surrogate for direct vision. Visualisation techniques (e.g. image fusion) made possible by tracking the camera require accurate hand–eye calibration between the camera and the tracking system. The authors introduce the concept of ‘guided hand–eye calibration’, where calibration measurements are facilitated by a target registration error (TRE) model. They formulate hand–eye calibration as a registration problem between homologous point–line pairs. For each measurement, the position of a monochromatic ball-tip stylus (a point) and its projection onto the image (a line) is recorded, and the TRE of the resulting calibration is predicted using a TRE model. The TRE model is then used to guide the placement of the calibration tool, so that the subsequent measurement minimises the predicted TRE. Assessing TRE after each measurement produces accurate calibration using a minimal number of measurements. As a proof of principle, they evaluated guided calibration using a webcam and an endoscopic camera. Their endoscopic camera results suggest that millimetre TRE is achievable when at least 15 measurements are acquired with the tracker sensor ∼80 cm away on the laparoscope handle for a target ∼20 cm away from the camera. PMID:29184657
NASA Technical Reports Server (NTRS)
Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth
2016-01-01
Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the ability to perform multi-camera 3 dimensional reconstruction. Utilizing OpenCV, via the Python programming language, a set of tools has been developed to perform motion capture in confined spaces using commercial cameras. Four Sony Video Cameras were intrinsically calibrated prior to flight. Intrinsic calibration provides a set of camera specific parameters to remove geometric distortion of the lens and sensor (specific to each individual camera). A set of high contrast markers were placed on the exercising subject (safety also necessitated that they be soft in case they become detached during parabolic flight); small yarn balls were used. Extrinsic calibration, the determination of camera location and orientation parameters, is performed using fixed landmark markers shared by the camera scenes. Additionally a wand calibration, the sweeping of the camera scenes simultaneously, was also performed. Techniques have been developed to perform intrinsic calibration, extrinsic calibration, isolation of the markers in the scene, calculation of marker 2D centroids, and 3D reconstruction from multiple cameras. These methods have been tested in the laboratory side-by-side comparison to a traditional motion capture system and also on a parabolic flight.
A panning DLT procedure for three-dimensional videography.
Yu, B; Koh, T J; Hay, J G
1993-06-01
The direct linear transformation (DLT) method [Abdel-Aziz and Karara, APS Symposium on Photogrammetry. American Society of Photogrammetry, Falls Church, VA (1971)] is widely used in biomechanics to obtain three-dimensional space coordinates from film and video records. This method has some major shortcomings when used to analyze events which take place over large areas. To overcome these shortcomings, a three-dimensional data collection method based on the DLT method, and making use of panning cameras, was developed. Several small single control volumes were combined to construct a large total control volume. For each single control volume, a regression equation (calibration equation) is developed to express each of the 11 DLT parameters as a function of camera orientation, so that the DLT parameters can then be estimated from arbitrary camera orientations. Once the DLT parameters are known for at least two cameras, and the associated two-dimensional film or video coordinates of the event are obtained, the desired three-dimensional space coordinates can be computed. In a laboratory test, five single control volumes (in a total control volume of 24.40 x 2.44 x 2.44 m3) were used to test the effect of the position of the single control volume on the accuracy of the computed three dimensional space coordinates. Linear and quadratic calibration equations were used to test the effect of the order of the equation on the accuracy of the computed three dimensional space coordinates. For four of the five single control volumes tested, the mean resultant errors associated with the use of the linear calibration equation were significantly larger than those associated with the use of the quadratic calibration equation. The position of the single control volume had no significant effect on the mean resultant errors in computed three dimensional coordinates when the quadratic calibration equation was used. Under the same data collection conditions, the mean resultant errors in the computed three dimensional coordinates associated with the panning and stationary DLT methods were 17 and 22 mm, respectively. The major advantages of the panning DLT method lie in the large image sizes obtained and in the ease with which the data can be collected. The method also has potential for use in a wide variety of contexts. The major shortcoming of the method is the large amount of digitizing necessary to calibrate the total control volume. Adaptations of the method to reduce the amount of digitizing required are being explored.
Small format digital photogrammetry for applications in the earth sciences
NASA Astrophysics Data System (ADS)
Rieke-Zapp, Dirk
2010-05-01
Small format digital photogrammetry for applications in the earth sciences Photogrammetry is often considered one of the most precise and versatile surveying techniques. The same camera and analysis software can be used for measurements from sub-millimetre to kilometre scale. Such a measurement device is well suited for application by earth scientists working in the field. In this case a small toolset and a straight forward setup best fit the needs of the operator. While a digital camera is typically already part of the field equipment of an earth scientist the main focus of the field work is often not surveying. Lack in photogrammetric training at the same time requires an easy to learn, straight forward surveying technique. A photogrammetric method was developed aimed primarily at earth scientists for taking accurate measurements in the field minimizing extra bulk and weight of the required equipment. The work included several challenges. A) Definition of an upright coordinate system without heavy and bulky tools like a total station or GNS-Sensor. B) Optimization of image acquisition and geometric stability of the image block. C) Identification of a small camera suitable for precise measurements in the field. D) Optimization of the workflow from image acquisition to preparation of images for stereo measurements. E) Introduction of students and non-photogrammetrists to the workflow. Wooden spheres were used as target points in the field. They were more rugged and available in different sizes than ping pong balls used in a previous setup. Distances between three spheres were introduced as scale information in a photogrammetric adjustment. The distances were measured with a laser distance meter accurate to 1 mm (1 sigma). The vertical angle between the spheres was measured with the same laser distance meter. The precision of the measurement was 0.3° (1 sigma) which is sufficient, i.e. better than inclination measurements with a geological compass. The upright coordinate system is important to measure the dip angle of geologic features in outcrop. The planimetric coordinate systems would be arbitrary, but may easily be oriented to compass north introducing a direction measurement of a compass. Wooden spheres and a Leica disto D3 laser distance meter added less than 0.150 kg to the field equipment considering that a suitable digital camera was already part of it. Identification of a small digital camera suitable for precise measurements was a major part of this work. A group of cameras were calibrated several times over different periods of time on a testfield. Further evaluation involved an accuracy assessment in the field comparing distances between signalized points calculated form a photogrammetric setup with coordinates derived from a total station survey. The smallest camera in the test required calibration on the job as the interior orientation changed significantly between testfield calibration and use in the field. We attribute this to the fact that the lens was retracted then the camera was switched off. Fairly stable camera geometry in a compact size camera with lens retracting system was accomplished for Sigma DP1 and DP2 cameras. While the pixel count of the cameras was less than for the Ricoh, the pixel pitch in the Sigma cameras was much larger. Hence, the same mechanical movement would have less per pixel effect for the Sigma cameras than for the Ricoh camera. A large pixel pitch may therefore compensate for some camera instability explaining why cameras with large sensors and larger pixel pitch typically yield better accuracy in object space. Both Sigma cameras weigh approximately 0.250 kg and may even be suitable for use with ultralight aerial vehicles (UAV) which have payload restriction of 0.200 to 0.300 kg. A set of other cameras that were available were also tested on a calibration field and on location showing once again that it is difficult to reason geometric stability from camera specifications. Image acquisition with geometrically stable cameras was fairly straight forward to cover the area of interest with stereo pairs for analysis. We limited our tests to setups with three to five images to minimize the amount of post processing. The laser dot of the laser distance meter was not visible for distances farther than 5-7 m with the naked eye which also limited the maximum stereo area that may be covered with this technique. Extrapolating the setup to fairly large areas showed no significant decrease in accuracy accomplished in object space. Working with a Sigma SD14 SLR camera on a 6 x 18 x 20 m3 volume the maximum length measurement error ranged between 20 and 30 mm depending on image setup and analysis. For smaller outcrops even the compact cameras yielded maximum length measurement errors in the mm range which was considered sufficient for measurements in the earth sciences. In many cases the resolution per pixel was the limiting factor of image analysis rather than accuracy. A field manual was developed guiding novice users and students to this technique. The technique does not simplify ease of use for precision; therefore successful users of the presented method easily grow into more advanced photogrammetric methods for high precision applications. Originally camera calibration was not part of the methodology for the novice operators. Recent introduction of Camera Calibrator which is a low cost, well automated software for camera calibration, allowed beginners to calibrate their camera within a couple minutes. The complete set of calibration parameters can be applied in ERDAS LPS software easing the workflow. Image orientation was performed in LPS 9.2 software which was also used for further image analysis.
Spectral colors capture and reproduction based on digital camera
NASA Astrophysics Data System (ADS)
Chen, Defen; Huang, Qingmei; Li, Wei; Lu, Yang
2018-01-01
The purpose of this work is to develop a method for the accurate reproduction of the spectral colors captured by digital camera. The spectral colors being the purest color in any hue, are difficult to reproduce without distortion on digital devices. In this paper, we attempt to achieve accurate hue reproduction of the spectral colors by focusing on two steps of color correction: the capture of the spectral colors and the color characterization of digital camera. Hence it determines the relationship among the spectral color wavelength, the RGB color space of the digital camera device and the CIEXYZ color space. This study also provides a basis for further studies related to the color spectral reproduction on digital devices. In this paper, methods such as wavelength calibration of the spectral colors and digital camera characterization were utilized. The spectrum was obtained through the grating spectroscopy system. A photo of a clear and reliable primary spectrum was taken by adjusting the relative parameters of the digital camera, from which the RGB values of color spectrum was extracted in 1040 equally-divided locations. Calculated using grating equation and measured by the spectrophotometer, two wavelength values were obtained from each location. The polynomial fitting method for the camera characterization was used to achieve color correction. After wavelength calibration, the maximum error between the two sets of wavelengths is 4.38nm. According to the polynomial fitting method, the average color difference of test samples is 3.76. This has satisfied the application needs of the spectral colors in digital devices such as display and transmission.
NASA Astrophysics Data System (ADS)
Rieke-Zapp, D.; Tecklenburg, W.; Peipe, J.; Hastedt, H.; Haig, Claudia
Recent tests on the geometric stability of several digital cameras that were not designed for photogrammetric applications have shown that the accomplished accuracies in object space are either limited or that the accuracy potential is not exploited to the fullest extent. A total of 72 calibrations were calculated with four different software products for eleven digital camera models with different hardware setups, some with mechanical fixation of one or more parts. The calibration procedure was chosen in accord to a German guideline for evaluation of optical 3D measuring systems [VDI/VDE, VDI/VDE 2634 Part 1, 2002. Optical 3D Measuring Systems-Imaging Systems with Point-by-point Probing. Beuth Verlag, Berlin]. All images were taken with ringflashes which was considered a standard method for close-range photogrammetry. In cases where the flash was mounted to the lens, the force exerted on the lens tube and the camera mount greatly reduced the accomplished accuracy. Mounting the ringflash to the camera instead resulted in a large improvement of accuracy in object space. For standard calibration best accuracies in object space were accomplished with a Canon EOS 5D and a 35 mm Canon lens where the focusing tube was fixed with epoxy (47 μm maximum absolute length measurement error in object space). The fixation of the Canon lens was fairly easy and inexpensive resulting in a sevenfold increase in accuracy compared with the same lens type without modification. A similar accuracy was accomplished with a Nikon D3 when mounting the ringflash to the camera instead of the lens (52 μm maximum absolute length measurement error in object space). Parameterisation of geometric instabilities by introduction of an image variant interior orientation in the calibration process improved results for most cameras. In this case, a modified Alpa 12 WA yielded the best results (29 μm maximum absolute length measurement error in object space). Extending the parameter model with FiBun software to model not only an image variant interior orientation, but also deformations in the sensor domain of the cameras, showed significant improvements only for a small group of cameras. The Nikon D3 camera yielded the best overall accuracy (25 μm maximum absolute length measurement error in object space) with this calibration procedure indicating at the same time the presence of image invariant error in the sensor domain. Overall, calibration results showed that digital cameras can be applied for an accurate photogrammetric survey and that only a little effort was sufficient to greatly improve the accuracy potential of digital cameras.
A calibration method for fringe reflection technique based on the analytical phase-slope description
NASA Astrophysics Data System (ADS)
Wu, Yuxiang; Yue, Huimin; Pan, Zhipeng; Liu, Yong
2018-05-01
The fringe reflection technique (FRT) has been one of the most popular methods to measure the shape of specular surface these years. The existing system calibration methods of FRT usually contain two parts, which are camera calibration and geometric calibration. In geometric calibration, the liquid crystal display (LCD) screen position calibration is one of the most difficult steps among all the calibration procedures, and its accuracy is affected by the factors such as the imaging aberration, the plane mirror flatness, and LCD screen pixel size accuracy. In this paper, based on the deduction of FRT analytical phase-slope description, we present a novel calibration method with no requirement to calibrate the position of LCD screen. On the other hand, the system can be arbitrarily arranged, and the imaging system can either be telecentric or non-telecentric. In our experiment of measuring the 5000mm radius sphere mirror, the proposed calibration method achieves 2.5 times smaller measurement error than the geometric calibration method. In the wafer surface measuring experiment, the measurement result with the proposed calibration method is closer to the interferometer result than the geometric calibration method.
Extrinsic Calibration of a Laser Galvanometric Setup and a Range Camera.
Sels, Seppe; Bogaerts, Boris; Vanlanduit, Steve; Penne, Rudi
2018-05-08
Currently, galvanometric scanning systems (like the one used in a scanning laser Doppler vibrometer) rely on a planar calibration procedure between a two-dimensional (2D) camera and the laser galvanometric scanning system to automatically aim a laser beam at a particular point on an object. In the case of nonplanar or moving objects, this calibration is not sufficiently accurate anymore. In this work, a three-dimensional (3D) calibration procedure that uses a 3D range sensor is proposed. The 3D calibration is valid for all types of objects and retains its accuracy when objects are moved between subsequent measurement campaigns. The proposed 3D calibration uses a Non-Perspective-n-Point (NPnP) problem solution. The 3D range sensor is used to calculate the position of the object under test relative to the laser galvanometric system. With this extrinsic calibration, the laser galvanometric scanning system can automatically aim a laser beam to this object. In experiments, the mean accuracy of aiming the laser beam on an object is below 10 mm for 95% of the measurements. This achieved accuracy is mainly determined by the accuracy and resolution of the 3D range sensor. The new calibration method is significantly better than the original 2D calibration method, which in our setup achieves errors below 68 mm for 95% of the measurements.
Lambert, Mark; Bellamy, Fiona; Budgey, Richard; Callaby, Rebecca; Coats, Julia; Talling, Janet
2018-01-01
Indices of rodent activity are used as indicators of population change during field evaluation of rodenticides. We investigated the potential for using camera traps to determine activity indices for commensal rodents living in and around farm buildings, and sought to compare these indices against previously calibrated survey methods. We recorded 41 263 images of 23 species, including Norway rats (Rattus norvegicus Berk.) and house mice (Mus musculus L.). We found a positive correlation between activity indices from camera traps and activity indices from a method (footprint tracking) previously shown to have a linear relationship with population size for Norway rats. Filtering the camera trap data to simulate a 30-s delay between camera trigger events removed 59.9% of data and did not adversely affect the correlation between activity indices from camera traps and footprint tracking. The relationship between activity indices from footprint tracking and Norway rat population size is known from a previous study; from this, we determined the relationship between activity indices from camera traps and population size for Norway rats living in and around farm buildings. Systematic use of camera traps was used to determine activity indices for Norway rats living in and around farm buildings; the activity indices were positively correlated with those derived from a method previously calibrated against known population size for this species in this context. © 2017 Crown copyright. Pest Management Science © 2017 Society of Chemical Industry. © 2017 Crown copyright. Pest Management Science © 2017 Society of Chemical Industry.
A Novel Multi-Digital Camera System Based on Tilt-Shift Photography Technology
Sun, Tao; Fang, Jun-yong; Zhao, Dong; Liu, Xue; Tong, Qing-xi
2015-01-01
Multi-digital camera systems (MDCS) are constantly being improved to meet the increasing requirement of high-resolution spatial data. This study identifies the insufficiencies of traditional MDCSs and proposes a new category MDCS based on tilt-shift photography to improve ability of the MDCS to acquire high-accuracy spatial data. A prototype system, including two or four tilt-shift cameras (TSC, camera model: Nikon D90), is developed to validate the feasibility and correctness of proposed MDCS. Similar to the cameras of traditional MDCSs, calibration is also essential for TSC of new MDCS. The study constructs indoor control fields and proposes appropriate calibration methods for TSC, including digital distortion model (DDM) approach and two-step calibrated strategy. The characteristics of TSC are analyzed in detail via a calibration experiment; for example, the edge distortion of TSC. Finally, the ability of the new MDCS to acquire high-accuracy spatial data is verified through flight experiments. The results of flight experiments illustrate that geo-position accuracy of prototype system achieves 0.3 m at a flight height of 800 m, and spatial resolution of 0.15 m. In addition, results of the comparison between the traditional (MADC II) and proposed MDCS demonstrate that the latter (0.3 m) provides spatial data with higher accuracy than the former (only 0.6 m) under the same conditions. We also take the attitude that using higher accuracy TSC in the new MDCS should further improve the accuracy of the photogrammetry senior product. PMID:25835187
A novel multi-digital camera system based on tilt-shift photography technology.
Sun, Tao; Fang, Jun-Yong; Zhao, Dong; Liu, Xue; Tong, Qing-Xi
2015-03-31
Multi-digital camera systems (MDCS) are constantly being improved to meet the increasing requirement of high-resolution spatial data. This study identifies the insufficiencies of traditional MDCSs and proposes a new category MDCS based on tilt-shift photography to improve ability of the MDCS to acquire high-accuracy spatial data. A prototype system, including two or four tilt-shift cameras (TSC, camera model: Nikon D90), is developed to validate the feasibility and correctness of proposed MDCS. Similar to the cameras of traditional MDCSs, calibration is also essential for TSC of new MDCS. The study constructs indoor control fields and proposes appropriate calibration methods for TSC, including digital distortion model (DDM) approach and two-step calibrated strategy. The characteristics of TSC are analyzed in detail via a calibration experiment; for example, the edge distortion of TSC. Finally, the ability of the new MDCS to acquire high-accuracy spatial data is verified through flight experiments. The results of flight experiments illustrate that geo-position accuracy of prototype system achieves 0.3 m at a flight height of 800 m, and spatial resolution of 0.15 m. In addition, results of the comparison between the traditional (MADC II) and proposed MDCS demonstrate that the latter (0.3 m) provides spatial data with higher accuracy than the former (only 0.6 m) under the same conditions. We also take the attitude that using higher accuracy TSC in the new MDCS should further improve the accuracy of the photogrammetry senior product.
User guide for the USGS aerial camera Report of Calibration.
Tayman, W.P.
1984-01-01
Calibration and testing of aerial mapping cameras includes the measurement of optical constants and the check for proper functioning of a number of complicated mechanical and electrical parts. For this purpose the US Geological Survey performs an operational type photographic calibration. This paper is not strictly a scientific paper but rather a 'user guide' to the USGS Report of Calibration of an aerial mapping camera for compliance with both Federal and State mapping specifications. -Author
NASA Astrophysics Data System (ADS)
Shao, Xinxing; Zhu, Feipeng; Su, Zhilong; Dai, Xiangjun; Chen, Zhenning; He, Xiaoyuan
2018-03-01
The strain errors in stereo-digital image correlation (DIC) due to camera calibration were investigated using precisely controlled numerical experiments and real experiments. Three-dimensional rigid body motion tests were conducted to examine the effects of camera calibration on the measured results. For a fully accurate calibration, rigid body motion causes negligible strain errors. However, for inaccurately calibrated camera parameters and a short working distance, rigid body motion will lead to more than 50-μɛ strain errors, which significantly affects the measurement. In practical measurements, it is impossible to obtain a fully accurate calibration; therefore, considerable attention should be focused on attempting to avoid these types of errors, especially for high-accuracy strain measurements. It is necessary to avoid large rigid body motions in both two-dimensional DIC and stereo-DIC.
Calibration of the Auger Fluorescence Telescopes
NASA Astrophysics Data System (ADS)
Klages, H.; Pierre Auger Observatory Collaboration
Thirty fluorescence telescopes in four stations will overlook the detector array of the southern hemisphere experiment of the Pierre Auger project. The main aim of these telescopes is tracking of EHE air showers, measurement of the longitudinal shower development (Xmax) and determination of the absolute energy of EHE events. A telescope camera contains 440 PMTs each covering a 1.5 x 1.5 degree pixel of the sky. The response of every pixel is converted into the number of charged particles at the observed part of the shower. This reconstruction includes the shower/observer geometry and the details of the atmospheric photon production and transport. The remaining experimental task is to convert the ADC counts of the camera pixel electronics into the light flux entering the Schmidt aperture. Three types of calibration and control are necessary : a) Monitoring of time dependent variations has to be performed for all parts of the optics and for all pixels frequently. Common illumination for all pixels of a camera allows the detection of individual deviations. Properties of windows, filters and mirrors have to be measured separately. b) Differences in pixel-to-pixel efficiency are mainly due to PMT gain and to differences in effective area (camera shadow, mirror size limits). Homogeneous and isotropic illumination will enable cross calibration. c) An absolute calibration has to be performed once in a while using trusted light monitors. The calibration methods used for the Pierre Auger FD telescopes in Argentina are discussed.
PRIMAS: a real-time 3D motion-analysis system
NASA Astrophysics Data System (ADS)
Sabel, Jan C.; van Veenendaal, Hans L. J.; Furnee, E. Hans
1994-03-01
The paper describes a CCD TV-camera-based system for real-time multicamera 2D detection of retro-reflective targets and software for accurate and fast 3D reconstruction. Applications of this system can be found in the fields of sports, biomechanics, rehabilitation research, and various other areas of science and industry. The new feature of real-time 3D opens an even broader perspective of application areas; animations in virtual reality are an interesting example. After presenting an overview of the hardware and the camera calibration method, the paper focuses on the real-time algorithms used for matching of the images and subsequent 3D reconstruction of marker positions. When using a calibrated setup of two cameras, it is now possible to track at least ten markers at 100 Hz. Limitations in the performance are determined by the visibility of the markers, which could be improved by adding a third camera.
Indoor integrated navigation and synchronous data acquisition method for Android smartphone
NASA Astrophysics Data System (ADS)
Hu, Chunsheng; Wei, Wenjian; Qin, Shiqiao; Wang, Xingshu; Habib, Ayman; Wang, Ruisheng
2015-08-01
Smartphones are widely used at present. Most smartphones have cameras and kinds of sensors, such as gyroscope, accelerometer and magnet meter. Indoor navigation based on smartphone is very important and valuable. According to the features of the smartphone and indoor navigation, a new indoor integrated navigation method is proposed, which uses MEMS (Micro-Electro-Mechanical Systems) IMU (Inertial Measurement Unit), camera and magnet meter of smartphone. The proposed navigation method mainly involves data acquisition, camera calibration, image measurement, IMU calibration, initial alignment, strapdown integral, zero velocity update and integrated navigation. Synchronous data acquisition of the sensors (gyroscope, accelerometer and magnet meter) and the camera is the base of the indoor navigation on the smartphone. A camera data acquisition method is introduced, which uses the camera class of Android to record images and time of smartphone camera. Two kinds of sensor data acquisition methods are introduced and compared. The first method records sensor data and time with the SensorManager of Android. The second method realizes open, close, data receiving and saving functions in C language, and calls the sensor functions in Java language with JNI interface. A data acquisition software is developed with JDK (Java Development Kit), Android ADT (Android Development Tools) and NDK (Native Development Kit). The software can record camera data, sensor data and time at the same time. Data acquisition experiments have been done with the developed software and Sumsang Note 2 smartphone. The experimental results show that the first method of sensor data acquisition is convenient but lost the sensor data sometimes, the second method is much better in real-time performance and much less in data losing. A checkerboard image is recorded, and the corner points of the checkerboard are detected with the Harris method. The sensor data of gyroscope, accelerometer and magnet meter have been recorded about 30 minutes. The bias stability and noise feature of the sensors have been analyzed. Besides the indoor integrated navigation, the integrated navigation and synchronous data acquisition method can be applied to outdoor navigation.
[Optimization of end-tool parameters based on robot hand-eye calibration].
Zhang, Lilong; Cao, Tong; Liu, Da
2017-04-01
A new one-time registration method was developed in this research for hand-eye calibration of a surgical robot to simplify the operation process and reduce the preparation time. And a new and practical method is introduced in this research to optimize the end-tool parameters of the surgical robot based on analysis of the error sources in this registration method. In the process with one-time registration method, firstly a marker on the end-tool of the robot was recognized by a fixed binocular camera, and then the orientation and position of the marker were calculated based on the joint parameters of the robot. Secondly the relationship between the camera coordinate system and the robot base coordinate system could be established to complete the hand-eye calibration. Because of manufacturing and assembly errors of robot end-tool, an error equation was established with the transformation matrix between the robot end coordinate system and the robot end-tool coordinate system as the variable. Numerical optimization was employed to optimize end-tool parameters of the robot. The experimental results showed that the one-time registration method could significantly improve the efficiency of the robot hand-eye calibration compared with the existing methods. The parameter optimization method could significantly improve the absolute positioning accuracy of the one-time registration method. The absolute positioning accuracy of the one-time registration method can meet the requirements of the clinical surgery.
Verification of the ISO calibration method for field pyranometers under tropical sky conditions
NASA Astrophysics Data System (ADS)
Janjai, Serm; Tohsing, Korntip; Pattarapanitchai, Somjet; Detkhon, Pasakorn
2017-02-01
Field pyranomters need to be annually calibrated and the International Organization for Standardization (ISO) has defined a standard method (ISO 9847) for calibrating these pyranometers. According to this standard method for outdoor calibration, the field pyranometers have to be compared to a reference pyranometer for the period of 2 to 14 days, depending on sky conditions. In this work, the ISO 9847 standard method was verified under tropical sky conditions. To verify the standard method, calibration of field pyranometers was conducted at a tropical site located in Nakhon Pathom (13.82o N, 100.04o E), Thailand under various sky conditions. The conditions of the sky were monitored by using a sky camera. The calibration results for different time periods used for the calibration under various sky conditions were analyzed. It was found that the calibration periods given by this standard method could be reduced without significant change in the final calibration result. In addition, recommendation and discussion on the use of this standard method in the tropics were also presented.
Mast Camera and Its Calibration Target on Curiosity Rover
2013-03-18
This set of images illustrates the twin cameras of the Mastcam instrument on NASA Curiosity Mars rover upper left, the Mastcam calibration target lower center, and the locations of the cameras and target on the rover.
Self calibrating monocular camera measurement of traffic parameters.
DOT National Transportation Integrated Search
2009-12-01
This proposed project will extend the work of previous projects that have developed algorithms and software : to measure traffic speed under adverse conditions using un-calibrated cameras. The present implementation : uses the WSDOT CCTV cameras moun...
A method and results of color calibration for the Chang'e-3 terrain camera and panoramic camera
NASA Astrophysics Data System (ADS)
Ren, Xin; Li, Chun-Lai; Liu, Jian-Jun; Wang, Fen-Fei; Yang, Jian-Feng; Liu, En-Hai; Xue, Bin; Zhao, Ru-Jin
2014-12-01
The terrain camera (TCAM) and panoramic camera (PCAM) are two of the major scientific payloads installed on the lander and rover of the Chang'e 3 mission respectively. They both use a Bayer color filter array covering CMOS sensor to capture color images of the Moon's surface. RGB values of the original images are related to these two kinds of cameras. There is an obvious color difference compared with human visual perception. This paper follows standards published by the International Commission on Illumination to establish a color correction model, designs the ground calibration experiment and obtains the color correction coefficient. The image quality has been significantly improved and there is no obvious color difference in the corrected images. Ground experimental results show that: (1) Compared with uncorrected images, the average color difference of TCAM is 4.30, which has been reduced by 62.1%. (2) The average color differences of the left and right cameras in PCAM are 4.14 and 4.16, which have been reduced by 68.3% and 67.6% respectively.
Beyl, Tim; Nicolai, Philip; Comparetti, Mirko D; Raczkowsky, Jörg; De Momi, Elena; Wörn, Heinz
2016-07-01
Scene supervision is a major tool to make medical robots safer and more intuitive. The paper shows an approach to efficiently use 3D cameras within the surgical operating room to enable for safe human robot interaction and action perception. Additionally the presented approach aims to make 3D camera-based scene supervision more reliable and accurate. A camera system composed of multiple Kinect and time-of-flight cameras has been designed, implemented and calibrated. Calibration and object detection as well as people tracking methods have been designed and evaluated. The camera system shows a good registration accuracy of 0.05 m. The tracking of humans is reliable and accurate and has been evaluated in an experimental setup using operating clothing. The robot detection shows an error of around 0.04 m. The robustness and accuracy of the approach allow for an integration into modern operating room. The data output can be used directly for situation and workflow detection as well as collision avoidance.
Assessing the Accuracy of Ortho-image using Photogrammetric Unmanned Aerial System
NASA Astrophysics Data System (ADS)
Jeong, H. H.; Park, J. W.; Kim, J. S.; Choi, C. U.
2016-06-01
Smart-camera can not only be operated under network environment anytime and any place but also cost less than the existing photogrammetric UAV since it provides high-resolution image, 3D location and attitude data on a real-time basis from a variety of built-in sensors. This study's proposed UAV photogrammetric method, low-cost UAV and smart camera were used. The elements of interior orientation were acquired through camera calibration. The image triangulation was conducted in accordance with presence or absence of consideration of the interior orientation (IO) parameters determined by camera calibration, The Digital Elevation Model (DEM) was constructed using the image data photographed at the target area and the results of the ground control point survey. This study also analyzes the proposed method's application possibility by comparing a Ortho-image the results of the ground control point survey. Considering these study findings, it is suggested that smartphone is very feasible as a payload for UAV system. It is also expected that smartphone may be loaded onto existing UAV playing direct or indirect roles significantly.
Measurement of large steel plates based on linear scan structured light scanning
NASA Astrophysics Data System (ADS)
Xiao, Zhitao; Li, Yaru; Lei, Geng; Xi, Jiangtao
2018-01-01
A measuring method based on linear structured light scanning is proposed to achieve the accurate measurement of the complex internal shape of large steel plates. Firstly, by using a calibration plate with round marks, an improved line scanning calibration method is designed. The internal and external parameters of camera are determined through the calibration method. Secondly, the images of steel plates are acquired by line scan camera. Then the Canny edge detection method is used to extract approximate contours of the steel plate images, the Gauss fitting algorithm is used to extract the sub-pixel edges of the steel plate contours. Thirdly, for the problem of inaccurate restoration of contour size, by measuring the distance between adjacent points in the grid of known dimensions, the horizontal and vertical error curves of the images are obtained. Finally, these horizontal and vertical error curves can be used to correct the contours of steel plates, and then combined with the calibration parameters of internal and external, the size of these contours can be calculated. The experiments results demonstrate that the proposed method can achieve the error of 1 mm/m in 1.2m×2.6m field of view, which has satisfied the demands of industrial measurement.
Active learning in camera calibration through vision measurement application
NASA Astrophysics Data System (ADS)
Li, Xiaoqin; Guo, Jierong; Wang, Xianchun; Liu, Changqing; Cao, Binfang
2017-08-01
Since cameras are increasingly more used in scientific application as well as in the applications requiring precise visual information, effective calibration of such cameras is getting more important. There are many reasons why the measurements of objects are not accurate. The largest reason is that the lens has a distortion. Another detrimental influence on the evaluation accuracy is caused by the perspective distortions in the image. They happen whenever we cannot mount the camera perpendicularly to the objects we want to measure. In overall, it is very important for students to understand how to correct lens distortions, that is camera calibration. If the camera is calibrated, the images are rectificated, and then it is possible to obtain undistorted measurements in world coordinates. This paper presents how the students should develop a sense of active learning for mathematical camera model besides the theoretical scientific basics. The authors will present the theoretical and practical lectures which have the goal of deepening the students understanding of the mathematical models of area scan cameras and building some practical vision measurement process by themselves.
Vanishing Point Extraction and Refinement for Robust Camera Calibration
Tsai, Fuan
2017-01-01
This paper describes a flexible camera calibration method using refined vanishing points without prior information. Vanishing points are estimated from human-made features like parallel lines and repeated patterns. With the vanishing points extracted from the three mutually orthogonal directions, the interior and exterior orientation parameters can be further calculated using collinearity condition equations. A vanishing point refinement process is proposed to reduce the uncertainty caused by vanishing point localization errors. The fine-tuning algorithm is based on the divergence of grouped feature points projected onto the reference plane, minimizing the standard deviation of each of the grouped collinear points with an O(1) computational complexity. This paper also presents an automated vanishing point estimation approach based on the cascade Hough transform. The experiment results indicate that the vanishing point refinement process can significantly improve camera calibration parameters and the root mean square error (RMSE) of the constructed 3D model can be reduced by about 30%. PMID:29280966
Harbour surveillance with cameras calibrated with AIS data
NASA Astrophysics Data System (ADS)
Palmieri, F. A. N.; Castaldo, F.; Marino, G.
The inexpensive availability of surveillance cameras, easily connected in network configurations, suggests the deployment of this additional sensor modality in port surveillance. Vessels appearing within cameras fields of view can be recognized and localized providing to fusion centers information that can be added to data coming from Radar, Lidar, AIS, etc. Camera systems, that are used as localizers however, must be properly calibrated in changing scenarios where often there is limited choice on the position on which they are deployed. Automatic Identification System (AIS) data, that includes position, course and vessel's identity, freely available through inexpensive receivers, for some of the vessels appearing within the field of view, provide the opportunity to achieve proper camera calibration to be used for the localization of vessels not equipped with AIS transponders. In this paper we assume a pinhole model for camera geometry and propose perspective matrices computation using AIS positional data. Images obtained from calibrated cameras are then matched and pixel association is utilized for other vessel's localization. We report preliminary experimental results of calibration and localization using two cameras deployed on the Gulf of Naples coastline. The two cameras overlook a section of the harbour and record short video sequences that are synchronized offline with AIS positional information of easily-identified passenger ships. Other small vessels, not equipped with AIS transponders, are localized using camera matrices and pixel matching. Localization accuracy is experimentally evaluated as a function of target distance from the sensors.
Generation of high-dynamic range image from digital photo
NASA Astrophysics Data System (ADS)
Wang, Ying; Potemin, Igor S.; Zhdanov, Dmitry D.; Wang, Xu-yang; Cheng, Han
2016-10-01
A number of the modern applications such as medical imaging, remote sensing satellites imaging, virtual prototyping etc use the High Dynamic Range Image (HDRI). Generally to obtain HDRI from ordinary digital image the camera is calibrated. The article proposes the camera calibration method based on the clear sky as the standard light source and takes sky luminance from CIE sky model for the corresponding geographical coordinates and time. The article considers base algorithms for getting real luminance values from ordinary digital image and corresponding programmed implementation of the algorithms. Moreover, examples of HDRI reconstructed from ordinary images illustrate the article.
Heliostat kinematic system calibration using uncalibrated cameras
NASA Astrophysics Data System (ADS)
Burisch, Michael; Gomez, Luis; Olasolo, David; Villasante, Cristobal
2017-06-01
The efficiency of the solar field greatly depends on the ability of the heliostats to precisely reflect solar radiation onto a central receiver. To control the heliostats with such a precision accurate knowledge of the motion of each of them modeled as a kinematic system is required. Determining the parameters of this system for each heliostat by a calibration system is crucial for the efficient operation of the solar field. For small sized heliostats being able to make such a calibration in a fast and automatic manner is imperative as the solar field potentially contain tens or even hundreds of thousands of them. A calibration system which can rapidly recalibrate a whole solar field would also allow reducing costs. Heliostats are generally designed to provide stability over a large period of time. Being able to relax this requirement and compensate any occurring error by adapting parameters in a model, the costs of the heliostat can be reduced. The presented method describes such an automatic calibration system using uncalibrated cameras rigidly attached to each heliostat. The cameras are used to observe targets spread out through the solar field; based on this the kinematic system of the heliostat can be estimated with high precision. A comparison of this approach to similar solutions shows the viability of the proposed solution.
Calibration Target for Curiosity Arm Camera
2012-09-10
This view of the calibration target for the MAHLI camera aboard NASA Mars rover Curiosity combines two images taken by that camera during Sept. 9, 2012. Part of Curiosity left-front and center wheels and a patch of Martian ground are also visible.
Design and simulation of a sensor for heliostat field closed loop control
NASA Astrophysics Data System (ADS)
Collins, Mike; Potter, Daniel; Burton, Alex
2017-06-01
Significant research has been completed in pursuit of capital cost reductions for heliostats [1],[2]. The camera array closed loop control concept has potential to radically alter the way heliostats are controlled and installed by replacing high quality open loop targeting systems with low quality targeting devices that rely on measurement of image position to remove tracking errors during operation. Although the system could be used for any heliostat size, the system significantly benefits small heliostats by reducing actuation costs, enabling large numbers of heliostats to be calibrated simultaneously, and enabling calibration of heliostats that produce low irradiance (similar or less than ambient light images) on Lambertian calibration targets, such as small heliostats that are far from the tower. A simulation method for the camera array has been designed and verified experimentally. The simulation tool demonstrates that closed loop calibration or control is possible using this device.
Low-cost precision rotary index calibration
NASA Astrophysics Data System (ADS)
Ng, T. W.; Lim, T. S.
2005-08-01
The traditional method for calibrating angular indexing repeatability of rotary axes on machine tools and measuring equipment is with a precision polygon (usually 12 sided) and an autocollimator or angular interferometer. Such a setup is typically expensive. Here, we propose a far more cost-effective approach that uses just a laser, diffractive optical element, and CCD camera. We show that significantly high accuracies can be achieved for angular index calibration.
Method and apparatus for calibrating a display using an array of cameras
NASA Technical Reports Server (NTRS)
Johnson, Michael J. (Inventor); Chen, Chung-Jen (Inventor); Chandrasekhar, Rajesh (Inventor)
2001-01-01
The present invention overcomes many of the disadvantages of the prior art by providing a display that can be calibrated and re-calibrated with a minimal amount of manual intervention. To accomplish this, the present invention provides one or more cameras to capture an image that is projected on a display screen. In one embodiment, the one or more cameras are placed on the same side of the screen as the projectors. In another embodiment, an array of cameras is provided on either or both sides of the screen for capturing a number of adjacent and/or overlapping capture images of the screen. In either of these embodiments, the resulting capture images are processed to identify any non-desirable characteristics including any visible artifacts such as seams, bands, rings, etc. Once the non-desirable characteristics are identified, an appropriate transformation function is determined. The transformation function is used to pre-warp the input video signal to the display such that the non-desirable characteristics are reduced or eliminated from the display. The transformation function preferably compensates for spatial non-uniformity, color non-uniformity, luminance non-uniformity, and/or other visible artifacts.
Instrumental Response Model and Detrending for the Dark Energy Camera
Bernstein, G. M.; Abbott, T. M. C.; Desai, S.; ...
2017-09-14
We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less
Instrumental Response Model and Detrending for the Dark Energy Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernstein, G. M.; Abbott, T. M. C.; Desai, S.
We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the "brighter-fatter" nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry withinmore » $$\\approx 2$$ mmag and $$\\approx 3$$ mas, respectively, of fundamental atmospheric and statistical limits. In conclusion, the DES techniques should be broadly applicable to wide-field imagers.« less
NASA Astrophysics Data System (ADS)
Liu, Chengwei; Sui, Xiubao; Gu, Guohua; Chen, Qian
2018-02-01
For the uncooled long-wave infrared (LWIR) camera, the infrared (IR) irradiation the focal plane array (FPA) receives is a crucial factor that affects the image quality. Ambient temperature fluctuation as well as system power consumption can result in changes of FPA temperature and radiation characteristics inside the IR camera; these will further degrade the imaging performance. In this paper, we present a novel shutterless non-uniformity correction method to compensate for non-uniformity derived from the variation of ambient temperature. Our method combines a calibration-based method and the properties of a scene-based method to obtain correction parameters at different ambient temperature conditions, so that the IR camera performance can be less influenced by ambient temperature fluctuation or system power consumption. The calibration process is carried out in a temperature chamber with slowly changing ambient temperature and a black body as uniform radiation source. Enough uniform images are captured and the gain coefficients are calculated during this period. Then in practical application, the offset parameters are calculated via the least squares method based on the gain coefficients, the captured uniform images and the actual scene. Thus we can get a corrected output through the gain coefficients and offset parameters. The performance of our proposed method is evaluated on realistic IR images and compared with two existing methods. The images we used in experiments are obtained by a 384× 288 pixels uncooled LWIR camera. Results show that our proposed method can adaptively update correction parameters as the actual target scene changes and is more stable to temperature fluctuation than the other two methods.
Principal axis-based correspondence between multiple cameras for people tracking.
Hu, Weiming; Hu, Min; Zhou, Xue; Tan, Tieniu; Lou, Jianguang; Maybank, Steve
2006-04-01
Visual surveillance using multiple cameras has attracted increasing interest in recent years. Correspondence between multiple cameras is one of the most important and basic problems which visual surveillance using multiple cameras brings. In this paper, we propose a simple and robust method, based on principal axes of people, to match people across multiple cameras. The correspondence likelihood reflecting the similarity of pairs of principal axes of people is constructed according to the relationship between "ground-points" of people detected in each camera view and the intersections of principal axes detected in different camera views and transformed to the same view. Our method has the following desirable properties: 1) Camera calibration is not needed. 2) Accurate motion detection and segmentation are less critical due to the robustness of the principal axis-based feature to noise. 3) Based on the fused data derived from correspondence results, positions of people in each camera view can be accurately located even when the people are partially occluded in all views. The experimental results on several real video sequences from outdoor environments have demonstrated the effectiveness, efficiency, and robustness of our method.
A three dimensional point cloud registration method based on rotation matrix eigenvalue
NASA Astrophysics Data System (ADS)
Wang, Chao; Zhou, Xiang; Fei, Zixuan; Gao, Xiaofei; Jin, Rui
2017-09-01
We usually need to measure an object at multiple angles in the traditional optical three-dimensional measurement method, due to the reasons for the block, and then use point cloud registration methods to obtain a complete threedimensional shape of the object. The point cloud registration based on a turntable is essential to calculate the coordinate transformation matrix between the camera coordinate system and the turntable coordinate system. We usually calculate the transformation matrix by fitting the rotation center and the rotation axis normal of the turntable in the traditional method, which is limited by measuring the field of view. The range of exact feature points used for fitting the rotation center and the rotation axis normal is approximately distributed within an arc less than 120 degrees, resulting in a low fit accuracy. In this paper, we proposes a better method, based on the invariant eigenvalue principle of rotation matrix in the turntable coordinate system and the coordinate transformation matrix of the corresponding coordinate points. First of all, we control the rotation angle of the calibration plate with the turntable to calibrate the coordinate transformation matrix of the corresponding coordinate points by using the least squares method. And then we use the feature decomposition to calculate the coordinate transformation matrix of the camera coordinate system and the turntable coordinate system. Compared with the traditional previous method, it has a higher accuracy, better robustness and it is not affected by the camera field of view. In this method, the coincidence error of the corresponding points on the calibration plate after registration is less than 0.1mm.
Solving the robot-world, hand-eye(s) calibration problem with iterative methods
USDA-ARS?s Scientific Manuscript database
Robot-world, hand-eye calibration is the problem of determining the transformation between the robot end effector and a camera, as well as the transformation between the robot base and the world coordinate system. This relationship has been modeled as AX = ZB, where X and Z are unknown homogeneous ...
Geometrical calibration television measuring systems with solid state photodetectors
NASA Astrophysics Data System (ADS)
Matiouchenko, V. G.; Strakhov, V. V.; Zhirkov, A. O.
2000-11-01
The various optical measuring methods for deriving information about the size and form of objects are now used in difference branches- mechanical engineering, medicine, art, criminalistics. Measuring by means of the digital television systems is one of these methods. The development of this direction is promoted by occurrence on the market of various types and costs small-sized television cameras and frame grabbers. There are many television measuring systems using the expensive cameras, but accuracy performances of low cost cameras are also interested for the system developers. For this reason inexpensive mountingless camera SK1004CP (format 1/3', cost up to 40$) and frame grabber Aver2000 were used in experiments.
NASA Astrophysics Data System (ADS)
Zhang, Shaojun; Xu, Xiping
2015-10-01
The 360-degree and all round looking camera, as its characteristics of suitable for automatic analysis and judgment on the ambient environment of the carrier by image recognition algorithm, is usually applied to opto-electronic radar of robots and smart cars. In order to ensure the stability and consistency of image processing results of mass production, it is necessary to make sure the centers of image planes of different cameras are coincident, which requires to calibrate the position of the image plane's center. The traditional mechanical calibration method and electronic adjusting mode of inputting the offsets manually, both exist the problem of relying on human eyes, inefficiency and large range of error distribution. In this paper, an approach of auto- calibration of the image plane of this camera is presented. The imaging of the 360-degree and all round looking camera is a ring-shaped image consisting of two concentric circles, the center of the image is a smaller circle and the outside is a bigger circle. The realization of the technology is just to exploit the above characteristics. Recognizing the two circles through HOUGH TRANSFORM algorithm and calculating the center position, we can get the accurate center of image, that the deviation of the central location of the optic axis and image sensor. The program will set up the image sensor chip through I2C bus automatically, we can adjusting the center of the image plane automatically and accurately. The technique has been applied to practice, promotes productivity and guarantees the consistent quality of products.
Measurement of the timing behaviour of off-the-shelf cameras
NASA Astrophysics Data System (ADS)
Schatz, Volker
2017-04-01
This paper presents a measurement method suitable for investigating the timing properties of cameras. A single light source illuminates the camera detector starting with a varying defined delay after the camera trigger. Pixels from the recorded camera frames are summed up and normalised, and the resulting function is indicative of the overlap between illumination and exposure. This allows one to infer the trigger delay and the exposure time with sub-microsecond accuracy. The method is therefore of interest when off-the-shelf cameras are used in reactive systems or synchronised with other cameras. It can supplement radiometric and geometric calibration methods for cameras in scientific use. A closer look at the measurement results reveals deviations from the ideal camera behaviour of constant sensitivity limited to the exposure interval. One of the industrial cameras investigated retains a small sensitivity long after the end of the nominal exposure interval. All three investigated cameras show non-linear variations of sensitivity at O≤ft({{10}-3}\\right) to O≤ft({{10}-2}\\right) during exposure. Due to its sign, the latter effect cannot be described by a sensitivity function depending on the time after triggering, but represents non-linear pixel characteristics.
Geometric Calibration and Validation of Ultracam Aerial Sensors
NASA Astrophysics Data System (ADS)
Gruber, Michael; Schachinger, Bernhard; Muick, Marc; Neuner, Christian; Tschemmernegg, Helfried
2016-03-01
We present details of the calibration and validation procedure of UltraCam Aerial Camera systems. Results from the laboratory calibration and from validation flights are presented for both, the large format nadir cameras and the oblique cameras as well. Thus in this contribution we show results from the UltraCam Eagle and the UltraCam Falcon, both nadir mapping cameras, and the UltraCam Osprey, our oblique camera system. This sensor offers a mapping grade nadir component together with the four oblique camera heads. The geometric processing after the flight mission is being covered by the UltraMap software product. Thus we present details about the workflow as well. The first part consists of the initial post-processing which combines image information as well as camera parameters derived from the laboratory calibration. The second part, the traditional automated aerial triangulation (AAT) is the step from single images to blocks and enables an additional optimization process. We also present some special features of our software, which are designed to better support the operator to analyze large blocks of aerial images and to judge the quality of the photogrammetric set-up.
Calibrating the orientation between a microlens array and a sensor based on projective geometry
NASA Astrophysics Data System (ADS)
Su, Lijuan; Yan, Qiangqiang; Cao, Jun; Yuan, Yan
2016-07-01
We demonstrate a method for calibrating a microlens array (MLA) with a sensor component by building a plenoptic camera with a conventional prime lens. This calibration method includes a geometric model, a setup to adjust the distance (L) between the prime lens and the MLA, a calibration procedure for determining the subimage centers, and an optimization algorithm. The geometric model introduces nine unknown parameters regarding the centers of the microlenses and their images, whereas the distance adjustment setup provides an initial guess for the distance L. The simulation results verify the effectiveness and accuracy of the proposed method. The experimental results demonstrate the calibration process can be performed with a commercial prime lens and the proposed method can be used to quantitatively evaluate whether a MLA and a sensor is assembled properly for plenoptic systems.
Automated Camera Array Fine Calibration
NASA Technical Reports Server (NTRS)
Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang
2008-01-01
Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.
An Automatic Image-Based Modelling Method Applied to Forensic Infography
Zancajo-Blazquez, Sandra; Gonzalez-Aguilera, Diego; Gonzalez-Jorge, Higinio; Hernandez-Lopez, David
2015-01-01
This paper presents a new method based on 3D reconstruction from images that demonstrates the utility and integration of close-range photogrammetry and computer vision as an efficient alternative to modelling complex objects and scenarios of forensic infography. The results obtained confirm the validity of the method compared to other existing alternatives as it guarantees the following: (i) flexibility, permitting work with any type of camera (calibrated and non-calibrated, smartphone or tablet) and image (visible, infrared, thermal, etc.); (ii) automation, allowing the reconstruction of three-dimensional scenarios in the absence of manual intervention, and (iii) high quality results, sometimes providing higher resolution than modern laser scanning systems. As a result, each ocular inspection of a crime scene with any camera performed by the scientific police can be transformed into a scaled 3d model. PMID:25793628
In-flight photogrammetric camera calibration and validation via complementary lidar
NASA Astrophysics Data System (ADS)
Gneeniss, A. S.; Mills, J. P.; Miller, P. E.
2015-02-01
This research assumes lidar as a reference dataset against which in-flight camera system calibration and validation can be performed. The methodology utilises a robust least squares surface matching algorithm to align a dense network of photogrammetric points to the lidar reference surface, allowing for the automatic extraction of so-called lidar control points (LCPs). Adjustment of the photogrammetric data is then repeated using the extracted LCPs in a self-calibrating bundle adjustment with additional parameters. This methodology was tested using two different photogrammetric datasets, a Microsoft UltraCamX large format camera and an Applanix DSS322 medium format camera. Systematic sensitivity testing explored the influence of the number and weighting of LCPs. For both camera blocks it was found that when the number of control points increase, the accuracy improves regardless of point weighting. The calibration results were compared with those obtained using ground control points, with good agreement found between the two.
Digital dental photography. Part 6: camera settings.
Ahmad, I
2009-07-25
Once the appropriate camera and equipment have been purchased, the next considerations involve setting up and calibrating the equipment. This article provides details regarding depth of field, exposure, colour spaces and white balance calibration, concluding with a synopsis of camera settings for a standard dental set-up.
Li, Tian-Jiao; Li, Sai; Yuan, Yuan; Liu, Yu-Dong; Xu, Chuan-Long; Shuai, Yong; Tan, He-Ping
2017-04-03
Plenoptic cameras are used for capturing flames in studies of high-temperature phenomena. However, simulations of plenoptic camera models can be used prior to the experiment improve experimental efficiency and reduce cost. In this work, microlens arrays, which are based on the established light field camera model, are optimized into a hexagonal structure with three types of microlenses. With this improved plenoptic camera model, light field imaging of static objects and flame are simulated using the calibrated parameters of the Raytrix camera (R29). The optimized models improve the image resolution, imaging screen utilization, and shooting range of depth of field.
A LiDAR data-based camera self-calibration method
NASA Astrophysics Data System (ADS)
Xu, Lijun; Feng, Jing; Li, Xiaolu; Chen, Jianjun
2018-07-01
To find the intrinsic parameters of a camera, a LiDAR data-based camera self-calibration method is presented here. Parameters have been estimated using particle swarm optimization (PSO), enhancing the optimal solution of a multivariate cost function. The main procedure of camera intrinsic parameter estimation has three parts, which include extraction and fine matching of interest points in the images, establishment of cost function, based on Kruppa equations and optimization of PSO using LiDAR data as the initialization input. To improve the precision of matching pairs, a new method of maximal information coefficient (MIC) and maximum asymmetry score (MAS) was used to remove false matching pairs based on the RANSAC algorithm. Highly precise matching pairs were used to calculate the fundamental matrix so that the new cost function (deduced from Kruppa equations in terms of the fundamental matrix) was more accurate. The cost function involving four intrinsic parameters was minimized by PSO for the optimal solution. To overcome the issue of optimization pushed to a local optimum, LiDAR data was used to determine the scope of initialization, based on the solution to the P4P problem for camera focal length. To verify the accuracy and robustness of the proposed method, simulations and experiments were implemented and compared with two typical methods. Simulation results indicated that the intrinsic parameters estimated by the proposed method had absolute errors less than 1.0 pixel and relative errors smaller than 0.01%. Based on ground truth obtained from a meter ruler, the distance inversion accuracy in the experiments was smaller than 1.0 cm. Experimental and simulated results demonstrated that the proposed method was highly accurate and robust.
Estimation of Antenna Pose in the Earth Frame Using Camera and IMU Data from Mobile Phones
Wang, Zhen; Jin, Bingwen; Geng, Weidong
2017-01-01
The poses of base station antennas play an important role in cellular network optimization. Existing methods of pose estimation are based on physical measurements performed either by tower climbers or using additional sensors attached to antennas. In this paper, we present a novel non-contact method of antenna pose measurement based on multi-view images of the antenna and inertial measurement unit (IMU) data captured by a mobile phone. Given a known 3D model of the antenna, we first estimate the antenna pose relative to the phone camera from the multi-view images and then employ the corresponding IMU data to transform the pose from the camera coordinate frame into the Earth coordinate frame. To enhance the resulting accuracy, we improve existing camera-IMU calibration models by introducing additional degrees of freedom between the IMU sensors and defining a new error metric based on both the downtilt and azimuth angles, instead of a unified rotational error metric, to refine the calibration. In comparison with existing camera-IMU calibration methods, our method achieves an improvement in azimuth accuracy of approximately 1.0 degree on average while maintaining the same level of downtilt accuracy. For the pose estimation in the camera coordinate frame, we propose an automatic method of initializing the optimization solver and generating bounding constraints on the resulting pose to achieve better accuracy. With this initialization, state-of-the-art visual pose estimation methods yield satisfactory results in more than 75% of cases when plugged into our pipeline, and our solution, which takes advantage of the constraints, achieves even lower estimation errors on the downtilt and azimuth angles, both on average (0.13 and 0.3 degrees lower, respectively) and in the worst case (0.15 and 7.3 degrees lower, respectively), according to an evaluation conducted on a dataset consisting of 65 groups of data. We show that both of our enhancements contribute to the performance improvement offered by the proposed estimation pipeline, which achieves downtilt and azimuth accuracies of respectively 0.47 and 5.6 degrees on average and 1.38 and 12.0 degrees in the worst case, thereby satisfying the accuracy requirements for network optimization in the telecommunication industry. PMID:28397765
NASA Technical Reports Server (NTRS)
Axholt, Magnus; Skoglund, Martin; Peterson, Stephen D.; Cooper, Matthew D.; Schoen, Thomas B.; Gustafsson, Fredrik; Ynnerman, Anders; Ellis, Stephen R.
2010-01-01
Augmented Reality (AR) is a technique by which computer generated signals synthesize impressions that are made to coexist with the surrounding real world as perceived by the user. Human smell, taste, touch and hearing can all be augmented, but most commonly AR refers to the human vision being overlaid with information otherwise not readily available to the user. A correct calibration is important on an application level, ensuring that e.g. data labels are presented at correct locations, but also on a system level to enable display techniques such as stereoscopy to function properly [SOURCE]. Thus, vital to AR, calibration methodology is an important research area. While great achievements already have been made, there are some properties in current calibration methods for augmenting vision which do not translate from its traditional use in automated cameras calibration to its use with a human operator. This paper uses a Monte Carlo simulation of a standard direct linear transformation camera calibration to investigate how user introduced head orientation noise affects the parameter estimation during a calibration procedure of an optical see-through head mounted display.
Sensors for 3D Imaging: Metric Evaluation and Calibration of a CCD/CMOS Time-of-Flight Camera.
Chiabrando, Filiberto; Chiabrando, Roberto; Piatti, Dario; Rinaudo, Fulvio
2009-01-01
3D imaging with Time-of-Flight (ToF) cameras is a promising recent technique which allows 3D point clouds to be acquired at video frame rates. However, the distance measurements of these devices are often affected by some systematic errors which decrease the quality of the acquired data. In order to evaluate these errors, some experimental tests on a CCD/CMOS ToF camera sensor, the SwissRanger (SR)-4000 camera, were performed and reported in this paper. In particular, two main aspects are treated: the calibration of the distance measurements of the SR-4000 camera, which deals with evaluation of the camera warm up time period, the distance measurement error evaluation and a study of the influence on distance measurements of the camera orientation with respect to the observed object; the second aspect concerns the photogrammetric calibration of the amplitude images delivered by the camera using a purpose-built multi-resolution field made of high contrast targets.
Photogrammetric camera calibration
Tayman, W.P.; Ziemann, H.
1984-01-01
Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.
Fast and robust curve skeletonization for real-world elongated objects
USDA-ARS?s Scientific Manuscript database
These datasets were generated for calibrating robot-camera systems. In an extension, we also considered the problem of calibrating robots with more than one camera. These datasets are provided as a companion to the paper, "Solving the Robot-World Hand-Eye(s) Calibration Problem with Iterative Meth...
Hakala, Teemu; Markelin, Lauri; Honkavaara, Eija; Scott, Barry; Theocharous, Theo; Nevalainen, Olli; Näsi, Roope; Suomalainen, Juha; Viljanen, Niko; Greenwell, Claire; Fox, Nigel
2018-05-03
Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK).
Hakala, Teemu; Scott, Barry; Theocharous, Theo; Näsi, Roope; Suomalainen, Juha; Greenwell, Claire; Fox, Nigel
2018-01-01
Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK). PMID:29751560
Calibration of EFOSC2 Broadband Linear Imaging Polarimetry
NASA Astrophysics Data System (ADS)
Wiersema, K.; Higgins, A. B.; Covino, S.; Starling, R. L. C.
2018-03-01
The European Southern Observatory Faint Object Spectrograph and Camera v2 is one of the workhorse instruments on ESO's New Technology Telescope, and is one of the most popular instruments at La Silla observatory. It is mounted at a Nasmyth focus, and therefore exhibits strong, wavelength and pointing-direction-dependent instrumental polarisation. In this document, we describe our efforts to calibrate the broadband imaging polarimetry mode, and provide a calibration for broadband B, V, and R filters to a level that satisfies most use cases (i.e. polarimetric calibration uncertainty 0.1%). We make our calibration codes public. This calibration effort can be used to enhance the yield of future polarimetric programmes with the European Southern Observatory Faint Object Spectrograph and Camera v2, by allowing good calibration with a greatly reduced number of standard star observations. Similarly, our calibration model can be combined with archival calibration observations to post-process data taken in past years, to form the European Southern Observatory Faint Object Spectrograph and Camera v2 legacy archive with substantial scientific potential.
Development and calibration of an accurate 6-degree-of-freedom measurement system with total station
NASA Astrophysics Data System (ADS)
Gao, Yang; Lin, Jiarui; Yang, Linghui; Zhu, Jigui
2016-12-01
To meet the demand of high-accuracy, long-range and portable use in large-scale metrology for pose measurement, this paper develops a 6-degree-of-freedom (6-DOF) measurement system based on total station by utilizing its advantages of long range and relative high accuracy. The cooperative target sensor, which is mainly composed of a pinhole prism, an industrial lens, a camera and a biaxial inclinometer, is designed to be portable in use. Subsequently, a precise mathematical model is proposed from the input variables observed by total station, imaging system and inclinometer to the output six pose variables. The model must be calibrated in two levels: the intrinsic parameters of imaging system, and the rotation matrix between coordinate systems of the camera and the inclinometer. Then corresponding approaches are presented. For the first level, we introduce a precise two-axis rotary table as a calibration reference. And for the second level, we propose a calibration method by varying the pose of a rigid body with the target sensor and a reference prism on it. Finally, through simulations and various experiments, the feasibilities of the measurement model and calibration methods are validated, and the measurement accuracy of the system is evaluated.
NASA Astrophysics Data System (ADS)
Hashimoto, Atsushi; Suehara, Ken-Ichiro; Kameoka, Takaharu
To measure the quantitative surface color information of agricultural products with the ambient information during cultivation, a color calibration method for digital camera images and a remote monitoring system of color imaging using the Web were developed. Single-lens reflex and web digital cameras were used for the image acquisitions. The tomato images through the post-ripening process were taken by the digital camera in both the standard image acquisition system and in the field conditions from the morning to evening. Several kinds of images were acquired with the standard RGB color chart set up just behind the tomato fruit on a black matte, and a color calibration was carried out. The influence of the sunlight could be experimentally eliminated, and the calibrated color information consistently agreed with the standard ones acquired in the system through the post-ripening process. Furthermore, the surface color change of the tomato on the tree in a greenhouse was remotely monitored during maturation using the digital cameras equipped with the Field Server. The acquired digital color images were sent from the Farm Station to the BIFE Laboratory of Mie University via VPN. The time behavior of the tomato surface color change during the maturing process could be measured using the color parameter calculated based on the obtained and calibrated color images along with the ambient atmospheric record. This study is a very important step in developing the surface color analysis for both the simple and rapid evaluation of the crop vigor in the field and to construct an ambient and networked remote monitoring system for food security, precision agriculture, and agricultural research.
IMU-based online kinematic calibration of robot manipulator.
Du, Guanglong; Zhang, Ping
2013-01-01
Robot calibration is a useful diagnostic method for improving the positioning accuracy in robot production and maintenance. An online robot self-calibration method based on inertial measurement unit (IMU) is presented in this paper. The method requires that the IMU is rigidly attached to the robot manipulator, which makes it possible to obtain the orientation of the manipulator with the orientation of the IMU in real time. This paper proposed an efficient approach which incorporates Factored Quaternion Algorithm (FQA) and Kalman Filter (KF) to estimate the orientation of the IMU. Then, an Extended Kalman Filter (EKF) is used to estimate kinematic parameter errors. Using this proposed orientation estimation method will result in improved reliability and accuracy in determining the orientation of the manipulator. Compared with the existing vision-based self-calibration methods, the great advantage of this method is that it does not need the complex steps, such as camera calibration, images capture, and corner detection, which make the robot calibration procedure more autonomous in a dynamic manufacturing environment. Experimental studies on a GOOGOL GRB3016 robot show that this method has better accuracy, convenience, and effectiveness than vision-based methods.
NASA Astrophysics Data System (ADS)
Vincent, Mark B.; Chanover, Nancy J.; Beebe, Reta F.; Huber, Lyle
2005-10-01
The NASA Infrared Telescope Facility (IRTF) on Mauna Kea, Hawaii, set aside some time on about 500 nights from 1995 to 2002, when the NSFCAM facility infrared camera was mounted and Jupiter was visible, for a standardized set of observations of Jupiter in support of the Galileo mission. The program included observations of Jupiter, nearby reference stars, and dome flats in five filters: narrowband filters centered at 1.58, 2.28, and 3.53 μm, and broader L' and M' bands that probe the atmosphere from the stratosphere to below the main cloud layer. The reference stars were not cross-calibrated against standards. We performed follow-up observations to calibrate these stars and Jupiter in 2003 and 2004. We present a summary of the calibration of the Galileo support monitoring program data set. We present calibrated magnitudes of the six most frequently observed stars, calibrated reflectivities, and brightness temperatures of Jupiter from 1995 to 2004, and a simple method of normalizing the Jovian brightness to the 2004 results. Our study indicates that the NSFCAM's zero-point magnitudes were not stable from 1995 to early 1997, and that the best Jovian calibration possible with this data set is limited to about +/-10%. The raw images and calibration data have been deposited in the Planetary Data System.
Calibration procedures of the Tore-Supra infrared endoscopes
NASA Astrophysics Data System (ADS)
Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.
2018-01-01
Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.
Radiometric calibration of SPOT 2 HRV - A comparison of three methods
NASA Technical Reports Server (NTRS)
Biggar, Stuart F.; Dinguirard, Magdeleine C.; Gellman, David I.; Henry, Patrice; Jackson, Ray D.; Moran, M. S.; Slater, Philip N.
1991-01-01
Three methods for determining an absolute radiometric calibration of a spacecraft optical sensor are compared. They are the well-known reflectance-based and radiance-based methods and a new method based on measurements of the ratio of diffuse-to-global irradiance at the ground. The latter will be described in detail and the comparison of the three approaches will be made with reference to the SPOT-2 HRV cameras for a field campaign 1990-06-19 through 1990-06-24 at the White Sands Missile Range in New Mexico.
Automatic Calibration Method for Driver’s Head Orientation in Natural Driving Environment
Fu, Xianping; Guan, Xiao; Peli, Eli; Liu, Hongbo; Luo, Gang
2013-01-01
Gaze tracking is crucial for studying driver’s attention, detecting fatigue, and improving driver assistance systems, but it is difficult in natural driving environments due to nonuniform and highly variable illumination and large head movements. Traditional calibrations that require subjects to follow calibrators are very cumbersome to be implemented in daily driving situations. A new automatic calibration method, based on a single camera for determining the head orientation and which utilizes the side mirrors, the rear-view mirror, the instrument board, and different zones in the windshield as calibration points, is presented in this paper. Supported by a self-learning algorithm, the system tracks the head and categorizes the head pose in 12 gaze zones based on facial features. The particle filter is used to estimate the head pose to obtain an accurate gaze zone by updating the calibration parameters. Experimental results show that, after several hours of driving, the automatic calibration method without driver’s corporation can achieve the same accuracy as a manual calibration method. The mean error of estimated eye gazes was less than 5°in day and night driving. PMID:24639620
Calibration of HST wide field camera for quantitative analysis of faint galaxy images
NASA Technical Reports Server (NTRS)
Ratnatunga, Kavan U.; Griffiths, Richard E.; Casertano, Stefano; Neuschaefer, Lyman W.; Wyckoff, Eric W.
1994-01-01
We present the methods adopted to optimize the calibration of images obtained with the Hubble Space Telescope (HST) Wide Field Camera (WFC) (1991-1993). Our main goal is to improve quantitative measurement of faint images, with special emphasis on the faint (I approximately 20-24 mag) stars and galaxies observed as a part of the Medium-Deep Survey. Several modifications to the standard calibration procedures have been introduced, including improved bias and dark images, and a new supersky flatfield obtained by combining a large number of relatively object-free Medium-Deep Survey exposures of random fields. The supersky flat has a pixel-to-pixel rms error of about 2.0% in F555W and of 2.4% in F785LP; large-scale variations are smaller than 1% rms. Overall, our modifications improve the quality of faint images with respect to the standard calibration by about a factor of five in photometric accuracy and about 0.3 mag in sensitivity, corresponding to about a factor of two in observing time. The relevant calibration images have been made available to the scientific community.
Single-pixel camera with one graphene photodetector.
Li, Gongxin; Wang, Wenxue; Wang, Yuechao; Yang, Wenguang; Liu, Lianqing
2016-01-11
Consumer cameras in the megapixel range are ubiquitous, but the improvement of them is hindered by the poor performance and high cost of traditional photodetectors. Graphene, a two-dimensional micro-/nano-material, recently has exhibited exceptional properties as a sensing element in a photodetector over traditional materials. However, it is difficult to fabricate a large-scale array of graphene photodetectors to replace the traditional photodetector array. To take full advantage of the unique characteristics of the graphene photodetector, in this study we integrated a graphene photodetector in a single-pixel camera based on compressive sensing. To begin with, we introduced a method called laser scribing for fabrication the graphene. It produces the graphene components in arbitrary patterns more quickly without photoresist contamination as do traditional methods. Next, we proposed a system for calibrating the optoelectrical properties of micro/nano photodetectors based on a digital micromirror device (DMD), which changes the light intensity by controlling the number of individual micromirrors positioned at + 12°. The calibration sensitivity is driven by the sum of all micromirrors of the DMD and can be as high as 10(-5)A/W. Finally, the single-pixel camera integrated with one graphene photodetector was used to recover a static image to demonstrate the feasibility of the single-pixel imaging system with the graphene photodetector. A high-resolution image can be recovered with the camera at a sampling rate much less than Nyquist rate. The study was the first demonstration for ever record of a macroscopic camera with a graphene photodetector. The camera has the potential for high-speed and high-resolution imaging at much less cost than traditional megapixel cameras.
Algebraic Approach for Recovering Topology in Distributed Camera Networks
2009-01-14
not valid for camera networks. Spatial sam- pling of plenoptic function [2] from a network of cameras is rarely i.i.d. (independent and identi- cally...coverage can be used to track and compare paths in a wireless camera network without any metric calibration information. In particular, these results can...edition edition, 2000. [14] A. Rahimi, B. Dunagan, and T. Darrell. Si- multaneous calibration and tracking with a network of non-overlapping sensors. In
Improving Photometric Calibration of Meteor Video Camera Systems
NASA Technical Reports Server (NTRS)
Ehlert, Steven; Kingery, Aaron; Cooke, William
2016-01-01
Current optical observations of meteors are commonly limited by systematic uncertainties in photometric calibration at the level of approximately 0.5 mag or higher. Future improvements to meteor ablation models, luminous efficiency models, or emission spectra will hinge on new camera systems and techniques that significantly reduce calibration uncertainties and can reliably perform absolute photometric measurements of meteors. In this talk we discuss the algorithms and tests that NASA's Meteoroid Environment Office (MEO) has developed to better calibrate photometric measurements for the existing All-Sky and Wide-Field video camera networks as well as for a newly deployed four-camera system for measuring meteor colors in Johnson-Cousins BV RI filters. In particular we will emphasize how the MEO has been able to address two long-standing concerns with the traditional procedure, discussed in more detail below.
Cloud Computing with Context Cameras
NASA Astrophysics Data System (ADS)
Pickles, A. J.; Rosing, W. E.
2016-05-01
We summarize methods and plans to monitor and calibrate photometric observations with our autonomous, robotic network of 2m, 1m and 40cm telescopes. These are sited globally to optimize our ability to observe time-variable sources. Wide field "context" cameras are aligned with our network telescopes and cycle every ˜2 minutes through BVr'i'z' filters, spanning our optical range. We measure instantaneous zero-point offsets and transparency (throughput) against calibrators in the 5-12m range from the all-sky Tycho2 catalog, and periodically against primary standards. Similar measurements are made for all our science images, with typical fields of view of ˜0.5 degrees. These are matched against Landolt, Stetson and Sloan standards, and against calibrators in the 10-17m range from the all-sky APASS catalog. Such measurements provide pretty good instantaneous flux calibration, often to better than 5%, even in cloudy conditions. Zero-point and transparency measurements can be used to characterize, monitor and inter-compare sites and equipment. When accurate calibrations of Target against Standard fields are required, monitoring measurements can be used to select truly photometric periods when accurate calibrations can be automatically scheduled and performed.
Robust gaze-steering of an active vision system against errors in the estimated parameters
NASA Astrophysics Data System (ADS)
Han, Youngmo
2015-01-01
Gaze-steering is often used to broaden the viewing range of an active vision system. Gaze-steering procedures are usually based on estimated parameters such as image position, image velocity, depth and camera calibration parameters. However, there may be uncertainties in these estimated parameters because of measurement noise and estimation errors. In this case, robust gaze-steering cannot be guaranteed. To compensate for such problems, this paper proposes a gaze-steering method based on a linear matrix inequality (LMI). In this method, we first propose a proportional derivative (PD) control scheme on the unit sphere that does not use depth parameters. This proposed PD control scheme can avoid uncertainties in the estimated depth and camera calibration parameters, as well as inconveniences in their estimation process, including the use of auxiliary feature points and highly non-linear computation. Furthermore, the control gain of the proposed PD control scheme on the unit sphere is designed using LMI such that the designed control is robust in the presence of uncertainties in the other estimated parameters, such as image position and velocity. Simulation results demonstrate that the proposed method provides a better compensation for uncertainties in the estimated parameters than the contemporary linear method and steers the gaze of the camera more steadily over time than the contemporary non-linear method.
NASA Astrophysics Data System (ADS)
de Villiers, Jason; Jermy, Robert; Nicolls, Fred
2014-06-01
This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.
Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio
2010-01-01
This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559
A holistic calibration method with iterative distortion compensation for stereo deflectometry
NASA Astrophysics Data System (ADS)
Xu, Yongjia; Gao, Feng; Zhang, Zonghua; Jiang, Xiangqian
2018-07-01
This paper presents a novel holistic calibration method for stereo deflectometry system to improve the system measurement accuracy. The reconstruction result of stereo deflectometry is integrated with the calculated normal data of the measured surface. The calculation accuracy of the normal data is seriously influenced by the calibration accuracy of the geometrical relationship of the stereo deflectometry system. Conventional calibration approaches introduce form error to the system due to inaccurate imaging model and distortion elimination. The proposed calibration method compensates system distortion based on an iterative algorithm instead of the conventional distortion mathematical model. The initial value of the system parameters are calculated from the fringe patterns displayed on the systemic LCD screen through a reflection of a markless flat mirror. An iterative algorithm is proposed to compensate system distortion and optimize camera imaging parameters and system geometrical relation parameters based on a cost function. Both simulation work and experimental results show the proposed calibration method can significantly improve the calibration and measurement accuracy of a stereo deflectometry. The PV (peak value) of measurement error of a flat mirror can be reduced to 69.7 nm by applying the proposed method from 282 nm obtained with the conventional calibration approach.
Camera Geo calibration Using an MCMC Approach (Author’s Manuscript)
2016-08-19
calibration problem that supports priors over camera parameters and constraints that relate image anno - tations, camera geometry, and a geographic...Θi+1|Θi) = Ndim∏ j =1 1 σj φ ( Θi+1, j −Θi, j σj ) , where σj denotes the sampling step size on the j -th dimen- sion, and φ(x) denotes the PDF of the...image calibration,” Imaging for Crime Detection and Prevention, 2013. 1 [2] Haipeng Zhang, Mohammed Korayem, David J Crandall, and Gretchen LeBuhn
Radiometric calibration of an ultra-compact microbolometer thermal imaging module
NASA Astrophysics Data System (ADS)
Riesland, David W.; Nugent, Paul W.; Laurie, Seth; Shaw, Joseph A.
2017-05-01
As microbolometer focal plane array formats are steadily decreasing, new challenges arise in correcting for thermal drift in the calibration coefficients. As the thermal mass of the cameras decrease the focal plane becomes more sensitive to external thermal inputs. This paper shows results from a temperature compensation algorithm for characterizing and radiometrically calibrating a FLIR Lepton camera.
Matching Images to Models: Camera Calibration for 3-D Surface Reconstruction
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Smelyanskiy, Vadim N.; Cheeseman. Peter C.; Norvig, Peter (Technical Monitor)
2001-01-01
In a previous paper we described a system which recursively recovers a super-resolved three dimensional surface model from a set of images of the surface. In that paper we assumed that the camera calibration for each image was known. In this paper we solve two problems. Firstly, if an estimate of the surface is already known, the problem is to calibrate a new image relative to the existing surface model. Secondly, if no surface estimate is available, the relative camera calibration between the images in the set must be estimated. This will allow an initial surface model to be estimated. Results of both types of estimation are given.
Hubble Space Telescope: Wide field and planetary camera instrument handbook. Version 2.1
NASA Technical Reports Server (NTRS)
Griffiths, Richard (Editor)
1990-01-01
An overview is presented of the development and construction of the Wide Field and Planetary Camera (WF/PC). The WF/PC is a duel two dimensional spectrophotometer with rudimentary polarimetric and transmission grating capabilities. The instrument operates from 1150 to 11000 A with a resolution of 0.1 arcsec per pixel or 0.043 arcsec per pixel. Data products and standard calibration methods are briefly summarized.
Assessment of the UV camera sulfur dioxide retrieval for point source plumes
Dalton, M.P.; Watson, I.M.; Nadeau, P.A.; Werner, C.; Morrow, W.; Shannon, J.M.
2009-01-01
Digital cameras, sensitive to specific regions of the ultra-violet (UV) spectrum, have been employed for quantifying sulfur dioxide (SO2) emissions in recent years. The instruments make use of the selective absorption of UV light by SO2 molecules to determine pathlength concentration. Many monitoring advantages are gained by using this technique, but the accuracy and limitations have not been thoroughly investigated. The effect of some user-controlled parameters, including image exposure duration, the diameter of the lens aperture, the frequency of calibration cell imaging, and the use of the single or paired bandpass filters, have not yet been addressed. In order to clarify methodological consequences and quantify accuracy, laboratory and field experiments were conducted. Images were collected of calibration cells under varying observational conditions, and our conclusions provide guidance for enhanced image collection. Results indicate that the calibration cell response is reliably linear below 1500 ppm m, but that the response is significantly affected by changing light conditions. Exposure durations that produced maximum image digital numbers above 32 500 counts can reduce noise in plume images. Sulfur dioxide retrieval results from a coal-fired power plant plume were compared to direct sampling measurements and the results indicate that the accuracy of the UV camera retrieval method is within the range of current spectrometric methods. ?? 2009 Elsevier B.V.
Action Sport Cameras as an Instrument to Perform a 3D Underwater Motion Analysis.
Bernardina, Gustavo R D; Cerveri, Pietro; Barros, Ricardo M L; Marins, João C B; Silvatti, Amanda P
2016-01-01
Action sport cameras (ASC) are currently adopted mainly for entertainment purposes but their uninterrupted technical improvements, in correspondence of cost decreases, are going to disclose them for three-dimensional (3D) motion analysis in sport gesture study and athletic performance evaluation quantitatively. Extending this technology to sport analysis however still requires a methodologic step-forward to making ASC a metric system, encompassing ad-hoc camera setup, image processing, feature tracking, calibration and 3D reconstruction. Despite traditional laboratory analysis, such requirements become an issue when coping with both indoor and outdoor motion acquisitions of athletes. In swimming analysis for example, the camera setup and the calibration protocol are particularly demanding since land and underwater cameras are mandatory. In particular, the underwater camera calibration can be an issue affecting the reconstruction accuracy. In this paper, the aim is to evaluate the feasibility of ASC for 3D underwater analysis by focusing on camera setup and data acquisition protocols. Two GoPro Hero3+ Black (frequency: 60Hz; image resolutions: 1280×720/1920×1080 pixels) were located underwater into a swimming pool, surveying a working volume of about 6m3. A two-step custom calibration procedure, consisting in the acquisition of one static triad and one moving wand, carrying nine and one spherical passive markers, respectively, was implemented. After assessing camera parameters, a rigid bar, carrying two markers at known distance, was acquired in several positions within the working volume. The average error upon the reconstructed inter-marker distances was less than 2.5mm (1280×720) and 1.5mm (1920×1080). The results of this study demonstrate that the calibration of underwater ASC is feasible enabling quantitative kinematic measurements with accuracy comparable to traditional motion capture systems.
Action Sport Cameras as an Instrument to Perform a 3D Underwater Motion Analysis
Cerveri, Pietro; Barros, Ricardo M. L.; Marins, João C. B.; Silvatti, Amanda P.
2016-01-01
Action sport cameras (ASC) are currently adopted mainly for entertainment purposes but their uninterrupted technical improvements, in correspondence of cost decreases, are going to disclose them for three-dimensional (3D) motion analysis in sport gesture study and athletic performance evaluation quantitatively. Extending this technology to sport analysis however still requires a methodologic step-forward to making ASC a metric system, encompassing ad-hoc camera setup, image processing, feature tracking, calibration and 3D reconstruction. Despite traditional laboratory analysis, such requirements become an issue when coping with both indoor and outdoor motion acquisitions of athletes. In swimming analysis for example, the camera setup and the calibration protocol are particularly demanding since land and underwater cameras are mandatory. In particular, the underwater camera calibration can be an issue affecting the reconstruction accuracy. In this paper, the aim is to evaluate the feasibility of ASC for 3D underwater analysis by focusing on camera setup and data acquisition protocols. Two GoPro Hero3+ Black (frequency: 60Hz; image resolutions: 1280×720/1920×1080 pixels) were located underwater into a swimming pool, surveying a working volume of about 6m3. A two-step custom calibration procedure, consisting in the acquisition of one static triad and one moving wand, carrying nine and one spherical passive markers, respectively, was implemented. After assessing camera parameters, a rigid bar, carrying two markers at known distance, was acquired in several positions within the working volume. The average error upon the reconstructed inter-marker distances was less than 2.5mm (1280×720) and 1.5mm (1920×1080). The results of this study demonstrate that the calibration of underwater ASC is feasible enabling quantitative kinematic measurements with accuracy comparable to traditional motion capture systems. PMID:27513846
NASA Technical Reports Server (NTRS)
Miller, Chris; Mulavara, Ajitkumar; Bloomberg, Jacob
2001-01-01
To confidently report any data collected from a video-based motion capture system, its functional characteristics must be determined, namely accuracy, repeatability and resolution. Many researchers have examined these characteristics with motion capture systems, but they used only two cameras, positioned 90 degrees to each other. Everaert used 4 cameras, but all were aligned along major axes (two in x, one in y and z). Richards compared the characteristics of different commercially available systems set-up in practical configurations, but all cameras viewed a single calibration volume. The purpose of this study was to determine the accuracy, repeatability and resolution of a 6-camera Motion Analysis system in a split-volume configuration using a quasistatic methodology.
IMU-Based Online Kinematic Calibration of Robot Manipulator
2013-01-01
Robot calibration is a useful diagnostic method for improving the positioning accuracy in robot production and maintenance. An online robot self-calibration method based on inertial measurement unit (IMU) is presented in this paper. The method requires that the IMU is rigidly attached to the robot manipulator, which makes it possible to obtain the orientation of the manipulator with the orientation of the IMU in real time. This paper proposed an efficient approach which incorporates Factored Quaternion Algorithm (FQA) and Kalman Filter (KF) to estimate the orientation of the IMU. Then, an Extended Kalman Filter (EKF) is used to estimate kinematic parameter errors. Using this proposed orientation estimation method will result in improved reliability and accuracy in determining the orientation of the manipulator. Compared with the existing vision-based self-calibration methods, the great advantage of this method is that it does not need the complex steps, such as camera calibration, images capture, and corner detection, which make the robot calibration procedure more autonomous in a dynamic manufacturing environment. Experimental studies on a GOOGOL GRB3016 robot show that this method has better accuracy, convenience, and effectiveness than vision-based methods. PMID:24302854
Out of lab calibration of a rotating 2D scanner for 3D mapping
NASA Astrophysics Data System (ADS)
Koch, Rainer; Böttcher, Lena; Jahrsdörfer, Maximilian; Maier, Johannes; Trommer, Malte; May, Stefan; Nüchter, Andreas
2017-06-01
Mapping is an essential task in mobile robotics. To fulfil advanced navigation and manipulation tasks a 3D representation of the environment is required. Applying stereo cameras or Time-of-flight cameras (TOF cameras) are one way to archive this requirement. Unfortunately, they suffer from drawbacks which makes it difficult to map properly. Therefore, costly 3D laser scanners are applied. An inexpensive way to build a 3D representation is to use a 2D laser scanner and rotate the scan plane around an additional axis. A 3D point cloud acquired with such a custom device consists of multiple 2D line scans. Therefore the scanner pose of each line scan need to be determined as well as parameters resulting from a calibration to generate a 3D point cloud. Using external sensor systems are a common method to determine these calibration parameters. This is costly and difficult when the robot needs to be calibrated outside the lab. Thus, this work presents a calibration method applied on a rotating 2D laser scanner. It uses a hardware setup to identify the required parameters for calibration. This hardware setup is light, small, and easy to transport. Hence, an out of lab calibration is possible. Additional a theoretical model was created to test the algorithm and analyse impact of the scanner accuracy. The hardware components of the 3D scanner system are an HOKUYO UTM-30LX-EW 2D laser scanner, a Dynamixel servo-motor, and a control unit. The calibration system consists of an hemisphere. In the inner of the hemisphere a circular plate is mounted. The algorithm needs to be provided with a dataset of a single rotation from the laser scanner. To achieve a proper calibration result the scanner needs to be located in the middle of the hemisphere. By means of geometric formulas the algorithms determine the individual deviations of the placed laser scanner. In order to minimize errors, the algorithm solves the formulas in an iterative process. First, the calibration algorithm was tested with an ideal hemisphere model created in Matlab. Second, laser scanner was mounted differently, the scanner position and the rotation axis was modified. In doing so, every deviation, was compared with the algorithm results. Several measurement settings were tested repeatedly with the 3D scanner system and the calibration system. The results show that the length accuracy of the laser scanner is most critical. It influences the required size of the hemisphere and the calibration accuracy.
Minimizing camera-eye optical aberrations during the 3D reconstruction of retinal structures
NASA Astrophysics Data System (ADS)
Aldana-Iuit, Javier; Martinez-Perez, M. Elena; Espinosa-Romero, Arturo; Diaz-Uribe, Rufino
2010-05-01
3D reconstruction of blood vessels is a powerful visualization tool for physicians, since it allows them to refer to qualitative representation of their subject of study. In this paper we propose a 3D reconstruction method of retinal vessels from fundus images. The reconstruction method propose herein uses images of the same retinal structure in epipolar geometry. Images are preprocessed by RISA system for segmenting blood vessels and obtaining feature points for correspondences. The correspondence points process is solved using correlation. The LMedS analysis and Graph Transformation Matching algorithm are used for outliers suppression. Camera projection matrices are computed with the normalized eight point algorithm. Finally, we retrieve 3D position of the retinal tree points by linear triangulation. In order to increase the power of visualization, 3D tree skeletons are represented by surfaces via generalized cylinders whose radius correspond to morphological measurements obtained by RISA. In this paper the complete calibration process including the fundus camera and the optical properties of the eye, the so called camera-eye system is proposed. On one hand, the internal parameters of the fundus camera are obtained by classical algorithms using a reference pattern. On the other hand, we minimize the undesirable efects of the aberrations induced by the eyeball optical system assuming that contact enlarging lens corrects astigmatism, spherical and coma aberrations are reduced changing the aperture size and eye refractive errors are suppressed adjusting camera focus during image acquisition. Evaluation of two self-calibration proposals and results of 3D blood vessel surface reconstruction are presented.
Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera
NASA Astrophysics Data System (ADS)
Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.
Jung, Jaehoon; Yoon, Inhye; Paik, Joonki
2016-01-01
This paper presents an object occlusion detection algorithm using object depth information that is estimated by automatic camera calibration. The object occlusion problem is a major factor to degrade the performance of object tracking and recognition. To detect an object occlusion, the proposed algorithm consists of three steps: (i) automatic camera calibration using both moving objects and a background structure; (ii) object depth estimation; and (iii) detection of occluded regions. The proposed algorithm estimates the depth of the object without extra sensors but with a generic red, green and blue (RGB) camera. As a result, the proposed algorithm can be applied to improve the performance of object tracking and object recognition algorithms for video surveillance systems. PMID:27347978
3D morphology reconstruction using linear array CCD binocular stereo vision imaging system
NASA Astrophysics Data System (ADS)
Pan, Yu; Wang, Jinjiang
2018-01-01
Binocular vision imaging system, which has a small field of view, cannot reconstruct the 3-D shape of the dynamic object. We found a linear array CCD binocular vision imaging system, which uses different calibration and reconstruct methods. On the basis of the binocular vision imaging system, the linear array CCD binocular vision imaging systems which has a wider field of view can reconstruct the 3-D morphology of objects in continuous motion, and the results are accurate. This research mainly introduces the composition and principle of linear array CCD binocular vision imaging system, including the calibration, capture, matching and reconstruction of the imaging system. The system consists of two linear array cameras which were placed in special arrangements and a horizontal moving platform that can pick up objects. The internal and external parameters of the camera are obtained by calibrating in advance. And then using the camera to capture images of moving objects, the results are then matched and 3-D reconstructed. The linear array CCD binocular vision imaging systems can accurately measure the 3-D appearance of moving objects, this essay is of great significance to measure the 3-D morphology of moving objects.
Calibration of the venµs super-spectral camera
NASA Astrophysics Data System (ADS)
Topaz, Jeremy; Sprecher, Tuvia; Tinto, Francesc; Echeto, Pierre; Hagolle, Olivier
2017-11-01
A high-resolution super-spectral camera is being developed by Elbit Systems in Israel for the joint CNES- Israel Space Agency satellite, VENμS (Vegetation and Environment monitoring on a new Micro-Satellite). This camera will have 12 narrow spectral bands in the Visible/NIR region and will give images with 5.3 m resolution from an altitude of 720 km, with an orbit which allows a two-day revisit interval for a number of selected sites distributed over some two-thirds of the earth's surface. The swath width will be 27 km at this altitude. To ensure the high radiometric and geometric accuracy needed to fully exploit such multiple data sampling, careful attention is given in the design to maximize characteristics such as signal-to-noise ratio (SNR), spectral band accuracy, stray light rejection, inter- band pixel-to-pixel registration, etc. For the same reasons, accurate calibration of all the principle characteristics is essential, and this presents some major challenges. The methods planned to achieve the required level of calibration are presented following a brief description of the system design. A fuller description of the system design is given in [2], [3] and [4].
Two-Camera Acquisition and Tracking of a Flying Target
NASA Technical Reports Server (NTRS)
Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter
2008-01-01
A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.
Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.
Easlon, Hsien Ming; Bloom, Arnold J
2014-07-01
Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.
Development and use of an L3CCD high-cadence imaging system for Optical Astronomy
NASA Astrophysics Data System (ADS)
Sheehan, Brendan J.; Butler, Raymond F.
2008-02-01
A high cadence imaging system, based on a Low Light Level CCD (L3CCD) camera, has been developed for photometric and polarimetric applications. The camera system is an iXon DV-887 from Andor Technology, which uses a CCD97 L3CCD detector from E2V technologies. This is a back illuminated device, giving it an extended blue response, and has an active area of 512×512 pixels. The camera system allows frame-rates ranging from 30 fps (full frame) to 425 fps (windowed & binned frame). We outline the system design, concentrating on the calibration and control of the L3CCD camera. The L3CCD detector can be either triggered directly by a GPS timeserver/frequency generator or be internally triggered. A central PC remotely controls the camera computer system and timeserver. The data is saved as standard `FITS' files. The large data loads associated with high frame rates, leads to issues with gathering and storing the data effectively. To overcome such problems, a specific data management approach is used, and a Python/PYRAF data reduction pipeline was written for the Linux environment. This uses calibration data collected either on-site, or from lab based measurements, and enables a fast and reliable method for reducing images. To date, the system has been used twice on the 1.5 m Cassini Telescope in Loiano (Italy) we present the reduction methods and observations made.
An Investigation of a Photographic Technique of Measuring High Surface Temperatures
NASA Technical Reports Server (NTRS)
Siviter, James H., Jr.; Strass, H. Kurt
1960-01-01
A photographic method of temperature determination has been developed to measure elevated temperatures of surfaces. The technique presented herein minimizes calibration procedures and permits wide variation in emulsion developing techniques. The present work indicates that the lower limit of applicability is approximately 1,400 F when conventional cameras, emulsions, and moderate exposures are used. The upper limit is determined by the calibration technique and the accuracy required.
Concave Surround Optics for Rapid Multi-View Imaging
2006-11-01
thus is amenable to capturing dynamic events avoiding the need to construct and calibrate an array of cameras. We demonstrate the system with a high...hard to assemble and calibrate . In this paper we present an optical system capable of rapidly moving the viewpoint around a scene. Our system...flexibility, large camera arrays are typically expensive and require significant effort to calibrate temporally, geometrically and chromatically
Teaching Camera Calibration by a Constructivist Methodology
ERIC Educational Resources Information Center
Samper, D.; Santolaria, J.; Pastor, J. J.; Aguilar, J. J.
2010-01-01
This article describes the Metrovisionlab simulation software and practical sessions designed to teach the most important machine vision camera calibration aspects in courses for senior undergraduate students. By following a constructivist methodology, having received introductory theoretical classes, students use the Metrovisionlab application to…
NASA Astrophysics Data System (ADS)
Sargent, Dusty; Chen, Chao-I.; Wang, Yuan-Fang
2010-02-01
The paper reports a fully-automated, cross-modality sensor data registration scheme between video and magnetic tracker data. This registration scheme is intended for use in computerized imaging systems to model the appearance, structure, and dimension of human anatomy in three dimensions (3D) from endoscopic videos, particularly colonoscopic videos, for cancer research and clinical practices. The proposed cross-modality calibration procedure operates this way: Before a colonoscopic procedure, the surgeon inserts a magnetic tracker into the working channel of the endoscope or otherwise fixes the tracker's position on the scope. The surgeon then maneuvers the scope-tracker assembly to view a checkerboard calibration pattern from a few different viewpoints for a few seconds. The calibration procedure is then completed, and the relative pose (translation and rotation) between the reference frames of the magnetic tracker and the scope is determined. During the colonoscopic procedure, the readings from the magnetic tracker are used to automatically deduce the pose (both position and orientation) of the scope's reference frame over time, without complicated image analysis. Knowing the scope movement over time then allows us to infer the 3D appearance and structure of the organs and tissues in the scene. While there are other well-established mechanisms for inferring the movement of the camera (scope) from images, they are often sensitive to mistakes in image analysis, error accumulation, and structure deformation. The proposed method using a magnetic tracker to establish the camera motion parameters thus provides a robust and efficient alternative for 3D model construction. Furthermore, the calibration procedure does not require special training nor use expensive calibration equipment (except for a camera calibration pattern-a checkerboard pattern-that can be printed on any laser or inkjet printer).
Evangelista, Dennis J.; Ray, Dylan D.; Hedrick, Tyson L.
2016-01-01
ABSTRACT Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts. PMID:27444791
Inflight Radiometric Calibration of New Horizons' Multispectral Visible Imaging Camera (MVIC)
NASA Technical Reports Server (NTRS)
Howett, C. J. A.; Parker, A. H.; Olkin, C. B.; Reuter, D. C.; Ennico, K.; Grundy, W. M.; Graps, A. L.; Harrison, K. P.; Throop, H. B.; Buie, M. W.;
2016-01-01
We discuss two semi-independent calibration techniques used to determine the inflight radiometric calibration for the New Horizons Multi-spectral Visible Imaging Camera (MVIC). The first calibration technique compares the measured number of counts (DN) observed from a number of well calibrated stars to those predicted using the component-level calibration. The ratio of these values provides a multiplicative factor that allows a conversation between the preflight calibration to the more accurate inflight one, for each detector. The second calibration technique is a channel-wise relative radiometric calibration for MVIC's blue, near-infrared and methane color channels using Hubble and New Horizons observations of Charon and scaling from the red channel stellar calibration. Both calibration techniques produce very similar results (better than 7% agreement), providing strong validation for the techniques used. Since the stellar calibration described here can be performed without a color target in the field of view and covers all of MVIC's detectors, this calibration was used to provide the radiometric keyword values delivered by the New Horizons project to the Planetary Data System (PDS). These keyword values allow each observation to be converted from counts to physical units; a description of how these keyword values were generated is included. Finally, mitigation techniques adopted for the gain drift observed in the near-infrared detector and one of the panchromatic framing cameras are also discussed.
Sub-Camera Calibration of a Penta-Camera
NASA Astrophysics Data System (ADS)
Jacobsen, K.; Gerke, M.
2016-03-01
Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding cameras of both blocks have the same trend, but as usual for block adjustments with self calibration, they still show significant differences. Based on the very high number of image points the remaining image residuals can be safely determined by overlaying and averaging the image residuals corresponding to their image coordinates. The size of the systematic image errors, not covered by the used additional parameters, is in the range of a square mean of 0.1 pixels corresponding to 0.6μm. They are not the same for both blocks, but show some similarities for corresponding cameras. In general the bundle block adjustment with a satisfying set of additional parameters, checked by remaining systematic errors, is required for use of the whole geometric potential of the penta camera. Especially for object points on facades, often only in two images and taken with a limited base length, the correct handling of systematic image errors is important. At least in the analyzed data sets the self calibration of sub-cameras by bundle block adjustment suffers from the correlation of the inner to the exterior calibration due to missing crossing flight directions. As usual, the systematic image errors differ from block to block even without the influence of the correlation to the exterior orientation.
Geometric rectification of camera-captured document images.
Liang, Jian; DeMenthon, Daniel; Doermann, David
2008-04-01
Compared to typical scanners, handheld cameras offer convenient, flexible, portable, and non-contact image capture, which enables many new applications and breathes new life into existing ones. However, camera-captured documents may suffer from distortions caused by non-planar document shape and perspective projection, which lead to failure of current OCR technologies. We present a geometric rectification framework for restoring the frontal-flat view of a document from a single camera-captured image. Our approach estimates 3D document shape from texture flow information obtained directly from the image without requiring additional 3D/metric data or prior camera calibration. Our framework provides a unified solution for both planar and curved documents and can be applied in many, especially mobile, camera-based document analysis applications. Experiments show that our method produces results that are significantly more OCR compatible than the original images.
Optical fringe-reflection deflectometry with bundle adjustment
NASA Astrophysics Data System (ADS)
Xiao, Yong-Liang; Li, Sikun; Zhang, Qican; Zhong, Jianxin; Su, Xianyu; You, Zhisheng
2018-06-01
Liquid crystal display (LCD) screens are located outside of a camera's field of view in fringe-reflection deflectometry. Therefore, fringes that are displayed on LCD screens are obtained through specular reflection by a fixed camera. Thus, the pose calibration between the camera and LCD screen is one of the main challenges in fringe-reflection deflectometry. A markerless planar mirror is used to reflect the LCD screen more than three times, and the fringes are mapped into the fixed camera. The geometrical calibration can be accomplished by estimating the pose between the camera and the virtual image of fringes. Considering the relation between their pose, the incidence and reflection rays can be unified in the camera frame, and a forward triangulation intersection can be operated in the camera frame to measure three-dimensional (3D) coordinates of the specular surface. In the final optimization, constraint-bundle adjustment is operated to refine simultaneously the camera intrinsic parameters, including distortion coefficients, estimated geometrical pose between the LCD screen and camera, and 3D coordinates of the specular surface, with the help of the absolute phase collinear constraint. Simulation and experiment results demonstrate that the pose calibration with planar mirror reflection is simple and feasible, and the constraint-bundle adjustment can enhance the 3D coordinate measurement accuracy in fringe-reflection deflectometry.
a Novel Approach to Camera Calibration Method for Smart Phones Under Road Environment
NASA Astrophysics Data System (ADS)
Lee, Bijun; Zhou, Jian; Ye, Maosheng; Guo, Yuan
2016-06-01
Monocular vision-based lane departure warning system has been increasingly used in advanced driver assistance systems (ADAS). By the use of the lane mark detection and identification, we proposed an automatic and efficient camera calibration method for smart phones. At first, we can detect the lane marker feature in a perspective space and calculate edges of lane markers in image sequences. Second, because of the width of lane marker and road lane is fixed under the standard structural road environment, we can automatically build a transformation matrix between perspective space and 3D space and get a local map in vehicle coordinate system. In order to verify the validity of this method, we installed a smart phone in the `Tuzhi' self-driving car of Wuhan University and recorded more than 100km image data on the road in Wuhan. According to the result, we can calculate the positions of lane markers which are accurate enough for the self-driving car to run smoothly on the road.
Automatic colorimetric calibration of human wounds
2010-01-01
Background Recently, digital photography in medicine is considered an acceptable tool in many clinical domains, e.g. wound care. Although ever higher resolutions are available, reproducibility is still poor and visual comparison of images remains difficult. This is even more the case for measurements performed on such images (colour, area, etc.). This problem is often neglected and images are freely compared and exchanged without further thought. Methods The first experiment checked whether camera settings or lighting conditions could negatively affect the quality of colorimetric calibration. Digital images plus a calibration chart were exposed to a variety of conditions. Precision and accuracy of colours after calibration were quantitatively assessed with a probability distribution for perceptual colour differences (dE_ab). The second experiment was designed to assess the impact of the automatic calibration procedure (i.e. chart detection) on real-world measurements. 40 Different images of real wounds were acquired and a region of interest was selected in each image. 3 Rotated versions of each image were automatically calibrated and colour differences were calculated. Results 1st Experiment: Colour differences between the measurements and real spectrophotometric measurements reveal median dE_ab values respectively 6.40 for the proper patches of calibrated normal images and 17.75 for uncalibrated images demonstrating an important improvement in accuracy after calibration. The reproducibility, visualized by the probability distribution of the dE_ab errors between 2 measurements of the patches of the images has a median of 3.43 dE* for all calibrated images, 23.26 dE_ab for all uncalibrated images. If we restrict ourselves to the proper patches of normal calibrated images the median is only 2.58 dE_ab! Wilcoxon sum-rank testing (p < 0.05) between uncalibrated normal images and calibrated normal images with proper squares were equal to 0 demonstrating a highly significant improvement of reproducibility. In the second experiment, the reproducibility of the chart detection during automatic calibration is presented using a probability distribution of dE_ab errors between 2 measurements of the same ROI. Conclusion The investigators proposed an automatic colour calibration algorithm that ensures reproducible colour content of digital images. Evidence was provided that images taken with commercially available digital cameras can be calibrated independently of any camera settings and illumination features. PMID:20298541
Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board
Park, Yoonsu; Yun, Seokmin; Won, Chee Sun; Cho, Kyungeun; Um, Kyhyun; Sim, Sungdae
2014-01-01
Calibration between color camera and 3D Light Detection And Ranging (LIDAR) equipment is an essential process for data fusion. The goal of this paper is to improve the calibration accuracy between a camera and a 3D LIDAR. In particular, we are interested in calibrating a low resolution 3D LIDAR with a relatively small number of vertical sensors. Our goal is achieved by employing a new methodology for the calibration board, which exploits 2D-3D correspondences. The 3D corresponding points are estimated from the scanned laser points on the polygonal planar board with adjacent sides. Since the lengths of adjacent sides are known, we can estimate the vertices of the board as a meeting point of two projected sides of the polygonal board. The estimated vertices from the range data and those detected from the color image serve as the corresponding points for the calibration. Experiments using a low-resolution LIDAR with 32 sensors show robust results. PMID:24643005
A system for extracting 3-dimensional measurements from a stereo pair of TV cameras
NASA Technical Reports Server (NTRS)
Yakimovsky, Y.; Cunningham, R.
1976-01-01
Obtaining accurate three-dimensional (3-D) measurement from a stereo pair of TV cameras is a task requiring camera modeling, calibration, and the matching of the two images of a real 3-D point on the two TV pictures. A system which models and calibrates the cameras and pairs the two images of a real-world point in the two pictures, either manually or automatically, was implemented. This system is operating and provides three-dimensional measurements resolution of + or - mm at distances of about 2 m.
Calibration Procedures in Mid Format Camera Setups
NASA Astrophysics Data System (ADS)
Pivnicka, F.; Kemper, G.; Geissler, S.
2012-07-01
A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied. However, there is a misalignment (bore side angle) that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.
Autocalibration of multiprojector CAVE-like immersive environments.
Sajadi, Behzad; Majumder, Aditi
2012-03-01
In this paper, we present the first method for the geometric autocalibration of multiple projectors on a set of CAVE-like immersive display surfaces including truncated domes and 4 or 5-wall CAVEs (three side walls, floor, and/or ceiling). All such surfaces can be categorized as swept surfaces and multiple projectors can be registered on them using a single uncalibrated camera without using any physical markers on the surface. Our method can also handle nonlinear distortion in the projectors, common in compact setups where a short throw lens is mounted on each projector. Further, when the whole swept surface is not visible from a single camera view, we can register the projectors using multiple pan and tilted views of the same camera. Thus, our method scales well with different size and resolution of the display. Since we recover the 3D shape of the display, we can achieve registration that is correct from any arbitrary viewpoint appropriate for head-tracked single-user virtual reality systems. We can also achieve wallpapered registration, more appropriate for multiuser collaborative explorations. Though much more immersive than common surfaces like planes and cylinders, general swept surfaces are used today only for niche display environments. Even the more popular 4 or 5-wall CAVE is treated as a piecewise planar surface for calibration purposes and hence projectors are not allowed to be overlapped across the corners. Our method opens up the possibility of using such swept surfaces to create more immersive VR systems without compromising the simplicity of having a completely automatic calibration technique. Such calibration allows completely arbitrary positioning of the projectors in a 5-wall CAVE, without respecting the corners.
Jordt, Anne; Zelenka, Claudius; von Deimling, Jens Schneider; Koch, Reinhard; Köser, Kevin
2015-12-05
Several acoustic and optical techniques have been used for characterizing natural and anthropogenic gas leaks (carbon dioxide, methane) from the ocean floor. Here, single-camera based methods for bubble stream observation have become an important tool, as they help estimating flux and bubble sizes under certain assumptions. However, they record only a projection of a bubble into the camera and therefore cannot capture the full 3D shape, which is particularly important for larger, non-spherical bubbles. The unknown distance of the bubble to the camera (making it appear larger or smaller than expected) as well as refraction at the camera interface introduce extra uncertainties. In this article, we introduce our wide baseline stereo-camera deep-sea sensor bubble box that overcomes these limitations, as it observes bubbles from two orthogonal directions using calibrated cameras. Besides the setup and the hardware of the system, we discuss appropriate calibration and the different automated processing steps deblurring, detection, tracking, and 3D fitting that are crucial to arrive at a 3D ellipsoidal shape and rise speed of each bubble. The obtained values for single bubbles can be aggregated into statistical bubble size distributions or fluxes for extrapolation based on diffusion and dissolution models and large scale acoustic surveys. We demonstrate and evaluate the wide baseline stereo measurement model using a controlled test setup with ground truth information.
Jordt, Anne; Zelenka, Claudius; Schneider von Deimling, Jens; Koch, Reinhard; Köser, Kevin
2015-01-01
Several acoustic and optical techniques have been used for characterizing natural and anthropogenic gas leaks (carbon dioxide, methane) from the ocean floor. Here, single-camera based methods for bubble stream observation have become an important tool, as they help estimating flux and bubble sizes under certain assumptions. However, they record only a projection of a bubble into the camera and therefore cannot capture the full 3D shape, which is particularly important for larger, non-spherical bubbles. The unknown distance of the bubble to the camera (making it appear larger or smaller than expected) as well as refraction at the camera interface introduce extra uncertainties. In this article, we introduce our wide baseline stereo-camera deep-sea sensor bubble box that overcomes these limitations, as it observes bubbles from two orthogonal directions using calibrated cameras. Besides the setup and the hardware of the system, we discuss appropriate calibration and the different automated processing steps deblurring, detection, tracking, and 3D fitting that are crucial to arrive at a 3D ellipsoidal shape and rise speed of each bubble. The obtained values for single bubbles can be aggregated into statistical bubble size distributions or fluxes for extrapolation based on diffusion and dissolution models and large scale acoustic surveys. We demonstrate and evaluate the wide baseline stereo measurement model using a controlled test setup with ground truth information. PMID:26690168
ColorChecker at the beach: dangers of sunburn and glare
NASA Astrophysics Data System (ADS)
McCann, John
2014-01-01
In High-Dynamic-Range (HDR) imaging, optical veiling glare sets the limits of accurate scene information recorded by a camera. But, what happens at the beach? Here we have a Low-Dynamic-Range (LDR) scene with maximal glare. Can we calibrate a camera at the beach and not be burnt? We know that we need sunscreen and sunglasses, but what about our cameras? The effect of veiling glare is scene-dependent. When we compare RAW camera digits with spotmeter measurements we find significant differences. As well, these differences vary, depending on where we aim the camera. When we calibrate our camera at the beach we get data that is valid for only that part of that scene. Camera veiling glare is an issue in LDR scenes in uniform illumination with a shaded lens.
Error modeling and analysis of star cameras for a class of 1U spacecraft
NASA Astrophysics Data System (ADS)
Fowler, David M.
As spacecraft today become increasingly smaller, the demand for smaller components and sensors rises as well. The smartphone, a cutting edge consumer technology, has impressive collections of both sensors and processing capabilities and may have the potential to fill this demand in the spacecraft market. If the technologies of a smartphone can be used in space, the cost of building miniature satellites would drop significantly and give a boost to the aerospace and scientific communities. Concentrating on the problem of spacecraft orientation, this study sets ground to determine the capabilities of a smartphone camera when acting as a star camera. Orientations determined from star images taken from a smartphone camera are compared to those of higher quality cameras in order to determine the associated accuracies. The results of the study reveal the abilities of low-cost off-the-shelf imagers in space and give a starting point for future research in the field. The study began with a complete geometric calibration of each analyzed imager such that all comparisons start from the same base. After the cameras were calibrated, image processing techniques were introduced to correct for atmospheric, lens, and image sensor effects. Orientations for each test image are calculated through methods of identifying the stars exposed on each image. Analyses of these orientations allow the overall errors of each camera to be defined and provide insight into the abilities of low-cost imagers.
van Schaick, Willem; van Dooren, Bart T H; Mulder, Paul G H; Völker-Dieben, Hennie J M
2005-07-01
To report on the calibration of the Topcon SP-2000P specular microscope and the Endothelial Cell Analysis Module of the IMAGEnet 2000 software, and to establish the validity of the different endothelial cell density (ECD) assessment methods available in these instruments. Using an external microgrid, we calibrated the magnification of the SP-2000P and the IMAGEnet software. In both eyes of 36 volunteers, we validated 4 ECD assessment methods by comparing these methods to the gold standard manual ECD, manual counting of cells on a video print. These methods were: the estimated ECD, estimation of ECD with a reference grid on the camera screen; the SP-2000P ECD, pointing out whole contiguous cells on the camera screen; the uncorrected IMAGEnet ECD, using automatically drawn cell borders, and the corrected IMAGEnet ECD, with manual correction of incorrectly drawn cell borders in the automated analysis. Validity of each method was evaluated by calculating both the mean difference with the manual ECD and the limits of agreement as described by Bland and Altman. Preset factory values of magnification were incorrect, resulting in errors in ECD of up to 9%. All assessments except 1 of the estimated ECDs differed significantly from manual ECDs, with most differences being similar (< or =6.5%), except for uncorrected IMAGEnet ECD (30.2%). Corrected IMAGEnet ECD showed the narrowest limits of agreement (-4.9 to +19.3%). We advise checking the calibration of magnification in any specular microscope or endothelial analysis software as it may be erroneous. Corrected IMAGEnet ECD is the most valid of the investigated methods in the Topcon SP-2000P/IMAGEnet 2000 combination.
Autocalibration of a projector-camera system.
Okatani, Takayuki; Deguchi, Koichiro
2005-12-01
This paper presents a method for calibrating a projector-camera system that consists of multiple projectors (or multiple poses of a single projector), a camera, and a planar screen. We consider the problem of estimating the homography between the screen and the image plane of the camera or the screen-camera homography, in the case where there is no prior knowledge regarding the screen surface that enables the direct computation of the homography. It is assumed that the pose of each projector is unknown while its internal geometry is known. Subsequently, it is shown that the screen-camera homography can be determined from only the images projected by the projectors and then obtained by the camera, up to a transformation with four degrees of freedom. This transformation corresponds to arbitrariness in choosing a two-dimensional coordinate system on the screen surface and when this coordinate system is chosen in some manner, the screen-camera homography as well as the unknown poses of the projectors can be uniquely determined. A noniterative algorithm is presented, which computes the homography from three or more images. Several experimental results on synthetic as well as real images are shown to demonstrate the effectiveness of the method.
Plate refractive camera model and its applications
NASA Astrophysics Data System (ADS)
Huang, Longxiang; Zhao, Xu; Cai, Shen; Liu, Yuncai
2017-03-01
In real applications, a pinhole camera capturing objects through a planar parallel transparent plate is frequently employed. Due to the refractive effects of the plate, such an imaging system does not comply with the conventional pinhole camera model. Although the system is ubiquitous, it has not been thoroughly studied. This paper aims at presenting a simple virtual camera model, called a plate refractive camera model, which has a form similar to a pinhole camera model and can efficiently model refractions through a plate. The key idea is to employ a pixel-wise viewpoint concept to encode the refraction effects into a pixel-wise pinhole camera model. The proposed camera model realizes an efficient forward projection computation method and has some advantages in applications. First, the model can help to compute the caustic surface to represent the changes of the camera viewpoints. Second, the model has strengths in analyzing and rectifying the image caustic distortion caused by the plate refraction effects. Third, the model can be used to calibrate the camera's intrinsic parameters without removing the plate. Last but not least, the model contributes to putting forward the plate refractive triangulation methods in order to solve the plate refractive triangulation problem easily in multiviews. We verify our theory in both synthetic and real experiments.
The use of uncalibrated roadside CCTV cameras to estimate mean traffic speed
DOT National Transportation Integrated Search
2001-12-01
In this report, we present a novel approach for estimating traffic speed using a sequence of images from an un-calibrated camera. We assert that exact calibration is not necessary to estimate speed. Instead, to estimate speed, we use: (1) geometric r...
Calibration method of microgrid polarimeters with image interpolation.
Chen, Zhenyue; Wang, Xia; Liang, Rongguang
2015-02-10
Microgrid polarimeters have large advantages over conventional polarimeters because of the snapshot nature and because they have no moving parts. However, they also suffer from several error sources, such as fixed pattern noise (FPN), photon response nonuniformity (PRNU), pixel cross talk, and instantaneous field-of-view (IFOV) error. A characterization method is proposed to improve the measurement accuracy in visible waveband. We first calibrate the camera with uniform illumination so that the response of the sensor is uniform over the entire field of view without IFOV error. Then a spline interpolation method is implemented to minimize IFOV error. Experimental results show the proposed method can effectively minimize the FPN and PRNU.
Benedetti, L. R.; Holder, J. P.; Perkins, M.; ...
2016-02-26
We describe an experimental method to measure the gate profile of an x-ray framing camera and to determine several important functional parameters: relative gain (between strips), relative gain droop (within each strip), gate propagation velocity, gate width, and actual inter-strip timing. Several of these parameters cannot be measured accurately by any other technique. This method is then used to document cross talk-induced gain variations and artifacts created by radiation that arrives before the framing camera is actively amplifying x-rays. Electromagnetic cross talk can cause relative gains to vary significantly as inter-strip timing is varied. This imposes a stringent requirement formore » gain calibration. If radiation arrives before a framing camera is triggered, it can cause an artifact that manifests as a high-intensity, spatially varying background signal. Furthermore, we have developed a device that can be added to the framing camera head to prevent these artifacts.« less
Benedetti, L R; Holder, J P; Perkins, M; Brown, C G; Anderson, C S; Allen, F V; Petre, R B; Hargrove, D; Glenn, S M; Simanovskaia, N; Bradley, D K; Bell, P
2016-02-01
We describe an experimental method to measure the gate profile of an x-ray framing camera and to determine several important functional parameters: relative gain (between strips), relative gain droop (within each strip), gate propagation velocity, gate width, and actual inter-strip timing. Several of these parameters cannot be measured accurately by any other technique. This method is then used to document cross talk-induced gain variations and artifacts created by radiation that arrives before the framing camera is actively amplifying x-rays. Electromagnetic cross talk can cause relative gains to vary significantly as inter-strip timing is varied. This imposes a stringent requirement for gain calibration. If radiation arrives before a framing camera is triggered, it can cause an artifact that manifests as a high-intensity, spatially varying background signal. We have developed a device that can be added to the framing camera head to prevent these artifacts.
Application of IR imaging for free-surface velocity measurement in liquid-metal systems
Hvasta, M. G.; Kolemen, E.; Fisher, A.
2017-01-05
Measuring free-surface, liquid-metal flow velocity is challenging to do in a reliable and accurate manner. This paper presents a non-invasive, easily calibrated method of measuring the surface velocities of open-channel liquid-metal flows using an IR camera. Unlike other spatially limited methods, this IR camera particle tracking technique provides full field-of-view data that can be used to better understand open-channel flows and determine surface boundary conditions. Lastly, this method could be implemented and automated for a wide range of liquid-metal experiments, even if they operate at high-temperatures or within strong magnetic fields.
Bacterial contamination monitor
NASA Technical Reports Server (NTRS)
Rich, E.; Macleod, N. H.
1973-01-01
Economical, simple, and fast method uses apparatus which detects bacteria by photography. Apparatus contains camera, film assembly, calibrated light bulb, opaque plastic plate with built-in reflecting surface and transparent window section, opaque slide, plate with chemical packages, and cover containing roller attached to handle.
Virtual viewpoint synthesis in multi-view video system
NASA Astrophysics Data System (ADS)
Li, Fang; Yang, Shiqiang
2005-07-01
In this paper, we present a virtual viewpoint video synthesis algorithm to satisfy the following three aims: low computing consuming; real time interpolation and acceptable video quality. In contrast with previous technologies, this method obtain incompletely 3D structure using neighbor video sources instead of getting total 3D information with all video sources, so that the computation is reduced greatly. So we demonstrate our interactive multi-view video synthesis algorithm in a personal computer. Furthermore, adopting the method of choosing feature points to build the correspondence between the frames captured by neighbor cameras, we need not require camera calibration. Finally, our method can be used when the angle between neighbor cameras is 25-30 degrees that it is much larger than common computer vision experiments. In this way, our method can be applied into many applications such as sports live, video conference, etc.
Pose Self-Calibration of Stereo Vision Systems for Autonomous Vehicle Applications.
Musleh, Basam; Martín, David; Armingol, José María; de la Escalera, Arturo
2016-09-14
Nowadays, intelligent systems applied to vehicles have grown very rapidly; their goal is not only the improvement of safety, but also making autonomous driving possible. Many of these intelligent systems are based on making use of computer vision in order to know the environment and act accordingly. It is of great importance to be able to estimate the pose of the vision system because the measurement matching between the perception system (pixels) and the vehicle environment (meters) depends on the relative position between the perception system and the environment. A new method of camera pose estimation for stereo systems is presented in this paper, whose main contribution regarding the state of the art on the subject is the estimation of the pitch angle without being affected by the roll angle. The validation of the self-calibration method is accomplished by comparing it with relevant methods of camera pose estimation, where a synthetic sequence is used in order to measure the continuous error with a ground truth. This validation is enriched by the experimental results of the method in real traffic environments.
Pose Self-Calibration of Stereo Vision Systems for Autonomous Vehicle Applications
Musleh, Basam; Martín, David; Armingol, José María; de la Escalera, Arturo
2016-01-01
Nowadays, intelligent systems applied to vehicles have grown very rapidly; their goal is not only the improvement of safety, but also making autonomous driving possible. Many of these intelligent systems are based on making use of computer vision in order to know the environment and act accordingly. It is of great importance to be able to estimate the pose of the vision system because the measurement matching between the perception system (pixels) and the vehicle environment (meters) depends on the relative position between the perception system and the environment. A new method of camera pose estimation for stereo systems is presented in this paper, whose main contribution regarding the state of the art on the subject is the estimation of the pitch angle without being affected by the roll angle. The validation of the self-calibration method is accomplished by comparing it with relevant methods of camera pose estimation, where a synthetic sequence is used in order to measure the continuous error with a ground truth. This validation is enriched by the experimental results of the method in real traffic environments. PMID:27649178
A Spectralon BRF Data Base for MISR Calibration Application
NASA Technical Reports Server (NTRS)
Bruegge, C.; Chrien, N.; Haner, D.
1999-01-01
The Multi-angle Imaging SpectroRadiometer (MISR) is an Earth observing sensor which will provide global retrievals of aerosols, clouds, and land surface parameters. Instrument specifications require high accuracy absolute calibration, as well as accurate camera-to-camera, band-to-band and pixel-to-pixel relative response determinations.
Camera calibration correction in shape from inconsistent silhouette
USDA-ARS?s Scientific Manuscript database
The use of shape from silhouette for reconstruction tasks is plagued by two types of real-world errors: camera calibration error and silhouette segmentation error. When either error is present, we call the problem the Shape from Inconsistent Silhouette (SfIS) problem. In this paper, we show how sm...
The calibration of video cameras for quantitative measurements
NASA Technical Reports Server (NTRS)
Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.
1993-01-01
Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.
A versatile calibration procedure for portable coded aperture gamma cameras and RGB-D sensors
NASA Astrophysics Data System (ADS)
Paradiso, V.; Crivellaro, A.; Amgarou, K.; de Lanaute, N. Blanc; Fua, P.; Liénard, E.
2018-04-01
The present paper proposes a versatile procedure for the geometrical calibration of coded aperture gamma cameras and RGB-D depth sensors, using only one radioactive point source and a simple experimental set-up. Calibration data is then used for accurately aligning radiation images retrieved by means of the γ-camera with the respective depth images computed with the RGB-D sensor. The system resulting from such a combination is thus able to retrieve, automatically, the distance of radioactive hotspots by means of pixel-wise mapping between gamma and depth images. This procedure is of great interest for a wide number of applications, ranging from precise automatic estimation of the shape and distance of radioactive objects to Augmented Reality systems. Incidentally, the corresponding results validated the choice of a perspective design model for a coded aperture γ-camera.
Updates to Post-Flash Calibration for the Advanced Camera for Surveys Wide Field Channel
NASA Astrophysics Data System (ADS)
Miles, Nathan
2018-03-01
This report presents a new technique for generating the post-flash calibration reference file for the Advanced Camera for Surveys (ACS) Wide Field Channel (WFC). The new method substantially reduces, if not, eliminates all together the presence of dark current artifacts arising from improper dark subtraction, while simultaneously preserving flat-field artifacts. The stability of the post-flash calibration reference file over time is measured using data taken yearly since 2012 and no statistically significant deviations are found. An analysis of all short-flashed darks taken every two days since January 2015 reveals a periodic modulation of the LED intensity on timescales of about one year. This effect is most readily explained by changes to the local temperature in the area surrounding the LED. However, a slight offset between the periods of the temperature and LED modulations lends to the possibility that the effect is a chance observation of the two sinusoids at an unfortunate point in their beat cycle.
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles.
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F
2016-09-16
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV's navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results.
Automatic Spatio-Temporal Flow Velocity Measurement in Small Rivers Using Thermal Image Sequences
NASA Astrophysics Data System (ADS)
Lin, D.; Eltner, A.; Sardemann, H.; Maas, H.-G.
2018-05-01
An automatic spatio-temporal flow velocity measurement approach, using an uncooled thermal camera, is proposed in this paper. The basic principle of the method is to track visible thermal features at the water surface in thermal camera image sequences. Radiometric and geometric calibrations are firstly implemented to remove vignetting effects in thermal imagery and to get the interior orientation parameters of the camera. An object-based unsupervised classification approach is then applied to detect the interest regions for data referencing and thermal feature tracking. Subsequently, GCPs are extracted to orient the river image sequences and local hot points are identified as tracking features. Afterwards, accurate dense tracking outputs are obtained using pyramidal Lucas-Kanade method. To validate the accuracy potential of the method, measurements obtained from thermal feature tracking are compared with reference measurements taken by a propeller gauge. Results show a great potential of automatic flow velocity measurement in small rivers using imagery from a thermal camera.
Spatio-temporal alignment of multiple sensors
NASA Astrophysics Data System (ADS)
Zhang, Tinghua; Ni, Guoqiang; Fan, Guihua; Sun, Huayan; Yang, Biao
2018-01-01
Aiming to achieve the spatio-temporal alignment of multi sensor on the same platform for space target observation, a joint spatio-temporal alignment method is proposed. To calibrate the parameters and measure the attitude of cameras, an astronomical calibration method is proposed based on star chart simulation and collinear invariant features of quadrilateral diagonal between the observed star chart. In order to satisfy a temporal correspondence and spatial alignment similarity simultaneously, the method based on the astronomical calibration and attitude measurement in this paper formulates the video alignment to fold the spatial and temporal alignment into a joint alignment framework. The advantage of this method is reinforced by exploiting the similarities and prior knowledge of velocity vector field between adjacent frames, which is calculated by the SIFT Flow algorithm. The proposed method provides the highest spatio-temporal alignment accuracy compared to the state-of-the-art methods on sequences recorded from multi sensor at different times.
The suitability of lightfield camera depth maps for coordinate measurement applications
NASA Astrophysics Data System (ADS)
Rangappa, Shreedhar; Tailor, Mitul; Petzing, Jon; Kinnell, Peter; Jackson, Michael
2015-12-01
Plenoptic cameras can capture 3D information in one exposure without the need for structured illumination, allowing grey scale depth maps of the captured image to be created. The Lytro, a consumer grade plenoptic camera, provides a cost effective method of measuring depth of multiple objects under controlled lightning conditions. In this research, camera control variables, environmental sensitivity, image distortion characteristics, and the effective working range of two Lytro first generation cameras were evaluated. In addition, a calibration process has been created, for the Lytro cameras, to deliver three dimensional output depth maps represented in SI units (metre). The novel results show depth accuracy and repeatability of +10.0 mm to -20.0 mm, and 0.5 mm respectively. For the lateral X and Y coordinates, the accuracy was +1.56 μm to -2.59 μm and the repeatability was 0.25 μm.
Psychophysical Calibration of Mobile Touch-Screens for Vision Testing in the Field
NASA Technical Reports Server (NTRS)
Mulligan, Jeffrey B.
2015-01-01
The now ubiquitous nature of touch-screen displays in cell phones and tablet computers makes them an attractive option for vision testing outside of the laboratory or clinic. Accurate measurement of parameters such as contrast sensitivity, however, requires precise control of absolute and relative screen luminances. The nonlinearity of the display response (gamma) can be measured or checked using a minimum motion technique similar to that developed by Anstis and Cavanagh (1983) for the determination of isoluminance. While the relative luminances of the color primaries vary between subjects (due to factors such as individual differences in pre-retinal pigment densities), the gamma nonlinearity can be checked in the lab using a photometer. Here we compare results obtained using the psychophysical method with physical measurements for a number of different devices. In addition, we present a novel physical method using the device's built-in front-facing camera in conjunction with a mirror to jointly calibrate the camera and display. A high degree of consistency between devices is found, but some departures from ideal performance are observed. In spite of this, the effects of calibration errors and display artifacts on estimates of contrast sensitivity are found to be small.
Model of an optical system's influence on sensitivity of microbolometric focal plane array
NASA Astrophysics Data System (ADS)
Gogler, Sławomir; Bieszczad, Grzegorz; Zarzycka, Alicja; Szymańska, Magdalena; Sosnowski, Tomasz
2012-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. The detectors used in thermal camera are illuminated by infrared radiation transmitted through a specialized optical system. Each optical system used influences irradiation distribution across an sensor array. In the article a model describing irradiation distribution across an array sensor working with an optical system used in the calibration set-up has been proposed. In the said method optical and geometrical considerations of the array set-up have been taken into account. By means of Monte-Carlo simulation, large number of rays has been traced to the sensor plane, what allowed to determine the irradiation distribution across the image plane for different aperture limiting configurations. Simulated results have been confronted with proposed analytical expression. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
NASA Astrophysics Data System (ADS)
Gliss, Jonas; Stebel, Kerstin; Kylling, Arve; Solvejg Dinger, Anna; Sihler, Holger; Sudbø, Aasmund
2017-04-01
UV SO2 cameras have become a common method for monitoring SO2 emission rates from volcanoes. Scattered solar UV radiation is measured in two wavelength windows, typically around 310 nm and 330 nm (distinct / weak SO2 absorption) using interference filters. The data analysis comprises the retrieval of plume background intensities (to calculate plume optical densities), the camera calibration (to convert optical densities into SO2 column densities) and the retrieval of gas velocities within the plume as well as the retrieval of plume distances. SO2 emission rates are then typically retrieved along a projected plume cross section, for instance a straight line perpendicular to the plume propagation direction. Today, for most of the required analysis steps, several alternatives exist due to ongoing developments and improvements related to the measurement technique. We present piscope, a cross platform, open source software toolbox for the analysis of UV SO2 camera data. The code is written in the Python programming language and emerged from the idea of a common analysis platform incorporating a selection of the most prevalent methods found in literature. piscope includes several routines for plume background retrievals, routines for cell and DOAS based camera calibration including two individual methods to identify the DOAS field of view (shape and position) within the camera images. Gas velocities can be retrieved either based on an optical flow analysis or using signal cross correlation. A correction for signal dilution (due to atmospheric scattering) can be performed based on topographic features in the images. The latter requires distance retrievals to the topographic features used for the correction. These distances can be retrieved automatically on a pixel base using intersections of individual pixel viewing directions with the local topography. The main features of piscope are presented based on dataset recorded at Mt. Etna, Italy in September 2015.
NASA Astrophysics Data System (ADS)
Chibunichev, A. G.; Kurkov, V. M.; Smirnov, A. V.; Govorov, A. V.; Mikhalin, V. A.
2016-10-01
Nowadays, aerial survey technology using aerial systems based on unmanned aerial vehicles (UAVs) becomes more popular. UAVs physically can not carry professional aerocameras. Consumer digital cameras are used instead. Such cameras usually have rolling, lamellar or global shutter. Quite often manufacturers and users of such aerial systems do not use camera calibration. In this case self-calibration techniques are used. However such approach is not confirmed by extensive theoretical and practical research. In this paper we compare results of phototriangulation based on laboratory, test-field or self-calibration. For investigations we use Zaoksky test area as an experimental field provided dense network of target and natural control points. Racurs PHOTOMOD and Agisoft PhotoScan software were used in evaluation. The results of investigations, conclusions and practical recommendations are presented in this article.
Bore-sight calibration of the profile laser scanner using a large size exterior calibration field
NASA Astrophysics Data System (ADS)
Koska, Bronislav; Křemen, Tomáš; Štroner, Martin
2014-10-01
The bore-sight calibration procedure and results of a profile laser scanner using a large size exterior calibration field is presented in the paper. The task is a part of Autonomous Mapping Airship (AMA) project which aims to create s surveying system with specific properties suitable for effective surveying of medium-wide areas (units to tens of square kilometers per a day). As is obvious from the project name an airship is used as a carrier. This vehicle has some specific properties. The most important properties are high carrying capacity (15 kg), long flight time (3 hours), high operating safety and special flight characteristics such as stability of flight, in terms of vibrations, and possibility to flight at low speed. The high carrying capacity enables using of high quality sensors like professional infrared (IR) camera FLIR SC645, high-end visible spectrum (VIS) digital camera and optics in the visible spectrum and tactical grade INSGPS sensor iMAR iTracerRT-F200 and profile laser scanner SICK LD-LRS1000. The calibration method is based on direct laboratory measuring of coordinate offset (lever-arm) and in-flight determination of rotation offsets (bore-sights). The bore-sight determination is based on the minimization of squares of individual point distances from measured planar surfaces.
Free LittleDog!: Towards Completely Untethered Operation of the LittleDog Quadruped
2007-08-01
helpful Intel Open Source Computer Vision ( OpenCV ) library [4] wherever possible rather than reimplementing many of the standard algorithms, however...correspondences between image points and world points, and feeding these to a camera calibration function, such as that provided by OpenCV , allows one to solve... OpenCV calibration function to that used for intrinsic calibration solves for Tboard→camerai . The position of the camera 37 Figure 5.3: Snapshot of
Flight Calibration of the LROC Narrow Angle Camera
NASA Astrophysics Data System (ADS)
Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.
2016-04-01
Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.
Medical-grade Sterilizable Target for Fluid-immersed Fetoscope Optical Distortion Calibration.
Nikitichev, Daniil I; Shakir, Dzhoshkun I; Chadebecq, François; Tella, Marcel; Deprest, Jan; Stoyanov, Danail; Ourselin, Sébastien; Vercauteren, Tom
2017-02-23
We have developed a calibration target for use with fluid-immersed endoscopes within the context of the GIFT-Surg (Guided Instrumentation for Fetal Therapy and Surgery) project. One of the aims of this project is to engineer novel, real-time image processing methods for intra-operative use in the treatment of congenital birth defects, such as spina bifida and the twin-to-twin transfusion syndrome. The developed target allows for the sterility-preserving optical distortion calibration of endoscopes within a few minutes. Good optical distortion calibration and compensation are important for mitigating undesirable effects like radial distortions, which not only hamper accurate imaging using existing endoscopic technology during fetal surgery, but also make acquired images less suitable for potentially very useful image computing applications, like real-time mosaicing. In this paper proposes a novel fabrication method to create an affordable, sterilizable calibration target suitable for use in a clinical setup. This method involves etching a calibration pattern by laser cutting a sandblasted stainless steel sheet. This target was validated using the camera calibration module provided by OpenCV, a state-of-the-art software library popular in the computer vision community.
Medical-grade Sterilizable Target for Fluid-immersed Fetoscope Optical Distortion Calibration
Chadebecq, François; Tella, Marcel; Deprest, Jan; Stoyanov, Danail; Ourselin, Sébastien; Vercauteren, Tom
2017-01-01
We have developed a calibration target for use with fluid-immersed endoscopes within the context of the GIFT-Surg (Guided Instrumentation for Fetal Therapy and Surgery) project. One of the aims of this project is to engineer novel, real-time image processing methods for intra-operative use in the treatment of congenital birth defects, such as spina bifida and the twin-to-twin transfusion syndrome. The developed target allows for the sterility-preserving optical distortion calibration of endoscopes within a few minutes. Good optical distortion calibration and compensation are important for mitigating undesirable effects like radial distortions, which not only hamper accurate imaging using existing endoscopic technology during fetal surgery, but also make acquired images less suitable for potentially very useful image computing applications, like real-time mosaicing. In this paper proposes a novel fabrication method to create an affordable, sterilizable calibration target suitable for use in a clinical setup. This method involves etching a calibration pattern by laser cutting a sandblasted stainless steel sheet. This target was validated using the camera calibration module provided by OpenCV, a state-of-the-art software library popular in the computer vision community. PMID:28287588
Analysis of Brown camera distortion model
NASA Astrophysics Data System (ADS)
Nowakowski, Artur; Skarbek, Władysław
2013-10-01
Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.
RGB-D SLAM Based on Extended Bundle Adjustment with 2D and 3D Information
Di, Kaichang; Zhao, Qiang; Wan, Wenhui; Wang, Yexin; Gao, Yunjun
2016-01-01
In the study of SLAM problem using an RGB-D camera, depth information and visual information as two types of primary measurement data are rarely tightly coupled during refinement of camera pose estimation. In this paper, a new method of RGB-D camera SLAM is proposed based on extended bundle adjustment with integrated 2D and 3D information on the basis of a new projection model. First, the geometric relationship between the image plane coordinates and the depth values is constructed through RGB-D camera calibration. Then, 2D and 3D feature points are automatically extracted and matched between consecutive frames to build a continuous image network. Finally, extended bundle adjustment based on the new projection model, which takes both image and depth measurements into consideration, is applied to the image network for high-precision pose estimation. Field experiments show that the proposed method has a notably better performance than the traditional method, and the experimental results demonstrate the effectiveness of the proposed method in improving localization accuracy. PMID:27529256
Corner detection and sorting method based on improved Harris algorithm in camera calibration
NASA Astrophysics Data System (ADS)
Xiao, Ying; Wang, Yonghong; Dan, Xizuo; Huang, Anqi; Hu, Yue; Yang, Lianxiang
2016-11-01
In traditional Harris corner detection algorithm, the appropriate threshold which is used to eliminate false corners is selected manually. In order to detect corners automatically, an improved algorithm which combines Harris and circular boundary theory of corners is proposed in this paper. After detecting accurate corner coordinates by using Harris algorithm and Forstner algorithm, false corners within chessboard pattern of the calibration plate can be eliminated automatically by using circular boundary theory. Moreover, a corner sorting method based on an improved calibration plate is proposed to eliminate false background corners and sort remaining corners in order. Experiment results show that the proposed algorithms can eliminate all false corners and sort remaining corners correctly and automatically.
NASA Astrophysics Data System (ADS)
Gauthier, L. R.; Jansen, M. E.; Meyer, J. R.
2014-09-01
Camera motion is a potential problem when a video camera is used to perform dynamic displacement measurements. If the scene camera moves at the wrong time, the apparent motion of the object under study can easily be confused with the real motion of the object. In some cases, it is practically impossible to prevent camera motion, as for instance, when a camera is used outdoors in windy conditions. A method to address this challenge is described that provides an objective means to measure the displacement of an object of interest in the scene, even when the camera itself is moving in an unpredictable fashion at the same time. The main idea is to synchronously measure the motion of the camera and to use those data ex post facto to subtract out the apparent motion in the scene that is caused by the camera motion. The motion of the scene camera is measured by using a reference camera that is rigidly attached to the scene camera and oriented towards a stationary reference object. For instance, this reference object may be on the ground, which is known to be stationary. It is necessary to calibrate the reference camera by simultaneously measuring the scene images and the reference images at times when it is known that the scene object is stationary and the camera is moving. These data are used to map camera movement data to apparent scene movement data in pixel space and subsequently used to remove the camera movement from the scene measurements.
Development of infrared scene projectors for testing fire-fighter cameras
NASA Astrophysics Data System (ADS)
Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.
2008-04-01
We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.
A novel camera localization system for extending three-dimensional digital image correlation
NASA Astrophysics Data System (ADS)
Sabato, Alessandro; Reddy, Narasimha; Khan, Sameer; Niezrecki, Christopher
2018-03-01
The monitoring of civil, mechanical, and aerospace structures is important especially as these systems approach or surpass their design life. Often, Structural Health Monitoring (SHM) relies on sensing techniques for condition assessment. Advancements achieved in camera technology and optical sensors have made three-dimensional (3D) Digital Image Correlation (DIC) a valid technique for extracting structural deformations and geometry profiles. Prior to making stereophotogrammetry measurements, a calibration has to be performed to obtain the vision systems' extrinsic and intrinsic parameters. It means that the position of the cameras relative to each other (i.e. separation distance, cameras angle, etc.) must be determined. Typically, cameras are placed on a rigid bar to prevent any relative motion between the cameras. This constraint limits the utility of the 3D-DIC technique, especially as it is applied to monitor large-sized structures and from various fields of view. In this preliminary study, the design of a multi-sensor system is proposed to extend 3D-DIC's capability and allow for easier calibration and measurement. The suggested system relies on a MEMS-based Inertial Measurement Unit (IMU) and a 77 GHz radar sensor for measuring the orientation and relative distance of the stereo cameras. The feasibility of the proposed combined IMU-radar system is evaluated through laboratory tests, demonstrating its ability in determining the cameras position in space for performing accurate 3D-DIC calibration and measurements.
Geometric Calibration of Full Spherical Panoramic Ricoh-Theta Camera
NASA Astrophysics Data System (ADS)
Aghayari, S.; Saadatseresht, M.; Omidalizarandi, M.; Neumann, I.
2017-05-01
A novel calibration process of RICOH-THETA, full-view fisheye camera, is proposed which has numerous applications as a low cost sensor in different disciplines such as photogrammetry, robotic and machine vision and so on. Ricoh Company developed this camera in 2014 that consists of two lenses and is able to capture the whole surrounding environment in one shot. In this research, each lens is calibrated separately and interior/relative orientation parameters (IOPs and ROPs) of the camera are determined on the basis of designed calibration network on the central and side images captured by the aforementioned lenses. Accordingly, designed calibration network is considered as a free distortion grid and applied to the measured control points in the image space as correction terms by means of bilinear interpolation. By performing corresponding corrections, image coordinates are transformed to the unit sphere as an intermediate space between object space and image space in the form of spherical coordinates. Afterwards, IOPs and EOPs of each lens are determined separately through statistical bundle adjustment procedure based on collinearity condition equations. Subsequently, ROPs of two lenses is computed from both EOPs. Our experiments show that by applying 3*3 free distortion grid, image measurements residuals diminish from 1.5 to 0.25 degrees on aforementioned unit sphere.
SU-E-T-570: New Quality Assurance Method Using Motion Tracking for 6D Robotic Couches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheon, W; Cho, J; Ahn, S
Purpose: To accommodate geometrically accurate patient positioning, a robotic couch that is capable of 6-degrees of freedom has been introduced. However, conventional couch QA methods are not sufficient to enable the necessary accuracy of tests. Therefore, we have developed a camera based motion detection and geometry calibration system for couch QA. Methods: Employing a Visual-Tracking System (VTS, BonitaB10, Vicon, UK) which tracks infrared reflective(IR) markers, camera calibration was conducted using a 5.7 × 5.7 × 5.7 cm{sup 3} cube attached with IR markers at each corner. After positioning a robotic-couch at the origin with the cube on the table top,more » 3D coordinates of the cube’s eight corners were acquired by VTS in the VTS coordinate system. Next, positions in reference coordinates (roomcoordinates) were assigned using the known relation between each point. Finally, camera calibration was completed by finding a transformation matrix between VTS and reference coordinate systems and by applying a pseudo inverse matrix method. After the calibration, the accuracy of linear and rotational motions as well as couch sagging could be measured by analyzing the continuously acquired data of the cube while the couch moves to a designated position. Accuracy of the developed software was verified through comparison with measurement data when using a Laser tracker (FARO, Lake Mary, USA) for a robotic-couch installed for proton therapy. Results: VTS system could track couch motion accurately and measured position in room-coordinates. The VTS measurements and Laser tracker data agreed within 1% of difference for linear and rotational motions. Also because the program analyzes motion in 3-Dimension, it can compute couch sagging. Conclusion: Developed QA system provides submillimeter/ degree accuracy which fulfills the high-end couch QA. This work was supported by the National Research Foundation of Korea funded by Ministry of Science, ICT & Future Planning. (2013M2A2A7043507 and 2012M3A9B6055201)« less
Automatic Calibration of an Airborne Imaging System to an Inertial Navigation Unit
NASA Technical Reports Server (NTRS)
Ansar, Adnan I.; Clouse, Daniel S.; McHenry, Michael C.; Zarzhitsky, Dimitri V.; Pagdett, Curtis W.
2013-01-01
This software automatically calibrates a camera or an imaging array to an inertial navigation system (INS) that is rigidly mounted to the array or imager. In effect, it recovers the coordinate frame transformation between the reference frame of the imager and the reference frame of the INS. This innovation can automatically derive the camera-to-INS alignment using image data only. The assumption is that the camera fixates on an area while the aircraft flies on orbit. The system then, fully automatically, solves for the camera orientation in the INS frame. No manual intervention or ground tie point data is required.
NASA Technical Reports Server (NTRS)
Chen, Fang-Jenq
1997-01-01
Flow visualization produces data in the form of two-dimensional images. If the optical components of a camera system are perfect, the transformation equations between the two-dimensional image and the three-dimensional object space are linear and easy to solve. However, real camera lenses introduce nonlinear distortions that affect the accuracy of transformation unless proper corrections are applied. An iterative least-squares adjustment algorithm is developed to solve the nonlinear transformation equations incorporated with distortion corrections. Experimental applications demonstrate that a relative precision on the order of 40,000 is achievable without tedious laboratory calibrations of the camera.
Kittaka, Daisuke; Takase, Tadashi; Akiyama, Masayuki; Nakazawa, Yasuo; Shinozuka, Akira; Shirai, Muneaki
2011-01-01
(123)I-MIBG Heart-to-Mediastinum activity ratio (H/M) is commonly used as an indicator of relative myocardial (123)I-MIBG uptake. H/M ratios reflect myocardial sympathetic nerve function, therefore it is a useful parameter to assess regional myocardial sympathetic denervation in various cardiac diseases. However, H/M ratio values differ by site, gamma camera system, position and size of region of interest (ROI), and collimator. In addition to these factors, 529 keV scatter component may also affect (123)I-MIBG H/M ratio. In this study, we examined whether the H/M ratio shows correlation between two different gamma camera systems and that sought for H/M ratio calculation formula. Moreover, we assessed the feasibility of (123)I Dual Window (IDW) method, which is a scatter correction method, and compared H/M ratios with and without IDW method. H/M ratio displayed a good correlation between two gamma camera systems. Additionally, we were able to create a new H/M calculation formula. These results indicated that the IDW method is a useful scatter correction method for calculating (123)I-MIBG H/M ratios.
Calibration of Action Cameras for Photogrammetric Purposes
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-01-01
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898
Calibration of action cameras for photogrammetric purposes.
Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo
2014-09-18
The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.
Multiple Sensor Camera for Enhanced Video Capturing
NASA Astrophysics Data System (ADS)
Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko
A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.
Binary pressure-sensitive paint measurements using miniaturised, colour, machine vision cameras
NASA Astrophysics Data System (ADS)
Quinn, Mark Kenneth
2018-05-01
Recent advances in machine vision technology and capability have led to machine vision cameras becoming applicable for scientific imaging. This study aims to demonstrate the applicability of machine vision colour cameras for the measurement of dual-component pressure-sensitive paint (PSP). The presence of a second luminophore component in the PSP mixture significantly reduces its inherent temperature sensitivity, increasing its applicability at low speeds. All of the devices tested are smaller than the cooled CCD cameras traditionally used and most are of significantly lower cost, thereby increasing the accessibility of such technology and techniques. Comparisons between three machine vision cameras, a three CCD camera, and a commercially available specialist PSP camera are made on a range of parameters, and a detailed PSP calibration is conducted in a static calibration chamber. The findings demonstrate that colour machine vision cameras can be used for quantitative, dual-component, pressure measurements. These results give rise to the possibility of performing on-board dual-component PSP measurements in wind tunnels or on real flight/road vehicles.
Huber, V; Huber, A; Kinna, D; Balboa, I; Collins, S; Conway, N; Drewelow, P; Maggi, C F; Matthews, G F; Meigs, A G; Mertens, Ph; Price, M; Sergienko, G; Silburn, S; Wynn, A; Zastrow, K-D
2016-11-01
The in situ absolute calibration of the JET real-time protection imaging system has been performed for the first time by means of radiometric light source placed inside the JET vessel and operated by remote handling. High accuracy of the calibration is confirmed by cross-validation of the near infrared (NIR) cameras against each other, with thermal IR cameras, and with the beryllium evaporator, which lead to successful protection of the JET first wall during the last campaign. The operation temperature ranges of NIR protection cameras for the materials used on JET are Be 650-1600 °C, W coating 600-1320 °C, and W 650-1500 °C.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huber, V., E-mail: V.Huber@fz-juelich.de; Huber, A.; Mertens, Ph.
The in situ absolute calibration of the JET real-time protection imaging system has been performed for the first time by means of radiometric light source placed inside the JET vessel and operated by remote handling. High accuracy of the calibration is confirmed by cross-validation of the near infrared (NIR) cameras against each other, with thermal IR cameras, and with the beryllium evaporator, which lead to successful protection of the JET first wall during the last campaign. The operation temperature ranges of NIR protection cameras for the materials used on JET are Be 650-1600 °C, W coating 600-1320 °C, and W 650-1500 °C.
NASA Technical Reports Server (NTRS)
Russell, James W.; Marshall, Robert A.; Finley, Tom D.; Lawrence, George F.
1994-01-01
This report presents a description of the test apparatus and the method of testing the low frequency disturbance source characteristics of small pumps, fans, camera motors, and recorders that are typical of those used in microgravity science facilities. The test apparatus will allow both force and acceleration spectra of these disturbance devices to be obtained from acceleration measurements over the frequency range from 2 to 300 Hz. Some preliminary calibration results are presented.
Radiometric calibration of wide-field camera system with an application in astronomy
NASA Astrophysics Data System (ADS)
Vítek, Stanislav; Nasyrova, Maria; Stehlíková, Veronika
2017-09-01
Camera response function (CRF) is widely used for the description of the relationship between scene radiance and image brightness. Most common application of CRF is High Dynamic Range (HDR) reconstruction of the radiance maps of imaged scenes from a set of frames with different exposures. The main goal of this work is to provide an overview of CRF estimation algorithms and compare their outputs with results obtained under laboratory conditions. These algorithms, typically designed for multimedia content, are unfortunately quite useless with astronomical image data, mostly due to their nature (blur, noise, and long exposures). Therefore, we propose an optimization of selected methods to use in an astronomical imaging application. Results are experimentally verified on the wide-field camera system using Digital Single Lens Reflex (DSLR) camera.
The influence of the in situ camera calibration for direct georeferencing of aerial imagery
NASA Astrophysics Data System (ADS)
Mitishita, E.; Barrios, R.; Centeno, J.
2014-11-01
The direct determination of exterior orientation parameters (EOPs) of aerial images via GNSS/INS technologies is an essential prerequisite in photogrammetric mapping nowadays. Although direct sensor orientation technologies provide a high degree of automation in the process due to the GNSS/INS technologies, the accuracies of the obtained results depend on the quality of a group of parameters that models accurately the conditions of the system at the moment the job is performed. One sub-group of parameters (lever arm offsets and boresight misalignments) models the position and orientation of the sensors with respect to the IMU body frame due to the impossibility of having all sensors on the same position and orientation in the airborne platform. Another sub-group of parameters models the internal characteristics of the sensor (IOP). A system calibration procedure has been recommended by worldwide studies to obtain accurate parameters (mounting and sensor characteristics) for applications of the direct sensor orientation. Commonly, mounting and sensor characteristics are not stable; they can vary in different flight conditions. The system calibration requires a geometric arrangement of the flight and/or control points to decouple correlated parameters, which are not available in the conventional photogrammetric flight. Considering this difficulty, this study investigates the feasibility of the in situ camera calibration to improve the accuracy of the direct georeferencing of aerial images. The camera calibration uses a minimum image block, extracted from the conventional photogrammetric flight, and control point arrangement. A digital Vexcel UltraCam XP camera connected to POS AV TM system was used to get two photogrammetric image blocks. The blocks have different flight directions and opposite flight line. In situ calibration procedures to compute different sets of IOPs are performed and their results are analyzed and used in photogrammetric experiments. The IOPs from the in situ camera calibration improve significantly the accuracies of the direct georeferencing. The obtained results from the experiments are shown and discussed.
NASA Technical Reports Server (NTRS)
Doty, Keith L
1992-01-01
The author has formulated a new, general model for specifying the kinematic properties of serial manipulators. The new model kinematic parameters do not suffer discontinuities when nominally parallel adjacent axes deviate from exact parallelism. From this new theory the author develops a first-order, lumped-parameter, calibration-model for the ARID manipulator. Next, the author develops a calibration methodology for the ARID based on visual and acoustic sensing. A sensor platform, consisting of a camera and four sonars attached to the ARID end frame, performs calibration measurements. A calibration measurement consists of processing one visual frame of an accurately placed calibration image and recording four acoustic range measurements. A minimum of two measurement protocols determine the kinematics calibration-model of the ARID for a particular region: assuming the joint displacements are accurately measured, the calibration surface is planar, and the kinematic parameters do not vary rapidly in the region. No theoretical or practical limitations appear to contra-indicate the feasibility of the calibration method developed here.
Holder, J P; Benedetti, L R; Bradley, D K
2016-11-01
Single hit pulse height analysis is applied to National Ignition Facility x-ray framing cameras to quantify gain and gain variation in a single micro-channel plate-based instrument. This method allows the separation of gain from detectability in these photon-detecting devices. While pulse heights measured by standard-DC calibration methods follow the expected exponential distribution at the limit of a compound-Poisson process, gain-gated pulse heights follow a more complex distribution that may be approximated as a weighted sum of a few exponentials. We can reproduce this behavior with a simple statistical-sampling model.
Don't get burned: thermal monitoring of vessel sealing using a miniature infrared camera
NASA Astrophysics Data System (ADS)
Lin, Shan; Fichera, Loris; Fulton, Mitchell J.; Webster, Robert J.
2017-03-01
Miniature infrared cameras have recently come to market in a form factor that facilitates packaging in endoscopic or other minimally invasive surgical instruments. If absolute temperature measurements can be made with these cameras, they may be useful for non-contact monitoring of electrocautery-based vessel sealing, or other thermal surgical processes like thermal ablation of tumors. As a first step in evaluating the feasibility of optical medical thermometry with these new cameras, in this paper we explore how well thermal measurements can be made with them. These cameras measure the raw flux of incoming IR radiation, and we perform a calibration procedure to map their readings to absolute temperature values in the range between 40 and 150 °C. Furthermore, we propose and validate a method to estimate the spatial extent of heat spread created by a cautery tool based on the thermal images.
OpenCV and TYZX : video surveillance for tracking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Jim; Spencer, Andrew; Chu, Eric
2008-08-01
As part of the National Security Engineering Institute (NSEI) project, several sensors were developed in conjunction with an assessment algorithm. A camera system was developed in-house to track the locations of personnel within a secure room. In addition, a commercial, off-the-shelf (COTS) tracking system developed by TYZX was examined. TYZX is a Bay Area start-up that has developed its own tracking hardware and software which we use as COTS support for robust tracking. This report discusses the pros and cons of each camera system, how they work, a proposed data fusion method, and some visual results. Distributed, embedded image processingmore » solutions show the most promise in their ability to track multiple targets in complex environments and in real-time. Future work on the camera system may include three-dimensional volumetric tracking by using multiple simple cameras, Kalman or particle filtering, automated camera calibration and registration, and gesture or path recognition.« less
NASA Astrophysics Data System (ADS)
Gogler, Slawomir; Bieszczad, Grzegorz; Krupinski, Michal
2013-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. Detectors used in thermal camera are illuminated by infrared radiation transmitted through an infrared transmitting optical system. Often an optical system, when exposed to uniform Lambertian source forms a non-uniform irradiation distribution in its image plane. In order to be able to carry out an accurate non-uniformity correction it is essential to correctly predict irradiation distribution from a uniform source. In the article a non-uniformity correction method has been presented, that takes into account optical system's radiometry. Predictions of the irradiation distribution have been confronted with measured irradiance values. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
Optical aberration correction for simple lenses via sparse representation
NASA Astrophysics Data System (ADS)
Cui, Jinlin; Huang, Wei
2018-04-01
Simple lenses with spherical surfaces are lightweight, inexpensive, highly flexible, and can be easily processed. However, they suffer from optical aberrations that lead to limitations in high-quality photography. In this study, we propose a set of computational photography techniques based on sparse signal representation to remove optical aberrations, thereby allowing the recovery of images captured through a single-lens camera. The primary advantage of the proposed method is that many prior point spread functions calibrated at different depths are successfully used for restoring visual images in a short time, which can be generally applied to nonblind deconvolution methods for solving the problem of the excessive processing time caused by the number of point spread functions. The optical software CODE V is applied for examining the reliability of the proposed method by simulation. The simulation results reveal that the suggested method outperforms the traditional methods. Moreover, the performance of a single-lens camera is significantly enhanced both qualitatively and perceptually. Particularly, the prior information obtained by CODE V can be used for processing the real images of a single-lens camera, which provides an alternative approach to conveniently and accurately obtain point spread functions of single-lens cameras.
Homography-based multiple-camera person-tracking
NASA Astrophysics Data System (ADS)
Turk, Matthew R.
2009-01-01
Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of live targets for training. No calibration is required. Testing shows that the algorithm performs very well in real-world sequences. The consistent labelling problem is solved, even for targets that appear via in-scene entrances. Full occlusions are handled. Although implemented in Matlab, the multiple-camera tracking system runs at eight frames per second. A faster implementation would be suitable for real-world use at typical video frame rates.
NASA Astrophysics Data System (ADS)
Lussem, U.; Hollberg, J.; Menne, J.; Schellberg, J.; Bareth, G.
2017-08-01
Monitoring the spectral response of intensively managed grassland throughout the growing season allows optimizing fertilizer inputs by monitoring plant growth. For example, site-specific fertilizer application as part of precision agriculture (PA) management requires information within short time. But, this requires field-based measurements with hyper- or multispectral sensors, which may not be feasible on a day to day farming practice. Exploiting the information of RGB images from consumer grade cameras mounted on unmanned aerial vehicles (UAV) can offer cost-efficient as well as near-real time analysis of grasslands with high temporal and spatial resolution. The potential of RGB imagery-based vegetation indices (VI) from consumer grade cameras mounted on UAVs has been explored recently in several. However, for multitemporal analyses it is desirable to calibrate the digital numbers (DN) of RGB-images to physical units. In this study, we explored the comparability of the RGBVI from a consumer grade camera mounted on a low-cost UAV to well established vegetation indices from hyperspectral field measurements for applications in grassland. The study was conducted in 2014 on the Rengen Grassland Experiment (RGE) in Germany. Image DN values were calibrated into reflectance by using the Empirical Line Method (Smith & Milton 1999). Depending on sampling date and VI the correlation between the UAV-based RGBVI and VIs such as the NDVI resulted in varying R2 values from no correlation to up to 0.9. These results indicate, that calibrated RGB-based VIs have the potential to support or substitute hyperspectral field measurements to facilitate management decisions on grasslands.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McBeth, R; Elder, D; Kesner, A
2016-06-15
Purpose: Y-90 Selective Internal Radiation Therapy (SIRT) is used to treat liver tumors, and by nature has variability in the percent of the intended dose that is actually delivered. To determine the quality of the administration, pre and post activity measurements are taken, and used to infer percent delivered. Vendor specifications indicate the use of an ion chamber to take these measurements. In our work, we investigated the accuracy of ion chambers, and compared them to other detector systems. Methods: We have built phantoms, phantom holders, and protocols, which allow us to measure our Y90 doses with varying apparatuses: amore » dose calibrator, a Geiger-counter, an ion chamber, a crystal based thyroid probe, and a gamma camera. We have set up a system that has enabled us to gather data by measuring clinical Y90 doses as they are used in the clinic using all of the instrumental methods. Five initial doses (25 measurements/acquisitions) have been taken at the time of this abstract submission. Results: Our initial results show that measurements acquired using scintillation based detectors (thyroid probe and gamma camera) correlate better with the gold standard (i.e. the dose calibrator). Pearson correlations between the dose calibrator measurements and the GM counter, Ion chamber, thyroid probe, and gamma camera were found to be 0.88, 0.83, 0.98, 0.99, respectively. More acquisitions and analysis are planned to determine the precision of the systems, as well as optimal energy window settings. Conclusion: It is likely that current standard practice can be improved using scintillation crystal based detectors. Such systems are more sensitive, can integrate signal, and can use energy discrimination. Furthermore, phantoms can be built to integrate with probe and gamma camera systems that are robust and provide reproducibility. Future work will include expanded acquisition and analysis.« less
NASA Astrophysics Data System (ADS)
Silva, T. S. F.; Torres, R. S.; Morellato, P.
2017-12-01
Vegetation phenology is a key component of ecosystem function and biogeochemical cycling, and highly susceptible to climatic change. Phenological knowledge in the tropics is limited by lack of monitoring, traditionally done by laborious direct observation. Ground-based digital cameras can automate daily observations, but also offer limited spatial coverage. Imaging by low-cost Unmanned Aerial Systems (UAS) combines the fine resolution of ground-based methods with and unprecedented capability for spatial coverage, but challenges remain in producing color-consistent multitemporal images. We evaluated the applicability of multitemporal UAS imaging to monitor phenology in tropical altitudinal grasslands and forests, answering: 1) Can very-high resolution aerial photography from conventional digital cameras be used to reliably monitor vegetative and reproductive phenology? 2) How is UAS monitoring affected by changes in illumination and by sensor physical limitations? We flew imaging missions monthly from Feb-16 to Feb-17, using a UAS equipped with an RGB Canon SX260 camera. Flights were carried between 10am and 4pm, at 120-150m a.g.l., yielding 5-10cm spatial resolution. To compensate illumination changes caused by time of day, season and cloud cover, calibration was attempted using reference targets and empirical models, as well as color space transformations. For vegetative phenological monitoring, multitemporal response was severely affected by changes in illumination conditions, strongly confounding the phenological signal. These variations could not be adequately corrected through calibration due to sensor limitations. For reproductive phenology, the very-high resolution of the acquired imagery allowed discrimination of individual reproductive structures for some species, and its stark colorimetric differences to vegetative structures allowed detection of the reproductive timing on the HSV color space, despite illumination effects. We conclude that reliable vegetative phenology monitoring may exceed the capabilities of consumer cameras, but reproductive phenology can be successfully monitored for species with conspicuous reproductive structures. Further research is being conducted to improve calibration methods and information extraction through machine learning.
A Semi-Automatic Image-Based Close Range 3D Modeling Pipeline Using a Multi-Camera Configuration
Rau, Jiann-Yeou; Yeh, Po-Chia
2012-01-01
The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656
A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.
Rau, Jiann-Yeou; Yeh, Po-Chia
2012-01-01
The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum.
NASA Astrophysics Data System (ADS)
Niemeyer, F.; Schima, R.; Grenzdörffer, G.
2013-08-01
Numerous unmanned aerial systems (UAS) are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg) are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis" software and will give an overview of the results and experiences of test flights.
Distance error correction for time-of-flight cameras
NASA Astrophysics Data System (ADS)
Fuersattel, Peter; Schaller, Christian; Maier, Andreas; Riess, Christian
2017-06-01
The measurement accuracy of time-of-flight cameras is limited due to properties of the scene and systematic errors. These errors can accumulate to multiple centimeters which may limit the applicability of these range sensors. In the past, different approaches have been proposed for improving the accuracy of these cameras. In this work, we propose a new method that improves two important aspects of the range calibration. First, we propose a new checkerboard which is augmented by a gray-level gradient. With this addition it becomes possible to capture the calibration features for intrinsic and distance calibration at the same time. The gradient strip allows to acquire a large amount of distance measurements for different surface reflectivities, which results in more meaningful training data. Second, we present multiple new features which are used as input to a random forest regressor. By using random regression forests, we circumvent the problem of finding an accurate model for the measurement error. During application, a correction value for each individual pixel is estimated with the trained forest based on a specifically tailored feature vector. With our approach the measurement error can be reduced by more than 40% for the Mesa SR4000 and by more than 30% for the Microsoft Kinect V2. In our evaluation we also investigate the impact of the individual forest parameters and illustrate the importance of the individual features.
Silvatti, Amanda P; Cerveri, Pietro; Telles, Thiago; Dias, Fábio A S; Baroni, Guido; Barros, Ricardo M L
2013-01-01
In this study we aim at investigating the applicability of underwater 3D motion capture based on submerged video cameras in terms of 3D accuracy analysis and trajectory reconstruction. Static points with classical direct linear transform (DLT) solution, a moving wand with bundle adjustment and a moving 2D plate with Zhang's method were considered for camera calibration. As an example of the final application, we reconstructed the hand motion trajectories in different swimming styles and qualitatively compared this with Maglischo's model. Four highly trained male swimmers performed butterfly, breaststroke and freestyle tasks. The middle fingertip trajectories of both hands in the underwater phase were considered. The accuracy (mean absolute error) of the two calibration approaches (wand: 0.96 mm - 2D plate: 0.73 mm) was comparable to out of water results and highly superior to the classical DLT results (9.74 mm). Among all the swimmers, the hands' trajectories of the expert swimmer in the style were almost symmetric and in good agreement with Maglischo's model. The kinematic results highlight symmetry or asymmetry between the two hand sides, intra- and inter-subject variability in terms of the motion patterns and agreement or disagreement with the model. The two outcomes, calibration results and trajectory reconstruction, both move towards the quantitative 3D underwater motion analysis.
Calibration of the Nikon 200 for Close Range Photogrammetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheriff, Lassana; /City Coll., N.Y. /SLAC
2010-08-25
The overall objective of this project is to study the stability and reproducibility of the calibration parameters of the Nikon D200 camera with a Nikkor 20 mm lens for close-range photogrammetric surveys. The well known 'central perspective projection' model is used to determine the camera parameters for interior orientation. The Brown model extends it with the introduction of radial distortion and other less critical variables. The calibration process requires a dense network of targets to be photographed at different angles. For faster processing, reflective coded targets are chosen. Two scenarios have been used to check the reproducibility of the parameters.more » The first one is using a flat 2D wall with 141 coded targets and 12 custom targets that were previously measured with a laser tracker. The second one is a 3D Unistrut structure with a combination of coded targets and 3D reflective spheres. The study has shown that this setup is only stable during a short period of time. In conclusion, this camera is acceptable when calibrated before each use. Future work should include actual field tests and possible mechanical improvements, such as securing the lens to the camera body.« less
Development and application of 3-D foot-shape measurement system under different loads
NASA Astrophysics Data System (ADS)
Liu, Guozhong; Wang, Boxiong; Shi, Hui; Luo, Xiuzhi
2008-03-01
The 3-D foot-shape measurement system under different loads based on laser-line-scanning principle was designed and the model of the measurement system was developed. 3-D foot-shape measurements without blind areas under different loads and the automatic extraction of foot-parameter are achieved with the system. A global calibration method for CCD cameras using a one-axis motion unit in the measurement system and the specialized calibration kits is presented. Errors caused by the nonlinearity of CCD cameras and other devices and caused by the installation of the one axis motion platform, the laser plane and the toughened glass plane can be eliminated by using the nonlinear coordinate mapping function and the Powell optimized method in calibration. Foot measurements under different loads for 170 participants were conducted and the statistic foot parameter measurement results for male and female participants under non-weight condition and changes of foot parameters under half-body-weight condition, full-body-weight condition and over-body-weight condition compared with non-weight condition are presented. 3-D foot-shape measurement under different loads makes it possible to realize custom-made shoe-making and shows great prosperity in shoe design, foot orthopaedic treatment, shoe size standardization, and establishment of a feet database for consumers and athletes.
A Fully Sensorized Cooperative Robotic System for Surgical Interventions
Tovar-Arriaga, Saúl; Vargas, José Emilio; Ramos, Juan M.; Aceves, Marco A.; Gorrostieta, Efren; Kalender, Willi A.
2012-01-01
In this research a fully sensorized cooperative robot system for manipulation of needles is presented. The setup consists of a DLR/KUKA Light Weight Robot III especially designed for safe human/robot interaction, a FD-CT robot-driven angiographic C-arm system, and a navigation camera. Also, new control strategies for robot manipulation in the clinical environment are introduced. A method for fast calibration of the involved components and the preliminary accuracy tests of the whole possible errors chain are presented. Calibration of the robot with the navigation system has a residual error of 0.81 mm (rms) with a standard deviation of ±0.41 mm. The accuracy of the robotic system while targeting fixed points at different positions within the workspace is of 1.2 mm (rms) with a standard deviation of ±0.4 mm. After calibration, and due to close loop control, the absolute positioning accuracy was reduced to the navigation camera accuracy which is of 0.35 mm (rms). The implemented control allows the robot to compensate for small patient movements. PMID:23012551
Navigation Aiding by a Hybrid Laser-Camera Motion Estimator for Micro Aerial Vehicles
Atman, Jamal; Popp, Manuel; Ruppelt, Jan; Trommer, Gert F.
2016-01-01
Micro Air Vehicles (MAVs) equipped with various sensors are able to carry out autonomous flights. However, the self-localization of autonomous agents is mostly dependent on Global Navigation Satellite Systems (GNSS). In order to provide an accurate navigation solution in absence of GNSS signals, this article presents a hybrid sensor. The hybrid sensor is a deep integration of a monocular camera and a 2D laser rangefinder so that the motion of the MAV is estimated. This realization is expected to be more flexible in terms of environments compared to laser-scan-matching approaches. The estimated ego-motion is then integrated in the MAV’s navigation system. However, first, the knowledge about the pose between both sensors is obtained by proposing an improved calibration method. For both calibration and ego-motion estimation, 3D-to-2D correspondences are used and the Perspective-3-Point (P3P) problem is solved. Moreover, the covariance estimation of the relative motion is presented. The experiments show very accurate calibration and navigation results. PMID:27649203
Method and apparatus for reading meters from a video image
Lewis, Trevor J.; Ferguson, Jeffrey J.
1997-01-01
A method and system to enable acquisition of data about an environment from one or more meters using video images. One or more meters are imaged by a video camera and the video signal is digitized. Then, each region of the digital image which corresponds to the indicator of the meter is calibrated and the video signal is analyzed to determine the value indicated by each meter indicator. Finally, from the value indicated by each meter indicator in the calibrated region, a meter reading is generated. The method and system offer the advantages of automatic data collection in a relatively non-intrusive manner without making any complicated or expensive electronic connections, and without requiring intensive manpower.
Calibrating Images from the MINERVA Cameras
NASA Astrophysics Data System (ADS)
Mercedes Colón, Ana
2016-01-01
The MINiature Exoplanet Radial Velocity Array (MINERVA) consists of an array of robotic telescopes located on Mount Hopkins, Arizona with the purpose of performing transit photometry and spectroscopy to find Earth-like planets around Sun-like stars. In order to make photometric observations, it is necessary to perform calibrations on the CCD cameras of the telescopes to take into account possible instrument error on the data. In this project, we developed a pipeline that takes optical images, calibrates them using sky flats, darks, and biases to generate a transit light curve.
NASA Astrophysics Data System (ADS)
Wang, Sheng; Bandini, Filippo; Jakobsen, Jakob; Zarco-Tejada, Pablo J.; Köppl, Christian Josef; Haugård Olesen, Daniel; Ibrom, Andreas; Bauer-Gottwein, Peter; Garcia, Monica
2017-04-01
Unmanned Aerial Systems (UAS) can collect optical and thermal hyperspatial (<1m) imagery with low cost and flexible revisit times regardless of cloudy conditions. The reflectance and radiometric temperature signatures of the land surface, closely linked with the vegetation structure and functioning, are already part of models to predict Evapotranspiration (ET) and Gross Primary Productivity (GPP) from satellites. However, there remain challenges for an operational monitoring using UAS compared to satellites: the payload capacity of most commercial UAS is less than 2 kg, but miniaturized sensors have low signal to noise ratios and small field of view requires mosaicking hundreds of images and accurate orthorectification. In addition, wind gusts and lower platform stability require appropriate geometric and radiometric corrections. Finally, modeling fluxes on days without images is still an issue for both satellite and UAS applications. This study focuses on designing an operational UAS-based monitoring system including payload design, sensor calibration, based on routine collection of optical and thermal images in a Danish willow field to perform a joint monitoring of ET and GPP dynamics over continuous time at daily time steps. The payload (<2 kg) consists of a multispectral camera (Tetra Mini-MCA6), a thermal infrared camera (FLIR Tau 2), a digital camera (Sony RX-100) used to retrieve accurate digital elevation models (DEMs) for multispectral and thermal image orthorectification, and a standard GNSS single frequency receiver (UBlox) or a real time kinematic double frequency system (Novatel Inc. flexpack6+OEM628). Geometric calibration of the digital and multispectral cameras was conducted to recover intrinsic camera parameters. After geometric calibration, accurate DEMs with vertical errors about 10cm could be retrieved. Radiometric calibration for the multispectral camera was conducted with an integrating sphere (Labsphere CSTM-USS-2000C) and the laboratory calibration showed that the camera measured radiance had a bias within ±4.8%. The thermal camera was calibrated using a black body at varying target and ambient temperatures and resulted in laboratory accuracy with RMSE of 0.95 K. A joint model of ET and GPP was applied using two parsimonious, physiologically based models, a modified version of the Priestley-Taylor Jet Propulsion Laboratory model (Fisher et al., 2008; Garcia et al., 2013) and a Light Use Efficiency approach (Potter et al., 1993). Both models estimate ET and GPP under optimum potential conditions down-regulated by the same biophysical constraints dependent on remote sensing and atmospheric data to reflect multiple stresses. Vegetation indices were calculated from the multispectral data to assess vegetation conditions, while thermal infrared imagery was used to compute a thermal inertia index to infer soil moisture constraints. To interpolate radiometric temperature between flights, a prognostic Surface Energy Balance model (Margulis et al., 2001) based on the force-restore method was applied in a data assimilation scheme to obtain continuous ET and GPP fluxes. With this operational system, regular flight campaigns with a hexacopter (DJI S900) have been conducted in a Danish willow flux site (Risø) over the 2016 growing season. The observed energy, water and carbon fluxes from the Risø eddy covariance flux tower were used to validate the model simulation. This UAS monitoring system is suitable for agricultural management and land-atmosphere interaction studies.
Robust camera calibration for sport videos using court models
NASA Astrophysics Data System (ADS)
Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang
2003-12-01
We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.
Geometric calibration of lens and filter distortions for multispectral filter-wheel cameras.
Brauers, Johannes; Aach, Til
2011-02-01
High-fidelity color image acquisition with a multispectral camera utilizes optical filters to separate the visible electromagnetic spectrum into several passbands. This is often realized with a computer-controlled filter wheel, where each position is equipped with an optical bandpass filter. For each filter wheel position, a grayscale image is acquired and the passbands are finally combined to a multispectral image. However, the different optical properties and non-coplanar alignment of the filters cause image aberrations since the optical path is slightly different for each filter wheel position. As in a normal camera system, the lens causes additional wavelength-dependent image distortions called chromatic aberrations. When transforming the multispectral image with these aberrations into an RGB image, color fringes appear, and the image exhibits a pincushion or barrel distortion. In this paper, we address both the distortions caused by the lens and by the filters. Based on a physical model of the bandpass filters, we show that the aberrations caused by the filters can be modeled by displaced image planes. The lens distortions are modeled by an extended pinhole camera model, which results in a remaining mean calibration error of only 0.07 pixels. Using an absolute calibration target, we then geometrically calibrate each passband and compensate for both lens and filter distortions simultaneously. We show that both types of aberrations can be compensated and present detailed results on the remaining calibration errors.
NASA Astrophysics Data System (ADS)
Santos, C. Almeida; Costa, C. Oliveira; Batista, J.
2016-05-01
The paper describes a kinematic model-based solution to estimate simultaneously the calibration parameters of the vision system and the full-motion (6-DOF) of large civil engineering structures, namely of long deck suspension bridges, from a sequence of stereo images captured by digital cameras. Using an arbitrary number of images and assuming a smooth structure motion, an Iterated Extended Kalman Filter is used to recursively estimate the projection matrices of the cameras and the structure full-motion (displacement and rotation) over time, helping to meet the structure health monitoring fulfilment. Results related to the performance evaluation, obtained by numerical simulation and with real experiments, are reported. The real experiments were carried out in indoor and outdoor environment using a reduced structure model to impose controlled motions. In both cases, the results obtained with a minimum setup comprising only two cameras and four non-coplanar tracking points, showed a high accuracy results for on-line camera calibration and structure full motion estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nazareth, D; Malhotra, H; French, S
Purpose: Breast radiotherapy, particularly electronic compensation, may involve large dose gradients and difficult patient positioning problems. We have developed a simple self-calibrating augmented-reality system, which assists in accurately and reproducibly positioning the patient, by displaying her live image from a single camera superimposed on the correct perspective projection of her 3D CT data. Our method requires only a standard digital camera capable of live-view mode, installed in the treatment suite at an approximately-known orientation and position (rotation R; translation T). Methods: A 10-sphere calibration jig was constructed and CT imaged to provide a 3D model. The (R,T) relating the cameramore » to the CT coordinate system were determined by acquiring a photograph of the jig and optimizing an objective function, which compares the true image points to points calculated with a given candidate R and T geometry. Using this geometric information, 3D CT patient data, viewed from the camera's perspective, is plotted using a Matlab routine. This image data is superimposed onto the real-time patient image, acquired by the camera, and displayed using standard live-view software. This enables the therapists to view both the patient's current and desired positions, and guide the patient into assuming the correct position. The method was evaluated using an in-house developed bolus-like breast phantom, mounted on a supporting platform, which could be tilted at various angles to simulate treatment-like geometries. Results: Our system allowed breast phantom alignment, with an accuracy of about 0.5 cm and 1 ± 0.5 degree. Better resolution could be possible using a camera with higher-zoom capabilities. Conclusion: We have developed an augmented-reality system, which combines a perspective projection of a CT image with a patient's real-time optical image. This system has the potential to improve patient setup accuracy during breast radiotherapy, and could possibly be used for other disease sites as well.« less
NASA Astrophysics Data System (ADS)
Klaessens, John H.; van der Veen, Albert; Verdaasdonk, Rudolf M.
2017-03-01
Recently, low cost smart phone based thermal cameras are being considered to be used in a clinical setting for monitoring physiological temperature responses such as: body temperature change, local inflammations, perfusion changes or (burn) wound healing. These thermal cameras contain uncooled micro-bolometers with an internal calibration check and have a temperature resolution of 0.1 degree. For clinical applications a fast quality measurement before use is required (absolute temperature check) and quality control (stability, repeatability, absolute temperature, absolute temperature differences) should be performed regularly. Therefore, a calibrated temperature phantom has been developed based on thermistor heating on both ends of a black coated metal strip to create a controllable temperature gradient from room temperature 26 °C up to 100 °C. The absolute temperatures on the strip are determined with software controlled 5 PT-1000 sensors using lookup tables. In this study 3 FLIR-ONE cameras and one high end camera were checked with this temperature phantom. The results show a relative good agreement between both low-cost and high-end camera's and the phantom temperature gradient, with temperature differences of 1 degree up to 6 degrees between the camera's and the phantom. The measurements were repeated as to absolute temperature and temperature stability over the sensor area. Both low-cost and high-end thermal cameras measured relative temperature changes with high accuracy and absolute temperatures with constant deviations. Low-cost smart phone based thermal cameras can be a good alternative to high-end thermal cameras for routine clinical measurements, appropriate to the research question, providing regular calibration checks for quality control.
Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery
NASA Astrophysics Data System (ADS)
Kwoh, L. K.; Huang, X.; Tan, W. J.
2012-07-01
XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.
Real-Time View Correction for Mobile Devices.
Schops, Thomas; Oswald, Martin R; Speciale, Pablo; Yang, Shuoran; Pollefeys, Marc
2017-11-01
We present a real-time method for rendering novel virtual camera views from given RGB-D (color and depth) data of a different viewpoint. Missing color and depth information due to incomplete input or disocclusions is efficiently inpainted in a temporally consistent way. The inpainting takes the location of strong image gradients into account as likely depth discontinuities. We present our method in the context of a view correction system for mobile devices, and discuss how to obtain a screen-camera calibration and options for acquiring depth input. Our method has use cases in both augmented and virtual reality applications. We demonstrate the speed of our system and the visual quality of its results in multiple experiments in the paper as well as in the supplementary video.
Multi-camera sensor system for 3D segmentation and localization of multiple mobile robots.
Losada, Cristina; Mazo, Manuel; Palazuelos, Sira; Pizarro, Daniel; Marrón, Marta
2010-01-01
This paper presents a method for obtaining the motion segmentation and 3D localization of multiple mobile robots in an intelligent space using a multi-camera sensor system. The set of calibrated and synchronized cameras are placed in fixed positions within the environment (intelligent space). The proposed algorithm for motion segmentation and 3D localization is based on the minimization of an objective function. This function includes information from all the cameras, and it does not rely on previous knowledge or invasive landmarks on board the robots. The proposed objective function depends on three groups of variables: the segmentation boundaries, the motion parameters and the depth. For the objective function minimization, we use a greedy iterative algorithm with three steps that, after initialization of segmentation boundaries and depth, are repeated until convergence.
Research on Geometric Calibration of Spaceborne Linear Array Whiskbroom Camera
Sheng, Qinghong; Wang, Qi; Xiao, Hui; Wang, Qing
2018-01-01
The geometric calibration of a spaceborne thermal-infrared camera with a high spatial resolution and wide coverage can set benchmarks for providing an accurate geographical coordinate for the retrieval of land surface temperature. The practice of using linear array whiskbroom Charge-Coupled Device (CCD) arrays to image the Earth can help get thermal-infrared images of a large breadth with high spatial resolutions. Focusing on the whiskbroom characteristics of equal time intervals and unequal angles, the present study proposes a spaceborne linear-array-scanning imaging geometric model, whilst calibrating temporal system parameters and whiskbroom angle parameters. With the help of the YG-14—China’s first satellite equipped with thermal-infrared cameras of high spatial resolution—China’s Anyang Imaging and Taiyuan Imaging are used to conduct an experiment of geometric calibration and a verification test, respectively. Results have shown that the plane positioning accuracy without ground control points (GCPs) is better than 30 pixels and the plane positioning accuracy with GCPs is better than 1 pixel. PMID:29337885
First in-flight results of Pleiades 1A innovative methods for optical calibration
NASA Astrophysics Data System (ADS)
Kubik, Philippe; Lebègue, Laurent; Fourest, Sébastien; Delvit, Jean-Marc; de Lussy, Françoise; Greslou, Daniel; Blanchet, Gwendoline
2017-11-01
The PLEIADES program is a space Earth Observation system led by France, under the leadership of the French Space Agency (CNES). Since it was successfully launched on December 17th, 2011, Pleiades 1A high resolution optical satellite has been thoroughly tested and validated during the commissioning phase led by CNES. The whole system has been designed to deliver submetric optical images to users whose needs were taken into account very early in the design process. This satellite opens a new era in Europe since its off-nadir viewing capability delivers a worldwide 2- days access, and its great agility will make possible to image numerous targets, strips and stereo coverage from the same orbit. Its imaging capability of more than 450 images of 20 km x 20 km per day can fulfill a broad spectrum of applications for both civilian and defence users. For an earth observing satellite with no on-board calibration source, the commissioning phase is a critical quest of wellcharacterized earth landscapes and ground patterns that have to be imaged by the camera in order to compute or fit the parameters of the viewing models. It may take a long time to get the required scenes with no cloud, whilst atmosphere corrections need simultaneous measurements that are not always possible. The paper focuses on new in-flight calibration methods that were prepared before the launch in the framework of the PLEIADES program : they take advantage of the satellite agility that can deeply relax the operational constraints and may improve calibration accuracy. Many performances of the camera were assessed thanks to a dedicated innovative method that was successfully validated during the commissioning period : Modulation Transfer Function (MTF), refocusing, absolute calibration, line of sight stability were estimated on stars and on the Moon. Detectors normalization and radiometric noise were computed on specific pictures on Earth with a dedicated guidance profile. Geometric viewing frame was determined with a particular image acquisition combining different views of the same target. All these new methods are expected to play a key role in the future when active optics will need sophisticated in-flight calibration strategy.
Viking lander camera radiometry calibration report, volume 2
NASA Technical Reports Server (NTRS)
Wolf, M. R.; Atwood, D. L.; Morrill, M. E.
1977-01-01
The requirements the performance validation, and interfaces for the RADCAM program, to convert Viking lander camera image data to radiometric units were established. A proposed algorithm is described, and an appendix summarizing the planned reduction of camera test data was included.
Camera Trajectory fromWide Baseline Images
NASA Astrophysics Data System (ADS)
Havlena, M.; Torii, A.; Pajdla, T.
2008-09-01
Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the angle θ of its corresponding rays w.r.t. the optical axis as θ = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendingly by the distance of their descriptors which may help to reduce the number of samples in RANSAC. From each 5-tuple, relative orientation is computed by solving the 5-point minimal relative orientation problem for calibrated cameras. Often, there are more models which are supported by a large number of matches. Thus the chance that the correct model, even if it has the largest support, will be found by running a single RANSAC is small. Work suggested to generate models by randomized sampling as in RANSAC but to use soft (kernel) voting for a parameter instead of looking for the maximal support. The best model is then selected as the one with the parameter closest to the maximum in the accumulator space. In our case, we vote in a two-dimensional accumulator for the estimated camera motion direction. However, unlike in, we do not cast votes directly by each sampled epipolar geometry but by the best epipolar geometries recovered by ordered sampling of RANSAC. With our technique, we could go up to the 98.5 % contamination of mismatches with comparable effort as simple RANSAC does for the contamination by 84 %. The relative camera orientation with the motion direction closest to the maximum in the voting space is finally selected. As already mentioned in the first paragraph, the use of camera trajectory estimates is quite wide. In we have introduced a technique for measuring the size of camera translation relatively to the observed scene which uses the dominant apical angle computed at the reconstructed scene points and is robust against mismatches. The experiments demonstrated that the measure can be used to improve the robustness of camera path computation and object recognition for methods which use a geometric, e.g. the ground plane, constraint such as does for the detection of pedestrians. Using the camera trajectories, perspective cutouts with stabilized horizon are constructed and an arbitrary object recognition routine designed to work with images acquired by perspective cameras can be used without any further modifications.
MicroCameras and Photometers (MCP) on board the TARANIS satellite
NASA Astrophysics Data System (ADS)
Farges, T.; Hébert, P.; Le Mer-Dachard, F.; Ravel, K.; Gaillac, S.
2017-12-01
TARANIS (Tool for the Analysis of Radiations from lightNing and Sprites) is a CNES micro satellite. Its main objective is to study impulsive transfers of energy between the Earth atmosphere and the space environment. It will be sun-synchronous at an altitude of 700 km. It will be launched in 2019 for at least 2 years. Its payload is composed of several electromagnetic instruments in different wavelengths (from gamma-rays to radio waves including optical). TARANIS instruments are currently in calibration and qualification phase. The purpose is to present the MicroCameras and Photometers (MCP) design, to show its performances after its recent characterization and at last to discuss the scientific objectives and how we want to answer it with the MCP observations. The MicroCameras, developed by Sodern, are dedicated to the spatial description of TLEs and their parent lightning. They are able to differentiate sprite and lightning thanks to two narrow bands ([757-767 nm] and [772-782 nm]) that provide simultaneous pairs of images of an Event. Simulation results of the differentiation method will be shown. After calibration and tests, the MicroCameras are now delivered to the CNES for integration on the payload. The Photometers, developed by Bertin Technologies, will provide temporal measurements and spectral characteristics of TLEs and lightning. There are key instrument because of their capability to detect on-board TLEs and then switch all the instruments of the scientific payload in their high resolution acquisition mode. Photometers use four spectral bands in the [170-260 nm], [332-342 nm], [757-767 nm] and [600-900 nm] and have the same field of view as cameras. The on-board TLE detection algorithm remote-controlled parameters have been tuned before launch using the electronic board and simulated or real events waveforms. After calibration, the Photometers are now going through the environmental tests. They will be delivered to the CNES for integration on the payload in September 2017.
An Unusual View: MISR sees the Moon
2017-08-17
The job of the Multiangle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite is to view Earth. For more than 17 years, its nine cameras have stared downward 24 hours a day, faithfully collecting images used to study Earth's surface and atmosphere. On August 5, however, MISR captured some very unusual data as the Terra satellite performed a backflip in space. This maneuver was performed to allow MISR and the other instruments on Terra to catch a glimpse of the Moon, something that has been done only once before, in 2003. Why task an elderly satellite with such a radical maneuver? Since we can be confident that the Moon's brightness has remained very constant over the mission, MISR's images of the Moon can be used as a check of the instrument's calibration, allowing an independent verification of the procedures used to correct the images for any changes the cameras have experienced over their many years in space. If changes in the cameras' responses to light aren't properly accounted for, the images captured by MISR would make it appear as if Earth were growing darker or lighter, which would throw off scientists' efforts to characterize air pollution, cloud cover and Earth's climate. Because of this, the MISR team uses several methods to calibrate the data, all of which involve imaging something with a known (or independently measured) brightness and correcting the images to match that brightness. Every month, MISR views two panels of a special material called Spectralon, which reflects sunlight in a very particular way, onboard the instrument. Periodically, this calibration is checked by a field team who measures the brightness of a flat, uniformly colored surface on Earth, usually a dry desert lakebed, as MISR flies overhead. The lunar maneuver offers a third opportunity to check the brightness calibration of MISR's images. While viewing Earth, MISR's cameras are fixed at nine different angles, with one (called An) pointed straight down, four canted forwards (Af, Bf, Cf, and Df) and four angled backwards (Aa, Ba, Ca, and Da). The A, B, C, and D cameras have different focal lengths, with the most oblique (D) cameras having the longest focal lengths in order to preserve spatial resolution on the ground. During the lunar maneuver, however, the spacecraft rotated so that each camera saw the almost-full Moon straight on. This means that the different focal lengths produce images with different resolutions. The D cameras produce the sharpest images. These grayscale images were made with raw data from the red spectral band of each camera. Because the spacecraft is constantly rotating while these images were taken, the images are "smeared" in the vertical direction, producing an oval-shaped Moon. These have been corrected to restore the Moon to its true circular shape. https://photojournal.jpl.nasa.gov/catalog/PIA21876
Calibration of Contactless Pulse Oximetry
Bartula, Marek; Bresch, Erik; Rocque, Mukul; Meftah, Mohammed; Kirenko, Ihor
2017-01-01
BACKGROUND: Contactless, camera-based photoplethysmography (PPG) interrogates shallower skin layers than conventional contact probes, either transmissive or reflective. This raises questions on the calibratability of camera-based pulse oximetry. METHODS: We made video recordings of the foreheads of 41 healthy adults at 660 and 840 nm, and remote PPG signals were extracted. Subjects were in normoxic, hypoxic, and low temperature conditions. Ratio-of-ratios were compared to reference Spo2 from 4 contact probes. RESULTS: A calibration curve based on artifact-free data was determined for a population of 26 individuals. For an Spo2 range of approximately 83% to 100% and discarding short-term errors, a root mean square error of 1.15% was found with an upper 99% one-sided confidence limit of 1.65%. Under normoxic conditions, a decrease in ambient temperature from 23 to 7°C resulted in a calibration error of 0.1% (±1.3%, 99% confidence interval) based on measurements for 3 subjects. PPG signal strengths varied strongly among individuals from about 0.9 × 10−3 to 4.6 × 10−3 for the infrared wavelength. CONCLUSIONS: For healthy adults, the results present strong evidence that camera-based contactless pulse oximetry is fundamentally feasible because long-term (eg, 10 minutes) error stemming from variation among individuals expressed as A*rms is significantly lower (<1.65%) than that required by the International Organization for Standardization standard (<4%) with the notion that short-term errors should be added. A first illustration of such errors has been provided with A**rms = 2.54% for 40 individuals, including 6 with dark skin. Low signal strength and subject motion present critical challenges that will have to be addressed to make camera-based pulse oximetry practically feasible. PMID:27258081
NASA Astrophysics Data System (ADS)
Bell, J. F.; Godber, A.; McNair, S.; Caplinger, M. A.; Maki, J. N.; Lemmon, M. T.; Van Beek, J.; Malin, M. C.; Wellington, D.; Kinch, K. M.; Madsen, M. B.; Hardgrove, C.; Ravine, M. A.; Jensen, E.; Harker, D.; Anderson, R. B.; Herkenhoff, K. E.; Morris, R. V.; Cisneros, E.; Deen, R. G.
2017-07-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted 2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) "true color" images, multispectral images in nine additional bands spanning 400-1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration.
Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng
2017-06-20
The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.
NASA Astrophysics Data System (ADS)
Mitishita, E.; Debiasi, P.; Hainosz, F.; Centeno, J.
2012-07-01
Digital photogrammetric products from the integration of imagery and lidar datasets are a reality nowadays. When the imagery and lidar surveys are performed together and the camera is connected to the lidar system, a direct georeferencing can be applied to compute the exterior orientation parameters of the images. Direct georeferencing of the images requires accurate interior orientation parameters to perform photogrammetric application. Camera calibration is a procedure applied to compute the interior orientation parameters (IOPs). Calibration researches have established that to obtain accurate IOPs, the calibration must be performed with same or equal condition that the photogrammetric survey is done. This paper shows the methodology and experiments results from in situ self-calibration using a simultaneous images block and lidar dataset. The calibration results are analyzed and discussed. To perform this research a test field was fixed in an urban area. A set of signalized points was implanted on the test field to use as the check points or control points. The photogrammetric images and lidar dataset of the test field were taken simultaneously. Four strips of flight were used to obtain a cross layout. The strips were taken with opposite directions of flight (W-E, E-W, N-S and S-N). The Kodak DSC Pro SLR/c digital camera was connected to the lidar system. The coordinates of the exposition station were computed from the lidar trajectory. Different layouts of vertical control points were used in the calibration experiments. The experiments use vertical coordinates from precise differential GPS survey or computed by an interpolation procedure using the lidar dataset. The positions of the exposition stations are used as control points in the calibration procedure to eliminate the linear dependency of the group of interior and exterior orientation parameters. This linear dependency happens, in the calibration procedure, when the vertical images and flat test field are used. The mathematic correlation of the interior and exterior orientation parameters are analyzed and discussed. The accuracies of the calibration experiments are, as well, analyzed and discussed.
NPS assessment of color medical displays using a monochromatic CCD camera
NASA Astrophysics Data System (ADS)
Roehrig, Hans; Gu, Xiliang; Fan, Jiahua
2012-02-01
This paper presents an approach to Noise Power Spectrum (NPS) assessment of color medical displays without using an expensive imaging colorimeter. The R, G and B color uniform patterns were shown on the display under study and the images were taken using a high resolution monochromatic camera. A colorimeter was used to calibrate the camera images. Synthetic intensity images were formed by the weighted sum of the R, G, B and the dark screen images. Finally the NPS analysis was conducted on the synthetic images. The proposed method replaces an expensive imaging colorimeter for NPS evaluation, which also suggests a potential solution for routine color medical display QA/QC in the clinical area, especially when imaging of display devices is desired.
NPS assessment of color medical image displays using a monochromatic CCD camera
NASA Astrophysics Data System (ADS)
Roehrig, Hans; Gu, Xiliang; Fan, Jiahua
2012-10-01
This paper presents an approach to Noise Power Spectrum (NPS) assessment of color medical displays without using an expensive imaging colorimeter. The R, G and B color uniform patterns were shown on the display under study and the images were taken using a high resolution monochromatic camera. A colorimeter was used to calibrate the camera images. Synthetic intensity images were formed by the weighted sum of the R, G, B and the dark screen images. Finally the NPS analysis was conducted on the synthetic images. The proposed method replaces an expensive imaging colorimeter for NPS evaluation, which also suggests a potential solution for routine color medical display QA/QC in the clinical area, especially when imaging of display devices is desired
NASA Technical Reports Server (NTRS)
McDowell, Mark (Inventor); Glasgow, Thomas K. (Inventor)
1999-01-01
A system and a method for measuring three-dimensional velocities at a plurality of points in a fluid employing at least two cameras positioned approximately perpendicular to one another. The cameras are calibrated to accurately represent image coordinates in world coordinate system. The two-dimensional views of the cameras are recorded for image processing and centroid coordinate determination. Any overlapping particle clusters are decomposed into constituent centroids. The tracer particles are tracked on a two-dimensional basis and then stereo matched to obtain three-dimensional locations of the particles as a function of time so that velocities can be measured therefrom The stereo imaging velocimetry technique of the present invention provides a full-field. quantitative, three-dimensional map of any optically transparent fluid which is seeded with tracer particles.
NASA Astrophysics Data System (ADS)
Lim, Sungsoo; Lee, Seohyung; Kim, Jun-geon; Lee, Daeho
2018-01-01
The around-view monitoring (AVM) system is one of the major applications of advanced driver assistance systems and intelligent transportation systems. We propose an on-line calibration method, which can compensate misalignments for AVM systems. Most AVM systems use fisheye undistortion, inverse perspective transformation, and geometrical registration methods. To perform these procedures, the parameters for each process must be known; the procedure by which the parameters are estimated is referred to as the initial calibration. However, when only using the initial calibration data, we cannot compensate misalignments, caused by changing equilibria of cars. Moreover, even small changes such as tire pressure levels, passenger weight, or road conditions can affect a car's equilibrium. Therefore, to compensate for this misalignment, additional techniques are necessary, specifically an on-line calibration method. On-line calibration can recalculate homographies, which can correct any degree of misalignment using the unique features of ordinary parking lanes. To extract features from the parking lanes, this method uses corner detection and a pattern matching algorithm. From the extracted features, homographies are estimated using random sample consensus and parameter estimation. Finally, the misaligned epipolar geographies are compensated via the estimated homographies. Thus, the proposed method can render image planes parallel to the ground. This method does not require any designated patterns and can be used whenever cars are placed in a parking lot. The experimental results show the robustness and efficiency of the method.
Method and apparatus for calibrating a tiled display
NASA Technical Reports Server (NTRS)
Chen, Chung-Jen (Inventor); Johnson, Michael J. (Inventor); Chandrasekhar, Rajesh (Inventor)
2001-01-01
A display system that can be calibrated and re-calibrated with a minimal amount of manual intervention. To accomplish this, one or more cameras are provided to capture an image of the display screen. The resulting captured image is processed to identify any non-desirable characteristics, including visible artifacts such as seams, bands, rings, etc. Once the non-desirable characteristics are identified, an appropriate transformation function is determined. The transformation function is used to pre-warp the input video signal that is provided to the display such that the non-desirable characteristics are reduced or eliminated from the display. The transformation function preferably compensates for spatial non-uniformity, color non-uniformity, luminance non-uniformity, and other visible artifacts.
Photogrammetry research for FAST eleven-meter reflector panel surface shape measurement
NASA Astrophysics Data System (ADS)
Zhou, Rongwei; Zhu, Lichun; Li, Weimin; Hu, Jingwen; Zhai, Xuebing
2010-10-01
In order to design and manufacture the Five-hundred-meter Aperture Spherical Radio Telescope (FAST) active reflector measuring equipment, measurement on each reflector panel surface shape was presented, static measurement of the whole neutral spherical network of nodes was performed, real-time dynamic measurement at the cable network dynamic deformation was undertaken. In the implementation process of the FAST, reflector panel surface shape detection was completed before eleven-meter reflector panel installation. Binocular vision system was constructed based on the method of binocular stereo vision in machine vision, eleven-meter reflector panel surface shape was measured with photogrammetry method. Cameras were calibrated with the feature points. Under the linearity camera model, the lighting spot array was used as calibration standard pattern, and the intrinsic and extrinsic parameters were acquired. The images were collected for digital image processing and analyzing with two cameras, feature points were extracted with the detection algorithm of characteristic points, and those characteristic points were matched based on epipolar constraint method. Three-dimensional reconstruction coordinates of feature points were analyzed and reflective panel surface shape structure was established by curve and surface fitting method. The error of reflector panel surface shape was calculated to realize automatic measurement on reflector panel surface shape. The results show that unit reflector panel surface inspection accuracy was 2.30mm, within the standard deviation error of 5.00mm. Compared with the requirement of reflector panel machining precision, photogrammetry has fine precision and operation feasibility on eleven-meter reflector panel surface shape measurement for FAST.
SHORT COMMUNICATION: An image processing approach to calibration of hydrometers
NASA Astrophysics Data System (ADS)
Lorefice, S.; Malengo, A.
2004-06-01
The usual method adopted for multipoint calibration of glass hydrometers is based on the measurement of the buoyancy by hydrostatic weighing when the hydrometer is plunged in a reference liquid up to the scale mark to be calibrated. An image processing approach is proposed by the authors to align the relevant scale mark with the reference liquid surface level. The method uses image analysis with a data processing technique and takes into account the perspective error. For this purpose a CCD camera with a pixel matrix of 604H × 576V and a lens of 16 mm focal length were used. High accuracy in the hydrometer reading was obtained as the resulting reading uncertainty was lower than 0.02 mm, about a fifth of the usual figure with the visual reading made by an operator.
MATE: Machine Learning for Adaptive Calibration Template Detection
Donné, Simon; De Vylder, Jonas; Goossens, Bart; Philips, Wilfried
2016-01-01
The problem of camera calibration is two-fold. On the one hand, the parameters are estimated from known correspondences between the captured image and the real world. On the other, these correspondences themselves—typically in the form of chessboard corners—need to be found. Many distinct approaches for this feature template extraction are available, often of large computational and/or implementational complexity. We exploit the generalized nature of deep learning networks to detect checkerboard corners: our proposed method is a convolutional neural network (CNN) trained on a large set of example chessboard images, which generalizes several existing solutions. The network is trained explicitly against noisy inputs, as well as inputs with large degrees of lens distortion. The trained network that we evaluate is as accurate as existing techniques while offering improved execution time and increased adaptability to specific situations with little effort. The proposed method is not only robust against the types of degradation present in the training set (lens distortions, and large amounts of sensor noise), but also to perspective deformations, e.g., resulting from multi-camera set-ups. PMID:27827920
Multi-camera digital image correlation method with distributed fields of view
NASA Astrophysics Data System (ADS)
Malowany, Krzysztof; Malesa, Marcin; Kowaluk, Tomasz; Kujawinska, Malgorzata
2017-11-01
A multi-camera digital image correlation (DIC) method and system for measurements of large engineering objects with distributed, non-overlapping areas of interest are described. The data obtained with individual 3D DIC systems are stitched by an algorithm which utilizes the positions of fiducial markers determined simultaneously by Stereo-DIC units and laser tracker. The proposed calibration method enables reliable determination of transformations between local (3D DIC) and global coordinate systems. The applicability of the method was proven during in-situ measurements of a hall made of arch-shaped (18 m span) self-supporting metal-plates. The proposed method is highly recommended for 3D measurements of shape and displacements of large and complex engineering objects made from multiple directions and it provides the suitable accuracy of data for further advanced structural integrity analysis of such objects.
Deviation rectification for dynamic measurement of rail wear based on coordinate sets projection
NASA Astrophysics Data System (ADS)
Wang, Chao; Ma, Ziji; Li, Yanfu; Zeng, Jiuzhen; Jin, Tan; Liu, Hongli
2017-10-01
Dynamic measurement of rail wear using a laser imaging system suffers from random vibrations in the laser-based imaging sensor which cause distorted rail profiles. In this paper, a simple and effective method for rectifying profile deviation is presented to address this issue. There are two main steps: profile recognition and distortion calibration. According to the constant camera and projector parameters, efficient recognition of measured profiles is achieved by analyzing the geometric difference between normal profiles and distorted ones. For a distorted profile, by constructing coordinate sets projecting from it to the standard one on triple projecting primitives, including the rail head inner line, rail waist curve and rail jaw, iterative extrinsic camera parameter self-compensation is implemented. The distortion is calibrated by projecting the distorted profile onto the x-y plane of a measuring coordinate frame, which is parallel to the rail cross section, to eliminate the influence of random vibrations in the laser-based imaging sensor. As well as evaluating the implementation with comprehensive experiments, we also compare our method with other published works. The results exhibit the effectiveness and superiority of our method for the dynamic measurement of rail wear.
Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry
NASA Astrophysics Data System (ADS)
Hastedt, H.; Ekkel, T.; Luhmann, T.
2016-06-01
The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taneja, S; Fru, L Che; Desai, V
Purpose: It is now commonplace to handle treatments of hyperthyroidism using iodine-131 as an outpatient procedure due to lower costs and less stringent federal regulations. The Nuclear Regulatory Commission has currently updated release guidelines for these procedures, but there is still a large uncertainty in the dose to the public. Current guidelines to minimize dose to the public require patients to remain isolated after treatment. The purpose of this study was to use a low-cost common device, such as a cell phone, to estimate exposure emitted from a patient to the general public. Methods: Measurements were performed using an Applemore » iPhone 3GS and a Cs-137 irradiator. The charge-coupled device (CCD) camera on the phone was irradiated to exposure rates ranging from 0.1 mR/hr to 100 mR/hr and 30-sec videos were taken during irradiation with the camera lens covered by electrical tape. Interactions were detected as white pixels on a black background in each video. Both single threshold (ST) and colony counting (CC) methods were performed using MATLAB®. Calibration curves were determined by comparing the total pixel intensity output from each method to the known exposure rate. Results: The calibration curve showed a linear relationship above 5 mR/hr for both analysis techniques. The number of events counted per unit exposure rate within the linear region was 19.5 ± 0.7 events/mR and 8.9 ± 0.4 events/mR for the ST and CC methods respectively. Conclusion: Two algorithms were developed and show a linear relationship between photons detected by a CCD camera and low exposure rates, in the range of 5 mR/hr to 100-mR/hr. Future work aims to refine this model by investigating the dose-rate and energy dependencies of the camera response. This algorithm allows for quantitative monitoring of exposure from patients treated with iodine-131 using a simple device outside of the hospital.« less
Method and apparatus for reading meters from a video image
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, T.J.; Ferguson, J.J.
1995-12-31
A method and system enable acquisition of data about an environment from one or more meters using video images. One or more meters are imaged by a video camera and the video signal is digitized. Then, each region of the digital image which corresponds to the indicator of the meter is calibrated and the video signal is analyzed to determine the value indicated by each meter indicator. Finally, from the value indicated by each meter indicator in the calibrated region, a meter reading is generated. The method and system offer the advantages of automatic data collection in a relatively non-intrusivemore » manner without making any complicated or expensive electronic connections, and without requiring intensive manpower.« less
Method and apparatus for reading meters from a video image
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, T.J.; Ferguson, J.J.
1997-09-30
A method and system to enable acquisition of data about an environment from one or more meters using video images. One or more meters are imaged by a video camera and the video signal is digitized. Then, each region of the digital image which corresponds to the indicator of the meter is calibrated and the video signal is analyzed to determine the value indicated by each meter indicator. Finally, from the value indicated by each meter indicator in the calibrated region, a meter reading is generated. The method and system offer the advantages of automatic data collection in a relativelymore » non-intrusive manner without making any complicated or expensive electronic connections, and without requiring intensive manpower. 1 fig.« less
Reconstruction method for fringe projection profilometry based on light beams.
Li, Xuexing; Zhang, Zhijiang; Yang, Chen
2016-12-01
A novel reconstruction method for fringe projection profilometry, based on light beams, is proposed and verified by experiments. Commonly used calibration techniques require the parameters of projector calibration or the reference planes placed in many known positions. Obviously, introducing the projector calibration can reduce the accuracy of the reconstruction result, and setting the reference planes to many known positions is a time-consuming process. Therefore, in this paper, a reconstruction method without projector's parameters is proposed and only two reference planes are introduced. A series of light beams determined by the subpixel point-to-point map on the two reference planes combined with their reflected light beams determined by the camera model are used to calculate the 3D coordinates of reconstruction points. Furthermore, the bundle adjustment strategy and the complementary gray-code phase-shifting method are utilized to ensure the accuracy and stability. Qualitative and quantitative comparisons as well as experimental tests demonstrate the performance of our proposed approach, and the measurement accuracy can reach about 0.0454 mm.
NASA Astrophysics Data System (ADS)
Alves, J.; Saraiva, A. C. V.; Campos, L. Z. D. S.; Pinto, O., Jr.; Antunes, L.
2014-12-01
This work presents a method for the evaluation of location accuracy of all Lightning Location System (LLS) in operation in southeastern Brazil, using natural cloud-to-ground (CG) lightning flashes. This can be done through a multiple high-speed cameras network (RAMMER network) installed in the Paraiba Valley region - SP - Brazil. The RAMMER network (Automated Multi-camera Network for Monitoring and Study of Lightning) is composed by four high-speed cameras operating at 2,500 frames per second. Three stationary black-and-white (B&W) cameras were situated in the cities of São José dos Campos and Caçapava. A fourth color camera was mobile (installed in a car), but operated in a fixed location during the observation period, within the city of São José dos Campos. The average distance among cameras was 13 kilometers. Each RAMMER sensor position was determined so that the network can observe the same lightning flash from different angles and all recorded videos were GPS (Global Position System) time stamped, allowing comparisons of events between cameras and the LLS. The RAMMER sensor is basically composed by a computer, a Phantom high-speed camera version 9.1 and a GPS unit. The lightning cases analyzed in the present work were observed by at least two cameras, their position was visually triangulated and the results compared with BrasilDAT network, during the summer seasons of 2011/2012 and 2012/2013. The visual triangulation method is presented in details. The calibration procedure showed an accuracy of 9 meters between the accurate GPS position of the object triangulated and the result from the visual triangulation method. Lightning return stroke positions, estimated with the visual triangulation method, were compared with LLS locations. Differences between solutions were not greater than 1.8 km.
Measurement of luminance noise and chromaticity noise of LCDs with a colorimeter and a color camera
NASA Astrophysics Data System (ADS)
Roehrig, H.; Dallas, W. J.; Krupinski, E. A.; Redford, Gary R.
2007-09-01
This communication focuses on physical evaluation of image quality of displays for applications in medical imaging. In particular we were interested in luminance noise as well as chromaticity noise of LCDs. Luminance noise has been encountered in the study of monochrome LCDs for some time, but chromaticity noise is a new type of noise which has been encountered first when monochrome and color LCDs were compared in an ROC study. In this present study one color and one monochrome 3 M-pixel LCDs were studied. Both were DICOM calibrated with equal dynamic range. We used a Konica Minolta Chroma Meter CS-200 as well as a Foveon color camera to estimate luminance and chrominance variations of the displays. We also used a simulation experiment to estimate luminance noise. The measurements with the colorimeter were consistent. The measurements with the Foveon color camera were very preliminary as color cameras had never been used for image quality measurements. However they were extremely promising. The measurements with the colorimeter and the simulation results showed that the luminance and chromaticity noise of the color LCD were larger than that of the monochrome LCD. Under the condition that an adequate calibration method and image QA/QC program for color displays are available, we expect color LCDs may be ready for radiology in very near future.
Design, implementation and accuracy of a prototype for medical augmented reality.
Pandya, Abhilash; Siadat, Mohammad-Reza; Auner, Greg
2005-01-01
This paper is focused on prototype development and accuracy evaluation of a medical Augmented Reality (AR) system. The accuracy of such a system is of critical importance for medical use, and is hence considered in detail. We analyze the individual error contributions and the system accuracy of the prototype. A passive articulated arm is used to track a calibrated end-effector-mounted video camera. The live video view is superimposed in real time with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull. The AR accuracy mostly depends on the accuracy of the tracking technology, the registration procedure, the camera calibration, and the image scanning device (e.g., a CT or MRI scanner). The accuracy of the Microscribe arm was measured to be 0.87 mm. After mounting the camera on the tracking device, the AR accuracy was measured to be 2.74 mm on average (standard deviation = 0.81 mm). After using data from a 2-mm-thick CT scan, the AR error remained essentially the same at an average of 2.75 mm (standard deviation = 1.19 mm). For neurosurgery, the acceptable error is approximately 2-3 mm, and our prototype approaches these accuracy requirements. The accuracy could be increased with a higher-fidelity tracking system and improved calibration and object registration. The design and methods of this prototype device can be extrapolated to current medical robotics (due to the kinematic similarity) and neuronavigation systems.
Preface: The Chang'e-3 lander and rover mission to the Moon
NASA Astrophysics Data System (ADS)
Ip, Wing-Huen; Yan, Jun; Li, Chun-Lai; Ouyang, Zi-Yuan
2014-12-01
The Chang'e-3 (CE-3) lander and rover mission to the Moon was an intermediate step in China's lunar exploration program, which will be followed by a sample return mission. The lander was equipped with a number of remote-sensing instruments including a pair of cameras (Landing Camera and Terrain Camera) for recording the landing process and surveying terrain, an extreme ultraviolet camera for monitoring activities in the Earth's plasmasphere, and a first-ever Moon-based ultraviolet telescope for astronomical observations. The Yutu rover successfully carried out close-up observations with the Panoramic Camera, mineralogical investigations with the VIS-NIR Imaging Spectrometer, study of elemental abundances with the Active Particle-induced X-ray Spectrometer, and pioneering measurements of the lunar subsurface with Lunar Penetrating Radar. This special issue provides a collection of key information on the instrumental designs, calibration methods and data processing procedures used by these experiments with a perspective of facilitating further analyses of scientific data from CE-3 in preparation for future missions.
Roberti, Joshua A.; SanClements, Michael D.; Loescher, Henry W.; Ayres, Edward
2014-01-01
Even though fine-root turnover is a highly studied topic, it is often poorly understood as a result of uncertainties inherent in its sampling, e.g., quantifying spatial and temporal variability. While many methods exist to quantify fine-root turnover, use of minirhizotrons has increased over the last two decades, making sensor errors another source of uncertainty. Currently, no standardized methodology exists to test and compare minirhizotron camera capability, imagery, and performance. This paper presents a reproducible, laboratory-based method by which minirhizotron cameras can be tested and validated in a traceable manner. The performance of camera characteristics was identified and test criteria were developed: we quantified the precision of camera location for successive images, estimated the trueness and precision of each camera's ability to quantify root diameter and root color, and also assessed the influence of heat dissipation introduced by the minirhizotron cameras and electrical components. We report detailed and defensible metrology analyses that examine the performance of two commercially available minirhizotron cameras. These cameras performed differently with regard to the various test criteria and uncertainty analyses. We recommend a defensible metrology approach to quantify the performance of minirhizotron camera characteristics and determine sensor-related measurement uncertainties prior to field use. This approach is also extensible to other digital imagery technologies. In turn, these approaches facilitate a greater understanding of measurement uncertainties (signal-to-noise ratio) inherent in the camera performance and allow such uncertainties to be quantified and mitigated so that estimates of fine-root turnover can be more confidently quantified. PMID:25391023
Videogrammetric Model Deformation Measurement Technique
NASA Technical Reports Server (NTRS)
Burner, A. W.; Liu, Tian-Shu
2001-01-01
The theory, methods, and applications of the videogrammetric model deformation (VMD) measurement technique used at NASA for wind tunnel testing are presented. The VMD technique, based on non-topographic photogrammetry, can determine static and dynamic aeroelastic deformation and attitude of a wind-tunnel model. Hardware of the system includes a video-rate CCD camera, a computer with an image acquisition frame grabber board, illumination lights, and retroreflective or painted targets on a wind tunnel model. Custom software includes routines for image acquisition, target-tracking/identification, target centroid calculation, camera calibration, and deformation calculations. Applications of the VMD technique at five large NASA wind tunnels are discussed.
Pancam: A Multispectral Imaging Investigation on the NASA 2003 Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.
2003-01-01
One of the six science payload elements carried on each of the NASA Mars Exploration Rovers (MER; Figure 1) is the Panoramic Camera System, or Pancam. Pancam consists of three major components: a pair of digital CCD cameras, the Pancam Mast Assembly (PMA), and a radiometric calibration target. The PMA provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. The calibration target provides a set of reference color and grayscale standards for calibration validation, and a shadow post for quantification of the direct vs. diffuse illumination of the scene. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover in up to 12 unique wavelengths. The major characteristics of Pancam are summarized.
Ensuring long-term stability of infrared camera absolute calibration.
Kattnig, Alain; Thetas, Sophie; Primot, Jérôme
2015-07-13
Absolute calibration of cryogenic 3-5 µm and 8-10 µm infrared cameras is notoriously instable and thus has to be repeated before actual measurements. Moreover, the signal to noise ratio of the imagery is lowered, decreasing its quality. These performances degradations strongly lessen the suitability of Infrared Imaging. These defaults are often blamed on detectors reaching a different "response state" after each return to cryogenic conditions, while accounting for the detrimental effects of imperfect stray light management. We show here that detectors are not to be blamed and that the culprit can also dwell in proximity electronics. We identify an unexpected source of instability in the initial voltage of the integrating capacity of detectors. Then we show that this parameter can be easily measured and taken into account. This way we demonstrate that a one month old calibration of a 3-5 µm camera has retained its validity.
Parameterizations for reducing camera reprojection error for robot-world hand-eye calibration
USDA-ARS?s Scientific Manuscript database
Accurate robot-world, hand-eye calibration is crucial to automation tasks. In this paper, we discuss the robot-world, hand-eye calibration problem which has been modeled as the linear relationship AX equals ZB, where X and Z are the unknown calibration matrices composed of rotation and translation ...
New technology and techniques for x-ray mirror calibration at PANTER
NASA Astrophysics Data System (ADS)
Freyberg, Michael J.; Budau, Bernd; Burkert, Wolfgang; Friedrich, Peter; Hartner, Gisela; Misaki, Kazutami; Mühlegger, Martin
2008-07-01
The PANTER X-ray Test Facility has been utilized successfully for developing and calibrating X-ray astronomical instrumentation for observatories such as ROSAT, Chandra, XMM-Newton, Swift, etc. Future missions like eROSITA, SIMBOL-X, or XEUS require improved spatial resolution and broader energy band pass, both for optics and for cameras. Calibration campaigns at PANTER have made use of flight spare instrumentation for space applications; here we report on a new dedicated CCD camera for on-ground calibration, called TRoPIC. As the CCD is similar to ones used for eROSITA (pn-type, back-illuminated, 75 μm pixel size, frame store mode, 450 μm micron wafer thickness, etc.) it can serve as prototype for eROSITA camera development. New techniques enable and enhance the analysis of measurements of eROSITA shells or silicon pore optics. Specifically, we show how sub-pixel resolution can be utilized to improve spatial resolution and subsequently the characterization of of mirror shell quality and of point spread function parameters in particular, also relevant for position reconstruction of astronomical sources in orbit.
DeLorenzo, Christine; Papademetris, Xenophon; Staib, Lawrence H.; Vives, Kenneth P.; Spencer, Dennis D.; Duncan, James S.
2010-01-01
During neurosurgery, nonrigid brain deformation prevents preoperatively-acquired images from accurately depicting the intraoperative brain. Stereo vision systems can be used to track intraoperative cortical surface deformation and update preoperative brain images in conjunction with a biomechanical model. However, these stereo systems are often plagued with calibration error, which can corrupt the deformation estimation. In order to decouple the effects of camera calibration from the surface deformation estimation, a framework that can solve for disparate and often competing variables is needed. Game theory, which was developed to handle decision making in this type of competitive environment, has been applied to various fields from economics to biology. In this paper, game theory is applied to cortical surface tracking during neocortical epilepsy surgery and used to infer information about the physical processes of brain surface deformation and image acquisition. The method is successfully applied to eight in vivo cases, resulting in an 81% decrease in mean surface displacement error. This includes a case in which some of the initial camera calibration parameters had errors of 70%. Additionally, the advantages of using a game theoretic approach in neocortical epilepsy surgery are clearly demonstrated in its robustness to initial conditions. PMID:20129844
Mars Exploration Rover Navigation Camera in-flight calibration
NASA Astrophysics Data System (ADS)
Soderblom, Jason M.; Bell, James F.; Johnson, Jeffrey R.; Joseph, Jonathan; Wolff, Michael J.
2008-06-01
The Navigation Camera (Navcam) instruments on the Mars Exploration Rover (MER) spacecraft provide support for both tactical operations as well as scientific observations where color information is not necessary: large-scale morphology, atmospheric monitoring including cloud observations and dust devil movies, and context imaging for both the thermal emission spectrometer and the in situ instruments on the Instrument Deployment Device. The Navcams are a panchromatic stereoscopic imaging system built using identical charge-coupled device (CCD) detectors and nearly identical electronics boards as the other cameras on the MER spacecraft. Previous calibration efforts were primarily focused on providing a detailed geometric calibration in line with the principal function of the Navcams, to provide data for the MER navigation team. This paper provides a detailed description of a new Navcam calibration pipeline developed to provide an absolute radiometric calibration that we estimate to have an absolute accuracy of 10% and a relative precision of 2.5%. Our calibration pipeline includes steps to model and remove the bias offset, the dark current charge that accumulates in both the active and readout regions of the CCD, and the shutter smear. It also corrects pixel-to-pixel responsivity variations using flat-field images, and converts from raw instrument-corrected digital number values per second to units of radiance (W m-2 nm-1 sr-1), or to radiance factor (I/F). We also describe here the initial results of two applications where radiance-calibrated Navcam data provide unique information for surface photometric and atmospheric aerosol studies.
NASA Astrophysics Data System (ADS)
Dong, Shidu; Yang, Xiaofan; He, Bo; Liu, Guojin
2006-11-01
Radiance coming from the interior of an uncooled infrared camera has a significant effect on the measured value of the temperature of the object. This paper presents a three-phase compensation scheme for coping with this effect. The first phase acquires the calibration data and forms the calibration function by least square fitting. Likewise, the second phase obtains the compensation data and builds the compensation function by fitting. With the aid of these functions, the third phase determines the temperature of the object in concern from any given ambient temperature. It is known that acquiring the compensation data of a camera is very time-consuming. For the purpose of getting the compensation data at a reasonable time cost, we propose a transplantable scheme. The idea of this scheme is to calculate the ratio between the central pixel’s responsivity of the child camera to the radiance from the interior and that of the mother camera, followed by determining the compensation data of the child camera using this ratio and the compensation data of the mother camera Experimental results show that either of the child camera and the mother camera can measure the temperature of the object with an error of no more than 2°C.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tokurei, Shogo, E-mail: shogo.tokurei@gmail.com, E-mail: junjim@med.kyushu-u.ac.jp; Morishita, Junji, E-mail: shogo.tokurei@gmail.com, E-mail: junjim@med.kyushu-u.ac.jp
Purpose: The aim of this study is to propose a method for the quantitative evaluation of image quality of both monochrome and color liquid-crystal displays (LCDs) using a commercially available color digital camera. Methods: The intensities of the unprocessed red (R), green (G), and blue (B) signals of a camera vary depending on the spectral sensitivity of the image sensor used in the camera. For consistent evaluation of image quality for both monochrome and color LCDs, the unprocessed RGB signals of the camera were converted into gray scale signals that corresponded to the luminance of the LCD. Gray scale signalsmore » for the monochrome LCD were evaluated by using only the green channel signals of the camera. For the color LCD, the RGB signals of the camera were converted into gray scale signals by employing weighting factors (WFs) for each RGB channel. A line image displayed on the color LCD was simulated on the monochrome LCD by using a software application for subpixel driving in order to verify the WF-based conversion method. Furthermore, the results obtained by different types of commercially available color cameras and a photometric camera were compared to examine the consistency of the authors’ method. Finally, image quality for both the monochrome and color LCDs was assessed by measuring modulation transfer functions (MTFs) and Wiener spectra (WS). Results: The authors’ results demonstrated that the proposed method for calibrating the spectral sensitivity of the camera resulted in a consistent and reliable evaluation of the luminance of monochrome and color LCDs. The MTFs and WS showed different characteristics for the two LCD types owing to difference in the subpixel structure. The MTF in the vertical direction of the color LCD was superior to that of the monochrome LCD, although the WS in the vertical direction of the color LCD was inferior to that of the monochrome LCD as a result of luminance fluctuations in RGB subpixels. Conclusions: The authors’ method based on the use of a commercially available color camera is useful to evaluate and understand the display performances of both monochrome and color LCDs in radiology departments.« less
Wijeisnghe, Ruchire Eranga Henry; Cho, Nam Hyun; Park, Kibeom; Shin, Yongseung; Kim, Jeehyun
2013-12-01
In this study, we demonstrate the enhanced spectral calibration method for 1.3 μm spectral-domain optical coherence tomography (SD-OCT). The calibration method using wavelength-filter simplifies the SD-OCT system, and also the axial resolution and the entire speed of the OCT system can be dramatically improved as well. An externally connected wavelength-filter is utilized to obtain the information of the wavenumber and the pixel position. During the calibration process the wavelength-filter is placed after a broadband source by connecting through an optical circulator. The filtered spectrum with a narrow line width of 0.5 nm is detected by using a line-scan camera. The method does not require a filter or a software recalibration algorithm for imaging as it simply resamples the OCT signal from the detector array without employing rescaling or interpolation methods. One of the main drawbacks of SD-OCT is the broadened point spread functions (PSFs) with increasing imaging depth can be compensated by increasing the wavenumber-linearization order. The sensitivity of our system was measured at 99.8 dB at an imaging depth of 2.1 mm compared with the uncompensated case.
NASA Astrophysics Data System (ADS)
Daakir, M.; Pierrot-Deseilligny, M.; Bosser, P.; Pichard, F.; Thom, C.; Rabot, Y.
2016-03-01
Nowadays, Unmanned Aerial Vehicle (UAV) on-board photogrammetry knows a significant growth due to the democratization of using drones in the civilian sector. Also, due to changes in regulations laws governing the rules of inclusion of a UAV in the airspace which become suitable for the development of professional activities. Fields of application of photogrammetry are diverse, for instance: architecture, geology, archaeology, mapping, industrial metrology, etc. Our research concerns the latter area. Vinci-Construction- Terrassement is a private company specialized in public earthworks that uses UAVs for metrology applications. This article deals with maximum accuracy one can achieve with a coupled camera and GPS receiver system for direct-georeferencing of Digital Surface Models (DSMs) without relying on Ground Control Points (GCPs) measurements. This article focuses specially on the lever-arm calibration part. This proposed calibration method is based on two steps: a first step involves the proper calibration for each sensor, i.e. to determine the position of the optical center of the camera and the GPS antenna phase center in a local coordinate system relative to the sensor. A second step concerns a 3d modeling of the UAV with embedded sensors through a photogrammetric acquisition. Processing this acquisition allows to determine the value of the lever-arm offset without using GCPs.
Multi-view video segmentation and tracking for video surveillance
NASA Astrophysics Data System (ADS)
Mohammadi, Gelareh; Dufaux, Frederic; Minh, Thien Ha; Ebrahimi, Touradj
2009-05-01
Tracking moving objects is a critical step for smart video surveillance systems. Despite the complexity increase, multiple camera systems exhibit the undoubted advantages of covering wide areas and handling the occurrence of occlusions by exploiting the different viewpoints. The technical problems in multiple camera systems are several: installation, calibration, objects matching, switching, data fusion, and occlusion handling. In this paper, we address the issue of tracking moving objects in an environment covered by multiple un-calibrated cameras with overlapping fields of view, typical of most surveillance setups. Our main objective is to create a framework that can be used to integrate objecttracking information from multiple video sources. Basically, the proposed technique consists of the following steps. We first perform a single-view tracking algorithm on each camera view, and then apply a consistent object labeling algorithm on all views. In the next step, we verify objects in each view separately for inconsistencies. Correspondent objects are extracted through a Homography transform from one view to the other and vice versa. Having found the correspondent objects of different views, we partition each object into homogeneous regions. In the last step, we apply the Homography transform to find the region map of first view in the second view and vice versa. For each region (in the main frame and mapped frame) a set of descriptors are extracted to find the best match between two views based on region descriptors similarity. This method is able to deal with multiple objects. Track management issues such as occlusion, appearance and disappearance of objects are resolved using information from all views. This method is capable of tracking rigid and deformable objects and this versatility lets it to be suitable for different application scenarios.
Virtual Reality Calibration for Telerobotic Servicing
NASA Technical Reports Server (NTRS)
Kim, W.
1994-01-01
A virtual reality calibration technique of matching a virtual environment of simulated graphics models in 3-D geometry and perspective with actual camera views of the remote site task environment has been developed to enable high-fidelity preview/predictive displays with calibrated graphics overlay on live video.
NASA Astrophysics Data System (ADS)
Chatterjee, Abhijit; Verma, Anurag
2016-05-01
The Advanced Wide Field Sensor (AWiFS) camera caters to high temporal resolution requirement of Resourcesat-2A mission with repeativity of 5 days. The AWiFS camera consists of four spectral bands, three in the visible and near IR and one in the short wave infrared. The imaging concept in VNIR bands is based on push broom scanning that uses linear array silicon charge coupled device (CCD) based Focal Plane Array (FPA). On-Board Calibration unit for these CCD based FPAs is used to monitor any degradation in FPA during entire mission life. Four LEDs are operated in constant current mode and 16 different light intensity levels are generated by electronically changing exposure of CCD throughout the calibration cycle. This paper describes experimental setup and characterization results of various flight model visible LEDs (λP=650nm) for development of On-Board Calibration unit of Advanced Wide Field Sensor (AWiFS) camera of RESOURCESAT-2A. Various LED configurations have been studied to meet dynamic range coverage of 6000 pixels silicon CCD based focal plane array from 20% to 60% of saturation during night pass of the satellite to identify degradation of detector elements. The paper also explains comparison of simulation and experimental results of CCD output profile at different LED combinations in constant current mode.
NASA Astrophysics Data System (ADS)
Kersten, T. P.; Stallmann, D.; Tschirschwitz, F.
2016-06-01
For mapping of building interiors various 2D and 3D indoor surveying systems are available today. These systems essentially differ from each other by price and accuracy as well as by the effort required for fieldwork and post-processing. The Laboratory for Photogrammetry & Laser Scanning of HafenCity University (HCU) Hamburg has developed, as part of an industrial project, a lowcost indoor mapping system, which enables systematic inventory mapping of interior facilities with low staffing requirements and reduced, measurable expenditure of time and effort. The modelling and evaluation of the recorded data take place later in the office. The indoor mapping system of HCU Hamburg consists of the following components: laser range finder, panorama head (pan-tilt-unit), single-board computer (Raspberry Pi) with digital camera and battery power supply. The camera is pre-calibrated in a photogrammetric test field under laboratory conditions. However, remaining systematic image errors are corrected simultaneously within the generation of the panorama image. Due to cost reasons the camera and laser range finder are not coaxially arranged on the panorama head. Therefore, eccentricity and alignment of the laser range finder against the camera must be determined in a system calibration. For the verification of the system accuracy and the system calibration, the laser points were determined from measurements with total stations. The differences to the reference were 4-5mm for individual coordinates.
Pallone, Matthew J.; Meaney, Paul M.; Paulsen, Keith D.
2012-01-01
Purpose: Microwave tomographic image quality can be improved significantly with prior knowledge of the breast surface geometry. The authors have developed a novel laser scanning system capable of accurately recovering surface renderings of breast-shaped phantoms immersed within a cylindrical tank of coupling fluid which resides completely external to the tank (and the aqueous environment) and overcomes the challenges associated with the optical distortions caused by refraction from the air, tank wall, and liquid bath interfaces. Methods: The scanner utilizes two laser line generators and a small CCD camera mounted concentrically on a rotating gantry about the microwave imaging tank. Various calibration methods were considered for optimizing the accuracy of the scanner in the presence of the optical distortions including traditional ray tracing and image registration approaches. In this paper, the authors describe the construction and operation of the laser scanner, compare the efficacy of several calibration methods—including analytical ray tracing and piecewise linear, polynomial, locally weighted mean, and thin-plate-spline (TPS) image registrations—and report outcomes from preliminary phantom experiments. Results: The results show that errors in calibrating camera angles and position prevented analytical ray tracing from achieving submillimeter accuracy in the surface renderings obtained from our scanner configuration. Conversely, calibration by image registration reliably attained mean surface errors of less than 0.5 mm depending on the geometric complexity of the object scanned. While each of the image registration approaches outperformed the ray tracing strategy, the authors found global polynomial methods produced the best compromise between average surface error and scanner robustness. Conclusions: The laser scanning system provides a fast and accurate method of three dimensional surface capture in the aqueous environment commonly found in microwave breast imaging. Optical distortions imposed by the imaging tank and coupling bath diminished the effectiveness of the ray tracing approach; however, calibration through image registration techniques reliably produced scans of submillimeter accuracy. Tests of the system with breast-shaped phantoms demonstrated the successful implementation of the scanner for the intended application. PMID:22755695
Temperature-Sensitive Coating Sensor Based on Hematite
NASA Technical Reports Server (NTRS)
Bencic, Timothy J.
2011-01-01
A temperature-sensitive coating, based on hematite (iron III oxide), has been developed to measure surface temperature using spectral techniques. The hematite powder is added to a binder that allows the mixture to be painted on the surface of a test specimen. The coating dynamically changes its relative spectral makeup or color with changes in temperature. The color changes from a reddish-brown appearance at room temperature (25 C) to a black-gray appearance at temperatures around 600 C. The color change is reversible and repeatable with temperature cycling from low to high and back to low temperatures. Detection of the spectral changes can be recorded by different sensors, including spectrometers, photodiodes, and cameras. Using a-priori information obtained through calibration experiments in known thermal environments, the color change can then be calibrated to yield accurate quantitative temperature information. Temperature information can be obtained at a point, or over an entire surface, depending on the type of equipment used for data acquisition. Because this innovation uses spectrophotometry principles of operation, rather than the current methods, which use photoluminescence principles, white light can be used for illumination rather than high-intensity short wavelength excitation. The generation of high-intensity white (or potentially filtered long wavelength light) is much easier, and is used more prevalently for photography and video technologies. In outdoor tests, the Sun can be used for short durations as an illumination source as long as the amplitude remains relatively constant. The reflected light is also much higher in intensity than the emitted light from the inefficient current methods. Having a much brighter surface allows a wider array of detection schemes and devices. Because color change is the principle of operation, the development of high-quality, lower-cost digital cameras can be used for detection, as opposed to the high-cost imagers needed for intensity measurements with the current methods. Alternative methods of detection are possible to increase the measurement sensitivity. For example, a monochrome camera can be used with an appropriate filter and a radiometric measurement of normalized intensity change that is proportional to the change coating temperature. Using different spectral regions yields different sensitivities and calibration curves for converting intensity change to temperature units. Alternatively, using a color camera, a ratio of the standard red, green, and blue outputs can be used as a self-referenced change. The blue region (less than 500 nm) does not change nearly as much as the red region (greater than 575 nm), so a ratio of color intensities will yield a calibrated temperature image. The new temperature sensor coating is easy to apply, is inexpensive, can contour complex shape surfaces, and can be a global surface measurement system based on spectrophotometry. The color change, or relative intensity change, at different colors makes the optical detection under white light illumination, and associated interpretation, much easier to measure and interpret than in the detection systems of the current methods.
Geometric Calibration and Radiometric Correction of the Maia Multispectral Camera
NASA Astrophysics Data System (ADS)
Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D.
2017-10-01
Multispectral imaging is a widely used remote sensing technique, whose applications range from agriculture to environmental monitoring, from food quality check to cultural heritage diagnostic. A variety of multispectral imaging sensors are available on the market, many of them designed to be mounted on different platform, especially small drones. This work focuses on the geometric and radiometric characterization of a brand-new, lightweight, low-cost multispectral camera, called MAIA. The MAIA camera is equipped with nine sensors, allowing for the acquisition of images in the visible and near infrared parts of the electromagnetic spectrum. Two versions are available, characterised by different set of band-pass filters, inspired by the sensors mounted on the WorlView-2 and Sentinel2 satellites, respectively. The camera details and the developed procedures for the geometric calibrations and radiometric correction are presented in the paper.
SU-E-T-641: Proton Range Measurements Using a Geometrically Calibrated Liquid Scintillator Detector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hui, C; Robertson, D; Alsanea, F
2015-06-15
Purpose: The purpose of this work is to develop a geometric calibration method to accurately calculate physical distances within a liquid scintillator detector and to assess the accuracy, consistency, and robustness of proton beam range measurements when using a liquid scintillator detector system with the proposed geometric calibration process. Methods: We developed a geometric calibration procedure to accurately convert pixel locations in the camera frame into physical locations in the scintillator frame. To ensure accuracy, the geometric calibration was performed before each experiment. The liquid scintillator was irradiated with spot scanning proton beams of 94 energies in two deliveries. Amore » CCD camera was used to capture the two-dimensional scintillation light profile of each of the proton energies. An algorithm was developed to automatically calculate the proton range from the acquired images. The measured range was compared to the nominal range to assess the accuracy of the detector. To evaluate the robustness of the detector between each setup, the experiments were repeated on three different days. To evaluate the consistency of the measurements between deliveries, three sets of measurements were acquired for each experiment. Results: Using this geometric calibration procedure, the proton beam ranges measured using the liquid scintillator system were all within 0.3mm of the nominal range. The average difference between the measured and nominal ranges was −0.20mm. The delivery-to-delivery standard deviation of the proton range measurement was 0.04mm, and the setup-to-setup standard deviation of the measurement was 0.10mm. Conclusion: The liquid scintillator system can measure the range of all 94 beams in just two deliveries. With the proposed geometric calibration, it can measure proton range with sub-millimeter accuracy, and the measurements were shown to be consistent between deliveries and setups. Therefore, we conclude that the liquid scintillator system provides a reliable and convenient tool for proton range measurement. This project was supported in part by award number CA182450 from the National Cancer Institute.« less
NASA Astrophysics Data System (ADS)
Liu, W.; Wang, H.; Liu, D.; Miu, Y.
2018-05-01
Precise geometric parameters are essential to ensure the positioning accuracy for space optical cameras. However, state-of-the-art onorbit calibration method inevitably suffers from long update cycle and poor timeliness performance. To this end, in this paper we exploit the optical auto-collimation principle and propose a real-time onboard calibration scheme for monitoring key geometric parameters. Specifically, in the proposed scheme, auto-collimation devices are first designed by installing collimated light sources, area-array CCDs, and prisms inside the satellite payload system. Through utilizing those devices, the changes in the geometric parameters are elegantly converted into changes in the spot image positions. The variation of geometric parameters can be derived via extracting and processing the spot images. An experimental platform is then set up to verify the feasibility and analyze the precision index of the proposed scheme. The experiment results demonstrate that it is feasible to apply the optical auto-collimation principle for real-time onboard monitoring.
NASA Technical Reports Server (NTRS)
Giveona, Amir; Shaklan, Stuart; Kern, Brian; Noecker, Charley; Kendrick, Steve; Wallace, Kent
2012-01-01
In a setup similar to the self coherent camera, we have added a set of pinholes in the diffraction ring of the Lyot plane in a high-contrast stellar Lyot coronagraph. We describe a novel complex electric field reconstruction from image plane intensity measurements consisting of light in the coronagraph's dark hole interfering with light from the pinholes. The image plane field is modified by letting light through one pinhole at a time. In addition to estimation of the field at the science camera, this method allows for self-calibration of the probes by letting light through the pinholes in various permutations while blocking the main Lyot opening. We present results of estimation and calibration from the High Contrast Imaging Testbed along with a comparison to the pair-wise deformable mirror diversity based estimation technique. Tests are carried out in narrow-band light and over a composite 10% bandpass.
Geometrical distortion calibration of the stereo camera for the BepiColombo mission to Mercury
NASA Astrophysics Data System (ADS)
Simioni, Emanuele; Da Deppo, Vania; Re, Cristina; Naletto, Giampiero; Martellato, Elena; Borrelli, Donato; Dami, Michele; Aroldi, Gianluca; Ficai Veltroni, Iacopo; Cremonese, Gabriele
2016-07-01
The ESA-JAXA mission BepiColombo that will be launched in 2018 is devoted to the observation of Mercury, the innermost planet of the Solar System. SIMBIOSYS is its remote sensing suite, which consists of three instruments: the High Resolution Imaging Channel (HRIC), the Visible and Infrared Hyperspectral Imager (VIHI), and the Stereo Imaging Channel (STC). The latter will provide the global three dimensional reconstruction of the Mercury surface, and it represents the first push-frame stereo camera on board of a space satellite. Based on a new telescope design, STC combines the advantages of a compact single detector camera to the convenience of a double direction acquisition system; this solution allows to minimize mass and volume performing a push-frame imaging acquisition. The shared camera sensor is divided in six portions: four are covered with suitable filters; the others, one looking forward and one backwards with respect to nadir direction, are covered with a panchromatic filter supplying stereo image pairs of the planet surface. The main STC scientific requirements are to reconstruct in 3D the Mercury surface with a vertical accuracy better than 80 m and performing a global imaging with a grid size of 65 m along-track at the periherm. Scope of this work is to present the on-ground geometric calibration pipeline for this original instrument. The selected STC off-axis configuration forced to develop a new distortion map model. Additional considerations are connected to the detector, a Si-Pin hybrid CMOS, which is characterized by a high fixed pattern noise. This had a great impact in pre-calibration phases compelling to use a not common approach to the definition of the spot centroids in the distortion calibration process. This work presents the results obtained during the calibration of STC concerning the distortion analysis for three different temperatures. These results are then used to define the corresponding distortion model of the camera.
Bell, James F.; Godber, A.; McNair, S.; Caplinger, M.A.; Maki, J.N.; Lemmon, M.T.; Van Beek, J.; Malin, M.C.; Wellington, D.; Kinch, K.M.; Madsen, M.B.; Hardgrove, C.; Ravine, M.A.; Jensen, E.; Harker, D.; Anderson, Ryan; Herkenhoff, Kenneth E.; Morris, R.V.; Cisneros, E.; Deen, R.G.
2017-01-01
The NASA Curiosity rover Mast Camera (Mastcam) system is a pair of fixed-focal length, multispectral, color CCD imagers mounted ~2 m above the surface on the rover's remote sensing mast, along with associated electronics and an onboard calibration target. The left Mastcam (M-34) has a 34 mm focal length, an instantaneous field of view (IFOV) of 0.22 mrad, and a FOV of 20° × 15° over the full 1648 × 1200 pixel span of its Kodak KAI-2020 CCD. The right Mastcam (M-100) has a 100 mm focal length, an IFOV of 0.074 mrad, and a FOV of 6.8° × 5.1° using the same detector. The cameras are separated by 24.2 cm on the mast, allowing stereo images to be obtained at the resolution of the M-34 camera. Each camera has an eight-position filter wheel, enabling it to take Bayer pattern red, green, and blue (RGB) “true color” images, multispectral images in nine additional bands spanning ~400–1100 nm, and images of the Sun in two colors through neutral density-coated filters. An associated Digital Electronics Assembly provides command and data interfaces to the rover, 8 Gb of image storage per camera, 11 bit to 8 bit companding, JPEG compression, and acquisition of high-definition video. Here we describe the preflight and in-flight calibration of Mastcam images, the ways that they are being archived in the NASA Planetary Data System, and the ways that calibration refinements are being developed as the investigation progresses on Mars. We also provide some examples of data sets and analyses that help to validate the accuracy and precision of the calibration
Method for measuring surface shear stress magnitude and direction using liquid crystal coatings
NASA Technical Reports Server (NTRS)
Reda, Daniel C. (Inventor)
1995-01-01
A method is provided for determining surface shear magnitude and direction at every point on a surface. The surface is covered with a shear stress sensitive liquid crystal coating and illuminated by white light from a normal direction. A video camera is positioned at an oblique angle above the surface to observe the color of the liquid crystal at that angle. The shear magnitude and direction are derived from the color information. A method of calibrating the device is also provided.
Rectification of curved document images based on single view three-dimensional reconstruction.
Kang, Lai; Wei, Yingmei; Jiang, Jie; Bai, Liang; Lao, Songyang
2016-10-01
Since distortions in camera-captured document images significantly affect the accuracy of optical character recognition (OCR), distortion removal plays a critical role for document digitalization systems using a camera for image capturing. This paper proposes a novel framework that performs three-dimensional (3D) reconstruction and rectification of camera-captured document images. While most existing methods rely on additional calibrated hardware or multiple images to recover the 3D shape of a document page, or make a simple but not always valid assumption on the corresponding 3D shape, our framework is more flexible and practical since it only requires a single input image and is able to handle a general locally smooth document surface. The main contributions of this paper include a new iterative refinement scheme for baseline fitting from connected components of text line, an efficient discrete vertical text direction estimation algorithm based on convex hull projection profile analysis, and a 2D distortion grid construction method based on text direction function estimation using 3D regularization. In order to examine the performance of our proposed method, both qualitative and quantitative evaluation and comparison with several recent methods are conducted in our experiments. The experimental results demonstrate that the proposed method outperforms relevant approaches for camera-captured document image rectification, in terms of improvements on both visual distortion removal and OCR accuracy.
Real-time tricolor phase measuring profilometry based on CCD sensitivity calibration
NASA Astrophysics Data System (ADS)
Zhu, Lin; Cao, Yiping; He, Dawu; Chen, Cheng
2017-02-01
A real-time tricolor phase measuring profilometry (RTPMP) based on charge coupled device (CCD) sensitivity calibration is proposed. Only one colour fringe pattern whose red (R), green (G) and blue (B) components are, respectively, coded as three sinusoidal phase-shifting gratings with an equivalent shifting phase of 2π/3 is needed and sent to an appointed flash memory on a specialized digital light projector (SDLP). A specialized time-division multiplexing timing sequence actively controls the SDLP to project the fringe patterns in R, G and B channels sequentially onto the measured object in one over seventy-two of a second and meanwhile actively controls a high frame rate monochrome CCD camera to capture the corresponding deformed patterns synchronously with the SDLP. So the sufficient information for reconstructing the three-dimensional (3D) shape in one over twenty-four of a second is obtained. Due to the different spectral sensitivity of the CCD camera to RGB lights, the captured deformed patterns from R, G and B channels cannot share the same peak and valley, which will lead to lower accuracy or even failing to reconstruct the 3D shape. So a deformed pattern amending method based on CCD sensitivity calibration is developed to guarantee the accurate 3D reconstruction. The experimental results verify the feasibility of the proposed RTPMP method. The proposed RTPMP method can obtain the 3D shape at over the video frame rate of 24 frames per second, avoid the colour crosstalk completely and be effective for measuring real-time changing object.
A calibration method immune to the projector errors in fringe projection profilometry
NASA Astrophysics Data System (ADS)
Zhang, Ruihua; Guo, Hongwei
2017-08-01
In fringe projection technique, system calibration is a tedious task to establish the mapping relationship between the object depths and the fringe phases. Especially, it is not easy to accurately determine the parameters of the projector in this system, which may induce errors in the measurement results. To solve this problem, this paper proposes a new calibration by using the cross-ratio invariance in the system geometry for determining the phase-to-depth relations. In it, we analyze the epipolar eometry of the fringe projection system. On each epipolar plane, the depth variation along an incident ray induces the pixel movement along the epipolar line on the image plane of the camera. These depth variations and pixel movements can be connected by use of the projective transformations, under which condition the cross-ratio for each of them keeps invariant. Based on this fact, we suggest measuring the depth map by use of this cross-ratio invariance. Firstly, we shift the reference board in its perpendicular direction to three positions with known depths, and measure their phase maps as the reference phase maps; and secondly, when measuring an object, we calculate the object depth at each pixel by equating the cross-ratio of the depths to that of the corresponding pixels having the same phase on the image plane of the camera. This method is immune to the errors sourced from the projector, including the distortions both in the geometric shapes and in the intensity profiles of the projected fringe patterns.The experimental results demonstrate the proposed method to be feasible and valid.
Toward Active Control of Noise from Hot Supersonic Jets
2012-05-14
was developed that would allow for easy data sharing among the research teams. This format includes the acoustic data along with all calibration ...SUPERSONIC | QUARTERLY RPT. 3 ■ 1 i; ’XZ. "• Tff . w w i — r i (a) Far-Field Array Calibration (b) MHz Rate PIV Camera Setup Figure... Plenoptic camera is a similar setup to determine 3-D motion of the flow using a thick light sheet. 2.3 Update on CFD Progress In the previous interim
Family Of Calibrated Stereometric Cameras For Direct Intraoral Use
NASA Astrophysics Data System (ADS)
Curry, Sean; Moffitt, Francis; Symes, Douglas; Baumrind, Sheldon
1983-07-01
In order to study empirically the relative efficiencies of different types of orthodontic appliances in repositioning teeth in vivo, we have designed and constructed a pair of fixed-focus, normal case, fully-calibrated stereometric cameras. One is used to obtain stereo photography of single teeth, at a scale of approximately 2:1, and the other is designed for stereo imaging of the entire dentition, study casts, facial structures, and other related objects at a scale of approximately 1:8. Twin lenses simultaneously expose adjacent frames on a single roll of 70 mm film. Physical flatness of the film is ensured by the use of a spring-loaded metal pressure plate. The film is forced against a 3/16" optical glass plate upon which is etched an array of 16 fiducial marks which divide the film format into 9 rectangular regions. Using this approach, it has been possible to produce photographs which are undistorted for qualitative viewing and from which quantitative data can be acquired by direct digitization of conventional photographic enlargements. We are in the process of designing additional members of this family of cameras. All calibration and data acquisition and analysis techniques previously developed will be directly applicable to these new cameras.
2007-09-01
the projective camera matrix (P) which is a 3x4 matrix that is represents both the intrinsic and extrinsic parameters of a camera. It is used to...K contains the intrinsic parameters of the camera and |R t⎡ ⎤⎣ ⎦ represents the extrinsic parameters of the camera. By definition, the extrinsic ... extrinsic parameters are known then the camera is said to be calibrated. If only the intrinsic parameters are known, then the projective camera can
A novel 360-degree shape measurement using a simple setup with two mirrors and a laser MEMS scanner
NASA Astrophysics Data System (ADS)
Jin, Rui; Zhou, Xiang; Yang, Tao; Li, Dong; Wang, Chao
2017-09-01
There is no denying that 360-degree shape measurement technology plays an important role in the field of threedimensional optical metrology. Traditional optical 360-degree shape measurement methods are mainly two kinds: the first kind, by placing multiple scanners to achieve 360-degree measurements; the second kind, through the high-precision rotating device to get 360-degree shape model. The former increases the number of scanners and costly, while the latter using rotating devices lead to time consuming. This paper presents a low cost and fast optical 360-degree shape measurement method, which possesses the advantages of full static, fast and low cost. The measuring system consists of two mirrors with a certain angle, a laser projection system, a stereoscopic calibration block, and two cameras. And most of all, laser MEMS scanner can achieve precise movement of laser stripes without any movement mechanism, improving the measurement accuracy and efficiency. What's more, a novel stereo calibration technology presented in this paper can achieve point clouds data registration, and then get the 360-degree model of objects. A stereoscopic calibration block with special coded patterns on six sides is used in this novel stereo calibration method. Through this novel stereo calibration technology we can quickly get the 360-degree models of objects.
Railway clearance intrusion detection method with binocular stereo vision
NASA Astrophysics Data System (ADS)
Zhou, Xingfang; Guo, Baoqing; Wei, Wei
2018-03-01
In the stage of railway construction and operation, objects intruding railway clearance greatly threaten the safety of railway operation. Real-time intrusion detection is of great importance. For the shortcomings of depth insensitive and shadow interference of single image method, an intrusion detection method with binocular stereo vision is proposed to reconstruct the 3D scene for locating the objects and judging clearance intrusion. The binocular cameras are calibrated with Zhang Zhengyou's method. In order to improve the 3D reconstruction speed, a suspicious region is firstly determined by background difference method of a single camera's image sequences. The image rectification, stereo matching and 3D reconstruction process are only executed when there is a suspicious region. A transformation matrix from Camera Coordinate System(CCS) to Track Coordinate System(TCS) is computed with gauge constant and used to transfer the 3D point clouds into the TCS, then the 3D point clouds are used to calculate the object position and intrusion in TCS. The experiments in railway scene show that the position precision is better than 10mm. It is an effective way for clearance intrusion detection and can satisfy the requirement of railway application.
Mars Exploration Rover Navigation Camera in-flight calibration
Soderblom, J.M.; Bell, J.F.; Johnson, J. R.; Joseph, J.; Wolff, M.J.
2008-01-01
The Navigation Camera (Navcam) instruments on the Mars Exploration Rover (MER) spacecraft provide support for both tactical operations as well as scientific observations where color information is not necessary: large-scale morphology, atmospheric monitoring including cloud observations and dust devil movies, and context imaging for both the thermal emission spectrometer and the in situ instruments on the Instrument Deployment Device. The Navcams are a panchromatic stereoscopic imaging system built using identical charge-coupled device (CCD) detectors and nearly identical electronics boards as the other cameras on the MER spacecraft. Previous calibration efforts were primarily focused on providing a detailed geometric calibration in line with the principal function of the Navcams, to provide data for the MER navigation team. This paper provides a detailed description of a new Navcam calibration pipeline developed to provide an absolute radiometric calibration that we estimate to have an absolute accuracy of 10% and a relative precision of 2.5%. Our calibration pipeline includes steps to model and remove the bias offset, the dark current charge that accumulates in both the active and readout regions of the CCD, and the shutter smear. It also corrects pixel-to-pixel responsivity variations using flat-field images, and converts from raw instrument-corrected digital number values per second to units of radiance (W m-2 nm-1 sr-1), or to radiance factor (I/F). We also describe here the initial results of two applications where radiance-calibrated Navcam data provide unique information for surface photometric and atmospheric aerosol studies. Copyright 2008 by the American Geophysical Union.
Microchannel plate cross-talk mitigation for spatial autocorrelation measurements
NASA Astrophysics Data System (ADS)
Lipka, Michał; Parniak, Michał; Wasilewski, Wojciech
2018-05-01
Microchannel plates (MCP) are the basis for many spatially resolved single-particle detectors such as ICCD or I-sCMOS cameras employing image intensifiers (II), MCPs with delay-line anodes for the detection of cold gas particles or Cherenkov radiation detectors. However, the spatial characterization provided by an MCP is severely limited by cross-talk between its microchannels, rendering MCP and II ill-suited for autocorrelation measurements. Here, we present a cross-talk subtraction method experimentally exemplified for an I-sCMOS based measurement of pseudo-thermal light second-order intensity autocorrelation function at the single-photon level. The method merely requires a dark counts measurement for calibration. A reference cross-correlation measurement certifies the cross-talk subtraction. While remaining universal for MCP applications, the presented cross-talk subtraction, in particular, simplifies quantum optical setups. With the possibility of autocorrelation measurements, the signal needs no longer to be divided into two camera regions for a cross-correlation measurement, reducing the experimental setup complexity and increasing at least twofold the simultaneously employable camera sensor region.
NASA Astrophysics Data System (ADS)
Humphreys, Kenneth; Ward, Tomas; Markham, Charles
2007-04-01
We present a camera-based device capable of capturing two photoplethysmographic (PPG) signals at two different wavelengths simultaneously, in a remote noncontact manner. The system comprises a complementary metal-oxide semiconductor camera and dual wavelength array of light emitting diodes (760 and 880nm). By alternately illuminating a region of tissue with each wavelength of light, and detecting the backscattered photons with the camera at a rate of 16frames/wavelengths, two multiplexed PPG wave forms are simultaneously captured. This process is the basis of pulse oximetry, and we describe how, with the inclusion of a calibration procedure, this system could be used as a noncontact pulse oximeter to measure arterial oxygen saturation (SpO2) remotely. Results from an experiment on ten subjects, exhibiting normal SpO2 readings, that demonstrate the instrument's ability to capture signals from a range of subjects under realistic lighting and environmental conditions are presented. We compare the signals captured by the noncontact system to a conventional PPG signal captured concurrently from a finger, and show by means of a J. Bland and D. Altman [Lancet 327, 307 (1986); Statistician 32, 307 (1983)] test, the noncontact device to be comparable to a contact device as a monitor of heart rate. We highlight some considerations that should be made when using camera-based "integrative" sampling methods and demonstrate through simulation, the suitability of the captured PPG signals for application of existing pulse oximetry calibration procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Po-Feng; Tully, R. Brent; Jacobs, Bradley A.
In this paper, we extend the use of the tip of the red giant branch (TRGB) method to near-infrared wavelengths from the previously used I-band, using the Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3). Upon calibration of a color dependency of the TRGB magnitude, the IR TRGB yields a random uncertainty of ∼5% in relative distance. The IR TRGB methodology has an advantage over the previously used Advance Camera for Surveys F606W and F814W filter set for galaxies that suffer from severe extinction. Using the IR TRGB methodology, we obtain distances toward three principal galaxies in the Maffei/ICmore » 342 complex, which are located at low Galactic latitudes. New distance estimates using the TRGB method are 3.45{sub −0.13}{sup +0.13} Mpc for IC 342, 3.37{sub −0.23}{sup +0.32} Mpc for Maffei 1, and 3.52{sub −0.30}{sup +0.32} Mpc for Maffei 2. The uncertainties are dominated by uncertain extinction, especially for Maffei 1 and Maffei 2. Our IR calibration demonstrates the viability of the TRGB methodology for observations with the James Webb Space Telescope.« less
Prototypic Development and Evaluation of a Medium Format Metric Camera
NASA Astrophysics Data System (ADS)
Hastedt, H.; Rofallski, R.; Luhmann, T.; Rosenbauer, R.; Ochsner, D.; Rieke-Zapp, D.
2018-05-01
Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2-3 m in each direction) and large volumes (around 20 x 20 x 1-10 m). The requested precision in object space (1σ RMS) is defined to be within 0.1-0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1) high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2) a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3) a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002). Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm-0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement). All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.
2D Measurements of the Balmer Series in Proto-MPEX using a Fast Visible Camera Setup
NASA Astrophysics Data System (ADS)
Lindquist, Elizabeth G.; Biewer, Theodore M.; Ray, Holly B.
2017-10-01
The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device with densities up to 1020 m-3 and temperatures up to 20 eV. Broadband spectral measurements show the visible emission spectra are solely due to the Balmer lines of deuterium. Monochromatic and RGB color Sanstreak SC1 Edgertronic fast visible cameras capture high speed video of plasmas in Proto-MPEX. The color camera is equipped with a long pass 450 nm filter and an internal Bayer filter to view the Dα line at 656 nm on the red channel and the Dβ line at 486 nm on the blue channel. The monochromatic camera has a 434 nm narrow bandpass filter to view the Dγ intensity. In the setup, a 50/50 beam splitter is used so both cameras image the same region of the plasma discharge. Camera images were aligned to each other by viewing a grid ensuring 1 pixel registration between the two cameras. A uniform intensity calibrated white light source was used to perform a pixel-to-pixel relative and an absolute intensity calibration for both cameras. Python scripts that combined the dual camera data, rendering the Dα, Dβ, and Dγ intensity ratios. Observations from Proto-MPEX discharges will be presented. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725.
Improving Photometric Calibration of Meteor Video Camera Systems
NASA Technical Reports Server (NTRS)
Ehlert, Steven; Kingery, Aaron; Suggs, Robert
2016-01-01
We present the results of new calibration tests performed by the NASA Meteoroid Environment Oce (MEO) designed to help quantify and minimize systematic uncertainties in meteor photometry from video camera observations. These systematic uncertainties can be categorized by two main sources: an imperfect understanding of the linearity correction for the MEO's Watec 902H2 Ultimate video cameras and uncertainties in meteor magnitudes arising from transformations between the Watec camera's Sony EX-View HAD bandpass and the bandpasses used to determine reference star magnitudes. To address the rst point, we have measured the linearity response of the MEO's standard meteor video cameras using two independent laboratory tests on eight cameras. Our empirically determined linearity correction is critical for performing accurate photometry at low camera intensity levels. With regards to the second point, we have calculated synthetic magnitudes in the EX bandpass for reference stars. These synthetic magnitudes enable direct calculations of the meteor's photometric ux within the camera band-pass without requiring any assumptions of its spectral energy distribution. Systematic uncertainties in the synthetic magnitudes of individual reference stars are estimated at 0:20 mag, and are limited by the available spectral information in the reference catalogs. These two improvements allow for zero-points accurate to 0:05 ?? 0:10 mag in both ltered and un ltered camera observations with no evidence for lingering systematics.
A practical indoor context-aware surveillance system with multi-Kinect sensors
NASA Astrophysics Data System (ADS)
Jia, Lili; You, Ying; Li, Tiezhu; Zhang, Shun
2014-11-01
In this paper we develop a novel practical application, which give scalable services to the end users when abnormal actives are happening. Architecture of the application has been presented consisting of network infrared cameras and a communication module. In this intelligent surveillance system we use Kinect sensors as the input cameras. Kinect is an infrared laser camera which its user can access the raw infrared sensor stream. We install several Kinect sensors in one room to track the human skeletons. Each sensor returns the body positions with 15 coordinates in its own coordinate system. We use calibration algorithms to calibrate all the body positions points into one unified coordinate system. With the body positions points, we can infer the surveillance context. Furthermore, the messages from the metadata index matrix will be sent to mobile phone through communication module. User will instantly be aware of an abnormal case happened in the room without having to check the website. In conclusion, theoretical analysis and experimental results in this paper show that the proposed system is reasonable and efficient. And the application method introduced in this paper is not only to discourage the criminals and assist police in the apprehension of suspects, but also can enabled the end-users monitor the indoor environments anywhere and anytime by their phones.
Improved iris localization by using wide and narrow field of view cameras for iris recognition
NASA Astrophysics Data System (ADS)
Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung
2013-10-01
Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.
A Simple Spectrophotometer Using Common Materials and a Digital Camera
ERIC Educational Resources Information Center
Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal
2011-01-01
A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…
Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraiskii, A V; Mironova, T V; Sultanov, T T
2010-09-10
A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)
3D measurement by digital photogrammetry
NASA Astrophysics Data System (ADS)
Schneider, Carl T.
1993-12-01
Photogrammetry is well known in geodetic surveys as aerial photogrammetry or close range applications as architectural photogrammetry. The photogrammetric methods and algorithms combined with digital cameras and digital image processing methods are now introduced for industrial applications as automation and quality control. The presented paper will describe the photogrammetric and digital image processing algorithms and the calibration methods. These algorithms and methods were demonstrated with application examples. These applications are a digital photogrammetric workstation as a mobil multi purpose 3D measuring tool and a tube measuring system as an example for a single purpose tool.
NASA Astrophysics Data System (ADS)
Kraiskii, A. V.; Mironova, T. V.; Sultanov, T. T.
2010-09-01
A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases.
Calibration of PVDF Film Transducers for the Cavitation Impact Measurement
NASA Astrophysics Data System (ADS)
Hujer, Jan; Müller, Miloš
2018-06-01
This paper describes investigation of the influence of the protective layer thickness on the calibration sensitivity of PVDF films sensors for the cavitation impacts measurements. The PVDF film sensor is casted into an aluminium block. The drop ball method is used for the measurement of the relation between impact force and the voltage detected on the PVDF film sensor. The calibration constants are measured for three different protective layers thicknesses. Five different ball weights for 400 mm drop height are used to reach the required impact force range. The ball positions for the evaluation of the impact force are measured with a high speed camera. The voltage signal detected on the PVDF film clamps was measured with a high speed digitizer. The measured signals are analysed in LabVIEW Signal Express.
Calibration method for video and radiation imagers
Cunningham, Mark F [Oak Ridge, TN; Fabris, Lorenzo [Knoxville, TN; Gee, Timothy F [Oak Ridge, TN; Goddard, Jr., James S.; Karnowski, Thomas P [Knoxville, TN; Ziock, Klaus-peter [Clinton, TN
2011-07-05
The relationship between the high energy radiation imager pixel (HERIP) coordinate and real-world x-coordinate is determined by a least square fit between the HERIP x-coordinate and the measured real-world x-coordinates of calibration markers that emit high energy radiation imager and reflect visible light. Upon calibration, a high energy radiation imager pixel position may be determined based on a real-world coordinate of a moving vehicle. Further, a scale parameter for said high energy radiation imager may be determined based on the real-world coordinate. The scale parameter depends on the y-coordinate of the moving vehicle as provided by a visible light camera. The high energy radiation imager may be employed to detect radiation from moving vehicles in multiple lanes, which correspondingly have different distances to the high energy radiation imager.
Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera
NASA Astrophysics Data System (ADS)
Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.
2016-04-01
The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.
Advances in Gamma-Ray Imaging with Intensified Quantum-Imaging Detectors
NASA Astrophysics Data System (ADS)
Han, Ling
Nuclear medicine, an important branch of modern medical imaging, is an essential tool for both diagnosis and treatment of disease. As the fundamental element of nuclear medicine imaging, the gamma camera is able to detect gamma-ray photons emitted by radiotracers injected into a patient and form an image of the radiotracer distribution, reflecting biological functions of organs or tissues. Recently, an intensified CCD/CMOS-based quantum detector, called iQID, was developed in the Center for Gamma-Ray Imaging. Originally designed as a novel type of gamma camera, iQID demonstrated ultra-high spatial resolution (< 100 micron) and many other advantages over traditional gamma cameras. This work focuses on advancing this conceptually-proven gamma-ray imaging technology to make it ready for both preclinical and clinical applications. To start with, a Monte Carlo simulation of the key light-intensification device, i.e. the image intensifier, was developed, which revealed the dominating factor(s) that limit energy resolution performance of the iQID cameras. For preclinical imaging applications, a previously-developed iQID-based single-photon-emission computed-tomography (SPECT) system, called FastSPECT III, was fully advanced in terms of data acquisition software, system sensitivity and effective FOV by developing and adopting a new photon-counting algorithm, thicker columnar scintillation detectors, and system calibration method. Originally designed for mouse brain imaging, the system is now able to provide full-body mouse imaging with sub-350-micron spatial resolution. To further advance the iQID technology to include clinical imaging applications, a novel large-area iQID gamma camera, called LA-iQID, was developed from concept to prototype. Sub-mm system resolution in an effective FOV of 188 mm x 188 mm has been achieved. The camera architecture, system components, design and integration, data acquisition, camera calibration, and performance evaluation are presented in this work. Mounted on a castered counter-weighted clinical cart, the camera also features portable and mobile capabilities for easy handling and on-site applications at remote locations where hospital facilities are not available.
2012-08-17
This image shows the calibration target for the Chemistry and Camera ChemCam instrument on NASA Curiosity rover. The calibration target is one square and a group of nine circles that look dark in the black-and-white image.
MAHLI Calibration Target in Ultraviolet Light
2012-02-07
During pre-flight testing in March 2011, the Mars Hand Lens Imager MAHLI camera on NASA Mars rover Curiosity took this image of the MAHLI calibration target under illumination from MAHLI two ultraviolet LEDs light emitting diodes.
Sevrin, Loïc; Noury, Norbert; Abouchi, Nacer; Jumel, Fabrice; Massot, Bertrand; Saraydaryan, Jacques
2015-01-01
An increasing number of systems use indoor positioning for many scenarios such as asset tracking, health care, games, manufacturing, logistics, shopping, and security. Many technologies are available and the use of depth cameras is becoming more and more attractive as this kind of device becomes affordable and easy to handle. This paper contributes to the effort of creating an indoor positioning system based on low cost depth cameras (Kinect). A method is proposed to optimize the calibration of the depth cameras, to describe the multi-camera data fusion and to specify a global positioning projection to maintain the compatibility with outdoor positioning systems. The monitoring of the people trajectories at home is intended for the early detection of a shift in daily activities which highlights disabilities and loss of autonomy. This system is meant to improve homecare health management at home for a better end of life at a sustainable cost for the community.
Development and calibration of a new gamma camera detector using large square Photomultiplier Tubes
NASA Astrophysics Data System (ADS)
Zeraatkar, N.; Sajedi, S.; Teimourian Fard, B.; Kaviani, S.; Akbarzadeh, A.; Farahani, M. H.; Sarkar, S.; Ay, M. R.
2017-09-01
Large area scintillation detectors applied in gamma cameras as well as Single Photon Computed Tomography (SPECT) systems, have a major role in in-vivo functional imaging. Most of the gamma detectors utilize hexagonal arrangement of Photomultiplier Tubes (PMTs). In this work we applied large square-shaped PMTs with row/column arrangement and positioning. The Use of large square PMTs reduces dead zones in the detector surface. However, the conventional center of gravity method for positioning may not introduce an acceptable result. Hence, the digital correlated signal enhancement (CSE) algorithm was optimized to obtain better linearity and spatial resolution in the developed detector. The performance of the developed detector was evaluated based on NEMA-NU1-2007 standard. The acquired images using this method showed acceptable uniformity and linearity comparing to three commercial gamma cameras. Also the intrinsic and extrinsic spatial resolutions with low-energy high-resolution (LEHR) collimator at 10 cm from surface of the detector were 3.7 mm and 7.5 mm, respectively. The energy resolution of the camera was measured 9.5%. The performance evaluation demonstrated that the developed detector maintains image quality with a reduced number of used PMTs relative to the detection area.
Satellite markers: a simple method for ground truth car pose on stereo video
NASA Astrophysics Data System (ADS)
Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Pierini, Marco
2018-04-01
Artificial prediction of future location of other cars in the context of advanced safety systems is a must. The remote estimation of car pose and particularly its heading angle is key to predict its future location. Stereo vision systems allow to get the 3D information of a scene. Ground truth in this specific context is associated with referential information about the depth, shape and orientation of the objects present in the traffic scene. Creating 3D ground truth is a measurement and data fusion task associated with the combination of different kinds of sensors. The novelty of this paper is the method to generate ground truth car pose only from video data. When the method is applied to stereo video, it also provides the extrinsic camera parameters for each camera at frame level which are key to quantify the performance of a stereo vision system when it is moving because the system is subjected to undesired vibrations and/or leaning. We developed a video post-processing technique which employs a common camera calibration tool for the 3D ground truth generation. In our case study, we focus in accurate car heading angle estimation of a moving car under realistic imagery. As outcomes, our satellite marker method provides accurate car pose at frame level, and the instantaneous spatial orientation for each camera at frame level.
BRDF invariant stereo using light transport constancy.
Wang, Liang; Yang, Ruigang; Davis, James E
2007-09-01
Nearly all existing methods for stereo reconstruction assume that scene reflectance is Lambertian and make use of brightness constancy as a matching invariant. We introduce a new invariant for stereo reconstruction called light transport constancy (LTC), which allows completely arbitrary scene reflectance (bidirectional reflectance distribution functions (BRDFs)). This invariant can be used to formulate a rank constraint on multiview stereo matching when the scene is observed by several lighting configurations in which only the lighting intensity varies. In addition, we show that this multiview constraint can be used with as few as two cameras and two lighting configurations. Unlike previous methods for BRDF invariant stereo, LTC does not require precisely configured or calibrated light sources or calibration objects in the scene. Importantly, the new constraint can be used to provide BRDF invariance to any existing stereo method whenever appropriate lighting variation is available.
Measurement of vibration using phase only correlation technique
NASA Astrophysics Data System (ADS)
Balachandar, S.; Vipin, K.
2017-08-01
A novel method for the measurement of vibration is proposed and demonstrated. The proposed experiment is based on laser triangulation: consists of line laser, object under test and a high speed camera remotely controlled by a software. Experiment involves launching a line-laser probe beam perpendicular to the axis of the vibrating object. The reflected probe beam is recorded by a high speed camera. The dynamic position of the line laser in camera plane is governed by the magnitude and frequency of the vibrating test-object. Using phase correlation technique the maximum distance travelled by the probe beam in CCD plane is measured in terms of pixels using MATLAB. An actual displacement of the object in mm is measured by calibration. Using displacement data with time, other vibration associated quantities such as acceleration, velocity and frequency are evaluated. The preliminary result of the proposed method is reported for acceleration from 1g to 3g, and from frequency 6Hz to 26Hz. The results are closely matching with its theoretical values. The advantage of the proposed method is that it is a non-destructive method and using phase correlation algorithm subpixel displacement in CCD plane can be measured with high accuracy.
A New Calibration Method Using Low Cost MEM IMUs to Verify the Performance of UAV-Borne MMS Payloads
Chiang, Kai-Wei; Tsai, Meng-Lun; Naser, El-Sheimy; Habib, Ayman; Chu, Chien-Hsun
2015-01-01
Spatial information plays a critical role in remote sensing and mapping applications such as environment surveying and disaster monitoring. An Unmanned Aerial Vehicle (UAV)-borne mobile mapping system (MMS) can accomplish rapid spatial information acquisition under limited sky conditions with better mobility and flexibility than other means. This study proposes a long endurance Direct Geo-referencing (DG)-based fixed-wing UAV photogrammetric platform and two DG modules that each use different commercial Micro-Electro Mechanical Systems’ (MEMS) tactical grade Inertial Measurement Units (IMUs). Furthermore, this study develops a novel kinematic calibration method which includes lever arms, boresight angles and camera shutter delay to improve positioning accuracy. The new calibration method is then compared with the traditional calibration approach. The results show that the accuracy of the DG can be significantly improved by flying at a lower altitude using the new higher specification hardware. The new proposed method improves the accuracy of DG by about 20%. The preliminary results show that two-dimensional (2D) horizontal DG positioning accuracy is around 5.8 m at a flight height of 300 m using the newly designed tactical grade integrated Positioning and Orientation System (POS). The positioning accuracy in three-dimensions (3D) is less than 8 m. PMID:25808764
New calibration method using low cost MEM IMUs to verify the performance of UAV-borne MMS payloads.
Chiang, Kai-Wei; Tsai, Meng-Lun; Naser, El-Sheimy; Habib, Ayman; Chu, Chien-Hsun
2015-03-19
Spatial information plays a critical role in remote sensing and mapping applications such as environment surveying and disaster monitoring. An Unmanned Aerial Vehicle (UAV)-borne mobile mapping system (MMS) can accomplish rapid spatial information acquisition under limited sky conditions with better mobility and flexibility than other means. This study proposes a long endurance Direct Geo-referencing (DG)-based fixed-wing UAV photogrammetric platform and two DG modules that each use different commercial Micro-Electro Mechanical Systems' (MEMS) tactical grade Inertial Measurement Units (IMUs). Furthermore, this study develops a novel kinematic calibration method which includes lever arms, boresight angles and camera shutter delay to improve positioning accuracy. The new calibration method is then compared with the traditional calibration approach. The results show that the accuracy of the DG can be significantly improved by flying at a lower altitude using the new higher specification hardware. The new proposed method improves the accuracy of DG by about 20%. The preliminary results show that two-dimensional (2D) horizontal DG positioning accuracy is around 5.8 m at a flight height of 300 m using the newly designed tactical grade integrated Positioning and Orientation System (POS). The positioning accuracy in three-dimensions (3D) is less than 8 m.
Oh, Hyun Jun; Yang, Il-Hyung
2016-01-01
Objectives: To propose a novel method for determining the three-dimensional (3D) root apex position of maxillary teeth using a two-dimensional (2D) panoramic radiograph image and a 3D virtual maxillary cast model. Methods: The subjects were 10 adult orthodontic patients treated with non-extraction. The multiple camera matrices were used to define transformative relationships between tooth images of the 2D panoramic radiographs and the 3D virtual maxillary cast models. After construction of the root apex-specific projective (RASP) models, overdetermined equations were used to calculate the 3D root apex position with a direct linear transformation algorithm and the known 2D co-ordinates of the root apex in the panoramic radiograph. For verification of the estimated 3D root apex position, the RASP and 3D-CT models were superimposed using a best-fit method. Then, the values of estimation error (EE; mean, standard deviation, minimum error and maximum error) between the two models were calculated. Results: The intraclass correlation coefficient values exhibited good reliability for the landmark identification. The mean EE of all root apices of maxillary teeth was 1.88 mm. The EE values, in descending order, were as follows: canine, 2.30 mm; first premolar, 1.93 mm; second premolar, 1.91 mm; first molar, 1.83 mm; second molar, 1.82 mm; lateral incisor, 1.80 mm; and central incisor, 1.53 mm. Conclusions: Camera calibration technology allows reliable determination of the 3D root apex position of maxillary teeth without the need for 3D-CT scan or tooth templates. PMID:26317151
NASA Technical Reports Server (NTRS)
Everett, Louis J.
1994-01-01
The work reported here demonstrates how to automatically compute the position and attitude of a targeting reflective alignment concept (TRAC) camera relative to the robot end effector. In the robotics literature this is known as the sensor registration problem. The registration problem is important to solve if TRAC images need to be related to robot position. Previously, when TRAC operated on the end of a robot arm, the camera had to be precisely located at the correct orientation and position. If this location is in error, then the robot may not be able to grapple an object even though the TRAC sensor indicates it should. In addition, if the camera is significantly far from the alignment it is expected to be at, TRAC may give incorrect feedback for the control of the robot. A simple example is if the robot operator thinks the camera is right side up but the camera is actually upside down, the camera feedback will tell the operator to move in an incorrect direction. The automatic calibration algorithm requires the operator to translate and rotate the robot arbitrary amounts along (about) two coordinate directions. After the motion, the algorithm determines the transformation matrix from the robot end effector to the camera image plane. This report discusses the TRAC sensor registration problem.
Projector-Camera Systems for Immersive Training
2006-01-01
average to a sequence of 100 captured distortion corrected images. The OpenCV library [ OpenCV ] was used for camera calibration. To correct for...rendering application [Treskunov, Pair, and Swartout, 2004]. It was transposed to take into account different matrix conventions between OpenCV and...Screen Imperfections. Proc. Workshop on Projector-Camera Systems (PROCAMS), Nice, France, IEEE. OpenCV : Open Source Computer Vision. [Available
Design and Calibration of a Dispersive Imaging Spectrometer Adaptor for a Fast IR Camera on NSTX-U
NASA Astrophysics Data System (ADS)
Reksoatmodjo, Richard; Gray, Travis; Princeton Plasma Physics Laboratory Team
2017-10-01
A dispersive spectrometer adaptor was designed, constructed and calibrated for use on a fast infrared camera employed to measure temperatures on the lower divertor tiles of the NSTX-U tokamak. This adaptor efficiently and evenly filters and distributes long-wavelength infrared photons between 8.0 and 12.0 microns across the 128x128 pixel detector of the fast IR camera. By determining the width of these separated wavelength bands across the camera detector, and then determining the corresponding average photon count for each photon wavelength, a very accurate measurement of the temperature, and thus heat flux, of the divertor tiles can be calculated using Plank's law. This approach of designing an exterior dispersive adaptor for the fast IR camera allows accurate temperature measurements to be made of materials with unknown emissivity. Further, the relative simplicity and affordability of this adaptor design provides an attractive option over more expensive, slower, dispersive IR camera systems. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No. DE-AC02-09CH11466.
NASA Astrophysics Data System (ADS)
Bruegge, Carol J.; Val, Sebastian; Diner, David J.; Jovanovic, Veljko; Gray, Ellyn; Di Girolamo, Larry; Zhao, Guangyu
2014-09-01
The Multi-angle Imaging SpectroRadiometer (MISR) has successfully operated on the EOS/ Terra spacecraft since 1999. It consists of nine cameras pointing from nadir to 70.5° view angle with four spectral channels per camera. Specifications call for a radiometric uncertainty of 3% absolute and 1% relative to the other cameras. To accomplish this, MISR utilizes an on-board calibrator (OBC) to measure camera response changes. Once every two months the two Spectralon panels are deployed to direct solar-light into the cameras. Six photodiode sets measure the illumination level that are compared to MISR raw digital numbers, thus determining the radiometric gain coefficients used in Level 1 data processing. Although panel stability is not required, there has been little detectable change in panel reflectance, attributed to careful preflight handling techniques. The cameras themselves have degraded in radiometric response by 10% since launch, but calibration updates using the detector-based scheme has compensated for these drifts and allowed the radiance products to meet accuracy requirements. Validation using Sahara desert observations show that there has been a drift of ~1% in the reported nadir-view radiance over a decade, common to all spectral bands.
Kinch, Kjartan M; Bell, James F; Goetz, Walter; Johnson, Jeffrey R; Joseph, Jonathan; Madsen, Morten Bo; Sohl-Dickstein, Jascha
2015-05-01
The Panoramic Cameras on NASA's Mars Exploration Rovers have each returned more than 17,000 images of their calibration targets. In order to make optimal use of this data set for reflectance calibration, a correction must be made for the presence of air fall dust. Here we present an improved dust correction procedure based on a two-layer scattering model, and we present a dust reflectance spectrum derived from long-term trends in the data set. The dust on the calibration targets appears brighter than dusty areas of the Martian surface. We derive detailed histories of dust deposition and removal revealing two distinct environments: At the Spirit landing site, half the year is dominated by dust deposition, the other half by dust removal, usually in brief, sharp events. At the Opportunity landing site the Martian year has a semiannual dust cycle with dust removal happening gradually throughout two removal seasons each year. The highest observed optical depth of settled dust on the calibration target is 1.5 on Spirit and 1.1 on Opportunity (at 601 nm). We derive a general prediction for dust deposition rates of 0.004 ± 0.001 in units of surface optical depth deposited per sol (Martian solar day) per unit atmospheric optical depth. We expect this procedure to lead to improved reflectance-calibration of the Panoramic Camera data set. In addition, it is easily adapted to similar data sets from other missions in order to deliver improved reflectance calibration as well as data on dust reflectance properties and deposition and removal history.
Bell, James F.; Goetz, Walter; Johnson, Jeffrey R.; Joseph, Jonathan; Madsen, Morten Bo; Sohl‐Dickstein, Jascha
2015-01-01
Abstract The Panoramic Cameras on NASA's Mars Exploration Rovers have each returned more than 17,000 images of their calibration targets. In order to make optimal use of this data set for reflectance calibration, a correction must be made for the presence of air fall dust. Here we present an improved dust correction procedure based on a two‐layer scattering model, and we present a dust reflectance spectrum derived from long‐term trends in the data set. The dust on the calibration targets appears brighter than dusty areas of the Martian surface. We derive detailed histories of dust deposition and removal revealing two distinct environments: At the Spirit landing site, half the year is dominated by dust deposition, the other half by dust removal, usually in brief, sharp events. At the Opportunity landing site the Martian year has a semiannual dust cycle with dust removal happening gradually throughout two removal seasons each year. The highest observed optical depth of settled dust on the calibration target is 1.5 on Spirit and 1.1 on Opportunity (at 601 nm). We derive a general prediction for dust deposition rates of 0.004 ± 0.001 in units of surface optical depth deposited per sol (Martian solar day) per unit atmospheric optical depth. We expect this procedure to lead to improved reflectance‐calibration of the Panoramic Camera data set. In addition, it is easily adapted to similar data sets from other missions in order to deliver improved reflectance calibration as well as data on dust reflectance properties and deposition and removal history. PMID:27981072
NASA Technical Reports Server (NTRS)
Wachter, R.; Schou, Jesper; Rabello-Soares, M. C.; Miles, J. W.; Duvall, T. L., Jr.; Bush, R. I.
2011-01-01
We describe the imaging quality of the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamics Observatory (SDO) as measured during the ground calibration of the instrument. We describe the calibration techniques and report our results for the final configuration of HMI. We present the distortion, modulation transfer function, stray light,image shifts introduced by moving parts of the instrument, best focus, field curvature, and the relative alignment of the two cameras. We investigate the gain and linearity of the cameras, and present the measured flat field.
Lytro camera technology: theory, algorithms, performance analysis
NASA Astrophysics Data System (ADS)
Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio
2013-03-01
The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.
Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Frascati, Joe; Driggers, Ronald
2018-04-01
Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.
Localization and Mapping Using a Non-Central Catadioptric Camera System
NASA Astrophysics Data System (ADS)
Khurana, M.; Armenakis, C.
2018-05-01
This work details the development of an indoor navigation and mapping system using a non-central catadioptric omnidirectional camera and its implementation for mobile applications. Omnidirectional catadioptric cameras find their use in navigation and mapping of robotic platforms, owing to their wide field of view. Having a wider field of view, or rather a potential 360° field of view, allows the system to "see and move" more freely in the navigation space. A catadioptric camera system is a low cost system which consists of a mirror and a camera. Any perspective camera can be used. A platform was constructed in order to combine the mirror and a camera to build a catadioptric system. A calibration method was developed in order to obtain the relative position and orientation between the two components so that they can be considered as one monolithic system. The mathematical model for localizing the system was determined using conditions based on the reflective properties of the mirror. The obtained platform positions were then used to map the environment using epipolar geometry. Experiments were performed to test the mathematical models and the achieved location and mapping accuracies of the system. An iterative process of positioning and mapping was applied to determine object coordinates of an indoor environment while navigating the mobile platform. Camera localization and 3D coordinates of object points obtained decimetre level accuracies.
Henrion, Sebastian; Spoor, Cees W; Pieters, Remco P M; Müller, Ulrike K; van Leeuwen, Johan L
2015-07-07
Images of underwater objects are distorted by refraction at the water-glass-air interfaces and these distortions can lead to substantial errors when reconstructing the objects' position and shape. So far, aquatic locomotion studies have minimized refraction in their experimental setups and used the direct linear transform algorithm (DLT) to reconstruct position information, which does not model refraction explicitly. Here we present a refraction corrected ray-tracing algorithm (RCRT) that reconstructs position information using Snell's law. We validated this reconstruction by calculating 3D reconstruction error-the difference between actual and reconstructed position of a marker. We found that reconstruction error is small (typically less than 1%). Compared with the DLT algorithm, the RCRT has overall lower reconstruction errors, especially outside the calibration volume, and errors are essentially insensitive to camera position and orientation and the number and position of the calibration points. To demonstrate the effectiveness of the RCRT, we tracked an anatomical marker on a seahorse recorded with four cameras to reconstruct the swimming trajectory for six different camera configurations. The RCRT algorithm is accurate and robust and it allows cameras to be oriented at large angles of incidence and facilitates the development of accurate tracking algorithms to quantify aquatic manoeuvers.
Ballesteros, Rocío
2017-01-01
The acquisition, processing, and interpretation of thermal images from unmanned aerial vehicles (UAVs) is becoming a useful source of information for agronomic applications because of the higher temporal and spatial resolution of these products compared with those obtained from satellites. However, due to the low load capacity of the UAV they need to mount light, uncooled thermal cameras, where the microbolometer is not stabilized to a constant temperature. This makes the camera precision low for many applications. Additionally, the low contrast of the thermal images makes the photogrammetry process inaccurate, which result in large errors in the generation of orthoimages. In this research, we propose the use of new calibration algorithms, based on neural networks, which consider the sensor temperature and the digital response of the microbolometer as input data. In addition, we evaluate the use of the Wallis filter for improving the quality of the photogrammetry process using structure from motion software. With the proposed calibration algorithm, the measurement accuracy increased from 3.55 °C with the original camera configuration to 1.37 °C. The implementation of the Wallis filter increases the number of tie-point from 58,000 to 110,000 and decreases the total positing error from 7.1 m to 1.3 m. PMID:28946606
Ribeiro-Gomes, Krishna; Hernández-López, David; Ortega, José F; Ballesteros, Rocío; Poblete, Tomás; Moreno, Miguel A
2017-09-23
The acquisition, processing, and interpretation of thermal images from unmanned aerial vehicles (UAVs) is becoming a useful source of information for agronomic applications because of the higher temporal and spatial resolution of these products compared with those obtained from satellites. However, due to the low load capacity of the UAV they need to mount light, uncooled thermal cameras, where the microbolometer is not stabilized to a constant temperature. This makes the camera precision low for many applications. Additionally, the low contrast of the thermal images makes the photogrammetry process inaccurate, which result in large errors in the generation of orthoimages. In this research, we propose the use of new calibration algorithms, based on neural networks, which consider the sensor temperature and the digital response of the microbolometer as input data. In addition, we evaluate the use of the Wallis filter for improving the quality of the photogrammetry process using structure from motion software. With the proposed calibration algorithm, the measurement accuracy increased from 3.55 °C with the original camera configuration to 1.37 °C. The implementation of the Wallis filter increases the number of tie-point from 58,000 to 110,000 and decreases the total positing error from 7.1 m to 1.3 m.
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor.
Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung
2018-02-03
A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods.
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor
Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung
2018-01-01
A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver’s point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods. PMID:29401681
Improving Photometric Calibration of Meteor Video Camera Systems.
Ehlert, Steven; Kingery, Aaron; Suggs, Robert
2017-09-01
We present the results of new calibration tests performed by the NASA Meteoroid Environment Office (MEO) designed to help quantify and minimize systematic uncertainties in meteor photometry from video camera observations. These systematic uncertainties can be categorized by two main sources: an imperfect understanding of the linearity correction for the MEO's Watec 902H2 Ultimate video cameras and uncertainties in meteor magnitudes arising from transformations between the Watec camera's Sony EX-View HAD bandpass and the bandpasses used to determine reference star magnitudes. To address the first point, we have measured the linearity response of the MEO's standard meteor video cameras using two independent laboratory tests on eight cameras. Our empirically determined linearity correction is critical for performing accurate photometry at low camera intensity levels. With regards to the second point, we have calculated synthetic magnitudes in the EX bandpass for reference stars. These synthetic magnitudes enable direct calculations of the meteor's photometric flux within the camera band pass without requiring any assumptions of its spectral energy distribution. Systematic uncertainties in the synthetic magnitudes of individual reference stars are estimated at ∼ 0.20 mag, and are limited by the available spectral information in the reference catalogs. These two improvements allow for zero-points accurate to ∼ 0.05 - 0.10 mag in both filtered and unfiltered camera observations with no evidence for lingering systematics. These improvements are essential to accurately measuring photometric masses of individual meteors and source mass indexes.
Improving Photometric Calibration of Meteor Video Camera Systems
NASA Technical Reports Server (NTRS)
Ehlert, Steven; Kingery, Aaron; Suggs, Robert
2017-01-01
We present the results of new calibration tests performed by the NASA Meteoroid Environment Office (MEO) designed to help quantify and minimize systematic uncertainties in meteor photometry from video camera observations. These systematic uncertainties can be categorized by two main sources: an imperfect understanding of the linearity correction for the MEO's Watec 902H2 Ultimate video cameras and uncertainties in meteor magnitudes arising from transformations between the Watec camera's Sony EX-View HAD bandpass and the bandpasses used to determine reference star magnitudes. To address the first point, we have measured the linearity response of the MEO's standard meteor video cameras using two independent laboratory tests on eight cameras. Our empirically determined linearity correction is critical for performing accurate photometry at low camera intensity levels. With regards to the second point, we have calculated synthetic magnitudes in the EX bandpass for reference stars. These synthetic magnitudes enable direct calculations of the meteor's photometric flux within the camera bandpass without requiring any assumptions of its spectral energy distribution. Systematic uncertainties in the synthetic magnitudes of individual reference stars are estimated at approx. 0.20 mag, and are limited by the available spectral information in the reference catalogs. These two improvements allow for zero-points accurate to 0.05 - 0.10 mag in both filtered and unfiltered camera observations with no evidence for lingering systematics. These improvements are essential to accurately measuring photometric masses of individual meteors and source mass indexes.
Radiometric Cross-Calibration of the HJ-1B IRS in the Thermal Infrared Spectral Band
NASA Astrophysics Data System (ADS)
Sun, K.
2012-12-01
The natural calamities occur continually, environment pollution and destruction in a severe position on the earth presently, which restricts societal and economic development. The satellite remote sensing technology has an important effect on improving surveillance ability of environment pollution and natural calamities. The radiometric calibration is precondition of quantitative remote sensing; which accuracy decides quality of the retrieval parameters. Since the China Environment Satellite (HJ-1A/B) has been launched successfully on September 6th, 2008, it has made an important role in the economic development of China. The satellite has four infrared bands; and one of it is thermal infrared. With application fields of quantitative remote sensing in china, finding appropriate calibration method becomes more and more important. Many kinds of independent methods can be used to do the absolute radiometric calibration. In this paper, according to the characteristic of thermal infrared channel of HJ-1B thermal infrared multi-spectral camera, the thermal infrared spectral band of HJ-1B IRS was calibrated using cross-calibration methods based on MODIS data. Firstly, the corresponding bands of the two sensors were obtained. Secondly, the MONDTRAN was run to analyze the influences of different spectral response, satellite view zenith angle, atmosphere condition and temperature on the match factor. In the end, their band match factor was calculated in different temperature, considering the dissimilar band response of the match bands. Seven images of Lake Qinghai in different time were chosen as the calibration data. On the basis of radiance of MODIS and match factor, the IRS radiance was calculated. And then the calibration coefficients were obtained by linearly regressing the radiance and the DN value. We compared the result of this cross-calibration with that of the onboard blackbody calibration, which consistency was good.The maximum difference of brightness temperature between HJ-1B IRS band4 and MODIS band 31 is less than 1 K. Therefore cross-calibration is a rapid and financial way to get calibration coefficients of HJ-1B, however, the matched factor calculation method need further research in order to further improve cross-calibration precision.
Estimation of Anthocyanin Content of Berries by NIR Method
NASA Astrophysics Data System (ADS)
Zsivanovits, G.; Ludneva, D.; Iliev, A.
2010-01-01
Anthocyanin contents of fruits were estimated by VIS spectrophotometer and compared with spectra measured by NIR spectrophotometer (600-1100 nm step 10 nm). The aim was to find a relationship between NIR method and traditional spectrophotometric method. The testing protocol, using NIR, is easier, faster and non-destructive. NIR spectra were prepared in pairs, reflectance and transmittance. A modular spectrocomputer, realized on the basis of a monochromator and peripherals Bentham Instruments Ltd (GB) and a photometric camera created at Canning Research Institute, were used. An important feature of this camera is the possibility offered for a simultaneous measurement of both transmittance and reflectance with geometry patterns T0/180 and R0/45. The collected spectra were analyzed by CAMO Unscrambler 9.1 software, with PCA, PLS, PCR methods. Based on the analyzed spectra quality and quantity sensitive calibrations were prepared. The results showed that the NIR method allows measuring of the total anthocyanin content in fresh berry fruits or processed products without destroying them.
Estimation of Anthocyanin Content of Berries by NIR Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zsivanovits, G.; Ludneva, D.; Iliev, A.
2010-01-21
Anthocyanin contents of fruits were estimated by VIS spectrophotometer and compared with spectra measured by NIR spectrophotometer (600-1100 nm step 10 nm). The aim was to find a relationship between NIR method and traditional spectrophotometric method. The testing protocol, using NIR, is easier, faster and non-destructive. NIR spectra were prepared in pairs, reflectance and transmittance. A modular spectrocomputer, realized on the basis of a monochromator and peripherals Bentham Instruments Ltd (GB) and a photometric camera created at Canning Research Institute, were used. An important feature of this camera is the possibility offered for a simultaneous measurement of both transmittance andmore » reflectance with geometry patterns T0/180 and R0/45. The collected spectra were analyzed by CAMO Unscrambler 9.1 software, with PCA, PLS, PCR methods. Based on the analyzed spectra quality and quantity sensitive calibrations were prepared. The results showed that the NIR method allows measuring of the total anthocyanin content in fresh berry fruits or processed products without destroying them.« less
Calibration Target as Seen by Mars Hand Lens Imager
2012-02-07
During pre-flight testing, the Mars Hand Lens Imager MAHLI camera on NASA Mars rover Curiosity took this image of the MAHLI calibration target from a distance of 3.94 inches 10 centimeters away from the target.
A New Technique for Precision Photometry Using Alt/Az Telescopes
NASA Astrophysics Data System (ADS)
Kirkaptrick, Colin; Stacey, Piper; Swift, Jonathan
2018-06-01
We present and test a new method for flat field calibration of images obtained on telescopes with altitude-azimuth (Alt-Az) mounts. Telescopes using Alt-Az mounts typically employ a field “de-rotator” to account for changing parallactic angles of targets observed across the sky, or for long exposures of a single target. This “de-rotation” results in a changing orientation of the telescope optics with respect to the camera. This, in turn, can result in a flat field that is a function of camera orientation due to, for example, vignetting. In order to account for these changes we develop and test a new flat field technique using the observations of known transiting exoplanets.
NASA Astrophysics Data System (ADS)
Romanishkin, Igor D.; Grachev, Pavel V.; Pominova, Daria V.; Burmistrov, Ivan A.; Sildos, Ilmo; Vanetsev, Alexander S.; Orlovskaya, Elena O.; Orlovskii, Yuri V.; Loschenov, Victor B.; Ryabova, Anastasia V.
2018-04-01
In this work we investigated the use of composite crystalline core/shell nanoparticles LaF3:Nd3+(1%)@DyPO4 for fluorescence-based contactless thermometry, as well as laser-induced hyperthermia effect in optical model of biological tissue with modeled neoplasm. In preparation for this, a thermal calibration of the nanoparticles luminescence spectra was carried out. The results of the spectroscopic temperature measurement were compared to infrared thermal camera measurements. It showed that there is a significant difference between temperature recorded with IR camera and the actual temperature of the nanoparticles in the depth of the tissue model. The temperature calculated using the spectral method was up to 10 °C higher.
DIGITAL CARTOGRAPHY OF THE PLANETS: NEW METHODS, ITS STATUS, AND ITS FUTURE.
Batson, R.M.
1987-01-01
A system has been developed that establishes a standardized cartographic database for each of the 19 planets and major satellites that have been explored to date. Compilation of the databases involves both traditional and newly developed digital image processing and mosaicking techniques, including radiometric and geometric corrections of the images. Each database, or digital image model (DIM), is a digital mosaic of spacecraft images that have been radiometrically and geometrically corrected and photometrically modeled. During compilation, ancillary data files such as radiometric calibrations and refined photometric values for all camera lens and filter combinations and refined camera-orientation matrices for all images used in the mapping are produced.
Automatic Camera Orientation and Structure Recovery with Samantha
NASA Astrophysics Data System (ADS)
Gherardi, R.; Toldo, R.; Garro, V.; Fusiello, A.
2011-09-01
SAMANTHA is a software capable of computing camera orientation and structure recovery from a sparse block of casual images without human intervention. It can process both calibrated images or uncalibrated, in which case an autocalibration routine is run. Pictures are organized into a hierarchical tree which has single images as leaves and partial reconstructions as internal nodes. The method proceeds bottom up until it reaches the root node, corresponding to the final result. This framework is one order of magnitude faster than sequential approaches, inherently parallel, less sensitive to the error accumulation causing drift. We have verified the quality of our reconstructions both qualitatively producing compelling point clouds and quantitatively, comparing them with laser scans serving as ground truth.
Hardware in the Loop Performance Assessment of LIDAR-Based Spacecraft Pose Determination
Fasano, Giancarmine; Grassi, Michele
2017-01-01
In this paper an original, easy to reproduce, semi-analytic calibration approach is developed for hardware-in-the-loop performance assessment of pose determination algorithms processing point cloud data, collected by imaging a non-cooperative target with LIDARs. The laboratory setup includes a scanning LIDAR, a monocular camera, a scaled-replica of a satellite-like target, and a set of calibration tools. The point clouds are processed by uncooperative model-based algorithms to estimate the target relative position and attitude with respect to the LIDAR. Target images, acquired by a monocular camera operated simultaneously with the LIDAR, are processed applying standard solutions to the Perspective-n-Points problem to get high-accuracy pose estimates which can be used as a benchmark to evaluate the accuracy attained by the LIDAR-based techniques. To this aim, a precise knowledge of the extrinsic relative calibration between the camera and the LIDAR is essential, and it is obtained by implementing an original calibration approach which does not need ad-hoc homologous targets (e.g., retro-reflectors) easily recognizable by the two sensors. The pose determination techniques investigated by this work are of interest to space applications involving close-proximity maneuvers between non-cooperative platforms, e.g., on-orbit servicing and active debris removal. PMID:28946651
Hardware in the Loop Performance Assessment of LIDAR-Based Spacecraft Pose Determination.
Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele
2017-09-24
In this paper an original, easy to reproduce, semi-analytic calibration approach is developed for hardware-in-the-loop performance assessment of pose determination algorithms processing point cloud data, collected by imaging a non-cooperative target with LIDARs. The laboratory setup includes a scanning LIDAR, a monocular camera, a scaled-replica of a satellite-like target, and a set of calibration tools. The point clouds are processed by uncooperative model-based algorithms to estimate the target relative position and attitude with respect to the LIDAR. Target images, acquired by a monocular camera operated simultaneously with the LIDAR, are processed applying standard solutions to the Perspective- n -Points problem to get high-accuracy pose estimates which can be used as a benchmark to evaluate the accuracy attained by the LIDAR-based techniques. To this aim, a precise knowledge of the extrinsic relative calibration between the camera and the LIDAR is essential, and it is obtained by implementing an original calibration approach which does not need ad-hoc homologous targets (e.g., retro-reflectors) easily recognizable by the two sensors. The pose determination techniques investigated by this work are of interest to space applications involving close-proximity maneuvers between non-cooperative platforms, e.g., on-orbit servicing and active debris removal.
Detailed in situ laser calibration of the infrared imaging video bolometer for the JT-60U tokamak
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parchamy, H.; Peterson, B. J.; Konoshima, S.
2006-10-15
The infrared imaging video bolometer (IRVB) in JT-60U includes a single graphite-coated gold foil with an effective area of 9x7 cm{sup 2} and a thickness of 2.5 {mu}m. The thermal images of the foil resulting from the plasma radiation are provided by an IR camera. The calibration technique of the IRVB gives confidence in the absolute levels of the measured values of the plasma radiation. The in situ calibration is carried out in order to obtain local foil properties such as the thermal diffusivity {kappa} and the product of the thermal conductivity k and the thickness t{sub f} of themore » foil. These quantities are necessary for solving the two-dimensional heat diffusion equation of the foil which is used in the experiments. These parameters are determined by comparing the measured temperature profiles (for kt{sub f}) and their decays (for {kappa}) with the corresponding results of a finite element model using the measured HeNe laser power profile as a known radiation power source. The infrared camera (Indigo/Omega) is calibrated by fitting the temperature rise of a heated plate to the resulting camera data using the Stefan-Boltzmann law.« less
Close Range Calibration of Long Focal Length Lenses in a Changing Environment
NASA Astrophysics Data System (ADS)
Robson, Stuart; MacDonald, Lindsay; Kyle, Stephen; Shortis, Mark R.
2016-06-01
University College London is currently developing a large-scale multi-camera system for dimensional control tasks in manufacturing, including part machining, assembly and tracking, as part of the Light Controlled Factory project funded by the UK Engineering and Physical Science Research Council. In parallel, as part of the EU LUMINAR project funded by the European Association of National Metrology Institutes, refraction models of the atmosphere in factory environments are being developed with the intent of modelling and eliminating the effects of temperature and other variations. The accuracy requirements for both projects are extremely demanding, so accordingly improvements in the modelling of both camera imaging and the measurement environment are essential. At the junction of these two projects lies close range camera calibration. The accurate and reliable calibration of cameras across a realistic range of atmospheric conditions in the factory environment is vital in order to eliminate systematic errors. This paper demonstrates the challenge of experimentally isolating environmental effects at the level of a few tens of microns. Longer lines of sight promote the use and calibration of a near perfect perspective projection from a Kern 75mm lens with maximum radial distortion of the order of 0.5m. Coordination of a reference target array, representing a manufactured part, is achieved to better than 0.1mm at a standoff of 8m. More widely, results contribute to better sensor understanding, improved mathematical modelling of factory environments and more reliable coordination of targets to 0.1mm and better over large volumes.
Calibration Plans for the Multi-angle Imaging SpectroRadiometer (MISR)
NASA Astrophysics Data System (ADS)
Bruegge, C. J.; Duval, V. G.; Chrien, N. L.; Diner, D. J.
1993-01-01
The EOS Multi-angle Imaging SpectroRadiometer (MISR) will study the ecology and climate of the Earth through acquisition of global multi-angle imagery. The MISR employs nine discrete cameras, each a push-broom imager. Of these, four point forward, four point aft and one views the nadir. Absolute radiometric calibration will be obtained pre-flight using high quantum efficiency (HQE) detectors and an integrating sphere source. After launch, instrument calibration will be provided using HQE detectors in conjunction with deployable diffuse calibration panels. The panels will be deployed at time intervals of one month and used to direct sunlight into the cameras, filling their fields-of-view and providing through-the-optics calibration. Additional techniques will be utilized to reduce systematic errors, and provide continuity as the methodology changes with time. For example, radiation-resistant photodiodes will also be used to monitor panel radiant exitance. These data will be acquired throughout the five-year mission, to maintain calibration in the latter years when it is expected that the HQE diodes will have degraded. During the mission, it is planned that the MISR will conduct semi-annual ground calibration campaigns, utilizing field measurements and higher resolution sensors (aboard aircraft or in-orbit platforms) to provide a check of the on-board hardware. These ground calibration campaigns are limited in number, but are believed to be the key to the long-term maintenance of MISR radiometric calibration.
Vann, C.
1998-03-24
The Laser Pulse Sampler (LPS) measures temporal pulse shape without the problems of a streak camera. Unlike the streak camera, the laser pulse directly illuminates a camera in the LPS, i.e., no additional equipment or energy conversions are required. The LPS has several advantages over streak cameras. The dynamic range of the LPS is limited only by the range of its camera, which for a cooled camera can be as high as 16 bits, i.e., 65,536. The LPS costs less because there are fewer components, and those components can be mass produced. The LPS is easier to calibrate and maintain because there is only one energy conversion, i.e., photons to electrons, in the camera. 5 figs.
Moonrungsee, Nuntaporn; Pencharee, Somkid; Jakmunee, Jaroon
2015-05-01
A field deployable colorimetric analyzer based on an "Android mobile phone" was developed for the determination of available phosphorus content in soil. An inexpensive mobile phone embedded with digital camera was used for taking photograph of the chemical solution under test. The method involved a reaction of the phosphorus (orthophosphate form), ammonium molybdate and potassium antimonyl tartrate to form phosphomolybdic acid which was reduced by ascorbic acid to produce the intense colored molybdenum blue. The software program was developed to use with the phone for recording and analyzing RGB color of the picture. A light tight box with LED light to control illumination was fabricated to improve precision and accuracy of the measurement. Under the optimum conditions, the calibration graph was created by measuring blue color intensity of a series of standard phosphorus solution (0.0-1.0mgPL(-1)), then, the calibration equation obtained was retained by the program for the analysis of sample solution. The results obtained from the proposed method agreed well with the spectrophotometric method, with a detection limit of 0.01mgPL(-1) and a sample throughput about 40h(-1) was achieved. The developed system provided good accuracy (RE<5%) and precision (RSD<2%, intra- and inter-day), fast and cheap analysis, and especially convenient to use in crop field for soil analysis of phosphorus nutrient. Copyright © 2015 Elsevier B.V. All rights reserved.
Figl, Michael; Ede, Christopher; Hummel, Johann; Wanschitz, Felix; Ewers, Rolf; Bergmann, Helmar; Birkfellner, Wolfgang
2005-11-01
Ever since the development of the first applications in image-guided therapy (IGT), the use of head-mounted displays (HMDs) was considered an important extension of existing IGT technologies. Several approaches to utilizing HMDs and modified medical devices for augmented reality (AR) visualization were implemented. These approaches include video-see through systems, semitransparent mirrors, modified endoscopes, and modified operating microscopes. Common to all these devices is the fact that a precise calibration between the display and three-dimensional coordinates in the patient's frame of reference is compulsory. In optical see-through devices based on complex optical systems such as operating microscopes or operating binoculars-as in the case of the system presented in this paper-this procedure can become increasingly difficult since precise camera calibration for every focus and zoom position is required. We present a method for fully automatic calibration of the operating binocular Varioscope M5 AR for the full range of zoom and focus settings available. Our method uses a special calibration pattern, a linear guide driven by a stepping motor, and special calibration software. The overlay error in the calibration plane was found to be 0.14-0.91 mm, which is less than 1% of the field of view. Using the motorized calibration rig as presented in the paper, we were also able to assess the dynamic latency when viewing augmentation graphics on a mobile target; spatial displacement due to latency was found to be in the range of 1.1-2.8 mm maximum, the disparity between the true object and its computed overlay represented latency of 0.1 s. We conclude that the automatic calibration method presented in this paper is sufficient in terms of accuracy and time requirements for standard uses of optical see-through systems in a clinical environment.
The PanCam Instrument for the ExoMars Rover
NASA Astrophysics Data System (ADS)
Coates, A. J.; Jaumann, R.; Griffiths, A. D.; Leff, C. E.; Schmitz, N.; Josset, J.-L.; Paar, G.; Gunn, M.; Hauber, E.; Cousins, C. R.; Cross, R. E.; Grindrod, P.; Bridges, J. C.; Balme, M.; Gupta, S.; Crawford, I. A.; Irwin, P.; Stabbins, R.; Tirsch, D.; Vago, J. L.; Theodorou, T.; Caballo-Perucha, M.; Osinski, G. R.; PanCam Team
2017-07-01
The scientific objectives of the ExoMars rover are designed to answer several key questions in the search for life on Mars. In particular, the unique subsurface drill will address some of these, such as the possible existence and stability of subsurface organics. PanCam will establish the surface geological and morphological context for the mission, working in collaboration with other context instruments. Here, we describe the PanCam scientific objectives in geology, atmospheric science, and 3-D vision. We discuss the design of PanCam, which includes a stereo pair of Wide Angle Cameras (WACs), each of which has an 11-position filter wheel and a High Resolution Camera (HRC) for high-resolution investigations of rock texture at a distance. The cameras and electronics are housed in an optical bench that provides the mechanical interface to the rover mast and a planetary protection barrier. The electronic interface is via the PanCam Interface Unit (PIU), and power conditioning is via a DC-DC converter. PanCam also includes a calibration target mounted on the rover deck for radiometric calibration, fiducial markers for geometric calibration, and a rover inspection mirror.
Television camera as a scientific instrument
NASA Technical Reports Server (NTRS)
Smokler, M. I.
1970-01-01
Rigorous calibration program, coupled with a sophisticated data-processing program that introduced compensation for system response to correct photometry, geometric linearity, and resolution, converted a television camera to a quantitative measuring instrument. The output data are in the forms of both numeric printout records and photographs.
NASA Astrophysics Data System (ADS)
Pagnutti, Mary; Ryan, Robert E.; Cazenavette, George; Gold, Maxwell; Harlan, Ryan; Leggett, Edward; Pagnutti, James
2017-01-01
A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.
Calibration of the motor-assisted robotic stereotaxy system: MARS.
Heinig, Maximilian; Hofmann, Ulrich G; Schlaefer, Alexander
2012-11-01
The motor-assisted robotic stereotaxy system presents a compact and light-weight robotic system for stereotactic neurosurgery. Our system is designed to position probes in the human brain for various applications, for example, deep brain stimulation. It features five fully automated axes. High positioning accuracy is of utmost importance in robotic neurosurgery. First, the key parameters of the robot's kinematics are determined using an optical tracking system. Next, the positioning errors at the center of the arc--which is equivalent to the target position in stereotactic interventions--are investigated using a set of perpendicular cameras. A modeless robot calibration method is introduced and evaluated. To conclude, the application accuracy of the robot is studied in a phantom trial. We identified the bending of the arc under load as the robot's main error source. A calibration algorithm was implemented to compensate for the deflection of the robot's arc. The mean error after the calibration was 0.26 mm, the 68.27th percentile was 0.32 mm, and the 95.45th was 0.50 mm. The kinematic properties of the robot were measured, and based on the results an appropriate calibration method was derived. With mean errors smaller than currently used mechanical systems, our results show that the robot's accuracy is appropriate for stereotactic interventions.
Embedded Augmented Reality Training System for Dynamic Human-Robot Cooperation
2009-10-01
through (OST) head- mounted displays ( HMDs ) still lack in usability and ergonomics because of their size, weight, resolution, and the hard-to-realize...with addressable focal planes [10], for example. Accurate and easy-to-use calibration routines for OST HMDs remains a challenging task; established...methods are based on matching of virtual over real objects [11], newer approaches use cameras looking directly through the HMD optics to exploit both
Bisi, Maria Cristina; Stagni, Rita; Caroselli, Alessio; Cappello, Angelo
2015-08-01
Inertial sensors are becoming widely used for the assessment of human movement in both clinical and research applications, thanks to their usability out of the laboratory. This work aims to propose a method for calibrating anatomical landmark position in the wearable sensor reference frame with an ease to use, portable and low cost device. An off-the-shelf camera, a stick and a pattern, attached to the inertial sensor, compose the device. The proposed technique is referred to as video Calibrated Anatomical System Technique (vCAST). The absolute orientation of a synthetic femur was tracked both using the vCAST together with an inertial sensor and using stereo-photogrammetry as reference. Anatomical landmark calibration showed mean absolute error of 0.6±0.5 mm: these errors are smaller than those affecting the in-vivo identification of anatomical landmarks. The roll, pitch and yaw anatomical frame orientations showed root mean square errors close to the accuracy limit of the wearable sensor used (1°), highlighting the reliability of the proposed technique. In conclusion, the present paper proposes and preliminarily verifies the performance of a method (vCAST) for calibrating anatomical landmark position in the wearable sensor reference frame: the technique is low time consuming, highly portable, easy to implement and usable outside laboratory. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Schneider, Adrian; Pezold, Simon; Baek, Kyung-Won; Marinov, Dilyan; Cattin, Philippe C
2016-09-01
PURPOSE : During the past five decades, laser technology emerged and is nowadays part of a great number of scientific and industrial applications. In the medical field, the integration of laser technology is on the rise and has already been widely adopted in contemporary medical applications. However, it is new to use a laser to cut bone and perform general osteotomy surgical tasks with it. In this paper, we describe a method to calibrate a laser deflecting tilting mirror and integrate it into a sophisticated laser osteotome, involving next generation robots and optical tracking. METHODS : A mathematical model was derived, which describes a controllable deflection mirror by the general projective transformation. This makes the application of well-known camera calibration methods possible. In particular, the direct linear transformation algorithm is applied to calibrate and integrate a laser deflecting tilting mirror into the affine transformation chain of a surgical system. RESULTS : Experiments were performed on synthetic generated calibration input, and the calibration was tested with real data. The determined target registration errors in a working distance of 150 mm for both simulated input and real data agree at the declared noise level of the applied optical 3D tracking system: The evaluation of the synthetic input showed an error of 0.4 mm, and the error with the real data was 0.3 mm.
Kim, Michele M; Zhu, Timothy C
2013-02-02
During HPPH-mediated pleural photodynamic therapy (PDT), it is critical to determine the anatomic geometry of the pleural surface quickly as there may be movement during treatment resulting in changes with the cavity. We have developed a laser scanning device for this purpose, which has the potential to obtain the surface geometry in real-time. A red diode laser with a holographic template to create a pattern and a camera with auto-focusing abilities are used to scan the cavity. In conjunction with a calibration with a known surface, we can use methods of triangulation to reconstruct the surface. Using a chest phantom, we are able to obtain a 360 degree scan of the interior in under 1 minute. The chest phantom scan was compared to an existing CT scan to determine its accuracy. The laser-camera separation can be determined through the calibration with 2mm accuracy. The device is best suited for environments that are on the scale of a chest cavity (between 10cm and 40cm). This technique has the potential to produce cavity geometry in real-time during treatment. This would enable PDT treatment dosage to be determined with greater accuracy. Works are ongoing to build a miniaturized device that moves the light source and camera via a fiber-optics bundle commonly used for endoscopy with increased accuracy.
NASA Astrophysics Data System (ADS)
Zhang, Shuo; Liu, Shaochuang; Ma, Youqing; Qi, Chen; Ma, Hao; Yang, Huan
2017-06-01
The Chang'e-3 was the first lunar soft landing probe of China. It was composed of the lander and the lunar rover. The Chang'e-3 successful landed in the northwest of the Mare Imbrium in December 14, 2013. The lunar rover completed the movement, imaging and geological survey after landing. The lunar rover equipped with a stereo vision system which was made up of the Navcam system, the mast mechanism and the inertial measurement unit (IMU). The Navcam system composed of two cameras with the fixed focal length. The mast mechanism was a robot with three revolute joints. The stereo vision system was used to determine the position of the lunar rover, generate the digital elevation models (DEM) of the surrounding region and plan the moving paths of the lunar rover. The stereo vision system must be calibrated before use. The control field could be built to calibrate the stereo vision system in the laboratory on the earth. However, the parameters of the stereo vision system would change after the launch, the orbital changes, the braking and the landing. Therefore, the stereo vision system should be self calibrated on the moon. An integrated self calibration method based on the bundle block adjustment is proposed in this paper. The bundle block adjustment uses each bundle of ray as the basic adjustment unit and the adjustment is implemented in the whole photogrammetric region. The stereo vision system can be self calibrated with the proposed method under the unknown lunar environment and all parameters can be estimated simultaneously. The experiment was conducted in the ground lunar simulation field. The proposed method was compared with other methods such as the CAHVOR method, the vanishing point method, the Denavit-Hartenberg method, the factorization method and the weighted least-squares method. The analyzed result proved that the accuracy of the proposed method was superior to those of other methods. Finally, the proposed method was practical used to self calibrate the stereo vision system of the Chang'e-3 lunar rover on the moon.
Time-of-flight depth image enhancement using variable integration time
NASA Astrophysics Data System (ADS)
Kim, Sun Kwon; Choi, Ouk; Kang, Byongmin; Kim, James Dokyoon; Kim, Chang-Yeong
2013-03-01
Time-of-Flight (ToF) cameras are used for a variety of applications because it delivers depth information at a high frame rate. These cameras, however, suffer from challenging problems such as noise and motion artifacts. To increase signal-to-noise ratio (SNR), the camera should calculate a distance based on a large amount of infra-red light, which needs to be integrated over a long time. On the other hand, the integration time should be short enough to suppress motion artifacts. We propose a ToF depth imaging method to combine advantages of short and long integration times exploiting an imaging fusion scheme proposed for color imaging. To calibrate depth differences due to the change of integration times, a depth transfer function is estimated by analyzing the joint histogram of depths in the two images of different integration times. The depth images are then transformed into wavelet domains and fused into a depth image with suppressed noise and low motion artifacts. To evaluate the proposed method, we captured a moving bar of a metronome with different integration times. The experiment shows the proposed method could effectively remove the motion artifacts while preserving high SNR comparable to the depth images acquired during long integration time.
A measurement system applicable for landslide experiments in the field.
Guo, Wen-Zhao; Xu, Xiang-Zhou; Wang, Wen-Long; Yang, Ji-Shan; Liu, Ya-Kun; Xu, Fei-Long
2016-04-01
Observation of gravity erosion in the field with strong sunshine and wind poses a challenge. Here, a novel topography meter together with a movable tent addresses the challenge. With the topography meter, a 3D geometric shape of the target surface can be digitally reconstructed. Before the commencement of a test, the laser generator position and the camera sightline should be adjusted with a sight calibrator. Typically, the topography meter can measure the gravity erosion on the slope with a gradient of 30°-70°. Two methods can be used to obtain a relatively clear video, despite the extreme steepness of the slopes. One method is to rotate the laser source away from the slope to ensure that the camera sightline remains perpendicular to the laser plane. Another way is to move the camera farther away from the slope in which the measured volume of the slope needs to be corrected; this method will reduce distortion of the image. In addition, installation of tent poles with concrete columns helps to surmount the altitude difference on steep slopes. Results observed by the topography meter in real landslide experiments are rational and reliable.
Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera
NASA Technical Reports Server (NTRS)
Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.
1988-01-01
The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.
Depth measurements through controlled aberrations of projected patterns.
Birch, Gabriel C; Tyo, J Scott; Schwiegerling, Jim
2012-03-12
Three-dimensional displays have become increasingly present in consumer markets. However, the ability to capture three-dimensional images in space confined environments and without major modifications to current cameras is uncommon. Our goal is to create a simple modification to a conventional camera that allows for three dimensional reconstruction. We require such an imaging system have imaging and illumination paths coincident. Furthermore, we require that any three-dimensional modification to a camera also permits full resolution 2D image capture.Here we present a method of extracting depth information with a single camera and aberrated projected pattern. A commercial digital camera is used in conjunction with a projector system with astigmatic focus to capture images of a scene. By using an astigmatic projected pattern we can create two different focus depths for horizontal and vertical features of a projected pattern, thereby encoding depth. By designing an aberrated projected pattern, we are able to exploit this differential focus in post-processing designed to exploit the projected pattern and optical system. We are able to correlate the distance of an object at a particular transverse position from the camera to ratios of particular wavelet coefficients.We present our information regarding construction, calibration, and images produced by this system. The nature of linking a projected pattern design and image processing algorithms will be discussed.
Accuracy Assessment of GO Pro Hero 3 (black) Camera in Underwater Environment
NASA Astrophysics Data System (ADS)
Helmholz, , P.; Long, J.; Munsie, T.; Belton, D.
2016-06-01
Modern digital cameras are increasing in quality whilst decreasing in size. In the last decade, a number of waterproof consumer digital cameras (action cameras) have become available, which often cost less than 500. A possible application of such action cameras is in the field of Underwater Photogrammetry. Especially with respect to the fact that with the change of the medium to below water can in turn counteract the distortions present. The goal of this paper is to investigate the suitability of such action cameras for underwater photogrammetric applications focusing on the stability of the camera and the accuracy of the derived coordinates for possible photogrammetric applications. For this paper a series of image sequences was capture in a water tank. A calibration frame was placed in the water tank allowing the calibration of the camera and the validation of the measurements using check points. The accuracy assessment covered three test sets operating three GoPro sports cameras of the same model (Hero 3 black). The test set included the handling of the camera in a controlled manner where the camera was only dunked into the water tank using 7MP and 12MP resolution and a rough handling where the camera was shaken as well as being removed from the waterproof case using 12MP resolution. The tests showed that the camera stability was given with a maximum standard deviation of the camera constant σc of 0.0031mm for 7MB (for an average c of 2.720mm) and 0.0072 mm for 12MB (for an average c of 3.642mm). The residual test of the check points gave for the 7MB test series the largest rms value with only 0.450mm and the largest maximal residual of only 2.5 mm. For the 12MB test series the maximum rms value is 0. 653mm.