NASA Technical Reports Server (NTRS)
1982-01-01
A design concept that will implement a mapping capability for the Orbital Camera Payload System (OCPS) when ground control points are not available is discussed. Through the use of stellar imagery collected by a pair of cameras whose optical axis are structurally related to the large format camera optical axis, such pointing information is made available.
Method used to test the imaging consistency of binocular camera's left-right optical system
NASA Astrophysics Data System (ADS)
Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui
2016-09-01
To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.
HIGH SPEED KERR CELL FRAMING CAMERA
Goss, W.C.; Gilley, L.F.
1964-01-01
The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)
RESTORATION OF ATMOSPHERICALLY DEGRADED IMAGES. VOLUME 3.
AERIAL CAMERAS, LASERS, ILLUMINATION, TRACKING CAMERAS, DIFFRACTION, PHOTOGRAPHIC GRAIN, DENSITY, DENSITOMETERS, MATHEMATICAL ANALYSIS, OPTICAL SCANNING, SYSTEMS ENGINEERING, TURBULENCE, OPTICAL PROPERTIES, SATELLITE TRACKING SYSTEMS.
Optical registration of spaceborne low light remote sensing camera
NASA Astrophysics Data System (ADS)
Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long
2018-02-01
For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.
Exploring the imaging properties of thin lenses for cryogenic infrared cameras
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Verdet, Sebastien; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Grulois, Tatiana; Matallah, Noura
2016-05-01
Designing a cryogenic camera is a good strategy to miniaturize and simplify an infrared camera using a cooled detector. Indeed, the integration of optics inside the cold shield allows to simply athermalize the design, guarantees a cold pupil and releases the constraint on having a high back focal length for small focal length systems. By this way, cameras made of a single lens or two lenses are viable systems with good optical features and a good stability in image correction. However it involves a relatively significant additional optical mass inside the dewar and thus increases the cool down time of the camera. ONERA is currently exploring a minimalist strategy consisting in giving an imaging function to thin optical plates that are found in conventional dewars. By this way, we could make a cryogenic camera that has the same cool down time as a traditional dewar without an imagery function. Two examples will be presented: the first one is a camera using a dual-band infrared detector made of a lens outside the dewar and a lens inside the cold shield, the later having the main optical power of the system. We were able to design a cold plano-convex lens with a thickness lower than 1mm. The second example is an evolution of a former cryogenic camera called SOIE. We replaced the cold meniscus by a plano-convex Fresnel lens with a decrease of the optical thermal mass of 66%. The performances of both cameras will be compared.
Multi-color pyrometry imaging system and method of operating the same
Estevadeordal, Jordi; Nirmalan, Nirm Velumylum; Tralshawala, Nilesh; Bailey, Jeremy Clyde
2017-03-21
A multi-color pyrometry imaging system for a high-temperature asset includes at least one viewing port in optical communication with at least one high-temperature component of the high-temperature asset. The system also includes at least one camera device in optical communication with the at least one viewing port. The at least one camera device includes a camera enclosure and at least one camera aperture defined in the camera enclosure, The at least one camera aperture is in optical communication with the at least one viewing port. The at least one camera device also includes a multi-color filtering mechanism coupled to the enclosure. The multi-color filtering mechanism is configured to sequentially transmit photons within a first predetermined wavelength band and transmit photons within a second predetermined wavelength band that is different than the first predetermined wavelength band.
Optical fiducial timing system for X-ray streak cameras with aluminum coated optical fiber ends
Nilson, David G.; Campbell, E. Michael; MacGowan, Brian J.; Medecki, Hector
1988-01-01
An optical fiducial timing system is provided for use with interdependent groups of X-ray streak cameras (18). The aluminum coated (80) ends of optical fibers (78) are positioned with the photocathodes (20, 60, 70) of the X-ray streak cameras (18). The other ends of the optical fibers (78) are placed together in a bundled array (90). A fiducial optical signal (96), that is comprised of 2.omega. or 1.omega. laser light, after introduction to the bundled array (90), travels to the aluminum coated (82) optical fiber ends and ejects quantities of electrons (84) that are recorded on the data recording media (52) of the X-ray streak cameras (18). Since both 2.omega. and 1.omega. laser light can travel long distances in optical fiber with only a slight attenuation, the initial arial power density of the fiducial optical signal (96) is well below the damage threshold of the fused silica or other material that comprises the optical fibers (78, 90). Thus the fiducial timing system can be repeatably used over long durations of time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riot, V J; Olivier, S; Bauman, B
2012-05-24
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less
Spickermann, Gunnar; Friederich, Fabian; Roskos, Hartmut G; Bolívar, Peter Haring
2009-11-01
We present a 64x48 pixel 2D electro-optical terahertz (THz) imaging system using a photonic mixing device time-of-flight camera as an optical demodulating detector array. The combination of electro-optic detection with a time-of-flight camera increases sensitivity drastically, enabling the use of a nonamplified laser source for high-resolution real-time THz electro-optic imaging.
Optical performance analysis of plenoptic camera systems
NASA Astrophysics Data System (ADS)
Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas
2014-09-01
Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.
NASA Astrophysics Data System (ADS)
Zelazny, Amy; Benson, Robert; Deegan, John; Walsh, Ken; Schmidt, W. David; Howe, Russell
2013-06-01
We describe the benefits to camera system SWaP-C associated with the use of aspheric molded glasses and optical polymers in the design and manufacture of optical components and elements. Both camera objectives and display eyepieces, typical for night vision man-portable EO/IR systems, are explored. We discuss optical trade-offs, system performance, and cost reductions associated with this approach in both visible and non-visible wavebands, specifically NIR and LWIR. Example optical models are presented, studied, and traded using this approach.
Combustion pinhole-camera system
Witte, A.B.
1982-05-19
A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.
Combustion pinhole camera system
Witte, A.B.
1984-02-21
A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.
Combustion pinhole camera system
Witte, Arvel B.
1984-02-21
A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.
Light field analysis and its applications in adaptive optics and surveillance systems
NASA Astrophysics Data System (ADS)
Eslami, Mohammed Ali
An image can only be as good as the optics of a camera or any other imaging system allows it to be. An imaging system is merely a transformation that takes a 3D world coordinate to a 2D image plane. This can be done through both linear/non-linear transfer functions. Depending on the application at hand it is easier to use some models of imaging systems over the others in certain situations. The most well-known models are the 1) Pinhole model, 2) Thin Lens Model and 3) Thick lens model for optical systems. Using light-field analysis the connection through these different models is described. A novel figure of merit is presented on using one optical model over the other for certain applications. After analyzing these optical systems, their applications in plenoptic cameras for adaptive optics applications are introduced. A new technique to use a plenoptic camera to extract information about a localized distorted planar wave front is described. CODEV simulations conducted in this thesis show that its performance is comparable to those of a Shack-Hartmann sensor and that they can potentially increase the dynamic range of angles that can be extracted assuming a paraxial imaging system. As a final application, a novel dual PTZ-surveillance system to track a target through space is presented. 22X optic zoom lenses on high resolution pan/tilt platforms recalibrate a master-slave relationship based on encoder readouts rather than complicated image processing algorithms for real-time target tracking. As the target moves out of a region of interest in the master camera, it is moved to force the target back into the region of interest. Once the master camera is moved, a precalibrated lookup table is interpolated to compute the relationship between the master/slave cameras. The homography that relates the pixels of the master camera to the pan/tilt settings of the slave camera then continue to follow the planar trajectories of targets as they move through space at high accuracies.
NASA Astrophysics Data System (ADS)
Moore, Lori
Plenoptic cameras and Shack-Hartmann wavefront sensors are lenslet-based optical systems that do not form a conventional image. The addition of a lens array into these systems allows for the aberrations generated by the combination of the object and the optical components located prior to the lens array to be measured or corrected with post-processing. This dissertation provides a ray selection method to determine the rays that pass through each lenslet in a lenslet-based system. This first-order, ray trace method is developed for any lenslet-based system with a well-defined fore optic, where in this dissertation the fore optic is all of the optical components located prior to the lens array. For example, in a plenoptic camera the fore optic is a standard camera lens. Because a lens array at any location after the exit pupil of the fore optic is considered in this analysis, it is applicable to both plenoptic cameras and Shack-Hartmann wavefront sensors. Only a generic, unaberrated fore optic is considered, but this dissertation establishes a framework for considering the effect of an aberrated fore optic in lenslet-based systems. The rays from the fore optic that pass through a lenslet placed at any location after the fore optic are determined. This collection of rays is reduced to three rays that describe the entire lenslet ray set. The lenslet ray set is determined at the object, image, and pupil planes of the fore optic. The consideration of the apertures that define the lenslet ray set for an on-axis lenslet leads to three classes of lenslet-based systems. Vignetting of the lenslet rays is considered for off-axis lenslets. Finally, the lenslet ray set is normalized into terms similar to the field and aperture vector used to describe the aberrated wavefront of the fore optic. The analysis in this dissertation is complementary to other first-order models that have been developed for a specific plenoptic camera layout or Shack-Hartmann wavefront sensor application. This general analysis determines the location where the rays of each lenslet pass through the fore optic establishing a framework to consider the effect of an aberrated fore optic in a future analysis.
Optical Transient Monitor (OTM) for BOOTES Project
NASA Astrophysics Data System (ADS)
Páta, P.; Bernas, M.; Castro-Tirado, A. J.; Hudec, R.
2003-04-01
The Optical Transient Monitor (OTM) is a software for control of three wide and ultra-wide filed cameras of BOOTES (Burst Observer and Optical Transient Exploring System) station. The OTM is a PC based and it is powerful tool for taking images from two SBIG CCD cameras in same time or from one camera only. The control program for BOOTES cameras is Windows 98 or MSDOS based. Now the version for Windows 2000 is prepared. There are five main supported modes of work. The OTM program could control cameras and evaluate image data without human interaction.
Modeling of digital information optical encryption system with spatially incoherent illumination
NASA Astrophysics Data System (ADS)
Bondareva, Alyona P.; Cheremkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.
2015-10-01
State of the art micromirror DMD spatial light modulators (SLM) offer unprecedented framerate up to 30000 frames per second. This, in conjunction with high speed digital camera, should allow to build high speed optical encryption system. Results of modeling of digital information optical encryption system with spatially incoherent illumination are presented. Input information is displayed with first SLM, encryption element - with second SLM. Factors taken into account are: resolution of SLMs and camera, holograms reconstruction noise, camera noise and signal sampling. Results of numerical simulation demonstrate high speed (several gigabytes per second), low bit error rate and high crypto-strength.
Coaxial fundus camera for opthalmology
NASA Astrophysics Data System (ADS)
de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.
2015-09-01
A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.
Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund
NASA Technical Reports Server (NTRS)
Hagyard, Mona J.
1992-01-01
The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.
NASA Technical Reports Server (NTRS)
Kassak, John E.
1991-01-01
The objective of the operational television (OTV) technology was to develop a multiple camera system (up to 256 cameras) for NASA Kennedy installations where camera video, synchronization, control, and status data are transmitted bidirectionally via a single fiber cable at distances in excess of five miles. It is shown that the benefits (such as improved video performance, immunity from electromagnetic interference and radio frequency interference, elimination of repeater stations, and more system configuration flexibility) can be realized if application of the proven fiber optic transmission concept is used. The control system will marry the lens, pan and tilt, and camera control functions into a modular based Local Area Network (LAN) control network. Such a system does not exist commercially at present since the Television Broadcast Industry's current practice is to divorce the positional controls from the camera control system. The application software developed for this system will have direct applicability to similar systems in industry using LAN based control systems.
NASA Astrophysics Data System (ADS)
Yu, Liping; Pan, Bing
2016-12-01
A low-cost, easy-to-implement but practical single-camera stereo-digital image correlation (DIC) system using a four-mirror adapter is established for accurate shape and three-dimensional (3D) deformation measurements. The mirrors assisted pseudo-stereo imaging system can convert a single camera into two virtual cameras, which view a specimen from different angles and record the surface images of the test object onto two halves of the camera sensor. To enable deformation measurement in non-laboratory conditions or extreme high temperature environments, an active imaging optical design, combining an actively illuminated monochromatic source with a coupled band-pass optical filter, is compactly integrated to the pseudo-stereo DIC system. The optical design, basic principles and implementation procedures of the established system for 3D profile and deformation measurements are described in detail. The effectiveness and accuracy of the established system are verified by measuring the profile of a regular cylinder surface and displacements of a translated planar plate. As an application example, the established system is used to determine the tensile strains and Poisson's ratio of a composite solid propellant specimen during stress relaxation test. Since the established single-camera stereo-DIC system only needs a single camera and presents strong robustness against variations in ambient light or the thermal radiation of a hot object, it demonstrates great potential in determining transient deformation in non-laboratory or high-temperature environments with the aid of a single high-speed camera.
Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA
NASA Astrophysics Data System (ADS)
Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki
2017-11-01
SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.
Research on a solid state-streak camera based on an electro-optic crystal
NASA Astrophysics Data System (ADS)
Wang, Chen; Liu, Baiyu; Bai, Yonglin; Bai, Xiaohong; Tian, Jinshou; Yang, Wenzheng; Xian, Ouyang
2006-06-01
With excellent temporal resolution ranging from nanosecond to sub-picoseconds, a streak camera is widely utilized in measuring ultrafast light phenomena, such as detecting synchrotron radiation, examining inertial confinement fusion target, and making measurements of laser-induced discharge. In combination with appropriate optics or spectroscope, the streak camera delivers intensity vs. position (or wavelength) information on the ultrafast process. The current streak camera is based on a sweep electric pulse and an image converting tube with a wavelength-sensitive photocathode ranging from the x-ray to near infrared region. This kind of streak camera is comparatively costly and complex. This paper describes the design and performance of a new-style streak camera based on an electro-optic crystal with large electro-optic coefficient. Crystal streak camera accomplishes the goal of time resolution by direct photon beam deflection using the electro-optic effect which can replace the current streak camera from the visible to near infrared region. After computer-aided simulation, we design a crystal streak camera which has the potential of time resolution between 1ns and 10ns.Some further improvements in sweep electric circuits, a crystal with a larger electro-optic coefficient, for example LN (γ 33=33.6×10 -12m/v) and the optimal optic system may lead to better time resolution less than 1ns.
A novel optical system design of light field camera
NASA Astrophysics Data System (ADS)
Wang, Ye; Li, Wenhua; Hao, Chenyang
2016-01-01
The structure of main lens - Micro Lens Array (MLA) - imaging sensor is usually adopted in optical system of light field camera, and the MLA is the most important part in the optical system, which has the function of collecting and recording the amplitude and phase information of the field light. In this paper, a novel optical system structure is proposed. The novel optical system is based on the 4f optical structure, and the micro-aperture array (MAA) is used to instead of the MLA for realizing the information acquisition of the 4D light field. We analyze the principle that the novel optical system could realize the information acquisition of the light field. At the same time, a simple MAA, line grating optical system, is designed by ZEMAX software in this paper. The novel optical system is simulated by a line grating optical system, and multiple images are obtained in the image plane. The imaging quality of the novel optical system is analyzed.
Plenoptic Imager for Automated Surface Navigation
NASA Technical Reports Server (NTRS)
Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael
2010-01-01
An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.
640x480 PtSi Stirling-cooled camera system
NASA Astrophysics Data System (ADS)
Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; Coyle, Peter J.; Feder, Howard L.; Gilmartin, Harvey R.; Levine, Peter A.; Sauer, Donald J.; Shallcross, Frank V.; Demers, P. L.; Smalser, P. J.; Tower, John R.
1992-09-01
A Stirling cooled 3 - 5 micron camera system has been developed. The camera employs a monolithic 640 X 480 PtSi-MOS focal plane array. The camera system achieves an NEDT equals 0.10 K at 30 Hz frame rate with f/1.5 optics (300 K background). At a spatial frequency of 0.02 cycles/mRAD the vertical and horizontal Minimum Resolvable Temperature are in the range of MRT equals 0.03 K (f/1.5 optics, 300 K background). The MOS focal plane array achieves a resolution of 480 TV lines per picture height independent of background level and position within the frame.
NASA Technical Reports Server (NTRS)
Nabors, Sammy
2015-01-01
NASA offers companies an optical system that provides a unique panoramic perspective with a single camera. NASA's Marshall Space Flight Center has developed a technology that combines a panoramic refracting optic (PRO) lens with a unique detection system to acquire a true 360-degree field of view. Although current imaging systems can acquire panoramic images, they must use up to five cameras to obtain the full field of view. MSFC's technology obtains its panoramic images from one vantage point.
Scalar wave-optical reconstruction of plenoptic camera images.
Junker, André; Stenau, Tim; Brenner, Karl-Heinz
2014-09-01
We investigate the reconstruction of plenoptic camera images in a scalar wave-optical framework. Previous publications relating to this topic numerically simulate light propagation on the basis of ray tracing. However, due to continuing miniaturization of hardware components it can be assumed that in combination with low-aperture optical systems this technique may not be generally valid. Therefore, we study the differences between ray- and wave-optical object reconstructions of true plenoptic camera images. For this purpose we present a wave-optical reconstruction algorithm, which can be run on a regular computer. Our findings show that a wave-optical treatment is capable of increasing the detail resolution of reconstructed objects.
Combined hostile fire and optics detection
NASA Astrophysics Data System (ADS)
Brännlund, Carl; Tidström, Jonas; Henriksson, Markus; Sjöqvist, Lars
2013-10-01
Snipers and other optically guided weapon systems are serious threats in military operations. We have studied a SWIR (Short Wave Infrared) camera-based system with capability to detect and locate snipers both before and after shot over a large field-of-view. The high frame rate SWIR-camera allows resolution of the temporal profile of muzzle flashes which is the infrared signature associated with the ejection of the bullet from the rifle. The capability to detect and discriminate sniper muzzle flashes with this system has been verified by FOI in earlier studies. In this work we have extended the system by adding a laser channel for optics detection. A laser diode with slit-shaped beam profile is scanned over the camera field-of-view to detect retro reflection from optical sights. The optics detection system has been tested at various distances up to 1.15 km showing the feasibility to detect rifle scopes in full daylight. The high speed camera gives the possibility to discriminate false alarms by analyzing the temporal data. The intensity variation, caused by atmospheric turbulence, enables discrimination of small sights from larger reflectors due to aperture averaging, although the targets only cover a single pixel. It is shown that optics detection can be integrated in combination with muzzle flash detection by adding a scanning rectangular laser slit. The overall optics detection capability by continuous surveillance of a relatively large field-of-view looks promising. This type of multifunctional system may become an important tool to detect snipers before and after shot.
The sequence measurement system of the IR camera
NASA Astrophysics Data System (ADS)
Geng, Ai-hui; Han, Hong-xia; Zhang, Hai-bo
2011-08-01
Currently, the IR cameras are broadly used in the optic-electronic tracking, optic-electronic measuring, fire control and optic-electronic countermeasure field, but the output sequence of the most presently applied IR cameras in the project is complex and the giving sequence documents from the leave factory are not detailed. Aiming at the requirement that the continuous image transmission and image procession system need the detailed sequence of the IR cameras, the sequence measurement system of the IR camera is designed, and the detailed sequence measurement way of the applied IR camera is carried out. The FPGA programming combined with the SignalTap online observation way has been applied in the sequence measurement system, and the precise sequence of the IR camera's output signal has been achieved, the detailed document of the IR camera has been supplied to the continuous image transmission system, image processing system and etc. The sequence measurement system of the IR camera includes CameraLink input interface part, LVDS input interface part, FPGA part, CameraLink output interface part and etc, thereinto the FPGA part is the key composed part in the sequence measurement system. Both the video signal of the CmaeraLink style and the video signal of LVDS style can be accepted by the sequence measurement system, and because the image processing card and image memory card always use the CameraLink interface as its input interface style, the output signal style of the sequence measurement system has been designed into CameraLink interface. The sequence measurement system does the IR camera's sequence measurement work and meanwhile does the interface transmission work to some cameras. Inside the FPGA of the sequence measurement system, the sequence measurement program, the pixel clock modification, the SignalTap file configuration and the SignalTap online observation has been integrated to realize the precise measurement to the IR camera. Te sequence measurement program written by the verilog language combining the SignalTap tool on line observation can count the line numbers in one frame, pixel numbers in one line and meanwhile account the line offset and row offset of the image. Aiming at the complex sequence of the IR camera's output signal, the sequence measurement system of the IR camera accurately measures the sequence of the project applied camera, supplies the detailed sequence document to the continuous system such as image processing system and image transmission system and gives out the concrete parameters of the fval, lval, pixclk, line offset and row offset. The experiment shows that the sequence measurement system of the IR camera can get the precise sequence measurement result and works stably, laying foundation for the continuous system.
Mach-zehnder based optical marker/comb generator for streak camera calibration
Miller, Edward Kirk
2015-03-03
This disclosure is directed to a method and apparatus for generating marker and comb indicia in an optical environment using a Mach-Zehnder (M-Z) modulator. High speed recording devices are configured to record image or other data defining a high speed event. To calibrate and establish time reference, the markers or combs are indicia which serve as timing pulses (markers) or a constant-frequency train of optical pulses (comb) to be imaged on a streak camera for accurate time based calibration and time reference. The system includes a camera, an optic signal generator which provides an optic signal to an M-Z modulator and biasing and modulation signal generators configured to provide input to the M-Z modulator. An optical reference signal is provided to the M-Z modulator. The M-Z modulator modulates the reference signal to a higher frequency optical signal which is output through a fiber coupled link to the streak camera.
Chang, Victoria C; Tang, Shou-Jiang; Swain, C Paul; Bergs, Richard; Paramo, Juan; Hogg, Deborah C; Fernandez, Raul; Cadeddu, Jeffrey A; Scott, Daniel J
2013-08-01
The influence of endoscopic video camera (VC) image quality on surgical performance has not been studied. Flexible endoscopes are used as substitutes for laparoscopes in natural orifice translumenal endoscopic surgery (NOTES), but their optics are originally designed for intralumenal use. Manipulable wired or wireless independent VCs might offer advantages for NOTES but are still under development. To measure the optical characteristics of 4 VC systems and to compare their impact on the performance of surgical suturing tasks. VC systems included a laparoscope (Storz 10 mm), a flexible endoscope (Olympus GIF 160), and 2 prototype deployable cameras (magnetic anchoring and guidance system [MAGS] Camera and PillCam). In a randomized fashion, the 4 systems were evaluated regarding standardized optical characteristics and surgical manipulations of previously validated ex vivo (fundamentals of laparoscopic surgery model) and in vivo (live porcine Nissen model) tasks; objective metrics (time and errors/precision) and combined surgeon (n = 2) performance were recorded. Subtle differences were detected for color tests, and field of view was variable (65°-115°). Suitable resolution was detected up to 10 cm for the laparoscope and MAGS camera but only at closer distances for the endoscope and PillCam. Compared with the laparoscope, surgical suturing performances were modestly lower for the MAGS camera and significantly lower for the endoscope (ex vivo) and PillCam (ex vivo and in vivo). This study documented distinct differences in VC systems that may be used for NOTES in terms of both optical characteristics and surgical performance. Additional work is warranted to optimize cameras for NOTES. Deployable systems may be especially well suited for this purpose.
Motionless active depth from defocus system using smart optics for camera autofocus applications
NASA Astrophysics Data System (ADS)
Amin, M. Junaid; Riza, Nabeel A.
2016-04-01
This paper describes a motionless active Depth from Defocus (DFD) system design suited for long working range camera autofocus applications. The design consists of an active illumination module that projects a scene illuminating coherent conditioned optical radiation pattern which maintains its sharpness over multiple axial distances allowing an increased DFD working distance range. The imager module of the system responsible for the actual DFD operation deploys an electronically controlled variable focus lens (ECVFL) as a smart optic to enable a motionless imager design capable of effective DFD operation. An experimental demonstration is conducted in the laboratory which compares the effectiveness of the coherent conditioned radiation module versus a conventional incoherent active light source, and demonstrates the applicability of the presented motionless DFD imager design. The fast response and no-moving-parts features of the DFD imager design are especially suited for camera scenarios where mechanical motion of lenses to achieve autofocus action is challenging, for example, in the tiny camera housings in smartphones and tablets. Applications for the proposed system include autofocus in modern day digital cameras.
Applying UV cameras for SO2 detection to distant or optically thick volcanic plumes
Kern, Christoph; Werner, Cynthia; Elias, Tamar; Sutton, A. Jeff; Lübcke, Peter
2013-01-01
Ultraviolet (UV) camera systems represent an exciting new technology for measuring two dimensional sulfur dioxide (SO2) distributions in volcanic plumes. The high frame rate of the cameras allows the retrieval of SO2 emission rates at time scales of 1 Hz or higher, thus allowing the investigation of high-frequency signals and making integrated and comparative studies with other high-data-rate volcano monitoring techniques possible. One drawback of the technique, however, is the limited spectral information recorded by the imaging systems. Here, a framework for simulating the sensitivity of UV cameras to various SO2 distributions is introduced. Both the wavelength-dependent transmittance of the optical imaging system and the radiative transfer in the atmosphere are modeled. The framework is then applied to study the behavior of different optical setups and used to simulate the response of these instruments to volcanic plumes containing varying SO2 and aerosol abundances located at various distances from the sensor. Results show that UV radiative transfer in and around distant and/or optically thick plumes typically leads to a lower sensitivity to SO2 than expected when assuming a standard Beer–Lambert absorption model. Furthermore, camera response is often non-linear in SO2 and dependent on distance to the plume and plume aerosol optical thickness and single scatter albedo. The model results are compared with camera measurements made at Kilauea Volcano (Hawaii) and a method for integrating moderate resolution differential optical absorption spectroscopy data with UV imagery to retrieve improved SO2 column densities is discussed.
Optical analysis of a compound quasi-microscope for planetary landers
NASA Technical Reports Server (NTRS)
Wall, S. D.; Burcher, E. E.; Huck, F. O.
1974-01-01
A quasi-microscope concept, consisting of facsimile camera augmented with an auxiliary lens as a magnifier, was introduced and analyzed. The performance achievable with this concept was primarily limited by a trade-off between resolution and object field; this approach leads to a limiting resolution of 20 microns when used with the Viking lander camera (which has an angular resolution of 0.04 deg). An optical system is analyzed which includes a field lens between camera and auxiliary lens to overcome this limitation. It is found that this system, referred to as a compound quasi-microscope, can provide improved resolution (to about 2 microns ) and a larger object field. However, this improvement is at the expense of increased complexity, special camera design requirements, and tighter tolerances on the distances between optical components.
NASA Technical Reports Server (NTRS)
Hertel, R. J.
1979-01-01
An electro-optical method to measure the aeroelastic deformations of wind tunnel models is examined. The multitarget tracking performance of one of the two electronic cameras comprising the stereo pair is modeled and measured. The properties of the targets at the model, the camera optics, target illumination, number of targets, acquisition time, target velocities, and tracker performance are considered. The electronic camera system is shown to be capable of locating, measuring, and following the positions of 5 to 50 targets attached to the model at measuring rates up to 5000 targets per second.
Intraocular camera for retinal prostheses: Refractive and diffractive lens systems
NASA Astrophysics Data System (ADS)
Hauer, Michelle Christine
The focus of this thesis is on the design and analysis of refractive, diffractive, and hybrid refractive/diffractive lens systems for a miniaturized camera that can be surgically implanted in the crystalline lens sac and is designed to work in conjunction with current and future generation retinal prostheses. The development of such an intraocular camera (IOC) would eliminate the need for an external head-mounted or eyeglass-mounted camera. Placing the camera inside the eye would allow subjects to use their natural eye movements for foveation (attention) instead of more cumbersome head tracking, would notably aid in personal navigation and mobility, and would also be significantly more psychologically appealing from the standpoint of personal appearances. The capability for accommodation with no moving parts or feedback control is incorporated by employing camera designs that exhibit nearly infinite depth of field. Such an ultracompact optical imaging system requires a unique combination of refractive and diffractive optical elements and relaxed system constraints derived from human psychophysics. This configuration necessitates an extremely compact, short focal-length lens system with an f-number close to unity. Initially, these constraints appear highly aggressive from an optical design perspective. However, after careful analysis of the unique imaging requirements of a camera intended to work in conjunction with the relatively low pixellation levels of a retinal microstimulator array, it becomes clear that such a design is not only feasible, but could possibly be implemented with a single lens system.
Concave Surround Optics for Rapid Multi-View Imaging
2006-11-01
thus is amenable to capturing dynamic events avoiding the need to construct and calibrate an array of cameras. We demonstrate the system with a high...hard to assemble and calibrate . In this paper we present an optical system capable of rapidly moving the viewpoint around a scene. Our system...flexibility, large camera arrays are typically expensive and require significant effort to calibrate temporally, geometrically and chromatically
Exact optics - III. Schwarzschild's spectrograph camera revised
NASA Astrophysics Data System (ADS)
Willstrop, R. V.
2004-03-01
Karl Schwarzschild identified a system of two mirrors, each defined by conic sections, free of third-order spherical aberration, coma and astigmatism, and with a flat focal surface. He considered it impractical, because the field was too restricted. This system was rediscovered as a quadratic approximation to one of Lynden-Bell's `exact optics' designs which have wider fields. Thus the `exact optics' version has a moderate but useful field, with excellent definition, suitable for a spectrograph camera. The mirrors are strongly aspheric in both the Schwarzschild design and the exact optics version.
Programmable 10 MHz optical fiducial system for hydrodiagnostic cameras
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huen, T.
1987-07-01
A solid state light control system was designed and fabricated for use with hydrodiagnostic streak cameras of the electro-optic type. With its use, the film containing the streak images will have on it two time scales simultaneously exposed with the signal. This allows timing and cross timing. The latter is achieved with exposure modulation marking onto the time tick marks. The purpose of using two time scales will be discussed. The design is based on a microcomputer, resulting in a compact and easy to use instrument. The light source is a small red light emitting diode. Time marking can bemore » programmed in steps of 0.1 microseconds, with a range of 255 steps. The time accuracy is based on a precision 100 MHz quartz crystal, giving a divided down 10 MHz system frequency. The light is guided by two small 100 micron diameter optical fibers, which facilitates light coupling onto the input slit of an electro-optic streak camera. Three distinct groups of exposure modulation of the time tick marks can be independently set anywhere onto the streak duration. This system has been successfully used in Fabry-Perot laser velocimeters for over four years in our Laboratory. The microcomputer control section is also being used in providing optical fids to mechanical rotor cameras.« less
Measuring the spatial resolution of an optical system in an undergraduate optics laboratory
NASA Astrophysics Data System (ADS)
Leung, Calvin; Donnelly, T. D.
2017-06-01
Two methods of quantifying the spatial resolution of a camera are described, performed, and compared, with the objective of designing an imaging-system experiment for students in an undergraduate optics laboratory. With the goal of characterizing the resolution of a typical digital single-lens reflex (DSLR) camera, we motivate, introduce, and show agreement between traditional test-target contrast measurements and the technique of using Fourier analysis to obtain the modulation transfer function (MTF). The advantages and drawbacks of each method are compared. Finally, we explore the rich optical physics at work in the camera system by calculating the MTF as a function of wavelength and f-number. For example, we find that the Canon 40D demonstrates better spatial resolution at short wavelengths, in accordance with scalar diffraction theory, but is not diffraction-limited, being significantly affected by spherical aberration. The experiment and data analysis routines described here can be built and written in an undergraduate optics lab setting.
Development of biostereometric experiments. [stereometric camera system
NASA Technical Reports Server (NTRS)
Herron, R. E.
1978-01-01
The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.
Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng
2017-06-20
The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-01-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454
NASA Astrophysics Data System (ADS)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-11-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.
Liquid lens: advances in adaptive optics
NASA Astrophysics Data System (ADS)
Casey, Shawn Patrick
2010-12-01
'Liquid lens' technologies promise significant advancements in machine vision and optical communications systems. Adaptations for machine vision, human vision correction, and optical communications are used to exemplify the versatile nature of this technology. Utilization of liquid lens elements allows the cost effective implementation of optical velocity measurement. The project consists of a custom image processor, camera, and interface. The images are passed into customized pattern recognition and optical character recognition algorithms. A single camera would be used for both speed detection and object recognition.
Wide field/planetary camera optics study. [for the large space telescope
NASA Technical Reports Server (NTRS)
1979-01-01
Design feasibility of the baseline optical design concept was established for the wide field/planetary camera (WF/PC) and will be used with the space telescope (ST) to obtain high angular resolution astronomical information over a wide field. The design concept employs internal optics to relay the ST image to a CCD detector system. Optical design performance predictions, sensitivity and tolerance analyses, manufacturability of the optical components, and acceptance testing of the two mirror Cassegrain relays are discussed.
NASA Astrophysics Data System (ADS)
de Villiers, Jason; Jermy, Robert; Nicolls, Fred
2014-06-01
This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.
FOREX-A Fiber Optics Diagnostic System For Study Of Materials At High Temperatures And Pressures
NASA Astrophysics Data System (ADS)
Smith, D. E.; Roeske, F.
1983-03-01
We have successfully fielded a Fiber Optics Radiation EXperiment system (FOREX) designed for measuring material properties at high temperatures and pressures on an underground nuclear test. The system collects light from radiating materials and transmits it through several hundred meters of optical fibers to a recording station consisting of a streak camera with film readout. The use of fiber optics provides a faster time response than can presently be obtained with equalized coaxial cables over comparable distances. Fibers also have significant cost and physical size advantages over coax cables. The streak camera achieves a much higher information density than an equivalent oscilloscope system, and it also serves as the light detector. The result is a wide bandwidth high capacity system that can be fielded at a relatively low cost in manpower, space, and materials. For this experiment, the streak camera had a 120 ns time window with a 1.2 ns time resolution. Dynamic range for the system was about 1000. Beam current statistical limitations were approximately 8% for a 0.3 ns wide data point at one decade above the threshold recording intensity.
Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid
2016-06-13
Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.
NASA Astrophysics Data System (ADS)
Bechis, K.; Pitruzzello, A.
2014-09-01
This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera operation is that the target must be within the near-field (Fraunhofer distance) of the collecting optics. For example, in visible light the near-field of a 1-m telescope extends out to about 3,500 km, while the near-field of the AEOS telescope extends out over 46,000 km. For our initial proof of concept, we have integrated our light field camera with a 14-inch Meade LX600 advanced coma-free telescope, to image various surrogate ground targets at up to tens of kilometers range. Our experiments with the 14-inch telescope have assessed factors and requirements that are traceable and scalable to a larger-aperture system that would have the near-field distance needed to obtain 3D images of LEO objects. The next step would be to integrate a light field camera with a 1-m or larger telescope and evaluate its 3D imaging capability against LEO objects. 3D imaging of LEO space objects with light field camera technology can potentially provide a valuable new tool for space situational awareness, especially for those situations where laser or radar illumination of the target objects is not feasible.
MTF measurements on real time for performance analysis of electro-optical systems
NASA Astrophysics Data System (ADS)
Stuchi, Jose Augusto; Signoreto Barbarini, Elisa; Vieira, Flavio Pascoal; dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fatima Maria Mitsue; Castro Neto, Jarbas C.; Linhari Rodrigues, Evandro Luis
2012-06-01
The need of methods and tools that assist in determining the performance of optical systems is actually increasing. One of the most used methods to perform analysis of optical systems is to measure the Modulation Transfer Function (MTF). The MTF represents a direct and quantitative verification of the image quality. This paper presents the implementation of the software, in order to calculate the MTF of electro-optical systems. The software was used for calculating the MTF of Digital Fundus Camera, Thermal Imager and Ophthalmologic Surgery Microscope. The MTF information aids the analysis of alignment and measurement of optical quality, and also defines the limit resolution of optical systems. The results obtained with the Fundus Camera and Thermal Imager was compared with the theoretical values. For the Microscope, the results were compared with MTF measured of Microscope Zeiss model, which is the quality standard of ophthalmological microscope.
NASA Astrophysics Data System (ADS)
Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.
2014-02-01
A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.
NASA Astrophysics Data System (ADS)
Kadosh, Itai; Sarusi, Gabby
2017-10-01
The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is <1 μm. Such layer should be attached in close proximity to the mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.
Optical gas imaging (OGI) cameras have the unique ability to exploit the electromagnetic properties of fugitive chemical vapors to make invisible gases visible. This ability is extremely useful for industrial facilities trying to mitigate product losses from escaping gas and fac...
Virtual-stereo fringe reflection technique for specular free-form surface testing
NASA Astrophysics Data System (ADS)
Ma, Suodong; Li, Bo
2016-11-01
Due to their excellent ability to improve the performance of optical systems, free-form optics have attracted extensive interest in many fields, e.g. optical design of astronomical telescopes, laser beam expanders, spectral imagers, etc. However, compared with traditional simple ones, testing for such kind of optics is usually more complex and difficult which has been being a big barrier for the manufacture and the application of these optics. Fortunately, owing to the rapid development of electronic devices and computer vision technology, fringe reflection technique (FRT) with advantages of simple system structure, high measurement accuracy and large dynamic range is becoming a powerful tool for specular free-form surface testing. In order to obtain absolute surface shape distributions of test objects, two or more cameras are often required in the conventional FRT which makes the system structure more complex and the measurement cost much higher. Furthermore, high precision synchronization between each camera is also a troublesome issue. To overcome the aforementioned drawback, a virtual-stereo FRT for specular free-form surface testing is put forward in this paper. It is able to achieve absolute profiles with the help of only one single biprism and a camera meanwhile avoiding the problems of stereo FRT based on binocular or multi-ocular cameras. Preliminary experimental results demonstrate the feasibility of the proposed technique.
Adaptive Optics For Imaging Bright Objects Next To Dim Ones
NASA Technical Reports Server (NTRS)
Shao, Michael; Yu, Jeffrey W.; Malbet, Fabien
1996-01-01
Adaptive optics used in imaging optical systems, according to proposal, to enhance high-dynamic-range images (images of bright objects next to dim objects). Designed to alter wavefronts to correct for effects of scattering of light from small bumps on imaging optics. Original intended application of concept in advanced camera installed on Hubble Space Telescope for imaging of such phenomena as large planets near stars other than Sun. Also applicable to other high-quality telescopes and cameras.
CCD imaging system for the EUV solar telescope
NASA Astrophysics Data System (ADS)
Gong, Yan; Song, Qian; Ye, Bing-Xun
2006-01-01
In order to develop the detector adapted to the space solar telescope, we have built a CCD camera system capable of working in the extra ultraviolet (EUV) band, which is composed of one phosphor screen, one intensified system using a photocathode/micro-channel plate(MCP)/ phosphor, one optical taper and one chip of front-illuminated (FI) CCD without screen windows. All of them were stuck one by one with optical glue. The working principle of the camera system is presented; moreover we have employed the mesh experiment to calibrate and test the CCD camera system in 15~24nm, the position resolution of about 19 μm is obtained at the wavelength of 17.1nm and 19.5nm.
Vision System Measures Motions of Robot and External Objects
NASA Technical Reports Server (NTRS)
Talukder, Ashit; Matthies, Larry
2008-01-01
A prototype of an advanced robotic vision system both (1) measures its own motion with respect to a stationary background and (2) detects other moving objects and estimates their motions, all by use of visual cues. Like some prior robotic and other optoelectronic vision systems, this system is based partly on concepts of optical flow and visual odometry. Whereas prior optoelectronic visual-odometry systems have been limited to frame rates of no more than 1 Hz, a visual-odometry subsystem that is part of this system operates at a frame rate of 60 to 200 Hz, given optical-flow estimates. The overall system operates at an effective frame rate of 12 Hz. Moreover, unlike prior machine-vision systems for detecting motions of external objects, this system need not remain stationary: it can detect such motions while it is moving (even vibrating). The system includes a stereoscopic pair of cameras mounted on a moving robot. The outputs of the cameras are digitized, then processed to extract positions and velocities. The initial image-data-processing functions of this system are the same as those of some prior systems: Stereoscopy is used to compute three-dimensional (3D) positions for all pixels in the camera images. For each pixel of each image, optical flow between successive image frames is used to compute the two-dimensional (2D) apparent relative translational motion of the point transverse to the line of sight of the camera. The challenge in designing this system was to provide for utilization of the 3D information from stereoscopy in conjunction with the 2D information from optical flow to distinguish between motion of the camera pair and motions of external objects, compute the motion of the camera pair in all six degrees of translational and rotational freedom, and robustly estimate the motions of external objects, all in real time. To meet this challenge, the system is designed to perform the following image-data-processing functions: The visual-odometry subsystem (the subsystem that estimates the motion of the camera pair relative to the stationary background) utilizes the 3D information from stereoscopy and the 2D information from optical flow. It computes the relationship between the 3D and 2D motions and uses a least-mean-squares technique to estimate motion parameters. The least-mean-squares technique is suitable for real-time implementation when the number of external-moving-object pixels is smaller than the number of stationary-background pixels.
Optical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S; Seppala, L; Gilmore, K
2008-07-16
The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less
Micro-optical system based 3D imaging for full HD depth image capturing
NASA Astrophysics Data System (ADS)
Park, Yong-Hwa; Cho, Yong-Chul; You, Jang-Woo; Park, Chang-Young; Yoon, Heesun; Lee, Sang-Hun; Kwon, Jong-Oh; Lee, Seung-Wan
2012-03-01
20 Mega-Hertz-switching high speed image shutter device for 3D image capturing and its application to system prototype are presented. For 3D image capturing, the system utilizes Time-of-Flight (TOF) principle by means of 20MHz high-speed micro-optical image modulator, so called 'optical shutter'. The high speed image modulation is obtained using the electro-optic operation of the multi-layer stacked structure having diffractive mirrors and optical resonance cavity which maximizes the magnitude of optical modulation. The optical shutter device is specially designed and fabricated realizing low resistance-capacitance cell structures having small RC-time constant. The optical shutter is positioned in front of a standard high resolution CMOS image sensor and modulates the IR image reflected from the object to capture a depth image. Suggested novel optical shutter device enables capturing of a full HD depth image with depth accuracy of mm-scale, which is the largest depth image resolution among the-state-of-the-arts, which have been limited up to VGA. The 3D camera prototype realizes color/depth concurrent sensing optical architecture to capture 14Mp color and full HD depth images, simultaneously. The resulting high definition color/depth image and its capturing device have crucial impact on 3D business eco-system in IT industry especially as 3D image sensing means in the fields of 3D camera, gesture recognition, user interface, and 3D display. This paper presents MEMS-based optical shutter design, fabrication, characterization, 3D camera system prototype and image test results.
Effect of camera angulation on adaptation of CAD/CAM restorations.
Parsell, D E; Anderson, B C; Livingston, H M; Rudd, J I; Tankersley, J D
2000-01-01
A significant concern with computer-assisted design/computer-assisted manufacturing (CAD/CAM)-produced prostheses is the accuracy of adaptation of the restoration to the preparation. The objective of this study is to determine the effect of operator-controlled camera misalignment on restoration adaptation. A CEREC 2 CAD/CAM unit (Sirona Dental Systems, Bensheim, Germany) was used to capture the optical impressions and machine the restorations. A Class I preparation was used as the standard preparation for optical impressions. Camera angles along the mesio-distal and buccolingual alignment were varied from the ideal orientation. Occlusal marginal gaps and sample height, width, and length were measured and compared to preparation dimensions. For clinical correlation, clinicians were asked to take optical impressions of mesio-occlusal preparations (Class II) on all four second molar sites, using a patient simulator. On the adjacent first molar occlusal surfaces, a preparation was machined such that camera angulation could be calculated from information taken from the optical impression. Degree of tilt and plane of tilt were compared to the optimum camera positions for those preparations. One-way analysis of variance and Dunnett C post hoc testing (alpha = 0.01) revealed little significant degradation in fit with camera angulation. Only the apical length fit was significantly degraded by excessive angulation. The CEREC 2 CAD/CAM system was found to be relatively insensitive to operator-induced errors attributable to camera misalignments of less than 5 degrees in either the buccolingual or the mesiodistal plane. The average camera tilt error generated by clinicians for all sites was 1.98 +/- 1.17 degrees.
Robotic Vehicle Communications Interoperability
1988-08-01
starter (cold start) X X Fire suppression X Fording control X Fuel control X Fuel tank selector X Garage toggle X Gear selector X X X X Hazard warning...optic Sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor control...optic sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor
NASA Astrophysics Data System (ADS)
Motta, Danilo A.; Serillo, André; de Matos, Luciana; Yasuoka, Fatima M. M.; Bagnato, Vanderlei S.; Carvalho, Luis A. V.
2014-03-01
Glaucoma is the second main cause of the blindness in the world and there is a tendency to increase this number due to the lifetime expectation raise of the population. Glaucoma is related to the eye conditions, which leads the damage to the optic nerve. This nerve carries visual information from eye to brain, then, if it has damage, it compromises the visual quality of the patient. In the majority cases the damage of the optic nerve is irreversible and it happens due to increase of intraocular pressure. One of main challenge for the diagnosis is to find out this disease, because any symptoms are not present in the initial stage. When is detected, it is already in the advanced stage. Currently the evaluation of the optic disc is made by sophisticated fundus camera, which is inaccessible for the majority of Brazilian population. The purpose of this project is to develop a specific fundus camera without fluorescein angiography and red-free system to accomplish 3D image of optic disc region. The innovation is the new simplified design of a stereo-optical system, in order to make capable the 3D image capture and in the same time quantitative measurements of excavation and topography of optic nerve; something the traditional fundus cameras do not do. The dedicated hardware and software is developed for this ophthalmic instrument, in order to permit quick capture and print of high resolution 3D image and videos of optic disc region (20° field-of-view) in the mydriatic and nonmydriatic mode.
Wavefront Sensing With Switched Lenses for Defocus Diversity
NASA Technical Reports Server (NTRS)
Dean, Bruce H.
2007-01-01
In an alternative hardware design for an apparatus used in image-based wavefront sensing, defocus diversity is introduced by means of fixed lenses that are mounted in a filter wheel (see figure) so that they can be alternately switched into a position in front of the focal plane of an electronic camera recording the image formed by the optical system under test. [The terms image-based, wavefront sensing, and defocus diversity are defined in the first of the three immediately preceding articles, Broadband Phase Retrieval for Image-Based Wavefront Sensing (GSC-14899-1).] Each lens in the filter wheel is designed so that the optical effect of placing it at the assigned position is equivalent to the optical effect of translating the camera a specified defocus distance along the optical axis. Heretofore, defocus diversity has been obtained by translating the imaging camera along the optical axis to various defocus positions. Because data must be taken at multiple, accurately measured defocus positions, it is necessary to mount the camera on a precise translation stage that must be calibrated for each defocus position and/or to use an optical encoder for measurement and feedback control of the defocus positions. Additional latency is introduced into the wavefront sensing process as the camera is translated to the various defocus positions. Moreover, if the optical system under test has a large focal length, the required defocus values are large, making it necessary to use a correspondingly bulky translation stage. By eliminating the need for translation of the camera, the alternative design simplifies and accelerates the wavefront-sensing process. This design is cost-effective in that the filterwheel/lens mechanism can be built from commercial catalog components. After initial calibration of the defocus value of each lens, a selected defocus value is introduced by simply rotating the filter wheel to place the corresponding lens in front of the camera. The rotation of the wheel can be automated by use of a motor drive, and further calibration is not necessary. Because a camera-translation stage is no longer needed, the size of the overall apparatus can be correspondingly reduced.
Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography
Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.
1972-01-01
Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...
2016-11-28
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
NASA Astrophysics Data System (ADS)
Scaduto, L. C. N.; Carvalho, E. G.; Modugno, R. G.; Cartolano, R.; Evangelista, S. H.; Segoria, D.; Santos, A. G.; Stefani, M. A.; Castro Neto, J. C.
2017-11-01
The purpose of this paper is to present the optical system developed for the Wide Field imaging Camera - WFI that will be integrated to the CBERS 3 and 4 satellites (China Brazil Earth resources Satellite). This camera will be used for remote sensing of the Earth and it is aimed to work at an altitude of 778 km. The optical system is designed for four spectral bands covering the range of wavelengths from blue to near infrared and its field of view is +/-28.63°, which covers 866 km, with a ground resolution of 64 m at nadir. WFI has been developed through a consortium formed by Opto Electrônica S. A. and Equatorial Sistemas. In particular, we will present the optical analysis based on the Modulation Transfer Function (MTF) obtained during the Engineering Model phase (EM) and the optical tests performed to evaluate the requirements. Measurements of the optical system MTF have been performed using an interferometer at the wavelength of 632.8nm and global MTF tests (including the CCD and signal processing electronic) have been performed by using a collimator with a slit target. The obtained results showed that the performance of the optical system meets the requirements of project.
NASA Astrophysics Data System (ADS)
Zhang, Hua; Zeng, Luan
2017-11-01
Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.
Development of a 3-D visible limiter imaging system for the HSX stellarator
NASA Astrophysics Data System (ADS)
Buelo, C.; Stephey, L.; Anderson, F. S. B.; Eisert, D.; Anderson, D. T.
2017-12-01
A visible camera diagnostic has been developed to study the Helically Symmetric eXperiment (HSX) limiter plasma interaction. A straight line view from the camera location to the limiter was not possible due to the complex 3D stellarator geometry of HSX, so it was necessary to insert a mirror/lens system into the plasma edge. A custom support structure for this optical system tailored to the HSX geometry was designed and installed. This system holds the optics tube assembly at the required angle for the desired view to both minimize system stress and facilitate robust and repeatable camera positioning. The camera system has been absolutely calibrated and using Hα and C-III filters can provide hydrogen and carbon photon fluxes, which through an S/XB coefficient can be converted into particle fluxes. The resulting measurements have been used to obtain the characteristic penetration length of hydrogen and C-III species. The hydrogen λiz value shows reasonable agreement with the value predicted by a 1D penetration length calculation.
Arain, Nabeel A; Cadeddu, Jeffrey A; Best, Sara L; Roshek, Thomas; Chang, Victoria; Hogg, Deborah C; Bergs, Richard; Fernandez, Raul; Webb, Erin M; Scott, Daniel J
2012-04-01
This study aimed to evaluate the surgeon performance and workload of a next-generation magnetically anchored camera compared with laparoscopic and flexible endoscopic imaging systems for laparoscopic and single-site laparoscopy (SSL) settings. The cameras included a 5-mm 30° laparoscope (LAP), a magnetically anchored (MAGS) camera, and a flexible endoscope (ENDO). The three camera systems were evaluated using standardized optical characteristic tests. Each system was used in random order for visualization during performance of a standardized suturing task by four surgeons. Each participant performed three to five consecutive repetitions as a surgeon and also served as a camera driver for other surgeons. Ex vivo testing was conducted in a laparoscopic multiport and SSL layout using a box trainer. In vivo testing was performed only in the multiport configuration and used a previously validated live porcine Nissen model. Optical testing showed superior resolution for MAGS at 5 and 10 cm compared with LAP or ENDO. The field of view ranged from 39 to 99°. The depth of focus was almost three times greater for MAGS (6-270 mm) than for LAP (2-88 mm) or ENDO (1-93 mm). Both ex vivo and in vivo multiport combined surgeon performance was significantly better for LAP than for ENDO, but no significant differences were detected for MAGS. For multiport testing, workload ratings were significantly less ex vivo for LAP and MAGS than for ENDO and less in vivo for LAP than for MAGS or ENDO. For ex vivo SSL, no significant performance differences were detected, but camera drivers rated the workload significantly less for MAGS than for LAP or ENDO. The data suggest that the improved imaging element of the next-generation MAGS camera has optical and performance characteristics that meet or exceed those of the LAP or ENDO systems and that the MAGS camera may be especially useful for SSL. Further refinements of the MAGS camera are encouraged.
Optomechanical stability design of space optical mapping camera
NASA Astrophysics Data System (ADS)
Li, Fuqiang; Cai, Weijun; Zhang, Fengqin; Li, Na; Fan, Junjie
2018-01-01
According to the interior orientation elements and imaging quality requirements of mapping application to mapping camera and combined with off-axis three-mirror anastigmat(TMA) system, high optomechanical stability design of a space optical mapping camera is introduced in this paper. The configuration is a coaxial TMA system used in off-axis situation. Firstly, the overall optical arrangement is described., and an overview of the optomechanical packaging is provided. Zerodurglass, carbon fiber composite and carbon-fiber reinforced silicon carbon (C/SiC) are widely used in the optomechanical structure, because their low coefficient of thermal expansion (CTE) can reduce the thermal sensitivity of the mirrors and focal plane. Flexible and unloading support are used in reflector and camera supporting structure. Epoxy structural adhesives is used for bonding optics to metal structure is also introduced in this paper. The primary mirror is mounted by means of three-point ball joint flexures system, which is attach to the back of the mirror. Then, In order to predict flexural displacements due to gravity, static finite element analysis (FEA) is performed on the primary mirror. The optical performance peak-to-valley (PV) and root-mean-square (RMS) wavefront errors are detected before and after assemble. Also, the dynamic finite element analysis(FEA) of the whole optical arrangement is carried out as to investigate the performance of optomechanical. Finally, in order to evaluate the stability of the design, the thermal vacuum test and vibration test are carried out and the Modulation Transfer Function (MTF) and elements of interior orientation are presented as the evaluation index. Before and after the thermal vacuum test and vibration test, the MTF, focal distance and position of the principal point of optical system are measured and the result is as expected.
Tsiourlis, Georgios; Andreadakis, Stamatis; Konstantinidis, Pavlos
2009-01-01
The SITHON system, a fully wireless optical imaging system, integrating a network of in-situ optical cameras linking to a multi-layer GIS database operated by Control Operating Centres, has been developed in response to the need for early detection, notification and monitoring of forest fires. This article presents in detail the architecture and the components of SITHON, and demonstrates the first encouraging results of an experimental test with small controlled fires over Sithonia Peninsula in Northern Greece. The system has already been scheduled to be installed in some fire prone areas of Greece. PMID:22408536
Design framework for a spectral mask for a plenoptic camera
NASA Astrophysics Data System (ADS)
Berkner, Kathrin; Shroff, Sapna A.
2012-01-01
Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.
NASA Astrophysics Data System (ADS)
Masciotti, James M.; Rahim, Shaheed; Grover, Jarrett; Hielscher, Andreas H.
2007-02-01
We present a design for frequency domain instrument that allows for simultaneous gathering of magnetic resonance and diffuse optical tomographic imaging data. This small animal imaging system combines the high anatomical resolution of magnetic resonance imaging (MRI) with the high temporal resolution and physiological information provided by diffuse optical tomography (DOT). The DOT hardware comprises laser diodes and an intensified CCD camera, which are modulated up to 1 GHz by radio frequency (RF) signal generators. An optical imaging head is designed to fit inside the 4 cm inner diameter of a 9.4 T MRI system. Graded index fibers are used to transfer light between the optical hardware and the imaging head within the RF coil. Fiducial markers are integrated into the imaging head to allow the determination of the positions of the source and detector fibers on the MR images and to permit co-registration of MR and optical tomographic images. Detector fibers are arranged compactly and focused through a camera lens onto the photocathode of the intensified CCD camera.
Design of the high resolution optical instrument for the Pleiades HR Earth observation satellites
NASA Astrophysics Data System (ADS)
Lamard, Jean-Luc; Gaudin-Delrieu, Catherine; Valentini, David; Renard, Christophe; Tournier, Thierry; Laherrere, Jean-Marc
2017-11-01
As part of its contribution to Earth observation from space, ALCATEL SPACE designed, built and tested the High Resolution cameras for the European intelligence satellites HELIOS I and II. Through these programmes, ALCATEL SPACE enjoys an international reputation. Its capability and experience in High Resolution instrumentation is recognised by the most customers. Coming after the SPOT program, it was decided to go ahead with the PLEIADES HR program. PLEIADES HR is the optical high resolution component of a larger optical and radar multi-sensors system : ORFEO, which is developed in cooperation between France and Italy for dual Civilian and Defense use. ALCATEL SPACE has been entrusted by CNES with the development of the high resolution camera of the Earth observation satellites PLEIADES HR. The first optical satellite of the PLEIADES HR constellation will be launched in mid-2008, the second will follow in 2009. To minimize the development costs, a mini satellite approach has been selected, leading to a compact concept for the camera design. The paper describes the design and performance budgets of this novel high resolution and large field of view optical instrument with emphasis on the technological features. This new generation of camera represents a breakthrough in comparison with the previous SPOT cameras owing to a significant step in on-ground resolution, which approaches the capabilities of aerial photography. Recent advances in detector technology, optical fabrication and electronics make it possible for the PLEIADES HR camera to achieve their image quality performance goals while staying within weight and size restrictions normally considered suitable only for much lower performance systems. This camera design delivers superior performance using an innovative low power, low mass, scalable architecture, which provides a versatile approach for a variety of imaging requirements and allows for a wide number of possibilities of accommodation with a mini-satellite class platform.
A small field of view camera for hybrid gamma and optical imaging
NASA Astrophysics Data System (ADS)
Lees, J. E.; Bugby, S. L.; Bhatia, B. S.; Jambi, L. K.; Alqahtani, M. S.; McKnight, W. R.; Ng, A. H.; Perkins, A. C.
2014-12-01
The development of compact low profile gamma-ray detectors has allowed the production of small field of view, hand held imaging devices for use at the patient bedside and in operating theatres. The combination of an optical and a gamma camera, in a co-aligned configuration, offers high spatial resolution multi-modal imaging giving a superimposed scintigraphic and optical image. This innovative introduction of hybrid imaging offers new possibilities for assisting surgeons in localising the site of uptake in procedures such as sentinel node detection. Recent improvements to the camera system along with results of phantom and clinical imaging are reported.
Holographic motion picture camera with Doppler shift compensation
NASA Technical Reports Server (NTRS)
Kurtz, R. L. (Inventor)
1976-01-01
A holographic motion picture camera is reported for producing three dimensional images by employing an elliptical optical system. There is provided in one of the beam paths (the object or reference beam path) a motion compensator which enables the camera to photograph faster moving objects.
Miniaturized unified imaging system using bio-inspired fluidic lens
NASA Astrophysics Data System (ADS)
Tsai, Frank S.; Cho, Sung Hwan; Qiao, Wen; Kim, Nam-Hyong; Lo, Yu-Hwa
2008-08-01
Miniaturized imaging systems have become ubiquitous as they are found in an ever-increasing number of devices, such as cellular phones, personal digital assistants, and web cameras. Until now, the design and fabrication methodology of such systems have not been significantly different from conventional cameras. The only established method to achieve focusing is by varying the lens distance. On the other hand, the variable-shape crystalline lens found in animal eyes offers inspiration for a more natural way of achieving an optical system with high functionality. Learning from the working concepts of the optics in the animal kingdom, we developed bio-inspired fluidic lenses for a miniature universal imager with auto-focusing, macro, and super-macro capabilities. Because of the enormous dynamic range of fluidic lenses, the miniature camera can even function as a microscope. To compensate for the image quality difference between the central vision and peripheral vision and the shape difference between a solid-state image sensor and a curved retina, we adopted a hybrid design consisting of fluidic lenses for tunability and fixed lenses for aberration and color dispersion correction. A design of the world's smallest surgical camera with 3X optical zoom capabilities is also demonstrated using the approach of hybrid lenses.
Retinal axial focusing and multi-layer imaging with a liquid crystal adaptive optics camera
NASA Astrophysics Data System (ADS)
Liu, Rui-Xue; Zheng, Xian-Liang; Li, Da-Yu; Xia, Ming-Liang; Hu, Li-Fa; Cao, Zhao-Liang; Mu, Quan-Quan; Xuan, Li
2014-09-01
With the help of adaptive optics (AO) technology, cellular level imaging of living human retina can be achieved. Aiming to reduce distressing feelings and to avoid potential drug induced diseases, we attempted to image retina with dilated pupil and froze accommodation without drugs. An optimized liquid crystal adaptive optics camera was adopted for retinal imaging. A novel eye stared system was used for stimulating accommodation and fixating imaging area. Illumination sources and imaging camera kept linkage for focusing and imaging different layers. Four subjects with diverse degree of myopia were imaged. Based on the optical properties of the human eye, the eye stared system reduced the defocus to less than the typical ocular depth of focus. In this way, the illumination light can be projected on certain retina layer precisely. Since that the defocus had been compensated by the eye stared system, the adopted 512 × 512 liquid crystal spatial light modulator (LC-SLM) corrector provided the crucial spatial fidelity to fully compensate high-order aberrations. The Strehl ratio of a subject with -8 diopter myopia was improved to 0.78, which was nearly close to diffraction-limited imaging. By finely adjusting the axial displacement of illumination sources and imaging camera, cone photoreceptors, blood vessels and nerve fiber layer were clearly imaged successfully.
Camera System MTF: combining optic with detector
NASA Astrophysics Data System (ADS)
Andersen, Torben B.; Granger, Zachary A.
2017-08-01
MTF is one of the most common metrics used to quantify the resolving power of an optical component. Extensive literature is dedicated to describing methods to calculate the Modulation Transfer Function (MTF) for stand-alone optical components such as a camera lens or telescope, and some literature addresses approaches to determine an MTF for combination of an optic with a detector. The formulations pertaining to a combined electro-optical system MTF are mostly based on theory, and assumptions that detector MTF is described only by the pixel pitch which does not account for wavelength dependencies. When working with real hardware, detectors are often characterized by testing MTF at discrete wavelengths. This paper presents a method to simplify the calculation of a polychromatic system MTF when it is permissible to consider the detector MTF to be independent of wavelength.
A rigid and thermally stable all ceramic optical support bench assembly for the LSST Camera
NASA Astrophysics Data System (ADS)
Kroedel, Matthias; Langton, J. Brian; Wahl, Bill
2017-09-01
This paper will present the ceramic design, fabrication and metrology results and assembly plan of the LSST camera optical bench structure which is using the unique manufacturing features of the HB-Cesic technology. The optical bench assembly consists of a rigid "Grid" fabrication supporting individual raft plates mounting sensor assemblies by way of a rigid kinematic support system to meet extreme stringent requirements for focal plane planarity and stability.
Li, Jin; Liu, Zilong
2017-07-24
Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.
Evaluation of multispectral plenoptic camera
NASA Astrophysics Data System (ADS)
Meng, Lingfei; Sun, Ting; Kosoglow, Rich; Berkner, Kathrin
2013-01-01
Plenoptic cameras enable capture of a 4D lightfield, allowing digital refocusing and depth estimation from data captured with a compact portable camera. Whereas most of the work on plenoptic camera design has been based a simplistic geometric-optics-based characterization of the optical path only, little work has been done of optimizing end-to-end system performance for a specific application. Such design optimization requires design tools that need to include careful parameterization of main lens elements, as well as microlens array and sensor characteristics. In this paper we are interested in evaluating the performance of a multispectral plenoptic camera, i.e. a camera with spectral filters inserted into the aperture plane of the main lens. Such a camera enables single-snapshot spectral data acquisition.1-3 We first describe in detail an end-to-end imaging system model for a spectrally coded plenoptic camera that we briefly introduced in.4 Different performance metrics are defined to evaluate the spectral reconstruction quality. We then present a prototype which is developed based on a modified DSLR camera containing a lenslet array on the sensor and a filter array in the main lens. Finally we evaluate the spectral reconstruction performance of a spectral plenoptic camera based on both simulation and measurements obtained from the prototype.
NASA Astrophysics Data System (ADS)
Swain, Pradyumna; Mark, David
2004-09-01
The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.
Model of an optical system's influence on sensitivity of microbolometric focal plane array
NASA Astrophysics Data System (ADS)
Gogler, Sławomir; Bieszczad, Grzegorz; Zarzycka, Alicja; Szymańska, Magdalena; Sosnowski, Tomasz
2012-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. The detectors used in thermal camera are illuminated by infrared radiation transmitted through a specialized optical system. Each optical system used influences irradiation distribution across an sensor array. In the article a model describing irradiation distribution across an array sensor working with an optical system used in the calibration set-up has been proposed. In the said method optical and geometrical considerations of the array set-up have been taken into account. By means of Monte-Carlo simulation, large number of rays has been traced to the sensor plane, what allowed to determine the irradiation distribution across the image plane for different aperture limiting configurations. Simulated results have been confronted with proposed analytical expression. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
Navigating surgical fluorescence cameras using near-infrared optical tracking.
van Oosterom, Matthias; den Houting, David; van de Velde, Cornelis; van Leeuwen, Fijs
2018-05-01
Fluorescence guidance facilitates real-time intraoperative visualization of the tissue of interest. However, due to attenuation, the application of fluorescence guidance is restricted to superficial lesions. To overcome this shortcoming, we have previously applied three-dimensional surgical navigation to position the fluorescence camera in reach of the superficial fluorescent signal. Unfortunately, in open surgery, the near-infrared (NIR) optical tracking system (OTS) used for navigation also induced an interference during NIR fluorescence imaging. In an attempt to support future implementation of navigated fluorescence cameras, different aspects of this interference were characterized and solutions were sought after. Two commercial fluorescence cameras for open surgery were studied in (surgical) phantom and human tissue setups using two different NIR OTSs and one OTS simulating light-emitting diode setup. Following the outcome of these measurements, OTS settings were optimized. Measurements indicated the OTS interference was caused by: (1) spectral overlap between the OTS light and camera, (2) OTS light intensity, (3) OTS duty cycle, (4) OTS frequency, (5) fluorescence camera frequency, and (6) fluorescence camera sensitivity. By optimizing points 2 to 4, navigation of fluorescence cameras during open surgery could be facilitated. Optimization of the OTS and camera compatibility can be used to support navigated fluorescence guidance concepts. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Concept of electro-optical sensor module for sniper detection system
NASA Astrophysics Data System (ADS)
Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz
2010-10-01
The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.
Multiple-aperture optical design for micro-level cameras using 3D-printing method
NASA Astrophysics Data System (ADS)
Peng, Wei-Jei; Hsu, Wei-Yao; Cheng, Yuan-Chieh; Lin, Wen-Lung; Yu, Zong-Ru; Chou, Hsiao-Yu; Chen, Fong-Zhi; Fu, Chien-Chung; Wu, Chong-Syuan; Huang, Chao-Tsung
2018-02-01
The design of the ultra miniaturized camera using 3D-printing technology directly printed on to the complementary metal-oxide semiconductor (CMOS) imaging sensor is presented in this paper. The 3D printed micro-optics is manufactured using the femtosecond two-photon direct laser writing, and the figure error which could achieve submicron accuracy is suitable for the optical system. Because the size of the micro-level camera is approximately several hundreds of micrometers, the resolution is reduced much and highly limited by the Nyquist frequency of the pixel pitch. For improving the reduced resolution, one single-lens can be replaced by multiple-aperture lenses with dissimilar field of view (FOV), and then stitching sub-images with different FOV can achieve a high resolution within the central region of the image. The reason is that the angular resolution of the lens with smaller FOV is higher than that with larger FOV, and then the angular resolution of the central area can be several times than that of the outer area after stitching. For the same image circle, the image quality of the central area of the multi-lens system is significantly superior to that of a single-lens. The foveated image using stitching FOV breaks the limitation of the resolution for the ultra miniaturized imaging system, and then it can be applied such as biomedical endoscopy, optical sensing, and machine vision, et al. In this study, the ultra miniaturized camera with multi-aperture optics is designed and simulated for the optimum optical performance.
Interplanetary approach optical navigation with applications
NASA Technical Reports Server (NTRS)
Jerath, N.
1978-01-01
The use of optical data from onboard television cameras for the navigation of interplanetary spacecraft during the planet approach phase is investigated. Three optical data types were studied: the planet limb with auxiliary celestial references, the satellite-star, and the planet-star two-camera methods. Analysis and modelling issues related to the nature and information content of the optical methods were examined. Dynamic and measurement system modelling, data sequence design, measurement extraction, model estimation and orbit determination, as relating optical navigation, are discussed, and the various error sources were analyzed. The methodology developed was applied to the Mariner 9 and the Viking Mars missions. Navigation accuracies were evaluated at the control and knowledge points, with particular emphasis devoted to the combined use of radio and optical data. A parametric probability analysis technique was developed to evaluate navigation performance as a function of system reliabilities.
NASA Technical Reports Server (NTRS)
Dunham, Edward W.
2000-01-01
We developed the CCD camera system for the laboratory test demonstration and designed the optical system for this test. The camera system was delivered to Ames in April, 1999 with continuing support mostly in the software area as the test progressed. The camera system has been operating successfully since delivery. The optical system performed well during the test. The laboratory demonstration activity is now nearly complete and is considered to be successful by the Technical Advisory Group, which met on 8 February, 2000 at the SETI Institute. A final report for the Technical Advisory Group and NASA Headquarters will be produced in the next few months. This report will be a comprehensive report on all facets of the test including those covered under this grant. A copy will be forwarded, if desired, when it is complete.
ARNICA, the Arcetri Near-Infrared Camera
NASA Astrophysics Data System (ADS)
Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.
1996-04-01
ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, D.E.; Roeske, F.
We have successfully fielded a Fiber Optics Radiation Experiment system (FOREX) designed for measuring material properties at high temperatures and pressures in an underground nuclear test. The system collects light from radiating materials and transmits it through several hundred meters of optical fibers to a recording station consisting of a streak camera with film readout. The use of fiber optics provides a faster time response than can presently be obtained with equalized coaxial cables over comparable distances. Fibers also have significant cost and physical size advantages over coax cables. The streak camera achieves a much higher information density than anmore » equivalent oscilloscope system, and it also serves as the light detector. The result is a wide bandwidth high capacity system that can be fielded at a relatively low cost in manpower, space, and materials. For this experiment, the streak camera had a 120 ns time window with a 1.2 ns time resolution. Dynamic range for the system was about 1000. Beam current statistical limitations were approximately 8% for a 0.3 ns wide data point at one decade above the threshold recording intensity.« less
ATTICA family of thermal cameras in submarine applications
NASA Astrophysics Data System (ADS)
Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold
2001-10-01
Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.
Prism-based single-camera system for stereo display
NASA Astrophysics Data System (ADS)
Zhao, Yue; Cui, Xiaoyu; Wang, Zhiguo; Chen, Hongsheng; Fan, Heyu; Wu, Teresa
2016-06-01
This paper combines the prism and single camera and puts forward a method of stereo imaging with low cost. First of all, according to the principle of geometrical optics, we can deduce the relationship between the prism single-camera system and dual-camera system, and according to the principle of binocular vision we can deduce the relationship between binoculars and dual camera. Thus we can establish the relationship between the prism single-camera system and binoculars and get the positional relation of prism, camera, and object with the best effect of stereo display. Finally, using the active shutter stereo glasses of NVIDIA Company, we can realize the three-dimensional (3-D) display of the object. The experimental results show that the proposed approach can make use of the prism single-camera system to simulate the various observation manners of eyes. The stereo imaging system, which is designed by the method proposed by this paper, can restore the 3-D shape of the object being photographed factually.
Image Intensifier Modules For Use With Commercially Available Solid State Cameras
NASA Astrophysics Data System (ADS)
Murphy, Howard; Tyler, Al; Lake, Donald W.
1989-04-01
A modular approach to design has contributed greatly to the success of the family of machine vision video equipment produced by EG&G Reticon during the past several years. Internal modularity allows high-performance area (matrix) and line scan cameras to be assembled with two or three electronic subassemblies with very low labor costs, and permits camera control and interface circuitry to be realized by assemblages of various modules suiting the needs of specific applications. Product modularity benefits equipment users in several ways. Modular matrix and line scan cameras are available in identical enclosures (Fig. 1), which allows enclosure components to be purchased in volume for economies of scale and allows field replacement or exchange of cameras within a customer-designed system to be easily accomplished. The cameras are optically aligned (boresighted) at final test; modularity permits optical adjustments to be made with the same precise test equipment for all camera varieties. The modular cameras contain two, or sometimes three, hybrid microelectronic packages (Fig. 2). These rugged and reliable "submodules" perform all of the electronic operations internal to the camera except for the job of image acquisition performed by the monolithic image sensor. Heat produced by electrical power dissipation in the electronic modules is conducted through low resistance paths to the camera case by the metal plates, which results in a thermally efficient and environmentally tolerant camera with low manufacturing costs. A modular approach has also been followed in design of the camera control, video processor, and computer interface accessory called the Formatter (Fig. 3). This unit can be attached directly onto either a line scan or matrix modular camera to form a self-contained units, or connected via a cable to retain the advantages inherent to a small, light weight, and rugged image sensing component. Available modules permit the bus-structured Formatter to be configured as required by a specific camera application. Modular line and matrix scan cameras incorporating sensors with fiber optic faceplates (Fig 4) are also available. These units retain the advantages of interchangeability, simple construction, ruggedness, and optical precision offered by the more common lens input units. Fiber optic faceplate cameras are used for a wide variety of applications. A common usage involves mating of the Reticon-supplied camera to a customer-supplied intensifier tube for low light level and/or short exposure time situations.
Optical Arc-Length Sensor For TIG Welding
NASA Technical Reports Server (NTRS)
Smith, Matthew A.
1990-01-01
Proposed subsystem of tungsten/inert-gas (TIG) welding system measures length of welding arc optically. Viewed by video camera, in one of three alternative optical configurations. Length of arc measured instead of inferred from voltage.
Harrison, Thomas C; Sigler, Albrecht; Murphy, Timothy H
2009-09-15
We describe a simple and low-cost system for intrinsic optical signal (IOS) imaging using stable LED light sources, basic microscopes, and commonly available CCD cameras. IOS imaging measures activity-dependent changes in the light reflectance of brain tissue, and can be performed with a minimum of specialized equipment. Our system uses LED ring lights that can be mounted on standard microscope objectives or video lenses to provide a homogeneous and stable light source, with less than 0.003% fluctuation across images averaged from 40 trials. We describe the equipment and surgical techniques necessary for both acute and chronic mouse preparations, and provide software that can create maps of sensory representations from images captured by inexpensive 8-bit cameras or by 12-bit cameras. The IOS imaging system can be adapted to commercial upright microscopes or custom macroscopes, eliminating the need for dedicated equipment or complex optical paths. This method can be combined with parallel high resolution imaging techniques such as two-photon microscopy.
Applications of optical fibers and miniature photonic elements in medical diagnostics
NASA Astrophysics Data System (ADS)
Blaszczak, Urszula; Gilewski, Marian; Gryko, Lukasz; Zajac, Andrzej; Kukwa, Andrzej; Kukwa, Wojciech
2014-05-01
Construction of endoscopes which are known for decades, in particular in small devices with the diameter of few millimetres, are based on the application of fibre optic imaging bundles or bundles of fibers in the illumination systems (usually with a halogen source). Cameras - CCD and CMOS - with the sensor size of less than 5 mm emerging commercially and high power LED solutions allow to design and construct modern endoscopes characterized by many innovative properties. These constructions offer higher resolution. They are also relatively cheaper especially in the context of the integration of the majority of the functions on a single chip. Mentioned features of the CMOS sensors reduce the cycle of introducing the newly developed instruments to the market. The paper includes a description of the concept of the endoscope with a miniature camera built on the basis of CMOS detector manufactured by Omni Vision. The set of LEDs located at the operator side works as the illuminating system. Fibre optic system and the lens of the camera are used in shaping the beam illuminating the observed tissue. Furthermore, to broaden the range of applications of the endoscope, the illuminator allows to control the spectral characteristics of emitted light. The paper presents the analysis of the basic parameters of the light-and-optical system of the endoscope. The possibility of adjusting the magnifications of the lens, the field of view of the camera and its spatial resolution is discussed. Special attention was drawn to the issues related to the selection of the light sources used for the illumination in terms of energy efficiency and the possibility of providing adjusting the colour of the emitted light in order to improve the quality of the image obtained by the camera.
Near infra-red astronomy with adaptive optics and laser guide stars at the Keck Observatory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Max, C.E.; Gavel, D.T.; Olivier, S.S.
1995-08-03
A laser guide star adaptive optics system is being built for the W. M. Keck Observatory`s 10-meter Keck II telescope. Two new near infra-red instruments will be used with this system: a high-resolution camera (NIRC 2) and an echelle spectrometer (NIRSPEC). The authors describe the expected capabilities of these instruments for high-resolution astronomy, using adaptive optics with either a natural star or a sodium-layer laser guide star as a reference. They compare the expected performance of these planned Keck adaptive optics instruments with that predicted for the NICMOS near infra-red camera, which is scheduled to be installed on the Hubblemore » Space Telescope in 1997.« less
Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy
NASA Technical Reports Server (NTRS)
1984-01-01
Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.
ERIC Educational Resources Information Center
Biermann, Mark L.; Biermann, Lois A. A.
1996-01-01
Discusses descriptions of the way in which an optical system controls the quantity of light that reaches a point on the image plane, a basic feature of optical imaging systems such as cameras, telescopes, and microscopes. (JRH)
NASA Astrophysics Data System (ADS)
Bernas, Martin; Páta, Petr; Hudec, René; Soldán, Jan; Rezek, Tomáš; Castro-Tirado, Alberto J.
1998-05-01
Although there are several optical GRB follow-up systems in operation and/or in development, some of them with a very short response time, they will never be able to provide true simultaneous (no delay) and pre-burst optical data for GRBs. We report on the development and tests of a monitoring experiment expected to be put into test operation in 1998. The system should detect Optical Transients down to mag 6-7 (few seconds duration assumed) over a wide field of view. The system is based on the double CCD wide-field cameras ST8. For the real time evaluation of the signal from both cameras, two TMS 320C40 processors are used. Using two channels differing in spectral sensitivity and processing of temporal sequence of images allows us to eliminate man-made objects and defects of the CCD electronics. The system is controlled by a standard PC computer.
Digital optical correlator x-ray telescope alignment monitoring system
NASA Astrophysics Data System (ADS)
Lis, Tomasz; Gaskin, Jessica; Jasper, John; Gregory, Don A.
2018-01-01
The High-Energy Replicated Optics to Explore the Sun (HEROES) program is a balloon-borne x-ray telescope mission to observe hard x-rays (˜20 to 70 keV) from the sun and multiple astrophysical targets. The payload consists of eight mirror modules with a total of 114 optics that are mounted on a 6-m-long optical bench. Each mirror module is complemented by a high-pressure xenon gas scintillation proportional counter. Attached to the payload is a camera that acquires star fields and then matches the acquired field to star maps to determine the pointing of the optical bench. Slight misalignments between the star camera, the optical bench, and the telescope elements attached to the optical bench may occur during flight due to mechanical shifts, thermal gradients, and gravitational effects. These misalignments can result in diminished imaging and reduced photon collection efficiency. To monitor these misalignments during flight, a supplementary Bench Alignment Monitoring System (BAMS) was added to the payload. BAMS hardware comprises two cameras mounted directly to the optical bench and rings of light-emitting diodes (LEDs) mounted onto the telescope components. The LEDs in these rings are mounted in a predefined, asymmetric pattern, and their positions are tracked using an optical/digital correlator. The BAMS analysis software is a digital adaption of an optical joint transform correlator. The aim is to enhance the observational proficiency of HEROES while providing insight into the magnitude of mechanically and thermally induced misalignments during flight. Results from a preflight test of the system are reported.
The spacecraft control laboratory experiment optical attitude measurement system
NASA Technical Reports Server (NTRS)
Welch, Sharon S.; Montgomery, Raymond C.; Barsky, Michael F.
1991-01-01
A stereo camera tracking system was developed to provide a near real-time measure of the position and attitude of the Spacecraft COntrol Laboratory Experiment (SCOLE). The SCOLE is a mockup of the shuttle-like vehicle with an attached flexible mast and (simulated) antenna, and was designed to provide a laboratory environment for the verification and testing of control laws for large flexible spacecraft. Actuators and sensors located on the shuttle and antenna sense the states of the spacecraft and allow the position and attitude to be controlled. The stereo camera tracking system which was developed consists of two position sensitive detector cameras which sense the locations of small infrared LEDs attached to the surface of the shuttle. Information on shuttle position and attitude is provided in six degrees-of-freedom. The design of this optical system, calibration, and tracking algorithm are described. The performance of the system is evaluated for yaw only.
Studying Upper-Limb Amputee Prosthesis Use to Inform Device Design
2015-10-01
the study. This equipment has included a modified GoPro head-mounted camera and a Vicon 13-camera optical motion capture system, which was not part...also completed for relevant members of the study team. 4. The head-mounted camera setup has been established (a modified GoPro Hero 3 with external
Design of microcontroller based system for automation of streak camera.
Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P
2010-08-01
A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.
Design of microcontroller based system for automation of streak camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.
2010-08-15
A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor.more » A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.« less
Completely optical orientation determination for an unstabilized aerial three-line camera
NASA Astrophysics Data System (ADS)
Wohlfeil, Jürgen
2010-10-01
Aerial line cameras allow the fast acquisition of high-resolution images at low costs. Unfortunately the measurement of the camera's orientation with the necessary rate and precision is related with large effort, unless extensive camera stabilization is used. But also stabilization implicates high costs, weight, and power consumption. This contribution shows that it is possible to completely derive the absolute exterior orientation of an unstabilized line camera from its images and global position measurements. The presented approach is based on previous work on the determination of the relative orientation of subsequent lines using optical information from the remote sensing system. The relative orientation is used to pre-correct the line images, in which homologous points can reliably be determined using the SURF operator. Together with the position measurements these points are used to determine the absolute orientation from the relative orientations via bundle adjustment of a block of overlapping line images. The approach was tested at a flight with the DLR's RGB three-line camera MFC. To evaluate the precision of the resulting orientation the measurements of a high-end navigation system and ground control points are used.
The optical design of a visible adaptive optics system for the Magellan Telescope
NASA Astrophysics Data System (ADS)
Kopon, Derek
The Magellan Adaptive Optics system will achieve first light in November of 2012. This AO system contains several subsystems including the 585-actuator concave adaptive secondary mirror, the Calibration Return Optic (CRO) alignment and calibration system, the CLIO 1-5 microm IR science camera, the movable guider camera and active optics assembly, and the W-Unit, which contains both the Pyramid Wavefront Sensor (PWFS) and the VisAO visible science camera. In this dissertation, we present details of the design, fabrication, assembly, alignment, and laboratory performance of the VisAO camera and its optical components. Many of these components required a custom design, such as the Spectral Differential Imaging Wollaston prisms and filters and the coronagraphic spots. One component, the Atmospheric Dispersion Corrector (ADC), required a unique triplet design that had until now never been fabricated and tested on sky. We present the design, laboratory, and on-sky results for our triplet ADC. We also present details of the CRO test setup and alignment. Because Magellan is a Gregorian telescope, the ASM is a concave ellipsoidal mirror. By simulating a star with a white light point source at the far conjugate, we can create a double-pass test of the whole system without the need for a real on-sky star. This allows us to test the AO system closed loop in the Arcetri test tower at its nominal design focal length and optical conjugates. The CRO test will also allow us to calibrate and verify the system off-sky at the Magellan telescope during commissioning and periodically thereafter. We present a design for a possible future upgrade path for a new visible Integral Field Spectrograph. By integrating a fiber array bundle at the VisAO focal plane, we can send light to a pre-existing facility spectrograph, such as LDSS3, which will allow 20 mas spatial sampling and R˜1,800 spectra over the band 0.6-1.05 microm. This would be the highest spatial resolution IFU to date, either from the ground or in space.
NASA Astrophysics Data System (ADS)
Castro Marín, J. M.; Brown, V. J. G.; López Jiménez, A. C.; Rodríguez Gómez, J.; Rodrigo, R.
2001-05-01
The optical, spectroscopic infrared remote imaging system (OSIRIS) is an instrument carried on board the European Space Agency spacecraft Rosetta that will be launched in January 2003 to study in situ the comet Wirtanen. The electronic design of the mechanism controller board (MCB) system of the two OSIRIS optical cameras, the narrow angle camera, and the wide angle camera, is described here. The system is comprised of two boards mounted on an aluminum frame as part of an electronics box that contains the power supply and the digital processor unit of the instrument. The mechanisms controlled by the MCB for each camera are the front door assembly and a filter wheel assembly. The front door assembly for each camera is driven by a four phase, permanent magnet stepper motor. Each filter wheel assembly consists of two, eight filter wheels. Each wheel is driven by a four phase, variable reluctance stepper motor. Each motor, for all the assemblies, also contains a redundant set of four stator phase windings that can be energized separately or in parallel with the main windings. All stepper motors are driven in both directions using the full step unipolar mode of operation. The MCB also performs general housekeeping data acquisition of the OSIRIS instrument, i.e., mechanism position encoders and temperature measurements. The electronic design application used is quite new due to use of a field programmable gate array electronic devices that avoid the use of the now traditional system controlled by microcontrollers and software. Electrical tests of the engineering model have been performed successfully and the system is ready for space qualification after environmental testing. This system may be of interest to institutions involved in future space experiments with similar needs for mechanisms control.
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan; Choi, P. I.; Severson, S. A.; Spjut, E.
2013-01-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration over UV-NIR wavelengths from Pomona College’s telescope atop Table Mountain. We present here, the final optical system, KAPAO Prime, designed in Zemax Optical Design Software that uses custom off-axis paraboloid mirrors (OAPs) to manipulate light appropriately for a Shack-Hartman wavefront sensor, deformable mirror, and science cameras. KAPAO Prime is characterized by diffraction limited imaging over the full 81” field of view of our optical camera at f/33 as well as over the smaller field of view of our NIR camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of our optical camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to λ/10 surface irregularity (632.8nm). Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75°F when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of KAPAO Prime in Q1 2013.
Electro-optical system for gunshot detection: analysis, concept, and performance
NASA Astrophysics Data System (ADS)
Kastek, M.; Dulski, R.; Madura, H.; Trzaskawka, P.; Bieszczad, G.; Sosnowski, T.
2011-08-01
The paper discusses technical possibilities to build an effective electro-optical sensor unit for sniper detection using infrared cameras. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. At first, the analysis was presented of three distinguished phases of sniper activity: before, during and after the shot. On the basis of experimental data the parameters defining the relevant sniper signatures were determined which are essential in assessing the capability of infrared camera to detect sniper activity. A sniper body and muzzle flash were analyzed as targets and the descriptions of phenomena which make it possible to detect sniper activities in infrared spectra as well as analysis of physical limitations were performed. The analyzed infrared systems were simulated using NVTherm software. The calculations for several cameras, equipped with different lenses and detector types were performed. The simulation of detection ranges was performed for the selected scenarios of sniper detection tasks. After the analysis of simulation results, the technical specifications of infrared sniper detection system were discussed, required to provide assumed detection range. Finally the infrared camera setup was proposed which can detected sniper from 1000 meters range.
Constraining the Optical Emission from the Double Pulsar System J0737-3039
NASA Astrophysics Data System (ADS)
Ferraro, F. R.; Mignani, R. P.; Pallanca, C.; Dalessandro, E.; Lanzoni, B.; Pellizzoni, A.; Possenti, A.; Burgay, M.; Camilo, F.; D'Amico, N.; Lyne, A. G.; Kramer, M.; Manchester, R. N.
2012-04-01
We present the first optical observations of the unique system J0737-3039 (composed of two pulsars, hereafter PSR-A and PSR-B). Ultra-deep optical observations, performed with the High Resolution Camera of the Advanced Camera for Surveys on board the Hubble Space Telescope, could not detect any optical emission from the system down to m F435W = 27.0 and m F606W = 28.3. The estimated optical flux limits are used to constrain the three-component (two thermal and one non-thermal) model recently proposed to reproduce the XMM-Newton X-ray spectrum. They suggest the presence of a break at low energies in the non-thermal power-law component of PSR-A and are compatible with the expected blackbody emission from the PSR-B surface. The corresponding efficiency of the optical emission from PSR-A's magnetosphere would be comparable to that of other Myr-old pulsars, thus suggesting that this parameter may not dramatically evolve over a timescale of a few Myr.
Time-resolved X-ray excited optical luminescence using an optical streak camera
NASA Astrophysics Data System (ADS)
Ward, M. J.; Regier, T. Z.; Vogt, J. M.; Gordon, R. A.; Han, W.-Q.; Sham, T. K.
2013-03-01
We report the development of a time-resolved XEOL (TR-XEOL) system that employs an optical streak camera. We have conducted TR-XEOL experiments at the Canadian Light Source (CLS) operating in single bunch mode with a 570 ns dark gap and 35 ps electron bunch pulse, and at the Advanced Photon Source (APS) operating in top-up mode with a 153 ns dark gap and 33.5 ps electron bunch pulse. To illustrate the power of this technique we measured the TR-XEOL of solid-solution nanopowders of gallium nitride - zinc oxide, and for the first time have been able to resolve near-band-gap (NBG) optical luminescence emission from these materials. Herein we will discuss the development of the streak camera TR-XEOL technique and its application to the study of these novel materials.
NASA Astrophysics Data System (ADS)
Ozolinsh, Maris; Paulins, Paulis
2017-09-01
An experimental setup allowing the modeling of conditions in optical devices and in the eye at various degrees of scattering such as cataract pathology in human eyes is presented. The scattering in cells of polymer-dispersed liquid crystals (PDLCs) and ‘Smart Glass’ windows is used in the modeling experiments. Both applications are used as optical obstacles placed in different positions of the optical information flow pathway either directly on the stimuli demonstration computer screen or mounted directly after the image-formation lens of a digital camera. The degree of scattering is changed continuously by applying an AC voltage of up to 30-80 V to the PDLC cell. The setup uses a camera with 14 bit depth and a 24 mm focal length lens. Light-emitting diodes and diode-pumped solid-state lasers emitting radiation of different wavelengths are used as portable small-divergence light sources in the experiments. Image formation, optical system point spread function, modulation transfer functions, and system resolution limits are determined for such sample optical systems in student optics and optometry experimental exercises.
Omnidirectional Underwater Camera Design and Calibration
Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David
2015-01-01
This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707
Lee, Sang-Won; Jeong, Hyun-Woo; Kim, Beop-Min
2010-01-01
We propose high-speed spectral domain polarization-sensitive optical coherence tomography (SD-PS-OCT) using a single camera and a 1x2 optical switch at the 1.3-microm region. The PS-low coherence interferometer used in the system is constructed using free-space optics. The reflected horizontal and vertical polarization light rays are delivered via an optical switch to a single spectrometer by turns. Therefore, our system costs less to build than those that use dual spectrometers, and the processes of timing and triggering are simpler from the viewpoints of both hardware and software. Our SD-PS-OCT has a sensitivity of 101.5 dB, an axial resolution of 8.2 microm, and an acquisition speed of 23,496 A-scans per second. We obtain the intensity, phase retardation, and fast axis orientation images of a rat tail tendon ex vivo.
SFR test fixture for hemispherical and hyperhemispherical camera systems
NASA Astrophysics Data System (ADS)
Tamkin, John M.
2017-08-01
Optical testing of camera systems in volume production environments can often require expensive tooling and test fixturing. Wide field (fish-eye, hemispheric and hyperhemispheric) optical systems create unique challenges because of the inherent distortion, and difficulty in controlling reflections from front-lit high resolution test targets over the hemisphere. We present a unique design for a test fixture that uses low-cost manufacturing methods and equipment such as 3D printing and an Arduino processor to control back-lit multi-color (VIS/NIR) targets and sources. Special care with LED drive electronics is required to accommodate both global and rolling shutter sensors.
NASA Astrophysics Data System (ADS)
Da Deppo, V.; Naletto, G.; Nicolosi, P.; Zambolin, P.; De Cecco, M.; Debei, S.; Parzianello, G.; Ramous, P.; Zaccariotto, M.; Fornasier, S.; Verani, S.; Thomas, N.; Barthol, P.; Hviid, S. F.; Sebastian, I.; Meller, R.; Sierks, H.; Keller, H. U.; Barbieri, C.; Angrilli, F.; Lamy, P.; Rodrigo, R.; Rickman, H.; Wenzel, K. P.
2017-11-01
Rosetta is one of the cornerstone missions of the European Space Agency for having a rendezvous with the comet 67P/Churyumov-Gerasimenko in 2014. The imaging instrument on board the satellite is OSIRIS (Optical, Spectroscopic and Infrared Remote Imaging System), a cooperation among several European institutes, which consists of two cameras: a Narrow (NAC) and a Wide Angle Camera (WAC). The WAC optical design is an innovative one: it adopts an all reflecting, unvignetted and unobstructed two mirror configuration which allows to cover a 12° × 12° field of view with an F/5.6 aperture and gives a nominal contrast ratio of about 10-4. The flight model of this camera has been successfully integrated and tested in our laboratories, and finally has been integrated on the satellite which is now waiting to be launched in February 2004. In this paper we are going to describe the optical characteristics of the camera, and to summarize the results so far obtained with the preliminary calibration data. The analysis of the optical performance of this model shows a good agreement between theoretical performance and experimental results.
NASA Astrophysics Data System (ADS)
Gogler, Slawomir; Bieszczad, Grzegorz; Krupinski, Michal
2013-10-01
Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. Detectors used in thermal camera are illuminated by infrared radiation transmitted through an infrared transmitting optical system. Often an optical system, when exposed to uniform Lambertian source forms a non-uniform irradiation distribution in its image plane. In order to be able to carry out an accurate non-uniformity correction it is essential to correctly predict irradiation distribution from a uniform source. In the article a non-uniformity correction method has been presented, that takes into account optical system's radiometry. Predictions of the irradiation distribution have been confronted with measured irradiance values. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.
Quinn, Mark Kenneth; Spinosa, Emanuele; Roberts, David A
2017-07-25
Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.
Spinosa, Emanuele; Roberts, David A.
2017-01-01
Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access. PMID:28757553
Electro-optic holography method for determination of surface shape and deformation
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-06-01
Current demanding engineering analysis and design applications require effective experimental methodologies for characterization of surface shape and deformation. Such characterization is of primary importance in many applications, because these quantities are related to the functionality, performance, and integrity of the objects of interest, especially in view of advances relating to concurrent engineering. In this paper, a new approach to characterization of surface shape and deformation using a simple optical setup is described. The approach consists of a fiber optic based electro-optic holography (EOH) system based on an IR, temperature tuned laser diode, a single mode fiber optic directional coupler assembly, and a video processing computer. The EOH can be arranged in multiple configurations which include, the three-camera, three- illumination, and speckle correlation modes.In particular, the three-camera mode is described, as well as a brief description of the procedures for obtaining quantitative 3D shape and deformation information. A representative application of the three-camera EOH system demonstrates the viability of the approach as an effective engineering tool. A particular feature of this system and the procedure described in this paper is that the 3D quantitative data are written to data files which can be readily interfaced to commercial CAD/CAM environments.
Super-resolution in a defocused plenoptic camera: a wave-optics-based approach.
Sahin, Erdem; Katkovnik, Vladimir; Gotchev, Atanas
2016-03-01
Plenoptic cameras enable the capture of a light field with a single device. However, with traditional light field rendering procedures, they can provide only low-resolution two-dimensional images. Super-resolution is considered to overcome this drawback. In this study, we present a super-resolution method for the defocused plenoptic camera (Plenoptic 1.0), where the imaging system is modeled using wave optics principles and utilizing low-resolution depth information of the scene. We are particularly interested in super-resolution of in-focus and near in-focus scene regions, which constitute the most challenging cases. The simulation results show that the employed wave-optics model makes super-resolution possible for such regions as long as sufficiently accurate depth information is available.
Detection of pointing errors with CMOS-based camera in intersatellite optical communications
NASA Astrophysics Data System (ADS)
Yu, Si-yuan; Ma, Jing; Tan, Li-ying
2005-01-01
For very high data rates, intersatellite optical communications hold a potential performance edge over microwave communications. Acquisition and Tracking problem is critical because of the narrow transmit beam. A single array detector in some systems performs both spatial acquisition and tracking functions to detect pointing errors, so both wide field of view and high update rate is required. The past systems tend to employ CCD-based camera with complex readout arrangements, but the additional complexity reduces the applicability of the array based tracking concept. With the development of CMOS array, CMOS-based cameras can employ the single array detector concept. The area of interest feature of the CMOS-based camera allows a PAT system to specify portion of the array. The maximum allowed frame rate increases as the size of the area of interest decreases under certain conditions. A commercially available CMOS camera with 105 fps @ 640×480 is employed in our PAT simulation system, in which only part pixels are used in fact. Beams angle varying in the field of view can be detected after getting across a Cassegrain telescope and an optical focus system. Spot pixel values (8 bits per pixel) reading out from CMOS are transmitted to a DSP subsystem via IEEE 1394 bus, and pointing errors can be computed by the centroid equation. It was shown in test that: (1) 500 fps @ 100×100 is available in acquisition when the field of view is 1mrad; (2)3k fps @ 10×10 is available in tracking when the field of view is 0.1mrad.
Single-Fiber Optical Link For Video And Control
NASA Technical Reports Server (NTRS)
Galloway, F. Houston
1993-01-01
Single optical fiber carries control signals to remote television cameras and video signals from cameras. Fiber replaces multiconductor copper cable, with consequent reduction in size. Repeaters not needed. System works with either multimode- or single-mode fiber types. Nonmetallic fiber provides immunity to electromagnetic interference at suboptical frequencies and much less vulnerable to electronic eavesdropping and lightning strikes. Multigigahertz bandwidth more than adequate for high-resolution television signals.
Optical stereo video signal processor
NASA Technical Reports Server (NTRS)
Craig, G. D. (Inventor)
1985-01-01
An otpical video signal processor is described which produces a two-dimensional cross-correlation in real time of images received by a stereo camera system. The optical image of each camera is projected on respective liquid crystal light valves. The images on the liquid crystal valves modulate light produced by an extended light source. This modulated light output becomes the two-dimensional cross-correlation when focused onto a video detector and is a function of the range of a target with respect to the stereo camera. Alternate embodiments utilize the two-dimensional cross-correlation to determine target movement and target identification.
Optomechanical System Development of the AWARE Gigapixel Scale Camera
NASA Astrophysics Data System (ADS)
Son, Hui S.
Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.
NASA Astrophysics Data System (ADS)
Druart, Guillaume; Matallah, Noura; Guerineau, Nicolas; Magli, Serge; Chambon, Mathieu; Jenouvrier, Pierre; Mallet, Eric; Reibel, Yann
2014-06-01
Today, both military and civilian applications require miniaturized optical systems in order to give an imagery function to vehicles with small payload capacity. After the development of megapixel focal plane arrays (FPA) with micro-sized pixels, this miniaturization will become feasible with the integration of optical functions in the detector area. In the field of cooled infrared imaging systems, the detector area is the Detector-Dewar-Cooler Assembly (DDCA). SOFRADIR and ONERA have launched a new research and innovation partnership, called OSMOSIS, to develop disruptive technologies for DDCA to improve the performance and compactness of optronic systems. With this collaboration, we will break down the technological barriers of DDCA, a sealed and cooled environment dedicated to the infrared detectors, to explore Dewar-level integration of optics. This technological breakthrough will bring more compact multipurpose thermal imaging products, as well as new thermal capabilities such as 3D imagery or multispectral imagery. Previous developments will be recalled (SOIE and FISBI cameras) and new developments will be presented. In particular, we will focus on a dual-band MWIR-LWIR camera and a multichannel camera.
Minimizing camera-eye optical aberrations during the 3D reconstruction of retinal structures
NASA Astrophysics Data System (ADS)
Aldana-Iuit, Javier; Martinez-Perez, M. Elena; Espinosa-Romero, Arturo; Diaz-Uribe, Rufino
2010-05-01
3D reconstruction of blood vessels is a powerful visualization tool for physicians, since it allows them to refer to qualitative representation of their subject of study. In this paper we propose a 3D reconstruction method of retinal vessels from fundus images. The reconstruction method propose herein uses images of the same retinal structure in epipolar geometry. Images are preprocessed by RISA system for segmenting blood vessels and obtaining feature points for correspondences. The correspondence points process is solved using correlation. The LMedS analysis and Graph Transformation Matching algorithm are used for outliers suppression. Camera projection matrices are computed with the normalized eight point algorithm. Finally, we retrieve 3D position of the retinal tree points by linear triangulation. In order to increase the power of visualization, 3D tree skeletons are represented by surfaces via generalized cylinders whose radius correspond to morphological measurements obtained by RISA. In this paper the complete calibration process including the fundus camera and the optical properties of the eye, the so called camera-eye system is proposed. On one hand, the internal parameters of the fundus camera are obtained by classical algorithms using a reference pattern. On the other hand, we minimize the undesirable efects of the aberrations induced by the eyeball optical system assuming that contact enlarging lens corrects astigmatism, spherical and coma aberrations are reduced changing the aperture size and eye refractive errors are suppressed adjusting camera focus during image acquisition. Evaluation of two self-calibration proposals and results of 3D blood vessel surface reconstruction are presented.
ERIC Educational Resources Information Center
Ruiz, Michael J.
1982-01-01
The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…
Optical character recognition of camera-captured images based on phase features
NASA Astrophysics Data System (ADS)
Diaz-Escobar, Julia; Kober, Vitaly
2015-09-01
Nowadays most of digital information is obtained using mobile devices specially smartphones. In particular, it brings the opportunity for optical character recognition in camera-captured images. For this reason many recognition applications have been recently developed such as recognition of license plates, business cards, receipts and street signal; document classification, augmented reality, language translator and so on. Camera-captured images are usually affected by geometric distortions, nonuniform illumination, shadow, noise, which make difficult the recognition task with existing systems. It is well known that the Fourier phase contains a lot of important information regardless of the Fourier magnitude. So, in this work we propose a phase-based recognition system exploiting phase-congruency features for illumination/scale invariance. The performance of the proposed system is tested in terms of miss classifications and false alarms with the help of computer simulation.
Diffraction-based optical sensor detection system for capture-restricted environments
NASA Astrophysics Data System (ADS)
Khandekar, Rahul M.; Nikulin, Vladimir V.
2008-04-01
The use of digital cameras and camcorders in prohibited areas presents a growing problem. Piracy in the movie theaters results in huge revenue loss to the motion picture industry every year, but still image and video capture may present even a bigger threat if performed in high-security locations. While several attempts are being made to address this issue, an effective solution is yet to be found. We propose to approach this problem using a very commonly observed optical phenomenon. Cameras and camcorders use CCD and CMOS sensors, which include a number of photosensitive elements/pixels arranged in a certain fashion. Those are photosites in CCD sensors and semiconductor elements in CMOS sensors. They are known to reflect a small fraction of incident light, but could also act as a diffraction grating, resulting in the optical response that could be utilized to identify the presence of such a sensor. A laser-based detection system is proposed that accounts for the elements in the optical train of the camera, as well as the eye-safety of the people who could be exposed to optical beam radiation. This paper presents preliminary experimental data, as well as the proof-of-concept simulation results.
Versatile microsecond movie camera
NASA Astrophysics Data System (ADS)
Dreyfus, R. W.
1980-03-01
A laboratory-type movie camera is described which satisfies many requirements in the range 1 microsec to 1 sec. The camera consists of a He-Ne laser and compatible state-of-the-art components; the primary components are an acoustooptic modulator, an electromechanical beam deflector, and a video tape system. The present camera is distinct in its operation in that submicrosecond laser flashes freeze the image motion while still allowing the simplicity of electromechanical image deflection in the millisecond range. The gating and pulse delay circuits of an oscilloscope synchronize the modulator and scanner relative to the subject being photographed. The optical table construction and electronic control enhance the camera's versatility and adaptability. The instant replay video tape recording allows for easy synchronization and immediate viewing of the results. Economy is achieved by using off-the-shelf components, optical table construction, and short assembly time.
NASA Astrophysics Data System (ADS)
Lee, Sang-Won; Jeong, Hyun-Woo; Kim, Beop-Min
2010-02-01
We demonstrated high-speed spectral domain polarization-sensitive optical coherence tomography (SD-PSOCT) using a single InGaAs line-scan camera and an optical switch at 1.3-μm region. The polarization-sensitive low coherence interferometer in the system was based on the original free-space PS-OCT system published by Hee et al. The horizontal and vertical polarization light rays split by polarization beam splitter were delivered and detected via an optical switch to a single spectrometer by turns instead of dual spectrometers. The SD-PSOCT system had an axial resolution of 8.2 μm, a sensitivity of 101.5 dB, and an acquisition speed of 23,496 Alines/s. We obtained the intensity, phase retardation, and fast axis orientation images of a biological tissue. In addition, we calculated the averaged axial profiles of the phase retardation in human skin.
Mechanical Design of the LSST Camera
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordby, Martin; Bowden, Gordon; Foss, Mike
2008-06-13
The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors inmore » image reconstruction. Design and analysis for the camera body and cryostat will be detailed.« less
The development of large-aperture test system of infrared camera and visible CCD camera
NASA Astrophysics Data System (ADS)
Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying
2015-10-01
Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.
Cryogenic optical systems for the rapid infrared imager/spectrometer (RIMAS)
NASA Astrophysics Data System (ADS)
Capone, John I.; Content, David A.; Kutyrev, Alexander S.; Robinson, Frederick D.; Lotkin, Gennadiy N.; Toy, Vicki L.; Veilleux, Sylvain; Moseley, Samuel H.; Gehrels, Neil A.; Vogel, Stuart N.
2014-07-01
The Rapid Infrared Imager/Spectrometer (RIMAS) is designed to perform follow-up observations of transient astronomical sources at near infrared (NIR) wavelengths (0.9 - 2.4 microns). In particular, RIMAS will be used to perform photometric and spectroscopic observations of gamma-ray burst (GRB) afterglows to compliment the Swift satellite's science goals. Upon completion, RIMAS will be installed on Lowell Observatory's 4.3 meter Discovery Channel Telescope (DCT) located in Happy Jack, Arizona. The instrument's optical design includes a collimator lens assembly, a dichroic to divide the wavelength coverage into two optical arms (0.9 - 1.4 microns and 1.4 - 2.4 microns respectively), and a camera lens assembly for each optical arm. Because the wavelength coverage extends out to 2.4 microns, all optical elements are cooled to ~70 K. Filters and transmission gratings are located on wheels prior to each camera allowing the instrument to be quickly configured for photometry or spectroscopy. An athermal optomechanical design is being implemented to prevent lenses from loosing their room temperature alignment as the system is cooled. The thermal expansion of materials used in this design have been measured in the lab. Additionally, RIMAS has a guide camera consisting of four lenses to aid observers in passing light from target sources through spectroscopic slits. Efforts to align these optics are ongoing.
Geometrical calibration television measuring systems with solid state photodetectors
NASA Astrophysics Data System (ADS)
Matiouchenko, V. G.; Strakhov, V. V.; Zhirkov, A. O.
2000-11-01
The various optical measuring methods for deriving information about the size and form of objects are now used in difference branches- mechanical engineering, medicine, art, criminalistics. Measuring by means of the digital television systems is one of these methods. The development of this direction is promoted by occurrence on the market of various types and costs small-sized television cameras and frame grabbers. There are many television measuring systems using the expensive cameras, but accuracy performances of low cost cameras are also interested for the system developers. For this reason inexpensive mountingless camera SK1004CP (format 1/3', cost up to 40$) and frame grabber Aver2000 were used in experiments.
Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.
2010-01-01
Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475
The system analysis of light field information collection based on the light field imaging
NASA Astrophysics Data System (ADS)
Wang, Ye; Li, Wenhua; Hao, Chenyang
2016-10-01
Augmented reality(AR) technology is becoming the study focus, and the AR effect of the light field imaging makes the research of light field camera attractive. The micro array structure was adopted in most light field information acquisition system(LFIAS) since emergence of light field camera, micro lens array(MLA) and micro pinhole array(MPA) system mainly included. It is reviewed in this paper the structure of the LFIAS that the Light field camera commonly used in recent years. LFIAS has been analyzed based on the theory of geometrical optics. Meanwhile, this paper presents a novel LFIAS, plane grating system, we call it "micro aperture array(MAA." And the LFIAS are analyzed based on the knowledge of information optics; This paper proves that there is a little difference in the multiple image produced by the plane grating system. And the plane grating system can collect and record the amplitude and phase information of the field light.
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.
Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi
2016-08-30
This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.
Lensless imaging for wide field of view
NASA Astrophysics Data System (ADS)
Nagahara, Hajime; Yagi, Yasushi
2015-02-01
It is desirable to engineer a small camera with a wide field of view (FOV) because of current developments in the field of wearable cameras and computing products, such as action cameras and Google Glass. However, typical approaches for achieving wide FOV, such as attaching a fisheye lens and convex mirrors, require a trade-off between optics size and the FOV. We propose camera optics that achieve a wide FOV, and are at the same time small and lightweight. The proposed optics are a completely lensless and catoptric design. They contain four mirrors, two for wide viewing, and two for focusing the image on the camera sensor. The proposed optics are simple and can be simply miniaturized, since we use only mirrors for the proposed optics and the optics are not susceptible to chromatic aberration. We have implemented the prototype optics of our lensless concept. We have attached the optics to commercial charge-coupled device/complementary metal oxide semiconductor cameras and conducted experiments to evaluate the feasibility of our proposed optics.
Towards designing an optical-flow based colonoscopy tracking algorithm: a comparative study
NASA Astrophysics Data System (ADS)
Liu, Jianfei; Subramanian, Kalpathi R.; Yoo, Terry S.
2013-03-01
Automatic co-alignment of optical and virtual colonoscopy images can supplement traditional endoscopic procedures, by providing more complete information of clinical value to the gastroenterologist. In this work, we present a comparative analysis of our optical flow based technique for colonoscopy tracking, in relation to current state of the art methods, in terms of tracking accuracy, system stability, and computational efficiency. Our optical-flow based colonoscopy tracking algorithm starts with computing multi-scale dense and sparse optical flow fields to measure image displacements. Camera motion parameters are then determined from optical flow fields by employing a Focus of Expansion (FOE) constrained egomotion estimation scheme. We analyze the design choices involved in the three major components of our algorithm: dense optical flow, sparse optical flow, and egomotion estimation. Brox's optical flow method,1 due to its high accuracy, was used to compare and evaluate our multi-scale dense optical flow scheme. SIFT6 and Harris-affine features7 were used to assess the accuracy of the multi-scale sparse optical flow, because of their wide use in tracking applications; the FOE-constrained egomotion estimation was compared with collinear,2 image deformation10 and image derivative4 based egomotion estimation methods, to understand the stability of our tracking system. Two virtual colonoscopy (VC) image sequences were used in the study, since the exact camera parameters(for each frame) were known; dense optical flow results indicated that Brox's method was superior to multi-scale dense optical flow in estimating camera rotational velocities, but the final tracking errors were comparable, viz., 6mm vs. 8mm after the VC camera traveled 110mm. Our approach was computationally more efficient, averaging 7.2 sec. vs. 38 sec. per frame. SIFT and Harris affine features resulted in tracking errors of up to 70mm, while our sparse optical flow error was 6mm. The comparison among egomotion estimation algorithms showed that our FOE-constrained egomotion estimation method achieved the optimal balance between tracking accuracy and robustness. The comparative study demonstrated that our optical-flow based colonoscopy tracking algorithm maintains good accuracy and stability for routine use in clinical practice.
Air-borne shape measurement of parabolic trough collector fields
NASA Astrophysics Data System (ADS)
Prahl, Christoph; Röger, Marc; Hilgert, Christoph
2017-06-01
The optical and thermal efficiency of parabolic trough collector solar fields is dependent on the performance and assembly accuracy of its components such as the concentrator and absorber. For the purpose of optical inspection/approval, yield analysis, localization of low performing areas, and optimization of the solar field, it is essential to create a complete view of the optical properties of the field. Existing optical measurement tools are based on ground based cameras, facing restriction concerning speed, volume and automation. QFly is an airborne qualification system which provides holistic and accurate information on geometrical, optical, and thermal properties of the entire solar field. It consists of an unmanned aerial vehicle, cameras and related software for flight path planning, data acquisition and evaluation. This article presents recent advances of the QFly measurement system and proposes a methodology on holistic qualification of the complete solar field with minimum impact on plant operation.
NASA Astrophysics Data System (ADS)
Mi, Yuhe; Huang, Yifan; Li, Lin
2015-08-01
Based on the location technique of beacon photogrammetry, Dual Camera Photogrammetry (DCP) algorithm was used to assist helicopters landing on the ship. In this paper, ZEMAX was used to simulate the two Charge Coupled Device (CCD) cameras imaging four beacons on both sides of the helicopter and output the image to MATLAB. Target coordinate systems, image pixel coordinate systems, world coordinate systems and camera coordinate systems were established respectively. According to the ideal pin-hole imaging model, the rotation matrix and translation vector of the target coordinate systems and the camera coordinate systems could be obtained by using MATLAB to process the image information and calculate the linear equations. On the basis mentioned above, ambient temperature and the positions of the beacons and cameras were changed in ZEMAX to test the accuracy of the DCP algorithm in complex sea status. The numerical simulation shows that in complex sea status, the position measurement accuracy can meet the requirements of the project.
Design, demonstration and testing of low F-number LWIR panoramic imaging relay optics
NASA Astrophysics Data System (ADS)
Furxhi, Orges; Frascati, Joe; Driggers, Ronald
2018-04-01
Panoramic imaging is inherently wide field of view. High sensitivity uncooled Long Wave Infrared (LWIR) imaging requires low F-number optics. These two requirements result in short back working distance designs that, in addition to being costly, are challenging to integrate with commercially available uncooled LWIR cameras and cores. Common challenges include the relocation of the shutter flag, custom calibration of the camera dynamic range and NUC tables, focusing, and athermalization. Solutions to these challenges add to the system cost and make panoramic uncooled LWIR cameras commercially unattractive. In this paper, we present the design of Panoramic Imaging Relay Optics (PIRO) and show imagery and test results with one of the first prototypes. PIRO designs use several reflective surfaces (generally two) to relay a panoramic scene onto a real, donut-shaped image. The PIRO donut is imaged on the focal plane of the camera using a commercially-off-the-shelf (COTS) low F-number lens. This approach results in low component cost and effortless integration with pre-calibrated commercially available cameras and lenses.
Integration of image capture and processing: beyond single-chip digital camera
NASA Astrophysics Data System (ADS)
Lim, SukHwan; El Gamal, Abbas
2001-05-01
An important trend in the design of digital cameras is the integration of capture and processing onto a single CMOS chip. Although integrating the components of a digital camera system onto a single chip significantly reduces system size and power, it does not fully exploit the potential advantages of integration. We argue that a key advantage of integration is the ability to exploit the high speed imaging capability of CMOS image senor to enable new applications such as multiple capture for enhancing dynamic range and to improve the performance of existing applications such as optical flow estimation. Conventional digital cameras operate at low frame rates and it would be too costly, if not infeasible, to operate their chips at high frame rates. Integration solves this problem. The idea is to capture images at much higher frame rates than he standard frame rate, process the high frame rate data on chip, and output the video sequence and the application specific data at standard frame rate. This idea is applied to optical flow estimation, where significant performance improvements are demonstrate over methods using standard frame rate sequences. We then investigate the constraints on memory size and processing power that can be integrated with a CMOS image sensor in a 0.18 micrometers process and below. We show that enough memory and processing power can be integrated to be able to not only perform the functions of a conventional camera system but also to perform applications such as real time optical flow estimation.
Camera, handlens, and microscope optical system for imaging and coupled optical spectroscopy
NASA Technical Reports Server (NTRS)
Mungas, Greg S. (Inventor); Boynton, John (Inventor); Sepulveda, Cesar A. (Inventor); Nunes de Sepulveda, legal representative, Alicia (Inventor); Gursel, Yekta (Inventor)
2012-01-01
An optical system comprising two lens cells, each lens cell comprising multiple lens elements, to provide imaging over a very wide image distance and within a wide range of magnification by changing the distance between the two lens cells. An embodiment also provides scannable laser spectroscopic measurements within the field-of-view of the instrument.
Camera, handlens, and microscope optical system for imaging and coupled optical spectroscopy
NASA Technical Reports Server (NTRS)
Mungas, Greg S. (Inventor); Boynton, John (Inventor); Sepulveda, Cesar A. (Inventor); Nunes de Sepulveda, Alicia (Inventor); Gursel, Yekta (Inventor)
2011-01-01
An optical system comprising two lens cells, each lens cell comprising multiple lens elements, to provide imaging over a very wide image distance and within a wide range of magnification by changing the distance between the two lens cells. An embodiment also provides scannable laser spectroscopic measurements within the field-of-view of the instrument.
Solid state electro-optic color filter and iris
NASA Technical Reports Server (NTRS)
1974-01-01
Test results obtained have confirmed the practicality of the solid state electro-optic filters as an optical control element in a television system. Neutral-density control range in excess of 1000:1 has been obtained on sample filters. Test results, measurements in a complete camera system, discussions of problem areas, analytical comparisons, and recommendations for future investigations are included.
Time-Of-Flight Camera, Optical Tracker and Computed Tomography in Pairwise Data Registration.
Pycinski, Bartlomiej; Czajkowska, Joanna; Badura, Pawel; Juszczyk, Jan; Pietka, Ewa
2016-01-01
A growing number of medical applications, including minimal invasive surgery, depends on multi-modal or multi-sensors data processing. Fast and accurate 3D scene analysis, comprising data registration, seems to be crucial for the development of computer aided diagnosis and therapy. The advancement of surface tracking system based on optical trackers already plays an important role in surgical procedures planning. However, new modalities, like the time-of-flight (ToF) sensors, widely explored in non-medical fields are powerful and have the potential to become a part of computer aided surgery set-up. Connection of different acquisition systems promises to provide a valuable support for operating room procedures. Therefore, the detailed analysis of the accuracy of such multi-sensors positioning systems is needed. We present the system combining pre-operative CT series with intra-operative ToF-sensor and optical tracker point clouds. The methodology contains: optical sensor set-up and the ToF-camera calibration procedures, data pre-processing algorithms, and registration technique. The data pre-processing yields a surface, in case of CT, and point clouds for ToF-sensor and marker-driven optical tracker representation of an object of interest. An applied registration technique is based on Iterative Closest Point algorithm. The experiments validate the registration of each pair of modalities/sensors involving phantoms of four various human organs in terms of Hausdorff distance and mean absolute distance metrics. The best surface alignment was obtained for CT and optical tracker combination, whereas the worst for experiments involving ToF-camera. The obtained accuracies encourage to further develop the multi-sensors systems. The presented substantive discussion concerning the system limitations and possible improvements mainly related to the depth information produced by the ToF-sensor is useful for computer aided surgery developers.
A detailed comparison of single-camera light-field PIV and tomographic PIV
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Ding, Junfei; Atkinson, Callum; Soria, Julio; New, T. H.
2018-03-01
This paper conducts a comprehensive study between the single-camera light-field particle image velocimetry (LF-PIV) and the multi-camera tomographic particle image velocimetry (Tomo-PIV). Simulation studies were first performed using synthetic light-field and tomographic particle images, which extensively examine the difference between these two techniques by varying key parameters such as pixel to microlens ratio (PMR), light-field camera Tomo-camera pixel ratio (LTPR), particle seeding density and tomographic camera number. Simulation results indicate that the single LF-PIV can achieve accuracy consistent with that of multi-camera Tomo-PIV, but requires the use of overall greater number of pixels. Experimental studies were then conducted by simultaneously measuring low-speed jet flow with single-camera LF-PIV and four-camera Tomo-PIV systems. Experiments confirm that given a sufficiently high pixel resolution, a single-camera LF-PIV system can indeed deliver volumetric velocity field measurements for an equivalent field of view with a spatial resolution commensurate with those of multi-camera Tomo-PIV system, enabling accurate 3D measurements in applications where optical access is limited.
Plenoptic camera wavefront sensing with extended sources
NASA Astrophysics Data System (ADS)
Jiang, Pengzhi; Xu, Jieping; Liang, Yonghui; Mao, Hongjun
2016-09-01
The wavefront sensor is used in adaptive optics to detect the atmospheric distortion, which feeds back to the deformable mirror to compensate for this distortion. Different from the Shack-Hartmann sensor that has been widely used with point sources, the plenoptic camera wavefront sensor has been proposed as an alternative wavefront sensor adequate for extended objects in recent years. In this paper, the plenoptic camera wavefront sensing with extended sources is discussed systematically. Simulations are performed to investigate the wavefront measurement error and the closed-loop performance of the plenoptic sensor. The results show that there are an optimal lenslet size and an optimal number of pixels to make the best performance. The RMS of the resulting corrected wavefront in closed-loop adaptive optics system is less than 108 nm (0.2λ) when D/r0 ≤ 10 and the magnitude M ≤ 5. Our investigation indicates that the plenoptic sensor is efficient to operate on extended sources in the closed-loop adaptive optics system.
NASA Astrophysics Data System (ADS)
Dani, Tiar; Rachman, Abdul; Priyatikanto, Rhorom; Religia, Bahar
2015-09-01
An increasing number of space junk in orbit has raised their chances to fall in Indonesian region. So far, three debris of rocket bodies have been found in Bengkulu, Gorontalo and Lampung. LAPAN has successfully developed software for monitoring space debris that passes over Indonesia with an altitude below 200 km. To support the software-based system, the hardware-based system has been developed based on optical instruments. The system has been under development in early 2014 which consist of two systems: the telescopic system and wide field system. The telescopic system uses CCD cameras and a reflecting telescope with relatively high sensitivity. Wide field system uses DSLR cameras, binoculars and a combination of CCD with DSLR Lens. Methods and preliminary results of the systems will be presented.
A portable W-band radar system for enhancement of infrared vision in fire fighting operations
NASA Astrophysics Data System (ADS)
Klenner, Mathias; Zech, Christian; Hülsmann, Axel; Kühn, Jutta; Schlechtweg, Michael; Hahmann, Konstantin; Kleiner, Bernhard; Ulrich, Michael; Ambacher, Oliver
2016-10-01
In this paper, we present a millimeter wave radar system which will enhance the performance of infrared cameras used for fire-fighting applications. The radar module is compact and lightweight such that the system can be combined with inertial sensors and integrated in a hand-held infrared camera. This allows for precise distance measurements in harsh environmental conditions, such as tunnel or industrial fires, where optical sensors are unreliable or fail. We discuss the design of the RF front-end, the antenna and a quasi-optical lens for beam shaping as well as signal processing and demonstrate the performance of the system by in situ measurements in a smoke filled environment.
Kampik, A; Rapp, J
1979-02-01
A method of Cinematography of the ocular fundus is introduced which--by connecting a camera with an indirect ophthalmoscop--allows to record the monocular picture of the fundus as produced by the ophthalmic lens.
NASA Astrophysics Data System (ADS)
Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo
2008-11-01
Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.
1988-11-01
atmospheric point the sensor line of sight to a target. Both oxidizers.) The stability of the booster plume as optical systems look out through windows...vertical. The optical layout olume unless it is tracking the UV plume outside for the UV camera is as shown in Figure 1. A the atmosphere. Thus, other...and olune and handoff to the missile in the atmosphere camera was used on the rear platform for the with high resolution optics . visible observation
Single-snapshot 2D color measurement by plenoptic imaging system
NASA Astrophysics Data System (ADS)
Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana
2014-03-01
Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.
Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio
2010-01-01
This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559
PRISM Spectrograph Optical Design
NASA Technical Reports Server (NTRS)
Chipman, Russell A.
1995-01-01
The objective of this contract is to explore optical design concepts for the PRISM spectrograph and produce a preliminary optical design. An exciting optical configuration has been developed which will allow both wavelength bands to be imaged onto the same detector array. At present the optical design is only partially complete because PRISM will require a fairly elaborate optical system to meet its specification for throughput (area*solid angle). The most complex part of the design, the spectrograph camera, is complete, providing proof of principle that a feasible design is attainable. This camera requires 3 aspheric mirrors to fit inside the 20x60 cm cross-section package. A complete design with reduced throughput (1/9th) has been prepared. The design documents the optical configuration concept. A suitable dispersing prism material, CdTe, has been identified for the prism spectrograph, after a comparison of many materials.
NASA Technical Reports Server (NTRS)
Cameron, R.; Aldcroft, T.; Podgorski, W. A.; Freeman, M. D.
2000-01-01
The aspect determination system of the Chandra X-ray Observatory plays a key role in realizing the full potential of Chandra's X-ray optics and detectors. We review the performance of the spacecraft hardware components and sub-systems, which provide information for both real time control of the attitude and attitude stability of the Chandra Observatory and also for more accurate post-facto attitude reconstruction. These flight components are comprised of the aspect camera (star tracker) and inertial reference units (gyros), plus the fiducial lights and fiducial transfer optics which provide an alignment null reference system for the science instruments and X-ray optics, together with associated thermal and structural components. Key performance measures will be presented for aspect camera focal plane data, gyro performance both during stable pointing and during maneuvers, alignment stability and mechanism repeatability.
QUANTITATIVE DETECTION OF ENVIRONMENTALLY IMPORTANT DYES USING DIODE LASER/FIBER-OPTIC RAMAN
A compact diode laser/fiber-optic Raman spectrometer is used for quantitative detection of environmentally important dyes. This system is based on diode laser excitation at 782 mm, fiber optic probe technology, an imaging spectrometer, and state-of-the-art scientific CCD camera. ...
NASA Astrophysics Data System (ADS)
Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.
In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).
NASA Technical Reports Server (NTRS)
Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.
2017-01-01
In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).
Development of Extinction Imagers for the Determination of Atmospheric Optical Extinction
2014-08-01
system resulting from the effects of both the optics and the camera system (including the electronics). The MSI sensor includes a fiber optic taper...small dots in Fig. 7-1 are due to the fiber optic taper in the system. The brighter region near the center is due to the lens optics. To apply the...a black target wliich was a hollow black box. Clearly it would be a major advantage if we could use "targets of opportunity" from a ship, and in
A USB 2.0 computer interface for the UCO/Lick CCD cameras
NASA Astrophysics Data System (ADS)
Wei, Mingzhi; Stover, Richard J.
2004-09-01
The new UCO/Lick Observatory CCD camera uses a 200 MHz fiber optic cable to transmit image data and an RS232 serial line for low speed bidirectional command and control. Increasingly RS232 is a legacy interface supported on fewer computers. The fiber optic cable requires either a custom interface board that is plugged into the mainboard of the image acquisition computer to accept the fiber directly or an interface converter that translates the fiber data onto a widely used standard interface. We present here a simple USB 2.0 interface for the UCO/Lick camera. A single USB cable connects to the image acquisition computer and the camera's RS232 serial and fiber optic cables plug into the USB interface. Since most computers now support USB 2.0 the Lick interface makes it possible to use the camera on essentially any modern computer that has the supporting software. No hardware modifications or additions to the computer are needed. The necessary device driver software has been written for the Linux operating system which is now widely used at Lick Observatory. The complete data acquisition software for the Lick CCD camera is running on a variety of PC style computers as well as an HP laptop.
KAPAO Prime: Design and Simulation
NASA Astrophysics Data System (ADS)
McGonigle, Lorcan
2012-11-01
KAPAO (KAPAO A Pomona Adaptive Optics instrument) is a dual-band natural guide star adaptive optics system designed to measure and remove atmospheric aberration from Pomona College's telescope atop Table Mountain. We present here, the final optical system, referred to as Prime, designed in Zemax Optical Design Software. Prime is characterized by diffraction limited imaging over the full 73'' field of view of our Andor Camera at f/33 as well as for our NIR Xenics camera at f/50. In Zemax, tolerances of 1% on OAP focal length and off-axis distance were shown to contribute an additional 4 nm of wavefront error (98% confidence) over the field of view of the Andor camera; the contribution from surface irregularity was determined analytically to be 40nm for OAPs specified to l/10 surface irregularity. Modeling of the temperature deformation of the breadboard in SolidWorks revealed 70 micron contractions along the edges of the board for a decrease of 75 F; when applied to OAP positions such displacements from the optimal layout are predicted to contribute an additional 20 nanometers of wavefront error. Flexure modeling of the breadboard due to gravity is on-going. We hope to begin alignment and testing of ``Prime'' in Q1 2013.
A novel optical investigation technique for railroad track inspection and assessment
NASA Astrophysics Data System (ADS)
Sabato, Alessandro; Beale, Christopher H.; Niezrecki, Christopher
2017-04-01
Track failures due to cross tie degradation or loss in ballast support may result in a number of problems ranging from simple service interruptions to derailments. Structural Health Monitoring (SHM) of railway track is important for safety reasons and to reduce downtime and maintenance costs. For this reason, novel and cost-effective track inspection technologies for assessing tracks' health are currently insufficient and needed. Advancements achieved in recent years in cameras technology, optical sensors, and image-processing algorithms have made machine vision, Structure from Motion (SfM), and three-dimensional (3D) Digital Image Correlation (DIC) systems extremely appealing techniques for extracting structural deformations and geometry profiles. Therefore, optically based, non-contact measurement techniques may be used for assessing surface defects, rail and tie deflection profiles, and ballast condition. In this study, the design of two camera-based measurement systems is proposed for crossties-ballast condition assessment and track examination purposes. The first one consists of four pairs of cameras installed on the underside of a rail car to detect the induced deformation and displacement on the whole length of the track's cross tie using 3D DIC measurement techniques. The second consists of another set of cameras using SfM techniques for obtaining a 3D rendering of the infrastructure from a series of two-dimensional (2D) images to evaluate the state of the track qualitatively. The feasibility of the proposed optical systems is evaluated through extensive laboratory tests, demonstrating their ability to measure parameters of interest (e.g. crosstie's full-field displacement, vertical deflection, shape, etc.) for assessment and SHM of railroad track.
Development of Communication Technology in Japan: The Hi-OVIS Project.
ERIC Educational Resources Information Center
Murata, Toshihiko
1981-01-01
Describes the two-way Highly Interactive Optical Visual Information System (Hi-OVIS), involving the transmission and reception of educational, advertising, and public service programing, which has been in experimental use in Japan since 1978. Utilizing fiber optics, the system equips each house with a keyboard, television, television camera, and…
High-speed optical 3D sensing and its applications
NASA Astrophysics Data System (ADS)
Watanabe, Yoshihiro
2016-12-01
This paper reviews high-speed optical 3D sensing technologies for obtaining the 3D shape of a target using a camera. The focusing speed is from 100 to 1000 fps, exceeding normal camera frame rates, which are typically 30 fps. In particular, contactless, active, and real-time systems are introduced. Also, three example applications of this type of sensing technology are introduced, including surface reconstruction from time-sequential depth images, high-speed 3D user interaction, and high-speed digital archiving.
Shuttle sortie electro-optical instruments study
NASA Technical Reports Server (NTRS)
1974-01-01
A study to determine the feasibility of adapting existing electro-optical instruments (designed and sucessfully used for ground operations) for use on a shuttle sortie flight and to perform satisfactorily in the space environment is considered. The suitability of these two instruments (a custom made image intensifier camera system and an off-the-shelf secondary electron conduction television camera) to support a barium ion cloud experiment was studied for two different modes of spacelab operation - within the pressurized module and on the pallet.
Data rate enhancement of optical camera communications by compensating inter-frame gaps
NASA Astrophysics Data System (ADS)
Nguyen, Duy Thong; Park, Youngil
2017-07-01
Optical camera communications (OCC) is a convenient way of transmitting data between LED lamps and image sensors that are included in most smart devices. Although many schemes have been suggested to increase the data rate of the OCC system, it is still much lower than that of the photodiode-based LiFi system. One major reason of this low data rate is attributed to the inter-frame gap (IFG) of image sensor system, that is, the time gap between consecutive image frames. In this paper, we propose a way to compensate for this IFG efficiently by an interleaved Hamming coding scheme. The proposed scheme is implemented and the performance is measured.
Practical aspects of modern interferometry for optical manufacturing quality control: Part 2
NASA Astrophysics Data System (ADS)
Smythe, Robert
2012-07-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space based satellite imaging and DVD and Blu-Ray disks are all enabled by phase shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful towards the practical use of interferometers. An understanding of the parameters that drive system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Practical aspects of modern interferometry for optical manufacturing quality control, Part 3
NASA Astrophysics Data System (ADS)
Smythe, Robert A.
2012-09-01
Modern phase shifting interferometers enable the manufacture of optical systems that drive the global economy. Semiconductor chips, solid-state cameras, cell phone cameras, infrared imaging systems, space-based satellite imaging, and DVD and Blu-Ray disks are all enabled by phase-shifting interferometers. Theoretical treatments of data analysis and instrument design advance the technology but often are not helpful toward the practical use of interferometers. An understanding of the parameters that drive the system performance is critical to produce useful results. Any interferometer will produce a data map and results; this paper, in three parts, reviews some of the key issues to minimize error sources in that data and provide a valid measurement.
Real-time Awake Animal Motion Tracking System for SPECT Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goddard Jr, James Samuel; Baba, Justin S; Lee, Seung Joon
Enhancements have been made in the development of a real-time optical pose measurement and tracking system that provides 3D position and orientation data for a single photon emission computed tomography (SPECT) imaging system for awake, unanesthetized, unrestrained small animals. Three optical cameras with infrared (IR) illumination view the head movements of an animal enclosed in a transparent burrow. Markers placed on the head provide landmark points for image segmentation. Strobed IR LED s are synchronized to the cameras and illuminate the markers to prevent motion blur for each set of images. The system using the three cameras automatically segments themore » markers, detects missing data, rejects false reflections, performs trinocular marker correspondence, and calculates the 3D pose of the animal s head. Improvements have been made in methods for segmentation, tracking, and 3D calculation to give higher speed and more accurate measurements during a scan. The optical hardware has been installed within a Siemens MicroCAT II small animal scanner at Johns Hopkins without requiring functional changes to the scanner operation. The system has undergone testing using both phantoms and live mice and has been characterized in terms of speed, accuracy, robustness, and reliability. Experimental data showing these motion tracking results are given.« less
Adapting smartphones for low-cost optical medical imaging
NASA Astrophysics Data System (ADS)
Pratavieira, Sebastião.; Vollet-Filho, José D.; Carbinatto, Fernanda M.; Blanco, Kate; Inada, Natalia M.; Bagnato, Vanderlei S.; Kurachi, Cristina
2015-06-01
Optical images have been used in several medical situations to improve diagnosis of lesions or to monitor treatments. However, most systems employ expensive scientific (CCD or CMOS) cameras and need computers to display and save the images, usually resulting in a high final cost for the system. Additionally, this sort of apparatus operation usually becomes more complex, requiring more and more specialized technical knowledge from the operator. Currently, the number of people using smartphone-like devices with built-in high quality cameras is increasing, which might allow using such devices as an efficient, lower cost, portable imaging system for medical applications. Thus, we aim to develop methods of adaptation of those devices to optical medical imaging techniques, such as fluorescence. Particularly, smartphones covers were adapted to connect a smartphone-like device to widefield fluorescence imaging systems. These systems were used to detect lesions in different tissues, such as cervix and mouth/throat mucosa, and to monitor ALA-induced protoporphyrin-IX formation for photodynamic treatment of Cervical Intraepithelial Neoplasia. This approach may contribute significantly to low-cost, portable and simple clinical optical imaging collection.
Visual tracking for multi-modality computer-assisted image guidance
NASA Astrophysics Data System (ADS)
Basafa, Ehsan; Foroughi, Pezhman; Hossbach, Martin; Bhanushali, Jasmine; Stolka, Philipp
2017-03-01
With optical cameras, many interventional navigation tasks previously relying on EM, optical, or mechanical guidance can be performed robustly, quickly, and conveniently. We developed a family of novel guidance systems based on wide-spectrum cameras and vision algorithms for real-time tracking of interventional instruments and multi-modality markers. These navigation systems support the localization of anatomical targets, support placement of imaging probe and instruments, and provide fusion imaging. The unique architecture - low-cost, miniature, in-hand stereo vision cameras fitted directly to imaging probes - allows for an intuitive workflow that fits a wide variety of specialties such as anesthesiology, interventional radiology, interventional oncology, emergency medicine, urology, and others, many of which see increasing pressure to utilize medical imaging and especially ultrasound, but have yet to develop the requisite skills for reliable success. We developed a modular system, consisting of hardware (the Optical Head containing the mini cameras) and software (components for visual instrument tracking with or without specialized visual features, fully automated marker segmentation from a variety of 3D imaging modalities, visual observation of meshes of widely separated markers, instant automatic registration, and target tracking and guidance on real-time multi-modality fusion views). From these components, we implemented a family of distinct clinical and pre-clinical systems (for combinations of ultrasound, CT, CBCT, and MRI), most of which have international regulatory clearance for clinical use. We present technical and clinical results on phantoms, ex- and in-vivo animals, and patients.
Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; Geert Sander de Jong, Jan; van Geest, Bert; Stoop, Karel; Young, Ian Ted
2012-12-01
We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.
Computational photography with plenoptic camera and light field capture: tutorial.
Lam, Edmund Y
2015-11-01
Photography is a cornerstone of imaging. Ever since cameras became consumer products more than a century ago, we have witnessed great technological progress in optics and recording mediums, with digital sensors replacing photographic films in most instances. The latest revolution is computational photography, which seeks to make image reconstruction computation an integral part of the image formation process; in this way, there can be new capabilities or better performance in the overall imaging system. A leading effort in this area is called the plenoptic camera, which aims at capturing the light field of an object; proper reconstruction algorithms can then adjust the focus after the image capture. In this tutorial paper, we first illustrate the concept of plenoptic function and light field from the perspective of geometric optics. This is followed by a discussion on early attempts and recent advances in the construction of the plenoptic camera. We will then describe the imaging model and computational algorithms that can reconstruct images at different focus points, using mathematical tools from ray optics and Fourier optics. Last, but not least, we will consider the trade-off in spatial resolution and highlight some research work to increase the spatial resolution of the resulting images.
3D imaging and wavefront sensing with a plenoptic objective
NASA Astrophysics Data System (ADS)
Rodríguez-Ramos, J. M.; Lüke, J. P.; López, R.; Marichal-Hernández, J. G.; Montilla, I.; Trujillo-Sevilla, J.; Femenía, B.; Puga, M.; López, M.; Fernández-Valdivia, J. J.; Rosa, F.; Dominguez-Conde, C.; Sanluis, J. C.; Rodríguez-Ramos, L. F.
2011-06-01
Plenoptic cameras have been developed over the last years as a passive method for 3d scanning. Several superresolution algorithms have been proposed in order to increase the resolution decrease associated with lightfield acquisition with a microlenses array. A number of multiview stereo algorithms have also been applied in order to extract depth information from plenoptic frames. Real time systems have been implemented using specialized hardware as Graphical Processing Units (GPUs) and Field Programmable Gates Arrays (FPGAs). In this paper, we will present our own implementations related with the aforementioned aspects but also two new developments consisting of a portable plenoptic objective to transform every conventional 2d camera in a 3D CAFADIS plenoptic camera, and the novel use of a plenoptic camera as a wavefront phase sensor for adaptive optics (OA). The terrestrial atmosphere degrades the telescope images due to the diffraction index changes associated with the turbulence. These changes require a high speed processing that justify the use of GPUs and FPGAs. Na artificial Laser Guide Stars (Na-LGS, 90km high) must be used to obtain the reference wavefront phase and the Optical Transfer Function of the system, but they are affected by defocus because of the finite distance to the telescope. Using the telescope as a plenoptic camera allows us to correct the defocus and to recover the wavefront phase tomographically. These advances significantly increase the versatility of the plenoptic camera, and provides a new contribution to relate the wave optics and computer vision fields, as many authors claim.
A phase space approach to imaging from limited data
NASA Astrophysics Data System (ADS)
Testorf, Markus E.
2015-09-01
The optical instrument function is used as the basis to develop optical system theory for imaging applications. The detection of optical signals is conveniently described as the overlap integral of the Wigner distribution functions of instrument and optical signal. Based on this framework various optical imaging systems, including plenoptic cameras, phase-retrieval algorithms, and Shack-Hartman sensors are shown to acquire information about a domain in phase-space, with finite extension and finite resolution. It is demonstrated how phase space optics can be used both to analyze imaging systems, as well as for designing methods for image reconstruction.
Synchronization Design and Error Analysis of Near-Infrared Cameras in Surgical Navigation.
Cai, Ken; Yang, Rongqian; Chen, Huazhou; Huang, Yizhou; Wen, Xiaoyan; Huang, Wenhua; Ou, Shanxing
2016-01-01
The accuracy of optical tracking systems is important to scientists. With the improvements reported in this regard, such systems have been applied to an increasing number of operations. To enhance the accuracy of these systems further and to reduce the effect of synchronization and visual field errors, this study introduces a field-programmable gate array (FPGA)-based synchronization control method, a method for measuring synchronous errors, and an error distribution map in field of view. Synchronization control maximizes the parallel processing capability of FPGA, and synchronous error measurement can effectively detect the errors caused by synchronization in an optical tracking system. The distribution of positioning errors can be detected in field of view through the aforementioned error distribution map. Therefore, doctors can perform surgeries in areas with few positioning errors, and the accuracy of optical tracking systems is considerably improved. The system is analyzed and validated in this study through experiments that involve the proposed methods, which can eliminate positioning errors attributed to asynchronous cameras and different fields of view.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
Image quality testing of assembled IR camera modules
NASA Astrophysics Data System (ADS)
Winters, Daniel; Erichsen, Patrik
2013-10-01
Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.
Infrared detectors and test technology of cryogenic camera
NASA Astrophysics Data System (ADS)
Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long
2016-10-01
Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.
NASA Astrophysics Data System (ADS)
Baruch, Daniel; Abookasis, David
2017-04-01
The application of optical techniques as tools for biomedical research has generated substantial interest for the ability of such methodologies to simultaneously measure biochemical and morphological parameters of tissue. Ongoing optimization of optical techniques may introduce such tools as alternative or complementary to conventional methodologies. The common approach shared by current optical techniques lies in the independent acquisition of tissue's optical properties (i.e., absorption and reduced scattering coefficients) from reflected or transmitted light. Such optical parameters, in turn, provide detailed information regarding both the concentrations of clinically relevant chromophores and macroscopic structural variations in tissue. We couple a noncontact optical setup with a simple analysis algorithm to obtain absorption and scattering coefficients of biological samples under test. Technically, a portable picoprojector projects serial sinusoidal patterns at low and high spatial frequencies, while a spectrometer and two independent CCD cameras simultaneously acquire the reflected diffuse light through a single spectrometer and two separate CCD cameras having different bandpass filters at nonisosbestic and isosbestic wavelengths in front of each. This configuration fills the gaps in each other's capabilities for acquiring optical properties of tissue at high spectral and spatial resolution. Experiments were performed on both tissue-mimicking phantoms as well as hands of healthy human volunteers to quantify their optical properties as proof of concept for the present technique. In a separate experiment, we derived the optical properties of the hand skin from the measured diffuse reflectance, based on a recently developed camera model. Additionally, oxygen saturation levels of tissue measured by the system were found to agree well with reference values. Taken together, the present results demonstrate the potential of this integrated setup for diagnostic and research applications.
Photogrammetry of Apollo 15 photography, part C
NASA Technical Reports Server (NTRS)
Wu, S. S. C.; Schafer, F. J.; Jordan, R.; Nakata, G. M.; Derick, J. L.
1972-01-01
In the Apollo 15 mission, a mapping camera system and a 61 cm optical bar, high resolution panoramic camera, as well as a laser altimeter were used. The panoramic camera is described, having several distortion sources, such as cylindrical shape of the negative film surface, the scanning action of the lens, the image motion compensator, and the spacecraft motion. Film products were processed on a specifically designed analytical plotter.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-24
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed.
Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe
2013-01-01
The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed. PMID:23348037
Use of an UROV to develop 3-D optical models of submarine environments
NASA Astrophysics Data System (ADS)
Null, W. D.; Landry, B. J.
2017-12-01
The ability to rapidly obtain high-fidelity bathymetry is crucial for a broad range of engineering, scientific, and defense applications ranging from bridge scour, bedform morphodynamics, and coral reef health to unexploded ordnance detection and monitoring. The present work introduces the use of an Underwater Remotely Operated Vehicle (UROV) to develop 3-D optical models of submarine environments. The UROV used a Raspberry Pi camera mounted to a small servo which allowed for pitch control. Prior to video data collection, in situ camera calibration was conducted with the system. Multiple image frames were extracted from the underwater video for 3D reconstruction using Structure from Motion (SFM). This system provides a simple and cost effective solution to obtaining detailed bathymetry in optically clear submarine environments.
Development of Flight Slit-Jaw Optics for Chromospheric Lyman-Alpha SpectroPolarimeter
NASA Technical Reports Server (NTRS)
Kubo, Masahito; Suematsu, Yoshinori; Kano, Ryohei; Bando, Takamasa; Hara, Hirohisa; Narukage, Noriyuki; Katsukawa, Yukio; Ishikawa, Ryoko; Ishikawa, Shin-nosuke; Kobiki, Toshihiko;
2015-01-01
In sounding rocket experiment CLASP, I have placed a slit a mirror-finished around the focal point of the telescope. The light reflected by the mirror surface surrounding the slit is then imaged in Slit-jaw optical system, to obtain the alpha-ray Lyman secondary image. This image, not only to use the real-time image in rocket flight rocket oriented direction selection, and also used as a scientific data showing the spatial structure of the Lyman alpha emission line intensity distribution and solar chromosphere around the observation area of the polarimetric spectroscope. Slit-jaw optical system is a two off-axis mirror unit part including a parabolic mirror and folding mirror, Lyman alpha transmission filter, the optical system magnification 1x consisting camera. The camera is supplied from the United States, and the other was carried out fabrication and testing in all the Japanese side. Slit-jaw optical system, it is difficult to access the structure, it is necessary to install the low place clearance. Therefore, influence the optical performance, the fine adjustment is necessary optical elements are collectively in the form of the mirror unit. On the other hand, due to the alignment of the solar sensor in the US launch site, must be removed once the Lyman alpha transmission filter holder including a filter has a different part from the mirror unit. In order to make the structure simple, stray light measures Aru to concentrate around Lyman alpha transmission filter. To overcome the difficulties of performing optical alignment in Lyman alpha wavelength absorbed by the atmosphere, it was planned following four steps in order to reduce standing time alignment me. 1: is measured in advance refractive index at Lyman alpha wavelength of Lyman alpha transmission filter (121.567nm), to prepare a visible light Firuwo having the same optical path length in the visible light (630nm). 2: The mirror structure CLASP before mounting unit standing, dummy slit and camera standing prescribed position in leading frame is, to complete the internal alignment adjustment. 3: CLASP structure F mirror unit and by attaching the visible light filter, as will plague the focus is carried out in standing position adjustment visible flight products camera. 4: Replace the Lyman alpha transmission filter, it is confirmed by Lyman alpha wavelength (under vacuum) the requested optical performance have come. Currently, up to 3 of the steps completed, it was confirmed in the visible light optical performance that satisfies the required value sufficiently extended. Also, put in Slit-jaw optical system the sunlight through the telescope of CLASP, it is also confirmed that and that stray light rejection no vignetting is in the field of view meets request standing.
Development of Flight Slit-Jaw Optics for Chromospheric Lyman-Alpha SpectroPolarimeter
NASA Technical Reports Server (NTRS)
Kubo, Masahito; Suematsu, Yoshinori; Kano, Ryohei; Bando, Takamasa; Hara, Hirohisa; Narukage, Noriyuki; Katsukawa, Yukio; Ishikawa, Ryoko; Ishikawa, Shin-nosuke; Kobiki, Toshihiko;
2015-01-01
In sounding rocket experiment CLASP, I have placed a slit a mirror-finished around the focal point of the telescope. The light reflected by the mirror surface surrounding the slit is then imaged in Slit-jaw optical system, to obtain the a-ray Lyman secondary image. This image, not only to use the real-time image in rocket flight rocket oriented direction selection, and also used as a scientific data showing the spatial structure of the Lyman alpha emission line intensity distribution and solar chromosphere around the observation area of the polarimetric spectroscope. Slit-jaw optical system is a two off-axis mirror unit part including a parabolic mirror and folding mirror, Lyman alpha transmission filter, the optical system magnification 1x consisting camera. The camera is supplied from the United States, and the other was carried out fabrication and testing in all the Japanese side. Slit-jaw optical system, it is difficult to access the structure, it is necessary to install the low place clearance. Therefore, influence the optical performance, the fine adjustment is necessary optical elements are collectively in the form of the mirror unit. On the other hand, due to the alignment of the solar sensor in the US launch site, must be removed once the Lyman alpha transmission filter holder including a filter has a different part from the mirror unit. In order to make the structure simple, stray light measures Aru to concentrate around Lyman alpha transmission filter. To overcome the difficulties of performing optical alignment in Lyman alpha wavelength absorbed by the atmosphere, it was planned 'following four steps in order to reduce standing time alignment me. 1. is measured in advance refractive index at Lyman alpha wavelength of Lyman alpha transmission filter (121.567nm), to prepare a visible light Firuwo having the same optical path length in the visible light (630nm).2. The mirror structure CLASP before mounting unit standing, dummy slit and camera standing prescribed position in leading frame is, to complete the internal alignment adjustment. 3. CLASP structure F mirror unit and by attaching the visible light filter, as will plague the focus is carried out in standing position adjustment visible flight products camera. 4. Replace the Lyman alpha transmission filter, it is confirmed by Lyman alpha wavelength (under vacuum) the requested optical performance have come. Currently, up to 3 of the steps completed, it was confirmed in the visible light optical performance that satisfies the required value sufficiently extended. Also, put in Slit-jaw optical system the sunlight through the telescope of CLASP, it is also confirmed that and that stray light rejection no vignetting is in the field of view meets request standing.
Preliminary Design of a Lightning Optical Camera and ThundEr (LOCATE) Sensor
NASA Technical Reports Server (NTRS)
Phanord, Dieudonne D.; Koshak, William J.; Rybski, Paul M.; Arnold, James E. (Technical Monitor)
2001-01-01
The preliminary design of an optical/acoustical instrument is described for making highly accurate real-time determinations of the location of cloud-to-ground (CG) lightning. The instrument, named the Lightning Optical Camera And ThundEr (LOCATE) sensor, will also image the clear and cloud-obscured lightning channel produced from CGs and cloud flashes, and will record the transient optical waveforms produced from these discharges. The LOCATE sensor will consist of a full (360 degrees) field-of-view optical camera for obtaining CG channel image and azimuth, a sensitive thunder microphone for obtaining CG range, and a fast photodiode system for time-resolving the lightning optical waveform. The optical waveform data will be used to discriminate CGs from cloud flashes. Together, the optical azimuth and thunder range is used to locate CGs and it is anticipated that a network of LOCATE sensors would determine CG source location to well within 100 meters. All of this would be accomplished for a relatively inexpensive cost compared to present RF lightning location technologies, but of course the range detection is limited and will be quantified in the future. The LOCATE sensor technology would have practical applications for electric power utility companies, government (e.g. NASA Kennedy Space Center lightning safety and warning), golf resort lightning safety, telecommunications, and other industries.
Photothermal camera port accessory for microscopic thermal diffusivity imaging
NASA Astrophysics Data System (ADS)
Escola, Facundo Zaldívar; Kunik, Darío; Mingolo, Nelly; Martínez, Oscar Eduardo
2016-06-01
The design of a scanning photothermal accessory is presented, which can be attached to the camera port of commercial microscopes to measure thermal diffusivity maps with micrometer resolution. The device is based on the thermal expansion recovery technique, which measures the defocusing of a probe beam due to the curvature induced by the local heat delivered by a focused pump beam. The beam delivery and collecting optics are built using optical fiber technology, resulting in a robust optical system that provides collinear pump and probe beams without any alignment adjustment necessary. The quasiconfocal configuration for the signal collection using the same optical fiber sets very restrictive conditions on the positioning and alignment of the optical components of the scanning unit, and a detailed discussion of the design equations is presented. The alignment procedure is carefully described, resulting in a system so robust and stable that no further alignment is necessary for the day-to-day use, becoming a tool that can be used for routine quality control, operated by a trained technician.
Space imaging infrared optical guidance for autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu
2008-08-01
We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.
NASA Technical Reports Server (NTRS)
Katzberg, S. J.; Kelly, W. L., IV; Rowland, C. W.; Burcher, E. E.
1973-01-01
The facsimile camera is an optical-mechanical scanning device which has become an attractive candidate as an imaging system for planetary landers and rovers. This paper presents electronic techniques which permit the acquisition and reconstruction of high quality images with this device, even under varying lighting conditions. These techniques include a control for low frequency noise and drift, an automatic gain control, a pulse-duration light modulation scheme, and a relative spectral gain control. Taken together, these techniques allow the reconstruction of radiometrically accurate and properly balanced color images from facsimile camera video data. These techniques have been incorporated into a facsimile camera and reproduction system, and experimental results are presented for each technique and for the complete system.
Sweatt, William C.
1998-01-01
A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.
Method and apparatus for acoustic imaging of objects in water
Deason, Vance A.; Telschow, Kenneth L.
2005-01-25
A method, system and underwater camera for acoustic imaging of objects in water or other liquids includes an acoustic source for generating an acoustic wavefront for reflecting from a target object as a reflected wavefront. The reflected acoustic wavefront deforms a screen on an acoustic side and correspondingly deforms the opposing optical side of the screen. An optical processing system is optically coupled to the optical side of the screen and converts the deformations on the optical side of the screen into an optical intensity image of the target object.
NASA Astrophysics Data System (ADS)
Cobos Arribas, Pedro; Monasterio Huelin Macia, Felix
2003-04-01
A FPGA based hardware implementation of the Santos-Victor optical flow algorithm, useful in robot guidance applications, is described in this paper. The system used to do contains an ALTERA FPGA (20K100), an interface with a digital camera, three VRAM memories to contain the data input and some output memories (a VRAM and a EDO) to contain the results. The system have been used previously to develop and test other vision algorithms, such as image compression, optical flow calculation with differential and correlation methods. The designed system let connect the digital camera, or the FPGA output (results of algorithms) to a PC, throw its Firewire or USB port. The problems take place in this occasion have motivated to adopt another hardware structure for certain vision algorithms with special requirements, that need a very hard code intensive processing.
Robustness of an artificially tailored fisheye imaging system with a curvilinear image surface
NASA Astrophysics Data System (ADS)
Lee, Gil Ju; Nam, Won Il; Song, Young Min
2017-11-01
Curved image sensors inspired by animal and insect eyes have provided a new development direction in next-generation digital cameras. It is known that natural fish eyes afford an extremely wide field of view (FOV) imaging due to the geometrical properties of the spherical lens and hemispherical retina. However, its inherent drawbacks, such as the low off-axis illumination and the fabrication difficulty of a 'dome-like' hemispherical imager, limit the development of bio-inspired wide FOV cameras. Here, a new type of fisheye imaging system is introduced that has simple lens configurations with a curvilinear image surface, while maintaining high off-axis illumination and a wide FOV. Moreover, through comparisons with commercial conventional fisheye designs, it is determined that the volume and required number of optical elements of the proposed design is practical while capturing the fundamental optical performances. Detailed design guidelines for tailoring the proposed optic system are also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goggin, L; Kilby, W; Noll, M
2015-06-15
Purpose: A technique using a scintillator-mirror-camera system to measure MLC leakage was developed to provide an efficient alternative to film dosimetry while maintaining high spatial resolution. This work describes the technique together with measurement uncertainties. Methods: Leakage measurements were made for the InCise™ MLC using the Logos XRV-2020A device. For each measurement approximately 170 leakage and background images were acquired using optimized camera settings. Average background was subtracted from each leakage frame before filtering the integrated leakage image to replace anomalous pixels. Pixel value to dose conversion was performed using a calibration image. Mean leakage was calculated within an ROImore » corresponding to the primary beam, and maximum leakage was determined by binning the image into overlapping 1mm x 1mm ROIs. 48 measurements were performed using 3 cameras and multiple MLC-linac combinations in varying beam orientations, with each compared to film dosimetry. Optical and environmental influences were also investigated. Results: Measurement time with the XRV-2020A was 8 minutes vs. 50 minutes using radiochromic film, and results were available immediately. Camera radiation exposure degraded measurement accuracy. With a relatively undamaged camera, mean leakage agreed with film measurement to ≤0.02% in 92% cases, ≤0.03% in 100% (for maximum leakage the values were 88% and 96%) relative to reference open field dose. The estimated camera lifetime over which this agreement is maintained is at least 150 measurements, and can be monitored using reference field exposures. A dependency on camera temperature was identified and a reduction in sensitivity with distance from image center due to optical distortion was characterized. Conclusion: With periodic monitoring of the degree of camera radiation damage, the XRV-2020A system can be used to measure MLC leakage. This represents a significant time saving when compared to the traditional film-based approach without any substantial reduction in accuracy.« less
High Energy Replicated Optics to Explore the Sun: Hard X-Ray Balloon-Borne Telescope
NASA Technical Reports Server (NTRS)
Gaskin, Jessica; Apple, Jeff; StevensonChavis, Katherine; Dietz, Kurt; Holt, Marlon; Koehler, Heather; Lis, Tomasz; O'Connor, Brian; RodriquezOtero, Miguel; Pryor, Jonathan;
2013-01-01
Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist
High Energy Replicated Optics to Explore the Sun: Hard X-ray balloon-borne telescope
NASA Astrophysics Data System (ADS)
Gaskin, J.; Apple, J.; Chavis, K. S.; Dietz, K.; Holt, M.; Koehler, H.; Lis, T.; O'Connor, B.; Otero, M. R.; Pryor, J.; Ramsey, B.; Rinehart-Dawson, M.; Smith, L.; Sobey, A.; Wilson-Hodge, C.; Christe, S.; Cramer, A.; Edgerton, M.; Rodriguez, M.; Shih, A.; Gregory, D.; Jasper, J.; Bohon, S.
Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist.
Analysis of edge density fluctuation measured by trial KSTAR beam emission spectroscopy systema)
NASA Astrophysics Data System (ADS)
Nam, Y. U.; Zoletnik, S.; Lampert, M.; Kovácsik, Á.
2012-10-01
A beam emission spectroscopy (BES) system based on direct imaging avalanche photodiode (APD) camera has been designed for Korea Superconducting Tokamak Advanced Research (KSTAR) and a trial system has been constructed and installed for evaluating feasibility of the design. The system contains two cameras, one is an APD camera for BES measurement and another is a fast visible camera for position calibration. Two pneumatically actuated mirrors were positioned at front and rear of lens optics. The front mirror can switch the measurement between edge and core region of plasma and the rear mirror can switch between the APD and the visible camera. All systems worked properly and the measured photon flux was reasonable as expected from the simulation. While the measurement data from the trial system were limited, it revealed some interesting characteristics of KSTAR plasma suggesting future research works with fully installed BES system. The analysis result and the development plan will be presented in this paper.
Thermal infrared panoramic imaging sensor
NASA Astrophysics Data System (ADS)
Gutin, Mikhail; Tsui, Eddy K.; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey
2006-05-01
Panoramic cameras offer true real-time, 360-degree coverage of the surrounding area, valuable for a variety of defense and security applications, including force protection, asset protection, asset control, security including port security, perimeter security, video surveillance, border control, airport security, coastguard operations, search and rescue, intrusion detection, and many others. Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence of target detection, and enables both man-in-the-loop and fully automated system operation. Thermal imaging provides the benefits of all-weather, 24-hour day/night operation with no downtime. In addition, thermal signatures of different target types facilitate better classification, beyond the limits set by camera's spatial resolution. The useful range of catadioptric panoramic cameras is affected by their limited resolution. In many existing systems the resolution is optics-limited. Reflectors customarily used in catadioptric imagers introduce aberrations that may become significant at large camera apertures, such as required in low-light and thermal imaging. Advantages of panoramic imagers with high image resolution include increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) combines the strengths of improved, high-resolution panoramic optics with thermal imaging in the 8 - 14 micron spectral range, leveraged by intelligent video processing for automated detection, location, and tracking of moving targets. The work in progress supports the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to serve in a wide range of applications of homeland security, as well as serve the Army in tasks of improved situational awareness (SA) in defense and offensive operations, and as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The novel ViperView TM high-resolution panoramic thermal imager is the heart of the APTIS system. It features an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS system include network communications, advanced power management, and wakeup capability. Recent developments include image processing, optical design being expanded into the visible spectral range, and wireless communications design. This paper describes the development status of the APTIS system.
Quantifying the movement of multiple insects using an optical insect counter
USDA-ARS?s Scientific Manuscript database
An optical insect counter (OIC) was designed and tested. The new system integrated a line-scan camera and a vertical light sheet along with data collection and image processing software to count numbers of flying insects crossing a vertical plane defined by the light sheet. The system also allows ...
TH-AB-202-11: Spatial and Rotational Quality Assurance of 6DOF Patient Tracking Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belcher, AH; Liu, X; Grelewicz, Z
2016-06-15
Purpose: External tracking systems used for patient positioning and motion monitoring during radiotherapy are now capable of detecting both translations and rotations (6DOF). In this work, we develop a novel technique to evaluate the 6DOF performance of external motion tracking systems. We apply this methodology to an infrared (IR) marker tracking system and two 3D optical surface mapping systems in a common tumor 6DOF workspace. Methods: An in-house designed and built 6DOF parallel kinematics robotic motion phantom was used to follow input trajectories with sub-millimeter and sub-degree accuracy. The 6DOF positions of the robotic system were then tracked and recordedmore » independently by three optical camera systems. A calibration methodology which associates the motion phantom and camera coordinate frames was first employed, followed by a comprehensive 6DOF trajectory evaluation, which spanned a full range of positions and orientations in a 20×20×16 mm and 5×5×5 degree workspace. The intended input motions were compared to the calibrated 6DOF measured points. Results: The technique found the accuracy of the IR marker tracking system to have maximal root mean square error (RMSE) values of 0.25 mm translationally and 0.09 degrees rotationally, in any one axis, comparing intended 6DOF positions to positions measured by the IR camera. The 6DOF RSME discrepancy for the first 3D optical surface tracking unit yielded maximal values of 0.60 mm and 0.11 degrees over the same 6DOF volume. An earlier generation 3D optical surface tracker was observed to have worse tracking capabilities than both the IR camera unit and the newer 3D surface tracking system with maximal RMSE of 0.74 mm and 0.28 degrees within the same 6DOF evaluation space. Conclusion: The proposed technique was effective at evaluating the performance of 6DOF patient tracking systems. All systems examined exhibited tracking capabilities at the sub-millimeter and sub-degree level within a 6DOF workspace.« less
Gamma Ray Burst Optical Counterpart Search Experiment (GROCSE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, H.S.; Ables, E.; Bionta, R.M.
GROCSE (Gamma-Ray Optical Counterpart Search Experiments) is a system of automated telescopes that search for simultaneous optical activity associated with gamma ray bursts in response to real-time burst notifications provided by the BATSE/BACODINE network. The first generation system, GROCSE 1, is sensitive down to Mv {approximately} 8.5 and requires an average of 12 seconds to obtain the first images of the gamma ray burst error box defined by the BACODINE trigger. The collaboration is now constructing a second generation system which has a 4 second slewing time and can reach Mv {approximately} 14 with a 5 second exposure. GROCSE 2more » consists of 4 cameras on a single mount. Each camera views the night sky through a commercial Canon lens (f/1.8, focal length 200 mm) and utilizes a 2K x 2K Loral CCD. Light weight and low noise custom readout electronics were designed and fabricated for these CCDs. The total field of view of the 4 cameras is 17.6 x 17.6 {degree}. GROCSE II will be operated by the end of 1995. In this paper, the authors present an overview of the GROCSE system and the results of measurements with a GROCSE 2 prototype unit.« less
Matovic, Milovan; Jankovic, Milica; Barjaktarovic, Marko; Jeremic, Marija
2017-01-01
After radioiodine therapy of differentiated thyroid cancer (DTC) patients, whole body scintigraphy (WBS) is standard procedure before releasing the patient from the hospital. A common problem is the precise localization of regions where the iod-avide tissue is located. Sometimes is practically impossible to perform precise topographic localization of such regions. In order to face this problem, we have developed a low-cost Vision-Fusion system for web-camera image acquisition simultaneously with routine scintigraphic whole body acquisition including the algorithm for fusion of images given from both cameras. For image acquisition in the gamma part of the spectra we used e.cam dual head gamma camera (Siemens, Erlangen, Germany) in WBS modality, with matrix size of 256×1024 pixels and bed speed of 6cm/min, equipped with high energy collimator. For optical image acquisition in visible part of spectra we have used web-camera model C905 (Logitech, USA) with Carl Zeiss® optics, native resolution 1600×1200 pixels, 34 o field of view, 30g weight, with autofocus option turned "off" and auto white balance turned "on". Web camera is connected to upper head of gamma camera (GC) by a holder of lightweight aluminum rod and a plexiglas adapter. Our own Vision-Fusion software for image acquisition and coregistration was developed using NI LabVIEW programming environment 2015 (National Instruments, Texas, USA) and two additional LabVIEW modules: NI Vision Acquisition Software (VAS) and NI Vision Development Module (VDM). Vision acquisition software enables communication and control between laptop computer and web-camera. Vision development module is image processing library used for image preprocessing and fusion. Software starts the web-camera image acquisition before starting image acquisition on GC and stops it when GC completes the acquisition. Web-camera is in continuous acquisition mode with frame rate f depending on speed of patient bed movement v (f=v/∆ cm , where ∆ cm is a displacement step that can be changed in Settings option of Vision-Fusion software; by default, ∆ cm is set to 1cm corresponding to ∆ p =15 pixels). All images captured while patient's bed is moving are processed. Movement of patient's bed is checked using cross-correlation of two successive images. After each image capturing, algorithm extracts the central region of interest (ROI) of the image, with the same width as captured image (1600 pixels) and the height that is equal to the ∆ p displacement in pixels. All extracted central ROI are placed next to each other in the overall whole-body image. Stacking of narrow central ROI introduces negligible distortion in the overall whole-body image. The first step for fusion of the scintigram and the optical image was determination of spatial transformation between them. We have made an experiment with two markers (point radioactivity sources of 99m Tc pertechnetate 1MBq) visible in both images (WBS and optical) to find transformation of coordinates between images. The distance between point markers is used for spatial coregistration of the gamma and optical images. At the end of coregistration process, gamma image is rescaled in spatial domain and added to the optical image (green or red channel, amplification changeable from user interface). We tested our system for 10 patients with DTC who received radioiodine therapy (8 women and two men, with average age of 50.10±12.26 years). Five patients received 5.55Gbq, three 3.70GBq and two 1.85GBq. Whole-body scintigraphy and optical image acquisition were performed 72 hours after application of radioiodine therapy. Based on our first results during clinical testing of our system, we can conclude that our system can improve diagnostic possibility of whole body scintigraphy to detect thyroid remnant tissue in patients with DTC after radioiodine therapy.
Optimal design of an earth observation optical system with dual spectral and high resolution
NASA Astrophysics Data System (ADS)
Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha
2017-02-01
With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.
Iterative Refinement of Transmission Map for Stereo Image Defogging Using a Dual Camera Sensor.
Kim, Heegwang; Park, Jinho; Park, Hasil; Paik, Joonki
2017-12-09
Recently, the stereo imaging-based image enhancement approach has attracted increasing attention in the field of video analysis. This paper presents a dual camera-based stereo image defogging algorithm. Optical flow is first estimated from the stereo foggy image pair, and the initial disparity map is generated from the estimated optical flow. Next, an initial transmission map is generated using the initial disparity map. Atmospheric light is then estimated using the color line theory. The defogged result is finally reconstructed using the estimated transmission map and atmospheric light. The proposed method can refine the transmission map iteratively. Experimental results show that the proposed method can successfully remove fog without color distortion. The proposed method can be used as a pre-processing step for an outdoor video analysis system and a high-end smartphone with a dual camera system.
Study of a quasi-microscope design for planetary landers
NASA Technical Reports Server (NTRS)
Giat, O.; Brown, E. B.
1973-01-01
The Viking Lander fascimile camera, in its present form, provides for a minimum object distance of 1.9 meters, at which distance its resolution of 0.0007 radian provides an object resolution of 1.33 millimeters. It was deemed desirable, especially for follow-on Viking missions, to provide means for examing Martian terrain at resolutions considerably higher than that now provided. This led to the concept of quasi-microscope, an attachment to be used in conjunction with the fascimile camera to convert it to a low power microscope. The results are reported of an investigation to consider alternate optical configurations for the quasi-microscope and to develop optical designs for the selected system or systems. Initial requirements included consideration of object resolutions in the range of 2 to 50 micrometers, an available field of view of the order of 500 pixels, and no significant modifications to the fascimile camera.
Iterative Refinement of Transmission Map for Stereo Image Defogging Using a Dual Camera Sensor
Park, Jinho; Park, Hasil
2017-01-01
Recently, the stereo imaging-based image enhancement approach has attracted increasing attention in the field of video analysis. This paper presents a dual camera-based stereo image defogging algorithm. Optical flow is first estimated from the stereo foggy image pair, and the initial disparity map is generated from the estimated optical flow. Next, an initial transmission map is generated using the initial disparity map. Atmospheric light is then estimated using the color line theory. The defogged result is finally reconstructed using the estimated transmission map and atmospheric light. The proposed method can refine the transmission map iteratively. Experimental results show that the proposed method can successfully remove fog without color distortion. The proposed method can be used as a pre-processing step for an outdoor video analysis system and a high-end smartphone with a dual camera system. PMID:29232826
Li, Jin; Liu, Zilong; Liu, Si
2017-02-20
In on-board photographing processes of satellite cameras, the platform vibration can generate image motion, distortion, and smear, which seriously affect the image quality and image positioning. In this paper, we create a mathematical model of a vibrating modulate transfer function (VMTF) for a remote-sensing camera. The total MTF of a camera is reduced by the VMTF, which means the image quality is degraded. In order to avoid the degeneration of the total MTF caused by vibrations, we use an Mn-20Cu-5Ni-2Fe (M2052) manganese copper alloy material to fabricate a vibration-isolation mechanism (VIM). The VIM can transform platform vibration energy into irreversible thermal energy with its internal twin crystals structure. Our experiment shows the M2052 manganese copper alloy material is good enough to suppress image motion below 125 Hz, which is the vibration frequency of satellite platforms. The camera optical system has a higher MTF after suppressing the vibration of the M2052 material than before.
Wavefront measurement of plastic lenses for mobile-phone applications
NASA Astrophysics Data System (ADS)
Huang, Li-Ting; Cheng, Yuan-Chieh; Wang, Chung-Yen; Wang, Pei-Jen
2016-08-01
In camera lenses for mobile-phone applications, all lens elements have been designed with aspheric surfaces because of the requirements in minimal total track length of the lenses. Due to the diffraction-limited optics design with precision assembly procedures, element inspection and lens performance measurement have become cumbersome in the production of mobile-phone cameras. Recently, wavefront measurements based on Shack-Hartmann sensors have been successfully implemented on injection-molded plastic lens with aspheric surfaces. However, the applications of wavefront measurement on small-sized plastic lenses have yet to be studied both theoretically and experimentally. In this paper, both an in-house-built and a commercial wavefront measurement system configured on two optics structures have been investigated with measurement of wavefront aberrations on two lens elements from a mobile-phone camera. First, the wet-cell method has been employed for verifications of aberrations due to residual birefringence in an injection-molded lens. Then, two lens elements of a mobile-phone camera with large positive and negative power have been measured with aberrations expressed in Zernike polynomial to illustrate the effectiveness in wavefront measurement for troubleshooting defects in optical performance.
Multi-MGy Radiation Hardened Camera for Nuclear Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef
There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO{sub 2}) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations ofmore » the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera electronics will be exposed. Another important element of the camera is the optical system that transports the image from the scene to the image sensor. This arrangement of glass-based lenses is affected by radiations through two mechanisms: the radiation induced absorption and the radiation induced refractive index changes. The first one will limit the signal to noise ratio of the image whereas the second one will directly affect the resolution of the camera. We'll present at the conference a coupled simulation/experiment study of these effects for various commercial glasses and present vulnerability study of typical optical systems to radiations at MGy doses. The last very important part of the camera is the illumination system that can be based on various technologies of emitting devices like LED, SLED or lasers. The most promising solutions for high radiation doses will be presented at the conference. In addition to this hardening-by-component approach, the global radiation tolerance of the camera can be drastically improve by working at the system level, combining innovative approaches eg. for the optical and illumination systems. We'll present at the conference the developed approach allowing to extend the camera lifetime up to the MGy dose range. (authors)« less
Time-Of-Flight Camera, Optical Tracker and Computed Tomography in Pairwise Data Registration
Badura, Pawel; Juszczyk, Jan; Pietka, Ewa
2016-01-01
Purpose A growing number of medical applications, including minimal invasive surgery, depends on multi-modal or multi-sensors data processing. Fast and accurate 3D scene analysis, comprising data registration, seems to be crucial for the development of computer aided diagnosis and therapy. The advancement of surface tracking system based on optical trackers already plays an important role in surgical procedures planning. However, new modalities, like the time-of-flight (ToF) sensors, widely explored in non-medical fields are powerful and have the potential to become a part of computer aided surgery set-up. Connection of different acquisition systems promises to provide a valuable support for operating room procedures. Therefore, the detailed analysis of the accuracy of such multi-sensors positioning systems is needed. Methods We present the system combining pre-operative CT series with intra-operative ToF-sensor and optical tracker point clouds. The methodology contains: optical sensor set-up and the ToF-camera calibration procedures, data pre-processing algorithms, and registration technique. The data pre-processing yields a surface, in case of CT, and point clouds for ToF-sensor and marker-driven optical tracker representation of an object of interest. An applied registration technique is based on Iterative Closest Point algorithm. Results The experiments validate the registration of each pair of modalities/sensors involving phantoms of four various human organs in terms of Hausdorff distance and mean absolute distance metrics. The best surface alignment was obtained for CT and optical tracker combination, whereas the worst for experiments involving ToF-camera. Conclusion The obtained accuracies encourage to further develop the multi-sensors systems. The presented substantive discussion concerning the system limitations and possible improvements mainly related to the depth information produced by the ToF-sensor is useful for computer aided surgery developers. PMID:27434396
NASA Astrophysics Data System (ADS)
Scaduto, Lucimara C. N.; Malavolta, Alexandre T.; Modugno, Rodrigo G.; Vales, Luiz F.; Carvalho, Erica G.; Evangelista, Sérgio; Stefani, Mario A.; de Castro Neto, Jarbas C.
2017-11-01
The first Brazilian remote sensing multispectral camera (MUX) is currently under development at Opto Eletronica S.A. It consists of a four-spectral-band sensor covering a 450nm to 890nm wavelength range. This camera will provide images within a 20m ground resolution at nadir. The MUX camera is part of the payload of the upcoming Sino-Brazilian satellites CBERS 3&4 (China-Brazil Earth Resource Satellite). The preliminary alignment between the optical system and the CCD sensor, which is located at the focal plane assembly, was obtained in air condition, clean room environment. A collimator was used for the performance evaluation of the camera. The preliminary performance evaluation of the optical channel was registered by compensating the collimator focus position due to changes in the test environment, as an air-to-vacuum environment transition leads to a defocus process in this camera. Therefore, it is necessary to confirm that the alignment of the camera must always be attained ensuring that its best performance is reached for an orbital vacuum condition. For this reason and as a further step on the development process, the MUX camera Qualification Model was tested and evaluated inside a thermo-vacuum chamber and submitted to an as-orbit vacuum environment. In this study, the influence of temperature fields was neglected. This paper reports on the performance evaluation and discusses the results for this camera when operating within those mentioned test conditions. The overall optical tests and results show that the "in air" adjustment method was suitable to be performed, as a critical activity, to guarantee the equipment according to its design requirements.
An inexpensive programmable illumination microscope with active feedback.
Tompkins, Nathan; Fraden, Seth
2016-02-01
We have developed a programmable illumination system capable of tracking and illuminating numerous objects simultaneously using only low-cost and reused optical components. The active feedback control software allows for a closed-loop system that tracks and perturbs objects of interest automatically. Our system uses a static stage where the objects of interest are tracked computationally as they move across the field of view allowing for a large number of simultaneous experiments. An algorithmically determined illumination pattern can be applied anywhere in the field of view with simultaneous imaging and perturbation using different colors of light to enable spatially and temporally structured illumination. Our system consists of a consumer projector, camera, 35-mm camera lens, and a small number of other optical and scaffolding components. The entire apparatus can be assembled for under $4,000.
ERIC Educational Resources Information Center
Milshtein, Amy
1997-01-01
The University of Maryland at College Park installed 25 surveillance cameras to combat crime. A minimum of disruption occurred because unused twisted pair wires left in place when the conversion to a fiber optic telephone system was made could be used for the camera installations. The campus is safer, and its budget is intact. (RE)
Tracking a Head-Mounted Display in a Room-Sized Environment with Head-Mounted Cameras
1990-04-01
poor resolution and a very limited working volume [Wan90]. 4 OPTOTRAK [Nor88] uses one camera with two dual-axis CCD infrared position sensors. Each...Nor88] Northern Digital. Trade literature on Optotrak - Northern Digital’s Three Dimensional Optical Motion Tracking and Analysis System. Northern Digital
Optical design of the SuMIRe/PFS spectrograph
NASA Astrophysics Data System (ADS)
Pascal, Sandrine; Vives, Sébastien; Barkhouser, Robert; Gunn, James E.
2014-07-01
The SuMIRe Prime Focus Spectrograph (PFS), developed for the 8-m class SUBARU telescope, will consist of four identical spectrographs, each receiving 600 fibers from a 2394 fiber robotic positioner at the telescope prime focus. Each spectrograph includes three spectral channels to cover the wavelength range [0.38-1.26] um with a resolving power ranging between 2000 and 4000. A medium resolution mode is also implemented to reach a resolving power of 5000 at 0.8 um. Each spectrograph is made of 4 optical units: the entrance unit which produces three corrected collimated beams and three camera units (one per spectral channel: "blue, "red", and "NIR"). The beam is split by using two large dichroics; and in each arm, the light is dispersed by large VPH gratings (about 280x280mm). The proposed optical design was optimized to achieve the requested image quality while simplifying the manufacturing of the whole optical system. The camera design consists in an innovative Schmidt camera observing a large field-of-view (10 degrees) with a very fast beam (F/1.09). To achieve such a performance, the classical spherical mirror is replaced by a catadioptric mirror (i.e meniscus lens with a reflective surface on the rear side of the glass, like a Mangin mirror). This article focuses on the optical architecture of the PFS spectrograph and the perfornance achieved. We will first described the global optical design of the spectrograph. Then, we will focus on the Mangin-Schmidt camera design. The analysis of the optical performance and the results obtained are presented in the last section.
NASA Astrophysics Data System (ADS)
Su, Peng; Khreishi, Manal A. H.; Su, Tianquan; Huang, Run; Dominguez, Margaret Z.; Maldonado, Alejandro; Butel, Guillaume; Wang, Yuhao; Parks, Robert E.; Burge, James H.
2014-03-01
A software configurable optical test system (SCOTS) based on deflectometry was developed at the University of Arizona for rapidly, robustly, and accurately measuring precision aspheric and freeform surfaces. SCOTS uses a camera with an external stop to realize a Hartmann test in reverse. With the external camera stop as the reference, a coordinate measuring machine can be used to calibrate the SCOTS test geometry to a high accuracy. Systematic errors from the camera are carefully investigated and controlled. Camera pupil imaging aberration is removed with the external aperture stop. Imaging aberration and other inherent errors are suppressed with an N-rotation test. The performance of the SCOTS test is demonstrated with the measurement results from a 5-m-diameter Large Synoptic Survey Telescope tertiary mirror and an 8.4-m diameter Giant Magellan Telescope primary mirror. The results show that SCOTS can be used as a large-dynamic-range, high-precision, and non-null test method for precision aspheric and freeform surfaces. The SCOTS test can achieve measurement accuracy comparable to traditional interferometric tests.
Progress in passive submillimeter-wave video imaging
NASA Astrophysics Data System (ADS)
Heinz, Erik; May, Torsten; Born, Detlef; Zieger, Gabriel; Peiselt, Katja; Zakosarenko, Vyacheslav; Krause, Torsten; Krüger, André; Schulz, Marco; Bauer, Frank; Meyer, Hans-Georg
2014-06-01
Since 2007 we are developing passive submillimeter-wave video cameras for personal security screening. In contradiction to established portal-based millimeter-wave scanning techniques, these are suitable for stand-off or stealth operation. The cameras operate in the 350GHz band and use arrays of superconducting transition-edge sensors (TES), reflector optics, and opto-mechanical scanners. Whereas the basic principle of these devices remains unchanged, there has been a continuous development of the technical details, as the detector array, the scanning scheme, and the readout, as well as system integration and performance. The latest prototype of this camera development features a linear array of 128 detectors and a linear scanner capable of 25Hz frame rate. Using different types of reflector optics, a field of view of 1×2m2 and a spatial resolution of 1-2 cm is provided at object distances of about 5-25m. We present the concept of this camera and give details on system design and performance. Demonstration videos show its capability for hidden threat detection and illustrate possible application scenarios.
NASA Astrophysics Data System (ADS)
Zhang, Bing; Li, Kunyang
2018-02-01
The “Breakthrough Starshot” aims at sending near-speed-of-light cameras to nearby stellar systems in the future. Due to the relativistic effects, a transrelativistic camera naturally serves as a spectrograph, a lens, and a wide-field camera. We demonstrate this through a simulation of the optical-band image of the nearby galaxy M51 in the rest frame of the transrelativistic camera. We suggest that observing celestial objects using a transrelativistic camera may allow one to study the astronomical objects in a special way, and to perform unique tests on the principles of special relativity. We outline several examples that suggest transrelativistic cameras may make important contributions to astrophysics and suggest that the Breakthrough Starshot cameras may be launched in any direction to serve as a unique astronomical observatory.
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.
Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy
NASA Astrophysics Data System (ADS)
Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
Digital micromirror device camera with per-pixel coded exposure for high dynamic range imaging.
Feng, Wei; Zhang, Fumin; Wang, Weijing; Xing, Wei; Qu, Xinghua
2017-05-01
In this paper, we overcome the limited dynamic range of the conventional digital camera, and propose a method of realizing high dynamic range imaging (HDRI) from a novel programmable imaging system called a digital micromirror device (DMD) camera. The unique feature of the proposed new method is that the spatial and temporal information of incident light in our DMD camera can be flexibly modulated, and it enables the camera pixels always to have reasonable exposure intensity by DMD pixel-level modulation. More importantly, it allows different light intensity control algorithms used in our programmable imaging system to achieve HDRI. We implement the optical system prototype, analyze the theory of per-pixel coded exposure for HDRI, and put forward an adaptive light intensity control algorithm to effectively modulate the different light intensity to recover high dynamic range images. Via experiments, we demonstrate the effectiveness of our method and implement the HDRI on different objects.
Performance and Calibration of H2RG Detectors and SIDECAR ASICs for the RATIR Camera
NASA Technical Reports Server (NTRS)
Fox, Ori D.; Kutyrev, Alexander S.; Rapchun, David A.; Klein, Christopher R.; Butler, Nathaniel R.; Bloom, Josh; de Diego, Jos A.; Simn Farah, Alejandro D.; Gehrels, Neil A.; Georgiev, Leonid;
2012-01-01
The Reionization And Transient Infra,.Red (RATIR) camera has been built for rapid Gamma,.Ray Burst (GRE) followup and will provide simultaneous optical and infrared photometric capabilities. The infrared portion of this camera incorporates two Teledyne HgCdTe HAWAII-2RG detectors, controlled by Teledyne's SIDECAR ASICs. While other ground-based systems have used the SIDECAR before, this system also utilizes Teledyne's JADE2 interface card and IDE development environment. Together, this setup comprises Teledyne's Development Kit, which is a bundled solution that can be efficiently integrated into future ground-based systems. In this presentation, we characterize the system's read noise, dark current, and conversion gain.
OPALS: A COTS-based Tech Demo of Optical Communications
NASA Technical Reports Server (NTRS)
Oaida, Bogdan
2012-01-01
I. Objective: Deliver video from ISS to optical ground terminal via an optical communications link. a) JPL Phaeton/Early Career Hire (ECH) training project. b) Implemented as Class-D payload. c) Downlink at approx.30Mb/s. II. Flight System a) Optical Head Beacon Acquisition Camera. Downlink Transmitter. 2-axis Gimbal. b) Sealed Container Laser Avionics Power distribution Digital I/O board III. Implementation: a) Ground Station - Optical Communications Telescope Laboratory at Table Mountain Facility b) Flight System mounted to ISS FRAM as standard I/F. Attached externally on Express Logistics Carrier.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Evan; Goodale, Wing; Burns, Steve
There is a critical need to develop monitoring tools to track aerofauna (birds and bats) in three dimensions around wind turbines. New monitoring systems will reduce permitting uncertainty by increasing the understanding of how birds and bats are interacting with wind turbines, which will improve the accuracy of impact predictions. Biodiversity Research Institute (BRI), The University of Maine Orono School of Computing and Information Science (UMaine SCIS), HiDef Aerial Surveying Limited (HiDef), and SunEdison, Inc. (formerly First Wind) responded to this need by using stereo-optic cameras with near-infrared (nIR) technology to investigate new methods for documenting aerofauna behavior around windmore » turbines. The stereo-optic camera system used two synchronized high-definition video cameras with fisheye lenses and processing software that detected moving objects, which could be identified in post-processing. The stereo- optic imaging system offered the ability to extract 3-D position information from pairs of images captured from different viewpoints. Fisheye lenses allowed for a greater field of view, but required more complex image rectification to contend with fisheye distortion. The ability to obtain 3-D positions provided crucial data on the trajectory (speed and direction) of a target, which, when the technology is fully developed, will provide data on how animals are responding to and interacting with wind turbines. This project was focused on testing the performance of the camera system, improving video review processing time, advancing the 3-D tracking technology, and moving the system from Technology Readiness Level 4 to 5. To achieve these objectives, we determined the size and distance at which aerofauna (particularly eagles) could be detected and identified, created efficient data management systems, improved the video post-processing viewer, and attempted refinement of 3-D modeling with respect to fisheye lenses. The 29-megapixel camera system successfully captured 16,173 five-minute video segments in the field. During nighttime field trials using nIR, we found that bat-sized objects could not be detected more than 60 m from the camera system. This led to a decision to focus research efforts exclusively on daytime monitoring and to redirect resources towards improving the video post- processing viewer. We redesigned the bird event post-processing viewer, which substantially decreased the review time necessary to detect and identify flying objects. During daytime field trials, we determine that eagles could be detected up to 500 m away using the fisheye wide-angle lenses, and eagle-sized targets could be identified to species within 350 m of the camera system. We used distance sampling survey methods to describe the probability of detecting and identifying eagles and other aerofauna as a function of distance from the system. The previously developed 3-D algorithm for object isolation and tracking was tested, but the image rectification (flattening) required to obtain accurate distance measurements with fish-eye lenses was determined to be insufficient for distant eagles. We used MATLAB and OpenCV to improve fisheye lens rectification towards the center of the image, but accurate measurements towards the image corners could not be achieved. We believe that changing the fisheye lens to rectilinear lens would greatly improve position estimation, but doing so would result in a decrease in viewing angle and depth of field. Finally, we generated simplified shape profiles of birds to look for similarities between unknown animals and known species. With further development, this method could provide a mechanism for filtering large numbers of shapes to reduce data storage and processing. These advancements further refined the camera system and brought this new technology closer to market. Once commercialized, the stereo-optic camera system technology could be used to: a) research how different species interact with wind turbines in order to refine collision risk models and inform mitigation solutions; and b) monitor aerofauna interactions with terrestrial and offshore wind farms replacing costly human observers and allowing for long-term monitoring in the offshore environment. The camera system will provide developers and regulators with data on the risk that wind turbines present to aerofauna, which will reduce uncertainty in the environmental permitting process.« less
NASA Astrophysics Data System (ADS)
Grasser, R.; Peyronneaudi, Benjamin; Yon, Kevin; Aubry, Marie
2015-10-01
CILAS, subsidiary of Airbus Defense and Space, develops, manufactures and sales laser-based optronics equipment for defense and homeland security applications. Part of its activity is related to active systems for threat detection, recognition and identification. Active surveillance and active imaging systems are often required to achieve identification capacity in case for long range observation in adverse conditions. In order to ease the deployment of active imaging systems often complex and expensive, CILAS suggests a new concept. It consists on the association of two apparatus working together. On one side, a patented versatile laser platform enables high peak power laser illumination for long range observation. On the other side, a small camera add-on works as a fast optical switch to select photons with specific time of flight only. The association of the versatile illumination platform and the fast optical switch presents itself as an independent body, so called "flash module", giving to virtually any passive observation systems gated active imaging capacity in NIR and SWIR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, K; Bin, Z; Wong, J
Purpose: We develop a novel dual-use configuration for a tri-modality, CBCT/bioluminescence tomography(BLT)/fluorescence tomography(FT), imaging system with the SARRP that can function as a standalone system for longitudinal imaging research and on-board the SARRP to guide irradiation. BLT provides radiation guidance for soft tissue target, while FT offers functional information allowing mechanistic investigations. Methods: The optical assembly includes CCD camera, lens, filter wheel, 3-way mirrors, scanning fiber system and light-tight enclosure. The rotating mirror system directs the optical signal from the animal surface to the camera at multiple projection over 180 degree. The fiber-laser system serves as the external light sourcemore » for the FT application. Multiple filters are used for multispectral imaging to enhance localization accuracy using BLT. SARRP CBCT provides anatomical information and geometric mesh for BLT/FT reconstruction. To facilitate dual use, the 3-way mirror system is cantilevered in front of the camera. The entire optical assembly is driven by a 1D linear stage to dock onto an independent mouse support bed for standalone application. After completion of on-board optical imaging, the system is retracted from the SARRP to allow irradiation of the mouse. Results: A tissue-simulating phantom and a mouse model with a luminescence light source are used to demonstrate the function of the dual-use optical system. Feasibility data have been obtained based on a manual-docking prototype. The center of mass of light source determined in living mouse with on-board BLT is within 1±0.2mm of that with CBCT. The performance of the motorized system is expected to be the same and will be presented. Conclusion: We anticipate the motorized dual use system provide significant efficiency gain over our manual docking and off-line system. By also supporting off-line longitudinal studies independent of the SARRP, the dual-use system is a highly efficient and cost-effective platform to facilitate optical imaging for pre-clinical radiation research. The work is supported by NIH R01CA158100 and Xstrahl Ltd. Drs. John Wong and Iulian Iordachita receive royalty payment from a licensing agreement between Xstrahl Ltd and Johns Hopkins University. John Wong also has a consultant agreement with Xstrahl Ltd.« less
NASA Technical Reports Server (NTRS)
Ponseggi, B. G. (Editor); Johnson, H. C. (Editor)
1985-01-01
Papers are presented on the picosecond electronic framing camera, photogrammetric techniques using high-speed cineradiography, picosecond semiconductor lasers for characterizing high-speed image shutters, the measurement of dynamic strain by high-speed moire photography, the fast framing camera with independent frame adjustments, design considerations for a data recording system, and nanosecond optical shutters. Consideration is given to boundary-layer transition detectors, holographic imaging, laser holographic interferometry in wind tunnels, heterodyne holographic interferometry, a multispectral video imaging and analysis system, a gated intensified camera, a charge-injection-device profile camera, a gated silicon-intensified-target streak tube and nanosecond-gated photoemissive shutter tubes. Topics discussed include high time-space resolved photography of lasers, time-resolved X-ray spectrographic instrumentation for laser studies, a time-resolving X-ray spectrometer, a femtosecond streak camera, streak tubes and cameras, and a short pulse X-ray diagnostic development facility.
NASA Astrophysics Data System (ADS)
Yasuoka, Fatima M. M.; Matos, Luciana; Cremasco, Antonio; Numajiri, Mirian; Marcato, Rafael; Oliveira, Otavio G.; Sabino, Luis G.; Castro N., Jarbas C.; Bagnato, Vanderlei S.; Carvalho, Luis A. V.
2016-03-01
An optical system that conjugates the patient's pupil to the plane of a Hartmann-Shack (HS) wavefront sensor has been simulated using optical design software. And an optical bench prototype is mounted using mechanical eye device, beam splitter, illumination system, lenses, mirrors, mirrored prism, movable mirror, wavefront sensor and camera CCD. The mechanical eye device is used to simulate aberrations of the eye. From this device the rays are emitted and travelled by the beam splitter to the optical system. Some rays fall on the camera CCD and others pass in the optical system and finally reach the sensor. The eye models based on typical in vivo eye aberrations is constructed using the optical design software Zemax. The computer-aided outcomes of each HS images for each case are acquired, and these images are processed using customized techniques. The simulated and real images for low order aberrations are compared using centroid coordinates to assure that the optical system is constructed precisely in order to match the simulated system. Afterwards a simulated version of retinal images is constructed to show how these typical eyes would perceive an optotype positioned 20 ft away. Certain personalized corrections are allowed by eye doctors based on different Zernike polynomial values and the optical images are rendered to the new parameters. Optical images of how that eye would see with or without corrections of certain aberrations are generated in order to allow which aberrations can be corrected and in which degree. The patient can then "personalize" the correction to their own satisfaction. This new approach to wavefront sensing is a promising change in paradigm towards the betterment of the patient-physician relationship.
Accurate and cost-effective MTF measurement system for lens modules of digital cameras
NASA Astrophysics Data System (ADS)
Chang, Gao-Wei; Liao, Chia-Cheng; Yeh, Zong-Mu
2007-01-01
For many years, the widening use of digital imaging products, e.g., digital cameras, has given rise to much attention in the market of consumer electronics. However, it is important to measure and enhance the imaging performance of the digital ones, compared to that of conventional cameras (with photographic films). For example, the effect of diffraction arising from the miniaturization of the optical modules tends to decrease the image resolution. As a figure of merit, modulation transfer function (MTF) has been broadly employed to estimate the image quality. Therefore, the objective of this paper is to design and implement an accurate and cost-effective MTF measurement system for the digital camera. Once the MTF of the sensor array is provided, that of the optical module can be then obtained. In this approach, a spatial light modulator (SLM) is employed to modulate the spatial frequency of light emitted from the light-source. The modulated light going through the camera under test is consecutively detected by the sensors. The corresponding images formed from the camera are acquired by a computer and then, they are processed by an algorithm for computing the MTF. Finally, through the investigation on the measurement accuracy from various methods, such as from bar-target and spread-function methods, it appears that our approach gives quite satisfactory results.
Shin, Dongsuk; Pierce, Mark C; Gillenwater, Ann M; Williams, Michelle D; Richards-Kortum, Rebecca R
2010-06-23
Early detection is an essential component of cancer management. Unfortunately, visual examination can often be unreliable, and many settings lack the financial capital and infrastructure to operate PET, CT, and MRI systems. Moreover, the infrastructure and expense associated with surgical biopsy and microscopy are a challenge to establishing cancer screening/early detection programs in low-resource settings. Improvements in performance and declining costs have led to the availability of optoelectronic components, which can be used to develop low-cost diagnostic imaging devices for use at the point-of-care. Here, we demonstrate a fiber-optic fluorescence microscope using a consumer-grade camera for in vivo cellular imaging. The fiber-optic fluorescence microscope includes an LED light, an objective lens, a fiber-optic bundle, and a consumer-grade digital camera. The system was used to image an oral cancer cell line labeled with 0.01% proflavine. A human tissue specimen was imaged following surgical resection, enabling dysplastic and cancerous regions to be evaluated. The oral mucosa of a healthy human subject was imaged in vivo, following topical application of 0.01% proflavine. The fiber-optic microscope resolved individual nuclei in all specimens and tissues imaged. This capability allowed qualitative and quantitative differences between normal and precancerous or cancerous tissues to be identified. The optical efficiency of the system permitted imaging of the human oral mucosa in real time. Our results indicate this device as a useful tool to assist in the identification of early neoplastic changes in epithelial tissues. This portable, inexpensive unit may be particularly appropriate for use at the point-of-care in low-resource settings.
Phase and amplitude wave front sensing and reconstruction with a modified plenoptic camera
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Ko, Jonathan; Nelson, William; Davis, Christopher C.
2014-10-01
A plenoptic camera is a camera that can retrieve the direction and intensity distribution of light rays collected by the camera and allows for multiple reconstruction functions such as: refocusing at a different depth, and for 3D microscopy. Its principle is to add a micro-lens array to a traditional high-resolution camera to form a semi-camera array that preserves redundant intensity distributions of the light field and facilitates back-tracing of rays through geometric knowledge of its optical components. Though designed to process incoherent images, we found that the plenoptic camera shows high potential in solving coherent illumination cases such as sensing both the amplitude and phase information of a distorted laser beam. Based on our earlier introduction of a prototype modified plenoptic camera, we have developed the complete algorithm to reconstruct the wavefront of the incident light field. In this paper the algorithm and experimental results will be demonstrated, and an improved version of this modified plenoptic camera will be discussed. As a result, our modified plenoptic camera can serve as an advanced wavefront sensor compared with traditional Shack- Hartmann sensors in handling complicated cases such as coherent illumination in strong turbulence where interference and discontinuity of wavefronts is common. Especially in wave propagation through atmospheric turbulence, this camera should provide a much more precise description of the light field, which would guide systems in adaptive optics to make intelligent analysis and corrections.
Yang, Xiaofeng; Wu, Wei; Wang, Guoan
2015-04-01
This paper presents a surgical optical navigation system with non-invasive, real-time, and positioning characteristics for open surgical procedure. The design was based on the principle of near-infrared fluorescence molecular imaging. The in vivo fluorescence excitation technology, multi-channel spectral camera technology and image fusion software technology were used. Visible and near-infrared light ring LED excitation source, multi-channel band pass filters, spectral camera 2 CCD optical sensor technology and computer systems were integrated, and, as a result, a new surgical optical navigation system was successfully developed. When the near-infrared fluorescence was injected, the system could display anatomical images of the tissue surface and near-infrared fluorescent functional images of surgical field simultaneously. The system can identify the lymphatic vessels, lymph node, tumor edge which doctor cannot find out with naked eye intra-operatively. Our research will guide effectively the surgeon to remove the tumor tissue to improve significantly the success rate of surgery. The technologies have obtained a national patent, with patent No. ZI. 2011 1 0292374. 1.
Line-Based Registration of Panoramic Images and LiDAR Point Clouds for Mobile Mapping.
Cui, Tingting; Ji, Shunping; Shan, Jie; Gong, Jianya; Liu, Kejian
2016-12-31
For multi-sensor integrated systems, such as the mobile mapping system (MMS), data fusion at sensor-level, i.e., the 2D-3D registration between an optical camera and LiDAR, is a prerequisite for higher level fusion and further applications. This paper proposes a line-based registration method for panoramic images and a LiDAR point cloud collected by a MMS. We first introduce the system configuration and specification, including the coordinate systems of the MMS, the 3D LiDAR scanners, and the two panoramic camera models. We then establish the line-based transformation model for the panoramic camera. Finally, the proposed registration method is evaluated for two types of camera models by visual inspection and quantitative comparison. The results demonstrate that the line-based registration method can significantly improve the alignment of the panoramic image and the LiDAR datasets under either the ideal spherical or the rigorous panoramic camera model, with the latter being more reliable.
Line-Based Registration of Panoramic Images and LiDAR Point Clouds for Mobile Mapping
Cui, Tingting; Ji, Shunping; Shan, Jie; Gong, Jianya; Liu, Kejian
2016-01-01
For multi-sensor integrated systems, such as the mobile mapping system (MMS), data fusion at sensor-level, i.e., the 2D-3D registration between an optical camera and LiDAR, is a prerequisite for higher level fusion and further applications. This paper proposes a line-based registration method for panoramic images and a LiDAR point cloud collected by a MMS. We first introduce the system configuration and specification, including the coordinate systems of the MMS, the 3D LiDAR scanners, and the two panoramic camera models. We then establish the line-based transformation model for the panoramic camera. Finally, the proposed registration method is evaluated for two types of camera models by visual inspection and quantitative comparison. The results demonstrate that the line-based registration method can significantly improve the alignment of the panoramic image and the LiDAR datasets under either the ideal spherical or the rigorous panoramic camera model, with the latter being more reliable. PMID:28042855
Mechanically assisted liquid lens zoom system for mobile phone cameras
NASA Astrophysics Data System (ADS)
Wippermann, F. C.; Schreiber, P.; Bräuer, A.; Berge, B.
2006-08-01
Camera systems with small form factor are an integral part of today's mobile phones which recently feature auto focus functionality. Ready to market solutions without moving parts have been developed by using the electrowetting technology. Besides virtually no deterioration, easy control electronics and simple and therefore cost-effective fabrication, this type of liquid lenses enables extremely fast settling times compared to mechanical approaches. As a next evolutionary step mobile phone cameras will be equipped with zoom functionality. We present first order considerations for the optical design of a miniaturized zoom system based on liquid-lenses and compare it to its mechanical counterpart. We propose a design of a zoom lens with a zoom factor of 2.5 considering state-of-the-art commercially available liquid lens products. The lens possesses auto focus capability and is based on liquid lenses and one additional mechanical actuator. The combination of liquid lenses and a single mechanical actuator enables extremely short settling times of about 20ms for the auto focus and a simplified mechanical system design leading to lower production cost and longer life time. The camera system has a mechanical outline of 24mm in length and 8mm in diameter. The lens with f/# 3.5 provides market relevant optical performance and is designed for an image circle of 6.25mm (1/2.8" format sensor).
The Orbiter camera payload system's large-format camera and attitude reference system
NASA Technical Reports Server (NTRS)
Schardt, B. B.; Mollberg, B. H.
1985-01-01
The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.
AO corrected satellite imaging from Mount Stromlo
NASA Astrophysics Data System (ADS)
Bennet, F.; Rigaut, F.; Price, I.; Herrald, N.; Ritchie, I.; Smith, C.
2016-07-01
The Research School of Astronomy and Astrophysics have been developing adaptive optics systems for space situational awareness. As part of this program we have developed satellite imaging using compact adaptive optics systems for small (1-2 m) telescopes such as those operated by Electro Optic Systems (EOS) from the Mount Stromlo Observatory. We have focused on making compact, simple, and high performance AO systems using modern high stroke high speed deformable mirrors and EMCCD cameras. We are able to track satellites down to magnitude 10 with a Strehl in excess of 20% in median seeing.
Robust 3D Position Estimation in Wide and Unconstrained Indoor Environments
Mossel, Annette
2015-01-01
In this paper, a system for 3D position estimation in wide, unconstrained indoor environments is presented that employs infrared optical outside-in tracking of rigid-body targets with a stereo camera rig. To overcome limitations of state-of-the-art optical tracking systems, a pipeline for robust target identification and 3D point reconstruction has been investigated that enables camera calibration and tracking in environments with poor illumination, static and moving ambient light sources, occlusions and harsh conditions, such as fog. For evaluation, the system has been successfully applied in three different wide and unconstrained indoor environments, (1) user tracking for virtual and augmented reality applications, (2) handheld target tracking for tunneling and (3) machine guidance for mining. The results of each use case are discussed to embed the presented approach into a larger technological and application context. The experimental results demonstrate the system’s capabilities to track targets up to 100 m. Comparing the proposed approach to prior art in optical tracking in terms of range coverage and accuracy, it significantly extends the available tracking range, while only requiring two cameras and providing a relative 3D point accuracy with sub-centimeter deviation up to 30 m and low-centimeter deviation up to 100 m. PMID:26694388
Use of the Polarized Radiance Distribution Camera System in the RADYO Program
2011-01-28
characterization and validation of a high dynamic range radiance camera", Ocean Optics XX, Anchorage, Ak., October 2010. POSTER G. Zibordi and K. J. Voss...on light in the ocean", Submitted to Physics Today, Dec 2010. H. Zhang and K. J. Voss, "On Hapke photometric model predictions on reflectance of
A novel camera localization system for extending three-dimensional digital image correlation
NASA Astrophysics Data System (ADS)
Sabato, Alessandro; Reddy, Narasimha; Khan, Sameer; Niezrecki, Christopher
2018-03-01
The monitoring of civil, mechanical, and aerospace structures is important especially as these systems approach or surpass their design life. Often, Structural Health Monitoring (SHM) relies on sensing techniques for condition assessment. Advancements achieved in camera technology and optical sensors have made three-dimensional (3D) Digital Image Correlation (DIC) a valid technique for extracting structural deformations and geometry profiles. Prior to making stereophotogrammetry measurements, a calibration has to be performed to obtain the vision systems' extrinsic and intrinsic parameters. It means that the position of the cameras relative to each other (i.e. separation distance, cameras angle, etc.) must be determined. Typically, cameras are placed on a rigid bar to prevent any relative motion between the cameras. This constraint limits the utility of the 3D-DIC technique, especially as it is applied to monitor large-sized structures and from various fields of view. In this preliminary study, the design of a multi-sensor system is proposed to extend 3D-DIC's capability and allow for easier calibration and measurement. The suggested system relies on a MEMS-based Inertial Measurement Unit (IMU) and a 77 GHz radar sensor for measuring the orientation and relative distance of the stereo cameras. The feasibility of the proposed combined IMU-radar system is evaluated through laboratory tests, demonstrating its ability in determining the cameras position in space for performing accurate 3D-DIC calibration and measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birch, Gabriel Carisle; Griffin, John Clark
2015-01-01
The horizontal television lines (HTVL) metric has been the primary quantity used by division 6000 related to camera resolution for high consequence security systems. This document shows HTVL measurements are fundamen- tally insufficient as a metric to determine camera resolution, and propose a quantitative, standards based methodology by measuring the camera system modulation transfer function (MTF), the most common and accepted metric of res- olution in the optical science community. Because HTVL calculations are easily misinterpreted or poorly defined, we present several scenarios in which HTVL is frequently reported, and discuss their problems. The MTF metric is discussed, and scenariosmore » are presented with calculations showing the application of such a metric.« less
Isochromatic photoelasticity fringe patterns of PMMA in various shapes and stress applications
NASA Astrophysics Data System (ADS)
Manjit, Y.; Limpichaipanit, A.; Ngamjarurojana, A.
2018-03-01
The research focuses on isochromatic photoelastic fringe patterns in solid materials by using reflection mode in dark field polariscope. The optical setup consists of light source, polarizers, quarter wave plates, 577 nm optical pass filter, compensator and digital camera system. The fringe patterns were produced on the sample and fractional / integer number of fringe order was observed using Babinet compensator and digital camera system. The samples were circular and rectangular shape of PMMA coated with silver spray and compressed by hydraulic system at the top and the bottom. The results of the isochromatic fringe pattern were analyzed in horizontal and vertical positions. It was found that force and the number of isochromatic photoelastic fringe order depended on shape of sample, which reflects stress distribution behavior.
Cascade Helps JPL Explore the Solar System
NASA Technical Reports Server (NTRS)
Burke, G. R.
1996-01-01
At Jet Propulsion Laboratory (JPL), we are involved with the unmanned exploration of the solar system. Unmanned probes observe the planet surfaces using radar and optical cameras to take a variety of measurements.
An inexpensive programmable illumination microscope with active feedback
Tompkins, Nathan; Fraden, Seth
2016-01-01
We have developed a programmable illumination system capable of tracking and illuminating numerous objects simultaneously using only low-cost and reused optical components. The active feedback control software allows for a closed-loop system that tracks and perturbs objects of interest automatically. Our system uses a static stage where the objects of interest are tracked computationally as they move across the field of view allowing for a large number of simultaneous experiments. An algorithmically determined illumination pattern can be applied anywhere in the field of view with simultaneous imaging and perturbation using different colors of light to enable spatially and temporally structured illumination. Our system consists of a consumer projector, camera, 35-mm camera lens, and a small number of other optical and scaffolding components. The entire apparatus can be assembled for under $4,000. PMID:27642182
3D Indoor Positioning of UAVs with Spread Spectrum Ultrasound and Time-of-Flight Cameras
Aguilera, Teodoro
2017-01-01
This work proposes the use of a hybrid acoustic and optical indoor positioning system for the accurate 3D positioning of Unmanned Aerial Vehicles (UAVs). The acoustic module of this system is based on a Time-Code Division Multiple Access (T-CDMA) scheme, where the sequential emission of five spread spectrum ultrasonic codes is performed to compute the horizontal vehicle position following a 2D multilateration procedure. The optical module is based on a Time-Of-Flight (TOF) camera that provides an initial estimation for the vehicle height. A recursive algorithm programmed on an external computer is then proposed to refine the estimated position. Experimental results show that the proposed system can increase the accuracy of a solely acoustic system by 70–80% in terms of positioning mean square error. PMID:29301211
High Speed Photographic Analysis Of Railgun Plasmas
NASA Astrophysics Data System (ADS)
Macintyre, I. B.
1985-02-01
Various experiments are underway at the Materials Research Laboratories, Australian Department of Defence, to develop a theory for the behaviour and propulsion action of plasmas in rail guns. Optical recording and imaging devices, with their low vulnerability to the effects of magnetic and electric fields present in the vicinity of electromagnetic launchers, have proven useful as diagnostic tools. This paper describes photoinstrumentation systems developed to provide visual qualitative assessment of the behaviour of plasma travelling along the bore of railgun launchers. In addition, a quantitative system is incorporated providing continuous data (on a microsecond time scale) of (a) Length of plasma during flight along the launcher bore. (b) Velocity of plasma. (c) Distribution of plasma with respect to time after creation. (d) Plasma intensity profile as it travels along the launcher bore. The evolution of the techniques used is discussed. Two systems were employed. The first utilized a modified high speed streak camera to record the light emitted from the plasma, through specially prepared fibre optic cables. The fibre faces external to the bore were then imaged onto moving film. The technique involved the insertion of fibres through the launcher body to enable the plasma to be viewed at discrete positions as it travelled along the launcher bore. Camera configuration, fibre optic preparation and experimental results are outlined. The second system utilized high speed streak and framing photography in conjunction with accurate sensitometric control procedures on the recording film. The two cameras recorded the plasma travelling along the bore of a specially designed transparent launcher. The streak camera, fitted with a precise slit size, recorded a streak image of the upper brightness range of the plasma as it travelled along the launcher's bore. The framing camera recorded an overall view of the launcher and the plasma path, to the maximum possible, governed by the film's ability to reproduce the plasma's brightness range. The instrumentation configuration, calibration, and film measurement using microdensitometer scanning techniques to evaluate inbore plasma behaviour, are also presented.
Preliminary optical design of PANIC, a wide-field infrared camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Rodríguez Gómez, J.; Lenzen, R.; Sánchez-Blanco, E.
2008-07-01
In this paper, we present the preliminary optical design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Calar Alto 2.2 m telescope. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. A mosaic of four Hawaii 2RG of 2k x 2k made by Teledyne is used as detector and will give a field of view of 31.9 arcmin x 31.9 arcmin. This cryogenic instrument has been optimized for the Y, J, H and K bands. Special care has been taken in the selection of the standard IR materials used for the optics in order to maximize the instrument throughput and to include the z band. The main challenges of this design are: to produce a well defined internal pupil which allows reducing the thermal background by a cryogenic pupil stop; the correction of off-axis aberrations due to the large field available; the correction of chromatic aberration because of the wide spectral coverage; and the capability of introduction of narrow band filters (~1%) in the system minimizing the degradation in the filter passband without a collimated stage in the camera. We show the optomechanical error budget and compensation strategy that allows our as built design to met the performances from an optical point of view. Finally, we demonstrate the flexibility of the design showing the performances of PANIC at the CAHA 3.5m telescope.
An astronomy camera for low background applications in the 1. 0 to 2. 5. mu. m spectral region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaki, S.A.; Bailey, G.C.; Hagood, R.W.
1989-02-01
A short wavelength (1.0-2.5 ..mu..m) 128 x 128 focal plane array forms the heart of this low background astronomy camera system. The camera is designed to accept either a 128 x 128 HgCdTe array for the 1-2.5 ..mu..m spectral region or an InSb array for the 3-5 ..mu..m spectral region. A cryogenic folded optical system is utilized to control excess stray light along with a cold eight-position filter wheel for spectral filtering. The camera head and electronics will also accept a 256 x 256 focal plane. Engineering evaluation of the complete system is complete along with two engineering runs atmore » the JPL Table Mountain Observatory. System design, engineering performance, and sample imagery are presented in this paper.« less
Registration of an on-axis see-through head-mounted display and camera system
NASA Astrophysics Data System (ADS)
Luo, Gang; Rensing, Noa M.; Weststrate, Evan; Peli, Eli
2005-02-01
An optical see-through head-mounted display (HMD) system integrating a miniature camera that is aligned with the user's pupil is developed and tested. Such an HMD system has a potential value in many augmented reality applications, in which registration of the virtual display to the real scene is one of the critical aspects. The camera alignment to the user's pupil results in a simple yet accurate calibration and a low registration error across a wide range of depth. In reality, a small camera-eye misalignment may still occur in such a system due to the inevitable variations of HMD wearing position with respect to the eye. The effects of such errors are measured. Calculation further shows that the registration error as a function of viewing distance behaves nearly the same for different virtual image distances, except for a shift. The impact of prismatic effect of the display lens on registration is also discussed.
A Normal Incidence X-ray Telescope (NIXT) sounding rocket payload
NASA Technical Reports Server (NTRS)
Golub, Leon
1989-01-01
Work on the High Resolution X-ray (HRX) Detector Program is described. In the laboratory and flight programs, multiple copies of a general purpose set of electronics which control the camera, signal processing and data acquisition, were constructed. A typical system consists of a phosphor convertor, image intensifier, a fiber optics coupler, a charge coupled device (CCD) readout, and a set of camera, signal processing and memory electronics. An initial rocket detector prototype camera was tested in flight and performed perfectly. An advanced prototype detector system was incorporated on another rocket flight, in which a high resolution heterojunction vidicon tube was used as the readout device for the H(alpha) telescope. The camera electronics for this tube were built in-house and included in the flight electronics. Performance of this detector system was 100 percent satisfactory. The laboratory X-ray system for operation on the ground is also described.
High-performance dual-speed CCD camera system for scientific imaging
NASA Astrophysics Data System (ADS)
Simpson, Raymond W.
1996-03-01
Traditionally, scientific camera systems were partitioned with a `camera head' containing the CCD and its support circuitry and a camera controller, which provided analog to digital conversion, timing, control, computer interfacing, and power. A new, unitized high performance scientific CCD camera with dual speed readout at 1 X 106 or 5 X 106 pixels per second, 12 bit digital gray scale, high performance thermoelectric cooling, and built in composite video output is described. This camera provides all digital, analog, and cooling functions in a single compact unit. The new system incorporates the A/C converter, timing, control and computer interfacing in the camera, with the power supply remaining a separate remote unit. A 100 Mbyte/second serial link transfers data over copper or fiber media to a variety of host computers, including Sun, SGI, SCSI, PCI, EISA, and Apple Macintosh. Having all the digital and analog functions in the camera made it possible to modify this system for the Woods Hole Oceanographic Institution for use on a remote controlled submersible vehicle. The oceanographic version achieves 16 bit dynamic range at 1.5 X 105 pixels/second, can be operated at depths of 3 kilometers, and transfers data to the surface via a real time fiber optic link.
The Sensor Irony: How Reliance on Sensor Technology is Limiting Our View of the Battlefield
2010-05-10
thermal ) camera, as well as a laser illuminator/range finder.73 Similar to the MQ- 1 , the MQ-9 Reaper is primarily a strike asset for emerging targets...Wescam 14TS. 1 Both systems have an Electro-optical (daylight) TV camera, an Infra-red ( thermal ) camera, as well as a laser illuminator/range finder...Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour
Autonomous Vision Navigation for Spacecraft in Lunar Orbit
NASA Astrophysics Data System (ADS)
Bader, Nolan A.
NASA aims to achieve unprecedented navigational reliability for the first manned lunar mission of the Orion spacecraft in 2023. A technique for accomplishing this is to integrate autonomous feature tracking as an added means of improving position and velocity estimation. In this thesis, a template matching algorithm and optical sensor are tested onboard three simulated lunar trajectories using linear covariance techniques under various conditions. A preliminary characterization of the camera gives insight into its ability to determine azimuth and elevation angles to points on the surface of the Moon. A navigation performance analysis shows that an optical camera sensor can aid in decreasing position and velocity errors, particularly in a loss of communication scenario. Furthermore, it is found that camera quality and computational capability are driving factors affecting the performance of such a system.
Speed of sound and photoacoustic imaging with an optical camera based ultrasound detection system
NASA Astrophysics Data System (ADS)
Nuster, Robert; Paltauf, Guenther
2017-07-01
CCD camera based optical ultrasound detection is a promising alternative approach for high resolution 3D photoacoustic imaging (PAI). To fully exploit its potential and to achieve an image resolution <50 μm, it is necessary to incorporate variations of the speed of sound (SOS) in the image reconstruction algorithm. Hence, in the proposed work the idea and a first implementation are shown how speed of sound imaging can be added to a previously developed camera based PAI setup. The current setup provides SOS-maps with a spatial resolution of 2 mm and an accuracy of the obtained absolute SOS values of about 1%. The proposed dual-modality setup has the potential to provide highly resolved and perfectly co-registered 3D photoacoustic and SOS images.
The HEROES Balloon-Borne Hard X-Ray Telescope
NASA Technical Reports Server (NTRS)
Wilson-Hodge, C.; Gaskin, J.; Christe, S.; Shih, A. Y.; Swartz, D. A.; Tennant, A. F.; Ramsey, B.; Kilaru, K.
2014-01-01
The High Energy Replicated Optics to Explore the Sun (HEROES) payload flew on a balloon from Ft. Sumner, NM, September 21-22, 2013. HEROES is sensitive from about 20-75 keV and comprises 8 optics modules (HPD approximately 33" as flown), each consisting of 13-14 nickel replicated optics shells and 8 matching Xenon-filled position-sensitive proportional counter detectors (dE/E=0.05 @ 60 keV). Our targets included the Sun, the Crab Nebula and pulsar and the black hole binary GRS 1915+105. HEROES was pointed using a day/night star camera system for astrophysical observations and a newly developed Solar Aspect System for solar observations (with a shutter protecting the star camera.) We have successfully detected the Crab Nebula. Analyses for GRS 1915+105 and the Sun are ongoing. In this presentation, I will describe the HEROES mission, the data analysis pipeline and calibrations, preliminary results, and plans for follow-on missions.
Self-aligning LED-based optical link
NASA Astrophysics Data System (ADS)
Shen, Thomas C.; Drost, Robert J.; Rzasa, John R.; Sadler, Brian M.; Davis, Christopher C.
2016-09-01
The steady advances in light-emitting diode (LED) technology have motivated the use of LEDs in optical wireless communication (OWC) applications such as indoor local area networks (LANs) and communication between mobile platforms (e.g., robots, vehicles). In contrast to traditional radio frequency (RF) wireless communication, OWC utilizes electromagnetic spectrum that is largely unregulated and unrestricted. OWC communication may be especially useful in RF-denied environments, in which RF communication may be prohibited or undesirable. However, OWC does present some challenges, including the need to maintain alignment between potentially moving nodes. We describe a novel system for link alignment that is composed of a hyperboloidal mirror, camera, and gimbal. The experimental system is able to use the mirror and camera to detect an LED beacon of a neighboring node and estimate its bearing (azimuth and elevation), point the gimbal towards the beacon, and establish an optical link.
The HEROES Balloon-borne Hard X-ray Telescope
NASA Astrophysics Data System (ADS)
Wilson-Hodge, Colleen; Gaskin, Jessica; Christe, Steven; Shih, Albert Y.; Swartz, Douglas A.; Tennant, Allyn F.; Ramsey, Brian; Kilaru, Kiranmayee
2014-08-01
The High Energy Replicated Optics to Explore the Sun (HEROES) payload flew on a balloon from Ft. Sumner, NM, September 21-22, 2013. HEROES is sensitive from about 20-75 keV and comprises 8 optics modules (HP 33"), each consisting of 13-14 nickel replicated optics shells and 8 matching Xenon-filled position-sensitive proportional counter detectors (dE/E=0.05 @ 60 keV). Our targets included the Sun, the Crab Nebula and pulsar and the black hole binary GRS 1915+105. HEROES was pointed using a day/night star camera system for astrophysical observations and a newly developed Solar Aspect System for solar observations (with a shutter protecting the star camera.) We have successfully imaged the Crab Nebula. Analyses for GRS 1915+105 and the Sun are ongoing. In this presentation, I will describe the HEROES mission, the data analysis pipeline and calibrations, preliminary results, and plans for follow-on missions.
Report of the facility definition team spacelab UV-Optical Telescope Facility
NASA Technical Reports Server (NTRS)
1975-01-01
Scientific requirements for the Spacelab Ultraviolet-Optical Telescope (SUOT) facility are presented. Specific programs involving high angular resolution imagery over wide fields, far ultraviolet spectroscopy, precisely calibrated spectrophotometry and spectropolarimetry over a wide wavelength range, and planetary studies, including high resolution synoptic imagery, are recommended. Specifications for the mounting configuration, instruments for the mounting configuration, instrument mounting system, optical parameters, and the pointing and stabilization system are presented. Concepts for the focal plane instruments are defined. The functional requirements of the direct imaging camera, far ultraviolet spectrograph, and the precisely calibrated spectrophotometer are detailed, and the planetary camera concept is outlined. Operational concepts described in detail are: the makeup and functions of shuttle payload crew, extravehicular activity requirements, telescope control and data management, payload operations control room, orbital constraints, and orbital interfaces (stabilization, maneuvering requirements and attitude control, contamination, utilities, and payload weight considerations).
The GCT camera for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium
2017-12-01
The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.
2005-11-01
is in relation to obstacles. Clearly, existing optical sensors are too large for this proposed system. Again, the approach utilizing biomimicry ...results in our latest conflicts. The Predator, a medium altitude system cruising at 70 knots and equipped with electro- optical and infrared cameras...that exist today, but the vehicles are also platforms for new concepts outside the status quo. 206 III. Technology Biomimicry is a new
Scientific Design of a High Contrast Integral Field Spectrograph for the Subaru Telescope
NASA Technical Reports Server (NTRS)
McElwain, Michael W.
2012-01-01
Ground based telescopes equipped with adaptive optics systems and specialized science cameras are now capable of directly detecting extrasolar planets. We present the scientific design for a high contrast integral field spectrograph for the Subaru Telescope. This lenslet based integral field spectrograph will be implemented into the new extreme adaptive optics system at Subaru, called SCExAO.
MARS: a mouse atlas registration system based on a planar x-ray projector and an optical camera
NASA Astrophysics Data System (ADS)
Wang, Hongkai; Stout, David B.; Taschereau, Richard; Gu, Zheng; Vu, Nam T.; Prout, David L.; Chatziioannou, Arion F.
2012-10-01
This paper introduces a mouse atlas registration system (MARS), composed of a stationary top-view x-ray projector and a side-view optical camera, coupled to a mouse atlas registration algorithm. This system uses the x-ray and optical images to guide a fully automatic co-registration of a mouse atlas with each subject, in order to provide anatomical reference for small animal molecular imaging systems such as positron emission tomography (PET). To facilitate the registration, a statistical atlas that accounts for inter-subject anatomical variations was constructed based on 83 organ-labeled mouse micro-computed tomography (CT) images. The statistical shape model and conditional Gaussian model techniques were used to register the atlas with the x-ray image and optical photo. The accuracy of the atlas registration was evaluated by comparing the registered atlas with the organ-labeled micro-CT images of the test subjects. The results showed excellent registration accuracy of the whole-body region, and good accuracy for the brain, liver, heart, lungs and kidneys. In its implementation, the MARS was integrated with a preclinical PET scanner to deliver combined PET/MARS imaging, and to facilitate atlas-assisted analysis of the preclinical PET images.
MARS: a mouse atlas registration system based on a planar x-ray projector and an optical camera.
Wang, Hongkai; Stout, David B; Taschereau, Richard; Gu, Zheng; Vu, Nam T; Prout, David L; Chatziioannou, Arion F
2012-10-07
This paper introduces a mouse atlas registration system (MARS), composed of a stationary top-view x-ray projector and a side-view optical camera, coupled to a mouse atlas registration algorithm. This system uses the x-ray and optical images to guide a fully automatic co-registration of a mouse atlas with each subject, in order to provide anatomical reference for small animal molecular imaging systems such as positron emission tomography (PET). To facilitate the registration, a statistical atlas that accounts for inter-subject anatomical variations was constructed based on 83 organ-labeled mouse micro-computed tomography (CT) images. The statistical shape model and conditional Gaussian model techniques were used to register the atlas with the x-ray image and optical photo. The accuracy of the atlas registration was evaluated by comparing the registered atlas with the organ-labeled micro-CT images of the test subjects. The results showed excellent registration accuracy of the whole-body region, and good accuracy for the brain, liver, heart, lungs and kidneys. In its implementation, the MARS was integrated with a preclinical PET scanner to deliver combined PET/MARS imaging, and to facilitate atlas-assisted analysis of the preclinical PET images.
Improving Photometric Calibration of Meteor Video Camera Systems
NASA Technical Reports Server (NTRS)
Ehlert, Steven; Kingery, Aaron; Cooke, William
2016-01-01
Current optical observations of meteors are commonly limited by systematic uncertainties in photometric calibration at the level of approximately 0.5 mag or higher. Future improvements to meteor ablation models, luminous efficiency models, or emission spectra will hinge on new camera systems and techniques that significantly reduce calibration uncertainties and can reliably perform absolute photometric measurements of meteors. In this talk we discuss the algorithms and tests that NASA's Meteoroid Environment Office (MEO) has developed to better calibrate photometric measurements for the existing All-Sky and Wide-Field video camera networks as well as for a newly deployed four-camera system for measuring meteor colors in Johnson-Cousins BV RI filters. In particular we will emphasize how the MEO has been able to address two long-standing concerns with the traditional procedure, discussed in more detail below.
NASA Technical Reports Server (NTRS)
Fischer, Robert E. (Editor); Rogers, Philip J. (Editor)
1986-01-01
The present conference considers topics in the fields of optical systems design software, the design and analysis of optical systems, illustrative cases of advanced optical system design, the integration of optical designs into greater systems, and optical fabrication and testing techniques. Attention is given to an extended range diffraction-based merit function for lens design optimization, an assessment of technologies for stray light control and evaluation, the automated characterization of IR systems' spatial resolution, a spectrum of design techniques based on aberration theory, a three-field IR telescope, a large aperture zoom lens for 16-mm motion picture cameras, and the use of concave holographic gratings as monochomators. Also discussed are the use of aspherics in optical systems, glass choice procedures for periscope design, the fabrication and testing of unconventional optics, low mass mirrors for large optics, and the diamond grinding of optical surfaces on aspheric lens molds.
The TESS camera: modeling and measurements with deep depletion devices
NASA Astrophysics Data System (ADS)
Woods, Deborah F.; Vanderspek, Roland; MacDonald, Robert; Morgan, Edward; Villasenor, Joel; Thayer, Carolyn; Burke, Barry; Chesbrough, Christian; Chrisp, Michael; Clark, Kristin; Furesz, Gabor; Gonzales, Alexandria; Nguyen, Tam; Prigozhin, Gregory; Primeau, Brian; Ricker, George; Sauerwein, Timothy; Suntharalingam, Vyshnavi
2016-07-01
The Transiting Exoplanet Survey Satellite, a NASA Explorer-class mission in development, will discover planets around nearby stars, most notably Earth-like planets with potential for follow up characterization. The all-sky survey requires a suite of four wide field-of-view cameras with sensitivity across a broad spectrum. Deep depletion CCDs with a silicon layer of 100 μm thickness serve as the camera detectors, providing enhanced performance in the red wavelengths for sensitivity to cooler stars. The performance of the camera is critical for the mission objectives, with both the optical system and the CCD detectors contributing to the realized image quality. Expectations for image quality are studied using a combination of optical ray tracing in Zemax and simulations in Matlab to account for the interaction of the incoming photons with the 100 μm silicon layer. The simulations include a probabilistic model to determine the depth of travel in the silicon before the photons are converted to photo-electrons, and a Monte Carlo approach to charge diffusion. The charge diffusion model varies with the remaining depth for the photo-electron to traverse and the strength of the intermediate electric field. The simulations are compared with laboratory measurements acquired by an engineering unit camera with the TESS optical design and deep depletion CCDs. In this paper we describe the performance simulations and the corresponding measurements taken with the engineering unit camera, and discuss where the models agree well in predicted trends and where there are differences compared to observations.
An imaging system for PLIF/Mie measurements for a combusting flow
NASA Technical Reports Server (NTRS)
Wey, C. C.; Ghorashi, B.; Marek, C. J.; Wey, C.
1990-01-01
The equipment required to establish an imaging system can be divided into four parts: (1) the light source and beam shaping optics; (2) camera and recording; (3) image acquisition and processing; and (4) computer and output systems. A pulsed, Nd:YAG-pummped, frequency-doubled dye laser which can freeze motion in the flowfield is used for an illumination source. A set of lenses is used to form the laser beam into a sheet. The induced fluorescence is collected by an UV-enhanced lens and passes through an UV-enhanced microchannel plate intensifier which is optically coupled to a gated solid state CCD camera. The output of the camera is simultaneously displayed on a monitor and recorded on either a laser videodisc set of a Super VHS VCR. This videodisc set is controlled by a minicomputer via a connection to the RS-232C interface terminals. The imaging system is connected to the host computer by a bus repeater and can be multiplexed between four video input sources. Sample images from a planar shear layer experiment are presented to show the processing capability of the imaging system with the host computer.
Hand-held photomicroscopy system
NASA Technical Reports Server (NTRS)
Zabower, H. R.
1972-01-01
Photomicroscopy system, with simple optics and any standard microscope objective, is used with any type of motion picture, still, or television camera system. Device performs well under difficult environmental conditions and applies to work in ecological studies, field hospitals, and geological surveys.
Opto-mechanical design of the G-CLEF flexure control camera system
NASA Astrophysics Data System (ADS)
Oh, Jae Sok; Park, Chan; Kim, Jihun; Kim, Kang-Min; Chun, Moo-Young; Yu, Young Sam; Lee, Sungho; Nah, Jakyoung; Park, Sung-Joon; Szentgyorgyi, Andrew; McMuldroch, Stuart; Norton, Timothy; Podgorski, William; Evans, Ian; Mueller, Mark; Uomoto, Alan; Crane, Jeffrey; Hare, Tyson
2016-08-01
The GMT-Consortium Large Earth Finder (G-CLEF) is the very first light instrument of the Giant Magellan Telescope (GMT). The G-CLEF is a fiber feed, optical band echelle spectrograph that is capable of extremely precise radial velocity measurement. KASI (Korea Astronomy and Space Science Institute) is responsible for Flexure Control Camera (FCC) included in the G-CLEF Front End Assembly (GCFEA). The FCC is a kind of guide camera, which monitors the field images focused on a fiber mirror to control the flexure and the focus errors within the GCFEA. The FCC consists of five optical components: a collimator including triple lenses for producing a pupil, neutral density filters allowing us to use much brighter star as a target or a guide, a tent prism as a focus analyzer for measuring the focus offset at the fiber mirror, a reimaging camera with three pair of lenses for focusing the beam on a CCD focal plane, and a CCD detector for capturing the image on the fiber mirror. In this article, we present the optical and mechanical FCC designs which have been modified after the PDR in April 2015.
Preliminary Electrical Designs for CTEX and AFIT Satellite Ground Station
2010-03-01
with additional IO High-Speed Piezo Tip/Tilt Platforms S-340 Platform Recommended Models Mirror Aluminum Aluminum S-340.Ax Invar Zerodur glass S-340...developed by RC Optics that uses internal steer- able mirrors that point the optics without slewing the entire instrument. The imaging system is composed of...Determination System Telescope Assembly CTEx Imaging System DCCU Camera Motor/Encoder Assemby FSM & Control Electronics Dwell Mirror w/ 2
Low-cost mobile phone microscopy with a reversed mobile phone camera lens.
Switz, Neil A; D'Ambrosio, Michael V; Fletcher, Daniel A
2014-01-01
The increasing capabilities and ubiquity of mobile phones and their associated digital cameras offer the possibility of extending low-cost, portable diagnostic microscopy to underserved and low-resource areas. However, mobile phone microscopes created by adding magnifying optics to the phone's camera module have been unable to make use of the full image sensor due to the specialized design of the embedded camera lens, exacerbating the tradeoff between resolution and field of view inherent to optical systems. This tradeoff is acutely felt for diagnostic applications, where the speed and cost of image-based diagnosis is related to the area of the sample that can be viewed at sufficient resolution. Here we present a simple and low-cost approach to mobile phone microscopy that uses a reversed mobile phone camera lens added to an intact mobile phone to enable high quality imaging over a significantly larger field of view than standard microscopy. We demonstrate use of the reversed lens mobile phone microscope to identify red and white blood cells in blood smears and soil-transmitted helminth eggs in stool samples.
Low-Cost Mobile Phone Microscopy with a Reversed Mobile Phone Camera Lens
Fletcher, Daniel A.
2014-01-01
The increasing capabilities and ubiquity of mobile phones and their associated digital cameras offer the possibility of extending low-cost, portable diagnostic microscopy to underserved and low-resource areas. However, mobile phone microscopes created by adding magnifying optics to the phone's camera module have been unable to make use of the full image sensor due to the specialized design of the embedded camera lens, exacerbating the tradeoff between resolution and field of view inherent to optical systems. This tradeoff is acutely felt for diagnostic applications, where the speed and cost of image-based diagnosis is related to the area of the sample that can be viewed at sufficient resolution. Here we present a simple and low-cost approach to mobile phone microscopy that uses a reversed mobile phone camera lens added to an intact mobile phone to enable high quality imaging over a significantly larger field of view than standard microscopy. We demonstrate use of the reversed lens mobile phone microscope to identify red and white blood cells in blood smears and soil-transmitted helminth eggs in stool samples. PMID:24854188
Alternatives for Military Space Radar
2007-01-01
transmitted microwaves to produce images of the Earth’s surface (somewhat akin to photographs produced by optical imaging).2 By providing their own...microwaves for illumination (rather than sunlight, as in an optical imaging system). By providing their own illu- mination, radars can produce...carry a variety of payloads, including electro- optical , infrared, and SAR imagers; a film camera; and signals- intelligence equipment. The aircraft’s
Ultrahigh resolution radiation imaging system using an optical fiber structure scintillator plate.
Yamamoto, Seiichi; Kamada, Kei; Yoshikawa, Akira
2018-02-16
High resolution imaging of radiation is required for such radioisotope distribution measurements as alpha particle detection in nuclear facilities or high energy physics experiments. For this purpose, we developed an ultrahigh resolution radiation imaging system using an optical fiber structure scintillator plate. We used a ~1-μm diameter fiber structured GdAlO 3 :Ce (GAP) /α-Al 2 O 3 scintillator plate to reduce the light spread. The fiber structured scintillator plate was optically coupled to a tapered optical fiber plate to magnify the image and combined with a lens-based high sensitivity CCD camera. We observed the images of alpha particles with a spatial resolution of ~25 μm. For the beta particles, the images had various shapes, and the trajectories of the electrons were clearly observed in the images. For the gamma photons, the images also had various shapes, and the trajectories of the secondary electrons were observed in some of the images. These results show that combining an optical fiber structure scintillator plate with a tapered optical fiber plate and a high sensitivity CCD camera achieved ultrahigh resolution and is a promising method to observe the images of the interactions of radiation in a scintillator.
Low-cost panoramic infrared surveillance system
NASA Astrophysics Data System (ADS)
Kecskes, Ian; Engel, Ezra; Wolfe, Christopher M.; Thomson, George
2017-05-01
A nighttime surveillance concept consisting of a single surface omnidirectional mirror assembly and an uncooled Vanadium Oxide (VOx) longwave infrared (LWIR) camera has been developed. This configuration provides a continuous field of view spanning 360° in azimuth and more than 110° in elevation. Both the camera and the mirror are readily available, off-the-shelf, inexpensive products. The mirror assembly is marketed for use in the visible spectrum and requires only minor modifications to function in the LWIR spectrum. The compactness and portability of this optical package offers significant advantages over many existing infrared surveillance systems. The developed system was evaluated on its ability to detect moving, human-sized heat sources at ranges between 10 m and 70 m. Raw camera images captured by the system are converted from rectangular coordinates in the camera focal plane to polar coordinates and then unwrapped into the users azimuth and elevation system. Digital background subtraction and color mapping are applied to the images to increase the users ability to extract moving items from background clutter. A second optical system consisting of a commercially available 50 mm f/1.2 ATHERM lens and a second LWIR camera is used to examine the details of objects of interest identified using the panoramic imager. A description of the components of the proof of concept is given, followed by a presentation of raw images taken by the panoramic LWIR imager. A description of the method by which these images are analyzed is given, along with a presentation of these results side-by-side with the output of the 50 mm LWIR imager and a panoramic visible light imager. Finally, a discussion of the concept and its future development are given.
Analysis of Photogrammetry Data from ISIM Mockup
NASA Technical Reports Server (NTRS)
Nowak, Maria; Hill, Mike
2007-01-01
During ground testing of the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST), the ISIM Optics group plans to use a Photogrammetry Measurement System for cryogenic calibration of specific target points on the ISIM composite structure and Science Instrument optical benches and other GSE equipment. This testing will occur in the Space Environmental Systems (SES) chamber at Goddard Space Flight Center. Close range photogrammetry is a 3 dimensional metrology system using triangulation to locate custom targets in 3 coordinates via a collection of digital photographs taken from various locations and orientations. These photos are connected using coded targets, special targets that are recognized by the software and can thus correlate the images to provide a 3 dimensional map of the targets, and scaled via well calibrated scale bars. Photogrammetry solves for the camera location and coordinates of the targets simultaneously through the bundling procedure contained in the V-STARS software, proprietary software owned by Geodetic Systems Inc. The primary objectives of the metrology performed on the ISIM mock-up were (1) to quantify the accuracy of the INCA3 photogrammetry camera on a representative full scale version of the ISIM structure at ambient temperature by comparing the measurements obtained with this camera to measurements using the Leica laser tracker system and (2), empirically determine the smallest increment of target position movement that can be resolved by the PG camera in the test setup, i.e., precision, or resolution. In addition, the geometrical details of the test setup defined during the mockup testing, such as target locations and camera positions, will contribute to the final design of the photogrammetry system to be used on the ISIM Flight Structure.
Fast and compact internal scanning CMOS-based hyperspectral camera: the Snapscan
NASA Astrophysics Data System (ADS)
Pichette, Julien; Charle, Wouter; Lambrechts, Andy
2017-02-01
Imec has developed a process for the monolithic integration of optical filters on top of CMOS image sensors, leading to compact, cost-efficient and faster hyperspectral cameras. Linescan cameras are typically used in remote sensing or for conveyor belt applications. Translation of the target is not always possible for large objects or in many medical applications. Therefore, we introduce a novel camera, the Snapscan (patent pending), exploiting internal movement of a linescan sensor enabling fast and convenient acquisition of high-resolution hyperspectral cubes (up to 2048x3652x150 in spectral range 475-925 nm). The Snapscan combines the spectral and spatial resolutions of a linescan system with the convenience of a snapshot camera.
Small diameter, deep bore optical inspection system
Lord, David E.; Petrini, Richard R.; Carter, Gary W.
1981-01-01
An improved rod optic system for inspecting small diameter, deep bores. The system consists of a rod optic system utilizing a curved mirror at the end of the rod lens such that the optical path through the system is bent 90.degree. to minimize optical distortion in examining the sides of a curved bore. The system is particularly useful in the examination of small bores for corrosion, and is capable of examining 1/16 inch diameter and up to 4 inch deep drill holes, for example. The positioning of the curved mirror allows simultaneous viewing from shallow and right angle points of observation of the same artifact (such as corrosion) in the bore hole. The improved rod optic system may be used for direct eye sighting, or in combination with a still camera or a low-light television monitor; particularly low-light color television.
Free-space laser communication system with rapid acquisition based on astronomical telescopes.
Wang, Jianmin; Lv, Junyi; Zhao, Guang; Wang, Gang
2015-08-10
The general structure of a free-space optical (FSO) communication system based on astronomical telescopes is proposed. The light path for astronomical observation and for communication can be easily switched. A separate camera is used as a star sensor to determine the pointing direction of the optical terminal's antenna. The new system exhibits rapid acquisition and is widely applicable in various astronomical telescope systems and wavelengths. We present a detailed analysis of the acquisition time, which can be decreased by one order of magnitude compared with traditional optical communication systems. Furthermore, we verify software algorithms and tracking accuracy.
Advanced Wavefront Sensing and Control Testbed (AWCT)
NASA Technical Reports Server (NTRS)
Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell
2010-01-01
The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.
The PAUCam readout electronics system
NASA Astrophysics Data System (ADS)
Jiménez, Jorge; Illa, José M.; Cardiel-Sas, Laia; de Vicente, Juan; Castilla, Javier; Casas, Ricard
2016-08-01
The PAUCam is an optical camera with a wide field of view of 1 deg x 1 deg and up to 46 narrow and broad band filters. The camera is already installed on the William Herschel Telescope (WHT) in the Canary Islands, Spain and successfully commissioned during the first period of 2015. The paper presents the main results from the readout electronics commissioning tests and include an overview of the whole readout electronics system, its configuration and current performance.
2002-07-10
KENNEDY SPACE CENTER, FLA. -- With the engines removed from Endeavour, the flow line can be inspected. On the right, Gerry Kathka, with United Space Alliance, hands part of a fiber-optic camera system to Scott Minnick, left. Minnick wears a special viewing apparatus that sees where the camera is going. The inspection is the result of small cracks being discovered on the LH2 Main Propulsion System (MPS) flow liners in other orbiters. Endeavour is next scheduled to fly on mission STS-113.
NASA Astrophysics Data System (ADS)
Yamamoto, Seiichi; Suzuki, Mayumi; Kato, Katsuhiko; Watabe, Tadashi; Ikeda, Hayato; Kanai, Yasukazu; Ogata, Yoshimune; Hatazawa, Jun
2016-09-01
Although iodine 131 (I-131) is used for radionuclide therapy, high resolution images are difficult to obtain with conventional gamma cameras because of the high energy of I-131 gamma photons (364 keV). Cerenkov-light imaging is a possible method for beta emitting radionuclides, and I-131 (606 MeV maximum beta energy) is a candidate to obtain high resolution images. We developed a high energy gamma camera system for I-131 radionuclide and combined it with a Cerenkov-light imaging system to form a gamma-photon/Cerenkov-light hybrid imaging system to compare the simultaneously measured images of these two modalities. The high energy gamma imaging detector used 0.85-mm×0.85-mm×10-mm thick GAGG scintillator pixels arranged in a 44×44 matrix with a 0.1-mm thick reflector and optical coupled to a Hamamatsu 2 in. square position sensitive photomultiplier tube (PSPMT: H12700 MOD). The gamma imaging detector was encased in a 2 cm thick tungsten shield, and a pinhole collimator was mounted on its top to form a gamma camera system. The Cerenkov-light imaging system was made of a high sensitivity cooled CCD camera. The Cerenkov-light imaging system was combined with the gamma camera using optical mirrors to image the same area of the subject. With this configuration, we simultaneously imaged the gamma photons and the Cerenkov-light from I-131 in the subjects. The spatial resolution and sensitivity of the gamma camera system for I-131 were respectively 3 mm FWHM and 10 cps/MBq for the high sensitivity collimator at 10 cm from the collimator surface. The spatial resolution of the Cerenkov-light imaging system was 0.64 mm FWHM at 10 cm from the system surface. Thyroid phantom and rat images were successfully obtained with the developed gamma-photon/Cerenkov-light hybrid imaging system, allowing direct comparison of these two modalities. Our developed gamma-photon/Cerenkov-light hybrid imaging system will be useful to evaluate the advantages and disadvantages of these two modalities.
NASA Astrophysics Data System (ADS)
Nikolashkin, S. V.; Reshetnikov, A. A.
2017-11-01
The system of video surveillance during active rocket experiments in the Polar geophysical observatory "Tixie" and studies of the effects of "Soyuz" vehicle launches from the "Vostochny" cosmodrome over the territory of the Republic of Sakha (Yakutia) is presented. The created system consists of three AHD video cameras with different angles of view mounted on a common platform mounted on a tripod with the possibility of manual guiding. The main camera with high-sensitivity black and white CCD matrix SONY EXview HADII is equipped depending on the task with lenses "MTO-1000" (F = 1000 mm) or "Jupiter-21M " (F = 300 mm) and is designed for more detailed shooting of luminous formations. The second camera of the same type, but with a 30 degree angle of view. It is intended for shooting of the general plan and large objects, and also for a binding of coordinates of object on stars. The third color wide-angle camera (120 degrees) is designed to be connected to landmarks in the daytime, the optical axis of this channel is directed at 60 degrees down. The data is recorded on the hard disk of a four-channel digital video recorder. Tests of the original version of the system with two channels were conducted during the launch of the geophysical rocket in Tixie in September 2015 and showed its effectiveness.
Study of optical techniques for the Ames unitary wind tunnels. Part 4: Model deformation
NASA Technical Reports Server (NTRS)
Lee, George
1992-01-01
A survey of systems capable of model deformation measurements was conducted. The survey included stereo-cameras, scanners, and digitizers. Moire, holographic, and heterodyne interferometry techniques were also looked at. Stereo-cameras with passive or active targets are currently being deployed for model deformation measurements at NASA Ames and LaRC, Boeing, and ONERA. Scanners and digitizers are widely used in robotics, motion analysis, medicine, etc., and some of the scanner and digitizers can meet the model deformation requirements. Commercial stereo-cameras, scanners, and digitizers are being improved in accuracy, reliability, and ease of operation. A number of new systems are coming onto the market.
Process simulation in digital camera system
NASA Astrophysics Data System (ADS)
Toadere, Florin
2012-06-01
The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.
Minimising back reflections from the common path objective in a fundus camera
NASA Astrophysics Data System (ADS)
Swat, A.
2016-11-01
Eliminating back reflections is critical in the design of a fundus camera with internal illuminating system. As there is very little light reflected from the retina, even excellent antireflective coatings are not sufficient suppression of ghost reflections, therefore the number of surfaces in the common optics in illuminating and imaging paths shall be minimised. Typically a single aspheric objective is used. In the paper an alternative approach, an objective with all spherical surfaces, is presented. As more surfaces are required, more sophisticated method is needed to get rid of back reflections. Typically back reflections analysis, comprise treating subsequent objective surfaces as mirrors, and reflections from the objective surfaces are traced back through the imaging path. This approach can be applied in both sequential and nonsequential ray tracing. It is good enough for system check but not very suitable for early optimisation process in the optical system design phase. There are also available standard ghost control merit function operands in the sequential ray-trace, for example in Zemax system, but these don't allow back ray-trace in an alternative optical path, illumination vs. imaging. What is proposed in the paper, is a complete method to incorporate ghost reflected energy into the raytracing system merit function for sequential mode which is more efficient in optimisation process. Although developed for the purpose of specific case of fundus camera, the method might be utilised in a wider range of applications where ghost control is critical.
Exploring the Universe with the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
1990-01-01
A general overview is given of the operations, engineering challenges, and components of the Hubble Space Telescope. Deployment, checkout and servicing in space are discussed. The optical telescope assembly, focal plane scientific instruments, wide field/planetary camera, faint object spectrograph, faint object camera, Goddard high resolution spectrograph, high speed photometer, fine guidance sensors, second generation technology, and support systems and services are reviewed.
2002-07-10
KENNEDY SPACE CENTER, FLA. -- Scott Minnick, with United Space Alliance, places a fiber-optic camera inside the flow line on Endeavour. Minnick wears a special viewing apparatus that sees where the camera is going. The inspection is the result of small cracks being discovered on the LH2 Main Propulsion System (MPS) flow liners in other orbiters. Endeavour is next scheduled to fly on mission STS-113.
2002-07-10
KENNEDY SPACE CENTER, FLA. -- Scott Minnick, with United Space Alliance, places a fiber-optic camera inside the flow line on Endeavour. Minnick wears a special viewing apparatus that sees where the camera is going. The inspection is the result of small cracks being discovered on the LH2 Main Propulsion System (MPS) flow liners in other orbiters. Endeavour is next scheduled to fly on mission STS-113.
Geometric calibration of lens and filter distortions for multispectral filter-wheel cameras.
Brauers, Johannes; Aach, Til
2011-02-01
High-fidelity color image acquisition with a multispectral camera utilizes optical filters to separate the visible electromagnetic spectrum into several passbands. This is often realized with a computer-controlled filter wheel, where each position is equipped with an optical bandpass filter. For each filter wheel position, a grayscale image is acquired and the passbands are finally combined to a multispectral image. However, the different optical properties and non-coplanar alignment of the filters cause image aberrations since the optical path is slightly different for each filter wheel position. As in a normal camera system, the lens causes additional wavelength-dependent image distortions called chromatic aberrations. When transforming the multispectral image with these aberrations into an RGB image, color fringes appear, and the image exhibits a pincushion or barrel distortion. In this paper, we address both the distortions caused by the lens and by the filters. Based on a physical model of the bandpass filters, we show that the aberrations caused by the filters can be modeled by displaced image planes. The lens distortions are modeled by an extended pinhole camera model, which results in a remaining mean calibration error of only 0.07 pixels. Using an absolute calibration target, we then geometrically calibrate each passband and compensate for both lens and filter distortions simultaneously. We show that both types of aberrations can be compensated and present detailed results on the remaining calibration errors.
Space infrared telescope facility wide field and diffraction limited array camera (IRAC)
NASA Technical Reports Server (NTRS)
Fazio, Giovanni G.
1988-01-01
The wide-field and diffraction limited array camera (IRAC) is capable of two-dimensional photometry in either a wide-field or diffraction-limited mode over the wavelength range from 2 to 30 microns with a possible extension to 120 microns. A low-doped indium antimonide detector was developed for 1.8 to 5.0 microns, detectors were tested and optimized for the entire 1.8 to 30 micron range, beamsplitters were developed and tested for the 1.8 to 30 micron range, and tradeoff studies of the camera's optical system performed. Data are presented on the performance of InSb, Si:In, Si:Ga, and Si:Sb array detectors bumpbonded to a multiplexed CMOS readout chip of the source-follower type at SIRTF operating backgrounds (equal to or less than 1 x 10 to the 8th ph/sq cm/sec) and temperature (4 to 12 K). Some results at higher temperatures are also presented for comparison to SIRTF temperature results. Data are also presented on the performance of IRAC beamsplitters at room temperature at both 0 and 45 deg angle of incidence and on the performance of the all-reflecting optical system baselined for the camera.
NASA Technical Reports Server (NTRS)
Kiplinger, Alan L.; Dennis, Brian R.; Orwig, Larry E.; Chen, P. C.
1988-01-01
A solid-state digital camera was developed for obtaining H alpha images of solar flares with 0.1 s time resolution. Beginning in the summer of 1988, this system will be operated in conjunction with SMM's hard X-ray burst spectrometer (HXRBS). Important electron time-of-flight effects that are crucial for determining the flare energy release processes should be detectable with these combined H alpha and hard X-ray observations. Charge-injection device (CID) cameras provide 128 x 128 pixel images simultaneously in the H alpha blue wing, line center, and red wing, or other wavelength of interest. The data recording system employs a microprocessor-controlled, electronic interface between each camera and a digital processor board that encodes the data into a serial bitstream for continuous recording by a standard video cassette recorder. Only a small fraction of the data will be permanently archived through utilization of a direct memory access interface onto a VAX-750 computer. In addition to correlations with hard X-ray data, observations from the high speed H alpha camera will also be correlated and optical and microwave data and data from future MAX 1991 campaigns. Whether the recorded optical flashes are simultaneous with X-ray peaks to within 0.1 s, are delayed by tenths of seconds or are even undetectable, the results will have implications on the validity of both thermal and nonthermal models of hard X-ray production.
Marshall Grazing Incidence X-ray Spectrometer (MaGIXS) Slit-Jaw Imaging System
NASA Astrophysics Data System (ADS)
Wilkerson, P.; Champey, P. R.; Winebarger, A. R.; Kobayashi, K.; Savage, S. L.
2017-12-01
The Marshall Grazing Incidence X-ray Spectrometer is a NASA sounding rocket payload providing a 0.6 - 2.5 nm spectrum with unprecedented spatial and spectral resolution. The instrument is comprised of a novel optical design, featuring a Wolter1 grazing incidence telescope, which produces a focused solar image on a slit plate, an identical pair of stigmatic optics, a planar diffraction grating and a low-noise detector. When MaGIXS flies on a suborbital launch in 2019, a slit-jaw camera system will reimage the focal plane of the telescope providing a reference for pointing the telescope on the solar disk and aligning the data to supporting observations from satellites and other rockets. The telescope focuses the X-ray and EUV image of the sun onto a plate covered with a phosphor coating that absorbs EUV photons, which then fluoresces in visible light. This 10-week REU project was aimed at optimizing an off-axis mounted camera with 600-line resolution NTSC video for extremely low light imaging of the slit plate. Radiometric calculations indicate an intensity of less than 1 lux at the slit jaw plane, which set the requirement for camera sensitivity. We selected a Watec 910DB EIA charge-coupled device (CCD) monochrome camera, which has a manufacturer quoted sensitivity of 0.0001 lux at F1.2. A high magnification and low distortion lens was then identified to image the slit jaw plane from a distance of approximately 10 cm. With the selected CCD camera, tests show that at extreme low-light levels, we achieve a higher resolution than expected, with only a moderate drop in frame rate. Based on sounding rocket flight heritage, the launch vehicle attitude control system is known to stabilize the instrument pointing such that jitter does not degrade video quality for context imaging. Future steps towards implementation of the imaging system will include ruggedizing the flight camera housing and mounting the selected camera and lens combination to the instrument structure.
Lee, Peter; Bollensdorff, Christian; Quinn, T. Alexander; Wuskell, Joseph P.; Loew, Leslie M.; Kohl, Peter
2011-01-01
Background Simultaneous optical mapping of multiple electrophysiologically relevant parameters in living myocardium is desirable for integrative exploration of mechanisms underlying heart rhythm generation under normal and pathophysiologic conditions. Current multiparametric methods are technically challenging, usually involving multiple sensors and moving parts, which contributes to high logistic and economic thresholds that prevent easy application of the technique. Objective The purpose of this study was to develop a simple, affordable, and effective method for spatially resolved, continuous, simultaneous, and multiparametric optical mapping of the heart, using a single camera. Methods We present a new method to simultaneously monitor multiple parameters using inexpensive off-the-shelf electronic components and no moving parts. The system comprises a single camera, commercially available optical filters, and light-emitting diodes (LEDs), integrated via microcontroller-based electronics for frame-accurate illumination of the tissue. For proof of principle, we illustrate measurement of four parameters, suitable for ratiometric mapping of membrane potential (di-4-ANBDQPQ) and intracellular free calcium (fura-2), in an isolated Langendorff-perfused rat heart during sinus rhythm and ectopy, induced by local electrical or mechanical stimulation. Results The pilot application demonstrates suitability of this imaging approach for heart rhythm research in the isolated heart. In addition, locally induced excitation, whether stimulated electrically or mechanically, gives rise to similar ventricular propagation patterns. Conclusion Combining an affordable camera with suitable optical filters and microprocessor-controlled LEDs, single-sensor multiparametric optical mapping can be practically implemented in a simple yet powerful configuration and applied to heart rhythm research. The moderate system complexity and component cost is destined to lower the threshold to broader application of functional imaging and to ease implementation of more complex optical mapping approaches, such as multiparametric panoramic imaging. A proof-of-principle application confirmed that although electrically and mechanically induced excitation occur by different mechanisms, their electrophysiologic consequences downstream from the point of activation are not dissimilar. PMID:21459161
A high-resolution multimode digital microscope system.
Salmon, Edward D; Shaw, Sidney L; Waters, Jennifer C; Waterman-Storer, Clare M; Maddox, Paul S; Yeh, Elaine; Bloom, Kerry
2013-01-01
This chapter describes the development of a high-resolution, multimode digital imaging system based on a wide-field epifluorescent and transmitted light microscope, and a cooled charge-coupled device (CCD) camera. The three main parts of this imaging system are Nikon FXA microscope, Hamamatsu C4880 cooled CCD camera, and MetaMorph digital imaging system. This chapter presents various design criteria for the instrument and describes the major features of the microscope components-the cooled CCD camera and the MetaMorph digital imaging system. The Nikon FXA upright microscope can produce high resolution images for both epifluorescent and transmitted light illumination without switching the objective or moving the specimen. The functional aspects of the microscope set-up can be considered in terms of the imaging optics, the epi-illumination optics, the transillumination optics, the focus control, and the vibration isolation table. This instrument is somewhat specialized for microtubule and mitosis studies, and it is also applicable to a variety of problems in cellular imaging, including tracking proteins fused to the green fluorescent protein in live cells. The instrument is also valuable for correlating the assembly dynamics of individual cytoplasmic microtubules (labeled by conjugating X-rhodamine to tubulin) with the dynamics of membranes of the endoplasmic reticulum (labeled with DiOC6) and the dynamics of the cell cortex (by differential interference contrast) in migrating vertebrate epithelial cells. This imaging system also plays an important role in the analysis of mitotic mutants in the powerful yeast genetic system Saccharomyces cerevisiae. Copyright © 1998 Elsevier Inc. All rights reserved.
High Speed Digital Camera Technology Review
NASA Technical Reports Server (NTRS)
Clements, Sandra D.
2009-01-01
A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.
Pentacam Scheimpflug quantitative imaging of the crystalline lens and intraocular lens.
Rosales, Patricia; Marcos, Susana
2009-05-01
To implement geometrical and optical distortion correction methods for anterior segment Scheimpflug images obtained with a commercially available system (Pentacam, Oculus Optikgeräte GmbH). Ray tracing algorithms were implemented to obtain corrected ocular surface geometry from the original images captured by the Pentacam's CCD camera. As details of the optical layout were not fully provided by the manufacturer, an iterative procedure (based on imaging of calibrated spheres) was developed to estimate the camera lens specifications. The correction procedure was tested on Scheimpflug images of a physical water cell model eye (with polymethylmethacrylate cornea and a commercial IOL of known dimensions) and of a normal human eye previously measured with a corrected optical and geometrical distortion Scheimpflug camera (Topcon SL-45 [Topcon Medical Systems Inc] from the Vrije University, Amsterdam, Holland). Uncorrected Scheimpflug images show flatter surfaces and thinner lenses than in reality. The application of geometrical and optical distortion correction algorithms improves the accuracy of the estimated anterior lens radii of curvature by 30% to 40% and of the estimated posterior lens by 50% to 100%. The average error in the retrieved radii was 0.37 and 0.46 mm for the anterior and posterior lens radii of curvature, respectively, and 0.048 mm for lens thickness. The Pentacam Scheimpflug system can be used to obtain quantitative information on the geometry of the crystalline lens, provided that geometrical and optical distortion correction algorithms are applied, within the accuracy of state-of-the art phakometry and biometry. The techniques could improve with exact knowledge of the technical specifications of the instrument, improved edge detection algorithms, consideration of aspheric and non-rotationally symmetrical surfaces, and introduction of a crystalline gradient index.
NASA Tech Briefs, October 2005
NASA Technical Reports Server (NTRS)
2005-01-01
Topics covered include: Insect-Inspired Optical-Flow Navigation Sensors; Chemical Sensors Based on Optical Ring Resonators; A Broad-Band Phase-Contrast Wave-Front Sensor; Progress in Insect-Inspired Optical Navigation Sensors; Portable Airborne Laser System Measures Forest-Canopy Height; Deployable Wide-Aperture Array Antennas; Faster Evolution of More Multifunctional Logic Circuits; Video-Camera-Based Position-Measuring System; N-Type delta Doping of High-Purity Silicon Imaging Arrays; Avionics System Architecture Tool; Updated Chemical Kinetics and Sensitivity Analysis Code; Predicting Flutter and Forced Response in Turbomachinery; Upgrades of Two Computer Codes for Analysis of Turbomachinery; Program Facilitates CMMI Appraisals; Grid Visualization Tool; Program Computes Sound Pressures at Rocket Launches; Solar-System Ephemeris Toolbox; Data-Acquisition Software for PSP/TSP Wind-Tunnel Cameras; Corrosion-Prevention Capabilities of a Water-Borne, Silicone-Based, Primerless Coating; Sol-Gel Process for Making Pt-Ru Fuel-Cell Catalysts; Making Activated Carbon for Storing Gas; System Regulates the Water Contents of Fuel-Cell Streams; Five-Axis, Three-Magnetic-Bearing Dynamic Spin Rig; Modifications of Fabrication of Vibratory Microgyroscopes; Chamber for Growing and Observing Fungi; Electroporation System for Sterilizing Water; Thermoelectric Air/Soil Energy-Harvesting Device; Flexible Metal-Fabric Radiators; Actuated Hybrid Mirror Telescope; Optical Design of an Optical Communications Terminal; Algorithm for Identifying Erroneous Rain-Gauge Readings; Condition Assessment and End-of-Life Prediction System for Electric Machines and Their Loads; Lightweight Thermal Insulation for a Liquid-Oxygen Tank; Stellar Gyroscope for Determining Attitude of a Spacecraft; and Lifting Mechanism for the Mars Explorer Rover.
NASA Technical Reports Server (NTRS)
Defrere, D.; Hinz, P.; Downey, E.; Boehm, M.; Danchi, W. C.; Durney, O.; Ertel, S.; Hill, J. M.; Hoffmann, W. F.; Mennesson, B.;
2016-01-01
The Large Binocular Telescope Interferometer uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI/MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller uses a near-infrared camera to measure the optical path length variations between the two AO-corrected apertures and provide high-angular resolution observations for all its science channels (1.5-13 microns). There is however a wavelength dependent component to the atmospheric turbulence, which can introduce optical path length errors when observing at a wavelength different from that of the fringe sensing camera. Water vapor in particular is highly dispersive and its effect must be taken into account for high-precision infrared interferometric observations as described previously for VLTI MIDI or the Keck Interferometer Nuller. In this paper, we describe the new sensing approach that has been developed at the LBT to measure and monitor the optical path length fluctuations due to dry air and water vapor separately. After reviewing the current performance of the system for dry air seeing compensation, we present simultaneous H-, K-, and N-band observations that illustrate the feasibility of our feed forward approach to stabilize the path length fluctuations seen by the LBTI nuller.
A system for simulating aerial or orbital TV observations of geographic patterns
NASA Technical Reports Server (NTRS)
Latham, J. P.
1972-01-01
A system which simulates observation of the earth surface by aerial or orbiting television devices has been developed. By projecting color slides of photographs taken by aircraft and orbiting sensors upon a rear screen system, and altering scale of projected image, screen position, or TV camera position, it is possible to simulate alternatives of altitude, or optical systems. By altering scan line patterns in COHU 3200 series camera from 525 to 945 scan lines, it is possible to study implications of scan line resolution upon the detection and analysis of geographic patterns observed by orbiting TV systems.
NASA Astrophysics Data System (ADS)
Gomes, Gary G.
1986-05-01
A cost effective and supportable color visual system has been developed to provide the necessary visual cues to United States Air Force B-52 bomber pilots training to become proficient at the task of inflight refueling. This camera model visual system approach is not suitable for all simulation applications, but provides a cost effective alternative to digital image generation systems when high fidelity of a single movable object is required. The system consists of a three axis gimballed KC-l35 tanker model, a range carriage mounted color augmented monochrome television camera, interface electronics, a color light valve projector and an infinity optics display system.
An optical system for detecting 3D high-speed oscillation of a single ultrasound microbubble
Liu, Yuan; Yuan, Baohong
2013-01-01
As contrast agents, microbubbles have been playing significant roles in ultrasound imaging. Investigation of microbubble oscillation is crucial for microbubble characterization and detection. Unfortunately, 3-dimensional (3D) observation of microbubble oscillation is challenging and costly because of the bubble size—a few microns in diameter—and the high-speed dynamics under MHz ultrasound pressure waves. In this study, a cost-efficient optical confocal microscopic system combined with a gated and intensified charge-coupled device (ICCD) camera were developed to detect 3D microbubble oscillation. The capability of imaging microbubble high-speed oscillation with much lower costs than with an ultra-fast framing or streak camera system was demonstrated. In addition, microbubble oscillations along both lateral (x and y) and axial (z) directions were demonstrated. Accordingly, this system is an excellent alternative for 3D investigation of microbubble high-speed oscillation, especially when budgets are limited. PMID:24049677
Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters
NASA Astrophysics Data System (ADS)
Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai
2016-04-01
Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion
Cryogenic system for the ArTeMiS large sub millimeter camera
NASA Astrophysics Data System (ADS)
Ercolani, E.; Relland, J.; Clerc, L.; Duband, L.; Jourdan, T.; Talvard, M.; Le Pennec, J.; Martignac, J.; Visticot, F.
2014-07-01
A new photonic camera has been developed in the framework of the ArTéMis project (Bolometers architecture for large field of view ground based telescopes in the sub-millimeter). This camera scans the sky in the sub-millimeter range at simultaneously three different wavelengths, namely 200 μm, 350 μm, 450 μm, and is installed inside the APEX telescope located at 5100m above sea level in Chile. Bolometric detectors cooled to 300 mK are used in the camera, which is integrated in an original cryostat developed at the low temperature laboratory (SBT) of the INAC institut. This cryostat contains filters, optics, mirrors and detectors which have to be implemented according to mass, size and stiffness requirements. As a result the cryostat exhibits an unusual geometry. The inner structure of the cryostat is a 40 K plate which acts as an optical bench and is bound to the external vessel through two hexapods, one fixed and the other one mobile thanks to a ball bearing. Once the cryostat is cold, this characteristic enabled all the different elements to be aligned with the optical axis. The cryogenic chain is built around a pulse tube cooler (40 K and 4 K) coupled to a double stage helium sorption cooler (300 mK). The cryogenic and vacuum processes are managed by a Siemens PLC and all the data are showed and stored on a CEA SCADA system. This paper describes the mechanical and thermal design of the cryostat, its command control, and the first thermal laboratory tests. This work was carried out in collaboration with the Astrophysics laboratory SAp of the IRFU institut. SAp and SBT have installed the camera in July 2013 inside the Cassegrain cabin of APEX.
NASA Technical Reports Server (NTRS)
Short, David A.; Lane, Robert E., Jr.; Winters, Katherine A.; Madura, John T.
2004-01-01
Clouds are highly effective in obscuring optical images of the Space Shuttle taken during its ascent by ground-based and airborne tracking cameras. Because the imagery is used for quick-look and post-flight engineering analysis, the Columbia Accident Investigation Board (CAIB) recommended the return-to-flight effort include an upgrade of the imaging system to enable it to obtain at least three useful views of the Shuttle from lift-off to at least solid rocket booster (SRB) separation (NASA 2003). The lifetimes of individual cloud elements capable of obscuring optical views of the Shuttle are typically 20 minutes or less. Therefore, accurately observing and forecasting cloud obscuration over an extended network of cameras poses an unprecedented challenge for the current state of observational and modeling techniques. In addition, even the best numerical simulations based on real observations will never reach "truth." In order to quantify the risk that clouds would obscure optical imagery of the Shuttle, a 3D model to calculate probabilistic risk was developed. The model was used to estimate the ability of a network of optical imaging cameras to obtain at least N simultaneous views of the Shuttle from lift-off to SRB separation in the presence of an idealized, randomized cloud field.
A Structured Light Sensor System for Tree Inventory
NASA Technical Reports Server (NTRS)
Chien, Chiun-Hong; Zemek, Michael C.
2000-01-01
Tree Inventory is referred to measurement and estimation of marketable wood volume in a piece of land or forest for purposes such as investment or for loan applications. Exist techniques rely on trained surveyor conducting measurements manually using simple optical or mechanical devices, and hence are time consuming subjective and error prone. The advance of computer vision techniques makes it possible to conduct automatic measurements that are more efficient, objective and reliable. This paper describes 3D measurements of tree diameters using a uniquely designed ensemble of two line laser emitters rigidly mounted on a video camera. The proposed laser camera system relies on a fixed distance between two parallel laser planes and projections of laser lines to calculate tree diameters. Performance of the laser camera system is further enhanced by fusion of information induced from structured lighting and that contained in video images. Comparison will be made between the laser camera sensor system and a stereo vision system previously developed for measurements of tree diameters.
Weather and atmosphere observation with the ATOM all-sky camera
NASA Astrophysics Data System (ADS)
Jankowsky, Felix; Wagner, Stefan
2015-03-01
The Automatic Telescope for Optical Monitoring (ATOM) for H.E.S.S. is an 75 cm optical telescope which operates fully automated. As there is no observer present during observation, an auxiliary all-sky camera serves as weather monitoring system. This device takes an all-sky image of the whole sky every three minutes. The gathered data then undergoes live-analysis by performing astrometric comparison with a theoretical night sky model, interpreting the absence of stars as cloud coverage. The sky monitor also serves as tool for a meteorological analysis of the observation site of the the upcoming Cherenkov Telescope Array. This overview covers design and benefits of the all-sky camera and additionally gives an introduction into current efforts to integrate the device into the atmosphere analysis programme of H.E.S.S.
NASA Technical Reports Server (NTRS)
Chen, Fang-Jenq
1997-01-01
Flow visualization produces data in the form of two-dimensional images. If the optical components of a camera system are perfect, the transformation equations between the two-dimensional image and the three-dimensional object space are linear and easy to solve. However, real camera lenses introduce nonlinear distortions that affect the accuracy of transformation unless proper corrections are applied. An iterative least-squares adjustment algorithm is developed to solve the nonlinear transformation equations incorporated with distortion corrections. Experimental applications demonstrate that a relative precision on the order of 40,000 is achievable without tedious laboratory calibrations of the camera.
Monte-Carlo Simulation for Accuracy Assessment of a Single Camera Navigation System
NASA Astrophysics Data System (ADS)
Bethmann, F.; Luhmann, T.
2012-07-01
The paper describes a simulation-based optimization of an optical tracking system that is used as a 6DOF navigation system for neurosurgery. Compared to classical system used in clinical navigation, the presented system has two unique properties: firstly, the system will be miniaturized and integrated into an operating microscope for neurosurgery; secondly, due to miniaturization a single camera approach has been designed. Single camera techniques for 6DOF measurements show a special sensitivity against weak geometric configurations between camera and object. In addition, the achievable accuracy potential depends significantly on the geometric properties of the tracked objects (locators). Besides quality and stability of the targets used on the locator, their geometric configuration is of major importance. In the following the development and investigation of a simulation program is presented which allows for the assessment and optimization of the system with respect to accuracy. Different system parameters can be altered as well as different scenarios indicating the operational use of the system. Measurement deviations are estimated based on the Monte-Carlo method. Practical measurements validate the correctness of the numerical simulation results.
NASA Astrophysics Data System (ADS)
Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.
1990-10-01
Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.
Traffic monitoring with distributed smart cameras
NASA Astrophysics Data System (ADS)
Sidla, Oliver; Rosner, Marcin; Ulm, Michael; Schwingshackl, Gert
2012-01-01
The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. Today the automated analysis of traffic situations is still in its infancy--the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully captured and interpreted by a vision system. 3In this work we present steps towards a visual monitoring system which is designed to detect potentially dangerous traffic situations around a pedestrian crossing at a street intersection. The camera system is specifically designed to detect incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system has been field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in a weatherproof housing. Two cameras run vehicle detection and tracking software, one camera runs a pedestrian detection and tracking module based on the HOG dectection principle. All 3 cameras use sparse optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. Geometric calibration of the cameras allows us to estimate the real-world co-ordinates of detected objects and to link the cameras together into one common reference system. This work describes the foundation for all the different object detection modalities (pedestrians, vehicles), and explains the system setup, tis design, and evaluation results which we have achieved so far.
Computational cameras for moving iris recognition
NASA Astrophysics Data System (ADS)
McCloskey, Scott; Venkatesha, Sharath
2015-05-01
Iris-based biometric identification is increasingly used for facility access and other security applications. Like all methods that exploit visual information, however, iris systems are limited by the quality of captured images. Optical defocus due to a small depth of field (DOF) is one such challenge, as is the acquisition of sharply-focused iris images from subjects in motion. This manuscript describes the application of computational motion-deblurring cameras to the problem of moving iris capture, from the underlying theory to system considerations and performance data.
NASA Astrophysics Data System (ADS)
Ma, Chen; Cheng, Dewen; Xu, Chen; Wang, Yongtian
2014-11-01
Fundus camera is a complex optical system for retinal photography, involving illumination and imaging of the retina. Stray light is one of the most significant problems of fundus camera because the retina is so minimally reflective that back reflections from the cornea and any other optical surface are likely to be significantly greater than the light reflected from the retina. To provide maximum illumination to the retina while eliminating back reflections, a novel design of illumination system used in portable fundus camera is proposed. Internal illumination, in which eyepiece is shared by both the illumination system and the imaging system but the condenser and the objective are separated by a beam splitter, is adopted for its high efficiency. To eliminate the strong stray light caused by corneal center and make full use of light energy, the annular stop in conventional illumination systems is replaced by a fiber-coupled, ring-shaped light source that forms an annular beam. Parameters including size and divergence angle of the light source are specially designed. To weaken the stray light, a polarized light source is used, and an analyzer plate is placed after beam splitter in the imaging system. Simulation results show that the illumination uniformity at the fundus exceeds 90%, and the stray light is within 1%. Finally, a proof-of-concept prototype is developed and retinal photos of an ophthalmophantom are captured. The experimental results show that ghost images and stray light have been greatly reduced to a level that professional diagnostic will not be interfered with.
Small diameter, deep bore optical inspection system
Lord, D.E.; Petrini, R.R.; Carter, G.W.
An improved rod optic system for inspecting small diameter, deep bores is described. The system consists of a rod optic system utilizing a curved mirror at the end of the rod lens such that the optical path through the system is bent 90/sup 0/ to minimize optical distortion in examing the sides of a curved bore. The system is particularly useful in the examination of small bores for corrosion, and is capable if examing 1/16 inch diameter and up to 4-inch deep drill holes, for example. The positioning of the curved mirror allows simultaneous viewing from shallow and righ angle points of observation of the same artifact (such as corrosion) in the bore hole. The improved rod optic system may be used for direct eye sighting, or in combination with a still camera or a low-light television monitor; particularly low-light color television.
Rover imaging system for the Mars rover/sample return mission
NASA Technical Reports Server (NTRS)
1993-01-01
In the past year, the conceptual design of a panoramic imager for the Mars Environmental Survey (MESUR) Pathfinder was finished. A prototype camera was built and its performace in the laboratory was tested. The performance of this camera was excellent. Based on this work, we have recently proposed a small, lightweight, rugged, and highly capable Mars Surface Imager (MSI) instrument for the MESUR Pathfinder mission. A key aspect of our approach to optimization of the MSI design is that we treat image gathering, coding, and restoration as a whole, rather than as separate and independent tasks. Our approach leads to higher image quality, especially in the representation of fine detail with good contrast and clarity, without increasing either the complexity of the camera or the amount of data transmission. We have made significant progress over the past year in both the overall MSI system design and in the detailed design of the MSI optics. We have taken a simple panoramic camera and have upgraded it substantially to become a prototype of the MSI flight instrument. The most recent version of the camera utilizes miniature wide-angle optics that image directly onto a 3-color, 2096-element CCD line array. There are several data-taking modes, providing resolution as high as 0.3 mrad/pixel. Analysis tasks that were performed or that are underway with the test data from the prototype camera include the following: construction of 3-D models of imaged scenes from stereo data, first for controlled scenes and later for field scenes; and checks on geometric fidelity, including alignment errors, mast vibration, and oscillation in the drive system. We have outlined a number of tasks planned for Fiscal Year '93 in order to prepare us for submission of a flight instrument proposal for MESUR Pathfinder.
Adaptive optics with pupil tracking for high resolution retinal imaging
Sahin, Betul; Lamory, Barbara; Levecq, Xavier; Harms, Fabrice; Dainty, Chris
2012-01-01
Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics. PMID:22312577
Adaptive optics with pupil tracking for high resolution retinal imaging.
Sahin, Betul; Lamory, Barbara; Levecq, Xavier; Harms, Fabrice; Dainty, Chris
2012-02-01
Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics.
Portable fiber-optic taper coupled optical microscopy platform
NASA Astrophysics Data System (ADS)
Wang, Weiming; Yu, Yan; Huang, Hui; Ou, Jinping
2017-04-01
The optical fiber taper coupled with CMOS has advantages of high sensitivity, compact structure and low distortion in the imaging platform. So it is widely used in low light, high speed and X-ray imaging systems. In the meanwhile, the peculiarity of the coupled structure can meet the needs of the demand in microscopy imaging. Toward this end, we developed a microscopic imaging platform based on the coupling of cellphone camera module and fiber optic taper for the measurement of the human blood samples and ascaris lumbricoides. The platform, weighing 70 grams, is based on the existing camera module of the smartphone and a fiber-optic array which providing a magnification factor of 6x.The top facet of the taper, on which samples are placed, serves as an irregular sampling grid for contact imaging. The magnified images of the sample, located on the bottom facet of the fiber, are then projected onto the CMOS sensor. This paper introduces the portable medical imaging system based on the optical fiber coupling with CMOS, and theoretically analyzes the feasibility of the system. The image data and process results either can be stored on the memory or transmitted to the remote medical institutions for the telemedicine. We validate the performance of this cell-phone based microscopy platform using human blood samples and test target, achieving comparable results to a standard bench-top microscope.
Material of LAPAN's thermal IR camera equipped with two microbolometers in one aperture
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; Andi M., T.
2017-11-01
Besides the wavelength used, there is another factor that we have to notice in designing an optical system. It is material used which is correct for the spectral bands determined. Basically, due the limitation of the available range and expensive, choosing and determining materials for Infra Red (IR) wavelength are more difficult and complex rather than visible spectrum. We also had the same problem while designing our thermal IR camera equipped with two microbolometers sharing aperture. Two spectral bands, 3 - 4 μm (MWIR) and 8 - 12 μm (LWIR), have been decided to be our thermal IR camera spectrum to address missions, i.e., peat land fire, volcanoes activities, and Sea Surface Temperature (SST). Referring those bands, we chose the appropriate material for LAPAN's IR camera optics. This paper describes material of LAPAN's IR camera equipped with two microbolometer in one aperture. First of all, we were learning and understanding of optical materials properties all matters of IR technology including its bandwidths. Considering some aspects, i.e., Transmission, Index of Refraction, Thermal properties covering the index gradient and coefficient of thermal expansion (CTE), the analysis then has been accomplished. Moreover, we were utilizing a commercial software, Thermal Desktop/Sinda Fluint, to strengthen the process. Some restrictions such as space environment, low cost, and performance mainly durability and transmission, were also cared throughout the trade off the works. The results of all those analysis, either in graphs or in measurement, indicate that the lens of LAPAN's IR camera with sharing aperture is based on Germanium/Zinc Selenide materials.
Optical designs for the Mars '03 rover cameras
NASA Astrophysics Data System (ADS)
Smith, Gregory H.; Hagerott, Edward C.; Scherr, Lawrence M.; Herkenhoff, Kenneth E.; Bell, James F.
2001-12-01
In 2003, NASA is planning to send two robotic rover vehicles to explore the surface of Mars. The spacecraft will land on airbags in different, carefully chosen locations. The search for evidence indicating conditions favorable for past or present life will be a high priority. Each rover will carry a total of ten cameras of five various types. There will be a stereo pair of color panoramic cameras, a stereo pair of wide- field navigation cameras, one close-up camera on a movable arm, two stereo pairs of fisheye cameras for hazard avoidance, and one Sun sensor camera. This paper discusses the lenses for these cameras. Included are the specifications, design approaches, expected optical performances, prescriptions, and tolerances.
NASA Astrophysics Data System (ADS)
Theil, S.; Ammann, N.; Andert, F.; Franz, T.; Krüger, H.; Lehner, H.; Lingenauber, M.; Lüdtke, D.; Maass, B.; Paproth, C.; Wohlfeil, J.
2018-03-01
Since 2010 the German Aerospace Center is working on the project Autonomous Terrain-based Optical Navigation (ATON). Its objective is the development of technologies which allow autonomous navigation of spacecraft in orbit around and during landing on celestial bodies like the Moon, planets, asteroids and comets. The project developed different image processing techniques and optical navigation methods as well as sensor data fusion. The setup—which is applicable to many exploration missions—consists of an inertial measurement unit, a laser altimeter, a star tracker and one or multiple navigation cameras. In the past years, several milestones have been achieved. It started with the setup of a simulation environment including the detailed simulation of camera images. This was continued by hardware-in-the-loop tests in the Testbed for Robotic Optical Navigation (TRON) where images were generated by real cameras in a simulated downscaled lunar landing scene. Data were recorded in helicopter flight tests and post-processed in real-time to increase maturity of the algorithms and to optimize the software. Recently, two more milestones have been achieved. In late 2016, the whole navigation system setup was flying on an unmanned helicopter while processing all sensor information onboard in real time. For the latest milestone the navigation system was tested in closed-loop on the unmanned helicopter. For that purpose the ATON navigation system provided the navigation state for the guidance and control of the unmanned helicopter replacing the GPS-based standard navigation system. The paper will give an introduction to the ATON project and its concept. The methods and algorithms of ATON are briefly described. The flight test results of the latest two milestones are presented and discussed.
Using Arago's spot to monitor optical axis shift in a Petzval refractor.
Bruns, Donald G
2017-03-10
Measuring the change in the optical alignment of a camera attached to a telescope is necessary to perform astrometric measurements. Camera movement when the telescope is refocused changes the plate constants, invalidating the calibration. Monitoring the shift in the optical axis requires a stable internal reference source. This is easily implemented in a Petzval refractor by adding an illuminated pinhole and a small obscuration that creates a spot of Arago on the camera. Measurements of the optical axis shift for a commercial telescope are given as an example.
Fiber-Optic Surface Temperature Sensor Based on Modal Interference.
Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc
2016-07-28
Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thurman-Keup, R.; Lumpkin, A. H.; Thangaraj, J.
FAST is a facility at Fermilab that consists of a photoinjector, two superconducting capture cavities, one superconducting ILC-style cryomodule, and a small ring for studying non-linear, integrable beam optics called IOTA. This paper discusses the layout for the optical transport system that provides optical radiation to an externally located streak camera for bunch length measurements, and THz radiation to a Martin-Puplett interferometer, also for bunch length measurements. It accepts radiation from two synchrotron radiation ports in a chicane bunch compressor and a diffraction/transition radiation screen downstream of the compressor. It also has the potential to access signal from a transitionmore » radiation screen or YAG screen after the spectrometer magnet for measurements of energy-time correlations. Initial results from both the streak camera and Martin-Puplett will be presented.« less
NASA Astrophysics Data System (ADS)
Ko, Jonathan; Wu, Chensheng; Davis, Christopher C.
2015-09-01
Adaptive optics has been widely used in the field of astronomy to correct for atmospheric turbulence while viewing images of celestial bodies. The slightly distorted incoming wavefronts are typically sensed with a Shack-Hartmann sensor and then corrected with a deformable mirror. Although this approach has proven to be effective for astronomical purposes, a new approach must be developed when correcting for the deep turbulence experienced in ground to ground based optical systems. We propose the use of a modified plenoptic camera as a wavefront sensor capable of accurately representing an incoming wavefront that has been significantly distorted by strong turbulence conditions (C2n <10-13 m- 2/3). An intelligent correction algorithm can then be developed to reconstruct the perturbed wavefront and use this information to drive a deformable mirror capable of correcting the major distortions. After the large distortions have been corrected, a secondary mode utilizing more traditional adaptive optics algorithms can take over to fine tune the wavefront correction. This two-stage algorithm can find use in free space optical communication systems, in directed energy applications, as well as for image correction purposes.
Magneto-optical system for high speed real time imaging.
Baziljevich, M; Barness, D; Sinvani, M; Perel, E; Shaulov, A; Yeshurun, Y
2012-08-01
A new magneto-optical system has been developed to expand the range of high speed real time magneto-optical imaging. A special source for the external magnetic field has also been designed, using a pump solenoid to rapidly excite the field coil. Together with careful modifications of the cryostat, to reduce eddy currents, ramping rates reaching 3000 T/s have been achieved. Using a powerful laser as the light source, a custom designed optical assembly, and a high speed digital camera, real time imaging rates up to 30 000 frames per seconds have been demonstrated.
Magneto-optical system for high speed real time imaging
NASA Astrophysics Data System (ADS)
Baziljevich, M.; Barness, D.; Sinvani, M.; Perel, E.; Shaulov, A.; Yeshurun, Y.
2012-08-01
A new magneto-optical system has been developed to expand the range of high speed real time magneto-optical imaging. A special source for the external magnetic field has also been designed, using a pump solenoid to rapidly excite the field coil. Together with careful modifications of the cryostat, to reduce eddy currents, ramping rates reaching 3000 T/s have been achieved. Using a powerful laser as the light source, a custom designed optical assembly, and a high speed digital camera, real time imaging rates up to 30 000 frames per seconds have been demonstrated.
Techniques for optically compressing light intensity ranges
Rushford, Michael C.
1989-01-01
A pin hole camera assembly for use in viewing an object having a relatively large light intensity range, for example a crucible containing molten uranium in an atomic vapor laser isotope separator (AVLIS) system is disclosed herein. The assembly includes means for optically compressing the light intensity range appearing at its input sufficient to make it receivable and decipherable by a standard video camera. A number of different means for compressing the intensity range are disclosed. These include the use of photogray glass, the use of a pair of interference filters, and the utilization of a new liquid crystal notch filter in combination with an interference filter.
Techniques for optically compressing light intensity ranges
Rushford, M.C.
1989-03-28
A pin hole camera assembly for use in viewing an object having a relatively large light intensity range, for example a crucible containing molten uranium in an atomic vapor laser isotope separator (AVLIS) system is disclosed herein. The assembly includes means for optically compressing the light intensity range appearing at its input sufficient to make it receivable and decipherable by a standard video camera. A number of different means for compressing the intensity range are disclosed. These include the use of photogray glass, the use of a pair of interference filters, and the utilization of a new liquid crystal notch filter in combination with an interference filter. 18 figs.
Cat-eye effect reflected beam profiles of an optical system with sensor array.
Gong, Mali; He, Sifeng; Guo, Rui; Wang, Wei
2016-06-01
In this paper, we propose an applicable propagation model for Gaussian beams passing through any cat-eye target instead of traditional simplification consisting of only a mirror placed at the focal plane of a lens. According to the model, the cat-eye effect of CCD cameras affected by defocus is numerically simulated. An excellent agreement of experiment results with theoretical analysis is obtained. It is found that the reflectivity distribution at the focal plane of the cat-eye optical lens has great influence on the results, while the cat-eye effect reflected beam profiles of CCD cameras show obvious periodicity.
A novel dual-camera calibration method for 3D optical measurement
NASA Astrophysics Data System (ADS)
Gai, Shaoyan; Da, Feipeng; Dai, Xianqiang
2018-05-01
A novel dual-camera calibration method is presented. In the classic methods, the camera parameters are usually calculated and optimized by the reprojection error. However, for a system designed for 3D optical measurement, this error does not denote the result of 3D reconstruction. In the presented method, a planar calibration plate is used. In the beginning, images of calibration plate are snapped from several orientations in the measurement range. The initial parameters of the two cameras are obtained by the images. Then, the rotation and translation matrix that link the frames of two cameras are calculated by using method of Centroid Distance Increment Matrix. The degree of coupling between the parameters is reduced. Then, 3D coordinates of the calibration points are reconstructed by space intersection method. At last, the reconstruction error is calculated. It is minimized to optimize the calibration parameters. This error directly indicates the efficiency of 3D reconstruction, thus it is more suitable for assessing the quality of dual-camera calibration. In the experiments, it can be seen that the proposed method is convenient and accurate. There is no strict requirement on the calibration plate position in the calibration process. The accuracy is improved significantly by the proposed method.
The Modular Optical Underwater Survey System
Amin, Ruhul; Richards, Benjamin L.; Misa, William F. X. E.; Taylor, Jeremy C.; Miller, Dianna R.; Rollo, Audrey K.; Demarke, Christopher; Ossolinski, Justin E.; Reardon, Russell T.; Koyanagi, Kyle H.
2017-01-01
The Pacific Islands Fisheries Science Center deploys the Modular Optical Underwater Survey System (MOUSS) to estimate the species-specific, size-structured abundance of commercially-important fish species in Hawaii and the Pacific Islands. The MOUSS is an autonomous stereo-video camera system designed for the in situ visual sampling of fish assemblages. This system is rated to 500 m and its low-light, stereo-video cameras enable identification, counting, and sizing of individuals at a range of 0.5–10 m. The modular nature of MOUSS allows for the efficient and cost-effective use of various imaging sensors, power systems, and deployment platforms. The MOUSS is in use for surveys in Hawaii, the Gulf of Mexico, and Southern California. In Hawaiian waters, the system can effectively identify individuals to a depth of 250 m using only ambient light. In this paper, we describe the MOUSS’s application in fisheries research, including the design, calibration, analysis techniques, and deployment mechanism. PMID:29019962
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rathore, Kavita, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Munshi, Prabhat, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Bhattacharjee, Sudeep, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in
A new non-invasive diagnostic system is developed for Microwave Induced Plasma (MIP) to reconstruct tomographic images of a 2D emission profile. A compact MIP system has wide application in industry as well as research application such as thrusters for space propulsion, high current ion beams, and creation of negative ions for heating of fusion plasma. Emission profile depends on two crucial parameters, namely, the electron temperature and density (over the entire spatial extent) of the plasma system. Emission tomography provides basic understanding of plasmas and it is very useful to monitor internal structure of plasma phenomena without disturbing its actualmore » processes. This paper presents development of a compact, modular, and versatile Optical Emission Tomography (OET) tool for a cylindrical, magnetically confined MIP system. It has eight slit-hole cameras and each consisting of a complementary metal–oxide–semiconductor linear image sensor for light detection. The optical noise is reduced by using aspheric lens and interference band-pass filters in each camera. The entire cylindrical plasma can be scanned with automated sliding ring mechanism arranged in fan-beam data collection geometry. The design of the camera includes a unique possibility to incorporate different filters to get the particular wavelength light from the plasma. This OET system includes selected band-pass filters for particular argon emission 750 nm, 772 nm, and 811 nm lines and hydrogen emission H{sub α} (656 nm) and H{sub β} (486 nm) lines. Convolution back projection algorithm is used to obtain the tomographic images of plasma emission line. The paper mainly focuses on (a) design of OET system in detail and (b) study of emission profile for 750 nm argon emission lines to validate the system design.« less
NASA Astrophysics Data System (ADS)
Joshi, V.; Wigdahl, J.; Nemeth, S.; Zamora, G.; Ebrahim, E.; Soliz, P.
2018-02-01
Retinal abnormalities associated with hypertensive retinopathy are useful in assessing the risk of cardiovascular disease, heart failure, and stroke. Assessing these risks as part of primary care can lead to a decrease in the incidence of cardiovascular disease-related deaths. Primary care is a resource limited setting where low cost retinal cameras may bring needed help without compromising care. We compared a low-cost handheld retinal camera to a traditional table top retinal camera on their optical characteristics and performance to detect hypertensive retinopathy. A retrospective dataset of N=40 subjects (28 with hypertensive retinopathy, 12 controls) was used from a clinical study conducted at a primary care clinic in Texas. Non-mydriatic retinal fundus images were acquired using a Pictor Plus hand held camera (Volk Optical Inc.) and a Canon CR1-Mark II tabletop camera (Canon USA) during the same encounter. The images from each camera were graded by a licensed optometrist according to the universally accepted Keith-Wagener-Barker Hypertensive Retinopathy Classification System, three weeks apart to minimize memory bias. The sensitivity of the hand-held camera to detect any level of hypertensive retinopathy was 86% compared to the Canon. Insufficient photographer's skills produced 70% of the false negative cases. The other 30% were due to the handheld camera's insufficient spatial resolution to resolve the vascular changes such as minor A/V nicking and copper wiring, but these were associated with non-referable disease. Physician evaluation of the performance of the handheld camera indicates it is sufficient to provide high risk patients with adequate follow up and management.
Performance evaluation and clinical applications of 3D plenoptic cameras
NASA Astrophysics Data System (ADS)
Decker, Ryan; Shademan, Azad; Opfermann, Justin; Leonard, Simon; Kim, Peter C. W.; Krieger, Axel
2015-06-01
The observation and 3D quantification of arbitrary scenes using optical imaging systems is challenging, but increasingly necessary in many fields. This paper provides a technical basis for the application of plenoptic cameras in medical and medical robotics applications, and rigorously evaluates camera integration and performance in the clinical setting. It discusses plenoptic camera calibration and setup, assesses plenoptic imaging in a clinically relevant context, and in the context of other quantitative imaging technologies. We report the methods used for camera calibration, precision and accuracy results in an ideal and simulated surgical setting. Afterwards, we report performance during a surgical task. Test results showed the average precision of the plenoptic camera to be 0.90mm, increasing to 1.37mm for tissue across the calibrated FOV. The ideal accuracy was 1.14mm. The camera showed submillimeter error during a simulated surgical task.
Hyperspectral imaging for food processing automation
NASA Astrophysics Data System (ADS)
Park, Bosoon; Lawrence, Kurt C.; Windham, William R.; Smith, Doug P.; Feldner, Peggy W.
2002-11-01
This paper presents the research results that demonstrates hyperspectral imaging could be used effectively for detecting feces (from duodenum, ceca, and colon) and ingesta on the surface of poultry carcasses, and potential application for real-time, on-line processing of poultry for automatic safety inspection. The hyperspectral imaging system included a line scan camera with prism-grating-prism spectrograph, fiber optic line lighting, motorized lens control, and hyperspectral image processing software. Hyperspectral image processing algorithms, specifically band ratio of dual-wavelength (565/517) images and thresholding were effective on the identification of fecal and ingesta contamination of poultry carcasses. A multispectral imaging system including a common aperture camera with three optical trim filters (515.4 nm with 8.6- nm FWHM), 566.4 nm with 8.8-nm FWHM, and 631 nm with 10.2-nm FWHM), which were selected and validated by a hyperspectral imaging system, was developed for a real-time, on-line application. A total image processing time required to perform the current multispectral images captured by a common aperture camera was approximately 251 msec or 3.99 frames/sec. A preliminary test shows that the accuracy of real-time multispectral imaging system to detect feces and ingesta on corn/soybean fed poultry carcasses was 96%. However, many false positive spots that cause system errors were also detected.
Non Contacting Evaluation of Strains and Cracking Using Optical and Infrared Imaging Techniques
1988-08-22
Compatible Zenith Z-386 microcomputer with plotter II. 3-D Motion Measurinq System 1. Complete OPTOTRAK three dimensional digitizing system. System includes...acquisition unit - 16 single ended analog input channels 3. Data Analysis Package software (KINEPLOT) 4. Extra OPTOTRAK Camera (max 224 per system
Application of PLZT electro-optical shutter to diaphragm of visible and mid-infrared cameras
NASA Astrophysics Data System (ADS)
Fukuyama, Yoshiyuki; Nishioka, Shunji; Chonan, Takao; Sugii, Masakatsu; Shirahata, Hiromichi
1997-04-01
Pb0.9La0.09(Zr0.65,Ti0.35)0.9775O3 9/65/35) commonly used as an electro-optical shutter exhibits large phase retardation with low applied voltage. This shutter features as follows; (1) high shutter speed, (2) wide optical transmittance, and (3) high optical density in 'OFF'-state. If the shutter is applied to a diaphragm of video-camera, it could protect its sensor from intense lights. We have tested the basic characteristics of the PLZT electro-optical shutter and resolved power of imaging. The ratio of optical transmittance at 'ON' and 'OFF'-states was 1.1 X 103. The response time of the PLZT shutter from 'ON'-state to 'OFF'-state was 10 micro second. MTF reduction when putting the PLZT shutter in from of the visible video- camera lens has been observed only with 12 percent at a spatial frequency of 38 cycles/mm which are sensor resolution of the video-camera. Moreover, we took the visible image of the Si-CCD video-camera. The He-Ne laser ghost image was observed at 'ON'-state. On the contrary, the ghost image was totally shut out at 'OFF'-state. From these teste, it has been found that the PLZT shutter is useful for the diaphragm of the visible video-camera. The measured optical transmittance of PLZT wafer with no antireflection coating was 78 percent over the range from 2 to 6 microns.
Robust optical sensors for safety critical automotive applications
NASA Astrophysics Data System (ADS)
De Locht, Cliff; De Knibber, Sven; Maddalena, Sam
2008-02-01
Optical sensors for the automotive industry need to be robust, high performing and low cost. This paper focuses on the impact of automotive requirements on optical sensor design and packaging. Main strategies to lower optical sensor entry barriers in the automotive market include: Perform sensor calibration and tuning by the sensor manufacturer, sensor test modes on chip to guarantee functional integrity at operation, and package technology is key. As a conclusion, optical sensor applications are growing in automotive. Optical sensor robustness matured to the level of safety critical applications like Electrical Power Assisted Steering (EPAS) and Drive-by-Wire by optical linear arrays based systems and Automated Cruise Control (ACC), Lane Change Assist and Driver Classification/Smart Airbag Deployment by camera imagers based systems.
Accuracy of an optical active-marker system to track the relative motion of rigid bodies.
Maletsky, Lorin P; Sun, Junyi; Morton, Nicholas A
2007-01-01
The measurement of relative motion between two moving bones is commonly accomplished for in vitro studies by attaching to each bone a series of either passive or active markers in a fixed orientation to create a rigid body (RB). This work determined the accuracy of motion between two RBs using an Optotrak optical motion capture system with active infrared LEDs. The stationary noise in the system was quantified by recording the apparent change in position with the RBs stationary and found to be 0.04 degrees and 0.03 mm. Incremental 10 degrees rotations and 10-mm translations were made using a more precise tool than the Optotrak. Increasing camera distance decreased the precision or increased the range of values observed for a set motion and increased the error in rotation or bias between the measured and actual rotation. The relative positions of the RBs with respect to the camera-viewing plane had a minimal effect on the kinematics and, therefore, for a given distance in the volume less than or close to the precalibrated camera distance, any motion was similarly reliable. For a typical operating set-up, a 10 degrees rotation showed a bias of 0.05 degrees and a 95% repeatability limit of 0.67 degrees. A 10-mm translation showed a bias of 0.03 mm and a 95% repeatability limit of 0.29 mm. To achieve a high level of accuracy it is important to keep the distance between the cameras and the markers near the distance the cameras are focused to during calibration.
To develop a flying fish egg inspection system by a digital imaging base system
NASA Astrophysics Data System (ADS)
Chen, Chun-Jen; Jywe, Wenyuh; Hsieh, Tung-Hsien; Chen, Chien Hung
2015-07-01
This paper develops an automatic optical inspection system for flying fish egg quality inspection. The automatic optical inspection system consists of a 2-axes stage, a digital camera, a lens, a LED light source, a vacuum generator, a tube and a tray. This system can automatically find the particle on the flying egg tray and used stage to driver the tube onto the particle. Then use straw and vacuum generator to pick up the particle. The system pick rate is about 30 particles per minute.
Multilevel microvibration test for performance predictions of a space optical load platform
NASA Astrophysics Data System (ADS)
Li, Shiqi; Zhang, Heng; Liu, Shiping; Wang, Yue
2018-05-01
This paper presents a framework for the multilevel microvibration analysis and test of a space optical load platform. The test framework is conducted on three levels, including instrument, subsystem, and system level. Disturbance source experimental investigations are performed to evaluate the vibration amplitude and study vibration mechanism. Transfer characteristics of space camera are validated by a subsystem test, which allows the calculation of transfer functions from various disturbance sources to optical performance outputs. In order to identify the influence of the source on the spacecraft performance, a system level microvibration measurement test has been performed on the ground. From the time domain analysis and spectrum analysis of multilevel microvibration tests, we concluded that the disturbance source has a significant effect on its installation position. After transmitted through mechanical links, the residual vibration reduces to a background noise level. In addition, the angular microvibration of the platform jitter is mainly concentrated in the rotation of y-axes. This work is applied to a real practical application involving the high resolution satellite camera system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardan, R; Popple, R; Dobelbower, M
Purpose: To demonstrate the ability to quickly generate an accurate collision avoidance map using multiple stereotactic cameras during simulation. Methods: Three Kinect stereotactic cameras were placed in the CT simulation room and optically calibrated to the DICOM isocenter. Immediately before scanning, the patient was optically imaged to generate a 3D polygon mesh, which was used to calculate the collision avoidance area using our previously developed framework. The mesh was visually compared to the CT scan body contour to ensure accurate coordinate alignment. To test the accuracy of the collision calculation, the patient and machine were physically maneuvered in the treatmentmore » room to calculated collision boundaries. Results: The optical scan and collision calculation took 38.0 seconds and 2.5 seconds to complete respectively. The collision prediction accuracy was determined using a receiver operating curve (ROC) analysis, where the true positive, true negative, false positive and false negative values were 837, 821, 43, and 79 points respectively. The ROC accuracy was 93.1% over the sampled collision space. Conclusion: We have demonstrated a framework which is fast and accurate for predicting collision avoidance for treatment which can be determined during the normal simulation process. Because of the speed, the system could be used to add a layer of safety with a negligible impact on the normal patient simulation experience. This information could be used during treatment planning to explore the feasible geometries when optimizing plans. Research supported by Varian Medical Systems.« less
Demonstrations of Optical Spectra with a Video Camera
ERIC Educational Resources Information Center
Kraftmakher, Yaakov
2012-01-01
The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera…
Optical Meteor Systems Used by the NASA Meteoroid Environment Office
NASA Technical Reports Server (NTRS)
Kingery, A. M.; Blaauw, R. C.; Cooke, W. J.; Moser, D. E.
2015-01-01
The NASA Meteoroid Environment Office (MEO) uses two main meteor camera networks to characterize the meteoroid environment: an all sky system and a wide field system to study cm and mm size meteors respectively. The NASA All Sky Fireball Network consists of fifteen meteor video cameras in the United States, with plans to expand to eighteen cameras by the end of 2015. The camera design and All-Sky Guided and Real-time Detection (ASGARD) meteor detection software [1, 2] were adopted from the University of Western Ontario's Southern Ontario Meteor Network (SOMN). After seven years of operation, the network has detected over 12,000 multi-station meteors, including meteors from at least 53 different meteor showers. The network is used for speed distribution determination, characterization of meteor showers and sporadic sources, and for informing the public on bright meteor events. The NASA Wide Field Meteor Network was established in December of 2012 with two cameras and expanded to eight cameras in December of 2014. The two camera configuration saw 5470 meteors over two years of operation with two cameras, and has detected 3423 meteors in the first five months of operation (Dec 12, 2014 - May 12, 2015) with eight cameras. We expect to see over 10,000 meteors per year with the expanded system. The cameras have a 20 degree field of view and an approximate limiting meteor magnitude of +5. The network's primary goal is determining the nightly shower and sporadic meteor fluxes. Both camera networks function almost fully autonomously with little human interaction required for upkeep and analysis. The cameras send their data to a central server for storage and automatic analysis. Every morning the servers automatically generates an e-mail and web page containing an analysis of the previous night's events. The current status of the networks will be described, alongside with preliminary results. In addition, future projects, CCD photometry and broadband meteor color camera system, will be discussed.
Video semaphore decoding for free-space optical communication
NASA Astrophysics Data System (ADS)
Last, Matthew; Fisher, Brian; Ezekwe, Chinwuba; Hubert, Sean M.; Patel, Sheetal; Hollar, Seth; Leibowitz, Brian S.; Pister, Kristofer S. J.
2001-04-01
Using teal-time image processing we have demonstrated a low bit-rate free-space optical communication system at a range of more than 20km with an average optical transmission power of less than 2mW. The transmitter is an autonomous one cubic inch microprocessor-controlled sensor node with a laser diode output. The receiver is a standard CCD camera with a 1-inch aperture lens, and both hardware and software implementations of the video semaphore decoding algorithm. With this system sensor data can be reliably transmitted 21 km form San Francisco to Berkeley.
NASA Astrophysics Data System (ADS)
Lee, Kyuhang; Ko, Jinseok; Wi, Hanmin; Chung, Jinil; Seo, Hyeonjin; Jo, Jae Heung
2018-06-01
The visible TV system used in the Korea Superconducting Tokamak Advanced Research device has been equipped with a periscope to minimize the damage on its CCD pixels from neutron radiation. The periscope with more than 2.3 m in overall length has been designed for the visible camera system with its semi-diagonal field of view as wide as 30° and its effective focal length as short as 5.57 mm. The design performance of the periscope includes the modulation transfer function greater than 0.25 at 68 cycles/mm with low distortion. The installed periscope system has confirmed the image qualities as designed and also as comparable as those from its predecessor but with far less probabilities of neutral damages on the camera.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, K; Zhang, B; Eslami, S
Purpose: We present a newly developed on-board optical tomography system for SARRP. Innovative features include the compact design and fast acquisition optical method to perform 3D soft tissue radiation guidance. Because of the on-board feature and the combination of the CBCT, diffusive optical tomography (DOT), bioluminescence and fluorescence tomography (BLT and FT), this integrated system is expected to provide more accurate soft tissue guidance than an off-line system as well as highly sensitive functional imaging in preclinical research. Methods: Images are acquired in the order of CBCT, DOT and then BLT/FT, where the SARRP CBCT and DOT are used tomore » provide the anatomical and optical properties information to enhance the subsequent BLT/FT optical reconstruction. The SARRP stage is redesigned to include 9 imbedded optical fibers in contact with the animal's skin. These fibers, connected to a white light lamp or laser, serve as the light sources for the DOT or FT, respectively. A CCD camera with f/1.4 lens and multi-spectral filter set is used as the optical detector and is mounted on a portable cart ready to dock into the SARRP. No radiation is delivered during optical image acquisition. A 3-way mirror system capable of 180 degree rotation around the animal reflects the optical signal to the camera at multiple projection angles. A special black-painted dome covers the stage and provides the light shielding. Results: Spontaneous metastatic bioluminescent liver and lung tumor models will be used to validate the 3D BLT reconstruction. To demonstrate the capability of our FT system, GastroSense750 fluorescence agent will be used to imaging the mouse stomach and intestinal region in 3D. Conclusion: We expect that this integrated CBCT and optical tomography on-board a SARRP will present new research opportunities for pre-clinical radiation research. Supported by NCI RO1-CA 158100.« less
Brown, David M; Juarez, Juan C; Brown, Andrea M
2013-12-01
A laser differential image-motion monitor (DIMM) system was designed and constructed as part of a turbulence characterization suite during the DARPA free-space optical experimental network experiment (FOENEX) program. The developed link measurement system measures the atmospheric coherence length (r0), atmospheric scintillation, and power in the bucket for the 1550 nm band. DIMM measurements are made with two separate apertures coupled to a single InGaAs camera. The angle of arrival (AoA) for the wavefront at each aperture can be calculated based on focal spot movements imaged by the camera. By utilizing a single camera for the simultaneous measurement of the focal spots, the correlation of the variance in the AoA allows a straightforward computation of r0 as in traditional DIMM systems. Standard measurements of scintillation and power in the bucket are made with the same apertures by redirecting a percentage of the incoming signals to InGaAs detectors integrated with logarithmic amplifiers for high sensitivity and high dynamic range. By leveraging two, small apertures, the instrument forms a small size and weight configuration for mounting to actively tracking laser communication terminals for characterizing link performance.
Compact 3D Camera for Shake-the-Box Particle Tracking
NASA Astrophysics Data System (ADS)
Hesseling, Christina; Michaelis, Dirk; Schneiders, Jan
2017-11-01
Time-resolved 3D-particle tracking usually requires the time-consuming optical setup and calibration of 3 to 4 cameras. Here, a compact four-camera housing has been developed. The performance of the system using Shake-the-Box processing (Schanz et al. 2016) is characterized. It is shown that the stereo-base is large enough for sensible 3D velocity measurements. Results from successful experiments in water flows using LED illumination are presented. For large-scale wind tunnel measurements, an even more compact version of the system is mounted on a robotic arm. Once calibrated for a specific measurement volume, the necessity for recalibration is eliminated even when the system moves around. Co-axial illumination is provided through an optical fiber in the middle of the housing, illuminating the full measurement volume from one viewing direction. Helium-filled soap bubbles are used to ensure sufficient particle image intensity. This way, the measurement probe can be moved around complex 3D-objects. By automatic scanning and stitching of recorded particle tracks, the detailed time-averaged flow field of a full volume of cubic meters in size is recorded and processed. Results from an experiment at TU-Delft of the flow field around a cyclist are shown.
Hinken, David; Schinke, Carsten; Herlufsen, Sandra; Schmidt, Arne; Bothe, Karsten; Brendel, Rolf
2011-03-01
We report in detail on the luminescence imaging setup developed within the last years in our laboratory. In this setup, the luminescence emission of silicon solar cells or silicon wafers is analyzed quantitatively. Charge carriers are excited electrically (electroluminescence) using a power supply for carrier injection or optically (photoluminescence) using a laser as illumination source. The luminescence emission arising from the radiative recombination of the stimulated charge carriers is measured spatially resolved using a camera. We give details of the various components including cameras, optical filters for electro- and photo-luminescence, the semiconductor laser and the four-quadrant power supply. We compare a silicon charged-coupled device (CCD) camera with a back-illuminated silicon CCD camera comprising an electron multiplier gain and a complementary metal oxide semiconductor indium gallium arsenide camera. For the detection of the luminescence emission of silicon we analyze the dominant noise sources along with the signal-to-noise ratio of all three cameras at different operation conditions.
Space telescope low scattered light camera - A model
NASA Technical Reports Server (NTRS)
Breckinridge, J. B.; Kuper, T. G.; Shack, R. V.
1982-01-01
A design approach for a camera to be used with the space telescope is given. Camera optics relay the system pupil onto an annular Gaussian ring apodizing mask to control scattered light. One and two dimensional models of ripple on the primary mirror were calculated. Scattered light calculations using ripple amplitudes between wavelength/20 wavelength/200 with spatial correlations of the ripple across the primary mirror between 0.2 and 2.0 centimeters indicate that the detection of an object a billion times fainter than a bright source in the field is possible. Detection of a Jovian type planet in orbit about alpha Centauri with a camera on the space telescope may be possible.
Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft
NASA Astrophysics Data System (ADS)
Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.
2015-02-01
Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.
Design of optical axis jitter control system for multi beam lasers based on FPGA
NASA Astrophysics Data System (ADS)
Ou, Long; Li, Guohui; Xie, Chuanlin; Zhou, Zhiqiang
2018-02-01
A design of optical axis closed-loop control system for multi beam lasers coherent combining based on FPGA was introduced. The system uses piezoelectric ceramics Fast Steering Mirrors (FSM) as actuator, the Fairfield spot detection of multi beam lasers by the high speed CMOS camera for optical detecting, a control system based on FPGA for real-time optical axis jitter suppression. The algorithm for optical axis centroid detecting and PID of anti-Integral saturation were realized by FPGA. Optimize the structure of logic circuit by reuse resource and pipeline, as a result of reducing logic resource but reduced the delay time, and the closed-loop bandwidth increases to 100Hz. The jitter of laser less than 40Hz was reduced 40dB. The cost of the system is low but it works stably.
Recent Developments In High Speed Lens Design At The NPRL
NASA Astrophysics Data System (ADS)
Mcdowell, M. W.; Klee, H. W.
1987-09-01
Although the lens provides the link between the high speed camera and the outside world, there has over the years been little evidence of co-operation between the optical design and high speed photography communities. It is still only too common for a manufacturer to develop a camera of improved performance and resolution and then to combine this with a standard camera lens. These lenses were often designed for a completely different recording medium and, more often than not, their use results in avoidable degradation of the overall system performance. There is a tendency to assume that a specialized lens would be too expensive and that pushing the aperture automatically implies more complex optical systems. In the present paper some recent South African developments in the design of large aperture lenses are described. The application of a new design principle, based on the work earlier this century of Bernhard Schmidt, shows that ultra-fast lenses need not be overly complex and a basic four-element lens configuration can be adapted to a wide variety of applications.
Blur spot limitations in distal endoscope sensors
NASA Astrophysics Data System (ADS)
Yaron, Avi; Shechterman, Mark; Horesh, Nadav
2006-02-01
In years past, the picture quality of electronic video systems was limited by the image sensor. In the present, the resolution of miniature image sensors, as in medical endoscopy, is typically superior to the resolution of the optical system. This "excess resolution" is utilized by Visionsense to create stereoscopic vision. Visionsense has developed a single chip stereoscopic camera that multiplexes the horizontal dimension of the image sensor into two (left and right) images, compensates the blur phenomena, and provides additional depth resolution without sacrificing planar resolution. The camera is based on a dual-pupil imaging objective and an image sensor coated by an array of microlenses (a plenoptic camera). The camera has the advantage of being compact, providing simultaneous acquisition of left and right images, and offering resolution comparable to a dual chip stereoscopic camera with low to medium resolution imaging lenses. A stereoscopic vision system provides an improved 3-dimensional perspective of intra-operative sites that is crucial for advanced minimally invasive surgery and contributes to surgeon performance. An additional advantage of single chip stereo sensors is improvement of tolerance to electronic signal noise.
Object recognition through turbulence with a modified plenoptic camera
NASA Astrophysics Data System (ADS)
Wu, Chensheng; Ko, Jonathan; Davis, Christopher
2015-03-01
Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.
Determination of feature generation methods for PTZ camera object tracking
NASA Astrophysics Data System (ADS)
Doyle, Daniel D.; Black, Jonathan T.
2012-06-01
Object detection and tracking using computer vision (CV) techniques have been widely applied to sensor fusion applications. Many papers continue to be written that speed up performance and increase learning of artificially intelligent systems through improved algorithms, workload distribution, and information fusion. Military application of real-time tracking systems is becoming more and more complex with an ever increasing need of fusion and CV techniques to actively track and control dynamic systems. Examples include the use of metrology systems for tracking and measuring micro air vehicles (MAVs) and autonomous navigation systems for controlling MAVs. This paper seeks to contribute to the determination of select tracking algorithms that best track a moving object using a pan/tilt/zoom (PTZ) camera applicable to both of the examples presented. The select feature generation algorithms compared in this paper are the trained Scale-Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), the Mixture of Gaussians (MoG) background subtraction method, the Lucas- Kanade optical flow method (2000) and the Farneback optical flow method (2003). The matching algorithm used in this paper for the trained feature generation algorithms is the Fast Library for Approximate Nearest Neighbors (FLANN). The BSD licensed OpenCV library is used extensively to demonstrate the viability of each algorithm and its performance. Initial testing is performed on a sequence of images using a stationary camera. Further testing is performed on a sequence of images such that the PTZ camera is moving in order to capture the moving object. Comparisons are made based upon accuracy, speed and memory.
Depth measurements through controlled aberrations of projected patterns.
Birch, Gabriel C; Tyo, J Scott; Schwiegerling, Jim
2012-03-12
Three-dimensional displays have become increasingly present in consumer markets. However, the ability to capture three-dimensional images in space confined environments and without major modifications to current cameras is uncommon. Our goal is to create a simple modification to a conventional camera that allows for three dimensional reconstruction. We require such an imaging system have imaging and illumination paths coincident. Furthermore, we require that any three-dimensional modification to a camera also permits full resolution 2D image capture.Here we present a method of extracting depth information with a single camera and aberrated projected pattern. A commercial digital camera is used in conjunction with a projector system with astigmatic focus to capture images of a scene. By using an astigmatic projected pattern we can create two different focus depths for horizontal and vertical features of a projected pattern, thereby encoding depth. By designing an aberrated projected pattern, we are able to exploit this differential focus in post-processing designed to exploit the projected pattern and optical system. We are able to correlate the distance of an object at a particular transverse position from the camera to ratios of particular wavelet coefficients.We present our information regarding construction, calibration, and images produced by this system. The nature of linking a projected pattern design and image processing algorithms will be discussed.
Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6
NASA Technical Reports Server (NTRS)
Lee, George
1993-01-01
A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.
NASA Astrophysics Data System (ADS)
Hertel, R. J.; Hoilman, K. A.
1982-01-01
The effects of model vibration, camera and window nonlinearities, and aerodynamic disturbances in the optical path on the measurement of target position is examined. Window distortion, temperature and pressure changes, laminar and turbulent boundary layers, shock waves, target intensity and, target vibration are also studied. A general computer program was developed to trace optical rays through these disturbances. The use of a charge injection device camera as an alternative to the image dissector camera was examined.
Optical analysis of electro-optical systems by MTF calculus
NASA Astrophysics Data System (ADS)
Barbarini, Elisa Signoreto; Dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fátima Maria Mitsue; Castro Neto, Jarbas C.; Rodrigues, Evandro Luís Linhari
2011-08-01
One of the widely used methods for performance analysis of an optical system is the determination of the Modulation Transfer Function (MTF). The MTF represents a quantitative and direct measure of image quality, and, besides being an objective test, it can be used on concatenated optical system. This paper presents the application of software called SMTF (software modulation transfer function), built in C++ and Open CV platforms for MTF calculation on electro-optical system. Through this technique, it is possible to develop specific method to measure the real time performance of a digital fundus camera, an infrared sensor and an ophthalmological surgery microscope. Each optical instrument mentioned has a particular device to measure the MTF response, which is being developed. Then the MTF information assists the analysis of the optical system alignment, and also defines its resolution limit by the MTF graphic. The result obtained from the implemented software is compared with the theoretical MTF curve from the analyzed systems.
Orion Optical Navigation Progress Toward Exploration: Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. It shares a history with the "method of lunar distances" that was used in the 18th century and gained some notoriety after its use by Captain James Cook during his 1768 Pacific voyage of the HMS Endeavor. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is being worked as a Government Furnished Equipment (GFE) project delivered as an application within the Core Flight Software of the Orion camera controller module. The mathematical formulation behind the initial ellipse fit in the image processing is detailed in Christian. The non-linear least squares refinement then follows the technique of Mortari as an estimation process of the planetary limb using the sigmoid function. The Orion optical navigation system uses a body fixed camera, a decision that was driven by mass and mechanism constraints. The general concept of operations involves a 2-hour pass once every 24 hours, with passes specifically placed before all maneuvers to supply accurate navigation information to guidance and targeting. The pass lengths are limited by thermal constraints on the vehicle since the OpNav attitude generally deviates from the thermally stable tail-to-sun attitude maintained during the rest of the orbit coast phase. Calibration is scheduled prior to every pass due to the unknown nature of thermal effects on the lens distortion and the mounting platform deformations between the camera and star trackers. The calibration technique is described in detail by Christian, et al. and simultaneously estimates the Brown-Conrady coefficients and the Star Tracker/Camera interlock angles. Accurate attitude information is provided by the star trackers during each pass. Figure 1 shows the various phases of lunar return navigation when the vehicle is in autonomous operation with lost ground communication. The midcourse maneuvers are placed to control the entry interface conditions to the desired corridor for safe landing. The general form of optical navigation on Orion is where still images of the Moon or Earth are processed to find the apparent angular diameter and centroid in the camera focal plane. This raw data is transformed into range and bearing angle measurements using planetary data and precise star tracker inertial attitude. The measurements are then sent to the main flight computer's Kalman filter to update the onboard state vector. The images are, of course, collected over an arc to converge the state and estimate velocity. The same basic technique was used by Apollo to satisfy loss-of-comm, but Apollo used manual crew sightings with a vehicle-integral sextant instead of autonomously processing optical imagery. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. In support of this, a hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. Figure 2 shows the rig, which the test team has dubbed OCILOT (Orion Camera In the Loop Optical Testbed). Analysis performed to date shows a delivery that satisfies an allowable entry corridor as shown in Figure 3.
Imaging spectroscopy using embedded diffractive optical arrays
NASA Astrophysics Data System (ADS)
Hinnrichs, Michele; Hinnrichs, Bradford
2017-09-01
Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame. This system spans the SWIR and MWIR bands with a single optical array and focal plane array.
A method for measuring aircraft height and velocity using dual television cameras
NASA Technical Reports Server (NTRS)
Young, W. R.
1977-01-01
A unique electronic optical technique, consisting of two closed circuit television cameras and timing electronics, was devised to measure an aircraft's horizontal velocity and height above ground without the need for airborne cooperative devices. The system is intended to be used where the aircraft has a predictable flight path and a height of less than 660 meters (2,000 feet) at or near the end of an air terminal runway, but is suitable for greater aircraft altitudes whenever the aircraft remains visible. Two television cameras, pointed at zenith, are placed in line with the expected path of travel of the aircraft. Velocity is determined by measuring the time it takes the aircraft to travel the measured distance between cameras. Height is determined by correlating this speed with the time required to cross the field of view of either camera. Preliminary tests with a breadboard version of the system and a small model aircraft indicate the technique is feasible.
Researches on hazard avoidance cameras calibration of Lunar Rover
NASA Astrophysics Data System (ADS)
Li, Chunyan; Wang, Li; Lu, Xin; Chen, Jihua; Fan, Shenghong
2017-11-01
Lunar Lander and Rover of China will be launched in 2013. It will finish the mission targets of lunar soft landing and patrol exploration. Lunar Rover has forward facing stereo camera pair (Hazcams) for hazard avoidance. Hazcams calibration is essential for stereo vision. The Hazcam optics are f-theta fish-eye lenses with a 120°×120° horizontal/vertical field of view (FOV) and a 170° diagonal FOV. They introduce significant distortion in images and the acquired images are quite warped, which makes conventional camera calibration algorithms no longer work well. A photogrammetric calibration method of geometric model for the type of optical fish-eye constructions is investigated in this paper. In the method, Hazcams model is represented by collinearity equations with interior orientation and exterior orientation parameters [1] [2]. For high-precision applications, the accurate calibration model is formulated with the radial symmetric distortion and the decentering distortion as well as parameters to model affinity and shear based on the fisheye deformation model [3] [4]. The proposed method has been applied to the stereo camera calibration system for Lunar Rover.
Refocusing distance of a standard plenoptic camera.
Hahne, Christopher; Aggoun, Amar; Velisavljevic, Vladan; Fiebig, Susanne; Pesch, Matthias
2016-09-19
Recent developments in computational photography enabled variation of the optical focus of a plenoptic camera after image exposure, also known as refocusing. Existing ray models in the field simplify the camera's complexity for the purpose of image and depth map enhancement, but fail to satisfyingly predict the distance to which a photograph is refocused. By treating a pair of light rays as a system of linear functions, it will be shown in this paper that its solution yields an intersection indicating the distance to a refocused object plane. Experimental work is conducted with different lenses and focus settings while comparing distance estimates with a stack of refocused photographs for which a blur metric has been devised. Quantitative assessments over a 24 m distance range suggest that predictions deviate by less than 0.35 % in comparison to an optical design software. The proposed refocusing estimator assists in predicting object distances just as in the prototyping stage of plenoptic cameras and will be an essential feature in applications demanding high precision in synthetic focus or where depth map recovery is done by analyzing a stack of refocused photographs.
ASPIRE - Airborne Spectro-Polarization InfraRed Experiment
NASA Astrophysics Data System (ADS)
DeLuca, E.; Cheimets, P.; Golub, L.; Madsen, C. A.; Marquez, V.; Bryans, P.; Judge, P. G.; Lussier, L.; McIntosh, S. W.; Tomczyk, S.
2017-12-01
Direct measurements of coronal magnetic fields are critical for taking the next step in active region and solar wind modeling and for building the next generation of physics-based space-weather models. We are proposing a new airborne instrument to make these key observations. Building on the successful Airborne InfraRed Spectrograph (AIR-Spec) experiment for the 2017 eclipse, we will design and build a spectro-polarimeter to measure coronal magnetic field during the 2019 South Pacific eclipse. The new instrument will use the AIR-Spec optical bench and the proven pointing, tracking, and stabilization optics. A new cryogenic spectro-polarimeter will be built focusing on the strongest emission lines observed during the eclipse. The AIR-Spec IR camera, slit jaw camera and data acquisition system will all be reused. The poster will outline the optical design and the science goals for ASPIRE.
A new adaptive light beam focusing principle for scanning light stimulation systems.
Bitzer, L A; Meseth, M; Benson, N; Schmechel, R
2013-02-01
In this article a novel principle to achieve optimal focusing conditions or rather the smallest possible beam diameter for scanning light stimulation systems is presented. It is based on the following methodology: First, a reference point on a camera sensor is introduced where optimal focusing conditions are adjusted and the distance between the light focusing optic and the reference point is determined using a laser displacement sensor. In a second step, this displacement sensor is used to map the topography of the sample under investigation. Finally, the actual measurement is conducted, using optimal focusing conditions in each measurement point at the sample surface, that are determined by the height difference between camera sensor and the sample topography. This principle is independent of the measurement values, the optical or electrical properties of the sample, the used light source, or the selected wavelength. Furthermore, the samples can be tilted, rough, bent, or of different surface materials. In the following the principle is implemented using an optical beam induced current system, but basically it can be applied to any other scanning light stimulation system. Measurements to demonstrate its operation are shown, using a polycrystalline silicon solar cell.
Engine flow visualization using a copper vapor laser
NASA Technical Reports Server (NTRS)
Regan, Carolyn A.; Chun, Kue S.; Schock, Harold J., Jr.
1987-01-01
A flow visualization system has been developed to determine the air flow within the combustion chamber of a motored, axisymmetric engine. The engine has been equipped with a transparent quartz cylinder, allowing complete optical access to the chamber. A 40-Watt copper vapor laser is used as the light source. Its beam is focused down to a sheet approximately 1 mm thick. The light plane is passed through the combustion chamber, and illuminates oil particles which were entrained in the intake air. The light scattered off of the particles is recorded by a high speed rotating prism movie camera. A movie is then made showing the air flow within the combustion chamber for an entire four-stroke engine cycle. The system is synchronized so that a pulse generated by the camera triggers the laser's thyratron. The camera is run at 5,000 frames per second; the trigger drives one laser pulse per frame. This paper describes the optics used in the flow visualization system, the synchronization circuit, and presents results obtained from the movie. This is believed to be the first published study showing a planar observation of airflow in a four-stroke piston-cylinder assembly. These flow visualization results have been used to interpret flow velocity measurements previously obtained with a laser Doppler velocimetry system.
NASA Astrophysics Data System (ADS)
Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki
2011-12-01
This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.
NASA Technical Reports Server (NTRS)
1976-01-01
Development of the F/48, F/96 Planetary Camera for the Large Space Telescope is discussed. Instrument characteristics, optical design, and CCD camera submodule thermal design are considered along with structural subsystem and thermal control subsystem. Weight, electrical subsystem, and support equipment requirements are also included.
NASA Astrophysics Data System (ADS)
Cortes-Medellin, German; Parshley, Stephen; Campbell, Donald B.; Warnick, Karl F.; Jeffs, Brian D.; Ganesh, Rajagopalan
2016-08-01
This paper presents the current concept design for ALPACA (Advanced L-Band Phased Array Camera for Arecibo) an L-Band cryo-phased array instrument proposed for the 305 m radio telescope of Arecibo. It includes the cryogenically cooled front-end with 160 low noise amplifiers, a RF-over-fiber signal transport and a digital beam former with an instantaneous bandwidth of 312.5 MHz per channel. The camera will digitally form 40 simultaneous beams inside the available field of view of the Arecibo telescope optics, with an expected system temperature goal of 30 K.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nazareth, D; Malhotra, H; French, S
Purpose: Breast radiotherapy, particularly electronic compensation, may involve large dose gradients and difficult patient positioning problems. We have developed a simple self-calibrating augmented-reality system, which assists in accurately and reproducibly positioning the patient, by displaying her live image from a single camera superimposed on the correct perspective projection of her 3D CT data. Our method requires only a standard digital camera capable of live-view mode, installed in the treatment suite at an approximately-known orientation and position (rotation R; translation T). Methods: A 10-sphere calibration jig was constructed and CT imaged to provide a 3D model. The (R,T) relating the cameramore » to the CT coordinate system were determined by acquiring a photograph of the jig and optimizing an objective function, which compares the true image points to points calculated with a given candidate R and T geometry. Using this geometric information, 3D CT patient data, viewed from the camera's perspective, is plotted using a Matlab routine. This image data is superimposed onto the real-time patient image, acquired by the camera, and displayed using standard live-view software. This enables the therapists to view both the patient's current and desired positions, and guide the patient into assuming the correct position. The method was evaluated using an in-house developed bolus-like breast phantom, mounted on a supporting platform, which could be tilted at various angles to simulate treatment-like geometries. Results: Our system allowed breast phantom alignment, with an accuracy of about 0.5 cm and 1 ± 0.5 degree. Better resolution could be possible using a camera with higher-zoom capabilities. Conclusion: We have developed an augmented-reality system, which combines a perspective projection of a CT image with a patient's real-time optical image. This system has the potential to improve patient setup accuracy during breast radiotherapy, and could possibly be used for other disease sites as well.« less
NASA Technical Reports Server (NTRS)
Ohl, Raymond G.; Dow, Thomas A.; Sohn, alex
2004-01-01
We present highlights from the American Society for Precision Engineering's 2004 winter topical meeting entitled Free-Form Optics: Design, Fabrication, Metrology, Assembly. We emphasize those papers that are most relevant to astronomical optics. Optical surfaces that transcend the bounds of rotational symmetry have been implemented in novel optical systems with fantastic results since the release of Polaroid's first instant camera. Despite these successes, free-form optics have found only a few niche applications and have yet to enter the mainstream. The purpose of this meeting is to identify the state of the art of free-form optics design, fabrication, metrology and assembly and to identify the technical and logistical challenges that inhibit their widespread use. Issues that will be addressed include: What are free-form optics? How can optical systems be made better with free-form optics? How can designers use free-form optics? How can free-form optics be fabricated? How can they be measured? How are free-form optical systems assembled? Control of multi-axis systems.
Selecting among competing models of electro-optic, infrared camera system range performance
Nichols, Jonathan M.; Hines, James E.; Nichols, James D.
2013-01-01
Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.
Dawkins, M S; Roberts, S J; Cain, R J; Nickson, T; Donnelly, C A
2017-05-20
Footpad dermatitis and hockburn are serious welfare and economic issues for the production of broiler (meat) chickens. The authors here describe the use of an inexpensive camera system that monitors the movements of broiler flocks throughout their lives and suggest that it is possible to predict, even in young birds, the cross-sectional prevalence at slaughter of footpad dermatitis and hockburn before external signs are visible. The skew and kurtosis calculated from the authors' camera-based optical flow system had considerably more power to predict these outcomes in the 50 flocks reported here than water consumption, bodyweight or mortality and therefore have the potential to inform improved flock management through giving farmers early warning of welfare issues. Further trials are underway to establish the generality of the results. British Veterinary Association.
Flexible mobile robot system for smart optical pipe inspection
NASA Astrophysics Data System (ADS)
Kampfer, Wolfram; Bartzke, Ralf; Ziehl, Wolfgang
1998-03-01
Damages of pipes can be inspected and graded by TV technology available on the market. Remotely controlled vehicles carry a TV-camera through pipes. Thus, depending on the experience and the capability of the operator, diagnosis failures can not be avoided. The classification of damages requires the knowledge of the exact geometrical dimensions of the damages such as width and depth of cracks, fractures and defect connections. Within the framework of a joint R&D project a sensor based pipe inspection system named RODIAS has been developed with two partners from industry and research institute. It consists of a remotely controlled mobile robot which carries intelligent sensors for on-line sewerage inspection purpose. The sensor is based on a 3D-optical sensor and a laser distance sensor. The laser distance sensor is integrated in the optical system of the camera and can measure the distance between camera and object. The angle of view can be determined from the position of the pan and tilt unit. With coordinate transformations it is possible to calculate the spatial coordinates for every point of the video image. So the geometry of an object can be described exactly. The company Optimess has developed TriScan32, a special software for pipe condition classification. The user can start complex measurements of profiles, pipe displacements or crack widths simply by pressing a push-button. The measuring results are stored together with other data like verbal damage descriptions and digitized images in a data base.
Establishing imaging sensor specifications for digital still cameras
NASA Astrophysics Data System (ADS)
Kriss, Michael A.
2007-02-01
Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.
Optics design of laser spotter camera for ex-CCD sensor
NASA Astrophysics Data System (ADS)
Nautiyal, R. P.; Mishra, V. K.; Sharma, P. K.
2015-06-01
Development of Laser based instruments like laser range finder and laser ranger designator has received prominence in modern day military application. Aiming the laser on the target is done with the help of a bore sighted graticule as human eye cannot see the laser beam directly. To view Laser spot there are two types of detectors available, InGaAs detector and Ex-CCD detector, the latter being a cost effective solution. In this paper optics design for Ex-CCD based camera is discussed. The designed system is light weight and compact and has the ability to see the 1064nm pulsed laser spot upto a range of 5 km.
A Precision Metrology System for the Hubble Space Telescope Wide Field Camera 3 Instrument
NASA Technical Reports Server (NTRS)
Toland, Ronald W.
2003-01-01
The Wide Field Camera 3 (WFC3) instrument for the Hubble Space Telescope (HST) will replace the current Wide Field and Planetary Camera 2 (WFPC2). By providing higher throughput and sensitivity than WFPC2, and operating from the near-IR to the near-UV, WFC3 will once again bring the performance of HST above that from ground-based observatories. Crucial to the integration of the WFC3 optical bench is a pair of 2-axis cathetometers used to view targets which cannot be seen by other means when the bench is loaded into its enclosure. The setup and calibration of these cathetometers is described, along with results from a comparison of the cathetometer system with other metrology techniques.
NASA Technical Reports Server (NTRS)
McDowell, Mark (Inventor); Glasgow, Thomas K. (Inventor)
1999-01-01
A system and a method for measuring three-dimensional velocities at a plurality of points in a fluid employing at least two cameras positioned approximately perpendicular to one another. The cameras are calibrated to accurately represent image coordinates in world coordinate system. The two-dimensional views of the cameras are recorded for image processing and centroid coordinate determination. Any overlapping particle clusters are decomposed into constituent centroids. The tracer particles are tracked on a two-dimensional basis and then stereo matched to obtain three-dimensional locations of the particles as a function of time so that velocities can be measured therefrom The stereo imaging velocimetry technique of the present invention provides a full-field. quantitative, three-dimensional map of any optically transparent fluid which is seeded with tracer particles.
NASA Technical Reports Server (NTRS)
Monford, Leo G. (Inventor)
1990-01-01
Improved techniques are provided for alignment of two objects. The present invention is particularly suited for three-dimensional translation and three-dimensional rotational alignment of objects in outer space. A camera 18 is fixedly mounted to one object, such as a remote manipulator arm 10 of the spacecraft, while the planar reflective surface 30 is fixed to the other object, such as a grapple fixture 20. A monitor 50 displays in real-time images from the camera, such that the monitor displays both the reflected image of the camera and visible markings on the planar reflective surface when the objects are in proper alignment. The monitor may thus be viewed by the operator and the arm 10 manipulated so that the reflective surface is perpendicular to the optical axis of the camera, the roll of the reflective surface is at a selected angle with respect to the camera, and the camera is spaced a pre-selected distance from the reflective surface.
Improved docking alignment system
NASA Technical Reports Server (NTRS)
Monford, Leo G. (Inventor)
1988-01-01
Improved techniques are provided for the alignment of two objects. The present invention is particularly suited for 3-D translation and 3-D rotational alignment of objects in outer space. A camera is affixed to one object, such as a remote manipulator arm of the spacecraft, while the planar reflective surface is affixed to the other object, such as a grapple fixture. A monitor displays in real-time images from the camera such that the monitor displays both the reflected image of the camera and visible marking on the planar reflective surface when the objects are in proper alignment. The monitor may thus be viewed by the operator and the arm manipulated so that the reflective surface is perpendicular to the optical axis of the camera, the roll of the reflective surface is at a selected angle with respect to the camera, and the camera is spaced a pre-selected distance from the reflective surface.
Displacement and deformation measurement for large structures by camera network
NASA Astrophysics Data System (ADS)
Shang, Yang; Yu, Qifeng; Yang, Zhen; Xu, Zhiqiang; Zhang, Xiaohu
2014-03-01
A displacement and deformation measurement method for large structures by a series-parallel connection camera network is presented. By taking the dynamic monitoring of a large-scale crane in lifting operation as an example, a series-parallel connection camera network is designed, and the displacement and deformation measurement method by using this series-parallel connection camera network is studied. The movement range of the crane body is small, and that of the crane arm is large. The displacement of the crane body, the displacement of the crane arm relative to the body and the deformation of the arm are measured. Compared with a pure series or parallel connection camera network, the designed series-parallel connection camera network can be used to measure not only the movement and displacement of a large structure but also the relative movement and deformation of some interesting parts of the large structure by a relatively simple optical measurement system.
A novel simultaneous streak and framing camera without principle errors
NASA Astrophysics Data System (ADS)
Jingzhen, L.; Fengshan, S.; Ningwen, L.; Xiangdong, G.; Bin, H.; Qingyang, W.; Hongyi, C.; Yi, C.; Xiaowei, L.
2018-02-01
A novel simultaneous streak and framing camera with continuous access, the perfect information of which is far more important for the exact interpretation and precise evaluation of many detonation events and shockwave phenomena, has been developed. The camera with the maximum imaging frequency of 2 × 106 fps and the maximum scanning velocity of 16.3 mm/μs has fine imaging properties which are the eigen resolution of over 40 lp/mm in the temporal direction and over 60 lp/mm in the spatial direction and the framing frequency principle error of zero for framing record, and the maximum time resolving power of 8 ns and the scanning velocity nonuniformity of 0.136%~-0.277% for streak record. The test data have verified the performance of the camera quantitatively. This camera, simultaneously gained frames and streak with parallax-free and identical time base, is characterized by the plane optical system at oblique incidence different from space system, the innovative camera obscura without principle errors, and the high velocity motor driven beryllium-like rotating mirror, made of high strength aluminum alloy with cellular lateral structure. Experiments demonstrate that the camera is very useful and reliable to take high quality pictures of the detonation events.
Adjustment of multi-CCD-chip-color-camera heads
NASA Astrophysics Data System (ADS)
Guyenot, Volker; Tittelbach, Guenther; Palme, Martin
1999-09-01
The principle of beam-splitter-multi-chip cameras consists in splitting an image into differential multiple images of different spectral ranges and in distributing these onto separate black and white CCD-sensors. The resulting electrical signals from the chips are recombined to produce a high quality color picture on the monitor. Because this principle guarantees higher resolution and sensitivity in comparison to conventional single-chip camera heads, the greater effort is acceptable. Furthermore, multi-chip cameras obtain the compete spectral information for each individual object point while single-chip system must rely on interpolation. In a joint project, Fraunhofer IOF and STRACON GmbH and in future COBRA electronic GmbH develop methods for designing the optics and dichroitic mirror system of such prism color beam splitter devices. Additionally, techniques and equipment for the alignment and assembly of color beam splitter-multi-CCD-devices on the basis of gluing with UV-curable adhesives have been developed, too.
A survey of camera error sources in machine vision systems
NASA Astrophysics Data System (ADS)
Jatko, W. B.
In machine vision applications, such as an automated inspection line, television cameras are commonly used to record scene intensity in a computer memory or frame buffer. Scene data from the image sensor can then be analyzed with a wide variety of feature-detection techniques. Many algorithms found in textbooks on image processing make the implicit simplifying assumption of an ideal input image with clearly defined edges and uniform illumination. The ideal image model is helpful to aid the student in understanding the principles of operation, but when these algorithms are blindly applied to real-world images the results can be unsatisfactory. This paper examines some common measurement errors found in camera sensors and their underlying causes, and possible methods of error compensation. The role of the camera in a typical image-processing system is discussed, with emphasis on the origination of signal distortions. The effects of such things as lighting, optics, and sensor characteristics are considered.
Photodetectors for the Advanced Gamma-ray Imaging System (AGIS)
NASA Astrophysics Data System (ADS)
Wagner, Robert G.; Advanced Gamma-ray Imaging System AGIS Collaboration
2010-03-01
The Advanced Gamma-Ray Imaging System (AGIS) is a concept for the next generation very high energy gamma-ray observatory. Design goals include an order of magnitude better sensitivity, better angular resolution, and a lower energy threshold than existing Cherenkov telescopes. Each telescope is equipped with a camera that detects and records the Cherenkov-light flashes from air showers. The camera is comprised of a pixelated focal plane of blue sensitive and fast (nanosecond) photon detectors that detect the photon signal and convert it into an electrical one. Given the scale of AGIS, the camera must be reliable and cost effective. The Schwarzschild-Couder optical design yields a smaller plate scale than present-day Cherenkov telescopes, enabling the use of more compact, multi-pixel devices, including multianode photomultipliers or Geiger avalanche photodiodes. We present the conceptual design of the focal plane for the camera and results from testing candidate! focal plane sensors.
Orion Optical Navigation Progress Toward Exploration Mission 1
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher N.; Saley, David
2018-01-01
Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.
Baikejiang, Reheman; Zhang, Wei; Li, Changqing
2017-01-01
Diffuse optical tomography (DOT) has attracted attentions in the last two decades due to its intrinsic sensitivity in imaging chromophores of tissues such as hemoglobin, water, and lipid. However, DOT has not been clinically accepted yet due to its low spatial resolution caused by strong optical scattering in tissues. Structural guidance provided by an anatomical imaging modality enhances the DOT imaging substantially. Here, we propose a computed tomography (CT) guided multispectral DOT imaging system for breast cancer imaging. To validate its feasibility, we have built a prototype DOT imaging system which consists of a laser at the wavelength of 650 nm and an electron multiplying charge coupled device (EMCCD) camera. We have validated the CT guided DOT reconstruction algorithms with numerical simulations and phantom experiments, in which different imaging setup parameters, such as projection number of measurements and width of measurement patch, have been investigated. Our results indicate that an air-cooling EMCCD camera is good enough for the transmission mode DOT imaging. We have also found that measurements at six angular projections are sufficient for DOT to reconstruct the optical targets with 2 and 4 times absorption contrast when the CT guidance is applied. Finally, we have described our future research plan on integration of a multispectral DOT imaging system into a breast CT scanner.
Process control of laser conduction welding by thermal imaging measurement with a color camera.
Bardin, Fabrice; Morgan, Stephen; Williams, Stewart; McBride, Roy; Moore, Andrew J; Jones, Julian D C; Hand, Duncan P
2005-11-10
Conduction welding offers an alternative to keyhole welding. Compared with keyhole welding, it is an intrinsically stable process because vaporization phenomena are minimal. However, as with keyhole welding, an on-line process-monitoring system is advantageous for quality assurance to maintain the required penetration depth, which in conduction welding is more sensitive to changes in heat sinking. The maximum penetration is obtained when the surface temperature is just below the boiling point, and so we normally wish to maintain the temperature at this level. We describe a two-color optical system that we have developed for real-time temperature profile measurement of the conduction weld pool. The key feature of the system is the use of a complementary metal-oxide semiconductor standard color camera leading to a simplified low-cost optical setup. We present and discuss the real-time temperature measurement and control performance of the system when a defocused beam from a high power Nd:YAG laser is used on 5 mm thick stainless steel workpieces.
A vidicon camera for real time X-ray diffraction studies on polymers using synchrotron radiation
NASA Astrophysics Data System (ADS)
Prieske, W.; Riekel, C.; Koch, M. H. J.; Zachmann, H. G.
1983-04-01
A Westinghouse Vidicon camera with a ZnS(Ag) or Gd 2S 2O: Tb covered fiber optics plate has been used to study the change in the structure of oriented polyethylene terephthalate during heat treament. The data were stored on videotape. Once completed, the system will allow to read out the pictures via an analogue/digital converter into a PDP11/24 computer.
A DirtI Application for LBT Commissioning Campaigns
NASA Astrophysics Data System (ADS)
Borelli, J. L.
2009-09-01
In order to characterize the Gregorian focal stations and test the performance achieved by the Large Binocular Telescope (LBT) adaptive optics system, two infrared test cameras were constructed within a joint project between INAF (Observatorio Astronomico di Bologna, Italy) and the Max Planck Institute for Astronomy (Germany). Is intended here to describe the functionality and successful results obtained with the Daemon for the Infrared Test Camera Interface (DirtI) during commissioning campaigns.
Graphic design of pinhole cameras
NASA Technical Reports Server (NTRS)
Edwards, H. B.; Chu, W. P.
1979-01-01
The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.
NASA Technical Reports Server (NTRS)
Mollberg, Bernard H.; Schardt, Bruton B.
1988-01-01
The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.
Close-range photogrammetry with video cameras
NASA Technical Reports Server (NTRS)
Burner, A. W.; Snow, W. L.; Goad, W. K.
1985-01-01
Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.
Close-Range Photogrammetry with Video Cameras
NASA Technical Reports Server (NTRS)
Burner, A. W.; Snow, W. L.; Goad, W. K.
1983-01-01
Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.
Optical Indoor Positioning System Based on TFT Technology.
Gőzse, István
2015-12-24
A novel indoor positioning system is presented in the paper. Similarly to the camera-based solutions, it is based on visual detection, but it conceptually differs from the classical approaches. First, the objects are marked by LEDs, and second, a special sensing unit is applied, instead of a camera, to track the motion of the markers. This sensing unit realizes a modified pinhole camera model, where the light-sensing area is fixed and consists of a small number of sensing elements (photodiodes), and it is the hole that can be moved. The markers are tracked by controlling the motion of the hole, such that the light of the LEDs always hits the photodiodes. The proposed concept has several advantages: Apart from its low computational demands, it is insensitive to the disturbing ambient light. Moreover, as every component of the system can be realized by simple and inexpensive elements, the overall cost of the system can be kept low.
Design and realization of adaptive optical principle system without wavefront sensing
NASA Astrophysics Data System (ADS)
Wang, Xiaobin; Niu, Chaojun; Guo, Yaxing; Han, Xiang'e.
2018-02-01
In this paper, we focus on the performance improvement of the free space optical communication system and carry out the research on wavefront-sensorless adaptive optics. We use a phase only liquid crystal spatial light modulator (SLM) as the wavefront corrector. The optical intensity distribution of the distorted wavefront is detected by a CCD. We develop a wavefront controller based on ARM and a software based on the Linux operating system. The wavefront controller can control the CCD camera and the wavefront corrector. There being two SLMs in the experimental system, one simulates atmospheric turbulence and the other is used to compensate the wavefront distortion. The experimental results show that the performance quality metric (the total gray value of 25 pixels) increases from 3037 to 4863 after 200 iterations. Besides, it is demonstrated that our wavefront-sensorless adaptive optics system based on SPGD algorithm has a good performance in compensating wavefront distortion.
Chavez-Burbano, Patricia; Rabadan, Jose; Perez-Jimenez, Rafael
2017-01-01
Due to the massive insertion of embedded cameras in a wide variety of devices and the generalized use of LED lamps, Optical Camera Communication (OCC) has been proposed as a practical solution for future Internet of Things (IoT) and smart cities applications. Influence of mobility, weather conditions, solar radiation interference, and external light sources over Visible Light Communication (VLC) schemes have been addressed in previous works. Some authors have studied the spatial intersymbol interference from close emitters within an OCC system; however, it has not been characterized or measured in function of the different transmitted wavelengths. In this work, this interference has been experimentally characterized and the Normalized Power Signal to Interference Ratio (NPSIR) for easily determining the interference in other implementations, independently of the selected system devices, has been also proposed. A set of experiments in a darkroom, working with RGB multi-LED transmitters and a general purpose camera, were performed in order to obtain the NPSIR values and to validate the deduced equations for 2D pixel representation of real distances. These parameters were used in the simulation of a wireless sensor network scenario in a small office, where the Bit Error Rate (BER) of the communication link was calculated. The experiments show that the interference of other close emitters in terms of the distance and the used wavelength can be easily determined with the NPSIR. Finally, the simulation validates the applicability of the deduced equations for scaling the initial results into real scenarios. PMID:28677613
Chavez-Burbano, Patricia; Guerra, Victor; Rabadan, Jose; Rodríguez-Esparragón, Dionisio; Perez-Jimenez, Rafael
2017-07-04
Due to the massive insertion of embedded cameras in a wide variety of devices and the generalized use of LED lamps, Optical Camera Communication (OCC) has been proposed as a practical solution for future Internet of Things (IoT) and smart cities applications. Influence of mobility, weather conditions, solar radiation interference, and external light sources over Visible Light Communication (VLC) schemes have been addressed in previous works. Some authors have studied the spatial intersymbol interference from close emitters within an OCC system; however, it has not been characterized or measured in function of the different transmitted wavelengths. In this work, this interference has been experimentally characterized and the Normalized Power Signal to Interference Ratio (NPSIR) for easily determining the interference in other implementations, independently of the selected system devices, has been also proposed. A set of experiments in a darkroom, working with RGB multi-LED transmitters and a general purpose camera, were performed in order to obtain the NPSIR values and to validate the deduced equations for 2D pixel representation of real distances. These parameters were used in the simulation of a wireless sensor network scenario in a small office, where the Bit Error Rate (BER) of the communication link was calculated. The experiments show that the interference of other close emitters in terms of the distance and the used wavelength can be easily determined with the NPSIR. Finally, the simulation validates the applicability of the deduced equations for scaling the initial results into real scenarios.
COBRA ATD multispectral camera response model
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.
Application of single-image camera calibration for ultrasound augmented laparoscopic visualization
NASA Astrophysics Data System (ADS)
Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj
2015-03-01
Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool (rdCalib; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.
Application of single-image camera calibration for ultrasound augmented laparoscopic visualization
Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj
2017-01-01
Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool (rdCalib; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery. PMID:28943703
Application of single-image camera calibration for ultrasound augmented laparoscopic visualization.
Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D; Shekhar, Raj
2015-03-01
Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool ( rdCalib ; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker ® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.
2007-05-01
general, off axis imaging can cause distortion and astigmatism in the image if proper precautions are not taken. In this case, the lens selection... astigmatism into the optical system. This astigmatism takes the form of a blurring in each image directed away from the optical axis. This blurring...is non-trivial and makes particle identification nearly impossible. Images of particles from two of the off axis cameras with the astigmatism present
Adaptive Beamforming Algorithms for High Resolution Microwave Imaging
1991-04-01
frequency- and phase -locked. With a system of radio camera size it must be assumed that oscillators will drift and, similarly, that electronic circuits in...propagation-induced phase errors an array as large as the one under discussion is likely to experience differ- ent weather conditions across it. The nominal...human optical system. Such a passing-scene display with human optical resolving power would be available to the air - man at night as well as during the
Lock-In Imaging System for Detecting Disturbances in Fluid
NASA Technical Reports Server (NTRS)
Park, Yeonjoon (Inventor); Choi, Sang Hyouk (Inventor); King, Glen C. (Inventor); Elliott, James R. (Inventor); Dimarcantonio, Albert L. (Inventor)
2014-01-01
A lock-in imaging system is configured for detecting a disturbance in air. The system includes an airplane, an interferometer, and a telescopic imaging camera. The airplane includes a fuselage and a pair of wings. The airplane is configured for flight in air. The interferometer is operatively disposed on the airplane and configured for producing an interference pattern by splitting a beam of light into two beams along two paths and recombining the two beams at a junction point in a front flight path of the airplane during flight. The telescopic imaging camera is configured for capturing an image of the beams at the junction point. The telescopic imaging camera is configured for detecting the disturbance in air in an optical path, based on an index of refraction of the image, as detected at the junction point.
[Studies of vision by Leonardo da Vinci].
Berggren, L
2001-01-01
Leonardo was an advocate of the intromission theory of vision. Light rays from the object to the eye caused visual perceptions which were transported to the brain ventricles via a hollow optic nerve. Leonardo introduced wax injections to explore the ventricular system. Perceptions were assumed to go to the "senso comune" in the middle (3rd) ventricle, also the seat of the soul. The processing station "imprensiva" in the anterior lateral horns together with memory "memoria" in th posterior (4th) ventricle integrated the visual perceptions to visual experience. - Leonardo's sketches with circular lenses in the center of the eye reveal that his dependence on medieval optics prevailed over anatomical observations. Drawings of the anatomy of the sectioned eye are missing although Leonardo had invented a new embedding technique. In order to dissect the eye without spilling its contents, the eye was first boiled in egg white and then cut. The procedure was now repeated and showed that the ovoid lens after boiling had become spherical. - Leonardo described that light rays were refracted and reflected in the eye but his imperfect anatomy prevented a development of physiological optics. He was, however, the first to compare the eye with a pin-hole camera (camera obscura). Leonardo's drawings of the inverted pictures on the back wall of a camera obscura inspired to its use as an instrument for artistic practice. The camera obscura was for centuries a model for explaining human vision.
Royo Sánchez, Ana Cristina; Aguilar Martín, Juan José; Santolaria Mazo, Jorge
2014-12-01
Motion capture systems are often used for checking and analyzing human motion in biomechanical applications. It is important, in this context, that the systems provide the best possible accuracy. Among existing capture systems, optical systems are those with the highest accuracy. In this paper, the development of a new calibration procedure for optical human motion capture systems is presented. The performance and effectiveness of that new calibration procedure are also checked by experimental validation. The new calibration procedure consists of two stages. In the first stage, initial estimators of intrinsic and extrinsic parameters are sought. The camera calibration method used in this stage is the one proposed by Tsai. These parameters are determined from the camera characteristics, the spatial position of the camera, and the center of the capture volume. In the second stage, a simultaneous nonlinear optimization of all parameters is performed to identify the optimal values, which minimize the objective function. The objective function, in this case, minimizes two errors. The first error is the distance error between two markers placed in a wand. The second error is the error of position and orientation of the retroreflective markers of a static calibration object. The real co-ordinates of the two objects are calibrated in a co-ordinate measuring machine (CMM). The OrthoBio system is used to validate the new calibration procedure. Results are 90% lower than those from the previous calibration software and broadly comparable with results from a similarly configured Vicon system.
NASA Technical Reports Server (NTRS)
Behar, Alberto; Carsey, Frank; Lane, Arthur; Engelhardt, Herman
2006-01-01
An instrumentation system has been developed for studying interactions between a glacier or ice sheet and the underlying rock and/or soil. Prior borehole imaging systems have been used in well-drilling and mineral-exploration applications and for studying relatively thin valley glaciers, but have not been used for studying thick ice sheets like those of Antarctica. The system includes a cylindrical imaging probe that is lowered into a hole that has been bored through the ice to the ice/bedrock interface by use of an established hot-water-jet technique. The images acquired by the cameras yield information on the movement of the ice relative to the bedrock and on visible features of the lower structure of the ice sheet, including ice layers formed at different times, bubbles, and mineralogical inclusions. At the time of reporting the information for this article, the system was just deployed in two boreholes on the Amery ice shelf in East Antarctica and after successful 2000 2001 deployments in 4 boreholes at Ice Stream C, West Antarctica, and in 2002 at Black Rapids Glacier, Alaska. The probe is designed to operate at temperatures from 40 to +40 C and to withstand the cold, wet, high-pressure [130-atm (13.20-MPa)] environment at the bottom of a water-filled borehole in ice as deep as 1.6 km. A current version is being outfitted to service 2.4-km-deep boreholes at the Rutford Ice Stream in West Antarctica. The probe (see figure) contains a sidelooking charge-coupled-device (CCD) camera that generates both a real-time analog video signal and a sequence of still-image data, and contains a digital videotape recorder. The probe also contains a downward-looking CCD analog video camera, plus halogen lamps to illuminate the fields of view of both cameras. The analog video outputs of the cameras are converted to optical signals that are transmitted to a surface station via optical fibers in a cable. Electric power is supplied to the probe through wires in the cable at a potential of 170 VDC. A DC-to-DC converter steps the supply down to 12 VDC for the lights, cameras, and image-data-transmission circuitry. Heat generated by dissipation of electric power in the probe is removed simply by conduction through the probe housing to the visible features of the lower structure of the ice sheet, including ice layers formed at different times, bubbles, and mineralogical inclusions. At the time of reporting the information for this article, the system was just deployed in two boreholes on the Amery ice shelf in East Antarctica and after successful 2000 2001 deployments in 4 boreholes at Ice Stream C, West Antarctica, and in 2002 at Black Rapids Glacier, Alaska. The probe is designed to operate at temperatures from 40 to +40 C and to withstand the cold, wet, high-pressure [130-atm (13.20-MPa)] environment at the bottom of a water-filled borehole in ice as deep as 1.6 km. A current version is being outfitted to service 2.4-km-deep boreholes at the Rutford Ice Stream in West Antarctica. The probe (see figure) contains a sidelooking charge-coupled-device (CCD) camera that generates both a real-time analog video signal and a sequence of still-image data, and contains a digital videotape recorder. The probe also contains a downward-looking CCD analog video camera, plus halogen lamps to illuminate the fields of view of both cameras. The analog video outputs of the cameras are converted to optical signals that are transmitted to a surface station via optical fibers in a cable. Electric power is supplied to the probe through wires in the cable at a potential of 170 VDC. A DC-to-DC converter steps the supply down to 12 VDC for the lights, cameras, and image-datatransmission circuitry. Heat generated by dissipation of electric power in the probe is removed simply by conduction through the probe housing to the visible features of the lower structure of the ice sheet, including ice layers formed at different times, bubbles, and mineralogical inclusions. At thime of reporting the information for this article, the system was just deployed in two boreholes on the Amery ice shelf in East Antarctica and after successful 2000 2001 deployments in 4 boreholes at Ice Stream C, West Antarctica, and in 2002 at Black Rapids Glacier, Alaska. The probe is designed to operate at temperatures from 40 to +40 C and to withstand the cold, wet, high-pressure [130-atm (13.20-MPa)] environment at the bottom of a water-filled borehole in ice as deep as 1.6 km. A current version is being outfitted to service 2.4-km-deep boreholes at the Rutford Ice Stream in West Antarctica. The probe (see figure) contains a sidelooking charge-coupled-device (CCD) camera that generates both a real-time analog video signal and a sequence of still-image data, and contains a digital videotape recorder. The probe also contains a downward-looking CCD analog video camera, plus halogen lamps to illuminate the fields of view of both cameras. The analog video outputs of the cameras are converted to optical signals that are transmitted to a surface station via optical fibers in a cable. Electric power is supplied to the probe through wires in the cable at a potential of 170 VDC. A DC-to-DC converter steps the supply down to 12 VDC for the lights, cameras, and image-datatransmission circuitry. Heat generated by dissipation of electric power in the probe is removed simply by conduction through the probe housing to the adjacent water and ice.
NASA Astrophysics Data System (ADS)
Liu, W.; Wang, H.; Liu, D.; Miu, Y.
2018-05-01
Precise geometric parameters are essential to ensure the positioning accuracy for space optical cameras. However, state-of-the-art onorbit calibration method inevitably suffers from long update cycle and poor timeliness performance. To this end, in this paper we exploit the optical auto-collimation principle and propose a real-time onboard calibration scheme for monitoring key geometric parameters. Specifically, in the proposed scheme, auto-collimation devices are first designed by installing collimated light sources, area-array CCDs, and prisms inside the satellite payload system. Through utilizing those devices, the changes in the geometric parameters are elegantly converted into changes in the spot image positions. The variation of geometric parameters can be derived via extracting and processing the spot images. An experimental platform is then set up to verify the feasibility and analyze the precision index of the proposed scheme. The experiment results demonstrate that it is feasible to apply the optical auto-collimation principle for real-time onboard monitoring.
Analysis of the effect on optical equipment caused by solar position in target flight measure
NASA Astrophysics Data System (ADS)
Zhu, Shun-hua; Hu, Hai-bin
2012-11-01
Optical equipment is widely used to measure flight parameters in target flight performance test, but the equipment is sensitive to the sun's rays. In order to avoid the disadvantage of sun's rays directly shines to the optical equipment camera lens when measuring target flight parameters, the angle between observation direction and the line which connects optical equipment camera lens and the sun should be kept at a big range, The calculation method of the solar azimuth and altitude to the optical equipment at any time and at any place on the earth, the equipment observation direction model and the calculating model of angle between observation direction and the line which connects optical equipment camera lens are introduced in this article. Also, the simulation of the effect on optical equipment caused by solar position at different time, different date, different month and different target flight direction is given in this article.
NASA Astrophysics Data System (ADS)
Wang, Xiaoyong; Guo, Chongling; Hu, Yongli; He, Hongyan
2017-11-01
The primary and secondary mirrors of onaxis three mirror anastigmatic (TMA) space camera are connected and supported by its front mirror-body structure, which affects both imaging performance and stability of the camera. In this paper, the carbon fiber reinforced plastics (CFRP) thin-walled cylinder and titanium alloy connecting rod have been used for the front mirror-body opto-mechanical structure of the long-focus on-axis and TMA space camera optical system. The front mirror-body component structure has then been optimized by finite element analysis (FEA) computing. Each performance of the front mirror-body structure has been tested by mechanics and vacuum experiments in order to verify the validity of such structure engineering design.
Potential for application of an acoustic camera in particle tracking velocimetry.
Wu, Fu-Chun; Shao, Yun-Chuan; Wang, Chi-Kuei; Liou, Jim
2008-11-01
We explored the potential and limitations for applying an acoustic camera as the imaging instrument of particle tracking velocimetry. The strength of the acoustic camera is its usability in low-visibility environments where conventional optical cameras are ineffective, while its applicability is limited by lower temporal and spatial resolutions. We conducted a series of experiments in which acoustic and optical cameras were used to simultaneously image the rotational motion of tracer particles, allowing for a comparison of the acoustic- and optical-based velocities. The results reveal that the greater fluctuations associated with the acoustic-based velocities are primarily attributed to the lower temporal resolution. The positive and negative biases induced by the lower spatial resolution are balanced, with the positive ones greater in magnitude but the negative ones greater in quantity. These biases reduce with the increase in the mean particle velocity and approach minimum as the mean velocity exceeds the threshold value that can be sensed by the acoustic camera.
Invisible marker based augmented reality system
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Park, Jong-Il
2005-07-01
Augmented reality (AR) has recently gained significant attention. The previous AR techniques usually need a fiducial marker with known geometry or objects of which the structure can be easily estimated such as cube. Placing a marker in the workspace of the user can be intrusive. To overcome this limitation, we present an AR system using invisible markers which are created/drawn with an infrared (IR) fluorescent pen. Two cameras are used: an IR camera and a visible camera, which are positioned in each side of a cold mirror so that their optical centers coincide with each other. We track the invisible markers using IR camera and visualize AR in the view of visible camera. Additional algorithms are employed for the system to have a reliable performance in the cluttered background. Experimental results are given to demonstrate the viability of the proposed system. As an application of the proposed system, the invisible marker can act as a Vision-Based Identity and Geometry (VBIG) tag, which can significantly extend the functionality of RFID. The invisible tag is the same as RFID in that it is not perceivable while more powerful in that the tag information can be presented to the user by direct projection using a mobile projector or by visualizing AR on the screen of mobile PDA.
A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection
NASA Astrophysics Data System (ADS)
Tomono, Akira; Iida, Muneo; Kobayashi, Yukio
1990-04-01
This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.
Lee, Onseok; Park, Sunup; Kim, Jaeyoung; Oh, Chilhwan
2017-11-01
The visual scoring method has been used as a subjective evaluation of pigmentary skin disorders. Severity of pigmentary skin disease, especially melasma, is evaluated using a visual scoring method, the MASI (melasma area severity index). This study differentiates between epidermal and dermal pigmented disease. The study was undertaken to determine methods to quantitatively measure the severity of pigmentary skin disorders under ultraviolet illumination. The optical imaging system consists of illumination (white LED, UV-A lamp) and image acquisition (DSLR camera, air cooling CMOS CCD camera). Each camera is equipped with a polarizing filter to remove glare. To analyze images of visible and UV light, images are divided into frontal, cheek, and chin regions of melasma patients. Each image must undergo image processing. To reduce the curvature error in facial contours, a gradient mask is used. The new method of segmentation of front and lateral facial images is more objective for face-area-measurement than the MASI score. Image analysis of darkness and homogeneity is adequate to quantify the conventional MASI score. Under visible light, active lesion margins appear in both epidermal and dermal melanin, whereas melanin is found in the epidermis under UV light. This study objectively analyzes severity of melasma and attempts to develop new methods of image analysis with ultraviolet optical imaging equipment. Based on the results of this study, our optical imaging system could be used as a valuable tool to assess the severity of pigmentary skin disease. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Study on portable optical 3D coordinate measuring system
NASA Astrophysics Data System (ADS)
Ren, Tongqun; Zhu, Jigui; Guo, Yinbiao
2009-05-01
A portable optical 3D coordinate measuring system based on digital Close Range Photogrammetry (CRP) technology and binocular stereo vision theory is researched. Three ultra-red LED with high stability is set on a hand-hold target to provide measuring feature and establish target coordinate system. Ray intersection based field directional calibrating is done for the intersectant binocular measurement system composed of two cameras by a reference ruler. The hand-hold target controlled by Bluetooth wireless communication is free moved to implement contact measurement. The position of ceramic contact ball is pre-calibrated accurately. The coordinates of target feature points are obtained by binocular stereo vision model from the stereo images pair taken by cameras. Combining radius compensation for contact ball and residual error correction, object point can be resolved by transfer of axes using target coordinate system as intermediary. This system is suitable for on-field large-scale measurement because of its excellent portability, high precision, wide measuring volume, great adaptability and satisfying automatization. It is tested that the measuring precision is near to +/-0.1mm/m.
Optical Characterization of the SPT-3G Camera
NASA Astrophysics Data System (ADS)
Pan, Z.; Ade, P. A. R.; Ahmed, Z.; Anderson, A. J.; Austermann, J. E.; Avva, J. S.; Thakur, R. Basu; Bender, A. N.; Benson, B. A.; Carlstrom, J. E.; Carter, F. W.; Cecil, T.; Chang, C. L.; Cliche, J. F.; Cukierman, A.; Denison, E. V.; de Haan, T.; Ding, J.; Dobbs, M. A.; Dutcher, D.; Everett, W.; Foster, A.; Gannon, R. N.; Gilbert, A.; Groh, J. C.; Halverson, N. W.; Harke-Hosemann, A. H.; Harrington, N. L.; Henning, J. W.; Hilton, G. C.; Holzapfel, W. L.; Huang, N.; Irwin, K. D.; Jeong, O. B.; Jonas, M.; Khaire, T.; Kofman, A. M.; Korman, M.; Kubik, D.; Kuhlmann, S.; Kuo, C. L.; Lee, A. T.; Lowitz, A. E.; Meyer, S. S.; Michalik, D.; Montgomery, J.; Nadolski, A.; Natoli, T.; Nguyen, H.; Noble, G. I.; Novosad, V.; Padin, S.; Pearson, J.; Posada, C. M.; Rahlin, A.; Ruhl, J. E.; Saunders, L. J.; Sayre, J. T.; Shirley, I.; Shirokoff, E.; Smecher, G.; Sobrin, J. A.; Stark, A. A.; Story, K. T.; Suzuki, A.; Tang, Q. Y.; Thompson, K. L.; Tucker, C.; Vale, L. R.; Vanderlinde, K.; Vieira, J. D.; Wang, G.; Whitehorn, N.; Yefremenko, V.; Yoon, K. W.; Young, M. R.
2018-05-01
The third-generation South Pole Telescope camera is designed to measure the cosmic microwave background across three frequency bands (centered at 95, 150 and 220 GHz) with ˜ 16,000 transition-edge sensor (TES) bolometers. Each multichroic array element on a detector wafer has a broadband sinuous antenna that couples power to six TESs, one for each of the three observing bands and both polarizations, via lumped element filters. Ten detector wafers populate the detector array, which is coupled to the sky via a large-aperture optical system. Here we present the frequency band characterization with Fourier transform spectroscopy, measurements of optical time constants, beam properties, and optical and polarization efficiencies of the detector array. The detectors have frequency bands consistent with our simulations and have high average optical efficiency which is 86, 77 and 66% for the 95, 150 and 220 GHz detectors. The time constants of the detectors are mostly between 0.5 and 5 ms. The beam is round with the correct size, and the polarization efficiency is more than 90% for most of the bolometers.
W-Band Free Space Permittivity Measurement Setup for Candidate Radome Materials
NASA Technical Reports Server (NTRS)
Fralick, Dion T.
1997-01-01
This paper presents a measurement system used for w-band complex permittivity measurements performed in NASA Langley Research Center's Electromagnetics Research Branch. The system was used to characterize candidate radome materials for the passive millimeter wave (PMMW) camera experiment. The PMMW camera is a new technology sensor, with goals of all-weather landings of civilian and military aircraft. The sensor is being developed under a NASA Technology Reinvestment program with TRW, McDonnell- Douglas, Honeywell, and Composite Optics, Inc. as participants. The experiment is scheduled to be flight tested on the Air Force's 'Speckled Trout' aircraft in late 1997. The camera operates at W-band, in a radiometric capacity and generates an image of the viewable field. Because the camera is a radiometer, the system is very sensitive to losses. Minimal transmission loss through the radome at the operating frequency, 89 GHz, was critical to the success of the experiment. This paper details the design, set-up, calibration and operation of a free space measurement system developed and used to characterize the candidate radome materials for this program.
Pi of the Sky observation of GRB160625B
NASA Astrophysics Data System (ADS)
Opiela, Rafał; Batsch, Tadeusz; Castro-Tirado, Alberto Javier; Czyrkowski, Henryk; Ćwiek, Arkadiusz; Ćwiok, Mikołaj; DÄ browski, Ryszard; Jelinek, Martin; Kasprowicz, Grzegorz; Majcher, Ariel; Małek, Katarzyna; Mankiewicz, Lech; Nawrocki, Krzysztof; Obara, Łukasz; Piotrowski, Lech; Siudek, Małgorzata; Sokołowski, Marcin; Wawrzaszek, Roman; Wrochna, Grzegorz; Zaremba, Marcin; Żarnecki, Aleksander Filip
2017-08-01
Pi of the Sky is a system of wide field of view robotic telescopes, which search for short timescale astrophysical phenomena, especially for prompt optical GRB emission. The system was designed for autonomous operation, monitoring a large fraction of the sky to a depth of 12m-13m and with time resolution of the order of 10 seconds. Custom designed CCD cameras are equipped with Canon lenses f = 85 mm, f/d = 1.2 and cover 20° × 20° of the sky each. The final system with 16 cameras on 4 equatorial mounts was completed in 2014 at the INTA El Arenosillo Test Centre in Spain. GRB160625B was an extremely bright GRB with three distinct emission episodes. Cameras of the Pi of the Sky observatory in Spain were not observing the position of the GRB160625B prior to the first emission episode. Observations started only after receiving Fermi/GBM trigger, about 140 seconds prior to the second emission. As the position estimate taken from the Fermi alert and used to position the telescope was not very accurate, the actual position of the burst happened to be in the overlap region of two cameras, resulting in two independent sets of measurements. Light curves from both cameras were reconstructed using the Luiza framework. No object brighter than 12.4m (3σ limit) was observed prior to the second GRB emission. An optical flash was identified on an image starting -5.9s before the time of the Fermi/LAT trigger, brightening to about 8m on the next image and then becoming gradually dimmer, fading below our sensitivity after about 400s. Emission features as measured in different spectral bands indicate that the three emission episodes of GRB160625B were dominated by distinct physics process. Simultaneously observations in gamma-rays and optical wavelengths support the hypothesis that this was the first observed transition from thermal to non-thermal radiation in a single GRB. Main results of the combined analysis are presented.
Touch And Go Camera System (TAGCAMS) for the OSIRIS-REx Asteroid Sample Return Mission
NASA Astrophysics Data System (ADS)
Bos, B. J.; Ravine, M. A.; Caplinger, M.; Schaffner, J. A.; Ladewig, J. V.; Olds, R. D.; Norman, C. D.; Huish, D.; Hughes, M.; Anderson, S. K.; Lorenz, D. A.; May, A.; Jackman, C. D.; Nelson, D.; Moreau, M.; Kubitschek, D.; Getzandanner, K.; Gordon, K. E.; Eberhardt, A.; Lauretta, D. S.
2018-02-01
NASA's OSIRIS-REx asteroid sample return mission spacecraft includes the Touch And Go Camera System (TAGCAMS) three camera-head instrument. The purpose of TAGCAMS is to provide imagery during the mission to facilitate navigation to the target asteroid, confirm acquisition of the asteroid sample, and document asteroid sample stowage. The cameras were designed and constructed by Malin Space Science Systems (MSSS) based on requirements developed by Lockheed Martin and NASA. All three of the cameras are mounted to the spacecraft nadir deck and provide images in the visible part of the spectrum, 400-700 nm. Two of the TAGCAMS cameras, NavCam 1 and NavCam 2, serve as fully redundant navigation cameras to support optical navigation and natural feature tracking. Their boresights are aligned in the nadir direction with small angular offsets for operational convenience. The third TAGCAMS camera, StowCam, provides imagery to assist with and confirm proper stowage of the asteroid sample. Its boresight is pointed at the OSIRIS-REx sample return capsule located on the spacecraft deck. All three cameras have at their heart a 2592 × 1944 pixel complementary metal oxide semiconductor (CMOS) detector array that provides up to 12-bit pixel depth. All cameras also share the same lens design and a camera field of view of roughly 44° × 32° with a pixel scale of 0.28 mrad/pixel. The StowCam lens is focused to image features on the spacecraft deck, while both NavCam lens focus positions are optimized for imaging at infinity. A brief description of the TAGCAMS instrument and how it is used to support critical OSIRIS-REx operations is provided.
Comparison of parameters of modern cooled and uncooled thermal cameras
NASA Astrophysics Data System (ADS)
Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał
2017-10-01
During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.
NASA Astrophysics Data System (ADS)
Blain, Pascal; Michel, Fabrice; Piron, Pierre; Renotte, Yvon; Habraken, Serge
2013-08-01
Noncontact optical measurement methods are essential tools in many industrial and research domains. A family of new noncontact optical measurement methods based on the polarization states splitting technique and monochromatic light projection as a way to overcome ambient lighting for in-situ measurement has been developed. Recent works on a birefringent element, a Savart plate, allow one to build a more flexible and robust interferometer. This interferometer is a multipurpose metrological device. On one hand the interferometer can be set in front of a charge-coupled device (CCD) camera. This optical measurement system is called a shearography interferometer and allows one to measure microdisplacements between two states of the studied object under coherent lighting. On the other hand, by producing and shifting multiple sinusoidal Young's interference patterns with this interferometer, and using a CCD camera, it is possible to build a three-dimensional structured light profilometer.
NASA Astrophysics Data System (ADS)
Holland, S. Douglas
1992-09-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
NASA Technical Reports Server (NTRS)
Holland, S. Douglas (Inventor)
1992-01-01
A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.
Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.
Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno
2008-11-17
The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.
Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena
NASA Astrophysics Data System (ADS)
Pei Wong, Choun; Subramaniam, R.
2018-05-01
The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena
ERIC Educational Resources Information Center
Wong, Choun Pei; Subramaniam, R.
2018-01-01
The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.
Obituary: James Gilbert Baker, 1914-2005
NASA Astrophysics Data System (ADS)
Baker, Neal Kenton
2005-12-01
Dr. James Gilbert Baker, renowned astronomer and optical physicist, died 29 June 2005 at his home in Bedford, New Hampshire at the age of 90. Although his scientific interest was astronomy, his extraordinary ability in optical design led to the creation of hundreds of optical systems that supported astronomy, aerial reconnaissance, instant photography (Polaroid SX70 camera), and the US space programs. He was the recipient of numerous awards for his creative work. He was born in Louisville, Kentucky, on 11 November 1914, the fourth child of Jesse B. Baker and Hattie M. Stallard. After graduating from Louisville DuPont Manual High, he went on to attend the University of Louisville majoring in Mathematics. He became very close to an Astronomy Professor, Dr. Moore, and many times used his telescopes to do nightly observations. While at the university, he built mirrors for his own telescopes and helped form the Louisville Astronomical Society in 1933. At the University of Louisville, he also met his future wife, Elizabeth Katherine Breitenstein of Jefferson County, Kentucky. He received his BA in 1935 at the height of the Depression. He began his graduate work in astronomy at the Harvard College Observatory. After his MA (1936), he was appointed a Junior Fellow (1937-1943) in the Prestigious Harvard Society of Fellows. He received his PhD in 1942 from Harvard in rather an unusual fashion, which is worth retelling. During an Astronomy Department dinner, Dr. Harlow Shapley (the director) asked him to give a talk. According to the "Courier-Journal Magazine", "Dr. Shapley stood up and proclaimed an on-the-spot departmental meeting and asked for a vote on recommending Baker for a Ph.D. on the basis of the 'oral exam' he had just finished. The vote was unanimous." It was at Harvard College Observatory during this first stage of his career that he collaborated with Donald H. Menzel, Lawrence H. Aller, and George H. Shortley on a landmark set of papers on the physical processes in gaseous nebulae. In addition to his theoretical work, he also began designing astronomical instruments with ever greater resolving powers and wide-angle acceptance which he described as the "the royal way to new discoveries."1 He is well known for the Baker-Schmidt telescope and the Baker Super Schmidt meteor camera. He was also a co-author with George Z. Dimitroff of a book entitled, "Telescopes and Accessories" (1945). In 1948 he received an Honorary Doctorate from the University of Louisville. With the start of World War II, the U.S. Army sought to establish an aerial reconnaissance branch and placed the project in charge of Colonel George W. Goddard. After months of searching for an optical designer, he asked for a recommendation from Dr. Mees2 of Eastman Kodak. Following the recommendations of Dr. Mees, Col. Goddard found this friendly and unassuming twenty-six year old graduate student at Harvard to be the perfect candidate. He was impressed by Dr. Baker's originality in optical design and provided him a small army research contract in early 1941 for a wide-angle camera system. Goddard's "Victory Lens" project began on 20 May 1942 when he visited Dr. Baker's office at Harvard College Observatory and described the need for a lens of f/2.5 covering a 5x5 plate to be made in huge quantities." Multiple designs were developed during the war effort. A hands-on man, Dr. Baker risked his life operating the cameras in many of the early test flights that carried the camera systems in unpressurized compartments on aircraft. He was the director of the Observatory Optical Project at Harvard University from 1943 to 1945. He began his long consulting career with the Perkin Elmer Corporation during this period. When the war ended, Harvard University decided to cease war-related projects and subsequently, Dr. Baker's lab was moved to Boston University and was eventually spun off as ITEK Corporation. However, he continued to be an associate professor and research associate at Harvard from 1946 to 1949. In 1948 he received the Presidential Medal for Merit for his work during World War II in the Office of Scientific Research and Development. In 1948, he moved to Orinda, California from Cambridge, Massachusetts and became a research associate of Lick Observatory for two years. He returned to Harvard in 1950. He had spent thousands of hours doing ray trace calculations on a Marchant calculator to produce his first aerial cameras. To replace the tedious calculations by hand, Dr. Baker introduced the use of numerical computers into the field of optics. His ray-trace program was one of the first applications run on the Harvard Mark II (1947) computer. Later on, he developed his own methodology to optimize the performance of his optical designs. These optical design computer programs were a family affair, developed under his direction by his own children to support his highly sophisticated designs of the 1960s and 1970s. For most of his career, Dr. Baker was involved with large system concepts covering not only the camera, but the camera delivery systems as well. As the chairman of U.S. Air Force Scientific Advisory Board, he recognized that national security requirements would require optical designs of even greater resolving power using aircraft at extreme altitudes. The need for such a plane resulted in the creation of the U-2 system consisting of a plane and camera functioning as a unit to create panoramic high-resolution aerial photographs. He formed Spica Incorporated in 1955 to perform the necessary optical design work for the US Government. The final design was a 36-inch f/10 system. Dr. Baker also designed the aircraft's periscope to allow the pilot to see his flight path. By 1958, he was almost solely responsible for all the cameras used in photoreconnaissance aircraft. He continued to serve on the President's Foreign Intelligence Advisory Board and on the Land Panel. Before the launch of Sputnik, he designed the Baker-Nunn satellite-tracking camera to support the Air Force's early satellite tracking and space surveillance networks. Because of his foresight, cameras were in place to track the Sputnik Satellite in October 1957. These cameras allowed the precise orbital determination of all orbiting spacecraft for over three decades until the tracking cameras were retired from service. He continued to advise top Government officials in the evolution of reconnaissance systems during the 1960s and 1970s. He received a Space Pioneer Award from the US Air Force. He received the Pioneers of National Reconnaissance Medal (2000) with the citation, "As a young Harvard astronomer, Dr. James G. Baker designed most of the lenses and many of the cameras used in aerial over flights of 'denied territory' enabling the success of the U.S. peacetime strategic reconnaissance policy." Around 1968, he undertook a consulting contract with Polaroid Corporation after Dr. Edwin Land persuaded him that only he could design the optical system for his new SX-70 Land. He was also responsible for the design of the Quintic focusing system for the Polaroid Spectra Camera system that employed a revolutionary combination of non-rotational aspherics to achieve focusing function. In 1958 he became a Fellow of the Optical Society of America (OSA). In 1960 he was elected President of the Society for one year and helped establish the Applied Optics Journal. He was the recipient of numerous OSA awards, spanning the breadth of the field, and has been honored with the Adolf Lomb Award, Ives Medal, Fraunhofer Award, and Richardson Award. He was made an honorary member of OSA in 1993. He also was the recipient of the 1978 Gold Medal, the highest award of the International Society of Optical Engineers (SPIE). Furthermore, he was the Recipient of the Elliott Cresson Medal of the Franklin Institute for his many innovations in astronomical tools. Dr. Baker was elected a Member of the National Academy of Sciences (1965), the American Philosophical Society (1970), the American Academy of Arts and Sciences (1946), and the National Academy of Engineering (1979). He was a member of the American Astronomical Society, the International Astronomical Union, and the Astronomical Society of the Pacific. He authored numerous professional papers and has over fifty US patents. He maintained his affiliation with the Harvard College Observatory and the Smithsonian Astrophysical Observatory until he retired in 2003. Even after his retirement in 2003, he continued work at his home on a new telescope design that he told his family he should have discovered in 1940. Light was always his tool to the understanding of the Universe. An entry from his personal observation log, 7 January 1933, made after an evening of star gazing reveals the pure inspiration of his efforts: "After all, it is the satisfaction obtained which benefits humanity, more than any other thing. It is in the satisfaction of greater human knowledge about the cosmos that the scientist is spurred on to greater efforts." James Baker fulfilled the destiny he had foreseen in 1933, living to see professional and amateur astronomers use his instruments and designs to further the understanding of the cosmos. Whereas, he had not predicted that his cameras would protect this nation for over many years. He is survived by his wife, his four children and five grandchildren. 1Oscar Bryant, "Astronomical Designs," in "Accent", the University of Louisville College of Arts and Sciences Alumni Newsletter, Spring 1994. 2George W. Goddard,Brigadier General, "Overview", 273.
Fukuda, Shinichi; Beheregaray, Simone; Hoshi, Sujin; Yamanari, Masahiro; Lim, Yiheng; Hiraoka, Takahiro; Yasuno, Yoshiaki; Oshika, Tetsuro
2013-12-01
To evaluate the ability of parameters measured by three-dimensional (3D) corneal and anterior segment optical coherence tomography (CAS-OCT) and a rotating Scheimpflug camera combined with a Placido topography system (Scheimpflug camera with topography) to discriminate between normal eyes and forme fruste keratoconus. Forty-eight eyes of 48 patients with keratoconus, 25 eyes of 25 patients with forme fruste keratoconus and 128 eyes of 128 normal subjects were evaluated. Anterior and posterior keratometric parameters (steep K, flat K, average K), elevation, topographic parameters, regular and irregular astigmatism (spherical, asymmetry, regular and higher-order astigmatism) and five pachymetric parameters (minimum, minimum-median, inferior-superior, inferotemporal-superonasal, vertical thinnest location of the cornea) were measured using 3D CAS-OCT and a Scheimpflug camera with topography. The area under the receiver operating curve (AUROC) was calculated to assess the discrimination ability. Compatibility and repeatability of both devices were evaluated. Posterior surface elevation showed higher AUROC values in discrimination analysis of forme fruste keratoconus using both devices. Both instruments showed significant linear correlations (p<0.05, Pearson's correlation coefficient) and good repeatability (ICCs: 0.885-0.999) for normal and forme fruste keratoconus. Posterior elevation was the best discrimination parameter for forme fruste keratoconus. Both instruments presented good correlation and repeatability for this condition.
Condenser for illuminating a ringfield camera with synchrotron emission light
Sweatt, W.C.
1996-04-30
The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors. 9 figs.
Condenser for illuminating a ringfield camera with synchrotron emission light
Sweatt, William C.
1996-01-01
The present invention relates generally to the field of condensers for collecting light from a synchrotron radiation source and directing the light into a ringfield of a lithography camera. The present invention discloses a condenser comprising collecting, processing, and imaging optics. The collecting optics are comprised of concave and convex spherical mirrors that collect the light beams. The processing optics, which receive the light beams, are comprised of flat mirrors that converge and direct the light beams into a real entrance pupil of the camera in a symmetrical pattern. In the real entrance pupil are located flat mirrors, common to the beams emitted from the preceding mirrors, for generating substantially parallel light beams and for directing the beams toward the ringfield of a camera. Finally, the imaging optics are comprised of a spherical mirror, also common to the beams emitted from the preceding mirrors, images the real entrance pupil through the resistive mask and into the virtual entrance pupil of the camera. Thus, the condenser is comprised of a plurality of beams with four mirrors corresponding to a single beam plus two common mirrors.
Human Age Estimation Method Robust to Camera Sensor and/or Face Movement
Nguyen, Dat Tien; Cho, So Ra; Pham, Tuyen Danh; Park, Kang Ryoung
2015-01-01
Human age can be employed in many useful real-life applications, such as customer service systems, automatic vending machines, entertainment, etc. In order to obtain age information, image-based age estimation systems have been developed using information from the human face. However, limitations exist for current age estimation systems because of the various factors of camera motion and optical blurring, facial expressions, gender, etc. Motion blurring can usually be presented on face images by the movement of the camera sensor and/or the movement of the face during image acquisition. Therefore, the facial feature in captured images can be transformed according to the amount of motion, which causes performance degradation of age estimation systems. In this paper, the problem caused by motion blurring is addressed and its solution is proposed in order to make age estimation systems robust to the effects of motion blurring. Experiment results show that our method is more efficient for enhancing age estimation performance compared with systems that do not employ our method. PMID:26334282
Optical design and stray light analysis for the JANUS camera of the JUICE space mission
NASA Astrophysics Data System (ADS)
Greggio, D.; Magrin, D.; Munari, M.; Zusi, M.; Ragazzoni, R.; Cremonese, G.; Debei, S.; Friso, E.; Della Corte, V.; Palumbo, P.; Hoffmann, H.; Jaumann, R.; Michaelis, H.; Schmitz, N.; Schipani, P.; Lara, L. M.
2015-09-01
The JUICE (JUpiter ICy moons Explorer) satellite of the European Space Agency (ESA) is dedicated to the detailed study of Jupiter and its moons. Among the whole instrument suite, JANUS (Jovis, Amorum ac Natorum Undique Scrutator) is the camera system of JUICE designed for imaging at visible wavelengths. It will conduct an in-depth study of Ganymede, Callisto and Europa, and explore most of the Jovian system and Jupiter itself, performing, in the case of Ganymede, a global mapping of the satellite with a resolution of 400 m/px. The optical design chosen to meet the scientific goals of JANUS is a three mirror anastigmatic system in an off-axis configuration. To ensure that the achieved contrast is high enough to observe the features on the surface of the satellites, we also performed a preliminary stray light analysis of the telescope. We provide here a short description of the optical design and we present the procedure adopted to evaluate the stray-light expected during the mapping phase of the surface of Ganymede. We also use the results obtained from the first run of simulations to optimize the baffle design.
Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles
Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián
2016-01-01
In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044
Visible-regime polarimetric imager: a fully polarimetric, real-time imaging system.
Barter, James D; Thompson, Harold R; Richardson, Christine L
2003-03-20
A fully polarimetric optical camera system has been constructed to obtain polarimetric information simultaneously from four synchronized charge-coupled device imagers at video frame rates of 60 Hz and a resolution of 640 x 480 pixels. The imagers view the same scene along the same optical axis by means of a four-way beam-splitting prism similar to ones used for multiple-imager, common-aperture color TV cameras. Appropriate polarizing filters in front of each imager provide the polarimetric information. Mueller matrix analysis of the polarimetric response of the prism, analyzing filters, and imagers is applied to the detected intensities in each imager as a function of the applied state of polarization over a wide range of linear and circular polarization combinations to obtain an average polarimetric calibration consistent to approximately 2%. Higher accuracies can be obtained by improvement of the polarimetric modeling of the splitting prism and by implementation of a pixel-by-pixel calibration.
Eccentricity error identification and compensation for high-accuracy 3D optical measurement
He, Dong; Liu, Xiaoli; Peng, Xiang; Ding, Yabin; Gao, Bruce Z
2016-01-01
The circular target has been widely used in various three-dimensional optical measurements, such as camera calibration, photogrammetry and structured light projection measurement system. The identification and compensation of the circular target systematic eccentricity error caused by perspective projection is an important issue for ensuring accurate measurement. This paper introduces a novel approach for identifying and correcting the eccentricity error with the help of a concentric circles target. Compared with previous eccentricity error correction methods, our approach does not require taking care of the geometric parameters of the measurement system regarding target and camera. Therefore, the proposed approach is very flexible in practical applications, and in particular, it is also applicable in the case of only one image with a single target available. The experimental results are presented to prove the efficiency and stability of the proposed approach for eccentricity error compensation. PMID:26900265
Eccentricity error identification and compensation for high-accuracy 3D optical measurement.
He, Dong; Liu, Xiaoli; Peng, Xiang; Ding, Yabin; Gao, Bruce Z
2013-07-01
The circular target has been widely used in various three-dimensional optical measurements, such as camera calibration, photogrammetry and structured light projection measurement system. The identification and compensation of the circular target systematic eccentricity error caused by perspective projection is an important issue for ensuring accurate measurement. This paper introduces a novel approach for identifying and correcting the eccentricity error with the help of a concentric circles target. Compared with previous eccentricity error correction methods, our approach does not require taking care of the geometric parameters of the measurement system regarding target and camera. Therefore, the proposed approach is very flexible in practical applications, and in particular, it is also applicable in the case of only one image with a single target available. The experimental results are presented to prove the efficiency and stability of the proposed approach for eccentricity error compensation.
Multithreaded hybrid feature tracking for markerless augmented reality.
Lee, Taehee; Höllerer, Tobias
2009-01-01
We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.
MagAO: Status and on-sky performance of the Magellan adaptive optics system
NASA Astrophysics Data System (ADS)
Morzinski, Katie M.; Close, Laird M.; Males, Jared R.; Kopon, Derek; Hinz, Phil M.; Esposito, Simone; Riccardi, Armando; Puglisi, Alfio; Pinna, Enrico; Briguglio, Runa; Xompero, Marco; Quirós-Pacheco, Fernando; Bailey, Vanessa; Follette, Katherine B.; Rodigas, T. J.; Wu, Ya-Lin; Arcidiacono, Carmelo; Argomedo, Javier; Busoni, Lorenzo; Hare, Tyson; Uomoto, Alan; Weinberger, Alycia
2014-07-01
MagAO is the new adaptive optics system with visible-light and infrared science cameras, located on the 6.5-m Magellan "Clay" telescope at Las Campanas Observatory, Chile. The instrument locks on natural guide stars (NGS) from 0th to 16th R-band magnitude, measures turbulence with a modulating pyramid wavefront sensor binnable from 28×28 to 7×7 subapertures, and uses a 585-actuator adaptive secondary mirror (ASM) to provide at wavefronts to the two science cameras. MagAO is a mutated clone of the similar AO systems at the Large Binocular Telescope (LBT) at Mt. Graham, Arizona. The high-level AO loop controls up to 378 modes and operates at frame rates up to 1000 Hz. The instrument has two science cameras: VisAO operating from 0.5-1μm and Clio2 operating from 1-5 μm. MagAO was installed in 2012 and successfully completed two commissioning runs in 2012-2013. In April 2014 we had our first science run that was open to the general Magellan community. Observers from Arizona, Carnegie, Australia, Harvard, MIT, Michigan, and Chile took observations in collaboration with the MagAO instrument team. Here we describe the MagAO instrument, describe our on-sky performance, and report our status as of summer 2014.
Multicolor pyrometer for materials processing in space
NASA Technical Reports Server (NTRS)
Frish, M. B.; Frank, J.; Baker, J. E.; Foutter, R. R.; Beerman, H.; Allen, M. G.
1990-01-01
This report documents the work performed by Physical Sciences Inc. (PSI), under contract to NASA JPL, during a 2.5-year SBIR Phase 2 Program. The program goals were to design, construct, and program a prototype passive imaging pyrometer capable of measuring, as accurately as possible, and controlling the temperature distribution across the surface of a moving object suspended in space. These goals were achieved and the instrument was delivered to JPL in November 1989. The pyrometer utilizes an optical system which operates at short wavelengths compared to the peak of the black-body spectrum for the temperature range of interest, thus minimizing errors associated with a lack of knowledge about the heated sample's emissivity. To cover temperatures from 900 to 2500 K, six wavelengths are available. The preferred wavelength for measurement of a particular temperature decreases as the temperature increases. Images at all six wavelengths are projected onto a single CCD camera concurrently. The camera and optical system have been calibrated to relate the measured intensity at each pixel to the temperature of the heated object. The output of the camera is digitized by a frame grabber installed in a personal computer and analyzed automatically to yield temperature information. The data can be used in a feedback loop to alter the status of computer-activated switches and thereby control a heating system.
Adaptive optics system for the IRSOL solar observatory
NASA Astrophysics Data System (ADS)
Ramelli, Renzo; Bucher, Roberto; Rossini, Leopoldo; Bianda, Michele; Balemi, Silvano
2010-07-01
We present a low cost adaptive optics system developed for the solar observatory at Istituto Ricerche Solari Locarno (IRSOL), Switzerland. The Shack-Hartmann Wavefront Sensor is based on a Dalsa CCD camera with 256 pixels × 256 pixels working at 1kHz. The wavefront compensation is obtained by a deformable mirror with 37 actuators and a Tip-Tilt mirror. A real time control software has been developed on a RTAI-Linux PC. Scicos/Scilab based software has been realized for an online analysis of the system behavior. The software is completely open source.
Coherent infrared imaging camera (CIRIC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.
1995-07-01
New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerousmore » and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.« less
Design and evaluation of a filter spectrometer concept for facsimile cameras
NASA Technical Reports Server (NTRS)
Kelly, W. L., IV; Jobson, D. J.; Rowland, C. W.
1974-01-01
The facsimile camera is an optical-mechanical scanning device which was selected as the imaging system for the Viking '75 lander missions to Mars. A concept which uses an interference filter-photosensor array to integrate a spectrometric capability with the basic imagery function of this camera was proposed for possible application to future missions. This paper is concerned with the design and evaluation of critical electronic circuits and components that are required to implement this concept. The feasibility of obtaining spectroradiometric data is demonstrated, and the performance of a laboratory model is described in terms of spectral range, angular and spectral resolution, and noise-equivalent radiance.
Precision machining of optical surfaces with subaperture correction technologies MRF and IBF
NASA Astrophysics Data System (ADS)
Schmelzer, Olaf; Feldkamp, Roman
2015-10-01
Precision optical elements are used in a wide range of technical instrumentations. Many optical systems e.g. semiconductor inspection modules, laser heads for laser material processing or high end movie cameras, contain precision optics even aspherical or freeform surfaces. Critical parameters for such systems are wavefront error, image field curvature or scattered light. Following these demands the lens parameters are also critical concerning power and RMSi of the surface form error and micro roughness. How can we reach these requirements? The emphasis of this discussion is set on the application of subaperture correction technologies in the fabrication of high-end aspheres and free-forms. The presentation focuses on the technology chain necessary for the production of high-precision aspherical optical components and the characterization of the applied subaperture finishing tools MRF (magneto-rheological finishing) and IBF (ion beam figuring). These technologies open up the possibility of improving the performance of optical systems.
Multiple-target tracking implementation in the ebCMOS camera system: the LUSIPHER prototype
NASA Astrophysics Data System (ADS)
Doan, Quang Tuyen; Barbier, Remi; Dominjon, Agnes; Cajgfinger, Thomas; Guerin, Cyrille
2012-06-01
The domain of the low light imaging systems progresses very fast, thanks to detection and electronic multiplication technology evolution, such as the emCCD (electron multiplying CCD) or the ebCMOS (electron bombarded CMOS). We present an ebCMOS camera system that is able to track every 2 ms more than 2000 targets with a mean number of photons per target lower than two. The point light sources (targets) are spots generated by a microlens array (Shack-Hartmann) used in adaptive optics. The Multiple-Target-Tracking designed and implemented on a rugged workstation is described. The results and the performances of the system on the identification and tracking are presented and discussed.
Adaptive Optics for the Human Eye
NASA Astrophysics Data System (ADS)
Williams, D. R.
2000-05-01
Adaptive optics can extend not only the resolution of ground-based telescopes, but also the human eye. Both static and dynamic aberrations in the cornea and lens of the normal eye limit its optical quality. Though it is possible to correct defocus and astigmatism with spectacle lenses, higher order aberrations remain. These aberrations blur vision and prevent us from seeing at the fundamental limits set by the retina and brain. They also limit the resolution of cameras to image the living retina, cameras that are a critical for the diagnosis and treatment of retinal disease. I will describe an adaptive optics system that measures the wave aberration of the eye in real time and compensates for it with a deformable mirror, endowing the human eye with unprecedented optical quality. This instrument provides fresh insight into the ultimate limits on human visual acuity, reveals for the first time images of the retinal cone mosaic responsible for color vision, and points the way to contact lenses and laser surgical methods that could enhance vision beyond what is currently possible today. Supported by the NSF Science and Technology Center for Adaptive Optics, the National Eye Institute, and Bausch and Lomb, Inc.
X-ray and optical stereo-based 3D sensor fusion system for image-guided neurosurgery.
Kim, Duk Nyeon; Chae, You Seong; Kim, Min Young
2016-04-01
In neurosurgery, an image-guided operation is performed to confirm that the surgical instruments reach the exact lesion position. Among the multiple imaging modalities, an X-ray fluoroscope mounted on C- or O-arm is widely used for monitoring the position of surgical instruments and the target position of the patient. However, frequently used fluoroscopy can result in relatively high radiation doses, particularly for complex interventional procedures. The proposed system can reduce radiation exposure and provide the accurate three-dimensional (3D) position information of surgical instruments and the target position. X-ray and optical stereo vision systems have been proposed for the C- or O-arm. Two subsystems have same optical axis and are calibrated simultaneously. This provides easy augmentation of the camera image and the X-ray image. Further, the 3D measurement of both systems can be defined in a common coordinate space. The proposed dual stereoscopic imaging system is designed and implemented for mounting on an O-arm. The calibration error of the 3D coordinates of the optical stereo and X-ray stereo is within 0.1 mm in terms of the mean and the standard deviation. Further, image augmentation with the camera image and the X-ray image using an artificial skull phantom is achieved. As the developed dual stereoscopic imaging system provides 3D coordinates of the point of interest in both optical images and fluoroscopic images, it can be used by surgeons to confirm the position of surgical instruments in a 3D space with minimum radiation exposure and to verify whether the instruments reach the surgical target observed in fluoroscopic images.
NASA Astrophysics Data System (ADS)
Farries, Mark; Ward, Jon; Valle, Stefano; Stephens, Gary; Moselund, Peter; van der Zanden, Koen; Napier, Bruce
2015-06-01
Mid-IR imaging spectroscopy has the potential to offer an effective tool for early cancer diagnosis. Current development of bright super-continuum sources, narrow band acousto-optic tunable filters and fast cameras have made feasible a system that can be used for fast diagnosis of cancer in vivo at point of care. The performance of a proto system that has been developed under the Minerva project is described.
Stereo optical guidance system for control of industrial robots
NASA Technical Reports Server (NTRS)
Powell, Bradley W. (Inventor); Rodgers, Mike H. (Inventor)
1992-01-01
A device for the generation of basic electrical signals which are supplied to a computerized processing complex for the operation of industrial robots. The system includes a stereo mirror arrangement for the projection of views from opposite sides of a visible indicia formed on a workpiece. The views are projected onto independent halves of the retina of a single camera. The camera retina is of the CCD (charge-coupled-device) type and is therefore capable of providing signals in response to the image projected thereupon. These signals are then processed for control of industrial robots or similar devices.
Optimization design of periscope type 3X zoom lens design for a five megapixel cellphone camera
NASA Astrophysics Data System (ADS)
Sun, Wen-Shing; Tien, Chuen-Lin; Pan, Jui-Wen; Chao, Yu-Hao; Chu, Pu-Yi
2016-11-01
This paper presents a periscope type 3X zoom lenses design for a five megapixel cellphone camera. The configuration of optical system uses the right angle prism in front of the zoom lenses to change the optical path rotated by a 90° angle resulting in the zoom lenses length of 6 mm. The zoom lenses can be embedded in mobile phone with a thickness of 6 mm. The zoom lenses have three groups with six elements. The half field of view is varied from 30° to 10.89°, the effective focal length is adjusted from 3.142 mm to 9.426 mm, and the F-number is changed from 2.8 to 5.13.
Towards next generation 3D cameras
NASA Astrophysics Data System (ADS)
Gupta, Mohit
2017-03-01
We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (<100 microns resolution) scans in extremely demanding scenarios with low-cost components. Several of these cameras are making a practical impact in industrial automation, being adopted in robotic inspection and assembly systems.
SU-F-T-232: Monthly Quality Assurance in External Beam Radiation Therapy Using a Single System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, K; Ji, T; Department of Radiation Oncology, The First Hospital, China Medical University, Shenyang, Liaoning
Purpose: Monthly quality assurance (QA) is time consuming for external beam radiation therapy, taking as long as 6–8 hours for each machine. It is due to the use and setup of multiple devices for different QA procedures. We have developed a single system with rotational capability for the measurement of both optical light and radiation which significantly reduces the time spent on Monthly QA. Methods: A single system using mirrors, a phosphor screen and a CCD camera is housed on a cylindrical motor so that it can rotate 360 degrees. For monthly QA, the system is placed on the patientmore » couch of the medical accelerator with the plane of the phosphor screen at isocenter for all measurements. For optical QA such as optical distance indicator, room laser and light field, the optical image is collected directly with the camera. For radiation QA such as beam profile, MLC speed, picket-fence test, collimator rotation, table rotation and gantry rotation, a brass build-up plate is attached to the top of the phosphor screen. Two brass plates with islands of different thickness were designed for photon energy and electron energy constancy checks. Flex map, distortion map and uniformity map were developed to calibrate the motor bearing, camera/lens distortion, and the phosphor screen’s measured response across the field. Results: Following the TG142 guidelines for monthly QA with our system, the overall run time is reduced from 6–8 hours to 1.5 hours. Our system’s rotating design allows for quick testing of the gantry radiation isocenter test that is also independent of the sag of the gantry and the EPID. Conclusion: Our system significantly shortens the time needed for monthly QA by unifying the tests with a single system. Future work will be focused on extending the technology to Brachytherapy, IMRT and proton therapy QAs. This work is funded in part by a sponsor research grant from JPLC who owns the Raven technology. John Wong is a co-founder of JPLC.« less
Blue camera of the Keck cosmic web imager, fabrication and testing
NASA Astrophysics Data System (ADS)
Rockosi, Constance; Cowley, David; Cabak, Jerry; Hilyard, David; Pfister, Terry
2016-08-01
The Keck Cosmic Web Imager (KCWI) is a new facility instrument being developed for the W. M. Keck Observatory and funded for construction by the Telescope System Instrumentation Program (TSIP) of the National Science Foundation (NSF). KCWI is a bench-mounted spectrograph for the Keck II right Nasmyth focal station, providing integral field spectroscopy over a seeing-limited field up to 20" x 33" in extent. Selectable Volume Phase Holographic (VPH) gratings provide high efficiency and spectral resolution in the range of 1000 to 20000. The dual-beam design of KCWI passed a Preliminary Design Review in summer 2011. The detailed design of the KCWI blue channel (350 to 700 nm) is now nearly complete, with the red channel (530 to 1050 nm) planned for a phased implementation contingent upon additional funding. KCWI builds on the experience of the Caltech team in implementing the Cosmic Web Imager (CWI), in operation since 2009 at Palomar Observatory. KCWI adds considerable flexibility to the CWI design, and will take full advantage of the excellent seeing and dark sky above Mauna Kea with a selectable nod-and-shuffle observing mode. In this paper, models of the expected KCWI sensitivity and background subtraction capability are presented, along with a detailed description of the instrument design. The KCWI team is lead by Caltech (project management, design and implementation) in partnership with the University of California at Santa Cruz (camera optical and mechanical design) and the W. M. Keck Observatory (program oversight and observatory interfaces). The optical design of the blue camera for the Keck Cosmic Web Imager (KCWI) by Harland Epps of the University of California, Santa Cruz is a lens assembly consisting of eight spherical optical elements. Half the elements are calcium fluoride and all elements are air spaced. The design of the camera barrel is unique in that all the optics are secured in their respective cells with an RTV annulus without additional hardware such as retaining rings. The optical design and the robust lens mounting concept has allowed UCO/Lick to design a straightforward lens camera assembly. However, alignment sensitivity is a strict 15 μm for most elements. This drives the fabrication, assembly, and performance of the camera barrel.
A traffic situation analysis system
NASA Astrophysics Data System (ADS)
Sidla, Oliver; Rosner, Marcin
2011-01-01
The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. For example embedded vision systems built into vehicles can be used as early warning systems, or stationary camera systems can modify the switching frequency of signals at intersections. Today the automated analysis of traffic situations is still in its infancy - the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully understood by a vision system. We present steps towards such a traffic monitoring system which is designed to detect potentially dangerous traffic situations, especially incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system is field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in an outdoor capable housing. Two cameras run vehicle detection software including license plate detection and recognition, one camera runs a complex pedestrian detection and tracking module based on the HOG detection principle. As a supplement, all 3 cameras use additional optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. This work describes the foundation for all 3 different object detection modalities (pedestrians, vehi1cles, license plates), and explains the system setup and its design.
NASA Technical Reports Server (NTRS)
Bozyan, Elizabeth P.; Hemenway, Paul D.; Argue, A. Noel
1990-01-01
Observations of a set of 89 extragalactic objects (EGOs) will be made with the Hubble Space Telescope Fine Guidance Sensors and Planetary Camera in order to link the HIPPARCOS Instrumental System to an extragalactic coordinate system. Most of the sources chosen for observation contain compact radio sources and stellarlike nuclei; 65 percent are optical variables beyond a 0.2 mag limit. To ensure proper exposure times, accurate mean magnitudes are necessary. In many cases, the average magnitudes listed in the literature were not adequate. The literature was searched for all relevant photometric information for the EGOs, and photometric parameters were derived, including mean magnitude, maximum range, and timescale of variability. This paper presents the results of that search and the parameters derived. The results will allow exposure times to be estimated such that an observed magnitude different from the tabular magnitude by 0.5 mag in either direction will not degrade the astrometric centering ability on a Planetary Camera CCD frame.
Optical Verification Laboratory Demonstration System for High Security Identification Cards
NASA Technical Reports Server (NTRS)
Javidi, Bahram
1997-01-01
Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the primary pattern [1-3]. We have demonstrated experimentally an optical processor for security verification of objects, products, and persons. This demonstration is very important to encourage industries to consider the proposed system for research and development.
Passive radiation detection using optically active CMOS sensors
NASA Astrophysics Data System (ADS)
Dosiek, Luke; Schalk, Patrick D.
2013-05-01
Recently, there have been a number of small-scale and hobbyist successes in employing commodity CMOS-based camera sensors for radiation detection. For example, several smartphone applications initially developed for use in areas near the Fukushima nuclear disaster are capable of detecting radiation using a cell phone camera, provided opaque tape is placed over the lens. In all current useful implementations, it is required that the sensor not be exposed to visible light. We seek to build a system that does not have this restriction. While building such a system would require sophisticated signal processing, it would nevertheless provide great benefits. In addition to fulfilling their primary function of image capture, cameras would also be able to detect unknown radiation sources even when the danger is considered to be low or non-existent. By experimentally profiling the image artifacts generated by gamma ray and β particle impacts, algorithms are developed to identify the unique features of radiation exposure, while discarding optical interaction and thermal noise effects. Preliminary results focus on achieving this goal in a laboratory setting, without regard to integration time or computational complexity. However, future work will seek to address these additional issues.
Color calibration of an RGB camera mounted in front of a microscope with strong color distortion.
Charrière, Renée; Hébert, Mathieu; Trémeau, Alain; Destouches, Nathalie
2013-07-20
This paper aims at showing that performing color calibration of an RGB camera can be achieved even in the case where the optical system before the camera introduces strong color distortion. In the present case, the optical system is a microscope containing a halogen lamp, with a nonuniform irradiance on the viewed surface. The calibration method proposed in this work is based on an existing method, but it is preceded by a three-step preprocessing of the RGB images aiming at extracting relevant color information from the strongly distorted images, taking especially into account the nonuniform irradiance map and the perturbing texture due to the surface topology of the standard color calibration charts when observed at micrometric scale. The proposed color calibration process consists first in computing the average color of the color-chart patches viewed under the microscope; then computing white balance, gamma correction, and saturation enhancement; and finally applying a third-order polynomial regression color calibration transform. Despite the nonusual conditions for color calibration, fairly good performance is achieved from a 48 patch Lambertian color chart, since an average CIE-94 color difference on the color-chart colors lower than 2.5 units is obtained.
NASA Astrophysics Data System (ADS)
House, Rachael; Lasso, Andras; Harish, Vinyas; Baum, Zachary; Fichtinger, Gabor
2017-03-01
PURPOSE: Optical pose tracking of medical instruments is often used in image-guided interventions. Unfortunately, compared to commonly used computing devices, optical trackers tend to be large, heavy, and expensive devices. Compact 3D vision systems, such as Intel RealSense cameras can capture 3D pose information at several magnitudes lower cost, size, and weight. We propose to use Intel SR300 device for applications where it is not practical or feasible to use conventional trackers and limited range and tracking accuracy is acceptable. We also put forward a vertebral level localization application utilizing the SR300 to reduce risk of wrong-level surgery. METHODS: The SR300 was utilized as an object tracker by extending the PLUS toolkit to support data collection from RealSense cameras. Accuracy of the camera was tested by comparing to a high-accuracy optical tracker. CT images of a lumbar spine phantom were obtained and used to create a 3D model in 3D Slicer. The SR300 was used to obtain a surface model of the phantom. Markers were attached to the phantom and a pointer and tracked using Intel RealSense SDK's built-in object tracking feature. 3D Slicer was used to align CT image with phantom using landmark registration and display the CT image overlaid on the optical image. RESULTS: Accuracy of the camera yielded a median position error of 3.3mm (95th percentile 6.7mm) and orientation error of 1.6° (95th percentile 4.3°) in a 20x16x10cm workspace, constantly maintaining proper marker orientation. The model and surface correctly aligned demonstrating the vertebral level localization application. CONCLUSION: The SR300 may be usable for pose tracking in medical procedures where limited accuracy is acceptable. Initial results suggest the SR300 is suitable for vertebral level localization.
Design principles and applications of a cooled CCD camera for electron microscopy.
Faruqi, A R
1998-01-01
Cooled CCD cameras offer a number of advantages in recording electron microscope images with CCDs rather than film which include: immediate availability of the image in a digital format suitable for further computer processing, high dynamic range, excellent linearity and a high detective quantum efficiency for recording electrons. In one important respect however, film has superior properties: the spatial resolution of CCD detectors tested so far (in terms of point spread function or modulation transfer function) are inferior to film and a great deal of our effort has been spent in designing detectors with improved spatial resolution. Various instrumental contributions to spatial resolution have been analysed and in this paper we discuss the contribution of the phosphor-fibre optics system in this measurement. We have evaluated the performance of a number of detector components and parameters, e.g. different phosphors (and a scintillator), optical coupling with lens or fibre optics with various demagnification factors, to improve the detector performance. The camera described in this paper, which is based on this analysis, uses a tapered fibre optics coupling between the phosphor and the CCD and is installed on a Philips CM12 electron microscope equipped to perform cryo-microscopy. The main use of the camera so far has been in recording electron diffraction patterns from two dimensional crystals of bacteriorhodopsin--from wild type and from different trapped states during the photocycle. As one example of the type of data obtained with the CCD camera a two dimensional Fourier projection map from the trapped O-state is also included. With faster computers, it will soon be possible to undertake this type of work on an on-line basis. Also, with improvements in detector size and resolution, CCD detectors, already ideal for diffraction, will be able to compete with film in the recording of high resolution images.
Design of optical system for binocular fundus camera.
Wu, Jun; Lou, Shiliang; Xiao, Zhitao; Geng, Lei; Zhang, Fang; Wang, Wen; Liu, Mengjia
2017-12-01
A non-mydriasis optical system for binocular fundus camera has been designed in this paper. It can capture two images of the same fundus retinal region from different angles at the same time, and can be used to achieve three-dimensional reconstruction of fundus. It is composed of imaging system and illumination system. In imaging system, Gullstrand Le Grand eye model is used to simulate normal human eye, and Schematic eye model is used to test the influence of ametropia in human eye on imaging quality. Annular aperture and black dot board are added into illumination system, so that the illumination system can eliminate stray light produced by corneal-reflected light and omentoscopic lens. Simulation results show that MTF of each visual field at the cut-off frequency of 90lp/mm is greater than 0.2, system distortion value is -2.7%, field curvature is less than 0.1 mm, radius of Airy disc is 3.25um. This system has a strong ability of chromatic aberration correction and focusing, and can image clearly for human fundus in which the range of diopters is from -10 D to +6 D(1 D = 1 m -1 ).
The Zwicky Transient Facility Camera
NASA Astrophysics Data System (ADS)
Dekany, Richard; Smith, Roger M.; Belicki, Justin; Delacroix, Alexandre; Duggan, Gina; Feeney, Michael; Hale, David; Kaye, Stephen; Milburn, Jennifer; Murphy, Patrick; Porter, Michael; Reiley, Daniel J.; Riddle, Reed L.; Rodriguez, Hector; Bellm, Eric C.
2016-08-01
The Zwicky Transient Facility Camera (ZTFC) is a key element of the ZTF Observing System, the integrated system of optoelectromechanical instrumentation tasked to acquire the wide-field, high-cadence time-domain astronomical data at the heart of the Zwicky Transient Facility. The ZTFC consists of a compact cryostat with large vacuum window protecting a mosaic of 16 large, wafer-scale science CCDs and 4 smaller guide/focus CCDs, a sophisticated vacuum interface board which carries data as electrical signals out of the cryostat, an electromechanical window frame for securing externally inserted optical filter selections, and associated cryo-thermal/vacuum system support elements. The ZTFC provides an instantaneous 47 deg2 field of view, limited by primary mirror vignetting in its Schmidt telescope prime focus configuration. We report here on the design and performance of the ZTF CCD camera cryostat and report results from extensive Joule-Thompson cryocooler tests that may be of broad interest to the instrumentation community.
Subaperture correlation based digital adaptive optics for full field optical coherence tomography.
Kumar, Abhishek; Drexler, Wolfgang; Leitgeb, Rainer A
2013-05-06
This paper proposes a sub-aperture correlation based numerical phase correction method for interferometric full field imaging systems provided the complex object field information can be extracted. This method corrects for the wavefront aberration at the pupil/ Fourier transform plane without the need of any adaptive optics, spatial light modulators (SLM) and additional cameras. We show that this method does not require the knowledge of any system parameters. In the simulation study, we consider a full field swept source OCT (FF SSOCT) system to show the working principle of the algorithm. Experimental results are presented for a technical and biological sample to demonstrate the proof of the principle.
Fast Fiber-Coupled Imaging Devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brockington, Samuel; Case, Andrew; Witherspoon, Franklin Douglas
HyperV Technologies Corp. has successfully designed, built and experimentally demonstrated a full scale 1024 pixel 100 MegaFrames/s fiber coupled camera with 12 or 14 bits, and record lengths of 32K frames, exceeding our original performance objectives. This high-pixel-count, fiber optically-coupled, imaging diagnostic can be used for investigating fast, bright plasma events. In Phase 1 of this effort, a 100 pixel fiber-coupled fast streak camera for imaging plasma jet profiles was constructed and successfully demonstrated. The resulting response from outside plasma physics researchers emphasized development of increased pixel performance as a higher priority over increasing pixel count. In this Phase 2more » effort, HyperV therefore focused on increasing the sample rate and bit-depth of the photodiode pixel designed in Phase 1, while still maintaining a long record length and holding the cost per channel to levels which allowed up to 1024 pixels to be constructed. Cost per channel was 53.31 dollars, very close to our original target of $50 per channel. The system consists of an imaging "camera head" coupled to a photodiode bank with an array of optical fibers. The output of these fast photodiodes is then digitized at 100 Megaframes per second and stored in record lengths of 32,768 samples with bit depths of 12 to 14 bits per pixel. Longer record lengths are possible with additional memory. A prototype imaging system with up to 1024 pixels was designed and constructed and used to successfully take movies of very fast moving plasma jets as a demonstration of the camera performance capabilities. Some faulty electrical components on the 64 circuit boards resulted in only 1008 functional channels out of 1024 on this first generation prototype system. We experimentally observed backlit high speed fan blades in initial camera testing and then followed that with full movies and streak images of free flowing high speed plasma jets (at 30-50 km/s). Jet structure and jet collisions onto metal pillars in the path of the plasma jets were recorded in a single shot. This new fast imaging system is an attractive alternative to conventional fast framing cameras for applications and experiments where imaging events using existing techniques are inefficient or impossible. The development of HyperV's new diagnostic was split into two tracks: a next generation camera track, in which HyperV built, tested, and demonstrated a prototype 1024 channel camera at its own facility, and a second plasma community beta test track, where selected plasma physics programs received small systems of a few test pixels to evaluate the expected performance of a full scale camera on their experiments. These evaluations were performed as part of an unfunded collaboration with researchers at Los Alamos National Laboratory and the University of California at Davis. Results from the prototype 1024-pixel camera are discussed, as well as results from the collaborations with test pixel system deployment sites.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rilling, M; Centre de Recherche sur le Cancer, Hôtel-Dieu de Québec, Quebec City, QC; Département de radio-oncologie, CHU de Québec, Quebec City, QC
2015-06-15
Purpose: The purpose of this work is to simulate a multi-focus plenoptic camera used as the measuring device in a real-time three-dimensional scintillation dosimeter. Simulating and optimizing this realistic optical system will bridge the technological gap between concept validation and a clinically viable tool that can provide highly efficient, accurate and precise measurements for dynamic radiotherapy techniques. Methods: The experimental prototype, previously developed for proof of concept purposes, uses an off-the-shelf multi-focus plenoptic camera. With an array of interleaved microlenses of different focal lengths, this camera records spatial and angular information of light emitted by a plastic scintillator volume. Themore » three distinct microlens focal lengths were determined experimentally for use as baseline parameters by measuring image-to-object magnification for different distances in object space. A simulated plenoptic system was implemented using the non-sequential ray tracing software Zemax: this tool allows complete simulation of multiple optical paths by modeling interactions at interfaces such as scatter, diffraction, reflection and refraction. The active sensor was modeled based on the camera manufacturer specifications by a 2048×2048, 5 µm-pixel pitch sensor. Planar light sources, simulating the plastic scintillator volume, were employed for ray tracing simulations. Results: The microlens focal lengths were determined to be 384, 327 and 290 µm. A realistic multi-focus plenoptic system, with independently defined and optimizable specifications, was fully simulated. A f/2.9 and 54 mm-focal length Double Gauss objective was modeled as the system’s main lens. A three-focal length hexagonal microlens array of 250-µm thickness was designed, acting as an image-relay system between the main lens and sensor. Conclusion: Simulation of a fully modeled multi-focus plenoptic camera enables the decoupled optimization of the main lens and microlens specifications. This work leads the way to improving the 3D dosimeter’s achievable resolution, efficiency and build for providing a quality assurance tool fully meeting clinical needs. M.R. is financially supported by a Master’s Canada Graduate Scholarship from the NSERC. This research is also supported by the NSERC Industrial Research Chair in Optical Design.« less
Active imaging system with Faraday filter
Snyder, James J.
1993-01-01
An active imaging system has a low to medium powered laser transmitter and receiver wherein the receiver includes a Faraday filter with an ultranarrow optical bandpass and a bare (nonintensified) CCD camera. The laser is locked in the vicinity of the passband of the Faraday filter. The system has high sensitivity to the laser illumination while eliminating solar background.
Active imaging system with Faraday filter
Snyder, J.J.
1993-04-13
An active imaging system has a low to medium powered laser transmitter and receiver wherein the receiver includes a Faraday filter with an ultranarrow optical bandpass and a bare (nonintensified) CCD camera. The laser is locked in the vicinity of the passband of the Faraday filter. The system has high sensitivity to the laser illumination while eliminating solar background.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chemerisov, S.; Bailey, J.; Heltemes, T.
A series of four one-day irradiations was conducted with 100Mo-enriched disk targets. After irradiation, the enriched disks were removed from the target and dissolved. The resulting solution was processed using a NorthStar RadioGenix™ 99mTc generator either at Argonne National Laboratory or at the NorthStar Medical Radioisotopes facility. Runs on the RadioGenix system produced inconsistent analytical results for 99mTc in the Tc/Mo solution. These inconsistencies were attributed to the impurities in the solution or improper column packing. During the irradiations, the performance of the optic transitional radiation (OTR) and infrared cameras was tested in high radiation field. The OTR cameras survivedmore » all irradiations, while the IR cameras failed every time. The addition of X-ray and neutron shielding improved camera survivability and decreased the number of upsets.« less
SHOK—The First Russian Wide-Field Optical Camera in Space
NASA Astrophysics Data System (ADS)
Lipunov, V. M.; Gorbovskoy, E. S.; Kornilov, V. G.; Panasyuk, M. I.; Amelushkin, A. M.; Petrov, V. L.; Yashin, I. V.; Svertilov, S. I.; Vedenkin, N. N.
2018-02-01
Onboard the spacecraft Lomonosov is established two fast, fixed, very wide-field cameras SHOK. The main goal of this experiment is the observation of GRB optical emission before, synchronously, and after the gamma-ray emission. The field of view of each of the cameras is placed in the gamma-ray burst detection area of other devices located onboard the "Lomonosov" spacecraft. SHOK provides measurements of optical emissions with a magnitude limit of ˜ 9-10m on a single frame with an exposure of 0.2 seconds. The device is designed for continuous sky monitoring at optical wavelengths in the very wide field of view (1000 square degrees each camera), detection and localization of fast time-varying (transient) optical sources on the celestial sphere, including provisional and synchronous time recording of optical emissions from the gamma-ray burst error boxes, detected by the BDRG device and implemented by a control signal (alert trigger) from the BDRG. The Lomonosov spacecraft has two identical devices, SHOK1 and SHOK2. The core of each SHOK device is a fast-speed 11-Megapixel CCD. Each of the SHOK devices represents a monoblock, consisting of a node observations of optical emission, the electronics node, elements of the mechanical construction, and the body.
NASA Astrophysics Data System (ADS)
Close, Laird M.; Males, Jared R.; Kopon, Derek A.; Gasho, Victor; Follette, Katherine B.; Hinz, Phil; Morzinski, Katie; Uomoto, Alan; Hare, Tyson; Riccardi, Armando; Esposito, Simone; Puglisi, Alfio; Pinna, Enrico; Busoni, Lorenzo; Arcidiacono, Carmelo; Xompero, Marco; Briguglio, Runa; Quiros-Pacheco, Fernando; Argomedo, Javier
2012-07-01
The heart of the 6.5 Magellan AO system (MagAO) is a 585 actuator adaptive secondary mirror (ASM) with <1 msec response times (0.7 ms typically). This adaptive secondary will allow low emissivity and high-contrast AO science. We fabricated a high order (561 mode) pyramid wavefront sensor (similar to that now successfully used at the Large Binocular Telescope). The relatively high actuator count (and small projected ~23 cm pitch) allows moderate Strehls to be obtained by MagAO in the “visible” (0.63-1.05 μm). To take advantage of this we have fabricated an AO CCD science camera called "VisAO". Complete “end-to-end” closed-loop lab tests of MagAO achieve a solid, broad-band, 37% Strehl (122 nm rms) at 0.76 μm (i’) with the VisAO camera in 0.8” simulated seeing (13 cm ro at V) with fast 33 mph winds and a 40 m Lo locked on R=8 mag artificial star. These relatively high visible wavelength Strehls are enabled by our powerful combination of a next generation ASM and a Pyramid WFS with 400 controlled modes and 1000 Hz sample speeds (similar to that used successfully on-sky at the LBT). Currently only the VisAO science camera is used for lab testing of MagAO, but this high level of measured performance (122 nm rms) promises even higher Strehls with our IR science cameras. On bright (R=8 mag) stars we should achieve very high Strehls (>70% at H) in the IR with the existing MagAO Clio2 (λ=1-5.3 μm) science camera/coronagraph or even higher (~98% Strehl) the Mid-IR (8-26 microns) with the existing BLINC/MIRAC4 science camera in the future. To eliminate non-common path vibrations, dispersions, and optical errors the VisAO science camera is fed by a common path advanced triplet ADC and is piggy-backed on the Pyramid WFS optical board itself. Also a high-speed shutter can be used to block periods of poor correction. The entire system passed CDR in June 2009, and we finished the closed-loop system level testing phase in December 2011. Final system acceptance (“pre-ship” review) was passed in February 2012. In May 2012 the entire AO system is was successfully shipped to Chile and fully tested/aligned. It is now in storage in the Magellan telescope clean room in anticipation of “First Light” scheduled for December 2012. An overview of the design, attributes, performance, and schedule for the Magellan AO system and its two science cameras are briefly presented here.
NASA Astrophysics Data System (ADS)
Guarino, V.; Vassiliev, V.; Buckley, J.; Byrum, K.; Falcone, A.; Fegan, S.; Finley, J.; Hanna, D.; Kaaret, P.; Konopelko, A.; Krawczynski, H.; Krennrich, F.; Romani, R.; Wagner, R.; Woods, M.
2009-05-01
The concept of a future ground-based gamma-ray observatory, AGIS, in the energy range 20 GeV to 200 TeV is based on an array of 50-100 imaging atmospheric Cherenkov telescopes (IACTs). The anticipated improvement of AGIS sensitivity, angular resolution, and reliability of operation imposes demanding technological and cost requirements on the design of IACTs. In this submission, we focus on the optical and mechanical systems for a novel Schwarzschild-Couder two-mirror aplanatic optical system originally proposed by Schwarzschild. Emerging new mirror production technologies based on replication processes, such as cold and hot glass slumping, cured CFRP, and electroforming, provide new opportunities for cost effective solutions for the design of the optical system. We explore capabilities of these mirror fabrication methods for the AGIS project and alignment methods for optical systems. We also study a mechanical structure which will provide support points for mirrors and camera design driven by the requirement of minimizing the deflections of the mirror support structures.
NASA Astrophysics Data System (ADS)
Rieke-Zapp, D.; Tecklenburg, W.; Peipe, J.; Hastedt, H.; Haig, Claudia
Recent tests on the geometric stability of several digital cameras that were not designed for photogrammetric applications have shown that the accomplished accuracies in object space are either limited or that the accuracy potential is not exploited to the fullest extent. A total of 72 calibrations were calculated with four different software products for eleven digital camera models with different hardware setups, some with mechanical fixation of one or more parts. The calibration procedure was chosen in accord to a German guideline for evaluation of optical 3D measuring systems [VDI/VDE, VDI/VDE 2634 Part 1, 2002. Optical 3D Measuring Systems-Imaging Systems with Point-by-point Probing. Beuth Verlag, Berlin]. All images were taken with ringflashes which was considered a standard method for close-range photogrammetry. In cases where the flash was mounted to the lens, the force exerted on the lens tube and the camera mount greatly reduced the accomplished accuracy. Mounting the ringflash to the camera instead resulted in a large improvement of accuracy in object space. For standard calibration best accuracies in object space were accomplished with a Canon EOS 5D and a 35 mm Canon lens where the focusing tube was fixed with epoxy (47 μm maximum absolute length measurement error in object space). The fixation of the Canon lens was fairly easy and inexpensive resulting in a sevenfold increase in accuracy compared with the same lens type without modification. A similar accuracy was accomplished with a Nikon D3 when mounting the ringflash to the camera instead of the lens (52 μm maximum absolute length measurement error in object space). Parameterisation of geometric instabilities by introduction of an image variant interior orientation in the calibration process improved results for most cameras. In this case, a modified Alpa 12 WA yielded the best results (29 μm maximum absolute length measurement error in object space). Extending the parameter model with FiBun software to model not only an image variant interior orientation, but also deformations in the sensor domain of the cameras, showed significant improvements only for a small group of cameras. The Nikon D3 camera yielded the best overall accuracy (25 μm maximum absolute length measurement error in object space) with this calibration procedure indicating at the same time the presence of image invariant error in the sensor domain. Overall, calibration results showed that digital cameras can be applied for an accurate photogrammetric survey and that only a little effort was sufficient to greatly improve the accuracy potential of digital cameras.
Final Optical Design of PANIC, a Wide-Field Infrared Camera for CAHA
NASA Astrophysics Data System (ADS)
Cárdenas, M. C.; Gómez, J. Rodríguez; Lenzen, R.; Sánchez-Blanco, E.
We present the Final Optical Design of PANIC (PAnoramic Near Infrared camera for Calar Alto), a wide-field infrared imager for the Ritchey-Chrtien focus of the Calar Alto 2.2 m telescope. This will be the first instrument built under the German-Spanish consortium that manages the Calar Alto observatory. The camera optical design is a folded single optical train that images the sky onto the focal plane with a plate scale of 0.45 arcsec per 18 μm pixel. The optical design produces a well defined internal pupil available to reducing the thermal background by a cryogenic pupil stop. A mosaic of four detectors Hawaii 2RG of 2 k ×2 k, made by Teledyne, will give a field of view of 31.9 arcmin ×31.9 arcmin.
Afocal viewport optics for underwater imaging
NASA Astrophysics Data System (ADS)
Slater, Dan
2014-09-01
A conventional camera can be adapted for underwater use by enclosing it in a sealed waterproof pressure housing with a viewport. The viewport, as an optical interface between water and air needs to consider both the camera and water optical characteristics while also providing a high pressure water seal. Limited hydrospace visibility drives a need for wide angle viewports. Practical optical interfaces between seawater and air vary from simple flat plate windows to complex water contact lenses. This paper first provides a brief overview of the physical and optical properties of the ocean environment along with suitable optical materials. This is followed by a discussion of the characteristics of various afocal underwater viewport types including flat windows, domes and the Ivanoff corrector lens, a derivative of a Galilean wide angle camera adapter. Several new and interesting optical designs derived from the Ivanoff corrector lens are presented including a pair of very compact afocal viewport lenses that are compatible with both in water and in air environments and an afocal underwater hyper-hemispherical fisheye lens.
The Design of Optical Sensor for the Pinhole/Occulter Facility
NASA Technical Reports Server (NTRS)
Greene, Michael E.
1990-01-01
Three optical sight sensor systems were designed, built and tested. Two optical lines of sight sensor system are capable of measuring the absolute pointing angle to the sun. The system is for use with the Pinhole/Occulter Facility (P/OF), a solar hard x ray experiment to be flown from Space Shuttle or Space Station. The sensor consists of a pinhole camera with two pairs of perpendicularly mounted linear photodiode arrays to detect the intensity distribution of the solar image produced by the pinhole, track and hold circuitry for data reduction, an analog to digital converter, and a microcomputer. The deflection of the image center is calculated from these data using an approximation for the solar image. A second system consists of a pinhole camera with a pair of perpendicularly mounted linear photodiode arrays, amplification circuitry, threshold detection circuitry, and a microcomputer board. The deflection of the image is calculated by knowing the position of each pixel of the photodiode array and merely counting the pixel numbers until threshold is surpassed. A third optical sensor system is capable of measuring the internal vibration of the P/OF between the mask and base. The system consists of a white light source, a mirror and a pair of perpendicularly mounted linear photodiode arrays to detect the intensity distribution of the solar image produced by the mirror, amplification circuitry, threshold detection circuitry, and a microcomputer board. The deflection of the image and hence the vibration of the structure is calculated by knowing the position of each pixel of the photodiode array and merely counting the pixel numbers until threshold is surpassed.
Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei
2016-01-01
High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera. PMID:26959023
Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei
2016-03-04
High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera.
The Panoramic Camera (PanCam) Instrument for the ESA ExoMars Rover
NASA Astrophysics Data System (ADS)
Griffiths, A.; Coates, A.; Jaumann, R.; Michaelis, H.; Paar, G.; Barnes, D.; Josset, J.
The recently approved ExoMars rover is the first element of the ESA Aurora programme and is slated to deliver the Pasteur exobiology payload to Mars by 2013. The 0.7 kg Panoramic Camera will provide multispectral stereo images with 65° field-of- view (1.1 mrad/pixel) and high resolution (85 µrad/pixel) monoscopic "zoom" images with 5° field-of-view. The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission as well as providing multispectral geological imaging, colour and stereo panoramic images, solar images for water vapour abundance and dust optical depth measurements and to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. Additionally the High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls.
Model-based software engineering for an optical navigation system for spacecraft
NASA Astrophysics Data System (ADS)
Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.
2017-09-01
The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.
Model-based software engineering for an optical navigation system for spacecraft
NASA Astrophysics Data System (ADS)
Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.
2018-06-01
The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.
Mosad and Stream Vision For A Telerobotic, Flying Camera System
NASA Technical Reports Server (NTRS)
Mandl, William
2002-01-01
Two full custom camera systems using the Multiplexed OverSample Analog to Digital (MOSAD) conversion technology for visible light sensing were built and demonstrated. They include a photo gate sensor and a photo diode sensor. The system includes the camera assembly, driver interface assembly, a frame stabler board with integrated decimeter and Windows 2000 compatible software for real time image display. An array size of 320X240 with 16 micron pixel pitch was developed for compatibility with 0.3 inch CCTV optics. With 1.2 micron technology, a 73% fill factor was achieved. Noise measurements indicated 9 to 11 bits operating with 13.7 bits best case. Power measured under 10 milliwatts at 400 samples per second. Nonuniformity variation was below noise floor. Pictures were taken with different cameras during the characterization study to demonstrate the operable range. The successful conclusion of this program demonstrates the utility of the MOSAD for NASA missions, providing superior performance over CMOS and lower cost and power consumption over CCD. The MOSAD approach also provides a path to radiation hardening for space based applications.
A simple optical tweezers for trapping polystyrene particles
NASA Astrophysics Data System (ADS)
Shiddiq, Minarni; Nasir, Zulfa; Yogasari, Dwiyana
2013-09-01
Optical tweezers is an optical trap. For decades, it has become an optical tool that can trap and manipulate any particle from the very small size like DNA to the big one like bacteria. The trapping force comes from the radiation pressure of laser light which is focused to a group of particles. Optical tweezers has been used in many research areas such as atomic physics, medical physics, biophysics, and chemistry. Here, a simple optical tweezers has been constructed using a modified Leybold laboratory optical microscope. The ocular lens of the microscope has been removed for laser light and digital camera accesses. A laser light from a Coherent diode laser with wavelength λ = 830 nm and power 50 mW is sent through an immersion oil objective lens with magnification 100 × and NA 1.25 to a cell made from microscope slides containing polystyrene particles. Polystyrene particles with size 3 μm and 10 μm are used. A CMOS Thorlabs camera type DCC1545M with USB Interface and Thorlabs camera lens 35 mm are connected to a desktop and used to monitor the trapping and measure the stiffness of the trap. The camera is accompanied by camera software which makes able for the user to capture and save images. The images are analyzed using ImageJ and Scion macro. The polystyrene particles have been trapped successfully. The stiffness of the trap depends on the size of the particles and the power of the laser. The stiffness increases linearly with power and decreases as the particle size larger.
Lightweight helmet-mounted eye movement measurement system
NASA Technical Reports Server (NTRS)
Barnes, J. A.
1978-01-01
The helmet-mounted eye movement measuring system, weighs 1,530 grams; the weight of the present aviators' helmet in standard form with the visor is 1,545 grams. The optical head is standard NAC Eye-Mark. This optical head was mounted on a magnesium yoke which in turn was attached to a slide cam mounted on the flight helmet. The slide cam allows one to adjust the eye-to-optics system distance quite easily and to secure it so that the system will remain in calibration. The design of the yoke and slide cam is such that the subject can, in an emergency, move the optical head forward and upward to the stowed and locked position atop the helmet. This feature was necessary for flight safety. The television camera that is used in the system is a solid state General Electric TN-2000 with a charged induced device imager used as the vidicon.