Sample records for scientific camera system

  1. Utilizing ISS Camera Systems for Scientific Analysis of Lightning Characteristics and Comparison with ISS-LIS and GLM

    NASA Technical Reports Server (NTRS)

    Schultz, Christopher J.; Lang, Timothy J.; Leake, Skye; Runco, Mario, Jr.; Blakeslee, Richard J.

    2017-01-01

    Video and still frame images from cameras aboard the International Space Station (ISS) are used to inspire, educate, and provide a unique vantage point from low-Earth orbit that is second to none; however, these cameras have overlooked capabilities for contributing to scientific analysis of the Earth and near-space environment. The goal of this project is to study how geo referenced video/images from available ISS camera systems can be useful for scientific analysis, using lightning properties as a demonstration.

  2. Tests of commercial colour CMOS cameras for astronomical applications

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Reshetnyk, V. M.; Zhilyaev, B. E.

    2013-12-01

    We present some results of testing commercial colour CMOS cameras for astronomical applications. Colour CMOS sensors allow to perform photometry in three filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system realized in colour CMOS sensors is close to the astronomical Johnson BVR system. The basic camera characteristics: read noise (e^{-}/pix), thermal noise (e^{-}/pix/sec) and electronic gain (e^{-}/ADU) for the commercial digital camera Canon 5D MarkIII are presented. We give the same characteristics for the scientific high performance cooled CCD camera system ALTA E47. Comparing results for tests of Canon 5D MarkIII and CCD ALTA E47 show that present-day commercial colour CMOS cameras can seriously compete with the scientific CCD cameras in deep astronomical imaging.

  3. High-performance dual-speed CCD camera system for scientific imaging

    NASA Astrophysics Data System (ADS)

    Simpson, Raymond W.

    1996-03-01

    Traditionally, scientific camera systems were partitioned with a `camera head' containing the CCD and its support circuitry and a camera controller, which provided analog to digital conversion, timing, control, computer interfacing, and power. A new, unitized high performance scientific CCD camera with dual speed readout at 1 X 106 or 5 X 106 pixels per second, 12 bit digital gray scale, high performance thermoelectric cooling, and built in composite video output is described. This camera provides all digital, analog, and cooling functions in a single compact unit. The new system incorporates the A/C converter, timing, control and computer interfacing in the camera, with the power supply remaining a separate remote unit. A 100 Mbyte/second serial link transfers data over copper or fiber media to a variety of host computers, including Sun, SGI, SCSI, PCI, EISA, and Apple Macintosh. Having all the digital and analog functions in the camera made it possible to modify this system for the Woods Hole Oceanographic Institution for use on a remote controlled submersible vehicle. The oceanographic version achieves 16 bit dynamic range at 1.5 X 105 pixels/second, can be operated at depths of 3 kilometers, and transfers data to the surface via a real time fiber optic link.

  4. Timing generator of scientific grade CCD camera and its implementation based on FPGA technology

    NASA Astrophysics Data System (ADS)

    Si, Guoliang; Li, Yunfei; Guo, Yongfei

    2010-10-01

    The Timing Generator's functions of Scientific Grade CCD Camera is briefly presented: it generates various kinds of impulse sequence for the TDI-CCD, video processor and imaging data output, acting as the synchronous coordinator for time in the CCD imaging unit. The IL-E2TDI-CCD sensor produced by DALSA Co.Ltd. use in the Scientific Grade CCD Camera. Driving schedules of IL-E2 TDI-CCD sensor has been examined in detail, the timing generator has been designed for Scientific Grade CCD Camera. FPGA is chosen as the hardware design platform, schedule generator is described with VHDL. The designed generator has been successfully fulfilled function simulation with EDA software and fitted into XC2VP20-FF1152 (a kind of FPGA products made by XILINX). The experiments indicate that the new method improves the integrated level of the system. The Scientific Grade CCD camera system's high reliability, stability and low power supply are achieved. At the same time, the period of design and experiment is sharply shorted.

  5. SpectraCAM SPM: a camera system with high dynamic range for scientific and medical applications

    NASA Astrophysics Data System (ADS)

    Bhaskaran, S.; Baiko, D.; Lungu, G.; Pilon, M.; VanGorden, S.

    2005-08-01

    A scientific camera system having high dynamic range designed and manufactured by Thermo Electron for scientific and medical applications is presented. The newly developed CID820 image sensor with preamplifier-per-pixel technology is employed in this camera system. The 4 Mega-pixel imaging sensor has a raw dynamic range of 82dB. Each high-transparent pixel is based on a preamplifier-per-pixel architecture and contains two photogates for non-destructive readout of the photon-generated charge (NDRO). Readout is achieved via parallel row processing with on-chip correlated double sampling (CDS). The imager is capable of true random pixel access with a maximum operating speed of 4MHz. The camera controller consists of a custom camera signal processor (CSP) with an integrated 16-bit A/D converter and a PowerPC-based CPU running a Linux embedded operating system. The imager is cooled to -40C via three-stage cooler to minimize dark current. The camera housing is sealed and is designed to maintain the CID820 imager in the evacuated chamber for at least 5 years. Thermo Electron has also developed custom software and firmware to drive the SpectraCAM SPM camera. Included in this firmware package is the new Extreme DRTM algorithm that is designed to extend the effective dynamic range of the camera by several orders of magnitude up to 32-bit dynamic range. The RACID Exposure graphical user interface image analysis software runs on a standard PC that is connected to the camera via Gigabit Ethernet.

  6. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras.

    PubMed

    Quinn, Mark Kenneth; Spinosa, Emanuele; Roberts, David A

    2017-07-25

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.

  7. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras

    PubMed Central

    Spinosa, Emanuele; Roberts, David A.

    2017-01-01

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access. PMID:28757553

  8. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    USDA-ARS?s Scientific Manuscript database

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  9. System Configuration and Operation Plan of Hayabusa2 DCAM3-D Camera System for Scientific Observation During SCI Impact Experiment

    NASA Astrophysics Data System (ADS)

    Ogawa, Kazunori; Shirai, Kei; Sawada, Hirotaka; Arakawa, Masahiko; Honda, Rie; Wada, Koji; Ishibashi, Ko; Iijima, Yu-ichi; Sakatani, Naoya; Nakazawa, Satoru; Hayakawa, Hajime

    2017-07-01

    An artificial impact experiment is scheduled for 2018-2019 in which an impactor will collide with asteroid 162137 Ryugu (1999 JU3) during the asteroid rendezvous phase of the Hayabusa2 spacecraft. The small carry-on impactor (SCI) will shoot a 2-kg projectile at 2 km/s to create a crater 1-10 m in diameter with an expected subsequent ejecta curtain of a 100-m scale on an ideal sandy surface. A miniaturized deployable camera (DCAM3) unit will separate from the spacecraft at about 1 km from impact, and simultaneously conduct optical observations of the experiment. We designed and developed a camera system (DCAM3-D) in the DCAM3, specialized for scientific observations of impact phenomenon, in order to clarify the subsurface structure, construct theories of impact applicable in a microgravity environment, and identify the impact point on the asteroid. The DCAM3-D system consists of a miniaturized camera with a wide-angle and high-focusing performance, high-speed radio communication devices, and control units with large data storage on both the DCAM3 unit and the spacecraft. These components were successfully developed under severe constraints of size, mass and power, and the whole DCAM3-D system has passed all tests verifying functions, performance, and environmental tolerance. Results indicated sufficient potential to conduct the scientific observations during the SCI impact experiment. An operation plan was carefully considered along with the configuration and a time schedule of the impact experiment, and pre-programed into the control unit before the launch. In this paper, we describe details of the system design concept, specifications, and the operating plan of the DCAM3-D system, focusing on the feasibility of scientific observations.

  10. LAMOST CCD camera-control system based on RTS2

    NASA Astrophysics Data System (ADS)

    Tian, Yuan; Wang, Zheng; Li, Jian; Cao, Zi-Huang; Dai, Wei; Wei, Shou-Lin; Zhao, Yong-Heng

    2018-05-01

    The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) is the largest existing spectroscopic survey telescope, having 32 scientific charge-coupled-device (CCD) cameras for acquiring spectra. Stability and automation of the camera-control software are essential, but cannot be provided by the existing system. The Remote Telescope System 2nd Version (RTS2) is an open-source and automatic observatory-control system. However, all previous RTS2 applications were developed for small telescopes. This paper focuses on implementation of an RTS2-based camera-control system for the 32 CCDs of LAMOST. A virtual camera module inherited from the RTS2 camera module is built as a device component working on the RTS2 framework. To improve the controllability and robustness, a virtualized layer is designed using the master-slave software paradigm, and the virtual camera module is mapped to the 32 real cameras of LAMOST. The new system is deployed in the actual environment and experimentally tested. Finally, multiple observations are conducted using this new RTS2-framework-based control system. The new camera-control system is found to satisfy the requirements for automatic camera control in LAMOST. This is the first time that RTS2 has been applied to a large telescope, and provides a referential solution for full RTS2 introduction to the LAMOST observatory control system.

  11. Television camera as a scientific instrument

    NASA Technical Reports Server (NTRS)

    Smokler, M. I.

    1970-01-01

    Rigorous calibration program, coupled with a sophisticated data-processing program that introduced compensation for system response to correct photometry, geometric linearity, and resolution, converted a television camera to a quantitative measuring instrument. The output data are in the forms of both numeric printout records and photographs.

  12. Multiplane and Spectrally-Resolved Single Molecule Localization Microscopy with Industrial Grade CMOS cameras.

    PubMed

    Babcock, Hazen P

    2018-01-29

    This work explores the use of industrial grade CMOS cameras for single molecule localization microscopy (SMLM). We show that industrial grade CMOS cameras approach the performance of scientific grade CMOS cameras at a fraction of the cost. This makes it more economically feasible to construct high-performance imaging systems with multiple cameras that are capable of a diversity of applications. In particular we demonstrate the use of industrial CMOS cameras for biplane, multiplane and spectrally resolved SMLM. We also provide open-source software for simultaneous control of multiple CMOS cameras and for the reduction of the movies that are acquired to super-resolution images.

  13. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  14. Using a trichromatic CCD camera for spectral skylight estimation.

    PubMed

    López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier; Olmo, F J; Cazorla, A; Alados-Arboledas, L

    2008-12-01

    In a previous work [J. Opt. Soc. Am. A 24, 942-956 (2007)] we showed how to design an optimum multispectral system aimed at spectral recovery of skylight. Since high-resolution multispectral images of skylight could be interesting for many scientific disciplines, here we also propose a nonoptimum but much cheaper and faster approach to achieve this goal by using a trichromatic RGB charge-coupled device (CCD) digital camera. The camera is attached to a fish-eye lens, hence permitting us to obtain a spectrum of every point of the skydome corresponding to each pixel of the image. In this work we show how to apply multispectral techniques to the sensors' responses of a common trichromatic camera in order to obtain skylight spectra from them. This spectral information is accurate enough to estimate experimental values of some climate parameters or to be used in algorithms for automatic cloud detection, among many other possible scientific applications.

  15. Scientific Design of a High Contrast Integral Field Spectrograph for the Subaru Telescope

    NASA Technical Reports Server (NTRS)

    McElwain, Michael W.

    2012-01-01

    Ground based telescopes equipped with adaptive optics systems and specialized science cameras are now capable of directly detecting extrasolar planets. We present the scientific design for a high contrast integral field spectrograph for the Subaru Telescope. This lenslet based integral field spectrograph will be implemented into the new extreme adaptive optics system at Subaru, called SCExAO.

  16. Scientific and technical collaboration between Russian and Ukranian researchers and manufacturers on the development of astronomical instruments equipped with advanced detection services

    NASA Astrophysics Data System (ADS)

    Vishnevsky, G. I.; Galyatkin, I. A.; Zhuk, A. A.; Iblyaminova, A. F.; Kossov, V. G.; Levko, G. V.; Nesterov, V. K.; Rivkind, V. L.; Rogalev, Yu. N.; Smirnov, A. V.; Gumerov, R. I.; Bikmaev, I. F.; Pinigin, G. I.; Shulga, A. V.; Kovalchyk, A. V.; Protsyuk, Yu. I.; Malevinsky, S. V.; Abrosimov, V. M.; Mironenko, V. N.; Savchenko, V. V.; Ivaschenko, Yu. N.; Andruk, V. M.; Dalinenko, I. N.; Vydrevich, M. G.

    2003-01-01

    The paper presents the possibilities and a list of tasks that are solved by collaboration between research and production companies, and astronomical observatories of Russia and Ukraine in the field of development, modernization and equipping of various telescopes (the AMC, RTT-150, Zeiss-600 and quantum-optical system Sazhen-S types) with advanced charge-coupled device (CCD) cameras. CCD imagers and ditital CCD cameras designed and manufactured by the "Electron-Optronic" Research & Production Company, St Petersburg, to equip astronomical telescopes and scientific instruments are described.

  17. In-Situ Cameras for Radiometric Correction of Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Kautz, Jess S.

    The atmosphere distorts the spectrum of remotely sensed data, negatively affecting all forms of investigating Earth's surface. To gather reliable data, it is vital that atmospheric corrections are accurate. The current state of the field of atmospheric correction does not account well for the benefits and costs of different correction algorithms. Ground spectral data are required to evaluate these algorithms better. This dissertation explores using cameras as radiometers as a means of gathering ground spectral data. I introduce techniques to implement a camera systems for atmospheric correction using off the shelf parts. To aid the design of future camera systems for radiometric correction, methods for estimating the system error prior to construction, calibration and testing of the resulting camera system are explored. Simulations are used to investigate the relationship between the reflectance accuracy of the camera system and the quality of atmospheric correction. In the design phase, read noise and filter choice are found to be the strongest sources of system error. I explain the calibration methods for the camera system, showing the problems of pixel to angle calibration, and adapting the web camera for scientific work. The camera system is tested in the field to estimate its ability to recover directional reflectance from BRF data. I estimate the error in the system due to the experimental set up, then explore how the system error changes with different cameras, environmental set-ups and inversions. With these experiments, I learn about the importance of the dynamic range of the camera, and the input ranges used for the PROSAIL inversion. Evidence that the camera can perform within the specification set for ELM correction in this dissertation is evaluated. The analysis is concluded by simulating an ELM correction of a scene using various numbers of calibration targets, and levels of system error, to find the number of cameras needed for a full-scale implementation.

  18. Utilizing ISS Camera Systems for Scientific Analysis of Lightning Characteristics and comparison with ISS-LIS and GLM

    NASA Astrophysics Data System (ADS)

    Schultz, C. J.; Lang, T. J.; Leake, S.; Runco, M.; Blakeslee, R. J.

    2017-12-01

    Video and still frame images from cameras aboard the International Space Station (ISS) are used to inspire, educate, and provide a unique vantage point from low-Earth orbit that is second to none; however, these cameras have overlooked capabilities for contributing to scientific analysis of the Earth and near-space environment. The goal of this project is to study how georeferenced video/images from available ISS camera systems can be useful for scientific analysis, using lightning properties as a demonstration. Camera images from the crew cameras and high definition video from the Chiba University Meteor Camera were combined with lightning data from the National Lightning Detection Network (NLDN), ISS-Lightning Imaging Sensor (ISS-LIS), the Geostationary Lightning Mapper (GLM) and lightning mapping arrays. These cameras provide significant spatial resolution advantages ( 10 times or better) over ISS-LIS and GLM, but with lower temporal resolution. Therefore, they can serve as a complementarity analysis tool for studying lightning and thunderstorm processes from space. Lightning sensor data, Visible Infrared Imaging Radiometer Suite (VIIRS) derived city light maps, and other geographic databases were combined with the ISS attitude and position data to reverse geolocate each image or frame. An open-source Python toolkit has been developed to assist with this effort. Next, the locations and sizes of all flashes in each frame or image were computed and compared with flash characteristics from all available lightning datasets. This allowed for characterization of cloud features that are below the 4-km and 8-km resolution of ISS-LIS and GLM which may reduce the light that reaches the ISS-LIS or GLM sensor. In the case of video, consecutive frames were overlaid to determine the rate of change of the light escaping cloud top. Characterization of the rate of change in geometry, more generally the radius, of light escaping cloud top was integrated with the NLDN, ISS-LIS and GLM to understand how the peak rate of change and the peak area of each flash aligned with each lightning system in time. Flash features like leaders could be inferred from the video frames as well. Testing is being done to see if leader speeds may be accurately calculated under certain circumstances.

  19. Exploring the Universe with the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A general overview is given of the operations, engineering challenges, and components of the Hubble Space Telescope. Deployment, checkout and servicing in space are discussed. The optical telescope assembly, focal plane scientific instruments, wide field/planetary camera, faint object spectrograph, faint object camera, Goddard high resolution spectrograph, high speed photometer, fine guidance sensors, second generation technology, and support systems and services are reviewed.

  20. Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters

    NASA Astrophysics Data System (ADS)

    Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai

    2016-04-01

    Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion

  1. First NAC Image Obtained in Mercury Orbit

    NASA Image and Video Library

    2017-12-08

    NASA image acquired: March 29, 2011 This is the first image of Mercury taken from orbit with MESSENGER’s Narrow Angle Camera (NAC). MESSENGER’s camera system, the Mercury Dual Imaging System (MDIS), has two cameras: the Narrow Angle Camera and the Wide Angle Camera (WAC). Comparison of this image with MESSENGER’s first WAC image of the same region shows the substantial difference between the fields of view of the two cameras. At 1.5°, the field of view of the NAC is seven times smaller than the 10.5° field of view of the WAC. This image was taken using MDIS’s pivot. MDIS is mounted on a pivoting platform and is the only instrument in MESSENGER’s payload capable of movement independent of the spacecraft. The other instruments are fixed in place, and most point down the spacecraft’s boresight at all times, relying solely on the guidance and control system for pointing. The 90° range of motion of the pivot gives MDIS a much-needed extra degree of freedom, allowing MDIS to image the planet’s surface at times when spacecraft geometry would normally prevent it from doing so. The pivot also gives MDIS additional imaging opportunities by allowing it to view more of the surface than that at which the boresight-aligned instruments are pointed at any given time. On March 17, 2011 (March 18, 2011, UTC), MESSENGER became the first spacecraft ever to orbit the planet Mercury. The mission is currently in the commissioning phase, during which spacecraft and instrument performance are verified through a series of specially designed checkout activities. In the course of the one-year primary mission, the spacecraft's seven scientific instruments and radio science investigation will unravel the history and evolution of the Solar System's innermost planet. Visit the Why Mercury? section of this website to learn more about the science questions that the MESSENGER mission has set out to answer. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  2. Multiple-Agent Air/Ground Autonomous Exploration Systems

    NASA Technical Reports Server (NTRS)

    Fink, Wolfgang; Chao, Tien-Hsin; Tarbell, Mark; Dohm, James M.

    2007-01-01

    Autonomous systems of multiple-agent air/ground robotic units for exploration of the surfaces of remote planets are undergoing development. Modified versions of these systems could be used on Earth to perform tasks in environments dangerous or inaccessible to humans: examples of tasks could include scientific exploration of remote regions of Antarctica, removal of land mines, cleanup of hazardous chemicals, and military reconnaissance. A basic system according to this concept (see figure) would include a unit, suspended by a balloon or a blimp, that would be in radio communication with multiple robotic ground vehicles (rovers) equipped with video cameras and possibly other sensors for scientific exploration. The airborne unit would be free-floating, controlled by thrusters, or tethered either to one of the rovers or to a stationary object in or on the ground. Each rover would contain a semi-autonomous control system for maneuvering and would function under the supervision of a control system in the airborne unit. The rover maneuvering control system would utilize imagery from the onboard camera to navigate around obstacles. Avoidance of obstacles would also be aided by readout from an onboard (e.g., ultrasonic) sensor. Together, the rover and airborne control systems would constitute an overarching closed-loop control system to coordinate scientific exploration by the rovers.

  3. Evolution of the SOFIA tracking control system

    NASA Astrophysics Data System (ADS)

    Fiebig, Norbert; Jakob, Holger; Pfüller, Enrico; Röser, Hans-Peter; Wiedemann, Manuel; Wolf, Jürgen

    2014-07-01

    The airborne observatory SOFIA (Stratospheric Observatory for Infrared Astronomy) is undergoing a modernization of its tracking system. This included new, highly sensitive tracking cameras, control computers, filter wheels and other equipment, as well as a major redesign of the control software. The experiences along the migration path from an aged 19" VMbus based control system to the application of modern industrial PCs, from VxWorks real-time operating system to embedded Linux and a state of the art software architecture are presented. Further, the concept is presented to operate the new camera also as a scientific instrument, in parallel to tracking.

  4. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2015-08-05

    This animation shows images of the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA). Read more: www.nasa.gov/feature/goddard/from-a-million-miles-away-na... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  5. From a Million Miles Away, NASA Camera Shows Moon Crossing Face of Earth

    NASA Image and Video Library

    2017-12-08

    This animation still image shows the far side of the moon, illuminated by the sun, as it crosses between the DISCOVR spacecraft's Earth Polychromatic Imaging Camera (EPIC) camera and telescope, and the Earth - one million miles away. Credits: NASA/NOAA A NASA camera aboard the Deep Space Climate Observatory (DSCOVR) satellite captured a unique view of the moon as it moved in front of the sunlit side of Earth last month. The series of test images shows the fully illuminated “dark side” of the moon that is never visible from Earth. The images were captured by NASA’s Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope on the DSCOVR satellite orbiting 1 million miles from Earth. From its position between the sun and Earth, DSCOVR conducts its primary mission of real-time solar wind monitoring for the National Oceanic and Atmospheric Administration (NOAA). Read more: www.nasa.gov/feature/goddard/from-a-million-miles-away-na... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  7. Engineer's drawing of Skylab 4 Far Ultraviolet Electronographic camera

    NASA Image and Video Library

    1973-11-19

    S73-36910 (November 1973) --- An engineer's drawing of the Skylab 4 Far Ultraviolet Electronographic camera (Experiment S201). Arrows point to various features and components of the camera. As the Comet Kohoutek streams through space at speeds of 100,000 miles per hour, the Skylab 4 crewmen will use the S201 UV camera to photograph features of the comet not visible from the Earth's surface. While the comet is some distance from the sun, the camera will be pointed through the scientific airlock in the wall of the Skylab space station Orbital Workshop (OWS). By using a movable mirror system built for the Ultraviolet Stellar Astronomy (S019) Experiment and rotating the space station, the S201 camera will be able to photograph the comet around the side of the space station. Photo credit: NASA

  8. Prototyping a 10 Gigabit-Ethernet Event-Builder for the CTA Camera Server

    NASA Astrophysics Data System (ADS)

    Hoffmann, Dirk; Houles, Julien

    2012-12-01

    While the Cherenkov Telescope Array will end its Preperatory Phase in 2012 or 2013 with the publication of a Technical Design Report, our lab has undertaken within the french CTA community the design and prototyping of a Camera-Server, which is a PC architecture based computer, used as a switchboard assigned to each of a hundred telescopes to handle a maximum amount of scientific data recorded by each telescope. Our work aims for a data acquisition hardware and software system for the scientific raw data at optimal speed. We have evaluated the maximum performance that can be obtained by choosing standard (COTS) hardware and software (Linux) in conjunction with a 10 Gb/s switch.

  9. A multi-criteria approach to camera motion design for volume data animation.

    PubMed

    Hsu, Wei-Hsien; Zhang, Yubo; Ma, Kwan-Liu

    2013-12-01

    We present an integrated camera motion design and path generation system for building volume data animations. Creating animations is an essential task in presenting complex scientific visualizations. Existing visualization systems use an established animation function based on keyframes selected by the user. This approach is limited in providing the optimal in-between views of the data. Alternatively, computer graphics and virtual reality camera motion planning is frequently focused on collision free movement in a virtual walkthrough. For semi-transparent, fuzzy, or blobby volume data the collision free objective becomes insufficient. Here, we provide a set of essential criteria focused on computing camera paths to establish effective animations of volume data. Our dynamic multi-criteria solver coupled with a force-directed routing algorithm enables rapid generation of camera paths. Once users review the resulting animation and evaluate the camera motion, they are able to determine how each criterion impacts path generation. In this paper, we demonstrate how incorporating this animation approach with an interactive volume visualization system reduces the effort in creating context-aware and coherent animations. This frees the user to focus on visualization tasks with the objective of gaining additional insight from the volume data.

  10. Removal of instrument signature from Mariner 9 television images of Mars

    NASA Technical Reports Server (NTRS)

    Green, W. B.; Jepsen, P. L.; Kreznar, J. E.; Ruiz, R. M.; Schwartz, A. A.; Seidman, J. B.

    1975-01-01

    The Mariner 9 spacecraft was inserted into orbit around Mars in November 1971. The two vidicon camera systems returned over 7300 digital images during orbital operations. The high volume of returned data and the scientific objectives of the Television Experiment made development of automated digital techniques for the removal of camera system-induced distortions from each returned image necessary. This paper describes the algorithms used to remove geometric and photometric distortions from the returned imagery. Enhancement processing of the final photographic products is also described.

  11. Toolkit for testing scientific CCD cameras

    NASA Astrophysics Data System (ADS)

    Uzycki, Janusz; Mankiewicz, Lech; Molak, Marcin; Wrochna, Grzegorz

    2006-03-01

    The CCD Toolkit (1) is a software tool for testing CCD cameras which allows to measure important characteristics of a camera like readout noise, total gain, dark current, 'hot' pixels, useful area, etc. The application makes a statistical analysis of images saved in files with FITS format, commonly used in astronomy. A graphical interface is based on the ROOT package, which offers high functionality and flexibility. The program was developed in a way to ensure future compatibility with different operating systems: Windows and Linux. The CCD Toolkit was created for the "Pie of the Sky" project collaboration (2).

  12. Indirectly Funded Research and Exploratory Development at the Applied Physics Laboratory, Fiscal Year 1978.

    DTIC Science & Technology

    1979-12-01

    used to reduce costs ). The orbital data from the prototype ion composi- tion telescope will not only be of great scientific interest -pro- viding for...active device whose transfer function may be almost arbitrarily defined, and cost and production trends permit contemplation of networks containing...developing solid-state television camera systems based on CCD imagers. RICA hopes to produce a $500 color camera for consumer use. Fairchild and Texas

  13. Flow visualization by mobile phone cameras

    NASA Astrophysics Data System (ADS)

    Cierpka, Christian; Hain, Rainer; Buchmann, Nicolas A.

    2016-06-01

    Mobile smart phones were completely changing people's communication within the last ten years. However, these devices do not only offer communication through different channels but also devices and applications for fun and recreation. In this respect, mobile phone cameras include now relatively fast (up to 240 Hz) cameras to capture high-speed videos of sport events or other fast processes. The article therefore explores the possibility to make use of this development and the wide spread availability of these cameras in the terms of velocity measurements for industrial or technical applications and fluid dynamics education in high schools and at universities. The requirements for a simplistic PIV (particle image velocimetry) system are discussed. A model experiment of a free water jet was used to prove the concept and shed some light on the achievable quality and determine bottle necks by comparing the results obtained with a mobile phone camera with data taken by a high-speed camera suited for scientific experiments.

  14. Space Telescope maintenance and refurbishment

    NASA Technical Reports Server (NTRS)

    Trucks, H. F.

    1983-01-01

    The Space Telescope (ST) represents a new concept regarding spaceborne astronomical observatories. Maintenance crews will be brought to the orbital worksite to make repairs and replace scientific instruments. For major overhauls the telescope can be temporarily returned to earth with the aid of the Shuttle. It will, thus, be possible to conduct astronomical studies with the ST for two decades or more. The five first-generation scientific instruments used with the ST include a wide field/planetary camera, a faint object camera, a faint object spectrograph, a high resolution spectrograph, and a high speed photometer. Attention is given to the optical telescope assembly, the support systems module, aspects of mission and science operations, unscheduled maintenance, contingency orbital maintenance, planned on-orbit maintenance, ground maintenance, ground refurbishment, and ground logistics.

  15. Measurement of the timing behaviour of off-the-shelf cameras

    NASA Astrophysics Data System (ADS)

    Schatz, Volker

    2017-04-01

    This paper presents a measurement method suitable for investigating the timing properties of cameras. A single light source illuminates the camera detector starting with a varying defined delay after the camera trigger. Pixels from the recorded camera frames are summed up and normalised, and the resulting function is indicative of the overlap between illumination and exposure. This allows one to infer the trigger delay and the exposure time with sub-microsecond accuracy. The method is therefore of interest when off-the-shelf cameras are used in reactive systems or synchronised with other cameras. It can supplement radiometric and geometric calibration methods for cameras in scientific use. A closer look at the measurement results reveals deviations from the ideal camera behaviour of constant sensitivity limited to the exposure interval. One of the industrial cameras investigated retains a small sensitivity long after the end of the nominal exposure interval. All three investigated cameras show non-linear variations of sensitivity at O≤ft({{10}-3}\\right) to O≤ft({{10}-2}\\right) during exposure. Due to its sign, the latter effect cannot be described by a sensitivity function depending on the time after triggering, but represents non-linear pixel characteristics.

  16. Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes

    NASA Astrophysics Data System (ADS)

    Pagnutti, Mary; Ryan, Robert E.; Cazenavette, George; Gold, Maxwell; Harlan, Ryan; Leggett, Edward; Pagnutti, James

    2017-01-01

    A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.

  17. Automated geo/ortho registered aerial imagery product generation using the mapping system interface card (MSIC)

    NASA Astrophysics Data System (ADS)

    Bratcher, Tim; Kroutil, Robert; Lanouette, André; Lewis, Paul E.; Miller, David; Shen, Sylvia; Thomas, Mark

    2013-05-01

    The development concept paper for the MSIC system was first introduced in August 2012 by these authors. This paper describes the final assembly, testing, and commercial availability of the Mapping System Interface Card (MSIC). The 2.3kg MSIC is a self-contained, compact variable configuration, low cost real-time precision metadata annotator with embedded INS/GPS designed specifically for use in small aircraft. The MSIC was specifically designed to convert commercial-off-the-shelf (COTS) digital cameras and imaging/non-imaging spectrometers with Camera Link standard data streams into mapping systems for airborne emergency response and scientific remote sensing applications. COTS digital cameras and imaging/non-imaging spectrometers covering the ultraviolet through long-wave infrared wavelengths are important tools now readily available and affordable for use by emergency responders and scientists. The MSIC will significantly enhance the capability of emergency responders and scientists by providing a direct transformation of these important COTS sensor tools into low-cost real-time aerial mapping systems.

  18. A mobile device-based imaging spectrometer for environmental monitoring by attaching a lightweight small module to a commercial digital camera.

    PubMed

    Cai, Fuhong; Lu, Wen; Shi, Wuxiong; He, Sailing

    2017-11-15

    Spatially-explicit data are essential for remote sensing of ecological phenomena. Lately, recent innovations in mobile device platforms have led to an upsurge in on-site rapid detection. For instance, CMOS chips in smart phones and digital cameras serve as excellent sensors for scientific research. In this paper, a mobile device-based imaging spectrometer module (weighing about 99 g) is developed and equipped on a Single Lens Reflex camera. Utilizing this lightweight module, as well as commonly used photographic equipment, we demonstrate its utility through a series of on-site multispectral imaging, including ocean (or lake) water-color sensing and plant reflectance measurement. Based on the experiments we obtain 3D spectral image cubes, which can be further analyzed for environmental monitoring. Moreover, our system can be applied to many kinds of cameras, e.g., aerial camera and underwater camera. Therefore, any camera can be upgraded to an imaging spectrometer with the help of our miniaturized module. We believe it has the potential to become a versatile tool for on-site investigation into many applications.

  19. Using the OOI Cabled Array HD Camera to Explore Geophysical and Oceanographic Problems at Axial Seamount

    NASA Astrophysics Data System (ADS)

    Crone, T. J.; Knuth, F.; Marburg, A.

    2016-12-01

    A broad array of Earth science problems can be investigated using high-definition video imagery from the seafloor, ranging from those that are geological and geophysical in nature, to those that are biological and water-column related. A high-definition video camera was installed as part of the Ocean Observatory Initiative's core instrument suite on the Cabled Array, a real-time fiber optic data and power system that stretches from the Oregon Coast to Axial Seamount on the Juan de Fuca Ridge. This camera runs a 14-minute pan-tilt-zoom routine 8 times per day, focusing on locations of scientific interest on and near the Mushroom vent in the ASHES hydrothermal field inside the Axial caldera. The system produces 13 GB of lossless HD video every 3 hours, and at the time of this writing it has generated 2100 recordings totaling 28.5 TB since it began streaming data into the OOI archive in August of 2015. Because of the large size of this dataset, downloading the entirety of the video for long timescale investigations is not practical. We are developing a set of user-side tools for downloading single frames and frame ranges from the OOI HD camera raw data archive to aid users interested in using these data for their research. We use these tools to download about one year's worth of partial frame sets to investigate several questions regarding the hydrothermal system at ASHES, including the variability of bacterial "floc" in the water-column, and changes in high temperature fluid fluxes using optical flow techniques. We show that while these user-side tools can facilitate rudimentary scientific investigations using the HD camera data, a server-side computing environment that allows users to explore this dataset without downloading any raw video will be required for more advanced investigations to flourish.

  20. View From Camera Not Used During Curiosity's First Six Months on Mars

    NASA Image and Video Library

    2017-12-08

    This view of Curiosity's left-front and left-center wheels and of marks made by wheels on the ground in the "Yellowknife Bay" area comes from one of six cameras used on Mars for the first time more than six months after the rover landed. The left Navigation Camera (Navcam) linked to Curiosity's B-side computer took this image during the 223rd Martian day, or sol, of Curiosity's work on Mars (March 22, 2013). The wheels are 20 inches (50 centimeters) in diameter. Curiosity carries a pair of main computers, redundant to each other, in order to have a backup available if one fails. Each of the computers, A-side and B-side, also has other redundant subsystems linked to just that computer. Curiosity operated on its A-side from before the August 2012 landing until Feb. 28, when engineers commanded a switch to the B-side in response to a memory glitch on the A-side. One set of activities after switching to the B-side computer has been to check the six engineering cameras that are hard-linked to that computer. The rover's science instruments, including five science cameras, can each be operated by either the A-side or B-side computer, whichever is active. However, each of Curiosity's 12 engineering cameras is linked to just one of the computers. The engineering cameras are the Navigation Camera (Navcam), the Front Hazard-Avoidance Camera (Front Hazcam) and Rear Hazard-Avoidance Camera (Rear Hazcam). Each of those three named cameras has four cameras as part of it: two stereo pairs of cameras, with one pair linked to each computer. Only the pairs linked to the active computer can be used, and the A-side computer was active from before landing, in August, until Feb. 28. All six of the B-side engineering cameras have been used during March 2013 and checked out OK. Image Credit: NASA/JPL-Caltech NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  1. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    PubMed

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  2. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy

    NASA Astrophysics Data System (ADS)

    Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  3. Sensor equipment of the German earth scientific airplane program

    NASA Technical Reports Server (NTRS)

    Seige, P.

    1975-01-01

    The German airplane program for earth scientific research supports the work of a vast staff of earth scientists from universities and federal agencies. Due to their fields of interest, which are in oceanography, hydrology, geology, ecology, and forestry, five test areas were selected which are spread all over Germany. The sensor package, which was designed in accordance with the requirements of this group of scientists, will be installed in a DO 28 D2 type airplane. The sensor equipment consists of a series of 70-mm cameras having different film/filter combinations, a photogrammetric camera, an infrared radiometer, an 11-channel multispectral line scanner, a LANDSAT-compatible radiometer, and a complex avionic system. Along with the airplane, a truck will be equipped with a set of radiometers and other sensor devices for extensive ground-truth measurement; this also includes a cherry picker.

  4. Active learning in camera calibration through vision measurement application

    NASA Astrophysics Data System (ADS)

    Li, Xiaoqin; Guo, Jierong; Wang, Xianchun; Liu, Changqing; Cao, Binfang

    2017-08-01

    Since cameras are increasingly more used in scientific application as well as in the applications requiring precise visual information, effective calibration of such cameras is getting more important. There are many reasons why the measurements of objects are not accurate. The largest reason is that the lens has a distortion. Another detrimental influence on the evaluation accuracy is caused by the perspective distortions in the image. They happen whenever we cannot mount the camera perpendicularly to the objects we want to measure. In overall, it is very important for students to understand how to correct lens distortions, that is camera calibration. If the camera is calibrated, the images are rectificated, and then it is possible to obtain undistorted measurements in world coordinates. This paper presents how the students should develop a sense of active learning for mathematical camera model besides the theoretical scientific basics. The authors will present the theoretical and practical lectures which have the goal of deepening the students understanding of the mathematical models of area scan cameras and building some practical vision measurement process by themselves.

  5. The LST scientific instruments

    NASA Technical Reports Server (NTRS)

    Levin, G. M.

    1975-01-01

    Seven scientific instruments are presently being studied for use with the Large Space Telescope (LST). These instruments are the F/24 Field Camera, the F/48-F/96 Planetary Camera, the High Resolution Spectrograph, the Faint Object Spectrograph, the Infrared Photometer, and the Astrometer. These instruments are being designed as facility instruments to be replaceable during the life of the Observatory.

  6. Extreme Faint Flux Imaging with an EMCCD

    NASA Astrophysics Data System (ADS)

    Daigle, Olivier; Carignan, Claude; Gach, Jean-Luc; Guillaume, Christian; Lessard, Simon; Fortin, Charles-Anthony; Blais-Ouellette, Sébastien

    2009-08-01

    An EMCCD camera, designed from the ground up for extreme faint flux imaging, is presented. CCCP, the CCD Controller for Counting Photons, has been integrated with a CCD97 EMCCD from e2v technologies into a scientific camera at the Laboratoire d’Astrophysique Expérimentale (LAE), Université de Montréal. This new camera achieves subelectron readout noise and very low clock-induced charge (CIC) levels, which are mandatory for extreme faint flux imaging. It has been characterized in laboratory and used on the Observatoire du Mont Mégantic 1.6 m telescope. The performance of the camera is discussed and experimental data with the first scientific data are presented.

  7. Using DSLR cameras in digital holography

    NASA Astrophysics Data System (ADS)

    Hincapié-Zuluaga, Diego; Herrera-Ramírez, Jorge; García-Sucerquia, Jorge

    2017-08-01

    In Digital Holography (DH), the size of the bidimensional image sensor to record the digital hologram, plays a key role on the performance of this imaging technique; the larger the size of the camera sensor, the better the quality of the final reconstructed image. Scientific cameras with large formats are offered in the market, but their cost and availability limit their use as a first option when implementing DH. Nowadays, DSLR cameras provide an easy-access alternative that is worthwhile to be explored. The DSLR cameras are a wide, commercial, and available option that in comparison with traditional scientific cameras, offer a much lower cost per effective pixel over a large sensing area. However, in the DSLR cameras, with their RGB pixel distribution, the sampling of information is different to the sampling in monochrome cameras usually employed in DH. This fact has implications in their performance. In this work, we discuss why DSLR cameras are not extensively used for DH, taking into account the problem reported by different authors of object replication. Simulations of DH using monochromatic and DSLR cameras are presented and a theoretical deduction for the replication problem using the Fourier theory is also shown. Experimental results of DH implementation using a DSLR camera show the replication problem.

  8. QUANTITATIVE DETECTION OF ENVIRONMENTALLY IMPORTANT DYES USING DIODE LASER/FIBER-OPTIC RAMAN

    EPA Science Inventory

    A compact diode laser/fiber-optic Raman spectrometer is used for quantitative detection of environmentally important dyes. This system is based on diode laser excitation at 782 mm, fiber optic probe technology, an imaging spectrometer, and state-of-the-art scientific CCD camera. ...

  9. ePix100 camera: Use and applications at LCLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carini, G. A., E-mail: carini@slac.stanford.edu; Alonso-Mori, R.; Blaj, G.

    2016-07-27

    The ePix100 x-ray camera is a new system designed and built at SLAC for experiments at the Linac Coherent Light Source (LCLS). The camera is the first member of a family of detectors built around a single hardware and software platform, supporting a variety of front-end chips. With a readout speed of 120 Hz, matching the LCLS repetition rate, a noise lower than 80 e-rms and pixels of 50 µm × 50 µm, this camera offers a viable alternative to fast readout, direct conversion, scientific CCDs in imaging mode. The detector, designed for applications such as X-ray Photon Correlation Spectroscopymore » (XPCS) and wavelength dispersive X-ray Emission Spectroscopy (XES) in the energy range from 2 to 10 keV and above, comprises up to 0.5 Mpixels in a very compact form factor. In this paper, we report the performance of the camera during its first use at LCLS.« less

  10. Colors of active regions on comet 67P

    NASA Astrophysics Data System (ADS)

    Oklay, N.; Vincent, J.-B.; Sierks, H.; Besse, S.; Fornasier, S.; Barucci, M. A.; Lara, L.; Scholten, F.; Preusker, F.; Lazzarin, M.; Pajola, M.; La Forgia, F.

    2015-10-01

    The OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) scientific imager (Keller et al. 2007) is successfully delivering images of comet 67P/Churyumov-Gerasimenko from its both wide angle camera (WAC) and narrow angle camera (NAC) since ESA's spacecraft Rosetta's arrival to the comet. Both cameras are equipped with filters covering the wavelength range of about 200 nm to 1000 nm. The comet nucleus is mapped with different combination of the filters in resolutions up to 15 cm/px. Besides the determination of the surface morphology in great details (Thomas et al. 2015), such high resolution images provided us a mean to unambiguously link some activity in the coma to a series of pits on the nucleus surface (Vincent et al. 2015).

  11. Application of low-noise CID imagers in scientific instrumentation cameras

    NASA Astrophysics Data System (ADS)

    Carbone, Joseph; Hutton, J.; Arnold, Frank S.; Zarnowski, Jeffrey J.; Vangorden, Steven; Pilon, Michael J.; Wadsworth, Mark V.

    1991-07-01

    CIDTEC has developed a PC-based instrumentation camera incorporating a preamplifier per row CID imager and a microprocessor/LCA camera controller. The camera takes advantage of CID X-Y addressability to randomly read individual pixels and potentially overlapping pixel subsets in true nondestructive (NDRO) as well as destructive readout modes. Using an oxy- nitride fabricated CID and the NDRO readout technique, pixel full well and noise levels of approximately 1*10(superscript 6) and 40 electrons, respectively, were measured. Data taken from test structures indicates noise levels (which appear to be 1/f limited) can be reduced by a factor of two by eliminating the nitride under the preamplifier gate. Due to software programmability, versatile readout capabilities, wide dynamic range, and extended UV/IR capability, this camera appears to be ideally suited for use in spectroscopy and other scientific applications.

  12. Multiple-camera tracking: UK government requirements

    NASA Astrophysics Data System (ADS)

    Hosmer, Paul

    2007-10-01

    The Imagery Library for Intelligent Detection Systems (i-LIDS) is the UK government's new standard for Video Based Detection Systems (VBDS). The standard was launched in November 2006 and evaluations against it began in July 2007. With the first four i-LIDS scenarios completed, the Home Office Scientific development Branch (HOSDB) are looking toward the future of intelligent vision in the security surveillance market by adding a fifth scenario to the standard. The fifth i-LIDS scenario will concentrate on the development, testing and evaluation of systems for the tracking of people across multiple cameras. HOSDB and the Centre for the Protection of National Infrastructure (CPNI) identified a requirement to track targets across a network of CCTV cameras using both live and post event imagery. The Detection and Vision Systems group at HOSDB were asked to determine the current state of the market and develop an in-depth Operational Requirement (OR) based on government end user requirements. Using this OR the i-LIDS team will develop a full i-LIDS scenario to aid the machine vision community in its development of multi-camera tracking systems. By defining a requirement for multi-camera tracking and building this into the i-LIDS standard the UK government will provide a widely available tool that developers can use to help them turn theory and conceptual demonstrators into front line application. This paper will briefly describe the i-LIDS project and then detail the work conducted in building the new tracking aspect of the standard.

  13. Time-lapse cinemicroscopy.

    PubMed

    Riddle, P N

    1990-01-01

    Cinematography commenced as a scientific technique used as a system for "slowing down" observed movement. Marey in 1888 (1) constructed, following a number of other ideas, a "Chambre Chronophoto-graphique," which had practically all the elements of the modern cine camera. With this he made serial photographs (not transparencies) of various biological phenomena (2).

  14. Virtual Interactive Classroom: A New Technology for Distance Learning Developed

    NASA Technical Reports Server (NTRS)

    York, David W.; Babula, Maria

    1999-01-01

    The Virtual Interactive Classroom (VIC) allows Internet users, specifically students, to remotely control and access data from scientific equipment. This is a significant advantage to school systems that cannot afford experimental equipment, have Internet access, and are seeking to improve science and math scores with current resources. A VIC Development Lab was established at Lewis to demonstrate that scientific equipment can be controlled by remote users over the Internet. Current projects include a wind tunnel, a room camera, a science table, and a microscope.

  15. Observation sequences and onboard data processing of Planet-C

    NASA Astrophysics Data System (ADS)

    Suzuki, M.; Imamura, T.; Nakamura, M.; Ishi, N.; Ueno, M.; Hihara, H.; Abe, T.; Yamada, T.

    Planet-C or VCO Venus Climate Orbiter will carry 5 cameras IR1 IR 1micrometer camera IR2 IR 2micrometer camera UVI UV Imager LIR Long-IR camera and LAC Lightning and Airglow Camera in the UV-IR region to investigate atmospheric dynamics of Venus During 30 hr orbiting designed to quasi-synchronize to the super rotation of the Venus atmosphere 3 groups of scientific observations will be carried out i image acquisition of 4 cameras IR1 IR2 UVI LIR 20 min in 2 hrs ii LAC operation only when VCO is within Venus shadow and iii radio occultation These observation sequences will define the scientific outputs of VCO program but the sequences must be compromised with command telemetry downlink and thermal power conditions For maximizing science data downlink it must be well compressed and the compression efficiency and image quality have the significant scientific importance in the VCO program Images of 4 cameras IR1 2 and UVI 1Kx1K and LIR 240x240 will be compressed using JPEG2000 J2K standard J2K is selected because of a no block noise b efficiency c both reversible and irreversible d patent loyalty free and e already implemented as academic commercial software ICs and ASIC logic designs Data compression efficiencies of J2K are about 0 3 reversible and 0 1 sim 0 01 irreversible The DE Digital Electronics unit which controls 4 cameras and handles onboard data processing compression is under concept design stage It is concluded that the J2K data compression logics circuits using space

  16. Quantitative evaluation of the accuracy and variance of individual pixels in a scientific CMOS (sCMOS) camera for computational imaging

    NASA Astrophysics Data System (ADS)

    Watanabe, Shigeo; Takahashi, Teruo; Bennett, Keith

    2017-02-01

    The"scientific" CMOS (sCMOS) camera architecture fundamentally differs from CCD and EMCCD cameras. In digital CCD and EMCCD cameras, conversion from charge to the digital output is generally through a single electronic chain, and the read noise and the conversion factor from photoelectrons to digital outputs are highly uniform for all pixels, although quantum efficiency may spatially vary. In CMOS cameras, the charge to voltage conversion is separate for each pixel and each column has independent amplifiers and analog-to-digital converters, in addition to possible pixel-to-pixel variation in quantum efficiency. The "raw" output from the CMOS image sensor includes pixel-to-pixel variability in the read noise, electronic gain, offset and dark current. Scientific camera manufacturers digitally compensate the raw signal from the CMOS image sensors to provide usable images. Statistical noise in images, unless properly modeled, can introduce errors in methods such as fluctuation correlation spectroscopy or computational imaging, for example, localization microscopy using maximum likelihood estimation. We measured the distributions and spatial maps of individual pixel offset, dark current, read noise, linearity, photoresponse non-uniformity and variance distributions of individual pixels for standard, off-the-shelf Hamamatsu ORCA-Flash4.0 V3 sCMOS cameras using highly uniform and controlled illumination conditions, from dark conditions to multiple low light levels between 20 to 1,000 photons / pixel per frame to higher light conditions. We further show that using pixel variance for flat field correction leads to errors in cameras with good factory calibration.

  17. Astronomers Find Elusive Planets in Decade-Old Hubble Data

    NASA Image and Video Library

    2017-12-08

    NASA image release Oct. 6, 2011 This is an image of the star HR 8799 taken by Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS) in 1998. A mask within the camera (coronagraph) blocks most of the light from the star. In addition, software has been used to digitally subtract more starlight. Nevertheless, scattered light from HR 8799 dominates the image, obscuring the faint planets. Object Name: HR 8799 Image Type: Astronomical Credit: NASA, ESA, and R. Soummer (STScI) To read more go to: www.nasa.gov/mission_pages/hubble/science/elusive-planets... NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. Hubble Team Unveils Most Colorful View of Universe Captured by Space Telescope

    NASA Image and Video Library

    2014-06-04

    Astronomers using NASA's Hubble Space Telescope have assembled a comprehensive picture of the evolving universe – among the most colorful deep space images ever captured by the 24-year-old telescope. Researchers say the image, in new study called the Ultraviolet Coverage of the Hubble Ultra Deep Field, provides the missing link in star formation. The Hubble Ultra Deep Field 2014 image is a composite of separate exposures taken in 2003 to 2012 with Hubble's Advanced Camera for Surveys and Wide Field Camera 3. Credit: NASA/ESA Read more: 1.usa.gov/1neD0se NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  20. The Activity of Comet 67P/Churyumov-Gerasimenko as Seen by Rosetta/OSIRIS

    NASA Astrophysics Data System (ADS)

    Sierks, H.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Rickman, H.; Koschny, D.

    2015-12-01

    The Rosetta mission of the European Space Agency arrived on August 6, 2014, at the target comet 67P/Churyumov-Gerasimenko. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. OSIRIS consists of a Narrow Angle Camera (NAC) for the nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field gas and dust coma investigations. OSIRIS observed the coma and the nucleus of comet 67P/C-G during approach, arrival, and landing of PHILAE. OSIRIS continued comet monitoring and mapping of surface and activity in 2015 with close fly-bys with high resolution and remote, wide angle observations. The scientific results reveal a nucleus with two lobes and varied morphology. Active regions are located at steep cliffs and collapsed pits which form collimated gas jets. Dust is accelerated by the gas, forming bright jet filaments and the large scale, diffuse coma of the comet. We will present activity and surface changes observed in the Northern and Southern hemisphere and around perihelion passage.

  1. View of Scientific Instrument Module to be flown on Apollo 15

    NASA Image and Video Library

    1971-06-27

    S71-2250X (June 1971) --- A close-up view of the Scientific Instrument Module (SIM) to be flown for the first time on the Apollo 15 lunar landing mission. Mounted in a previously vacant sector of the Apollo Service Module (SM), the SIM carries specialized cameras and instrumentation for gathering lunar orbit scientific data. SIM equipment includes a laser altimeter for accurate measurement of height above the lunar surface; a large-format panoramic camera for mapping, correlated with a metric camera and the laser altimeter for surface mapping; a gamma ray spectrometer on a 25-feet extendible boom; a mass spectrometer on a 21-feet extendible boom; X-ray and alpha particle spectrometers; and a subsatellite which will be injected into lunar orbit carrying a particle and magnetometer, and the S-Band transponder.

  2. JCII Camera Museum: A unique museum that preserves and evaluates photographic artifacts, literature and artworks focusing on the Japanese phtography and related industries.

    NASA Astrophysics Data System (ADS)

    Ichikawa, Yasunori; Shirayama, Mari

    JCII Camera Museum is a unique photographic museum having three major departments, the camera museum that collects, preserves and exhibits historically valuable cameras and camera-related produts, the photo salon that collects, preserve and exhibits various original photographic films and prints, and the library that collects, preserves and appraises photo-historical literatures including magazines, industrial histories, product catalogues and scientific papers.

  3. Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems

    NASA Technical Reports Server (NTRS)

    Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.

    2015-01-01

    Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.

  4. Rosetta/OSIRIS - Nucleus morphology and activity of comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rickman, Hans; Rodrigo, Rafael; Koschny, Detlef

    2015-04-01

    ESA's Rosetta mission arrived on August 6, 2014, at target comet 67P/Churyumov-Gerasimenko after 10 years of cruise. OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) is the scientific imaging system onboard Rosetta. It comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field coma investigations. OSIRIS imaged the nucleus and coma of the comet from the arrival throughout the mapping phase, PHILAE landing, early escort phase and close fly-by. The overview paper will discuss the surface morpholo-gy and activity of the nucleus as seen in gas, dust, and local jets as well as small scale structures in the local topography.

  5. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  6. Scientific CCD technology at JPL

    NASA Technical Reports Server (NTRS)

    Janesick, J.; Collins, S. A.; Fossum, E. R.

    1991-01-01

    Charge-coupled devices (CCD's) were recognized for their potential as an imaging technology almost immediately following their conception in 1970. Twenty years later, they are firmly established as the technology of choice for visible imaging. While consumer applications of CCD's, especially the emerging home video camera market, dominated manufacturing activity, the scientific market for CCD imagers has become significant. Activity of the Jet Propulsion Laboratory and its industrial partners in the area of CCD imagers for space scientific instruments is described. Requirements for scientific imagers are significantly different from those needed for home video cameras, and are described. An imager for an instrument on the CRAF/Cassini mission is described in detail to highlight achieved levels of performance.

  7. Using virtual reality for science mission planning: A Mars Pathfinder case

    NASA Technical Reports Server (NTRS)

    Kim, Jacqueline H.; Weidner, Richard J.; Sacks, Allan L.

    1994-01-01

    NASA's Mars Pathfinder Project requires a Ground Data System (GDS) that supports both engineering and scientific payloads with reduced mission operations staffing, and short planning schedules. Also, successful surface operation of the lander camera requires efficient mission planning and accurate pointing of the camera. To meet these challenges, a new software strategy that integrates virtual reality technology with existing navigational ancillary information and image processing capabilities. The result is an interactive workstation based applications software that provides a high resolution, 3-dimensial, stereo display of Mars as if it were viewed through the lander camera. The design, implementation strategy and parametric specification phases for the development of this software were completed, and the prototype tested. When completed, the software will allow scientists and mission planners to access simulated and actual scenes of Mars' surface. The perspective from the lander camera will enable scientists to plan activities more accurately and completely. The application will also support the sequence and command generation process and will allow testing and verification of camera pointing commands via simulation.

  8. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  9. In vivo burn diagnosis by camera-phone diffuse reflectance laser speckle detection.

    PubMed

    Ragol, S; Remer, I; Shoham, Y; Hazan, S; Willenz, U; Sinelnikov, I; Dronov, V; Rosenberg, L; Bilenca, A

    2016-01-01

    Burn diagnosis using laser speckle light typically employs widefield illumination of the burn region to produce two-dimensional speckle patterns from light backscattered from the entire irradiated tissue volume. Analysis of speckle contrast in these time-integrated patterns can then provide information on burn severity. Here, by contrast, we use point illumination to generate diffuse reflectance laser speckle patterns of the burn. By examining spatiotemporal fluctuations in these time-integrated patterns along the radial direction from the incident point beam, we show the ability to distinguish partial-thickness burns in a porcine model in vivo within the first 24 hours post-burn. Furthermore, our findings suggest that time-integrated diffuse reflectance laser speckle can be useful for monitoring burn healing over time post-burn. Unlike conventional diffuse reflectance laser speckle detection systems that utilize scientific or industrial-grade cameras, our system is designed with a camera-phone, demonstrating the potential for burn diagnosis with a simple imager.

  10. In vivo burn diagnosis by camera-phone diffuse reflectance laser speckle detection

    PubMed Central

    Ragol, S.; Remer, I.; Shoham, Y.; Hazan, S.; Willenz, U.; Sinelnikov, I.; Dronov, V.; Rosenberg, L.; Bilenca, A.

    2015-01-01

    Burn diagnosis using laser speckle light typically employs widefield illumination of the burn region to produce two-dimensional speckle patterns from light backscattered from the entire irradiated tissue volume. Analysis of speckle contrast in these time-integrated patterns can then provide information on burn severity. Here, by contrast, we use point illumination to generate diffuse reflectance laser speckle patterns of the burn. By examining spatiotemporal fluctuations in these time-integrated patterns along the radial direction from the incident point beam, we show the ability to distinguish partial-thickness burns in a porcine model in vivo within the first 24 hours post-burn. Furthermore, our findings suggest that time-integrated diffuse reflectance laser speckle can be useful for monitoring burn healing over time post-burn. Unlike conventional diffuse reflectance laser speckle detection systems that utilize scientific or industrial-grade cameras, our system is designed with a camera-phone, demonstrating the potential for burn diagnosis with a simple imager. PMID:26819831

  11. High-immersion three-dimensional display of the numerical computer model

    NASA Astrophysics Data System (ADS)

    Xing, Shujun; Yu, Xunbo; Zhao, Tianqi; Cai, Yuanfa; Chen, Duo; Chen, Zhidong; Sang, Xinzhu

    2013-08-01

    High-immersion three-dimensional (3D) displays making them valuable tools for many applications, such as designing and constructing desired building houses, industrial architecture design, aeronautics, scientific research, entertainment, media advertisement, military areas and so on. However, most technologies provide 3D display in the front of screens which are in parallel with the walls, and the sense of immersion is decreased. To get the right multi-view stereo ground image, cameras' photosensitive surface should be parallax to the public focus plane and the cameras' optical axes should be offset to the center of public focus plane both atvertical direction and horizontal direction. It is very common to use virtual cameras, which is an ideal pinhole camera to display 3D model in computer system. We can use virtual cameras to simulate the shooting method of multi-view ground based stereo image. Here, two virtual shooting methods for ground based high-immersion 3D display are presented. The position of virtual camera is determined by the people's eye position in the real world. When the observer stand in the circumcircle of 3D ground display, offset perspective projection virtual cameras is used. If the observer stands out the circumcircle of 3D ground display, offset perspective projection virtual cameras and the orthogonal projection virtual cameras are adopted. In this paper, we mainly discussed the parameter setting of virtual cameras. The Near Clip Plane parameter setting is the main point in the first method, while the rotation angle of virtual cameras is the main point in the second method. In order to validate the results, we use the D3D and OpenGL to render scenes of different viewpoints and generate a stereoscopic image. A realistic visualization system for 3D models is constructed and demonstrated for viewing horizontally, which provides high-immersion 3D visualization. The displayed 3D scenes are compared with the real objects in the real world.

  12. Video control system for a drilling in furniture workpiece

    NASA Astrophysics Data System (ADS)

    Khmelev, V. L.; Satarov, R. N.; Zavyalova, K. V.

    2018-05-01

    During last 5 years, Russian industry has being starting to be a robotic, therefore scientific groups got new tasks. One of new tasks is machine vision systems, which should solve problem of automatic quality control. This type of systems has a cost of several thousand dollars each. The price is impossible for regional small business. In this article, we describe principle and algorithm of cheap video control system, which one uses web-cameras and notebook or desktop computer as a computing unit.

  13. An integrated multispectral video and environmental monitoring system for the study of coastal processes and the support of beach management operations

    NASA Astrophysics Data System (ADS)

    Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim

    2016-04-01

    Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can detect upwelling in the nearshore zone, and enhances the safety of beach users. All data can be presented in real- or quasi-real time and are stored for future analysis and training/validation of coastal processes models. Acknowledgements: This work was supported by the project BEACHTOUR (11SYN-8-1466) of the Operational Program "Cooperation 2011, Competitiveness and Entrepreneurship", co-funded by the European Regional Development Fund and the Greek Ministry of Education and Religious Affairs.

  14. The Visible Imaging System (VIS) for the Polar Spacecraft

    NASA Technical Reports Server (NTRS)

    Frank, L. A.; Sigwarth, J. B.; Craven, J. D.; Cravens, J. P.; Dolan, J. S.; Dvorsky, M. R.; Hardebeck, P. K.; Harvey, J. D.; Muller, D. W.

    1995-01-01

    The Visible Imaging System (VIS) is a set of three low-light-level cameras to be flown on the POLAR spacecraft of the Global Geospace Science (GGS) program which is an element of the International Solar-Terrestrial Physics (ISTP) campaign. Two of these cameras share primary and some secondary optics and are designed to provide images of the nighttime auroral oval at visible wavelengths. A third camera is used to monitor the directions of the fields-of-view of these sensitive auroral cameras with respect to sunlit Earth. The auroral emissions of interest include those from N+2 at 391.4 nm, 0 I at 557.7 and 630.0 nm, H I at 656.3 nm, and 0 II at 732.0 nm. The two auroral cameras have different spatial resolutions. These resolutions are about 10 and 20 km from a spacecraft altitude of 8 R(sub e). The time to acquire and telemeter a 256 x 256-pixel image is about 12 s. The primary scientific objectives of this imaging instrumentation, together with the in-situ observations from the ensemble of ISTP spacecraft, are (1) quantitative assessment of the dissipation of magnetospheric energy into the auroral ionosphere, (2) an instantaneous reference system for the in-situ measurements, (3) development of a substantial model for energy flow within the magnetosphere, (4) investigation of the topology of the magnetosphere, and (5) delineation of the responses of the magnetosphere to substorms and variable solar wind conditions.

  15. Hubble Sees a Legion of Galaxies

    NASA Image and Video Library

    2017-12-08

    Peering deep into the early universe, this picturesque parallel field observation from the NASA/ESA Hubble Space Telescope reveals thousands of colorful galaxies swimming in the inky blackness of space. A few foreground stars from our own galaxy, the Milky Way, are also visible. In October 2013 Hubble’s Wide Field Camera 3 (WFC3) and Advanced Camera for Surveys (ACS) began observing this portion of sky as part of the Frontier Fields program. This spectacular skyscape was captured during the study of the giant galaxy cluster Abell 2744, otherwise known as Pandora’s Box. While one of Hubble’s cameras concentrated on Abell 2744, the other camera viewed this adjacent patch of sky near to the cluster. Containing countless galaxies of various ages, shapes and sizes, this parallel field observation is nearly as deep as the Hubble Ultra-Deep Field. In addition to showcasing the stunning beauty of the deep universe in incredible detail, this parallel field — when compared to other deep fields — will help astronomers understand how similar the universe looks in different directions. Image credit: NASA, ESA and the HST Frontier Fields team (STScI), NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. Space telescope optical telescope assembly/scientific instruments. Phase B: -Preliminary design and program definition study; Volume 2A: Planetary camera report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Development of the F/48, F/96 Planetary Camera for the Large Space Telescope is discussed. Instrument characteristics, optical design, and CCD camera submodule thermal design are considered along with structural subsystem and thermal control subsystem. Weight, electrical subsystem, and support equipment requirements are also included.

  17. Thermal Imaging in the Science Classroom

    ERIC Educational Resources Information Center

    Short, Daniel B.

    2012-01-01

    Thermal cameras are useful tools for use in scientific investigation and for teaching scientific concepts to students in the classroom. Demonstrations of scientific phenomena can be greatly enhanced visually by the use of this cutting-edge technology. (Contains 7 figures.)

  18. The Moon's North Pole

    NASA Image and Video Library

    2017-12-08

    NASA image release September 7, 2011 The Earth's moon has been an endless source of fascination for humanity for thousands of years. When at last Apollo 11 landed on the moon's surface in 1969, the crew found a desolate, lifeless orb, but one which still fascinates scientist and non-scientist alike. This image of the moon's north polar region was taken by the Lunar Reconnaissance Orbiter Camera, or LROC. One of the primary scientific objectives of LROC is to identify regions of permanent shadow and near-permanent illumination. Since the start of the mission, LROC has acquired thousands of Wide Angle Camera images approaching the north pole. From these images, scientists produced this mosaic, which is composed of 983 images taken over a one month period during northern summer. This mosaic shows the pole when it is best illuminated, regions that are in shadow are candidates for permanent shadow. Image Credit: NASA/GSFC/Arizona State University NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Dynamic Geometry Capture with a Multi-View Structured-Light System

    DTIC Science & Technology

    2014-12-19

    funding was never a problem during my studies . One of the best parts of my time at UC Berkeley has been working with colleagues within the Video and...scientific and medical applications such as quantifying improvement in physical therapy and measuring unnatural poses in ergonomic studies . Specifically... cases with limited scene texture. This direct generation of surface geometry provides us with a distinct advantage over multi-camera based systems. For

  20. a Novel Technique for Precision Geometric Correction of Jitter Distortion for the Europa Imaging System and Other Rolling-Shutter Cameras

    NASA Astrophysics Data System (ADS)

    Kirk, R. L.; Shepherd, M.; Sides, S. C.

    2018-04-01

    We use simulated images to demonstrate a novel technique for mitigating geometric distortions caused by platform motion ("jitter") as two-dimensional image sensors are exposed and read out line by line ("rolling shutter"). The results indicate that the Europa Imaging System (EIS) on NASA's Europa Clipper can likely meet its scientific goals requiring 0.1-pixel precision. We are therefore adapting the software used to demonstrate and test rolling shutter jitter correction to become part of the standard processing pipeline for EIS. The correction method will also apply to other rolling-shutter cameras, provided they have the operational flexibility to read out selected "check lines" at chosen times during the systematic readout of the frame area.

  1. Use of an UROV to develop 3-D optical models of submarine environments

    NASA Astrophysics Data System (ADS)

    Null, W. D.; Landry, B. J.

    2017-12-01

    The ability to rapidly obtain high-fidelity bathymetry is crucial for a broad range of engineering, scientific, and defense applications ranging from bridge scour, bedform morphodynamics, and coral reef health to unexploded ordnance detection and monitoring. The present work introduces the use of an Underwater Remotely Operated Vehicle (UROV) to develop 3-D optical models of submarine environments. The UROV used a Raspberry Pi camera mounted to a small servo which allowed for pitch control. Prior to video data collection, in situ camera calibration was conducted with the system. Multiple image frames were extracted from the underwater video for 3D reconstruction using Structure from Motion (SFM). This system provides a simple and cost effective solution to obtaining detailed bathymetry in optically clear submarine environments.

  2. FROM THE HISTORY OF PHYSICS: Georgii L'vovich Shnirman: designer of fast-response instruments

    NASA Astrophysics Data System (ADS)

    Bashilov, I. P.

    1994-07-01

    A biography is given of the outstanding Russian scientist Georgii L'vovich Shnirman, whose scientific life had been 'top secret'. He was an experimental physicist and instrument designer, the founder of many branches of the Soviet instrument-making industry, the originator of a theory of electric methods of integration and differentiation, a theory of astasisation of pendulums, and also of original measurement methods. He was the originator and designer of automatic systems for the control of the measuring apparatus used at nuclear test sites and of automatic seismic station systems employed in monitoring nuclear tests. He also designed the first loop oscilloscopes in the Soviet Union, high-speed photographic and cine cameras (streak cameras, etc.), and many other unique instruments, including some mounted on moving objects.

  3. Lunar Satellite Snaps Image of Earth

    NASA Image and Video Library

    2014-05-07

    This image, captured Feb. 1, 2014, shows a colorized view of Earth from the moon-based perspective of NASA's Lunar Reconnaissance Orbiter. Credit: NASA/Goddard/Arizona State University -- NASA's Lunar Reconnaissance Orbiter (LRO) experiences 12 "earthrises" every day, however LROC (short for LRO Camera) is almost always busy imaging the lunar surface so only rarely does an opportunity arise such that LROC can capture a view of Earth. On Feb. 1, 2014, LRO pitched forward while approaching the moon's north pole allowing the LROC Wide Angle Camera to capture Earth rising above Rozhdestvenskiy crater (112 miles, or 180 km, in diameter). Read more: go.nasa.gov/1oqMlgu NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  4. Infrared Imaging Camera Final Report CRADA No. TC02061.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E. V.; Nebeker, S.

    This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less

  5. Explore Galaxies Far, Far Away at Internet Speeds | Berkeley Lab

    Science.gov Websites

    Survey) were taken by the 520-megapixel Dark Energy Survey Camera (DECam). The scientific aim of DECaLS the Dark Energy Camera Legacy Survey (DECaLS). Credit: Dustin Lang/University of Toronto This galaxy UGC 10041 imaged by the Dark Energy Camera Legacy Survey (DECaLS). Credit: Dustin Lang/University

  6. The Ocean in Depth - Ideas for Using Marine Technology in Science Communication

    NASA Astrophysics Data System (ADS)

    Gerdes, A.

    2009-04-01

    By deploying camera and video systems on remotely operated diving vehicles (ROVs), new and fascinating insights concerning the functioning of deep ocean ecosystems like cold-water coral reef communities can be gained. Moreover, mapping hot vents at mid-ocean ridge locations, and exploring asphalt and mud volcanoes in the Gulf of Mexico and the Mediterranean Sea with the aid of video camera systems have illustrated the scientific value of state-of-the-art diving tools. In principle, the deployment of sophisticated marine technology on seagoing expeditions and their results - video tapes and photographs of fascinating submarine environments, publication of new scientific findings - offer unique opportunities for communicating marine sciences. Experience shows that an interest in marine technology can easily be stirred in laypersons if the deployment of underwater vehicles such as ROVs during seagoing expeditions can be presented using catchwords like "discovery", "new frontier", groundbreaking mission", etc. On the other hand, however, a number of restrictions and challenges have to be kept in mind. Communicating marine science in general, and the achievements of marine technology in particular, can only be successful with the application of a well-defined target-audience concept. While national and international TV stations and production companies are very much interested in using high quality underwater video footage, the involvement of journalists and camera teams in seagoing expeditions entails a number a challenges: berths onboard research vessels are limited; safety aspects have to be considered; copyright and utilisation questions of digitalized video and photo material has to be handled with special care. To cite one example: on-board video material produced by professional TV teams cannot be used by the research institute that operated the expedition. This presentation aims at (1)informing members of the scientific community about new opportunities related to marine technology, (2)discussing challenges and limitations in cooperative projects with media,(3) presenting new ways of marketing scientific findings, (4) promoting the interest of the media present at the EGU09 conference in cooperating with research institutes.

  7. The 1973 Smithsonian standard earth (3). [for the satellite geodesy program

    NASA Technical Reports Server (NTRS)

    Garoschkin, E. M. (Editor)

    1973-01-01

    The origins of the satellite geodesy program are described, starting with the International Geophysical Year, continuing through a number of international programs, and culminating with the National Geodetic Satellite Program. The philosophical basis for the Baker-Nunn camera and the laser ranging system, the evolution of international scientific cooperation, and the significance of the results are discussed.

  8. How much camera separation should be used for the capture and presentation of 3D stereoscopic imagery on binocular HMDs?

    NASA Astrophysics Data System (ADS)

    McIntire, John; Geiselman, Eric; Heft, Eric; Havig, Paul

    2011-06-01

    Designers, researchers, and users of binocular stereoscopic head- or helmet-mounted displays (HMDs) face the tricky issue of what imagery to present in their particular displays, and how to do so effectively. Stereoscopic imagery must often be created in-house with a 3D graphics program or from within a 3D virtual environment, or stereoscopic photos/videos must be carefully captured, perhaps for relaying to an operator in a teleoperative system. In such situations, the question arises as to what camera separation (real or virtual) is appropriate or desirable for end-users and operators. We review some of the relevant literature regarding the question of stereo pair camera separation using deskmounted or larger scale stereoscopic displays, and employ our findings to potential HMD applications, including command & control, teleoperation, information and scientific visualization, and entertainment.

  9. Connecting Digital Repeat Photography to Ecosystem Fluxes in Inland Pacific Northwest, US Cropping Systems

    NASA Astrophysics Data System (ADS)

    Russell, E.; Chi, J.; Waldo, S.; Pressley, S. N.; Lamb, B. K.; Pan, W.

    2017-12-01

    Diurnal and seasonal gas fluxes vary by crop growth stage. Digital cameras are increasingly being used to monitor inter-annual changes in vegetation phenology in a variety of ecosystems. These cameras are not designed as scientific instruments but the information they gather can add value to established measurement techniques (i.e. eddy covariance). This work combined deconstructed digital images with eddy covariance data from five agricultural sites (1 fallow, 4 cropped) in the inland Pacific Northwest, USA. The data were broken down with respect to crop stage and management activities. The fallow field highlighted the camera response to changing net radiation, illumination, and rainfall. At the cropped sites, the net ecosystem exchange, gross primary production, and evapotranspiration were correlated with the greenness and redness values derived from the images over the growing season. However, the color values do not change quickly enough to respond to day-to-day variability in the flux exchange as the two measurement types are based on different processes. The management practices and changes in phenology through the growing season were not visible within the camera data though the camera did capture the general evolution of the ecosystem fluxes.

  10. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Astrophysics Data System (ADS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  11. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Technical Reports Server (NTRS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    2017-01-01

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  12. Use of commercial off-the-shelf digital cameras for scientific data acquisition and scene-specific color calibration

    PubMed Central

    Akkaynak, Derya; Treibitz, Tali; Xiao, Bei; Gürkan, Umut A.; Allen, Justine J.; Demirci, Utkan; Hanlon, Roger T.

    2014-01-01

    Commercial off-the-shelf digital cameras are inexpensive and easy-to-use instruments that can be used for quantitative scientific data acquisition if images are captured in raw format and processed so that they maintain a linear relationship with scene radiance. Here we describe the image-processing steps required for consistent data acquisition with color cameras. In addition, we present a method for scene-specific color calibration that increases the accuracy of color capture when a scene contains colors that are not well represented in the gamut of a standard color-calibration target. We demonstrate applications of the proposed methodology in the fields of biomedical engineering, artwork photography, perception science, marine biology, and underwater imaging. PMID:24562030

  13. SU-D-BRC-07: System Design for a 3D Volumetric Scintillation Detector Using SCMOS Cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darne, C; Robertson, D; Alsanea, F

    2016-06-15

    Purpose: The purpose of this project is to build a volumetric scintillation detector for quantitative imaging of 3D dose distributions of proton beams accurately in near real-time. Methods: The liquid scintillator (LS) detector consists of a transparent acrylic tank (20×20×20 cm{sup 3}) filled with a liquid scintillator that when irradiated with protons generates scintillation light. To track rapid spatial and dose variations in spot scanning proton beams we used three scientific-complementary metal-oxide semiconductor (sCMOS) imagers (2560×2160 pixels). The cameras collect optical signal from three orthogonal projections. To reduce system footprint two mirrors oriented at 45° to the tank surfaces redirectmore » scintillation light to cameras for capturing top and right views. Selection of fixed focal length objective lenses for these cameras was based on their ability to provide large depth of field (DoF) and required field of view (FoV). Multiple cross-hairs imprinted on the tank surfaces allow for image corrections arising from camera perspective and refraction. Results: We determined that by setting sCMOS to 16-bit dynamic range, truncating its FoV (1100×1100 pixels) to image the entire volume of the LS detector, and using 5.6 msec integration time imaging rate can be ramped up to 88 frames per second (fps). 20 mm focal length lens provides a 20 cm imaging DoF and 0.24 mm/pixel resolution. Master-slave camera configuration enable the slaves to initiate image acquisition instantly (within 2 µsec) after receiving a trigger signal. A computer with 128 GB RAM was used for spooling images from the cameras and can sustain a maximum recording time of 2 min per camera at 75 fps. Conclusion: The three sCMOS cameras are capable of high speed imaging. They can therefore be used for quick, high-resolution, and precise mapping of dose distributions from scanned spot proton beams in three dimensions.« less

  14. Free-form reflective optics for mid-infrared camera and spectrometer on board SPICA

    NASA Astrophysics Data System (ADS)

    Fujishiro, Naofumi; Kataza, Hirokazu; Wada, Takehiko; Ikeda, Yuji; Sakon, Itsuki; Oyabu, Shinki

    2017-11-01

    SPICA (Space Infrared Telescope for Cosmology and Astrophysics) is an astronomical mission optimized for mid-and far-infrared astronomy with a cryogenically cooled 3-m class telescope, envisioned for launch in early 2020s. Mid-infrared Camera and Spectrometer (MCS) is a focal plane instrument for SPICA with imaging and spectroscopic observing capabilities in the mid-infrared wavelength range of 5-38μm. MCS consists of two relay optical modules and following four scientific optical modules of WFC (Wide Field Camera; 5'x 5' field of view, f/11.7 and f/4.2 cameras), LRS (Low Resolution Spectrometer; 2'.5 long slits, prism dispersers, f/5.0 and f/1.7 cameras, spectral resolving power R ∼ 50-100), MRS (Mid Resolution Spectrometer; echelles, integral field units by image slicer, f/3.3 and f/1.9 cameras, R ∼ 1100-3000) and HRS (High Resolution Spectrometer; immersed echelles, f/6.0 and f/3.6 cameras, R ∼ 20000-30000). Here, we present optical design and expected optical performance of MCS. Most parts of MCS optics adopt off-axis reflective system for covering the wide wavelength range of 5-38μm without chromatic aberration and minimizing problems due to changes in shapes and refractive indices of materials from room temperature to cryogenic temperature. In order to achieve the high specification requirements of wide field of view, small F-number and large spectral resolving power with compact size, we employed the paraxial and aberration analysis of off-axial optical systems (Araki 2005 [1]) which is a design method using free-form surfaces for compact reflective optics such as head mount displays. As a result, we have successfully designed compact reflective optics for MCS with as-built performance of diffraction-limited image resolution.

  15. Development of the focal plane PNCCD camera system for the X-ray space telescope eROSITA

    NASA Astrophysics Data System (ADS)

    Meidinger, Norbert; Andritschke, Robert; Ebermayer, Stefanie; Elbs, Johannes; Hälker, Olaf; Hartmann, Robert; Herrmann, Sven; Kimmel, Nils; Schächner, Gabriele; Schopper, Florian; Soltau, Heike; Strüder, Lothar; Weidenspointner, Georg

    2010-12-01

    A so-called PNCCD, a special type of CCD, was developed twenty years ago as focal plane detector for the XMM-Newton X-ray astronomy mission of the European Space Agency ESA. Based on this detector concept and taking into account the experience of almost ten years of operation in space, a new X-ray CCD type was designed by the ‘MPI semiconductor laboratory’ for an upcoming X-ray space telescope, called eROSITA (extended Roentgen survey with an imaging telescope array). This space telescope will be equipped with seven X-ray mirror systems of Wolter-I type and seven CCD cameras, placed in their foci. The instrumentation permits the exploration of the X-ray universe in the energy band from 0.3 up to 10 keV by spectroscopic measurements with a time resolution of 50 ms for a full image comprising 384×384 pixels. Main scientific goals are an all-sky survey and investigation of the mysterious ‘Dark Energy’. The eROSITA space telescope, which is developed under the responsibility of the ‘Max-Planck-Institute for extraterrestrial physics’, is a scientific payload on the new Russian satellite ‘Spectrum-Roentgen-Gamma’ (SRG). The mission is already approved by the responsible Russian and German space agencies. After launch in 2012 the destination of the satellite is Lagrange point L2. The planned observational program takes about seven years. We describe the design of the eROSITA camera system and present important test results achieved recently with the eROSITA prototype PNCCD detector. This includes a comparison of the eROSITA detector with the XMM-Newton detector.

  16. Breast Biopsy System

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Charge Coupled Devices (CCDs) are high technology silicon chips that connect light directly into electronic or digital images, which can be manipulated or enhanced by computers. When Goddard Space Flight Center (GSFC) scientists realized that existing CCD technology could not meet scientific requirements for the Hubble Space Telescope Imagining Spectrograph, GSFC contracted with Scientific Imaging Technologies, Inc. (SITe) to develop an advanced CCD. SITe then applied many of the NASA-driven enhancements to the manufacture of CCDs for digital mammography. The resulting device images breast tissue more clearly and efficiently. The LORAD Stereo Guide Breast Biopsy system incorporates SITe's CCD as part of a digital camera system that is replacing surgical biopsy in many cases. Known as stereotactic needle biopsy, it is performed under local anesthesia with a needle and saves women time, pain, scarring, radiation exposure and money.

  17. KENNEDY SPACE CENTER, FLA. - The Window Observational Research Facility (WORF), seen in the Space Station Processing Facility, was designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.

    NASA Image and Video Library

    2003-09-08

    KENNEDY SPACE CENTER, FLA. - The Window Observational Research Facility (WORF), seen in the Space Station Processing Facility, was designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.

  18. KENNEDY SPACE CENTER, FLA. - Workers in the Space Station Processing Facility check out the Window Observational Research Facility (WORF), designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.

    NASA Image and Video Library

    2003-09-08

    KENNEDY SPACE CENTER, FLA. - Workers in the Space Station Processing Facility check out the Window Observational Research Facility (WORF), designed and built by the Boeing Co. at NASA’s Marshall Space Flight Center in Huntsville, Ala. WORF will be delivered to the International Space Station and placed in the rack position in front of the Destiny lab window, providing locations for attaching cameras, multi-spectral scanners and other instruments. WORF will support a variety of scientific and commercial experiments in areas of Earth systems and processes, global ecological changes in Earth’s biosphere, lithosphere, hydrosphere and climate system, Earth resources, natural hazards, and education. After installation, it will become a permanent focal point for Earth Science research aboard the space station.

  19. Line drawing Scientific Instrument Module and lunar orbital science package

    NASA Technical Reports Server (NTRS)

    1970-01-01

    A line drawing of the Scientific Instrument Module (SIM) with its lunar orbital science package. The SIM will be mounted in a previously vacant sector of the Apollo Service Module. It will carry specialized cameras and instrumentation for gathering lunar orbit scientific data.

  20. Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera

    NASA Technical Reports Server (NTRS)

    Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.

    1988-01-01

    The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.

  1. Precision of FLEET Velocimetry Using High-Speed CMOS Camera Systems

    NASA Technical Reports Server (NTRS)

    Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.

    2015-01-01

    Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 microseconds, precisions of 0.5 meters per second in air and 0.2 meters per second in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision HighSpeed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.

  2. The kinelite project. A new powerful motion analyser for spacelab and space station

    NASA Astrophysics Data System (ADS)

    Venet, M.; Pinard, H.; McIntyre, J.; Berthoz, A.; Lacquaniti, F.

    The goal of the Kinelite Project is to develop a space qualified motion analysis system to be used in space by the scientific community, mainly to support neuroscience protocols. The measurement principle of the Kinelite is to determine, by triangulation mean, the 3D position of small, lightweight, reflective markers positionned at the different points of interest. The scene is illuminated by Infra Red flashes and the reflected light is acquired by up to 8 precalibrated and synchronized CCD cameras. The main characteristics of the system are: - Camera field of view: 45 °, - Number of cameras: 2 to 8, - Acquisition frequency: 25, 50, 100 or 200 Hz, - CCD format: 256 × 256, - Number of markers: up to 64, - 3D accuracy: 2 mm, - Main dimensions: 45 cm × 45 cm × 30 cm, - Mass: 23 kg, - Power consumption: less than 200 W. The Kinelite will first fly aboard the NASA Spacelab; it will be used, during the NEUROLAB mission (4/98), to support the "Frames of References and Internal Models" (Principal Investigator: Pr. A.BERTHOZ, Co Investigators: J. Mc INTYRE, F. LACQUANITI).

  3. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  4. Development of a data reduction expert assistant

    NASA Technical Reports Server (NTRS)

    Miller, Glenn E.

    1994-01-01

    This report documents the development and deployment of the Data Reduction Expert Assistant (DRACO). The system was successfully applied to two astronomical research projects. The first was the removal of cosmic ray artifacts from Hubble Space Telescope (HST) Wide Field Planetary Camera data. The second was the reduction and calibration of low-dispersion CCD spectra taken from a ground-based telescope. This has validated our basic approach and demonstrated the applicability of this technology. This work has been made available to the scientific community in two ways. First, we have published the work in the scientific literature and presented papers at relevant conferences. Secondly, we have made the entire system (including documentation and source code) available to the community via the World Wide Web.

  5. SPLASSH: Open source software for camera-based high-speed, multispectral in-vivo optical image acquisition

    PubMed Central

    Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.

    2010-01-01

    Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475

  6. View of Scientific Instrument Module to be flown on Apollo 15

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Close-up view of the Scientific Instrument Module (SIM) to be flown for the first time on the Apollo 15 mission. Mounted in a previously vacant sector of the Apollo Service Module, the SIM carries specialized cameras and instrumentation for gathering lunar orbit scientific data.

  7. SeeStar: an open-source, low-cost imaging system for subsea observations

    NASA Astrophysics Data System (ADS)

    Cazenave, F.; Kecy, C. D.; Haddock, S.

    2016-02-01

    Scientists and engineers at the Monterey Bay Aquarium Research Institute (MBARI) have collaborated to develop SeeStar, a modular, light weight, self-contained, low-cost subsea imaging system for short- to long-term monitoring of marine ecosystems. SeeStar is composed of separate camera, battery, and LED lighting modules. Two versions of the system exist: one rated to 300 meters depth, the other rated to 1500 meters. Users can download plans and instructions from an online repository and build the system using low-cost off-the-shelf components. The system utilizes an easily programmable Arduino based controller, and the widely distributed GoPro camera. The system can be deployed in a variety of scenarios taking still images and video and can be operated either autonomously or tethered on a range of platforms, including ROVs, AUVs, landers, piers, and moorings. Several Seestar systems have been built and used for scientific studies and engineering tests. The long-term goal of this project is to have a widely distributed marine imaging network across thousands of locations, to develop baselines of biological information.

  8. Developments on a SEM-based X-ray tomography system: Stabilization scheme and performance evaluation

    NASA Astrophysics Data System (ADS)

    Gomes Perini, L. A.; Bleuet, P.; Filevich, J.; Parker, W.; Buijsse, B.; Kwakman, L. F. Tz.

    2017-06-01

    Recent improvements in a SEM-based X-ray tomography system are described. In this type of equipment, X-rays are generated through the interaction between a highly focused electron-beam and a geometrically confined anode target. Unwanted long-term drifts of the e-beam can lead to loss of X-ray flux or decrease of spatial resolution in images. To circumvent this issue, a closed-loop control using FFT-based image correlation is integrated to the acquisition routine, in order to provide an in-line drift correction. The X-ray detection system consists of a state-of-the-art scientific CMOS camera (indirect detection), featuring high quantum efficiency (˜60%) and low read-out noise (˜1.2 electrons). The system performance is evaluated in terms of resolution, detectability, and scanning times for applications covering three different scientific fields: microelectronics, technical textile, and material science.

  9. The real-time learning mechanism of the Scientific Research Associates Advanced Robotic System (SRAARS)

    NASA Technical Reports Server (NTRS)

    Chen, Alexander Y.

    1990-01-01

    Scientific research associates advanced robotic system (SRAARS) is an intelligent robotic system which has autonomous learning capability in geometric reasoning. The system is equipped with one global intelligence center (GIC) and eight local intelligence centers (LICs). It controls mainly sixteen links with fourteen active joints, which constitute two articulated arms, an extensible lower body, a vision system with two CCD cameras and a mobile base. The on-board knowledge-based system supports the learning controller with model representations of both the robot and the working environment. By consecutive verifying and planning procedures, hypothesis-and-test routines and learning-by-analogy paradigm, the system would autonomously build up its own understanding of the relationship between itself (i.e., the robot) and the focused environment for the purposes of collision avoidance, motion analysis and object manipulation. The intelligence of SRAARS presents a valuable technical advantage to implement robotic systems for space exploration and space station operations.

  10. NASA's "Webb-cam" Captures Engineers at Work on Webb at Johnson Space Center

    NASA Image and Video Library

    2017-05-30

    Now that NASA's James Webb Space Telescope has moved to NASA's Johnson Space Center in Houston, Texas, a special Webb camera was installed there to continue providing daily video feeds on the telescope's progress. Space enthusiasts, who are fascinated to see how this next generation space telescope has come together and how it is being tested, are able to see the telescope’s progress as it happens by watching the Webb-cam feed online. The Web camera at NASA’s Johnson Space Center can be seen online at: jwst.nasa.gov/, with larger views of the cams available at: jwst.nasa.gov/webcam.html. Read more: go.nasa.gov/2rQYpT2 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  11. View of model of Scientific Instrument Module to be flown on Apollo 15

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Close-up view of a scale model of the Scientific Instrument Module (SIM) to be flown for the first time on the Apollo 15 mission. Mounted in a previously vacant sector of the Apollo service module, the SIM carries specialized cameras and instrumentation for gathering lunar orbit scientific data.

  12. Investigating Image Formation with a Camera Obscura: a Study in Initial Primary Science Teacher Education

    NASA Astrophysics Data System (ADS)

    Muñoz-Franco, Granada; Criado, Ana María; García-Carmona, Antonio

    2018-04-01

    This article presents the results of a qualitative study aimed at determining the effectiveness of the camera obscura as a didactic tool to understand image formation (i.e., how it is possible to see objects and how their image is formed on the retina, and what the image formed on the retina is like compared to the object observed) in a context of scientific inquiry. The study involved 104 prospective primary teachers (PPTs) who were being trained in science teaching. To assess the effectiveness of this tool, an open questionnaire was applied before (pre-test) and after (post-test) the educational intervention. The data were analyzed by combining methods of inter- and intra-rater analysis. The results showed that more than half of the PPTs advanced in their ideas towards the desirable level of knowledge in relation to the phenomena studied. The conclusion reached is that the camera obscura, used in a context of scientific inquiry, is a useful tool for PPTs to improve their knowledge about image formation and experience in the first person an authentic scientific inquiry during their teacher training.

  13. Comparison of scientific CMOS camera and webcam for monitoring cardiac pulse after exercise

    NASA Astrophysics Data System (ADS)

    Sun, Yu; Papin, Charlotte; Azorin-Peris, Vicente; Kalawsky, Roy; Greenwald, Stephen; Hu, Sijung

    2011-09-01

    In light of its capacity for remote physiological assessment over a wide range of anatomical locations, imaging photoplethysmography has become an attractive research area in biomedical and clinical community. Amongst recent iPPG studies, two separate research directions have been revealed, i.e., scientific camera based imaging PPG (iPPG) and webcam based imaging PPG (wPPG). Little is known about the difference between these two techniques. To address this issue, a dual-channel imaging PPG system (iPPG and wPPG) using ambient light as the illumination source has been introduced in this study. The performance of the two imaging PPG techniques was evaluated through the measurement of cardiac pulse acquired from the face of 10 male subjects before and after 10 min of cycling exercise. A time-frequency representation method was used to visualize the time-dependent behaviour of the heart rate. In comparison to the gold standard contact PPG, both imaging PPG techniques exhibit comparable functional characteristics in the context of cardiac pulse assessment. Moreover, the synchronized ambient light intensity recordings in the present study can provide additional information for appraising the performance of the imaging PPG systems. This feasibility study thereby leads to a new route for non-contact monitoring of vital signs, with clear applications in triage and homecare.

  14. Report of the facility definition team spacelab UV-Optical Telescope Facility

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Scientific requirements for the Spacelab Ultraviolet-Optical Telescope (SUOT) facility are presented. Specific programs involving high angular resolution imagery over wide fields, far ultraviolet spectroscopy, precisely calibrated spectrophotometry and spectropolarimetry over a wide wavelength range, and planetary studies, including high resolution synoptic imagery, are recommended. Specifications for the mounting configuration, instruments for the mounting configuration, instrument mounting system, optical parameters, and the pointing and stabilization system are presented. Concepts for the focal plane instruments are defined. The functional requirements of the direct imaging camera, far ultraviolet spectrograph, and the precisely calibrated spectrophotometer are detailed, and the planetary camera concept is outlined. Operational concepts described in detail are: the makeup and functions of shuttle payload crew, extravehicular activity requirements, telescope control and data management, payload operations control room, orbital constraints, and orbital interfaces (stabilization, maneuvering requirements and attitude control, contamination, utilities, and payload weight considerations).

  15. 'El Capitan's' Scientific Gems

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This mosaic of images taken by the panoramic camera onboard the Mars Exploration Rover Opportunity shows the rock region dubbed 'El Capitan,' which lies within the larger outcrop near the rover's landing site. 'El Capitan' is being studied in great detail using the scientific instruments on the rover's arm; images from the panoramic camera help scientists choose the locations for this compositional work. The millimeter-scale detail of the lamination covering these rocks can be seen. The face of the rock to the right of the mosaic may be a future target for grinding with the rover's rock abrasion tool.

  16. User guide for the USGS aerial camera Report of Calibration.

    USGS Publications Warehouse

    Tayman, W.P.

    1984-01-01

    Calibration and testing of aerial mapping cameras includes the measurement of optical constants and the check for proper functioning of a number of complicated mechanical and electrical parts. For this purpose the US Geological Survey performs an operational type photographic calibration. This paper is not strictly a scientific paper but rather a 'user guide' to the USGS Report of Calibration of an aerial mapping camera for compliance with both Federal and State mapping specifications. -Author

  17. In-flight performance of the Faint Object Camera of the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Greenfield, P.; Paresce, F.; Baxter, D.; Hodge, P.; Hook, R.; Jakobsen, P.; Jedrzejewski, R.; Nota, A.; Sparks, W. B.; Towers, N.

    1991-01-01

    An overview of the Faint Object Camera and its performance to date is presented. In particular, the detector's efficiency, the spatial uniformity of response, distortion characteristics, detector and sky background, detector linearity, spectrography, and operation are discussed. The effect of the severe spherical aberration of the telescope's primary mirror on the camera's point spread function is reviewed, as well as the impact it has on the camera's general performance. The scientific implications of the performance and the spherical aberration are outlined, with emphasis on possible remedies for spherical aberration, hardware remedies, and stellar population studies.

  18. The First Year of Croatian Meteor Network

    NASA Astrophysics Data System (ADS)

    Andreic, Zeljko; Segon, Damir

    2010-08-01

    The idea and a short history of Croatian Meteor Network (CMN) is described. Based on use of cheap surveillance cameras, standard PC-TV cards and old PCs, the Network allows schools, amateur societies and individuals to participate in photographic meteor patrol program. The network has a strong educational component and many cameras are located at or around teaching facilities. Data obtained by these cameras are collected and processed by the scientific team of the network. Currently 14 cameras are operable, covering a large part of the croatian sky, data gathering is fully functional, and data reduction software is in testing phase.

  19. Adapting smartphones for low-cost optical medical imaging

    NASA Astrophysics Data System (ADS)

    Pratavieira, Sebastião.; Vollet-Filho, José D.; Carbinatto, Fernanda M.; Blanco, Kate; Inada, Natalia M.; Bagnato, Vanderlei S.; Kurachi, Cristina

    2015-06-01

    Optical images have been used in several medical situations to improve diagnosis of lesions or to monitor treatments. However, most systems employ expensive scientific (CCD or CMOS) cameras and need computers to display and save the images, usually resulting in a high final cost for the system. Additionally, this sort of apparatus operation usually becomes more complex, requiring more and more specialized technical knowledge from the operator. Currently, the number of people using smartphone-like devices with built-in high quality cameras is increasing, which might allow using such devices as an efficient, lower cost, portable imaging system for medical applications. Thus, we aim to develop methods of adaptation of those devices to optical medical imaging techniques, such as fluorescence. Particularly, smartphones covers were adapted to connect a smartphone-like device to widefield fluorescence imaging systems. These systems were used to detect lesions in different tissues, such as cervix and mouth/throat mucosa, and to monitor ALA-induced protoporphyrin-IX formation for photodynamic treatment of Cervical Intraepithelial Neoplasia. This approach may contribute significantly to low-cost, portable and simple clinical optical imaging collection.

  20. Wide Field Camera 3 Accommodations for HST Robotics Servicing Mission

    NASA Technical Reports Server (NTRS)

    Ginyard, Amani

    2005-01-01

    This slide presentation discusses the objectives of the Hubble Space Telescope (HST) Robotics Servicing and Deorbit Mission (HRSDM), reviews the Wide Field Camera 3 (WFC3), and also reviews the contamination accomodations for the WFC3. The objectives of the HRSDM are (1) to provide a disposal capability at the end of HST's useful life, (2) to upgrade the hardware by installing two new scientific instruments: replace the Corrective Optics Space Telescope Axial Replacement (COSTAR) with the Cosmic Origins Spectrograph (COS), and to replace the Wide Field/Planetary Camera-2 (WFPC2) with Wide Field Camera-3, and (3) Extend the Scientific life of HST for a minimum of 5 years after servicing. Included are slides showing the Hubble Robotic Vehicle (HRV) and slides describing what the HRV contains. There are also slides describing the WFC3. One of the mechanisms of the WFC3 is to serve partially as replacement gyroscopes for HST. There are also slides that discuss the contamination requirements for the Rate Sensor Units (RSUs), that are part of the Rate Gyroscope Assembly on the WFC3.

  1. Thermal Control of the Scientific Instrument Package in the Large Space Telescope

    NASA Technical Reports Server (NTRS)

    Hawks, K. H.

    1972-01-01

    The general thermal control system philosophy was to utilize passive control where feasible and to utilize active methods only where required for more accurate thermal control of the SIP components with narrow temperature tolerances. A thermal model of the SIP and a concept for cooling the SIP cameras are presented. The model and cooling concept have established a rationale for determining a Phase A baseline for SIP thermal control.

  2. The Hubble Space Telescope: UV, Visible, and Near-Infrared Pursuits

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer

    2010-01-01

    The Hubble Space Telescope continues to push the limits on world-class astrophysics. Cameras including the Advanced Camera for Surveys and the new panchromatic Wide Field Camera 3 which was installed nu last year's successful servicing mission S2N4,o{fer imaging from near-infrared through ultraviolet wavelengths. Spectroscopic studies of sources from black holes to exoplanet atmospheres are making great advances through the versatile use of STIS, the Space Telescope Imaging Spectrograph. The new Cosmic Origins Spectrograph, also installed last year, is the most sensitive UV spectrograph to fly io space and is uniquely suited to address particular scientific questions on galaxy halos, the intergalactic medium, and the cosmic web. With these outstanding capabilities on HST come complex needs for laboratory astrophysics support including atomic and line identification data. I will provide an overview of Hubble's current capabilities and the scientific programs and goals that particularly benefit from the studies of laboratory astrophysics.

  3. Quantitative Imaging with a Mobile Phone Microscope

    PubMed Central

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  4. The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics

    NASA Astrophysics Data System (ADS)

    Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.

    2003-04-01

    The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.

  5. Three-dimensional shape measurement system applied to superficial inspection of non-metallic pipes for the hydrocarbons transport

    NASA Astrophysics Data System (ADS)

    Arciniegas, Javier R.; González, Andrés. L.; Quintero, L. A.; Contreras, Carlos R.; Meneses, Jaime E.

    2014-05-01

    Three-dimensional shape measurement is a subject that consistently produces high scientific interest and provides information for medical, industrial and investigative applications, among others. In this paper, it is proposed to implement a three-dimensional (3D) reconstruction system for applications in superficial inspection of non-metallic pipes for the hydrocarbons transport. The system is formed by a CCD camera, a video-projector and a laptop and it is based on fringe projection technique. System functionality is evidenced by evaluating the quality of three-dimensional reconstructions obtained, which allow observing the failures and defects on the study object surface.

  6. High Energy Replicated Optics to Explore the Sun Balloon-Borne Telescope: Astrophysical Pointing

    NASA Technical Reports Server (NTRS)

    Gaskin, Jessica; Wilson-Hodge, Colleen; Ramsey, Brian; Apple, Jeff; Kurt, Dietz; Tennant, Allyn; Swartz, Douglas; Christe, Steven D.; Shih, Albert

    2014-01-01

    On September 21, 2013, the High Energy Replicated Optics to Explore the Sun, or HEROES, balloon-borne x-ray telescope launched from the Columbia Scientific Balloon Facility's site in Ft. Summer, NM. The flight lasted for approximately 27 hours and the observational targets included the Sun and astrophysical sources GRS 1915+105 and the Crab Nebula. Over the past year, the HEROES team upgraded the existing High Energy Replicated Optics (HERO) balloon-borne telescope to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES Project is a multi-NASA Center effort with team members at both Marshall Space Flight Center (MSFC) and Goddard Space Flight Center (GSFC), and is led by Co-PIs (one at each Center). The HEROES payload consists of the hard X-ray telescope HERO, developed at MSFC, combined with several new systems. To allow the HEROES telescope to make observations of the Sun, a new solar aspect system was added to supplement the existing star camera for fine pointing during both the day and night. A mechanical shutter was added to the star camera to protect it during solar observations and two alignment monitoring systems were added for improved pointing and post-flight data reconstruction. This mission was funded by the NASA HOPE (Hands-On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist.

  7. Binary pressure-sensitive paint measurements using miniaturised, colour, machine vision cameras

    NASA Astrophysics Data System (ADS)

    Quinn, Mark Kenneth

    2018-05-01

    Recent advances in machine vision technology and capability have led to machine vision cameras becoming applicable for scientific imaging. This study aims to demonstrate the applicability of machine vision colour cameras for the measurement of dual-component pressure-sensitive paint (PSP). The presence of a second luminophore component in the PSP mixture significantly reduces its inherent temperature sensitivity, increasing its applicability at low speeds. All of the devices tested are smaller than the cooled CCD cameras traditionally used and most are of significantly lower cost, thereby increasing the accessibility of such technology and techniques. Comparisons between three machine vision cameras, a three CCD camera, and a commercially available specialist PSP camera are made on a range of parameters, and a detailed PSP calibration is conducted in a static calibration chamber. The findings demonstrate that colour machine vision cameras can be used for quantitative, dual-component, pressure measurements. These results give rise to the possibility of performing on-board dual-component PSP measurements in wind tunnels or on real flight/road vehicles.

  8. An arc control and protection system for the JET lower hybrid antenna based on an imaging system.

    PubMed

    Figueiredo, J; Mailloux, J; Kirov, K; Kinna, D; Stamp, M; Devaux, S; Arnoux, G; Edwards, J S; Stephen, A V; McCullen, P; Hogben, C

    2014-11-01

    Arcs are the potentially most dangerous events related to Lower Hybrid (LH) antenna operation. If left uncontrolled they can produce damage and cause plasma disruption by impurity influx. To address this issue an arc real time control and protection imaging system for the Joint European Torus (JET) LH antenna has been implemented. The LH system is one of the additional heating systems at JET. It comprises 24 microwave generators (klystrons, operating at 3.7 GHz) providing up to 5 MW of heating and current drive to the JET plasma. This is done through an antenna composed of an array of waveguides facing the plasma. The protection system presented here is based primarily on an imaging arc detection and real time control system. It has adapted the ITER like wall hotspot protection system using an identical CCD camera and real time image processing unit. A filter has been installed to avoid saturation and spurious system triggers caused by ionization light. The antenna is divided in 24 Regions Of Interest (ROIs) each one corresponding to one klystron. If an arc precursor is detected in a ROI, power is reduced locally with subsequent potential damage and plasma disruption avoided. The power is subsequently reinstated if, during a defined interval of time, arcing is confirmed not to be present by image analysis. This system was successfully commissioned during the restart phase and beginning of the 2013 scientific campaign. Since its installation and commissioning, arcs and related phenomena have been prevented. In this contribution we briefly describe the camera, image processing, and real time control systems. Most importantly, we demonstrate that an LH antenna arc protection system based on CCD camera imaging systems works. Examples of both controlled and uncontrolled LH arc events and their consequences are shown.

  9. A study on a portable fluorescence imaging system

    NASA Astrophysics Data System (ADS)

    Chang, Han-Chao; Wu, Wen-Hong; Chang, Chun-Li; Huang, Kuo-Cheng; Chang, Chung-Hsing; Chiu, Shang-Chen

    2011-09-01

    The fluorescent reaction is that an organism or dye, excited by UV light (200-405 nm), emits a specific frequency of light; the light is usually a visible or near infrared light (405-900 nm). During the UV light irradiation, the photosensitive agent will be induced to start the photochemical reaction. In addition, the fluorescence image can be used for fluorescence diagnosis and then photodynamic therapy can be given to dental diseases and skin cancer, which has become a useful tool to provide scientific evidence in many biomedical researches. However, most of the methods on acquiring fluorescence biology traces are still stay in primitive stage, catching by naked eyes and researcher's subjective judgment. This article presents a portable camera to obtain the fluorescence image and to make up a deficit from observer competence and subjective judgment. Furthermore, the portable camera offers the 375nm UV-LED exciting light source for user to record fluorescence image and makes the recorded image become persuasive scientific evidence. In addition, when the raising the rate between signal and noise, the signal processing module will not only amplify the fluorescence signal up to 70 %, but also decrease the noise significantly from environmental light on bill and nude mouse testing.

  10. Lunar Reconnaissance Orbiter Data Enable Science and Terrain Analysis of Potential Landing Sites in South Pole-Aitken Basin

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.

    2017-12-01

    Exploring the South Pole-Aitken basin (SPA), one of the key unsampled geologic terranes on the Moon, is a high priority for Solar System science. As the largest and oldest recognizable impact basin on the Moon, it anchors the heavy bombardment chronology. It is thus a key target for sample return to better understand the impact flux in the Solar System between formation of the Moon and 3.9 Ga when Imbrium, one of the last of the great lunar impact basins, formed. Exploration of SPA has implications for understanding early habitable environments on the terrestrial planets. Global mineralogical and compositional data exist from the Clementine UV-VIS camera, the Lunar Prospector Gamma Ray Spectrometer, the Moon Mineralogy Mapper (M3) on Chandrayaan-1, the Chang'E-1 Imaging Interferometer, the spectral suite on SELENE, and the Lunar Reconnaissance Orbiter Cameras (LROC) Wide Angle Camera (WAC) and Diviner thermal radiometer. Integration of data sets enables synergistic assessment of geology and distribution of units across multiple spatial scales. Mineralogical assessment using hyperspectral data indicates spatial relationships with mineralogical signatures, e.g., central peaks of complex craters, consistent with inferred SPA basin structure and melt differentiation (Moriarty & Pieters, 2015, JGR-P 118). Delineation of mare, cryptomare, and nonmare surfaces is key to interpreting compositional mixing in the formation of SPA regolith to interpret remotely sensed data, and for scientific assessment of landing sites. LROC Narrow Angle Camera (NAC) images show the location and distribution of >0.5 m boulders and fresh craters that constitute the main threats to automated landers and thus provide critical information for landing site assessment and planning. NAC images suitable for geometric stereo derivation and digital terrain models so derived, controlled with Lunar Orbiter Laser Altimeter (LOLA) data, and oblique NAC images made with large slews of the spacecraft, are crucial to both scientific and landing-site assessments. These images, however, require favorable illumination and significant spacecraft resources. Thus they make up only a small percentage of all of the images taken. It is essential for future exploration to support LRO continued operation for these critical datasets.

  11. MOSES: a modular sensor electronics system for space science and commercial applications

    NASA Astrophysics Data System (ADS)

    Michaelis, Harald; Behnke, Thomas; Tschentscher, Matthias; Mottola, Stefano; Neukum, Gerhard

    1999-10-01

    The camera group of the DLR--Institute of Space Sensor Technology and Planetary Exploration is developing imaging instruments for scientific and space applications. One example is the ROLIS imaging system of the ESA scientific space mission `Rosetta', which consists of a descent/downlooking and a close-up imager. Both are parts of the Rosetta-Lander payload and will operate in the extreme environment of a cometary nucleus. The Rosetta Lander Imaging System (ROLIS) will introduce a new concept for the sensor electronics, which is referred to as MOSES (Modula Sensor Electronics System). MOSES is a 3D miniaturized CCD- sensor-electronics which is based on single modules. Each of the modules has some flexibility and enables a simple adaptation to specific application requirements. MOSES is mainly designed for space applications where high performance and high reliability are required. This concept, however, can also be used in other science or commercial applications. This paper describes the concept of MOSES, its characteristics, performance and applications.

  12. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F.; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  13. Baffling system for the Wide Angle Camera (WAC) of ROSETTA mission

    NASA Astrophysics Data System (ADS)

    Brunello, Pierfrancesco; Peron, Fabio; Barbieri, Cesare; Fornasier, Sonia

    2000-10-01

    After the experience of GIOTTO fly-by to comet Halley in 1986, the European Space Agency planned to improve the scientific knowledge of these astronomical objects by means of an even more ambitious rendezvous mission with another comet (P/Wirtanen). This mission, named ROSETTA, will go on from 2003 to 2013, ending after the comet perihelion phase and including also the fly-by with two asteroids of the main belt (140 Siwa and 4979 Otawara). Scientific priority of the mission is the in situ investigation of the cometary nucleus, with the aim of better understanding the formation and the composition of planetesimals and their evolution over the last 4.5 billions of years. In this context, the Authors were involved in the design of the baffling for the Wide Angle Camera (WAC) of the imaging system (OSIRIS) carried on board of the spacecraft. Scientific requirements for the WAC are : a large field of view (FOV) of 12 degree(s) x 12 degree(s) with a resolution of 100 (mu) rad per pixel, UV response, and a contrast ratio of 10-4 in order to detect gaseous and dusty features close to the nucleus of the comet. TO achieve these performances, a fairly novel class of optical solutions employing off-axis sections of concentric mirrors was explored. Regarding baffling, the peculiar demand was the rejection of stray-light generated by the optics for sources within the FOV, since the optical entrance aperture is located at the level of the secondary mirror (instead of the primary as usual). This paper describes the baffle design and analyzes its performances, calculated by numerical simulation with ray tracing methods, at different angles of incidence of the light, for sources both outside and inside the field of view.

  14. Accurate estimation of camera shot noise in the real-time

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.

    2017-10-01

    Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the accuracy of the obtained temporal noise values was estimated.

  15. Imaging of Mercury and Venus from a flyby

    USGS Publications Warehouse

    Murray, B.C.; Belton, M.J.S.; Danielson, G. Edward; Davies, M.E.; Kuiper, G.P.; O'Leary, B. T.; Suomi, V.E.; Trask, N.J.

    1971-01-01

    This paper describes the results of study of an imaging experiment planned for the 1973 Mariner Venus/Mercury flyby mission. Scientific objectives, mission constraints, analysis of alternative systems, and the rationale for final choice are presented. Severe financial constraints ruled out the best technical alternative for flyby imaging, a film/readout system, or even significant re-design of previous Mariner vidicon camera/tape recorder systems. The final selection was a vidicon camera quite similar to that used for Mariner Mars 1971, but with the capability of real time transmission during the Venus and Mercury flybys. Real time data return became possible through dramatic increase in the communications bandwidth at only modest sacrifice in the quality of the returned pictures. Two identical long focal length cameras (1500 mm) were selected and it will be possible to return several thousand pictures from both planets at resolutions ranging from equivalent to Earthbased to tenths of a kilometer at encounter. Systematic high resolution ultraviolet photography of Venus is planned after encounter in an attempt to understand the nature of the mysterious ultraviolet markings and their apparent 4- to 5-day rotation period. Full disk coverage in mosaics will produce pictures of both planets similar in quality to Earthbased telescopic pictures of the Moon. The increase of resolution, more than three orders of magnitude, will yield an exciting first look at two planets whose closeup appearance is unknown. ?? 1971.

  16. Sampling from the Museum of Forms: Photography and Visual Thinking in the Rise of Modern Statistics.

    ERIC Educational Resources Information Center

    Biocca, Frank

    Treating the camera as an information technology, this essay shows how the camera is a powerful theoretical disquisition on the nature of form, realism, and scientific vision. The first section presents a history of form, separate from matter, as something collectible in a library or museum. The second section discusses the photograph as a rival…

  17. Accurate color images: from expensive luxury to essential resource

    NASA Astrophysics Data System (ADS)

    Saunders, David R.; Cupitt, John

    2002-06-01

    Over ten years ago the National Gallery in London began a program to make digital images of paintings in the collection using a colorimetric imaging system. This was to provide a permanent record of the state of paintings against which future images could be compared to determine if any changes had occurred. It quickly became apparent that such images could be used not only for scientific purposes, but also in applications where transparencies were then being used, for example as source materials for printed books and catalogues or for computer-based information systems. During the 1990s we were involved in the development of a series of digital cameras that have combined the high color accuracy of the original 'scientific' imaging system with the familiarity and portability of a medium format camera. This has culminated in the program of digitization now in progress at the National Gallery. By the middle of 2001 we will have digitized all the major paintings in the collection at a resolution of 10,000 pixels along their longest dimension and with calibrated color; we are on target to digitize the whole collection by the end of 2002. The images are available on-line within the museum for consultation and so that Gallery departments can use the images in printed publications and on the Gallery's web- site. We describe the development of the imaging systems used at National Gallery and how the research we have conducted into high-resolution accurate color imaging has developed from being a peripheral, if harmless, research activity to becoming a central part of the Gallery's information and publication strategy. Finally, we discuss some outstanding issues, such as interfacing our color management procedures with the systems used by external organizations.

  18. X-ray detectors at the Linac Coherent Light Source.

    PubMed

    Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; Carron, Sebastian; Dragone, Angelo; Freytag, Dietrich; Haller, Gunther; Hart, Philip; Hasi, Jasmine; Herbst, Ryan; Herrmann, Sven; Kenney, Chris; Markovic, Bojan; Nishimura, Kurtis; Osier, Shawn; Pines, Jack; Reese, Benjamin; Segal, Julie; Tomada, Astrid; Weaver, Matt

    2015-05-01

    Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a new generation of cameras under development at SLAC, is introduced.

  19. X-ray detectors at the Linac Coherent Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella

    Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a newmore » generation of cameras under development at SLAC, is introduced.« less

  20. X-ray detectors at the Linac Coherent Light Source

    DOE PAGES

    Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; ...

    2015-04-21

    Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a newmore » generation of cameras under development at SLAC, is introduced.« less

  1. X-ray detectors at the Linac Coherent Light Source

    PubMed Central

    Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; Carron, Sebastian; Dragone, Angelo; Freytag, Dietrich; Haller, Gunther; Hart, Philip; Hasi, Jasmine; Herbst, Ryan; Herrmann, Sven; Kenney, Chris; Markovic, Bojan; Nishimura, Kurtis; Osier, Shawn; Pines, Jack; Reese, Benjamin; Segal, Julie; Tomada, Astrid; Weaver, Matt

    2015-01-01

    Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a new generation of cameras under development at SLAC, is introduced. PMID:25931071

  2. An Investigation on the Use of Different Centroiding Algorithms and Star Catalogs in Astro-Geodetic Observations

    NASA Astrophysics Data System (ADS)

    Basoglu, Burak; Halicioglu, Kerem; Albayrak, Muge; Ulug, Rasit; Tevfik Ozludemir, M.; Deniz, Rasim

    2017-04-01

    In the last decade, the importance of high-precise geoid determination at local or national level has been pointed out by Turkish National Geodesy Commission. The Commission has also put objective of modernization of national height system of Turkey to the agenda. Meanwhile several projects have been realized in recent years. In Istanbul city, a GNSS/Levelling geoid was defined in 2005 for the metropolitan area of the city with an accuracy of ±3.5cm. In order to achieve a better accuracy in this area, "Local Geoid Determination with Integration of GNSS/Levelling and Astro-Geodetic Data" project has been conducted in Istanbul Technical University and Bogazici University KOERI since January 2016. The project is funded by The Scientific and Technological Research Council of Turkey. With the scope of the project, modernization studies of Digital Zenith Camera System are being carried on in terms of hardware components and software development. Accentuated subjects are the star catalogues, and centroiding algorithm used to identify the stars on the zenithal star field. During the test observations of Digital Zenith Camera System performed between 2013-2016, final results were calculated using the PSF method for star centroiding, and the second USNO CCD Astrograph Catalogue (UCAC2) for the reference star positions. This study aims to investigate the position accuracy of the star images by comparing different centroiding algorithms and available star catalogs used in astro-geodetic observations conducted with the digital zenith camera system.

  3. Wide-field spectrally resolved quantitative fluorescence imaging system: toward neurosurgical guidance in glioma resection

    NASA Astrophysics Data System (ADS)

    Xie, Yijing; Thom, Maria; Ebner, Michael; Wykes, Victoria; Desjardins, Adrien; Miserocchi, Anna; Ourselin, Sebastien; McEvoy, Andrew W.; Vercauteren, Tom

    2017-11-01

    In high-grade glioma surgery, tumor resection is often guided by intraoperative fluorescence imaging. 5-aminolevulinic acid-induced protoporphyrin IX (PpIX) provides fluorescent contrast between normal brain tissue and glioma tissue, thus achieving improved tumor delineation and prolonged patient survival compared with conventional white-light-guided resection. However, commercially available fluorescence imaging systems rely solely on visual assessment of fluorescence patterns by the surgeon, which makes the resection more subjective than necessary. We developed a wide-field spectrally resolved fluorescence imaging system utilizing a Generation II scientific CMOS camera and an improved computational model for the precise reconstruction of the PpIX concentration map. In our model, the tissue's optical properties and illumination geometry, which distort the fluorescent emission spectra, are considered. We demonstrate that the CMOS-based system can detect low PpIX concentration at short camera exposure times, while providing high-pixel resolution wide-field images. We show that total variation regularization improves the contrast-to-noise ratio of the reconstructed quantitative concentration map by approximately twofold. Quantitative comparison between the estimated PpIX concentration and tumor histopathology was also investigated to further evaluate the system.

  4. Sub-picosecond streak camera measurements at LLNL: From IR to x-rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuba, J; Shepherd, R; Booth, R

    An ultra fast, sub-picosecond resolution streak camera has been recently developed at the LLNL. The camera is a versatile instrument with a wide operating wavelength range. The temporal resolution of up to 300 fs can be achieved, with routine operation at 500 fs. The streak camera has been operated in a wide wavelength range from IR to x-rays up to 2 keV. In this paper we briefly review the main design features that result in the unique properties of the streak camera and present its several scientific applications: (1) Streak camera characterization using a Michelson interferometer in visible range, (2)more » temporally resolved study of a transient x-ray laser at 14.7 nm, which enabled us to vary the x-ray laser pulse duration from {approx}2-6 ps by changing the pump laser parameters, and (3) an example of a time-resolved spectroscopy experiment with the streak camera.« less

  5. An Automatic Image-Based Modelling Method Applied to Forensic Infography

    PubMed Central

    Zancajo-Blazquez, Sandra; Gonzalez-Aguilera, Diego; Gonzalez-Jorge, Higinio; Hernandez-Lopez, David

    2015-01-01

    This paper presents a new method based on 3D reconstruction from images that demonstrates the utility and integration of close-range photogrammetry and computer vision as an efficient alternative to modelling complex objects and scenarios of forensic infography. The results obtained confirm the validity of the method compared to other existing alternatives as it guarantees the following: (i) flexibility, permitting work with any type of camera (calibrated and non-calibrated, smartphone or tablet) and image (visible, infrared, thermal, etc.); (ii) automation, allowing the reconstruction of three-dimensional scenarios in the absence of manual intervention, and (iii) high quality results, sometimes providing higher resolution than modern laser scanning systems. As a result, each ocular inspection of a crime scene with any camera performed by the scientific police can be transformed into a scaled 3d model. PMID:25793628

  6. NASA Webb Telescope

    NASA Image and Video Library

    2017-12-08

    NASA image release September 17, 2010 In preparation for a cryogenic test NASA Goddard technicians install instrument mass simulators onto the James Webb Space Telescope ISIM structure. The ISIM Structure supports and holds the four Webb telescope science instruments : the Mid-Infrared Instrument (MIRI), the Near-Infrared Camera (NIRCam), the Near-Infrared Spectrograph (NIRSpec) and the Fine Guidance Sensor (FGS). Credit: NASA/GSFC/Chris Gunn To learn more about the James Webb Space Telescope go to: www.jwst.nasa.gov/ NASA Goddard Space Flight Center contributes to NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s endeavors by providing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  7. Space Telescopes Reveal Secrets of Turbulent Black Hole

    NASA Image and Video Library

    2017-12-08

    NASA image release September 29, 2011 This image of the distant active galaxy Markarian 509 was taken in April 2007 with the Hubble Space Telescope's Wide Field Camera 2. To read more go to: www.nasa.gov/mission_pages/hubble/science/turbulent-black... Credit: NASA, ESA, G. Kriss (STScI), and J. de Plaa (SRON Netherlands Institute for Space Research); Acknowledgment: B. Peterson (Ohio State University) NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  8. ABRIXAS: scientific goal and mission concept

    NASA Astrophysics Data System (ADS)

    Predehl, Peter

    1999-10-01

    ABRIXAS is a small German satellite project having the goal of surveying the sky in the x-ray band between 0.5 and 12 keV, thereby extending the former ROSAT all-sky survey towards higher energies. It consists of seven highly nested Wolter-I mirror systems which share one common focal plane camera, a CCD detector of the novel pn-type. ABRIXAS benefits from previously developed technologies or existing instruments. ABRIXAS was launched successfully into a low earth orbit by a Russian KOSMOS rocket in late April 1999. A few hours after its launch, however, the mission failed due to a battery problem. Currently, a repetition of the mission is under discussion because both the scientific goal and the mission concept are still be regarded as very attractive.

  9. Research Instruments

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The GENETI-SCANNER, newest product of Perceptive Scientific Instruments, Inc. (PSI), rapidly scans slides, locates, digitizes, measures and classifies specific objects and events in research and diagnostic applications. Founded by former NASA employees, PSI's primary product line is based on NASA image processing technology. The instruments karyotype - a process employed in analysis and classification of chromosomes - using a video camera mounted on a microscope. Images are digitized, enabling chromosome image enhancement. The system enables karyotyping to be done significantly faster, increasing productivity and lowering costs. Product is no longer being manufactured.

  10. Coordinates of anthropogenic features on the Moon

    NASA Astrophysics Data System (ADS)

    Wagner, R. V.; Nelson, D. M.; Plescia, J. B.; Robinson, M. S.; Speyerer, E. J.; Mazarico, E.

    2017-02-01

    High-resolution images from the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) reveal the landing locations of recent and historic spacecraft and associated impact sites across the lunar surface. Using multiple images of each site acquired between 2009 and 2015, an improved Lunar Reconnaissance Orbiter (LRO) ephemeris, and a temperature-dependent camera orientation model, we derived accurate coordinates (<12 m) for each soft-landed spacecraft, rover, deployed scientific payload, and spacecraft impact crater that we have identified. Accurate coordinates enhance the scientific interpretations of data returned by the surface instruments and of returned samples of the Apollo and Luna sites. In addition, knowledge of the sizes and positions of craters formed as the result of impacting spacecraft provides key benchmarks into the relationship between energy and crater size, as well as calibration points for reanalyzing seismic measurements acquired during the Apollo program. We identified the impact craters for the three spacecraft that impacted the surface during the LRO mission by comparing before and after NAC images.

  11. Coordinates of Anthropogenic Features on the Moon

    NASA Technical Reports Server (NTRS)

    Wagner, R. V.; Nelson, D. M.; Plescia, J. B.; Robinson, M. S.; Speyerer , E. J.; Mazarico, E.

    2016-01-01

    High-resolution images from the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) reveal the landing locations of recent and historic spacecraft and associated impact sites across the lunar surface. Using multiple images of each site acquired between 2009 and 2015, an improved Lunar Reconnaissance Orbiter (LRO) ephemeris, and a temperature-dependent camera orientation model, we derived accurate coordinates ( less than 12 meters) for each soft-landed spacecraft, rover, deployed scientific payload, and spacecraft impact crater that we have identified. Accurate coordinates enhance the scientific interpretations of data returned by the surface instruments and of returned samples of the Apollo and Luna sites. In addition, knowledge of the sizes and positions of craters formed as the result of impacting spacecraft provides key benchmarks into the relationship between energy and crater size, as well as calibration points for reanalyzing seismic measurements acquired during the Apollo program. We identified the impact craters for the three spacecraft that impacted the surface during the LRO mission by comparing before and after NAC images.

  12. Guide for the Preparation of Scientific Papers for Publication. Second Edition.

    ERIC Educational Resources Information Center

    Martinsson, Anders

    Updating a 1968 publication, this document presents rules and explanatory comments for use by authors and editors involved in the preparation of a scientific manuscript for professional typesetting prior to publication. It is noted that the guidelines should also be useful for authors producing camera-ready typescript with word processing…

  13. PRoViScout: a planetary scouting rover demonstrator

    NASA Astrophysics Data System (ADS)

    Paar, Gerhard; Woods, Mark; Gimkiewicz, Christiane; Labrosse, Frédéric; Medina, Alberto; Tyler, Laurence; Barnes, David P.; Fritz, Gerald; Kapellos, Konstantinos

    2012-01-01

    Mobile systems exploring Planetary surfaces in future will require more autonomy than today. The EU FP7-SPACE Project ProViScout (2010-2012) establishes the building blocks of such autonomous exploration systems in terms of robotics vision by a decision-based combination of navigation and scientific target selection, and integrates them into a framework ready for and exposed to field demonstration. The PRoViScout on-board system consists of mission management components such as an Executive, a Mars Mission On-Board Planner and Scheduler, a Science Assessment Module, and Navigation & Vision Processing modules. The platform hardware consists of the rover with the sensors and pointing devices. We report on the major building blocks and their functions & interfaces, emphasizing on the computer vision parts such as image acquisition (using a novel zoomed 3D-Time-of-Flight & RGB camera), mapping from 3D-TOF data, panoramic image & stereo reconstruction, hazard and slope maps, visual odometry and the recognition of potential scientifically interesting targets.

  14. NICMOS status and plans

    NASA Technical Reports Server (NTRS)

    Thompson, Rodger I.

    1997-01-01

    Near Infrared Camera and Multi-Object Spectrometer (NICMOS) has been in orbit for about 8 months. This is a report on its current status and future plans. Also included are some comments on particular aspects of data analysis concerning dark subtraction, shading, and removal of cosmic rays. At present NICMOS provides excellent images of high scientific content. Most of the observations utilize cameras 1 and 2 which are in excellent focus. Camera 3 is not yet within the range of the focus adjustment mechanism, but its current images are still quite excellent. In this paper we will present the status of various aspects of the NICMOS instrument.

  15. The Mast Cameras and Mars Descent Imager (MARDI) for the 2009 Mars Science Laboratory

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Bell, J. F.; Cameron, J.; Dietrich, W. E.; Edgett, K. S.; Hallet, B.; Herkenhoff, K. E.; Lemmon, M. T.; Parker, T. J.; Sullivan, R. J.

    2005-01-01

    Based on operational experience gained during the Mars Exploration Rover (MER) mission, we proposed and were selected to conduct two related imaging experiments: (1) an investigation of the geology and short-term atmospheric vertical wind profile local to the Mars Science Laboratory (MSL) landing site using descent imaging, and (2) a broadly-based scientific investigation of the MSL locale employing visible and very near infra-red imaging techniques from a pair of mast-mounted, high resolution cameras. Both instruments share a common electronics design, a design also employed for the MSL Mars Hand Lens Imager (MAHLI) [1]. The primary differences between the cameras are in the nature and number of mechanisms and specific optics tailored to each camera s requirements.

  16. Observation of Passive and Explosive Emissions at Stromboli with a Ground-based Hyperspectral TIR Camera

    NASA Astrophysics Data System (ADS)

    Smekens, J. F.; Mathieu, G.

    2015-12-01

    Scientific imaging techniques have progressed at a fast pace in the recent years, thanks in part to great improvements in detector technology, and through our ability to process large amounts of complex data using sophisticated software. Broadband thermal cameras are ubiquitously used for permanent monitoring of volcanic activity, and have been used in a multitude of scientific applications, from tracking ballistics to studying the thermal evolution lava flow fields and volcanic plumes. In parallel, UV cameras are now used at several volcano observatories to quantify daytime sulfur dioxide (SO2) emissions at very high frequency. In this work we present the results the first deployment of a ground-based Thermal Infrared (TIR) Hyperspectral Imaging System (Telops Hyper-Cam LW) for the study of passive and explosive volcanic activity at Stromboli volcano, Italy. The instrument uses a Michelson spectrometer and Fourier Transform Infrared Spectrometry to produce hyperspectral datacubes of a scene (320x256 pixels) in the range 7.7-11.8 μm, with a spectral resolution of up to 0.25 cm-1 and at frequencies of ~10 Hz. The activity at Stromboli is characterized by explosions of small magnitude, often containing significant amounts of gas and ash, separated by periods of quiescent degassing of 10-60 minutes. With our dataset, spanning about 5 days of monitoring, we are able to detect and track temporal variations of SO2 and ash emissions during both daytime and nighttime. It ultimately allows for the quantification of the mass of gas and ash ejected during and between explosive events. Although the high price and power consumption of the instrument are obstacles to its deployment as a monitoring tool, this type of data sets offers unprecedented insight into the dynamic processes taking place at Stromboli, and could lead to a better understanding of the eruptive mechanisms at persistently active systems in general.

  17. Low-Cost Alternative for Signal Generators in the Physics Laboratory

    NASA Astrophysics Data System (ADS)

    Pathare, Shirish Rajan; Raghavendra, M. K.; Huli, Saurabhee

    2017-05-01

    Recently devices such as the optical mouse of a computer, webcams, Wii remote, and digital cameras have been used to record and analyze different physical phenomena quantitatively. Devices like tablets and smartphones are also becoming popular. Different scientific applications available at Google Play (Android devices) or the App Store (iOS devices) make them versatile. One can find many websites that provide information regarding various scientific applications compatible with these systems. A variety of smartphones/tablets are available with different types of sensors embedded. Some of them have sensors that are capable of measuring intensity of light, sound, and magnetic field. The camera of these devices has been used to study projectile motion, and the same device, along with a sensor, has been used to study the physical pendulum. Accelerometers have been used to study free and damped harmonic oscillations and to measure acceleration due to gravity. Using accelerometers and gyroscopes, angular velocity and centripetal acceleration have been measured. The coefficient of restitution for a ball bouncing on the floor has been measured using the application Oscilloscope on the iPhone. In this article, we present the use of an Android device as a low-cost alternative for a signal generator. We use the Signal Generator application installed on the Android device along with an amplifier circuit.

  18. Autonomous vision-based navigation for proximity operations around binary asteroids

    NASA Astrophysics Data System (ADS)

    Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo

    2018-02-01

    Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.

  19. Autonomous vision-based navigation for proximity operations around binary asteroids

    NASA Astrophysics Data System (ADS)

    Gil-Fernandez, Jesus; Ortega-Hernando, Guillermo

    2018-06-01

    Future missions to small bodies demand higher level of autonomy in the Guidance, Navigation and Control system for higher scientific return and lower operational costs. Different navigation strategies have been assessed for ESA's asteroid impact mission (AIM). The main objective of AIM is the detailed characterization of binary asteroid Didymos. The trajectories for the proximity operations shall be intrinsically safe, i.e., no collision in presence of failures (e.g., spacecraft entering safe mode), perturbations (e.g., non-spherical gravity field), and errors (e.g., maneuver execution error). Hyperbolic arcs with sufficient hyperbolic excess velocity are designed to fulfil the safety, scientific, and operational requirements. The trajectory relative to the asteroid is determined using visual camera images. The ground-based trajectory prediction error at some points is comparable to the camera Field Of View (FOV). Therefore, some images do not contain the entire asteroid. Autonomous navigation can update the state of the spacecraft relative to the asteroid at higher frequency. The objective of the autonomous navigation is to improve the on-board knowledge compared to the ground prediction. The algorithms shall fit in off-the-shelf, space-qualified avionics. This note presents suitable image processing and relative-state filter algorithms for autonomous navigation in proximity operations around binary asteroids.

  20. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  1. An arc control and protection system for the JET lower hybrid antenna based on an imaging system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Figueiredo, J., E-mail: joao.figueiredo@jet.efda.org; Mailloux, J.; Kirov, K.

    Arcs are the potentially most dangerous events related to Lower Hybrid (LH) antenna operation. If left uncontrolled they can produce damage and cause plasma disruption by impurity influx. To address this issue an arc real time control and protection imaging system for the Joint European Torus (JET) LH antenna has been implemented. The LH system is one of the additional heating systems at JET. It comprises 24 microwave generators (klystrons, operating at 3.7 GHz) providing up to 5 MW of heating and current drive to the JET plasma. This is done through an antenna composed of an array of waveguidesmore » facing the plasma. The protection system presented here is based primarily on an imaging arc detection and real time control system. It has adapted the ITER like wall hotspot protection system using an identical CCD camera and real time image processing unit. A filter has been installed to avoid saturation and spurious system triggers caused by ionization light. The antenna is divided in 24 Regions Of Interest (ROIs) each one corresponding to one klystron. If an arc precursor is detected in a ROI, power is reduced locally with subsequent potential damage and plasma disruption avoided. The power is subsequently reinstated if, during a defined interval of time, arcing is confirmed not to be present by image analysis. This system was successfully commissioned during the restart phase and beginning of the 2013 scientific campaign. Since its installation and commissioning, arcs and related phenomena have been prevented. In this contribution we briefly describe the camera, image processing, and real time control systems. Most importantly, we demonstrate that an LH antenna arc protection system based on CCD camera imaging systems works. Examples of both controlled and uncontrolled LH arc events and their consequences are shown.« less

  2. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  3. Stellar Occultations in the Coma of Comet 67/P Chuyumov-Gerasimenko Observed by the OSIRIS Camera System

    NASA Astrophysics Data System (ADS)

    Moissl, Richard; Kueppers, Michael

    2016-10-01

    In this paper we present the results of an analysis on a large part of the existing Image data from the OSIRIS camera system onboard the Rosetta Spacecraft, in which stars of sufficient brightness (down to a limiting magnitude of 6) have been observed through the coma of Comet 67/P Churyumov-Gerasimenko ("C-G"). Over the course of the Rosetta main mission the Coma of the comet underwent large changes in density and structure, owed to the changing insolation along the orbit of C-G. We report on the changes of the stellar signals in the wavelength ranges, covered by the filters of the OSIRIS Narrow-Angle (NAC) and Wide-Angle (WAC) cameras.Acknowledgements: OSIRIS was built by a consortium led by the Max-Planck-Institut für Sonnensystemforschung, Göttingen, Germany, in collaboration with CISAS, University of Padova, Italy, the Laboratoire d'Astrophysique de Marseille, France, the Instituto de Astrofísica de Andalucia, CSIC, Granada, Spain, the Scientific Support Office of the European Space Agency, Noordwijk, The Netherlands, the Instituto Nacional de Técnica Aeroespacial, Madrid, Spain, the Universidad Politéchnica de Madrid, Spain, the Department of Physics and Astronomy of Uppsala University, Sweden, and the Institut für Datentechnik und Kommunikationsnetze der Technischen Universität Braunschweig, Germany.

  4. Experiments with synchronized sCMOS cameras

    NASA Astrophysics Data System (ADS)

    Steele, Iain A.; Jermak, Helen; Copperwheat, Chris M.; Smith, Robert J.; Poshyachinda, Saran; Soonthorntham, Boonrucksar

    2016-07-01

    Scientific-CMOS (sCMOS) cameras can combine low noise with high readout speeds and do not suffer the charge multiplication noise that effectively reduces the quantum efficiency of electron multiplying CCDs by a factor 2. As such they have strong potential in fast photometry and polarimetry instrumentation. In this paper we describe the results of laboratory experiments using a pair of commercial off the shelf sCMOS cameras based around a 4 transistor per pixel architecture. In particular using a both stable and a pulsed light sources we evaluate the timing precision that may be obtained when the cameras readouts are synchronized either in software or electronically. We find that software synchronization can introduce an error of 200-msec. With electronic synchronization any error is below the limit ( 50-msec) of our simple measurement technique.

  5. The PanCam Instrument for the ExoMars Rover

    NASA Astrophysics Data System (ADS)

    Coates, A. J.; Jaumann, R.; Griffiths, A. D.; Leff, C. E.; Schmitz, N.; Josset, J.-L.; Paar, G.; Gunn, M.; Hauber, E.; Cousins, C. R.; Cross, R. E.; Grindrod, P.; Bridges, J. C.; Balme, M.; Gupta, S.; Crawford, I. A.; Irwin, P.; Stabbins, R.; Tirsch, D.; Vago, J. L.; Theodorou, T.; Caballo-Perucha, M.; Osinski, G. R.; PanCam Team

    2017-07-01

    The scientific objectives of the ExoMars rover are designed to answer several key questions in the search for life on Mars. In particular, the unique subsurface drill will address some of these, such as the possible existence and stability of subsurface organics. PanCam will establish the surface geological and morphological context for the mission, working in collaboration with other context instruments. Here, we describe the PanCam scientific objectives in geology, atmospheric science, and 3-D vision. We discuss the design of PanCam, which includes a stereo pair of Wide Angle Cameras (WACs), each of which has an 11-position filter wheel and a High Resolution Camera (HRC) for high-resolution investigations of rock texture at a distance. The cameras and electronics are housed in an optical bench that provides the mechanical interface to the rover mast and a planetary protection barrier. The electronic interface is via the PanCam Interface Unit (PIU), and power conditioning is via a DC-DC converter. PanCam also includes a calibration target mounted on the rover deck for radiometric calibration, fiducial markers for geometric calibration, and a rover inspection mirror.

  6. View of the Columbia's open payload bay and the Canadian RMS

    NASA Image and Video Library

    1981-11-13

    STS002-12-833 (13 Nov. 1981) --- Clouds over Earth and black sky form the background for this unique photograph from the space shuttle Columbia in Earth orbit. The photograph was shot through the aft flight deck windows viewing the cargo bay. Part of the scientific payload of the Office of Space and Terrestrial Applications (OSTA-1) is visible in the open cargo bay. The astronauts inside Columbia's cabin were remotely operating the Canadian-built remote manipulator system (RMS). Note television cameras on its elbow and wrist pieces. Photo credit: NASA

  7. The Global Coronal Structure Investigation

    NASA Technical Reports Server (NTRS)

    Golub, Leon

    1998-01-01

    During the past year we have completed the changeover from the NIXT program to the new TXI sounding rocket program. The NIXT effort, aimed at evaluating the viability of the remaining portions of the NIXT hardware and design, has been finished and the portions of the NIXT which are viable and flightworthy, such as filters, mirror mounting hardware, electronics and telemetry interface systems, are now part of the new rocket payload. The backup NIXT multilayer-coated x-ray telescope and its mounting hardware have been completely fabricated and are being stored for possible future use in the TXI rocket. The H-alpha camera design is being utilized in the TXI program for real-time pointing verification and control via telemetry. A new H-alpha camera has been built, with a high-resolution RS170 CCD camera output. Two papers, summarizing scientific results from the NIXT rocket program, have been written and published this year: 1. "The Solar X-ray Corona," by L. Golub, Astrophysics and Space Science, 237, 33 (1996). 2. "Difficulties in Observing Coronal Structure," Keynote Paper, Proceedings STEPWG1 Workshop on Measurements and Analyses of the Solar 3D Magnetic Field, Solar Physics, 174, 99 (1997).

  8. Optical design and stray light analysis for the JANUS camera of the JUICE space mission

    NASA Astrophysics Data System (ADS)

    Greggio, D.; Magrin, D.; Munari, M.; Zusi, M.; Ragazzoni, R.; Cremonese, G.; Debei, S.; Friso, E.; Della Corte, V.; Palumbo, P.; Hoffmann, H.; Jaumann, R.; Michaelis, H.; Schmitz, N.; Schipani, P.; Lara, L. M.

    2015-09-01

    The JUICE (JUpiter ICy moons Explorer) satellite of the European Space Agency (ESA) is dedicated to the detailed study of Jupiter and its moons. Among the whole instrument suite, JANUS (Jovis, Amorum ac Natorum Undique Scrutator) is the camera system of JUICE designed for imaging at visible wavelengths. It will conduct an in-depth study of Ganymede, Callisto and Europa, and explore most of the Jovian system and Jupiter itself, performing, in the case of Ganymede, a global mapping of the satellite with a resolution of 400 m/px. The optical design chosen to meet the scientific goals of JANUS is a three mirror anastigmatic system in an off-axis configuration. To ensure that the achieved contrast is high enough to observe the features on the surface of the satellites, we also performed a preliminary stray light analysis of the telescope. We provide here a short description of the optical design and we present the procedure adopted to evaluate the stray-light expected during the mapping phase of the surface of Ganymede. We also use the results obtained from the first run of simulations to optimize the baffle design.

  9. Panoramic 3d Vision on the ExoMars Rover

    NASA Astrophysics Data System (ADS)

    Paar, G.; Griffiths, A. D.; Barnes, D. P.; Coates, A. J.; Jaumann, R.; Oberst, J.; Gao, Y.; Ellery, A.; Li, R.

    The Pasteur payload on the ESA ExoMars Rover 2011/2013 is designed to search for evidence of extant or extinct life either on or up to ˜2 m below the surface of Mars. The rover will be equipped by a panoramic imaging system to be developed by a UK, German, Austrian, Swiss, Italian and French team for visual characterization of the rover's surroundings and (in conjunction with an infrared imaging spectrometer) remote detection of potential sample sites. The Panoramic Camera system consists of a wide angle multispectral stereo pair with 65° field-of-view (WAC; 1.1 mrad/pixel) and a high resolution monoscopic camera (HRC; current design having 59.7 µrad/pixel with 3.5° field-of-view) . Its scientific goals and operational requirements can be summarized as follows: • Determination of objects to be investigated in situ by other instruments for operations planning • Backup and Support for the rover visual navigation system (path planning, determination of subsequent rover positions and orientation/tilt within the 3d environment), and localization of the landing site (by stellar navigation or by combination of orbiter and ground panoramic images) • Geological characterization (using narrow band geology filters) and cartography of the local environments (local Digital Terrain Model or DTM). • Study of atmospheric properties and variable phenomena near the Martian surface (e.g. aerosol opacity, water vapour column density, clouds, dust devils, meteors, surface frosts,) 1 • Geodetic studies (observations of Sun, bright stars, Phobos/Deimos). The performance of 3d data processing is a key element of mission planning and scientific data analysis. The 3d Vision Team within the Panoramic Camera development Consortium reports on the current status of development, consisting of the following items: • Hardware Layout & Engineering: The geometric setup of the system (location on the mast & viewing angles, mutual mounting between WAC and HRC) needs to be optimized w.r.t. fields of view, ranging capability (distance measurement capability), data rate, necessity of calibration targets, hardware & data interfaces to other subsystems (e.g. navigation) as well as accuracy impacts of sensor design and compression ratio. • Geometric Calibration: The geometric properties of the individual cameras including various spectral filters, their mutual relations and the dynamic geometrical relation between rover frame and cameras - with the mast in between - are precisely described by a calibration process. During surface operations these relations will be continuously checked and updated by photogrammetric means, environmental influences such as temperature, pressure and the Mars gravity will be taken into account. • Surface Mapping: Stereo imaging using the WAC stereo pair is used for the 3d reconstruction of the rover vicinity to identify, locate and characterize potentially interesting spots (3-10 for an experimental cycle to be performed within approx. 10-30 sols). The HRC is used for high resolution imagery of these regions of interest to be overlaid on the 3d reconstruction and potentially refined by shape-from-shading techniques. A quick processing result is crucial for time critical operations planning, therefore emphasis is laid on the automatic behaviour and intrinsic error detection mechanisms. The mapping results will be continuously fused, updated and synchronized with the map used by the navigation system. The surface representation needs to take into account the different resolutions of HRC and WAC as well as uncommon or even unexpected image acquisition modes such as long range, wide baseline stereo from different rover positions or escape strategies in the case of loss of one of the stereo camera heads. • Panorama Mosaicking: The production of a high resolution stereoscopic panorama nowadays is state-of-art in computer vision. However, certain 2 challenges such as the need for access to accurate spherical coordinates, maintenance of radiometric & spectral response in various spectral bands, fusion between HRC and WAC, super resolution, and again the requirement of quick yet robust processing will add some complexity to the ground processing system. • Visualization for Operations Planning: Efficient operations planning is directly related to an ergonomic and well performing visualization. It is intended to adapt existing tools to an integrated visualization solution for the purpose of scientific site characterization, view planning and reachability mapping/instrument placement of pointing sensors (including the panoramic imaging system itself), and selection of regions of interest. The main interfaces between the individual components as well as the first version of a user requirement document are currently under definition. Beside the support for sensor layout and calibration the 3d vision system will consist of 2-3 main modules to be used during ground processing & utilization of the ExoMars Rover panoramic imaging system. 3

  10. The space telescope scientific instruments

    NASA Technical Reports Server (NTRS)

    Leckrone, D. S.

    1980-01-01

    The paper describes the space telescope with a 2.4 m aperture to be launched at 500 km altitude in late 1983. Four axial-bay and one radial-bay scientific instrument, a wide-field and planetary camera, a faint-object camera, a faint-object spectrograph, and a high-speed photometer are to be installed to conduct the initial observations. The axial instruments are constrained to envelopes with dimensions 0.9 x 0.9 x 2.2 m and their masses cannot exceed 317 kg. The observatory will also be equipped with fine-guidance sensors and a microprocessor. The design concepts of the instruments are outlined and some of the astronomical capabilities including studies of distant and local galaxies, physical properties of quasars, interrelations between quasars and active galactic nuclei are mentioned.

  11. Space telescope scientific instruments

    NASA Technical Reports Server (NTRS)

    Leckrone, D. S.

    1979-01-01

    The paper describes the Space Telescope (ST) observatory, the design concepts of the five scientific instruments which will conduct the initial observatory observations, and summarizes their astronomical capabilities. The instruments are the wide-field and planetary camera (WFPC) which will receive the highest quality images, the faint-object camera (FOC) which will penetrate to the faintest limiting magnitudes and achieve the finest angular resolution possible, and the faint-object spectrograph (FOS), which will perform photon noise-limited spectroscopy and spectropolarimetry on objects substantially fainter than those accessible to ground-based spectrographs. In addition, the high resolution spectrograph (HRS) will provide higher spectral resolution with greater photometric accuracy than previously possible in ultraviolet astronomical spectroscopy, and the high-speed photometer will achieve precise time-resolved photometric observations of rapidly varying astronomical sources on short time scales.

  12. Automatic multi-camera calibration for deployable positioning systems

    NASA Astrophysics Data System (ADS)

    Axelsson, Maria; Karlsson, Mikael; Rudner, Staffan

    2012-06-01

    Surveillance with automated positioning and tracking of subjects and vehicles in 3D is desired in many defence and security applications. Camera systems with stereo or multiple cameras are often used for 3D positioning. In such systems, accurate camera calibration is needed to obtain a reliable 3D position estimate. There is also a need for automated camera calibration to facilitate fast deployment of semi-mobile multi-camera 3D positioning systems. In this paper we investigate a method for automatic calibration of the extrinsic camera parameters (relative camera pose and orientation) of a multi-camera positioning system. It is based on estimation of the essential matrix between each camera pair using the 5-point method for intrinsically calibrated cameras. The method is compared to a manual calibration method using real HD video data from a field trial with a multicamera positioning system. The method is also evaluated on simulated data from a stereo camera model. The results show that the reprojection error of the automated camera calibration method is close to or smaller than the error for the manual calibration method and that the automated calibration method can replace the manual calibration.

  13. The sequence measurement system of the IR camera

    NASA Astrophysics Data System (ADS)

    Geng, Ai-hui; Han, Hong-xia; Zhang, Hai-bo

    2011-08-01

    Currently, the IR cameras are broadly used in the optic-electronic tracking, optic-electronic measuring, fire control and optic-electronic countermeasure field, but the output sequence of the most presently applied IR cameras in the project is complex and the giving sequence documents from the leave factory are not detailed. Aiming at the requirement that the continuous image transmission and image procession system need the detailed sequence of the IR cameras, the sequence measurement system of the IR camera is designed, and the detailed sequence measurement way of the applied IR camera is carried out. The FPGA programming combined with the SignalTap online observation way has been applied in the sequence measurement system, and the precise sequence of the IR camera's output signal has been achieved, the detailed document of the IR camera has been supplied to the continuous image transmission system, image processing system and etc. The sequence measurement system of the IR camera includes CameraLink input interface part, LVDS input interface part, FPGA part, CameraLink output interface part and etc, thereinto the FPGA part is the key composed part in the sequence measurement system. Both the video signal of the CmaeraLink style and the video signal of LVDS style can be accepted by the sequence measurement system, and because the image processing card and image memory card always use the CameraLink interface as its input interface style, the output signal style of the sequence measurement system has been designed into CameraLink interface. The sequence measurement system does the IR camera's sequence measurement work and meanwhile does the interface transmission work to some cameras. Inside the FPGA of the sequence measurement system, the sequence measurement program, the pixel clock modification, the SignalTap file configuration and the SignalTap online observation has been integrated to realize the precise measurement to the IR camera. Te sequence measurement program written by the verilog language combining the SignalTap tool on line observation can count the line numbers in one frame, pixel numbers in one line and meanwhile account the line offset and row offset of the image. Aiming at the complex sequence of the IR camera's output signal, the sequence measurement system of the IR camera accurately measures the sequence of the project applied camera, supplies the detailed sequence document to the continuous system such as image processing system and image transmission system and gives out the concrete parameters of the fval, lval, pixclk, line offset and row offset. The experiment shows that the sequence measurement system of the IR camera can get the precise sequence measurement result and works stably, laying foundation for the continuous system.

  14. Optical Constituents at the Mouth of the Columbia River: Variability and Signature in Remotely Sensed Reflectance

    DTIC Science & Technology

    2013-09-30

    constructed at BIO, carried the new Machine Vision Floc Camera (MVFC), a Sequoia Scientific LISST 100x Type B, an RBR CTD, and two pressure-actuated...WetStar CDOM fluorometer, a Sequoia Scientific flow control switch, and a SeaBird 37 CTD. The flow-control switch allows the ac- 9 to collect 0.2-um

  15. Geometrical distortion calibration of the stereo camera for the BepiColombo mission to Mercury

    NASA Astrophysics Data System (ADS)

    Simioni, Emanuele; Da Deppo, Vania; Re, Cristina; Naletto, Giampiero; Martellato, Elena; Borrelli, Donato; Dami, Michele; Aroldi, Gianluca; Ficai Veltroni, Iacopo; Cremonese, Gabriele

    2016-07-01

    The ESA-JAXA mission BepiColombo that will be launched in 2018 is devoted to the observation of Mercury, the innermost planet of the Solar System. SIMBIOSYS is its remote sensing suite, which consists of three instruments: the High Resolution Imaging Channel (HRIC), the Visible and Infrared Hyperspectral Imager (VIHI), and the Stereo Imaging Channel (STC). The latter will provide the global three dimensional reconstruction of the Mercury surface, and it represents the first push-frame stereo camera on board of a space satellite. Based on a new telescope design, STC combines the advantages of a compact single detector camera to the convenience of a double direction acquisition system; this solution allows to minimize mass and volume performing a push-frame imaging acquisition. The shared camera sensor is divided in six portions: four are covered with suitable filters; the others, one looking forward and one backwards with respect to nadir direction, are covered with a panchromatic filter supplying stereo image pairs of the planet surface. The main STC scientific requirements are to reconstruct in 3D the Mercury surface with a vertical accuracy better than 80 m and performing a global imaging with a grid size of 65 m along-track at the periherm. Scope of this work is to present the on-ground geometric calibration pipeline for this original instrument. The selected STC off-axis configuration forced to develop a new distortion map model. Additional considerations are connected to the detector, a Si-Pin hybrid CMOS, which is characterized by a high fixed pattern noise. This had a great impact in pre-calibration phases compelling to use a not common approach to the definition of the spot centroids in the distortion calibration process. This work presents the results obtained during the calibration of STC concerning the distortion analysis for three different temperatures. These results are then used to define the corresponding distortion model of the camera.

  16. PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory

    NASA Astrophysics Data System (ADS)

    Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.

    2018-02-01

    PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.

  17. Miniaturized fundus camera

    NASA Astrophysics Data System (ADS)

    Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

    2003-07-01

    We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

  18. A method and results of color calibration for the Chang'e-3 terrain camera and panoramic camera

    NASA Astrophysics Data System (ADS)

    Ren, Xin; Li, Chun-Lai; Liu, Jian-Jun; Wang, Fen-Fei; Yang, Jian-Feng; Liu, En-Hai; Xue, Bin; Zhao, Ru-Jin

    2014-12-01

    The terrain camera (TCAM) and panoramic camera (PCAM) are two of the major scientific payloads installed on the lander and rover of the Chang'e 3 mission respectively. They both use a Bayer color filter array covering CMOS sensor to capture color images of the Moon's surface. RGB values of the original images are related to these two kinds of cameras. There is an obvious color difference compared with human visual perception. This paper follows standards published by the International Commission on Illumination to establish a color correction model, designs the ground calibration experiment and obtains the color correction coefficient. The image quality has been significantly improved and there is no obvious color difference in the corrected images. Ground experimental results show that: (1) Compared with uncorrected images, the average color difference of TCAM is 4.30, which has been reduced by 62.1%. (2) The average color differences of the left and right cameras in PCAM are 4.14 and 4.16, which have been reduced by 68.3% and 67.6% respectively.

  19. Current status of Polish Fireball Network

    NASA Astrophysics Data System (ADS)

    Wiśniewski, M.; Żołądek, P.; Olech, A.; Tyminski, Z.; Maciejewski, M.; Fietkiewicz, K.; Rudawska, R.; Gozdalski, M.; Gawroński, M. P.; Suchodolski, T.; Myszkiewicz, M.; Stolarz, M.; Polakowski, K.

    2017-09-01

    The Polish Fireball Network (PFN) is a project to monitor regularly the sky over Poland in order to detect bright fireballs. In 2016 the PFN consisted of 36 continuously active stations with 57 sensitive analogue video cameras and 7 high resolution digital cameras. In our observations we also use spectroscopic and radio techniques. A PyFN software package for trajectory and orbit determination was developed. The PFN project is an example of successful participation of amateur astronomers who can provide valuable scientific data. The network is coordinated by astronomers from Copernicus Astronomical Centre in Warsaw, Poland. In 2011-2015 the PFN cameras recorded 214,936 meteor events. Using the PFN data and the UFOOrbit software 34,609 trajectories and orbits were calculated. In the following years we are planning intensive modernization of the PFN network including installation of dozens of new digital cameras.

  20. Camera systems in human motion analysis for biomedical applications

    NASA Astrophysics Data System (ADS)

    Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.

    2015-05-01

    Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.

  1. Super Blood Moon Lunar Eclipse

    NASA Image and Video Library

    2017-12-08

    Are you ready for tonight's ‪#‎SuperBloodMoon‬ Lunar Eclipse? Get your camera and find a great spot to snap a pic of the event, then share it with NASA in our Flickr group www.flickr.com/groups/superbloodmoon/ You can also share your photo with us starting at 10:00pm EDT tonight in the NASA photo contest here: go.nasa.gov/superbloodmoon-contest Learn more about this celestial event & when to look up to see it: bit.ly/1NVEwh5 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  2. Advanced digital image archival system using MPEG technologies

    NASA Astrophysics Data System (ADS)

    Chang, Wo

    2009-08-01

    Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.

  3. Health Promotion and Preventive Contents Performed During Reproduction System Learning; Observation in Senior High School

    NASA Astrophysics Data System (ADS)

    Yuniarti, E.; Fadilah, M.; Darussyamsu, R.; Nurhayati, N.

    2018-04-01

    The higher numbers of cases around sexual behavioral deviance on adolescence are significantly related to their knowledge level about the health of the reproduction system. Thus, teenagers, especially school-aged, have to receive the complete information which emphasizes on recognize promotion and prevention knowledge. This article aims to describe information about health promotion and prevention, which delivered by the teacher in Senior High School learning process on topic reproduction system. The data gained through focused observation using observation sheet and camera recorder. Further, data analyzed descriptively. The result show promotion and preventive approach have been inadequately presented. There are two reasons. Firstly, the promotion and preventive value are not technically requested in the final assessment. The second, the explanation tend to refer to consequences existed in the term of the social and religious norm rather than a scientific basis. It can be concluded suggestion to promote health reproduction and prevent the risk of health reproduction need to be implemented more practice with a scientific explanation which is included in a specific program for adolescence reproductive health improvement.

  4. A multipurpose camera system for monitoring Kīlauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Orr, Tim R.; Lee, Lopaka; Moniz, Cyril J.

    2015-01-01

    We describe a low-cost, compact multipurpose camera system designed for field deployment at active volcanoes that can be used either as a webcam (transmitting images back to an observatory in real-time) or as a time-lapse camera system (storing images onto the camera system for periodic retrieval during field visits). The system also has the capability to acquire high-definition video. The camera system uses a Raspberry Pi single-board computer and a 5-megapixel low-light (near-infrared sensitive) camera, as well as a small Global Positioning System (GPS) module to ensure accurate time-stamping of images. Custom Python scripts control the webcam and GPS unit and handle data management. The inexpensive nature of the system allows it to be installed at hazardous sites where it might be lost. Another major advantage of this camera system is that it provides accurate internal timing (independent of network connection) and, because a full Linux operating system and the Python programming language are available on the camera system itself, it has the versatility to be configured for the specific needs of the user. We describe example deployments of the camera at Kīlauea Volcano, Hawai‘i, to monitor ongoing summit lava lake activity. 

  5. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration of other information within that video; namely, focal and spectral information. The next part of the thesis demonstrates derivative works of CACTI: compressive extended depth of field and compressive spectral-temporal imaging. These works successfully show the technique's extension of temporal coding to improve sensing performance in these other dimensions. Geometrical optics-related tradeoffs, such as the classic challenges of wide-field-of-view and high resolution photography, have motivated the development of mulitscale camera arrays. The advent of such designs less than a decade ago heralds a new era of research- and engineering-related challenges. One significant challenge is that of managing the focal volume (x,y,z ) over wide fields of view and resolutions. The fourth chapter shows advances on focus and image quality assessment for a class of multiscale gigapixel cameras developed at Duke. Along the same line of work, we have explored methods for dynamic and adaptive addressing of focus via point spread function engineering. We demonstrate another form of temporal coding in the form of physical translation of the image plane from its nominal focal position. We demonstrate this technique's capability to generate arbitrary point spread functions.

  6. The STARDUST Discovery Mission: Data from the Encounter with Comet Wild 2 and the Expected Sample Return

    NASA Technical Reports Server (NTRS)

    Sandford, Scott A.

    2004-01-01

    On January 2,2004, the STARDUST spacecraft made the closest ever flyby (236 km) of the nucleus of a comet - Comet Wild 2. During the fly by the spacecraft collected samples of dust from the coma of the comet. These samples will be returned to Earth on January 15,2006. After a brief preliminary examination to establish the nature of the returned samples, they will be made available to the general scientific community for study. In addition to its aerogel dust collector, the STARDUST spacecraft was also equipped with instruments that made in situ measurements of the comet during the flyby. These included several dust impact monitors, a mass spectrometer, and a camera. The spacecraft's communication system was also used to place dynamical constraints on the mass of the nucleus and the number of impacts the spacecraft had with large particles. The data taken by these instruments indicate that the spacecraft successfully captured coma samples. These instruments, particularly the camera, also demonstrated that Wild 2 is unlike any other object in the Solar System previously visited by a spacecraft. During my talk I will discuss the scientific goals of the STARDUST mission and provide an overview of its design and flight to date. I will then end with a description of the exciting data returned by the spacecraft during the recent encounter with Wild 2 and discuss what these data tell us about the nature of comets. It will probably come as no surprise that the encounter data raise as many (or more) new questions as they answer old ones.

  7. Low cost and open source multi-fluorescence imaging system for teaching and research in biology and bioengineering.

    PubMed

    Nuñez, Isaac; Matute, Tamara; Herrera, Roberto; Keymer, Juan; Marzullo, Timothy; Rudge, Timothy; Federici, Fernán

    2017-01-01

    The advent of easy-to-use open source microcontrollers, off-the-shelf electronics and customizable manufacturing technologies has facilitated the development of inexpensive scientific devices and laboratory equipment. In this study, we describe an imaging system that integrates low-cost and open-source hardware, software and genetic resources. The multi-fluorescence imaging system consists of readily available 470 nm LEDs, a Raspberry Pi camera and a set of filters made with low cost acrylics. This device allows imaging in scales ranging from single colonies to entire plates. We developed a set of genetic components (e.g. promoters, coding sequences, terminators) and vectors following the standard framework of Golden Gate, which allowed the fabrication of genetic constructs in a combinatorial, low cost and robust manner. In order to provide simultaneous imaging of multiple wavelength signals, we screened a series of long stokes shift fluorescent proteins that could be combined with cyan/green fluorescent proteins. We found CyOFP1, mBeRFP and sfGFP to be the most compatible set for 3-channel fluorescent imaging. We developed open source Python code to operate the hardware to run time-lapse experiments with automated control of illumination and camera and a Python module to analyze data and extract meaningful biological information. To demonstrate the potential application of this integral system, we tested its performance on a diverse range of imaging assays often used in disciplines such as microbial ecology, microbiology and synthetic biology. We also assessed its potential use in a high school environment to teach biology, hardware design, optics, and programming. Together, these results demonstrate the successful integration of open source hardware, software, genetic resources and customizable manufacturing to obtain a powerful, low cost and robust system for education, scientific research and bioengineering. All the resources developed here are available under open source licenses.

  8. Low cost and open source multi-fluorescence imaging system for teaching and research in biology and bioengineering

    PubMed Central

    Herrera, Roberto; Keymer, Juan; Marzullo, Timothy; Rudge, Timothy

    2017-01-01

    The advent of easy-to-use open source microcontrollers, off-the-shelf electronics and customizable manufacturing technologies has facilitated the development of inexpensive scientific devices and laboratory equipment. In this study, we describe an imaging system that integrates low-cost and open-source hardware, software and genetic resources. The multi-fluorescence imaging system consists of readily available 470 nm LEDs, a Raspberry Pi camera and a set of filters made with low cost acrylics. This device allows imaging in scales ranging from single colonies to entire plates. We developed a set of genetic components (e.g. promoters, coding sequences, terminators) and vectors following the standard framework of Golden Gate, which allowed the fabrication of genetic constructs in a combinatorial, low cost and robust manner. In order to provide simultaneous imaging of multiple wavelength signals, we screened a series of long stokes shift fluorescent proteins that could be combined with cyan/green fluorescent proteins. We found CyOFP1, mBeRFP and sfGFP to be the most compatible set for 3-channel fluorescent imaging. We developed open source Python code to operate the hardware to run time-lapse experiments with automated control of illumination and camera and a Python module to analyze data and extract meaningful biological information. To demonstrate the potential application of this integral system, we tested its performance on a diverse range of imaging assays often used in disciplines such as microbial ecology, microbiology and synthetic biology. We also assessed its potential use in a high school environment to teach biology, hardware design, optics, and programming. Together, these results demonstrate the successful integration of open source hardware, software, genetic resources and customizable manufacturing to obtain a powerful, low cost and robust system for education, scientific research and bioengineering. All the resources developed here are available under open source licenses. PMID:29140977

  9. Overview of machine vision methods in x-ray imaging and microtomography

    NASA Astrophysics Data System (ADS)

    Buzmakov, Alexey; Zolotov, Denis; Chukalina, Marina; Nikolaev, Dmitry; Gladkov, Andrey; Ingacheva, Anastasia; Yakimchuk, Ivan; Asadchikov, Victor

    2018-04-01

    Digital X-ray imaging became widely used in science, medicine, non-destructive testing. This allows using modern digital images analysis for automatic information extraction and interpretation. We give short review of scientific applications of machine vision in scientific X-ray imaging and microtomography, including image processing, feature detection and extraction, images compression to increase camera throughput, microtomography reconstruction, visualization and setup adjustment.

  10. Preface: The Chang'e-3 lander and rover mission to the Moon

    NASA Astrophysics Data System (ADS)

    Ip, Wing-Huen; Yan, Jun; Li, Chun-Lai; Ouyang, Zi-Yuan

    2014-12-01

    The Chang'e-3 (CE-3) lander and rover mission to the Moon was an intermediate step in China's lunar exploration program, which will be followed by a sample return mission. The lander was equipped with a number of remote-sensing instruments including a pair of cameras (Landing Camera and Terrain Camera) for recording the landing process and surveying terrain, an extreme ultraviolet camera for monitoring activities in the Earth's plasmasphere, and a first-ever Moon-based ultraviolet telescope for astronomical observations. The Yutu rover successfully carried out close-up observations with the Panoramic Camera, mineralogical investigations with the VIS-NIR Imaging Spectrometer, study of elemental abundances with the Active Particle-induced X-ray Spectrometer, and pioneering measurements of the lunar subsurface with Lunar Penetrating Radar. This special issue provides a collection of key information on the instrumental designs, calibration methods and data processing procedures used by these experiments with a perspective of facilitating further analyses of scientific data from CE-3 in preparation for future missions.

  11. Volumetric PIV with a Plenoptic Camera

    NASA Astrophysics Data System (ADS)

    Thurow, Brian; Fahringer, Tim

    2012-11-01

    Plenoptic cameras have received attention recently due to their ability to computationally refocus an image after it has been acquired. We describe the development of a robust, economical and easy-to-use volumetric PIV technique using a unique plenoptic camera built in our laboratory. The tomographic MART algorithm is used to reconstruct pairs of 3D particle volumes with velocity determined using conventional cross-correlation techniques. 3D/3C velocity measurements (volumetric dimensions of 2 . 8 ' ' × 1 . 9 ' ' × 1 . 6 ' ') of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. This work has been supported by the Air Force Office of Scientific Research,(Grant #FA9550-100100576).

  12. Video-rate nanoscopy enabled by sCMOS camera-specific single-molecule localization algorithms

    PubMed Central

    Huang, Fang; Hartwich, Tobias M. P.; Rivera-Molina, Felix E.; Lin, Yu; Duim, Whitney C.; Long, Jane J.; Uchil, Pradeep D.; Myers, Jordan R.; Baird, Michelle A.; Mothes, Walther; Davidson, Michael W.; Toomre, Derek; Bewersdorf, Joerg

    2013-01-01

    Newly developed scientific complementary metal–oxide–semiconductor (sCMOS) cameras have the potential to dramatically accelerate data acquisition in single-molecule switching nanoscopy (SMSN) while simultaneously increasing the effective quantum efficiency. However, sCMOS-intrinsic pixel-dependent readout noise substantially reduces the localization precision and introduces localization artifacts. Here we present algorithms that overcome these limitations and provide unbiased, precise localization of single molecules at the theoretical limit. In combination with a multi-emitter fitting algorithm, we demonstrate single-molecule localization super-resolution imaging at up to 32 reconstructed images/second (recorded at 1,600–3,200 camera frames/second) in both fixed and living cells. PMID:23708387

  13. Ames Research Center Life Sciences Payload Project for Spacelab Mission 3

    NASA Technical Reports Server (NTRS)

    Callahan, P. X.; Tremor, J.; Lund, G.; Wagner, W. L.

    1983-01-01

    The Research Animal Holding Facility, developed to support rodent and squirrel monkey animal husbandry in the Spacelab environment, is to be tested during the Spacelab Mission 3 flight. The configuration and function of the payload hardware elements, the assembly and test program, the operational rationale, and the scientific approach of this mission are examined. Topics covered include animal life support systems, the squirrel monkey restraint, the camera-mirror system, the dynamic environment measurement system, the biotelemetry system, and the ground support equipment. Consideration is also given to animal pretests, loading the animals during their 12 hour light cycle, and animal early recovery after landing. This mission will be the first time that relatively large samples of monkeys and rats will be flown in space and also cared for and observed by man.

  14. Investigation of tracking systems properties in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Szymaniak, Magda; Mazikowski, Adam; Meironke, Michał

    2017-08-01

    In recent years, many scientific and industrial centers in the world developed a virtual reality systems or laboratories. One of the most advanced solutions are Immersive 3D Visualization Lab (I3DVL), a CAVE-type (Cave Automatic Virtual Environment) laboratory. It contains two CAVE-type installations: six-screen installation arranged in a form of a cube, and four-screen installation, a simplified version of the previous one. The user feeling of "immersion" and interaction with virtual world depend on many factors, in particular on the accuracy of the tracking system of the user. In this paper properties of the tracking systems applied in I3DVL was investigated. For analysis two parameters were selected: the accuracy of the tracking system and the range of detection of markers by the tracking system in space of the CAVE. Measurements of system accuracy were performed for six-screen installation, equipped with four tracking cameras for three axes: X, Y, Z. Rotation around the Y axis was also analyzed. Measured tracking system shows good linear and rotating accuracy. The biggest issue was the range of the monitoring of markers inside the CAVE. It turned out, that the tracking system lose sight of the markers in the corners of the installation. For comparison, for a simplified version of CAVE (four-screen installation), equipped with eight tracking cameras, this problem was not occur. Obtained results will allow for improvement of cave quality.

  15. Design and Fabrication of Nereid-UI: A Remotely Operated Underwater Vehicle for Oceanographic Access Under Ice

    NASA Astrophysics Data System (ADS)

    Whitcomb, L. L.; Bowen, A. D.; Yoerger, D.; German, C. R.; Kinsey, J. C.; Mayer, L. A.; Jakuba, M. V.; Gomez-Ibanez, D.; Taylor, C. L.; Machado, C.; Howland, J. C.; Kaiser, C. L.; Heintz, M.; Pontbriand, C.; Suman, S.; O'hara, L.

    2013-12-01

    The Woods Hole Oceanographic Institution and collaborators from the Johns Hopkins University and the University of New Hampshire are developing for the Polar Science Community a remotely-controlled underwater robotic vehicle capable of being tele-operated under ice under remote real-time human supervision. The Nereid Under-Ice (Nereid-UI) vehicle will enable exploration and detailed examination of biological and physical environments at glacial ice-tongues and ice-shelf margins, delivering high-definition video in addition to survey data from on board acoustic, chemical, and biological sensors. Preliminary propulsion system testing indicates the vehicle will be able to attain standoff distances of up to 20 km from an ice-edge boundary, as dictated by the current maximum tether length. The goal of the Nereid-UI system is to provide scientific access to under-ice and ice-margin environments that is presently impractical or infeasible. FIBER-OPTIC TETHER: The heart of the Nereid-UI system is its expendable fiber optic telemetry system. The telemetry system utilizes many of the same components pioneered for the full-ocean depth capable HROV Nereus vehicle, with the addition of continuous fiber status monitoring, and new float-pack and depressor designs that enable single-body deployment. POWER SYSTEM: Nereid-UI is powered by a pressure-tolerant lithium-ion battery system composed of 30 Ah prismatic pouch cells, arranged on a 90 volt bus and capable of delivering 15 kW. The cells are contained in modules of 8 cells, and groups of 9 modules are housed together in oil-filled plastic boxes. The power distribution system uses pressure tolerant components extensively, each of which have been individually qualified to 10 kpsi and operation between -20 C and 40 C. THRUSTERS: Nereid-UI will employ eight identical WHOI-designed thrusters, each with a frameless motor, oil-filled and individually compensated, and designed for low-speed (500 rpm max) direct drive. We expect an end-to-end propulsive efficiency of between 0.3 and 0.4 at a transit speed of 1 m/s based on testing conducted at WHOI. CAMERAS: Video imagery is one of the principal products of Nereid-UI. Two fiber-optic telemetry wavelengths deliver 1.5 Gb/s uncompressed HDSDI video to the support vessel in real time, supporting a Kongsberg OE14-522 hyperspherical pan and tilt HD camera and several utility cameras. PROJECT STATUS: The first shallow-water vehicle trials are scheduled for September 2013. The trials are designed to test core vehicle systems particularly the power system, main computer and control system, thrusters, video and telemetry system, and to refine camera, lighting and acoustic sensor placement for piloted and closed-loop control, especially as pertains to working near the underside of ice. Remaining vehicle design tasks include finalizing the single-body deployment concept and depressor, populating the scientific sensing suite, and the software development necessary to implement the planned autonomous return strategy. Final design and fabrication for these remaining components of the vehicle system will proceed through fall 2013, with trials under lake ice in early 2014, and potential polar trials beginning in 2014-15. SUPPORT: NSF OPP (ANT-1126311), WHOI, James Family Foundation, and George Frederick Jewett Foundation East.

  16. Design and performances of microcameras and photometers instruments on TARANIS satellite for an advanced characterization of Transient Luminous Event in the upper atmosphere

    NASA Astrophysics Data System (ADS)

    Le Mer-Dachard, Fanny; Cansot, Elodie; Hébert, Philippe; Farges, Thomas; Ravel, Karen; Gaillac, Stéphanie

    2015-10-01

    The TARANIS mission aims at studying upper atmosphere coupling with a scientific nadir-pointing microsatellite - CNES Myriade family - at a low-altitude orbit (700 km). The main objectives are to measure the occurrence of Transient Luminous Event (TLE), impulsive energetic optical phenomena generated by storms according to recently discovered process, and Terrestrial Gamma-ray Flash (TGF), their emissions and trigger factors. TARANIS instruments are currently in manufacturing, assembly, integration and testing phase. The MicroCameras and Photometers instruments (MCP) are in charge of the remote sensing of the sprites and the lightning in optical wavelengths. MicroCameras instrument [MCP-MC] is an imager in the visible and Photometers instrument [MCP-PH] is a radiometer with four bands from UV to NIR, able to detect TLEs on-board and to trigger the whole payload. The satellite will provide a complete survey of the atmosphere in low resolution together with a high resolution data of sites of interest automatically detected on board. For MC and PH instruments, CEA defined scientific needs and is in charge of processing data and providing scientific results. CNES described the technical requirements of these two instruments and will run in-flight commissioning. Design, manufacturing and testing is under responsibility of Sodern for MicroCameras and Bertin Technologies for Photometers. This article shortly describes physical characteristics of TLEs and presents the final design of these instruments and first measured performances.

  17. The dust environment of 67P/Churyumov-Gerasimenko as seen through Rosetta/OSIRIS

    NASA Astrophysics Data System (ADS)

    Tubiana, C.; Güttler, C.; Sierks, H.; Bertini, I.; Osiris Team

    2017-09-01

    The ESA's Rosetta spacecraft had the unique opportunity to be in the vicinity of comet 67P/Churyumov-Gerasimenko for 2.5 years, observing how the comet evolved while approaching the Sun, passing through perihelion and then moving back into the outer solar system. OSIRIS, the scientific camera system onboard Rosetta, imaged the nucleus and the comet dust environment during the entire mission. We studied the unresolved dust coma, investigating its diurnal and seasonal variations and providing insights into the dust composition. Hundreds of individual particles, identified in the thousands of images dedicated to dust studies, have been characterized in terms of color, size distribution, distance, light curves and orbits.

  18. A Tale of Two Comets: ISON

    NASA Image and Video Library

    2013-11-26

    Release Date: November 25, 2013 MESSENGER image of comet C/2012 S1 (ISON) during its closest approach to Mercury. At that time, ISON was approximately 22.5 million miles (36.2 million kilometers) from MESSENGER and 42.1 million miles (67.8 million kilometers) from the Sun. The image is 7° by 4.7° in size and has been slightly magnified and smoothed to enhance the faint tail of the comet. The tail was oriented at an angle to MESSENGER at the time and is foreshortened in this image; however, some faint structure can still be seen. MESSENGER's cameras have been acquiring targeted observations (watch an animation here) of Encke since October 28 and ISON since October 26, although the first faint detections didn't come until early November. During the closest approach of each comet to Mercury, the Mercury Atmospheric and Surface Composition Spectrometer (MASCS) and X-Ray Spectrometer (XRS) instruments also targeted the comets. Observations of ISON conclude on November 26, when the comet passes too close to the Sun, but MESSENGER will continue to monitor Encke with both the imagers and spectrometers through early December. Read this mission news story for more details. The MESSENGER spacecraft is the first ever to orbit the planet Mercury, and the spacecraft's seven scientific instruments and radio science investigation are unraveling the history and evolution of the Solar System's innermost planet. During the first two years of orbital operations, MESSENGER acquired over 150,000 images and extensive other data sets. MESSENGER is capable of continuing orbital operations until early 2015. Date acquired: 01:54:30 UTC on November 20, 2013 Instrument: Wide Angle Camera (WAC) of the Mercury Dual Imaging System (MDIS) Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington/Southwest Research Institute NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Motion Estimation Utilizing Range Detection-Enhanced Visual Odometry

    NASA Technical Reports Server (NTRS)

    Morris, Daniel Dale (Inventor); Chang, Hong (Inventor); Friend, Paul Russell (Inventor); Chen, Qi (Inventor); Graf, Jodi Seaborn (Inventor)

    2016-01-01

    A motion determination system is disclosed. The system may receive a first and a second camera image from a camera, the first camera image received earlier than the second camera image. The system may identify corresponding features in the first and second camera images. The system may receive range data comprising at least one of a first and a second range data from a range detection unit, corresponding to the first and second camera images, respectively. The system may determine first positions and the second positions of the corresponding features using the first camera image and the second camera image. The first positions or the second positions may be determined by also using the range data. The system may determine a change in position of the machine based on differences between the first and second positions, and a VO-based velocity of the machine based on the determined change in position.

  20. (abstract) Realization of a Faster, Cheaper, Better Mission and Its New Paradigm Star Tracker, the Advanced Stellar Compass

    NASA Technical Reports Server (NTRS)

    Eisenman, Allan Read; Liebe, Carl Christian; Joergensen, John Lief; Jensen, Gunnar Bent

    1997-01-01

    The first Danish satellite, rsted, will be launched in August of 1997. The scientific objective of sted is to perform a precision mapping of the Earth's magnetic field. Attitude data for the payload and the satellite are provided by the Advanced Stellar Compass (ASC) star tracker. The ASC consists of a CCD star camera and a capable microprocessor which operates by comparing the star image frames taken by the camera to its internal star catalogs.

  1. THE MARS ORBITER CAMERA IS INSTALLED ON THE MARS GLOBAL SURVEYOR

    NASA Technical Reports Server (NTRS)

    1996-01-01

    In the Payload Hazardous Servicing Facility at KSC, installation is under way of the Mars Orbiter Camera (MOC) on the Mars Global Surveyor spacecraft. The MOC is one of a suite of six scientific instruments that will gather data during a two-year period about Martian topography, mineral distribution and weather. The Mars Global Surveyor is slated for launch aboard a Delta II expendable launch vehicle on November 6, the beginning of a 20-day launch period.

  2. Absolute calibration of a charge-coupled device camera with twin beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meda, A.; Ruo-Berchera, I., E-mail: i.ruoberchera@inrim.it; Degiovanni, I. P.

    2014-09-08

    We report on the absolute calibration of a Charge-Coupled Device (CCD) camera by exploiting quantum correlation. This method exploits a certain number of spatial pairwise quantum correlated modes produced by spontaneous parametric-down-conversion. We develop a measurement model accounting for all the uncertainty contributions, and we reach the relative uncertainty of 0.3% in low photon flux regime. This represents a significant step forward for the characterization of (scientific) CCDs used in mesoscopic light regime.

  3. Differences in glance behavior between drivers using a rearview camera, parking sensor system, both technologies, or no technology during low-speed parking maneuvers.

    PubMed

    Kidd, David G; McCartt, Anne T

    2016-02-01

    This study characterized the use of various fields of view during low-speed parking maneuvers by drivers with a rearview camera, a sensor system, a camera and sensor system combined, or neither technology. Participants performed four different low-speed parking maneuvers five times. Glances to different fields of view the second time through the four maneuvers were coded along with the glance locations at the onset of the audible warning from the sensor system and immediately after the warning for participants in the sensor and camera-plus-sensor conditions. Overall, the results suggest that information from cameras and/or sensor systems is used in place of mirrors and shoulder glances. Participants with a camera, sensor system, or both technologies looked over their shoulders significantly less than participants without technology. Participants with cameras (camera and camera-plus-sensor conditions) used their mirrors significantly less compared with participants without cameras (no-technology and sensor conditions). Participants in the camera-plus-sensor condition looked at the center console/camera display for a smaller percentage of the time during the low-speed maneuvers than participants in the camera condition and glanced more frequently to the center console/camera display immediately after the warning from the sensor system compared with the frequency of glances to this location at warning onset. Although this increase was not statistically significant, the pattern suggests that participants in the camera-plus-sensor condition may have used the warning as a cue to look at the camera display. The observed differences in glance behavior between study groups were illustrated by relating it to the visibility of a 12-15-month-old child-size object. These findings provide evidence that drivers adapt their glance behavior during low-speed parking maneuvers following extended use of rearview cameras and parking sensors, and suggest that other technologies which augment the driving task may do the same. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  5. Small SWAP 3D imaging flash ladar for small tactical unmanned air systems

    NASA Astrophysics Data System (ADS)

    Bird, Alan; Anderson, Scott A.; Wojcik, Michael; Budge, Scott E.

    2015-05-01

    The Space Dynamics Laboratory (SDL), working with Naval Research Laboratory (NRL) and industry leaders Advanced Scientific Concepts (ASC) and Hood Technology Corporation, has developed a small SWAP (size, weight, and power) 3D imaging flash ladar (LAser Detection And Ranging) sensor system concept design for small tactical unmanned air systems (STUAS). The design utilizes an ASC 3D flash ladar camera and laser in a Hood Technology gyro-stabilized gimbal system. The design is an autonomous, intelligent, geo-aware sensor system that supplies real-time 3D terrain and target images. Flash ladar and visible camera data are processed at the sensor using a custom digitizer/frame grabber with compression. Mounted in the aft housing are power, controls, processing computers, and GPS/INS. The onboard processor controls pointing and handles image data, detection algorithms and queuing. The small SWAP 3D imaging flash ladar sensor system generates georeferenced terrain and target images with a low probability of false return and <10 cm range accuracy through foliage in real-time. The 3D imaging flash ladar is designed for a STUAS with a complete system SWAP estimate of <9 kg, <0.2 m3 and <350 W power. The system is modeled using LadarSIM, a MATLAB® and Simulink®- based ladar system simulator designed and developed by the Center for Advanced Imaging Ladar (CAIL) at Utah State University. We will present the concept design and modeled performance predictions.

  6. ICE stereocamera system - photogrammetric setup for retrieval and analysis of small scale sea ice topography

    NASA Astrophysics Data System (ADS)

    Divine, Dmitry; Pedersen, Christina; Karlsen, Tor Ivan; Aas, Harald; Granskog, Mats; Renner, Angelika; Spreen, Gunnar; Gerland, Sebastian

    2013-04-01

    A new thin-ice Arctic paradigm requires reconsideration of the set of parameterizations of mass and energy exchange within the ocean-sea-ice-atmosphere system used in modern CGCMs. Such a reassessment would require a comprehensive collection of measurements made specifically on first-year pack ice with a focus on summer melt season when the difference from typical conditions for the earlier multi-year Arctic sea ice cover becomes most pronounced. Previous in situ studies have demonstrated a crucial importance of smaller (i.e. less than 10 m) scale surface topography features for the seasonal evolution of pack ice. During 2011-2012 NPI developed a helicopter borne ICE stereocamera system intended for mapping the sea ice surface topography and aerial photography. The hardware component of the system comprises two Canon 5D Mark II cameras, combined GPS/INS unit by "Novatel" and a laser altimeter mounted in a single enclosure outside the helicopter. The unit is controlled by a PXI chassis mounted inside the helicopter cabin. The ICE stereocamera system was deployed for the first time during the 2012 summer field season. The hardware setup has proven to be highly reliable and was used in about 30 helicopter flights over Arctic sea-ice during July-September. Being highly automated it required a minimal human supervision during in-flight operation. The deployment of the camera system was mostly done in combination with the EM-bird, which measures sea-ice thickness, and this combination provides an integrated view of sea ice cover along the flight track. During the flight the cameras shot sequentially with a time interval of 1 second each to ensure sufficient overlap between subsequent images. Some 35000 images of sea ice/water surface captured per camera sums into 6 Tb of data collected during its first field season. The reconstruction of the digital elevation model of sea ice surface will be done using SOCET SET commercial software. Refraction at water/air interface can also be taken into account, providing the valuable data on melt pond coverage, depth and bottom topography -the primary goals for the system at its present stage. Preliminary analysis of the reconstructed 3D scenes of ponded first year ice for some selected sites has shown a good agreement with in situ measurements demonstrating a good scientific potential of the ICE stereocamera system.

  7. Near-infrared fluorescence imaging with a mobile phone (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ghassemi, Pejhman; Wang, Bohan; Wang, Jianting; Wang, Quanzeng; Chen, Yu; Pfefer, T. Joshua

    2017-03-01

    Mobile phone cameras employ sensors with near-infrared (NIR) sensitivity, yet this capability has not been exploited for biomedical purposes. Removing the IR-blocking filter from a phone-based camera opens the door to a wide range of techniques and applications for inexpensive, point-of-care biophotonic imaging and sensing. This study provides proof of principle for one of these modalities - phone-based NIR fluorescence imaging. An imaging system was assembled using a 780 nm light source along with excitation and emission filters with 800 nm and 825 nm cut-off wavelengths, respectively. Indocyanine green (ICG) was used as an NIR fluorescence contrast agent in an ex vivo rodent model, a resolution test target and a 3D-printed, tissue-simulating vascular phantom. Raw and processed images for red, green and blue pixel channels were analyzed for quantitative evaluation of fundamental performance characteristics including spectral sensitivity, detection linearity and spatial resolution. Mobile phone results were compared with a scientific CCD. The spatial resolution of CCD system was consistently superior to the phone, and green phone camera pixels showed better resolution than blue or green channels. The CCD exhibited similar sensitivity as processed red and blue pixels channels, yet a greater degree of detection linearity. Raw phone pixel data showed lower sensitivity but greater linearity than processed data. Overall, both qualitative and quantitative results provided strong evidence of the potential of phone-based NIR imaging, which may lead to a wide range of applications from cancer detection to glucose sensing.

  8. The PanCam Instrument for the ExoMars Rover

    PubMed Central

    Coates, A.J.; Jaumann, R.; Griffiths, A.D.; Leff, C.E.; Schmitz, N.; Josset, J.-L.; Paar, G.; Gunn, M.; Hauber, E.; Cousins, C.R.; Cross, R.E.; Grindrod, P.; Bridges, J.C.; Balme, M.; Gupta, S.; Crawford, I.A.; Irwin, P.; Stabbins, R.; Tirsch, D.; Vago, J.L.; Theodorou, T.; Caballo-Perucha, M.; Osinski, G.R.

    2017-01-01

    Abstract The scientific objectives of the ExoMars rover are designed to answer several key questions in the search for life on Mars. In particular, the unique subsurface drill will address some of these, such as the possible existence and stability of subsurface organics. PanCam will establish the surface geological and morphological context for the mission, working in collaboration with other context instruments. Here, we describe the PanCam scientific objectives in geology, atmospheric science, and 3-D vision. We discuss the design of PanCam, which includes a stereo pair of Wide Angle Cameras (WACs), each of which has an 11-position filter wheel and a High Resolution Camera (HRC) for high-resolution investigations of rock texture at a distance. The cameras and electronics are housed in an optical bench that provides the mechanical interface to the rover mast and a planetary protection barrier. The electronic interface is via the PanCam Interface Unit (PIU), and power conditioning is via a DC-DC converter. PanCam also includes a calibration target mounted on the rover deck for radiometric calibration, fiducial markers for geometric calibration, and a rover inspection mirror. Key Words: Mars—ExoMars—Instrumentation—Geology—Atmosphere—Exobiology—Context. Astrobiology 17, 511–541.

  9. Science Activity Planner for the MER Mission

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey S.; Crockett, Thomas M.; Fox, Jason M.; Joswig, Joseph C.; Powell, Mark W.; Shams, Khawaja S.; Torres, Recaredo J.; Wallick, Michael N.; Mittman, David S.

    2008-01-01

    The Maestro Science Activity Planner is a computer program that assists human users in planning operations of the Mars Explorer Rover (MER) mission and visualizing scientific data returned from the MER rovers. Relative to its predecessors, this program is more powerful and easier to use. This program is built on the Java Eclipse open-source platform around a Web-browser-based user-interface paradigm to provide an intuitive user interface to Mars rovers and landers. This program affords a combination of advanced display and simulation capabilities. For example, a map view of terrain can be generated from images acquired by the High Resolution Imaging Science Explorer instrument aboard the Mars Reconnaissance Orbiter spacecraft and overlaid with images from a navigation camera (more precisely, a stereoscopic pair of cameras) aboard a rover, and an interactive, annotated rover traverse path can be incorporated into the overlay. It is also possible to construct an overhead perspective mosaic image of terrain from navigation-camera images. This program can be adapted to similar use on other outer-space missions and is potentially adaptable to numerous terrestrial applications involving analysis of data, operations of robots, and planning of such operations for acquisition of scientific data.

  10. Impact of New Camera Technologies on Discoveries in Cell Biology.

    PubMed

    Stuurman, Nico; Vale, Ronald D

    2016-08-01

    New technologies can make previously invisible phenomena visible. Nowhere is this more obvious than in the field of light microscopy. Beginning with the observation of "animalcules" by Antonie van Leeuwenhoek, when he figured out how to achieve high magnification by shaping lenses, microscopy has advanced to this day by a continued march of discoveries driven by technical innovations. Recent advances in single-molecule-based technologies have achieved unprecedented resolution, and were the basis of the Nobel prize in Chemistry in 2014. In this article, we focus on developments in camera technologies and associated image processing that have been a major driver of technical innovations in light microscopy. We describe five types of developments in camera technology: video-based analog contrast enhancement, charge-coupled devices (CCDs), intensified sensors, electron multiplying gain, and scientific complementary metal-oxide-semiconductor cameras, which, together, have had major impacts in light microscopy. © 2016 Marine Biological Laboratory.

  11. Dawn: An Ion-Propelled Journey to the Beginning of the Solar System

    NASA Technical Reports Server (NTRS)

    Brophy, John R.; Rayman, Marc D.; Pavri, Betina

    2008-01-01

    The Dawn mission is designed to perform a scientific investigation of the two heaviest mainbelt asteroids Vesta and Ceres. These bodies are believed to preserve records of the physical and chemical conditions present during the formation of the solar system. The mission uses an ion propulsion system to enable the single Dawn spacecraft and its complement of scientific instruments to orbit both of these asteroids. Dawn's three science instruments - the gamma ray and neutron detector, the visible and infrared mapping spectrometer, and the primary framing camera - were successfully tested after launch and are functioning normally. The ion propulsion system includes three ion thrusters of the type flown previously on NASA's Deep Space 1 mission. A minimum of two ion thrusters is necessary to accomplish the Dawn mission. Checkout of two of the ion thrusters was completed as planned within 30 days after launch. This activity confirmed that the spacecraft has two healthy ion thrusters. While further checkout activities are still in progress, the activities completed as of the end of October indicate that the spacecraft is well on its way toward being ready for the start of the thrusting-cruise phase of the mission beginning December 15th.

  12. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  13. KSC-07pd2638

    NASA Image and Video Library

    2007-09-28

    KENNEDY SPACE CENTER, FLA. -- In the Orbiter Processing Facility, members of the STS-122 crew look over cameras that will be used during the mission. From left are Mission Specialists Hans Schlegel and Rex Walheim. Schlegel represents the European Space Agency. The crew is at Kennedy Space Center to take part in a crew equipment interface test, which helps familiarize them with equipment and payloads for the mission. Among the activities standard to a CEIT are harness training, inspection of the thermal protection system and camera operation for planned extravehicular activities, or EVAs. The mission will carry and install the Columbus Lab, a multifunctional, pressurized laboratory that will be permanently attached to Node 2 of the space station to carry out experiments in materials science, fluid physics and biosciences, as well as to perform a number of technological applications. It is Europe’s largest contribution to the construction of the International Space Station and will support scientific and technological research in a microgravity environment. STS-122 is targeted for launch in December. Photo credit: NASA/Kim Shiflett

  14. KSC-07pd2637

    NASA Image and Video Library

    2007-09-28

    KENNEDY SPACE CENTER, FLA. -- In the Orbiter Processing Facility, members of the STS-122 crew look over cameras that will be used during the mission. From left are Mission Specialists Stanley Love, Hans Schlegel and Rex Walheim and Pilot Alan Poindexter. The crew is at Kennedy Space Center to take part in a crew equipment interface test, which helps familiarize them with equipment and payloads for the mission. Among the activities standard to a CEIT are harness training, inspection of the thermal protection system and camera operation for planned extravehicular activities, or EVAs. The mission will carry and install the Columbus Lab, a multifunctional, pressurized laboratory that will be permanently attached to Node 2 of the space station to carry out experiments in materials science, fluid physics and biosciences, as well as to perform a number of technological applications. It is Europe’s largest contribution to the construction of the International Space Station and will support scientific and technological research in a microgravity environment. STS-122 is targeted for launch in December. Photo credit: NASA/Kim Shiflett

  15. Inspection with Robotic Microscopic Imaging

    NASA Technical Reports Server (NTRS)

    Pedersen, Liam; Deans, Matthew; Kunz, Clay; Sargent, Randy; Chen, Alan; Mungas, Greg

    2005-01-01

    Future Mars rover missions will require more advanced onboard autonomy for increased scientific productivity and reduced mission operations cost. One such form of autonomy can be achieved by targeting precise science measurements to be made in a single command uplink cycle. In this paper we present an overview of our solution to the subproblems of navigating a rover into place for microscopic imaging, mapping an instrument target point selected by an operator using far away science camera images to close up hazard camera images, verifying the safety of placing a contact instrument on a sample or finding nearby safe points, and analyzing the data that comes back from the rover. The system developed includes portions used in the Multiple Target Single Cycle Instrument Placement demonstration at NASA Ames in October 2004, and portions of the MI Toolkit delivered to the Athena Microscopic Imager Instrument Team for the MER mission still operating on Mars today. Some of the component technologies are also under consideration for MSL mission infusion.

  16. STS-35 MS Hoffman operates ASTRO-1 MPC on OV-102's aft flight deck

    NASA Image and Video Library

    1990-12-10

    STS035-12-015 (2-11 Dec 1990) --- Astronaut Jeffrey A. Hoffman, STS 35 mission specialist, uses a manual pointing controller (MPC) for the Astro-1 mission's Instrument Pointing System (IPS). By using the MPC, Hoffman and other crewmembers on Columbia's aft flight deck, were able to command the IPS, located in the cargo bay, to record astronomical data. Hoffman is serving the "Blue" shift which complemented the currently sleeping "Red" shift of crewmembers as the mission collected scientific data on a 24-hour basis. The scene was photographed with a 35mm camera.

  17. An attentive multi-camera system

    NASA Astrophysics Data System (ADS)

    Napoletano, Paolo; Tisato, Francesco

    2014-03-01

    Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

  18. Design of an unmanned Martian polar exploration system

    NASA Technical Reports Server (NTRS)

    Baldwin, Curt; Chitwood, Denny; Demann, Brian; Ducheny, Jordan; Hampton, Richard; Kuhns, Jesse; Mercer, Amy; Newman, Shawn; Patrick, Chris; Polakowski, Tony

    1994-01-01

    The design of an unmanned Martian polar exploration system is presented. The system elements include subsystems for transportation of material from earth to Mars, study of the Martian north pole, power generation, and communications. Early next century, three Atlas 2AS launch vehicles will be used to insert three Earth-Mars transfer vehicles, or buses, into a low-energy transfer orbit. Capture at Mars will be accomplished by aerobraking into a circular orbit. Each bus contains four landers and a communications satellite. Six of the twelve total landers will be deployed at 60 deg intervals along 80 deg N, and the remaining six landers at 5 deg intervals along 30 deg E from 65 deg N to 90 deg N by a combination of retrorockets and parachutes. The three communications satellites will be deployed at altitudes of 500 km in circular polar orbits that are 120 deg out of phase. These placements maximize the polar coverage of the science and communications subsystems. Each lander contains scientific equipment, two microrovers, power supplies, communications equipment, and a science computer. The lander scientific equipment includes a microweather station, seismometer, thermal probe, x-ray spectrometer, camera, and sounding rockets. One rover, designed for short-range (less than 2 km) excursions from the lander, includes a mass spectrometer for mineral analysis, an auger/borescope system for depth profiling, a deployable thermal probe, and charge coupled device cameras for terrain visualization/navigation. The second rover, designed for longer-range (2-5 km) excursions from the lander, includes radar sounding/mapping equipment, a seismometer, and laser ranging devices. Power for all subsystems is supplied by a combination of solar cells, Ni-H batteries, and radioisotope thermoelectric generators. Communications are sequenced from rovers, sounding rockets, and remote sensors to the lander, then to the satellites, through the Deep Space Network to and from earth.

  19. Prism-based single-camera system for stereo display

    NASA Astrophysics Data System (ADS)

    Zhao, Yue; Cui, Xiaoyu; Wang, Zhiguo; Chen, Hongsheng; Fan, Heyu; Wu, Teresa

    2016-06-01

    This paper combines the prism and single camera and puts forward a method of stereo imaging with low cost. First of all, according to the principle of geometrical optics, we can deduce the relationship between the prism single-camera system and dual-camera system, and according to the principle of binocular vision we can deduce the relationship between binoculars and dual camera. Thus we can establish the relationship between the prism single-camera system and binoculars and get the positional relation of prism, camera, and object with the best effect of stereo display. Finally, using the active shutter stereo glasses of NVIDIA Company, we can realize the three-dimensional (3-D) display of the object. The experimental results show that the proposed approach can make use of the prism single-camera system to simulate the various observation manners of eyes. The stereo imaging system, which is designed by the method proposed by this paper, can restore the 3-D shape of the object being photographed factually.

  20. Feasibility evaluation and study of adapting the attitude reference system to the Orbiter camera payload system's large format camera

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A design concept that will implement a mapping capability for the Orbital Camera Payload System (OCPS) when ground control points are not available is discussed. Through the use of stellar imagery collected by a pair of cameras whose optical axis are structurally related to the large format camera optical axis, such pointing information is made available.

  1. NASA's Solar Observing Fleet Watch Comet ISON's Journey Around the Sun

    NASA Image and Video Library

    2013-11-22

    Comet ISON makes its appearance into the higher-resolution HI-1 camera on the STEREO-A spacecraft. The dark "clouds" coming from the right are density enhancements in the solar wind, causing all the ripples in comet Encke's tail. These kinds of solar wind interactions give us valuable information about solar wind conditions near the sun. Note: the STEREO-A spacecraft is currently located on the other side of the Sun, so it sees a totally different geometry to what we see from Earth. Credit: Karl Battams/NASA/STEREO/CIOC NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  2. Dark-cycle monitoring of biological subjects on Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Chuang, Sherry; Mian, Arshad

    1992-01-01

    The operational environment for biological research on Space Station Freedom will incorporate video technology for monitoring plant and animal subjects. The video coverage must include dark-cycle monitoring because early experiments will use rodents that are nocturnal and therefore most active during the dark part of the daily cycle. Scientific requirements for monitoring during the dark cycle are exacting. Infrared (IR) or near-IR sensors are required. The trade-offs between these two types of sensors are based on engineering constraints, sensitivity spectra, and the quality of imagery possible from each type. This paper presents results of a study conducted by the Biological Flight Research Projects Office in conjunction with the Spacecraft Data Systems Branch at ARC to investigate the use of charged-coupled-device and IR cameras to meet the scientific requirements. Also examined is the effect of low levels of near-IR illumination on the circadian rhythm in rats.

  3. Hubble's Necklace

    NASA Image and Video Library

    2017-12-08

    Image released 11 Aug 2011. The "Necklace Nebula" is located 15,000 light-years away in the constellation Sagitta (the Arrow). In this composite image, taken on July 2, 2011, Hubble's Wide Field Camera 3 captured the glow of hydrogen (blue), oxygen (green), and nitrogen (red). The object, aptly named the Necklace Nebula, is a recently discovered planetary nebula, the glowing remains of an ordinary, Sun-like star. The nebula consists of a bright ring, measuring 12 trillion miles wide, dotted with dense, bright knots of gas that resemble diamonds in a necklace. Credit: NASA, ESA, and the Hubble Heritage Team (STScI/AURA) NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  4. Smartphone schlieren and shadowgraph imaging

    NASA Astrophysics Data System (ADS)

    Settles, Gary S.

    2018-05-01

    Schlieren and shadowgraph techniques are used throughout the realm of scientific experimentation to reveal transparent refractive phenomena, but the requirement of large precise optics has kept them mostly out of reach of the public. New developments, including the ubiquity of smartphones with high-resolution digital cameras and the Background-Oriented Schlieren technique (BOS), which replaces the precise optics with digital image processing, have changed these circumstances. This paper demonstrates a number of different schlieren and shadowgraph setups and image examples based only on a smartphone, its software applications, and some inexpensive accessories. After beginning with a simple traditional schlieren system the emphasis is placed on what can be visualized and measured using BOS and digital slit-scan imaging on the smartphone. Thermal plumes, liquid mixing and glass are used as subjects of investigation. Not only recreational and experimental photography, but also serious scientific imaging can be done.

  5. NASA's EPIC View of 2017 Eclipse Across America

    NASA Image and Video Library

    2017-08-22

    From a million miles out in space, NASA’s Earth Polychromatic Imaging Camera (EPIC) captured natural color images of the moon’s shadow crossing over North America on Aug. 21, 2017. EPIC is aboard NOAA’s Deep Space Climate Observatory (DSCOVR), where it photographs the full sunlit side of Earth every day, giving it a unique view of total solar eclipses. EPIC normally takes about 20 to 22 images of Earth per day, so this animation appears to speed up the progression of the eclipse. To see the images of Earth every day, go to: epic.gsfc.nasa.gov NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System

    NASA Astrophysics Data System (ADS)

    Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.

  7. Investigation of solar active regions at high resolution by balloon flights of the solar optical universal polarimeter, extended definition phase

    NASA Technical Reports Server (NTRS)

    Tarbell, Theodore D.

    1993-01-01

    Technical studies of the feasibility of balloon flights of the former Spacelab instrument, the Solar Optical Universal Polarimeter, with a modern charge-coupled device (CCD) camera, to study the structure and evolution of solar active regions at high resolution, are reviewed. In particular, different CCD cameras were used at ground-based solar observatories with the SOUP filter, to evaluate their performance and collect high resolution images. High resolution movies of the photosphere and chromosphere were successfully obtained using four different CCD cameras. Some of this data was collected in coordinated observations with the Yohkoh satellite during May-July, 1992, and they are being analyzed scientifically along with simultaneous X-ray observations.

  8. The Palomar Transient Factory: High Quality Realtime Data Processing in a Cost-Constrained Environment

    NASA Astrophysics Data System (ADS)

    Surace, J.; Laher, R.; Masci, F.; Grillmair, C.; Helou, G.

    2015-09-01

    The Palomar Transient Factory (PTF) is a synoptic sky survey in operation since 2009. PTF utilizes a 7.1 square degree camera on the Palomar 48-inch Schmidt telescope to survey the sky primarily at a single wavelength (R-band) at a rate of 1000-3000 square degrees a night. The data are used to detect and study transient and moving objects such as gamma ray bursts, supernovae and asteroids, as well as variable phenomena such as quasars and Galactic stars. The data processing system at IPAC handles realtime processing and detection of transients, solar system object processing, high photometric precision processing and light curve generation, and long-term archiving and curation. This was developed under an extremely limited budget profile in an unusually agile development environment. Here we discuss the mechanics of this system and our overall development approach. Although a significant scientific installation in of itself, PTF also serves as the prototype for our next generation project, the Zwicky Transient Facility (ZTF). Beginning operations in 2017, ZTF will feature a 50 square degree camera which will enable scanning of the entire northern visible sky every night. ZTF in turn will serve as a stepping stone to the Large Synoptic Survey Telescope (LSST), a major NSF facility scheduled to begin operations in the early 2020s.

  9. KENNEDY SPACE CENTER, FLA. - In the Payload Hazardous Servicing Facility at KSC, installation is under way of the Mars Orbiter Camera (MOC) on the Mars Global Surveyor spacecraft. The MOC is one of a suite of six scientific instruments that will gather data about Martian topography, mineral distribution and weather during a two-year period. The Mars Global Surveyor is slated for launch aboard a Delta II expendable launch vehicle on Nov. 6, the beginning of a 20-day launch period.

    NASA Image and Video Library

    1996-08-19

    KENNEDY SPACE CENTER, FLA. - In the Payload Hazardous Servicing Facility at KSC, installation is under way of the Mars Orbiter Camera (MOC) on the Mars Global Surveyor spacecraft. The MOC is one of a suite of six scientific instruments that will gather data about Martian topography, mineral distribution and weather during a two-year period. The Mars Global Surveyor is slated for launch aboard a Delta II expendable launch vehicle on Nov. 6, the beginning of a 20-day launch period.

  10. Real Time Data/Video/Voice Uplink and Downlink for Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Harper, Doyal A.

    1997-01-01

    LFS was an educational outreach adventure which brought the excitement of astronomical exploration on NASA's Kuiper Airborne Observatory (KAO) to a nationwide audience of children, parents and children through live, interactive television, broadcast from the KAO at an altitude of 41,000 feet during an actual scientific observing mission. The project encompassed three KAO flights during the fall of 1995, including a short practice mission, a daytime observing flight between Moffett Field, California to Houston, Texas, and a nighttime mission from Houston back to Moffett Field. The University of Chicago infrared research team participated in planning the program, developing auxiliary materials including background information and lesson plans, developing software which allowed students on the ground to control the telescope and on-board cameras via the Internet from the Adler Planetarium in Chicago, and acting as on-camera correspondents to explain and answer questions about the scientific research conducted during the flights.

  11. Man with a Mission: Jean-Dominique Cassini

    NASA Astrophysics Data System (ADS)

    Belkora, Leila

    2004-03-01

    Jean-Dominique Cassini, for whom the Cassini mission to Saturn is named, is best known for his early understanding of that planet's rings. This article is an overview of his influential career in astronomy and other scientific fields.= Born in Italy in1625 and formally educated at an early age, he was a professor of astronomy at the University of Bologna, a leading center of learning in Europe of the time. He was an early observer of Jupiter, Mars, and Venus. He is best known for constructing a giant pinhole camera in a cathedral that he used with a meridian line on the floor to track the Sun's image through the year, thus providing the Catholic Church with a reliable calendar. Cassini also used the pinhole camera observations to calculate the variation in the distance between the Sun and Earth, thus lending support to the Copernican (Sun-centered) view of the solar system. Cassini moved to Paris at the request of King Louis XIV, originally to oversee the surveying needed for a new map system of France, but ultimately he took over as the director of the Paris Observatory. Cassini's descendants ran the observatory there for the following century.

  12. Microgravity experiment system utilizing a balloon

    NASA Astrophysics Data System (ADS)

    Namiki, M.; Ohta, S.; Yamagami, T.; Koma, Y.; Akiyama, H.; Hirosawa, H.; Nishimura, J.

    A system for microgravity experiments by using a stratospheric balloon has been planned and developed in ISAS since 1978. A rocket-shaped chamber mounting the experiment apparatus is released from the balloon around 30 km altitude. The microgravity duration is from the release to opening of parachute, controlled by an on-board sequential timer. Test flights were performed in 1980 and in 1981. In September 1983 the first scientific experiment, observing behaviors and brain activities of fishes in the microgravity circumstance, have been successfully carried out. The chamber is specially equipped with movie cameras and subtransmitters, and its release altitude is about 32 km. The microgravity observed inside the chamber is less than 2.9 × 10-3 G during 10 sec. Engineering aspects of the system used in the 1983 experiment are presented.

  13. The Sondrestrom Research Facility All-sky Imagers

    NASA Astrophysics Data System (ADS)

    Kendall, E. A.; Grill, M.; Gudmundsson, E.; Stromme, A.

    2010-12-01

    The Sondrestrom Upper Atmospheric Research Facility is located near Kangerlussuaq, Greenland, just north of the Arctic Circle and 100 km inland from the west coast of Greenland. The facility is operated by SRI International in Menlo Park, California, under the auspices of the U.S. National Science Foundation. Operating in Greenland since 1983, the Sondrestrom facility is host to more than 20 instruments, the majority of which provide unique and complementary information about the arctic upper atmosphere. Together these instruments advance our knowledge of upper atmospheric physics and determine how the tenuous neutral gas interacts with the charged space plasma environment. The suite of instrumentation supports many disciplines of research - from plate tectonics to auroral physics and space weather. The Sondrestrom facility has recently acquired two new all-sky imagers. In this paper, we present images from both new imagers, placing them in context with other instruments at the site and detailing to the community how to gain access to this new data set. The first new camera replaces the intensified auroral system which has been on site for nearly three decades. This new all-sky imager (ASI), designed and assembled by Keo Scientific Ltd., employs a medium format 180° fisheye lens coupled to a set of five 3-inch narrowband interference filters. The current filter suite allows operation at the following wavelengths: 750 nm, 557.7 nm, 777.4 nm, 630.0 nm, and 732/3 nm. Monochromatic images from the ASI are acquired at a specific filter and integration time as determined by a unique configuration file. Integrations as short as 0.5 sec can be commanded for exceptionally bright features. Preview images are posted to the internet in near real-time, with final images posted weeks later. While images are continuously collected in a "patrol mode," users can request special collection sequences for targeted experiments. The second new imager installed at the Sondrestrom facility is a color all-sky imager (CASI). The CASI instrument is a low-cost Keo Scientific Ltd. system similar to cameras designed for the THEMIS satellite ground-based imaging network. This camera captures all visible wavelengths simultaneously at a higher data rate than the ASI. While it is not possible to resolve fine spectral features as with narrowband filters on the ASI, this camera provides context on wavelengths not covered by other imagers, and makes it much simpler to distinguish clouds from airglow and aurora. As with the ASI, this imager collects data during periods of dark skies and the images are posted to the web for community viewing.

  14. A Power Efficient Exaflop Computer Design for Global Cloud System Resolving Climate Models.

    NASA Astrophysics Data System (ADS)

    Wehner, M. F.; Oliker, L.; Shalf, J.

    2008-12-01

    Exascale computers would allow routine ensemble modeling of the global climate system at the cloud system resolving scale. Power and cost requirements of traditional architecture systems are likely to delay such capability for many years. We present an alternative route to the exascale using embedded processor technology to design a system optimized for ultra high resolution climate modeling. These power efficient processors, used in consumer electronic devices such as mobile phones, portable music players, cameras, etc., can be tailored to the specific needs of scientific computing. We project that a system capable of integrating a kilometer scale climate model a thousand times faster than real time could be designed and built in a five year time scale for US$75M with a power consumption of 3MW. This is cheaper, more power efficient and sooner than any other existing technology.

  15. Structure-From for Calibration of a Vehicle Camera System with Non-Overlapping Fields-Of in AN Urban Environment

    NASA Astrophysics Data System (ADS)

    Hanel, A.; Stilla, U.

    2017-05-01

    Vehicle environment cameras observing traffic participants in the area around a car and interior cameras observing the car driver are important data sources for driver intention recognition algorithms. To combine information from both camera groups, a camera system calibration can be performed. Typically, there is no overlapping field-of-view between environment and interior cameras. Often no marked reference points are available in environments, which are a large enough to cover a car for the system calibration. In this contribution, a calibration method for a vehicle camera system with non-overlapping camera groups in an urban environment is described. A-priori images of an urban calibration environment taken with an external camera are processed with the structure-frommotion method to obtain an environment point cloud. Images of the vehicle interior, taken also with an external camera, are processed to obtain an interior point cloud. Both point clouds are tied to each other with images of both image sets showing the same real-world objects. The point clouds are transformed into a self-defined vehicle coordinate system describing the vehicle movement. On demand, videos can be recorded with the vehicle cameras in a calibration drive. Poses of vehicle environment cameras and interior cameras are estimated separately using ground control points from the respective point cloud. All poses of a vehicle camera estimated for different video frames are optimized in a bundle adjustment. In an experiment, a point cloud is created from images of an underground car park, as well as a point cloud of the interior of a Volkswagen test car is created. Videos of two environment and one interior cameras are recorded. Results show, that the vehicle camera poses are estimated successfully especially when the car is not moving. Position standard deviations in the centimeter range can be achieved for all vehicle cameras. Relative distances between the vehicle cameras deviate between one and ten centimeters from tachymeter reference measurements.

  16. Relative and Absolute Calibration of a Multihead Camera System with Oblique and Nadir Looking Cameras for a Uas

    NASA Astrophysics Data System (ADS)

    Niemeyer, F.; Schima, R.; Grenzdörffer, G.

    2013-08-01

    Numerous unmanned aerial systems (UAS) are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg) are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis" software and will give an overview of the results and experiences of test flights.

  17. Introduction of A New Toolbox for Processing Digital Images From Multiple Camera Networks: FMIPROT

    NASA Astrophysics Data System (ADS)

    Melih Tanis, Cemal; Nadir Arslan, Ali

    2017-04-01

    Webcam networks intended for scientific monitoring of ecosystems is providing digital images and other environmental data for various studies. Also, other types of camera networks can also be used for scientific purposes, e.g. usage of traffic webcams for phenological studies, camera networks for ski tracks and avalanche monitoring over mountains for hydrological studies. To efficiently harness the potential of these camera networks, easy to use software which can obtain and handle images from different networks having different protocols and standards is necessary. For the analyses of the images from webcam networks, numerous software packages are freely available. These software packages have different strong features not only for analyzing but also post processing digital images. But specifically for the ease of use, applicability and scalability, a different set of features could be added. Thus, a more customized approach would be of high value, not only for analyzing images of comprehensive camera networks, but also considering the possibility to create operational data extraction and processing with an easy to use toolbox. At this paper, we introduce a new toolbox, entitled; Finnish Meteorological Institute Image PROcessing Tool (FMIPROT) which a customized approach is followed. FMIPROT has currently following features: • straightforward installation, • no software dependencies that require as extra installations, • communication with multiple camera networks, • automatic downloading and handling images, • user friendly and simple user interface, • data filtering, • visualizing results on customizable plots, • plugins; allows users to add their own algorithms. Current image analyses in FMIPROT include "Color Fraction Extraction" and "Vegetation Indices". The analysis of color fraction extraction is calculating the fractions of the colors in a region of interest, for red, green and blue colors along with brightness and luminance parameters. The analysis of vegetation indices is a collection of indices used in vegetation phenology and includes "Green Fraction" (green chromatic coordinate), "Green-Red Vegetation Index" and "Green Excess Index". "Snow cover fraction" analysis which detects snow covered pixels in the images and georeference them on a geospatial plane to calculate the snow cover fraction is being implemented at the moment. FMIPROT is being developed during the EU Life+ MONIMET project. Altogether we mounted 28 cameras at 14 different sites in Finland as MONIMET camera network. In this paper, we will present details of FMIPROT and analysis results from MONIMET camera network. We will also discuss on future planned developments of FMIPROT.

  18. Compact Autonomous Hemispheric Vision System

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Cunningham, Thomas J.; Werne, Thomas A.; Eastwood, Michael L.; Walch, Marc J.; Staehle, Robert L.

    2012-01-01

    Solar System Exploration camera implementations to date have involved either single cameras with wide field-of-view (FOV) and consequently coarser spatial resolution, cameras on a movable mast, or single cameras necessitating rotation of the host vehicle to afford visibility outside a relatively narrow FOV. These cameras require detailed commanding from the ground or separate onboard computers to operate properly, and are incapable of making decisions based on image content that control pointing and downlink strategy. For color, a filter wheel having selectable positions was often added, which added moving parts, size, mass, power, and reduced reliability. A system was developed based on a general-purpose miniature visible-light camera using advanced CMOS (complementary metal oxide semiconductor) imager technology. The baseline camera has a 92 FOV and six cameras are arranged in an angled-up carousel fashion, with FOV overlaps such that the system has a 360 FOV (azimuth). A seventh camera, also with a FOV of 92 , is installed normal to the plane of the other 6 cameras giving the system a > 90 FOV in elevation and completing the hemispheric vision system. A central unit houses the common electronics box (CEB) controlling the system (power conversion, data processing, memory, and control software). Stereo is achieved by adding a second system on a baseline, and color is achieved by stacking two more systems (for a total of three, each system equipped with its own filter.) Two connectors on the bottom of the CEB provide a connection to a carrier (rover, spacecraft, balloon, etc.) for telemetry, commands, and power. This system has no moving parts. The system's onboard software (SW) supports autonomous operations such as pattern recognition and tracking.

  19. InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications

    NASA Technical Reports Server (NTRS)

    Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon

    1996-01-01

    In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.

  20. Control and protection of outdoor embedded camera for astronomy

    NASA Astrophysics Data System (ADS)

    Rigaud, F.; Jegouzo, I.; Gaudemard, J.; Vaubaillon, J.

    2012-09-01

    The purpose of CABERNET- Podet-Met (CAmera BEtter Resolution NETwork, Pole sur la Dynamique de l'Environnement Terrestre - Meteor) project is the automated observation, by triangulation with three cameras, of meteor showers to perform a calculation of meteoroids trajectory and velocity. The scientific goal is to search the parent body, comet or asteroid, for each observed meteor. To install outdoor cameras in order to perform astronomy measurements for several years with high reliability requires a very specific design for the box. For these cameras, this contribution shows how we fulfilled the various functions of their boxes, such as cooling of the CCD, heating to melt snow and ice, the protecting against moisture, lightning and Solar light. We present the principal and secondary functions, the product breakdown structure, the technical solutions evaluation grid of criteria, the adopted technology products and their implementation in multifunction subsets for miniaturization purpose. To manage this project, we aim to get the lowest manpower and development time for every part. In appendix, we present the measurements the image quality evolution during the CCD cooling, and some pictures of the prototype.

  1. A little-known 3-lens Catadioptric Camera by Bernard Schmidt

    NASA Astrophysics Data System (ADS)

    Busch, Wolfgang; Ceragioli, Roger C.; Stephani, Walter

    2013-07-01

    The authors investigate a prototype 3-lens f/1 catadioptric camera, built in 1934 by the famous optician Bernhard Schmidt at the Hamburg-Bergedorf Observatory in Germany, where Schmidt worked before his death in 1935. The prototype is in the observatory's collection of Schmidt artifacts, but its nature was not understood before the authors' recent examination. It is an astronomical camera of a form known as 'Buchroeder-Houghton', consisting of a spherical mirror and a 3-element afocal corrector lens placed at the mirror's center of curvature. The design is named for R.A. Buchroeder and J.L. Houghton who independently published this and related forms of wide-field spherical-lens cameras after 1942. Schmidt died before he could publish his own design. The authors disassembled the prototype and measured its optical parameters. These they present together with a transmission test of the corrector lens. The authors also consider the theoretical performance of the design as built, the theory of Houghton cameras, Schmidt's possible path to his invention, and the place of the prototype in his scientific output.

  2. High Energy Replicated Optics to Explore the Sun: Hard X-Ray Balloon-Borne Telescope

    NASA Technical Reports Server (NTRS)

    Gaskin, Jessica; Apple, Jeff; StevensonChavis, Katherine; Dietz, Kurt; Holt, Marlon; Koehler, Heather; Lis, Tomasz; O'Connor, Brian; RodriquezOtero, Miguel; Pryor, Jonathan; hide

    2013-01-01

    Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist

  3. High Energy Replicated Optics to Explore the Sun: Hard X-ray balloon-borne telescope

    NASA Astrophysics Data System (ADS)

    Gaskin, J.; Apple, J.; Chavis, K. S.; Dietz, K.; Holt, M.; Koehler, H.; Lis, T.; O'Connor, B.; Otero, M. R.; Pryor, J.; Ramsey, B.; Rinehart-Dawson, M.; Smith, L.; Sobey, A.; Wilson-Hodge, C.; Christe, S.; Cramer, A.; Edgerton, M.; Rodriguez, M.; Shih, A.; Gregory, D.; Jasper, J.; Bohon, S.

    Set to fly in the Fall of 2013 from Ft. Sumner, NM, the High Energy Replicated Optics to Explore the Sun (HEROES) mission is a collaborative effort between the NASA Marshall Space Flight Center and the Goddard Space Flight Center to upgrade an existing payload, the High Energy Replicated Optics (HERO) balloon-borne telescope, to make unique scientific measurements of the Sun and astrophysical targets during the same flight. The HEROES science payload consists of 8 mirror modules, housing a total of 109 grazing-incidence optics. These modules are mounted on a carbon-fiber - and Aluminum optical bench 6 m from a matching array of high pressure xenon gas scintillation proportional counters, which serve as the focal-plane detectors. The HERO gondola utilizes a differential GPS system (backed by a magnetometer) for coarse pointing in the azimuth and a shaft angle encoder plus inclinometer provides the coarse elevation. The HEROES payload will incorporate a new solar aspect system to supplement the existing star camera, for fine pointing during both the day and night. A mechanical shutter will be added to the star camera to protect it during solar observations. HEROES will also implement two novel alignment monitoring system that will measure the alignment between the optical bench and the star camera and between the optics and detectors for improved pointing and post-flight data reconstruction. The overall payload will also be discussed. This mission is funded by the NASA HOPE (Hands On Project Experience) Training Opportunity awarded by the NASA Academy of Program/Project and Engineering Leadership, in partnership with NASA's Science Mission Directorate, Office of the Chief Engineer and Office of the Chief Technologist.

  4. The research of adaptive-exposure on spot-detecting camera in ATP system

    NASA Astrophysics Data System (ADS)

    Qian, Feng; Jia, Jian-jun; Zhang, Liang; Wang, Jian-Yu

    2013-08-01

    High precision acquisition, tracking, pointing (ATP) system is one of the key techniques of laser communication. The spot-detecting camera is used to detect the direction of beacon in laser communication link, so that it can get the position information of communication terminal for ATP system. The positioning accuracy of camera decides the capability of laser communication system directly. So the spot-detecting camera in satellite-to-earth laser communication ATP systems needs high precision on target detection. The positioning accuracy of cameras should be better than +/-1μ rad . The spot-detecting cameras usually adopt centroid algorithm to get the position information of light spot on detectors. When the intensity of beacon is moderate, calculation results of centroid algorithm will be precise. But the intensity of beacon changes greatly during communication for distance, atmospheric scintillation, weather etc. The output signal of detector will be insufficient when the camera underexposes to beacon because of low light intensity. On the other hand, the output signal of detector will be saturated when the camera overexposes to beacon because of high light intensity. The calculation accuracy of centroid algorithm becomes worse if the spot-detecting camera underexposes or overexposes, and then the positioning accuracy of camera will be reduced obviously. In order to improve the accuracy, space-based cameras should regulate exposure time in real time according to light intensity. The algorithm of adaptive-exposure technique for spot-detecting camera based on metal-oxide-semiconductor (CMOS) detector is analyzed. According to analytic results, a CMOS camera in space-based laser communication system is described, which utilizes the algorithm of adaptive-exposure to adapting exposure time. Test results from imaging experiment system formed verify the design. Experimental results prove that this design can restrain the reduction of positioning accuracy for the change of light intensity. So the camera can keep stable and high positioning accuracy during communication.

  5. Electrostatic camera system functional design study

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Cook, F. J.; Moore, R. F.

    1972-01-01

    A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

  6. An open-source, FireWire camera-based, Labview-controlled image acquisition system for automated, dynamic pupillometry and blink detection.

    PubMed

    de Souza, John Kennedy Schettino; Pinto, Marcos Antonio da Silva; Vieira, Pedro Gabrielle; Baron, Jerome; Tierra-Criollo, Carlos Julio

    2013-12-01

    The dynamic, accurate measurement of pupil size is extremely valuable for studying a large number of neuronal functions and dysfunctions. Despite tremendous and well-documented progress in image processing techniques for estimating pupil parameters, comparatively little work has been reported on practical hardware issues involved in designing image acquisition systems for pupil analysis. Here, we describe and validate the basic features of such a system which is based on a relatively compact, off-the-shelf, low-cost FireWire digital camera. We successfully implemented two configurable modes of video record: a continuous mode and an event-triggered mode. The interoperability of the whole system is guaranteed by a set of modular software components hosted on a personal computer and written in Labview. An offline analysis suite of image processing algorithms for automatically estimating pupillary and eyelid parameters were assessed using data obtained in human subjects. Our benchmark results show that such measurements can be done in a temporally precise way at a sampling frequency of up to 120 Hz and with an estimated maximum spatial resolution of 0.03 mm. Our software is made available free of charge to the scientific community, allowing end users to either use the software as is or modify it to suit their own needs. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Space telescope optical telescope assembly/scientific instruments. Phase B: Preliminary design and program definition study. Volume 2A. focal plane camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Trade studies were conducted to ensure the overall feasibility of the focal plane camera in a radial module. The primary variable in the trade studies was the location of the pickoff mirror, on axis versus off-axis. Two alternatives were: (1) the standard (electromagnetic focus) SECO submodule, and (2) the MOD 15 permanent magnet focus SECO submodule. The technical areas of concern were the packaging affected parameters of thermal dissipation, focal plane obscuration, and image quality.

  8. ARTIST CONCEPT - ASTRONAUT WORDEN'S EXTRAVEHICULAR ACTIVITY (EVA) (APOLLO XV)

    NASA Image and Video Library

    1971-07-09

    S71-39614 (July 1971) --- An artist's concept of the Apollo 15 Command and Service Modules (CSM), showing two crewmembers performing a new-to-Apollo extravehicular activity (EVA). The figure at left represents astronaut Alfred M. Worden, command module pilot, connected by an umbilical tether to the CM, at right, where a figure representing astronaut James B. Irwin, lunar module pilot, stands at the open CM hatch. Worden is working with the panoramic camera in the Scientific Instrument Module (SIM). Behind Irwin is the 16mm data acquisition camera. Artwork by North American Rockwell.

  9. The Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Mather, John C.; Seery, Bernard (Technical Monitor)

    2001-01-01

    The Next Generation Space Telescope NGST is an 6-7 m class radiatively cooled telescope, planned for launch to the Lagrange point L2 in 2009, to be built by a partnership of NASA, ESA, and CSA. The NGST science program calls for three core instruments: 1) Near IR camera, 0.6 - 5 micrometer; 2) Near IR multiobject spectrometer, 1 - 5 micrometer, and 3) Mid IR camera and spectrometer, 5 - 28 micrometers. I will report on the scientific goals, project status, and the recent reduction in aperture from the target of 8 m.

  10. Some Preliminary Scientific Results of Chang'E-3 Mission

    NASA Astrophysics Data System (ADS)

    Zou, Y.; Li, W.; Zheng, Y.; Li, H.

    2015-12-01

    Chang'E-3 mission is the main task of Phase two of China Lunar Exploration Program (CLEP), and also is Chinese first probe of landing, working and roving on the moon. Chang'E-3 craft composed of a lander and a rover, and each of them carry four scientific payloads respectively. The landing site of Chang'E-3 was located at 44.12 degrees north latitude and 19.51 degrees west longitude, where is in the northern part of Imbrium Which the distance in its west direction from the landing site of former Soviet probe Luna-17 is about 400 km, and about 780km far from the landing site of Appolo-17 in its southeast direction. Unfortunately, after a series of scientific tests and exploration on the surface of the moon, the motor controller communication of the rover emerged a breakdown on January 16, 2014, which leaded the four payloads onboard the rover can't obtain data anymore. However, we have received some interesting scientific data which have been studied by Chinese scientists. During the landing process of Chang'E-3, the Landing camera got total 4673 images with the Resolution in millimeters to meters, and the lander and rover took pictures for each other at different point with Topography camera and Panoramic camera. We can find characteristic changes in celestial brightness with time by analyzing image data from Lunar-based Ultraviolet Telescope (LUT) and an unprecedented constraint on water content in the sunlit lunar exosphere seen by LUT). The figure observed by EUV camera (EUVC) shows that there is a transient weak area of the Earth's plasma sphere; This event took place about three hours. The scientists think that it might be related to the change of the particle density of mid-latitude ionosphere. The preliminary spectral and mineralogical results from the landing site are derived according to the data of Visible and Near-infrared Imaging Spectrometer (VNIS). Seven major elements including Mg, Al, Si, K, Ca, Ti and Fe have been identified by the Active Particle-induced X-ay Spectrometer (APXS). The observations of Lunar Penetrating Radar (LPR) have revealed the con-figuration of regolith where the thickness of regolith varies from about 4m to 6m, and on layer of lunar rock was detected, which is about 330m deep and might have been accumulated during the depositional hiatus of mare basalts.

  11. Report on the eROSITA camera system

    NASA Astrophysics Data System (ADS)

    Meidinger, Norbert; Andritschke, Robert; Bornemann, Walter; Coutinho, Diogo; Emberger, Valentin; Hälker, Olaf; Kink, Walter; Mican, Benjamin; Müller, Siegfried; Pietschner, Daniel; Predehl, Peter; Reiffers, Jonas

    2014-07-01

    The eROSITA space telescope is currently developed for the determination of cosmological parameters and the equation of state of dark energy via evolution of clusters of galaxies. Furthermore, the instrument development was strongly motivated by the intention of a first imaging X-ray all-sky survey enabling measurements above 2 keV. eROSITA is a scientific payload on the Russian research satellite SRG. Its destination after launch is the Lagrangian point L2. The observational program of the observatory divides into an all-sky survey and pointed observations and takes in total about 7.5 years. The instrument comprises an array of 7 identical and parallel aligned telescopes. Each of the seven focal plane cameras is equipped with a PNCCD detector, an enhanced type of the XMM-Newton focal plane detector. This instrumentation permits spectroscopy and imaging of X-rays in the energy band from 0.3 keV to 10 keV with a field of view of 1.0 degree. The camera development is done at the Max-Planck-Institute for extraterrestrial physics. Key component of each camera is the PNCCD chip. This silicon sensor is a back-illuminated, fully depleted and column-parallel type of charge coupled device. The image area of the 450 micron thick frame-transfer CCD comprises an array of 384 x 384 pixels, each with a size of 75 micron x 75 micron. Readout of the signal charge that is generated by an incident X-ray photon in the CCD is accomplished by an ASIC, the so-called eROSITA CAMEX. It provides 128 parallel analog signal processing channels but multiplexes the signals finally to one output which feeds the detector signals to a fast 14-bit ADC. The read noise of this system is equivalent to a noise charge of about 2.5 electrons rms. We achieve an energy resolution close to the theoretical limit given by Fano noise (except for very low energies). For example, the FWHM at an energy of 5.9 keV is approximately 140 eV. The complete camera assembly comprises the camera head with the detector as key component, the electronics for detector operation as well as data acquisition and the filter wheel unit. In addition to the on-chip light blocking filter directly deposited on the photon entrance window of the PNCCD, an external filter can be moved in front of the sensor, which serves also for contamination protection. Furthermore, an on-board calibration source emitting several fluorescence lines is accommodated on the filter wheel mechanism for the purpose of in-orbit calibration. Since the spectroscopic silicon sensors need cooling down to -95°C to mitigate best radiation damage effects, an elaborate cooling system is necessary. It consists of two different types of heat pipes linking the seven detectors to two radiators. Based on the tests with an engineering model, a flight design was developed for the camera and a qualification model has been built. The tests and the performance of this camera is presented in the following. In conclusion an outlook on the flight cameras is given.

  12. Geocam Space: Enhancing Handheld Digital Camera Imagery from the International Space Station for Research and Applications

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Lee, Yeon Jin; Dille, Michael

    2016-01-01

    Handheld astronaut photography of the Earth has been collected from the International Space Station (ISS) since 2000, making it the most temporally extensive remotely sensed dataset from this unique Low Earth orbital platform. Exclusive use of digital handheld cameras to perform Earth observations from the ISS began in 2004. Nadir viewing imagery is constrained by the inclined equatorial orbit of the ISS to between 51.6 degrees North and South latitude, however numerous oblique images of land surfaces above these latitudes are included in the dataset. While unmodified commercial off-the-shelf digital cameras provide only visible wavelength, three-band spectral information of limited quality current cameras used with long (400+ mm) lenses can obtain high quality spatial information approaching 2 meters/ground pixel resolution. The dataset is freely available online at the Gateway to Astronaut Photography of Earth site (http://eol.jsc.nasa.gov), and now comprises over 2 million images. Despite this extensive image catalog, use of the data for scientific research, disaster response, commercial applications and visualizations is minimal in comparison to other data collected from free-flying satellite platforms such as Landsat, Worldview, etc. This is due primarily to the lack of fully-georeferenced data products - while current digital cameras typically have integrated GPS, this does not function in the Low Earth Orbit environment. The Earth Science and Remote Sensing (ESRS) Unit at NASA Johnson Space Center provides training in Earth Science topics to ISS crews, performs daily operations and Earth observation target delivery to crews through the Crew Earth Observations (CEO) Facility on board ISS, and also catalogs digital handheld imagery acquired from orbit by manually adding descriptive metadata and determining an image geographic centerpoint using visual feature matching with other georeferenced data, e.g. Landsat, Google Earth, etc. The lack of full geolocation information native to the data makes it difficult to integrate astronaut photographs with other georeferenced data to facilitate quantitative analysis such as urban land cover/land use classification, change detection, or geologic mapping. The manual determination of image centerpoints is both time and labor-intensive, leading to delays in releasing geolocated and cataloged data to the public, such as the timely use of data for disaster response. The GeoCam Space project was funded by the ISS Program in 2015 to develop an on-orbit hardware and ground-based software system for increasing the efficiency of geolocating astronaut photographs from the ISS (Fig. 1). The Intelligent Robotics Group at NASA Ames Research Center leads the development of both the ground and on-orbit systems in collaboration with the ESRS Unit. The hardware component consists of modified smartphone elements including cameras, central processing unit, wireless Ethernet, and an inertial measurement unit (gyroscopes/accelerometers/magnetometers) reconfigured into a compact unit that attaches to the base of the current Nikon D4 camera - and its replacement, the Nikon D5 - and connects using the standard Nikon peripheral connector or USB port. This provides secondary, side and downward facing cameras perpendicular to the primary camera pointing direction. The secondary cameras observe calibration targets with known internal X, Y, and Z position affixed to the interior of the ISS to determine the camera pose corresponding to each image frame. This information is recorded by the GeoCam Space unit and indexed for correlation to the camera time recorded for each image frame. Data - image, EXIF header, and camera pose information - is transmitted to the ground software system (GeoRef) using the established Ku-band USOS downlink system. Following integration on the ground, the camera pose information provides an initial geolocation estimate for the individual film frame. This new capability represents a significant advance in geolocation from the manual feature-matching approach for both nadir and off-nadir viewing imagery. With the initial geolocation estimate, full georeferencing of an image is completed using the rapid tie-pointing interface in GeoRef, and the resulting data is added to the Gateway to Astronaut Photography of Earth online database in both Geotiff and Keyhole Markup Language (kml) formats. The integration of the GeoRef software component of Geocam Space into the CEO image cataloging workflow is complete, and disaster response imagery acquired by the ISS crew is now fully georeferenced as a standard data product. The on-orbit hardware component (GeoSens) is in final prototyping phase, and is on-schedule for launch to the ISS in late 2016. Installation and routine use of the Geocam Space system for handheld digital camera photography from the ISS is expected to significantly improve the usefulness of this unique dataset for a variety of public- and private-sector applications.

  13. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than their flat CCD counterparts, lightening the cameras and enabling faster scanning and 3D integration of objects moving within a planetary terrain environment. Preliminary experiments conducted at the Sarnoff Corporation indicate the feasibility of curved CCD imagers with acceptable electro-optic integrity. Currently, we are in the process of evaluating the electro-optic performance of a curved wafer scale CCD imager. Detailed ray trace modeling and experimental electro-optical data performance obtained from the curved imager will be presented at the conference.

  14. Blue Beaufort Sea Ice from Operation IceBridge

    NASA Image and Video Library

    2017-12-08

    Mosaic image of sea ice in the Beaufort Sea created by the Digital Mapping System (DMS) instrument aboard the IceBridge P-3B. The dark area in the middle of the image is open water seen through a lead, or opening, in the ice. Light blue areas are thick sea ice and dark blue areas are thinner ice formed as water in the lead refreezes. Leads are formed when cracks develop in sea ice as it moves in response to wind and ocean currents. DMS uses a modified digital SLR camera that points down through a window in the underside of the plane, capturing roughly one frame per second. These images are then combined into an image mosaic using specialized computer software. Credit: NASA/DMS NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. Pattern-Recognition System for Approaching a Known Target

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Cheng, Yang

    2008-01-01

    A closed-loop pattern-recognition system is designed to provide guidance for maneuvering a small exploratory robotic vehicle (rover) on Mars to return to a landed spacecraft to deliver soil and rock samples that the spacecraft would subsequently bring back to Earth. The system could be adapted to terrestrial use in guiding mobile robots to approach known structures that humans could not approach safely, for such purposes as reconnaissance in military or law-enforcement applications, terrestrial scientific exploration, and removal of explosive or other hazardous items. The system has been demonstrated in experiments in which the Field Integrated Design and Operations (FIDO) rover (a prototype Mars rover equipped with a video camera for guidance) is made to return to a mockup of Mars-lander spacecraft. The FIDO rover camera autonomously acquires an image of the lander from a distance of 125 m in an outdoor environment. Then under guidance by an algorithm that performs fusion of multiple line and texture features in digitized images acquired by the camera, the rover traverses the intervening terrain, using features derived from images of the lander truss structure. Then by use of precise pattern matching for determining the position and orientation of the rover relative to the lander, the rover aligns itself with the bottom of ramps extending from the lander, in preparation for climbing the ramps to deliver samples to the lander. The most innovative aspect of the system is a set of pattern-recognition algorithms that govern a three-phase visual-guidance sequence for approaching the lander. During the first phase, a multifeature fusion algorithm integrates the outputs of a horizontal-line-detection algorithm and a wavelet-transform-based visual-area-of-interest algorithm for detecting the lander from a significant distance. The horizontal-line-detection algorithm is used to determine candidate lander locations based on detection of a horizontal deck that is part of the lander.

  16. 'EPIC' View of Africa and Europe from a Million Miles Away

    NASA Image and Video Library

    2015-07-29

    Africa is front and center in this image of Earth taken by a NASA camera on the Deep Space Climate Observatory (DSCOVR) satellite. The image, taken July 6 from a vantage point one million miles from Earth, was one of the first taken by NASA’s Earth Polychromatic Imaging Camera (EPIC). Central Europe is toward the top of the image with the Sahara Desert to the south, showing the Nile River flowing to the Mediterranean Sea through Egypt. The photographic-quality color image was generated by combining three separate images of the entire Earth taken a few minutes apart. The camera takes a series of 10 images using different narrowband filters -- from ultraviolet to near infrared -- to produce a variety of science products. The red, green and blue channel images are used in these Earth images. The DSCOVR mission is a partnership between NASA, the National Oceanic and Atmospheric Administration (NOAA) and the U.S. Air Force, with the primary objective to maintain the nation’s real-time solar wind monitoring capabilities, which are critical to the accuracy and lead time of space weather alerts and forecasts from NOAA. DSCOVR was launched in February to its planned orbit at the first Lagrange point or L1, about one million miles from Earth toward the sun. It’s from that unique vantage point that the EPIC instrument is acquiring images of the entire sunlit face of Earth. Data from EPIC will be used to measure ozone and aerosol levels in Earth’s atmosphere, cloud height, vegetation properties and a variety of other features. Image Credit: NASA NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  17. Hubble Sees Pinwheel of Star Birth

    NASA Image and Video Library

    2017-12-08

    NASA image release October 19, 2010 Though the universe is chock full of spiral-shaped galaxies, no two look exactly the same. This face-on spiral galaxy, called NGC 3982, is striking for its rich tapestry of star birth, along with its winding arms. The arms are lined with pink star-forming regions of glowing hydrogen, newborn blue star clusters, and obscuring dust lanes that provide the raw material for future generations of stars. The bright nucleus is home to an older population of stars, which grow ever more densely packed toward the center. NGC 3982 is located about 68 million light-years away in the constellation Ursa Major. The galaxy spans about 30,000 light-years, one-third of the size of our Milky Way galaxy. This color image is composed of exposures taken by the Hubble Space Telescope's Wide Field Planetary Camera 2 (WFPC2), the Advanced Camera for Surveys (ACS), and the Wide Field Camera 3 (WFC3). The observations were taken between March 2000 and August 2009. The rich color range comes from the fact that the galaxy was photographed invisible and near-infrared light. Also used was a filter that isolates hydrogen emission that emanates from bright star-forming regions dotting the spiral arms. The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI) conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc. in Washington, D.C. Credit: NASA, ESA, and the Hubble Heritage Team (STScI/AURA) Acknowledgment: A. Riess (STScI) NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  18. Amateur and Professional Astronomers Team Up to Create a Cosmological Masterpiece

    NASA Image and Video Library

    2017-12-08

    To view a video of this story go to: www.flickr.com/photos/gsfc/8448332724 Working with astronomical image processors at the Space Telescope Science Institute in Baltimore, Md., renowned astro-photographer Robert Gendler has taken science data from the Hubble Space Telescope (HST) archive and combined it with his own ground-based observations to assemble a photo illustration of the magnificent spiral galaxy M106. Gendler retrieved archival Hubble images of M106 to assemble a mosaic of the center of the galaxy. He then used his own and fellow astro-photographer Jay GaBany's observations of M106 to combine with the Hubble data in areas where there was less coverage, and finally, to fill in the holes and gaps where no Hubble data existed. The center of the galaxy is composed almost entirely of HST data taken by the Advanced Camera for Surveys, Wide Field Camera 3, and Wide Field Planetary Camera 2 detectors. The outer spiral arms are predominantly HST data colorized with ground-based data taken by Gendler's and GaBany's 12.5-inch and 20-inch telescopes, located at very dark remote sites in New Mexico. The image also reveals the optical component of the "anomalous arms" of M106, seen here as red, glowing hydrogen emission. To read more go to: www.nasa.gov/mission_pages/hubble/science/m106.html Credit: NASA, ESA, the Hubble Heritage Team (STScI/AURA), R. Gendler (for the Hubble Heritage Team), and G. Bacon (STScI) NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.

    2009-01-01

    The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.

  20. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Astrophysics Data System (ADS)

    Bolas, Mark T.; Fisher, Scott S.

    1990-09-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remote manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  1. An integrated port camera and display system for laparoscopy.

    PubMed

    Terry, Benjamin S; Ruppert, Austin D; Steinhaus, Kristen R; Schoen, Jonathan A; Rentschler, Mark E

    2010-05-01

    In this paper, we built and tested the port camera, a novel, inexpensive, portable, and battery-powered laparoscopic tool that integrates the components of a vision system with a cannula port. This new device 1) minimizes the invasiveness of laparoscopic surgery by combining a camera port and tool port; 2) reduces the cost of laparoscopic vision systems by integrating an inexpensive CMOS sensor and LED light source; and 3) enhances laparoscopic surgical procedures by mechanically coupling the camera, tool port, and liquid crystal display (LCD) screen to provide an on-patient visual display. The port camera video system was compared to two laparoscopic video systems: a standard resolution unit from Karl Storz (model 22220130) and a high definition unit from Stryker (model 1188HD). Brightness, contrast, hue, colorfulness, and sharpness were compared. The port camera video is superior to the Storz scope and approximately equivalent to the Stryker scope. An ex vivo study was conducted to measure the operative performance of the port camera. The results suggest that simulated tissue identification and biopsy acquisition with the port camera is as efficient as with a traditional laparoscopic system. The port camera was successfully used by a laparoscopic surgeon for exploratory surgery and liver biopsy during a porcine surgery, demonstrating initial surgical feasibility.

  2. Tracing the growth of Milky Way-like galaxies

    NASA Image and Video Library

    2013-11-15

    This composite image shows examples of galaxies similar to our Milky Way at various stages of construction over a time span of 11 billion years. The galaxies are arranged according to time. Those on the left reside nearby; those at far right existed when the cosmos was about 2 billion years old. The bluish glow from young stars dominates the color of the galaxies on the right. The galaxies at left are redder from the glow of older stellar populations. Astronomers found the distant galaxies in two Hubble Space Telescope surveys: 3D-HST and the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey, or CANDELS. The observations were made in visible and near-infrared light by Hubble's Wide Field Camera 3 and Advanced Camera for Surveys. The nearby galaxies were taken from the Sloan Digital Sky Survey. This image traces Milky Way-like galaxies over most of cosmic history, revealing how they evolve over time. Hubble's sharp vision resolved the galaxies' shapes, showing that their bulges and disks grew simultaneously. Credit: NASA, ESA, P. van Dokkum (Yale University), S. Patel (Leiden University), and the 3D-HST Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  3. Hubble Spies a Loopy Galaxy

    NASA Image and Video Library

    2015-02-02

    This NASA Hubble Space Telescope photo of NGC 7714 presents an especially striking view of the galaxy's smoke-ring-like structure. The golden loop is made of sun-like stars that have been pulled deep into space, far from the galaxy's center. The galaxy is located approximately 100 million light-years from Earth in the direction of the constellation Pisces. The universe is full of such galaxies that are gravitationally stretched and pulled and otherwise distorted in gravitational tug-o'-wars with bypassing galaxies. The companion galaxy doing the "taffy pulling" in this case, NGC 7715, lies just out of the field of view in this image. A very faint bridge of stars extends to the unseen companion. The close encounter has compressed interstellar gas to trigger bursts of star formation seen in bright blue arcs extending around NGC 7714's center. The gravitational disruption of NGC 7714 began between 100 million and 200 million years ago, at the epoch when dinosaurs ruled the Earth. The image was taken with the Wide Field Camera 3 and the Advanced Camera for Surveys in October 2011. Credit: NASA and ESA. Acknowledgment: A. Gal-Yam (Weizmann Institute of Science) NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  4. Optical Constituents Along a River Mouth and Inlet: Variability and Signature in Remotely Sensed Reflectance, and: Optical Constituents at the Mouth of the Columbia River: Variability and Signature in Remotely Sensed Reflectance

    DTIC Science & Technology

    2013-09-30

    Vision Floc Camera (MVFC), a Sequoia Scientific LISST 100x Type B, an RBR CTD, and two pressure-actuated Niskin bottles. The Niskin bottles were...Eco bb2fl, that measures 3 backscattering at 532 and 650 nm and CDOM fluorescence, a WetLabs WetStar CDOM fluorometer, a Sequoia Scientific flow

  5. A real-time camera calibration system based on OpenCV

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Wang, Hua; Guo, Huinan; Ren, Long; Zhou, Zuofeng

    2015-07-01

    Camera calibration is one of the essential steps in the computer vision research. This paper describes a real-time OpenCV based camera calibration system, and developed and implemented in the VS2008 environment. Experimental results prove that the system to achieve a simple and fast camera calibration, compared with MATLAB, higher precision and does not need manual intervention, and can be widely used in various computer vision system.

  6. Validation of the Microsoft Kinect® camera system for measurement of lower extremity jump landing and squatting kinematics.

    PubMed

    Eltoukhy, Moataz; Kelly, Adam; Kim, Chang-Young; Jun, Hyung-Pil; Campbell, Richard; Kuenze, Christopher

    2016-01-01

    Cost effective, quantifiable assessment of lower extremity movement represents potential improvement over standard tools for evaluation of injury risk. Ten healthy participants completed three trials of a drop jump, overhead squat, and single leg squat task. Peak hip and knee kinematics were assessed using an 8 camera BTS Smart 7000DX motion analysis system and the Microsoft Kinect® camera system. The agreement and consistency between both uncorrected and correct Kinect kinematic variables and the BTS camera system were assessed using interclass correlations coefficients. Peak sagittal plane kinematics measured using the Microsoft Kinect® camera system explained a significant amount of variance [Range(hip) = 43.5-62.8%; Range(knee) = 67.5-89.6%] in peak kinematics measured using the BTS camera system. Across tasks, peak knee flexion angle and peak hip flexion were found to be consistent and in agreement when the Microsoft Kinect® camera system was directly compared to the BTS camera system but these values were improved following application of a corrective factor. The Microsoft Kinect® may not be an appropriate surrogate for traditional motion analysis technology, but it may have potential applications as a real-time feedback tool in pathological or high injury risk populations.

  7. Evaluation of camera-based systems to reduce transit bus side collisions : phase II.

    DOT National Transportation Integrated Search

    2012-12-01

    The sideview camera system has been shown to eliminate blind zones by providing a view to the driver in real time. In : order to provide the best integration of these systems, an integrated camera-mirror system (hybrid system) was : developed and tes...

  8. Real time moving scene holographic camera system

    NASA Technical Reports Server (NTRS)

    Kurtz, R. L. (Inventor)

    1973-01-01

    A holographic motion picture camera system producing resolution of front surface detail is described. The system utilizes a beam of coherent light and means for dividing the beam into a reference beam for direct transmission to a conventional movie camera and two reflection signal beams for transmission to the movie camera by reflection from the front side of a moving scene. The system is arranged so that critical parts of the system are positioned on the foci of a pair of interrelated, mathematically derived ellipses. The camera has the theoretical capability of producing motion picture holograms of projectiles moving at speeds as high as 900,000 cm/sec (about 21,450 mph).

  9. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    PubMed Central

    Shortis, Mark

    2015-01-01

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems. PMID:26690172

  10. A detailed comparison of single-camera light-field PIV and tomographic PIV

    NASA Astrophysics Data System (ADS)

    Shi, Shengxian; Ding, Junfei; Atkinson, Callum; Soria, Julio; New, T. H.

    2018-03-01

    This paper conducts a comprehensive study between the single-camera light-field particle image velocimetry (LF-PIV) and the multi-camera tomographic particle image velocimetry (Tomo-PIV). Simulation studies were first performed using synthetic light-field and tomographic particle images, which extensively examine the difference between these two techniques by varying key parameters such as pixel to microlens ratio (PMR), light-field camera Tomo-camera pixel ratio (LTPR), particle seeding density and tomographic camera number. Simulation results indicate that the single LF-PIV can achieve accuracy consistent with that of multi-camera Tomo-PIV, but requires the use of overall greater number of pixels. Experimental studies were then conducted by simultaneously measuring low-speed jet flow with single-camera LF-PIV and four-camera Tomo-PIV systems. Experiments confirm that given a sufficiently high pixel resolution, a single-camera LF-PIV system can indeed deliver volumetric velocity field measurements for an equivalent field of view with a spatial resolution commensurate with those of multi-camera Tomo-PIV system, enabling accurate 3D measurements in applications where optical access is limited.

  11. CCD image sensor induced error in PIV applications

    NASA Astrophysics Data System (ADS)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  12. Photogrammetry System and Method for Determining Relative Motion Between Two Bodies

    NASA Technical Reports Server (NTRS)

    Miller, Samuel A. (Inventor); Severance, Kurt (Inventor)

    2014-01-01

    A photogrammetry system and method provide for determining the relative position between two objects. The system utilizes one or more imaging devices, such as high speed cameras, that are mounted on a first body, and three or more photogrammetry targets of a known location on a second body. The system and method can be utilized with cameras having fish-eye, hyperbolic, omnidirectional, or other lenses. The system and method do not require overlapping fields-of-view if two or more cameras are utilized. The system and method derive relative orientation by equally weighting information from an arbitrary number of heterogeneous cameras, all with non-overlapping fields-of-view. Furthermore, the system can make the measurements with arbitrary wide-angle lenses on the cameras.

  13. Advanced imaging system

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document describes the Advanced Imaging System CCD based camera. The AIS1 camera system was developed at Photometric Ltd. in Tucson, Arizona as part of a Phase 2 SBIR contract No. NAS5-30171 from the NASA/Goddard Space Flight Center in Greenbelt, Maryland. The camera project was undertaken as a part of the Space Telescope Imaging Spectrograph (STIS) project. This document is intended to serve as a complete manual for the use and maintenance of the camera system. All the different parts of the camera hardware and software are discussed and complete schematics and source code listings are provided.

  14. KSC-98pc1370

    NASA Image and Video Library

    1998-10-16

    KENNEDY SPACE CENTER, FLA. -- Attached to the second stage of a Boeing Delta II at Pad 17A, Cape Canaveral Air Station, is the Students for the Exploration and Development of Space Satellite-1 (SEDSat-1). An international project, SEDSat-1 is a secondary payload on the Deep Space 1 mission and will be deployed 88 minutes after launch over Hawaii. The satellite includes cameras for imaging Earth, a unique attitude determination system, and amateur radio communication capabilities. Deep Space 1, targeted for launch on Oct. 24, is the first flight in NASA's New Millennium Program and is designed to validate 12 new technologies for scientific space missions of the next century

  15. KSC-98pc1369

    NASA Image and Video Library

    1998-10-16

    KENNEDY SPACE CENTER, FLA. -- Attached to the second stage of a Boeing Delta II at Pad 17A, Cape Canaveral Air Station, is the Students for the Exploration and Development of Space Satellite-1 (SEDSat-1). An international project, SEDSat-1 is a secondary payload on the Deep Space 1 mission and will be deployed 88 minutes after launch over Hawaii. The satellite includes cameras for imaging Earth, a unique attitude determination system, and amateur radio communication capabilities. Deep Space 1, targeted for launch on Oct. 24, is the first flight in NASA's New Millennium Program and is designed to validate 12 new technologies for scientific space missions of the next century

  16. New Occultation Systems and the 2005 July 11 Charon Occultation

    NASA Astrophysics Data System (ADS)

    Young, L. A.; French, R. G.; Gregory, B.; Olkin, C. B.; Ruhland, C.; Shoemaker, K.; Young, E. F.

    2005-08-01

    Charon's density is an important input to models of its formation and internal structure. Estimates range from 1.59 to 1.83 g/cm3 (Olkin et al. 2003. Icarus 164, 254), with Charon's radius as the main source of uncertainty. Reported values of Charon's radius from mutual events range from 593±13 (Buie et al. 1992, Icarus 97, 211) to 621±21 km (Young & Binzel 1994, Icarus 108), while an occultation observed from a single site gives a lower limit on the radius of 601.5 km (Walker 1980 MNRAS 192, 47; Elliot & Young 1991, Icarus 89, 244). On 2005 July 11 UT (following this abstract submission date), Charon is predicted to occult the star C313.2. If successful, this event will be the first Charon occultation observed since 1980, and the first giving multiple chords across Charon's disk. This event is expected to measure Charon's radius to 1 km. Our team is observing from three telescopes in Chile, the 4.0-m Blanco and the 0.9-m telescopes at Cerro Tololo and the 4.2-m SOAR telescope at Cerro Pachon. At SOAR, we will be using the camera from our new PHOT systems (Portable High-speed Occultation Telescopes). The PHOT camera is a Princeton Instrument MicroMAX:512BFT from Roper Scientific, a 512×512 frame-transfer CCD with a readnoise of only 3 electrons at the 100 kHz digitization rate. The camera's exposures are triggered by a custom built, compact, stand-alone GPS-based pulse-train generator. A PHOT camera and pulse-train generator were used to observe the occultation of 2MASS 1275723153 by Pluto on 2005 June 15 UT from Sommers-Bausch Observatory in Boulder Colorado; preliminary analysis shows this was at best a grazing occultation from this site and a successful engineering run for the July 11 Charon occultation. The work was supported, in part, by NSF AST-0321338 (EFY) and NASA NNG-05GF05G (LAY).

  17. Low-cost laser speckle contrast imaging of blood flow using a webcam.

    PubMed

    Richards, Lisa M; Kazmi, S M Shams; Davis, Janel L; Olin, Katherine E; Dunn, Andrew K

    2013-01-01

    Laser speckle contrast imaging has become a widely used tool for dynamic imaging of blood flow, both in animal models and in the clinic. Typically, laser speckle contrast imaging is performed using scientific-grade instrumentation. However, due to recent advances in camera technology, these expensive components may not be necessary to produce accurate images. In this paper, we demonstrate that a consumer-grade webcam can be used to visualize changes in flow, both in a microfluidic flow phantom and in vivo in a mouse model. A two-camera setup was used to simultaneously image with a high performance monochrome CCD camera and the webcam for direct comparison. The webcam was also tested with inexpensive aspheric lenses and a laser pointer for a complete low-cost, compact setup ($90, 5.6 cm length, 25 g). The CCD and webcam showed excellent agreement with the two-camera setup, and the inexpensive setup was used to image dynamic blood flow changes before and after a targeted cerebral occlusion.

  18. Low-cost laser speckle contrast imaging of blood flow using a webcam

    PubMed Central

    Richards, Lisa M.; Kazmi, S. M. Shams; Davis, Janel L.; Olin, Katherine E.; Dunn, Andrew K.

    2013-01-01

    Laser speckle contrast imaging has become a widely used tool for dynamic imaging of blood flow, both in animal models and in the clinic. Typically, laser speckle contrast imaging is performed using scientific-grade instrumentation. However, due to recent advances in camera technology, these expensive components may not be necessary to produce accurate images. In this paper, we demonstrate that a consumer-grade webcam can be used to visualize changes in flow, both in a microfluidic flow phantom and in vivo in a mouse model. A two-camera setup was used to simultaneously image with a high performance monochrome CCD camera and the webcam for direct comparison. The webcam was also tested with inexpensive aspheric lenses and a laser pointer for a complete low-cost, compact setup ($90, 5.6 cm length, 25 g). The CCD and webcam showed excellent agreement with the two-camera setup, and the inexpensive setup was used to image dynamic blood flow changes before and after a targeted cerebral occlusion. PMID:24156082

  19. A versatile photogrammetric camera automatic calibration suite for multispectral fusion and optical helmet tracking

    NASA Astrophysics Data System (ADS)

    de Villiers, Jason; Jermy, Robert; Nicolls, Fred

    2014-06-01

    This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.

  20. Video systems for real-time oil-spill detection

    NASA Technical Reports Server (NTRS)

    Millard, J. P.; Arvesen, J. C.; Lewis, P. L.; Woolever, G. F.

    1973-01-01

    Three airborne television systems are being developed to evaluate techniques for oil-spill surveillance. These include a conventional TV camera, two cameras operating in a subtractive mode, and a field-sequential camera. False-color enhancement and wavelength and polarization filtering are also employed. The first of a series of flight tests indicates that an appropriately filtered conventional TV camera is a relatively inexpensive method of improving contrast between oil and water. False-color enhancement improves the contrast, but the problem caused by sun glint now limits the application to overcast days. Future effort will be aimed toward a one-camera system. Solving the sun-glint problem and developing the field-sequential camera into an operable system offers potential for color 'flagging' oil on water.

  1. Development of the science instrument CLUPI: the close-up imager on board the ExoMars rover

    NASA Astrophysics Data System (ADS)

    Josset, J.-L.; Beauvivre, S.; Cessa, V.; Martin, P.

    2017-11-01

    First mission of the Aurora Exploration Programme of ESA, ExoMars will demonstrate key flight and in situ enabling technologies, and will pursue fundamental scientific investigations. Planned for launch in 2013, ExoMars will send a robotic rover to the surface of Mars. The Close-UP Imager (CLUPI) instrument is part of the Pasteur Payload of the rover fixed on the robotic arm. It is a robotic replacement of one of the most useful instruments of the field geologist: the hand lens. Imaging of surfaces of rocks, soils and wind drift deposits at high resolution is crucial for the understanding of the geological context of any site where the Pasteur rover may be active on Mars. At the resolution provided by CLUPI (approx. 15 micrometer/pixel), rocks show a plethora of surface and internal structures, to name just a few: crystals in igneous rocks, sedimentary structures such as bedding, fracture mineralization, secondary minerals, details of the surface morphology, sedimentary bedding, sediment components, surface marks in sediments, soil particles. It is conceivable that even textures resulting from ancient biological activity can be visualized, such as fine lamination due to microbial mats (stromatolites) and textures resulting from colonies of filamentous microbes, potentially present in sediments and in palaeocavitites in any rock type. CLUPI is a complete imaging system, consisting of an APS (Active Pixel Sensor) camera with 27° FOV optics. The sensor is sensitive to light between 400 and 900 nm with 12 bits digitization. The fixed focus optics provides well focused images of 4 cm x 2.4 cm rock area at a distance of about 10 cm. This challenging camera system, less than 200g, is an independent scientific instrument linked to the rover on board computer via a SpaceWire interface. After the science goals and specifications presentation, the development of this complex high performance miniaturized imaging system will be described.

  2. Localization and Mapping Using a Non-Central Catadioptric Camera System

    NASA Astrophysics Data System (ADS)

    Khurana, M.; Armenakis, C.

    2018-05-01

    This work details the development of an indoor navigation and mapping system using a non-central catadioptric omnidirectional camera and its implementation for mobile applications. Omnidirectional catadioptric cameras find their use in navigation and mapping of robotic platforms, owing to their wide field of view. Having a wider field of view, or rather a potential 360° field of view, allows the system to "see and move" more freely in the navigation space. A catadioptric camera system is a low cost system which consists of a mirror and a camera. Any perspective camera can be used. A platform was constructed in order to combine the mirror and a camera to build a catadioptric system. A calibration method was developed in order to obtain the relative position and orientation between the two components so that they can be considered as one monolithic system. The mathematical model for localizing the system was determined using conditions based on the reflective properties of the mirror. The obtained platform positions were then used to map the environment using epipolar geometry. Experiments were performed to test the mathematical models and the achieved location and mapping accuracies of the system. An iterative process of positioning and mapping was applied to determine object coordinates of an indoor environment while navigating the mobile platform. Camera localization and 3D coordinates of object points obtained decimetre level accuracies.

  3. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  4. Overview of a Hybrid Underwater Camera System

    DTIC Science & Technology

    2014-07-01

    meters), in increments of 200ps. The camera is also equipped with 6:1 motorized zoom lens. A precision miniature attitude, heading reference system ( AHRS ...LUCIE Control & Power Distribution System AHRS Pulsed LASER Gated Camera -^ Sonar Transducer (b) LUCIE sub-systems Proc. ofSPIEVol. 9111

  5. [Microeconomics of introduction of a PET system based on the revised Japanese National Insurance reimbursement system].

    PubMed

    Abe, Katsumi; Kosuda, Shigeru; Kusano, Shoichi; Nagata, Masayoshi

    2003-11-01

    It is crucial to evaluate an annual balance before-hand when an institution installs a PET system because the revised Japanese national insurance reimbursement system set the cost of a FDG PET study as 75,000 yen. A break-even point was calculated in an 8-hour or a 24-hour operation of a PET system, based on the total costs reported. The break-even points were as follows: 13.4, 17.7, 22.1 studies per day for the 1 cyclotron-1 PET camera, 1 cyclotron-2 PET cameras, 1 cyclotron-3 PET cameras system, respectively, in an ordinary PET system operation of 8 hours. The break-even points were 19.9, 25.5, 31.2 studies per day for the 1 cyclotron-1 PET camera, 1 cyclotron-2 PET cameras, 1 cyclotron-3 PET cameras system, respectively, in a full PET system operation of 24 hours. The results indicate no profit would accrue in an ordinary PET system operation of 8 hours. The annual profit and break-even point for the total cost including the initial investment would be respectively 530 million yen and 2.8 years in a 24-hour operation with 1 cyclotron-3 PET cameras system.

  6. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  7. Assessment of the DoD Embedded Media Program

    DTIC Science & Technology

    2004-09-01

    Classified and Sensitive Information ................... VII-22 3. Weapons Systems Video, Gun Camera Video, and Lipstick Cameras...Weapons Systems Video, Gun Camera Video, and Lipstick Cameras A SECDEF and CJCS message to commanders stated, “Put in place mechanisms and processes...of public communication activities.”126 The 10 February 2003 PAG stated, “Use of lipstick and helmet-mounted cameras on combat sorties is approved

  8. Drilling, sampling, and sample-handling system for China's asteroid exploration mission

    NASA Astrophysics Data System (ADS)

    Zhang, Tao; Zhang, Wenming; Wang, Kang; Gao, Sheng; Hou, Liang; Ji, Jianghui; Ding, Xilun

    2017-08-01

    Asteroid exploration has a significant importance in promoting our understanding of the solar system and the origin of life on Earth. A unique opportunity to study near-Earth asteroid 99942 Apophis will occur in 2029 because it will be at its perigee. In the current work, a drilling, sampling, and sample-handling system (DSSHS) is proposed to penetrate the asteroid regolith, collect regolith samples at different depths, and distribute the samples to different scientific instruments for in situ analysis. In this system, a rotary-drilling method is employed for the penetration, and an inner sampling tube is utilized to collect and discharge the regolith samples. The sampling tube can deliver samples up to a maximum volume of 84 mm3 at a maximum penetration depth of 300 mm to 17 different ovens. To activate the release of volatile substances, the samples will be heated up to a temperature of 600 °C by the ovens, and these substances will be analyzed by scientific instruments such as a mass spectrometer, an isotopic analyzer, and micro-cameras, among other instruments. The DSSHS is capable of penetrating rocks with a hardness value of six, and it can be used for China's asteroid exploration mission in the foreseeable future.

  9. Raduga experiment: Multizonal photographing the Earth from the Soyuz-22 spacecraft

    NASA Technical Reports Server (NTRS)

    Ziman, Y.; Chesnokov, Y.; Dunayev, B.; Aksenov, V.; Bykovskiy, V.; Ioaskhim, R.; Myuller, K.; Choppe, V.; Volter, V.

    1980-01-01

    The main results of the scientific research and 'Raduga' experiment are reported. Technical parameters are presented for the MKF-6 camera and the MSP-4 projector. Characteristics of the obtained materials and certain results of their processing are reported.

  10. Design of low noise imaging system

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Chen, Xiaolai

    2017-10-01

    In order to meet the needs of engineering applications for low noise imaging system under the mode of global shutter, a complete imaging system is designed based on the SCMOS (Scientific CMOS) image sensor CIS2521F. The paper introduces hardware circuit and software system design. Based on the analysis of key indexes and technologies about the imaging system, the paper makes chips selection and decides SCMOS + FPGA+ DDRII+ Camera Link as processing architecture. Then it introduces the entire system workflow and power supply and distribution unit design. As for the software system, which consists of the SCMOS control module, image acquisition module, data cache control module and transmission control module, the paper designs in Verilog language and drives it to work properly based on Xilinx FPGA. The imaging experimental results show that the imaging system exhibits a 2560*2160 pixel resolution, has a maximum frame frequency of 50 fps. The imaging quality of the system satisfies the requirement of the index.

  11. Lytro camera technology: theory, algorithms, performance analysis

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

    2013-03-01

    The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

  12. Single-snapshot 2D color measurement by plenoptic imaging system

    NASA Astrophysics Data System (ADS)

    Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana

    2014-03-01

    Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.

  13. Energy-efficient lighting system for television

    DOEpatents

    Cawthorne, Duane C.

    1987-07-21

    A light control system for a television camera comprises an artificial light control system which is cooperative with an iris control system. This artificial light control system adjusts the power to lamps illuminating the camera viewing area to provide only sufficient artificial illumination necessary to provide a sufficient video signal when the camera iris is substantially open.

  14. NASA Hubble Sees Sparring Antennae Galaxies

    NASA Image and Video Library

    2013-11-15

    The NASA/ESA Hubble Space Telescope has snapped the best ever image of the Antennae Galaxies. Hubble has released images of these stunning galaxies twice before, once using observations from its Wide Field and Planetary Camera 2 (WFPC2) in 1997, and again in 2006 from the Advanced Camera for Surveys (ACS). Each of Hubble’s images of the Antennae Galaxies has been better than the last, due to upgrades made during the famous servicing missions, the last of which took place in 2009. The galaxies — also known as NGC 4038 and NGC 4039 — are locked in a deadly embrace. Once normal, sedate spiral galaxies like the Milky Way, the pair have spent the past few hundred million years sparring with one another. This clash is so violent that stars have been ripped from their host galaxies to form a streaming arc between the two. In wide-field images of the pair the reason for their name becomes clear — far-flung stars and streamers of gas stretch out into space, creating long tidal tails reminiscent of antennae. This new image of the Antennae Galaxies shows obvious signs of chaos. Clouds of gas are seen in bright pink and red, surrounding the bright flashes of blue star-forming regions — some of which are partially obscured by dark patches of dust. The rate of star formation is so high that the Antennae Galaxies are said to be in a state of starburst, a period in which all of the gas within the galaxies is being used to form stars. This cannot last forever and neither can the separate galaxies; eventually the nuclei will coalesce, and the galaxies will begin their retirement together as one large elliptical galaxy. This image uses visible and near-infrared observations from Hubble’s Wide Field Camera 3 (WFC3), along with some of the previously-released observations from Hubble’s Advanced Camera for Surveys (ACS). Credit: NASA/European Space Agency NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. Morphology and Dynamics of Jets of Comet 67P Churyumov-Gerasimenko: Early Phase Development

    NASA Astrophysics Data System (ADS)

    Lin, Zhong-Yi; Ip, Wing-Huen; Lai, Ian-Lin; Lee, Jui-Chi; Pajola, Maurizio; Lara, Luisa; Gutierrez, Pedro; Rodrigo, Rafael; Bodewits, Dennis; A'Hearn, Mike; Vincent, Jean-Baptiste; Agarwal, Jessica; Keller, Uwe; Mottola, Stefano; Bertini, Ivano; Lowry, Stephen; Rozek, Agata; Liao, Ying; Rosetta Osiris Coi Team

    2015-04-01

    The scientific camera, OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System), onboard the Rosetta spacecraft comprises a Narrow Angle Camera (NAC) for nucleus surface and dust studies and a Wide Angle Camera (WAC) for the wide field of dust and gas coma investigations. The dynamical behavior of jets in the dust coma continuously monitored by using dust filters from the arrival at the comet (August 2014) throughout the mapping phase (Oct. 2014) is described here. The analysis will cover the study of the time variability of jets, the source regions of these jets, the excess brightness of jets relative to the averaged coma brightness, and the brightness distribution of dust jets along the projected distance. The jets detected between August and September originated mostly from the neck region (Hapi). Morphological changes appeared over a time scale of several days in September. The brightness slope of the dust jets is much steeper than the background coma. This might be related to the sublimation or fragmentation of the emitted dust grains. Inter-comparison with results from other experiments will be necessary to understand the difference between the dust emitted from Hapi and those from the head and the body of the nucleus surface. The physical properties of the Hapi jets will be compared to dust jets (and their source regions) to emerge as comet 67P moves around the perihelion.

  16. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.

    PubMed

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-06-24

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.

  17. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-01-01

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961

  18. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  19. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  20. Components of the Early Apollo Scientific Experiments Package (EASEP)

    NASA Image and Video Library

    1969-07-20

    AS11-37-5551 (20 July 1969) --- Two components of the Early Apollo Scientific Experiments Package (EASEP) are seen deployed on the lunar surface in this view photographed from inside the Lunar Module (LM). In the far background is the Passive Seismic Experiment Package (PSEP); and to the right and closer to the camera is the Laser Ranging Retro-Reflector (LR-3). The footprints of Apollo 11 astronauts Neil A. Armstrong and Edwin E. Aldrin Jr. are very distinct in the lunar soil.

  1. Research on the electro-optical assistant landing system based on the dual camera photogrammetry algorithm

    NASA Astrophysics Data System (ADS)

    Mi, Yuhe; Huang, Yifan; Li, Lin

    2015-08-01

    Based on the location technique of beacon photogrammetry, Dual Camera Photogrammetry (DCP) algorithm was used to assist helicopters landing on the ship. In this paper, ZEMAX was used to simulate the two Charge Coupled Device (CCD) cameras imaging four beacons on both sides of the helicopter and output the image to MATLAB. Target coordinate systems, image pixel coordinate systems, world coordinate systems and camera coordinate systems were established respectively. According to the ideal pin-hole imaging model, the rotation matrix and translation vector of the target coordinate systems and the camera coordinate systems could be obtained by using MATLAB to process the image information and calculate the linear equations. On the basis mentioned above, ambient temperature and the positions of the beacons and cameras were changed in ZEMAX to test the accuracy of the DCP algorithm in complex sea status. The numerical simulation shows that in complex sea status, the position measurement accuracy can meet the requirements of the project.

  2. MicroCameras and Photometers (MCP) on board the TARANIS satellite

    NASA Astrophysics Data System (ADS)

    Farges, T.; Hébert, P.; Le Mer-Dachard, F.; Ravel, K.; Gaillac, S.

    2017-12-01

    TARANIS (Tool for the Analysis of Radiations from lightNing and Sprites) is a CNES micro satellite. Its main objective is to study impulsive transfers of energy between the Earth atmosphere and the space environment. It will be sun-synchronous at an altitude of 700 km. It will be launched in 2019 for at least 2 years. Its payload is composed of several electromagnetic instruments in different wavelengths (from gamma-rays to radio waves including optical). TARANIS instruments are currently in calibration and qualification phase. The purpose is to present the MicroCameras and Photometers (MCP) design, to show its performances after its recent characterization and at last to discuss the scientific objectives and how we want to answer it with the MCP observations. The MicroCameras, developed by Sodern, are dedicated to the spatial description of TLEs and their parent lightning. They are able to differentiate sprite and lightning thanks to two narrow bands ([757-767 nm] and [772-782 nm]) that provide simultaneous pairs of images of an Event. Simulation results of the differentiation method will be shown. After calibration and tests, the MicroCameras are now delivered to the CNES for integration on the payload. The Photometers, developed by Bertin Technologies, will provide temporal measurements and spectral characteristics of TLEs and lightning. There are key instrument because of their capability to detect on-board TLEs and then switch all the instruments of the scientific payload in their high resolution acquisition mode. Photometers use four spectral bands in the [170-260 nm], [332-342 nm], [757-767 nm] and [600-900 nm] and have the same field of view as cameras. The on-board TLE detection algorithm remote-controlled parameters have been tuned before launch using the electronic board and simulated or real events waveforms. After calibration, the Photometers are now going through the environmental tests. They will be delivered to the CNES for integration on the payload in September 2017.

  3. Low-cost digital dynamic visualization system

    NASA Astrophysics Data System (ADS)

    Asundi, Anand K.; Sajan, M. R.

    1995-05-01

    High speed photographic systems like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording systems requiring time consuming and tedious wet processing of the films. Currently digital cameras are replacing to certain extent the conventional cameras for static experiments. Recently, there is lot of interest in developing and modifying CCD architectures and recording arrangements for dynamic scene analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration (TDI) mode for digitally recording dynamic scenes. Applications in solid as well as fluid impact problems are presented.

  4. Similar on the Inside (pre-grinding)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This approximate true-color image taken by the panoramic camera on the Mars Exploration Rover Opportunity show the rock called 'Pilbara' located in the small crater dubbed 'Fram.' The rock appears to be dotted with the same 'blueberries,' or spherules, found at 'Eagle Crater.' Spirit drilled into this rock with its rock abrasion tool. After analyzing the hole with the rover's scientific instruments, scientists concluded that Pilbara has a similar chemical make-up, and thus watery past, to rocks studied at Eagle Crater. This image was taken with the panoramic camera's 480-, 530- and 600-nanometer filters.

  5. Similar on the Inside (post-grinding)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This approximate true-color image taken by the panoramic camera on the Mars Exploration Rover Opportunity show the hole drilled into the rock called 'Pilbara,' which is located in the small crater dubbed 'Fram.' Spirit drilled into this rock with its rock abrasion tool. The rock appears to be dotted with the same 'blueberries,' or spherules, found at 'Eagle Crater.' After analyzing the hole with the rover's scientific instruments, scientists concluded that Pilbara has a similar chemical make-up, and thus watery past, to rocks studied at Eagle Crater. This image was taken with the panoramic camera's 480-, 530- and 600-nanometer filters.

  6. Reflective correctors for the Hubble Space Telescope axial instruments

    NASA Technical Reports Server (NTRS)

    Bottema, Murk

    1993-01-01

    Reflective correctors to compensate the spherical aberration in the Hubble Space Telescope are placed in front of three of the axial scientific instruments (a camera and two spectrographs) during the first scheduled refurbishment mission. The five correctors required are deployed from a new module that replaces the fourth axial instrument. Each corrector consists of a field mirror and an aspherical, aberration-correcting reimaging mirror. In the camera the angular resolution capability is restored, be it in reduced fields, and in the spectrographs the potential for observations in crowded areas is regained along with effective light collection at the slits.

  7. High-resolution CCD imaging alternatives

    NASA Astrophysics Data System (ADS)

    Brown, D. L.; Acker, D. E.

    1992-08-01

    High resolution CCD color cameras have recently stimulated the interest of a large number of potential end-users for a wide range of practical applications. Real-time High Definition Television (HDTV) systems are now being used or considered for use in applications ranging from entertainment program origination through digital image storage to medical and scientific research. HDTV generation of electronic images offers significant cost and time-saving advantages over the use of film in such applications. Further in still image systems electronic image capture is faster and more efficient than conventional image scanners. The CCD still camera can capture 3-dimensional objects into the computing environment directly without having to shoot a picture on film develop it and then scan the image into a computer. 2. EXTENDING CCD TECHNOLOGY BEYOND BROADCAST Most standard production CCD sensor chips are made for broadcast-compatible systems. One popular CCD and the basis for this discussion offers arrays of roughly 750 x 580 picture elements (pixels) or a total array of approximately 435 pixels (see Fig. 1). FOR. A has developed a technique to increase the number of available pixels for a given image compared to that produced by the standard CCD itself. Using an inter-lined CCD with an overall spatial structure several times larger than the photo-sensitive sensor areas each of the CCD sensors is shifted in two dimensions in order to fill in spatial gaps between adjacent sensors.

  8. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  9. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  10. A direct-view customer-oriented digital holographic camera

    NASA Astrophysics Data System (ADS)

    Besaga, Vira R.; Gerhardt, Nils C.; Maksimyak, Peter P.; Hofmann, Martin R.

    2018-01-01

    In this paper, we propose a direct-view digital holographic camera system consisting mostly of customer-oriented components. The camera system is based on standard photographic units such as camera sensor and objective and is adapted to operate under off-axis external white-light illumination. The common-path geometry of the holographic module of the system ensures direct-view operation. The system can operate in both self-reference and self-interference modes. As a proof of system operability, we present reconstructed amplitude and phase information of a test sample.

  11. Deep Space Positioning System

    NASA Technical Reports Server (NTRS)

    Vaughan, Andrew T. (Inventor); Riedel, Joseph E. (Inventor)

    2016-01-01

    A single, compact, lower power deep space positioning system (DPS) configured to determine a location of a spacecraft anywhere in the solar system, and provide state information relative to Earth, Sun, or any remote object. For example, the DPS includes a first camera and, possibly, a second camera configured to capture a plurality of navigation images to determine a state of a spacecraft in a solar system. The second camera is located behind, or adjacent to, a secondary reflector of a first camera in a body of a telescope.

  12. The ESA mission to Comet Halley

    NASA Technical Reports Server (NTRS)

    Reinhard, R.

    1981-01-01

    The Europeon Space Agency's approximately Giotto mission plans for a launch in July 1985 with a Halley encounter in mid-March 1986 4 weeks after the comet's perihelion passage. Giotto carries 10 scientific experiments, a camera, neutral, ion and dust mass spectrometers, a dust impact detector system, various plasma analyzers, a magnetometer and an optical probe. The instruments are described, the principles on which they are based are described, and the experiment key performance data are summarized. The launch constraints the helicentric transfer trajectory, and the encounter scenario are analyzed. The Giotto spacecraft major design criteria, spacecraft subsystem and the ground system are described. The problem of hypervelocity dust particle impacts in the innermost part of the coma, the problem of spacecraft survival, and the adverse effects of impact-generated plasma aroung the spacecraft are considered.

  13. STS-42 MS/PLC Norman E. Thagard adjusts Rack 10 FES equipment in IML-1 module

    NASA Image and Video Library

    1992-01-30

    STS042-05-006 (22-30 Jan 1992) --- Astronaut Norman E. Thagard, payload commander, performs the Fluids Experiment System (FES) in the International Microgravity Laboratory (IML-1) science module. The FES is a NASA-developed facility that produces optical images of fluid flows during the processing of materials in space. The system's sophisticated optics consist of a laser to make holograms of samples and a video camera to record images of flows in and around samples. Thagard was joined by six fellow crewmembers for eight days of scientific research aboard Discovery in Earth-orbit. Most of their on-duty time was spent in this IML-1 science module, positioned in the cargo bay and attached via a tunnel to Discovery's airlock.

  14. The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team

    2002-12-01

    The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide the ability to validate the radiometric and geometric calibration on Mars. Pancam relies heavily on use of the JPL ICER wavelet compression algorithm to maximize data return within stringent mission downlink limits. The scientific goals of the Pancam investigation are to: (a) obtain monoscopic and stereoscopic image mosaics to assess the morphology, topography, and geologic context of each MER landing site; (b) obtain multispectral visible to short-wave near-IR images of selected regions to determine surface color and mineralogic properties; (c) obtain multispectral images over a range of viewing geometries to constrain surface photometric and physical properties; and (d) obtain images of the Martian sky, including direct images of the Sun, to determine dust and aerosol opacity and physical properties. In addition, Pancam also serves a variety of operational functions on the MER mission, including (e) serving as the primary Sun-finding camera for rover navigation; (f) resolving objects on the scale of the rover wheels to distances of ~100 m to help guide navigation decisions; (g) providing stereo coverage adequate for the generation of digital terrain models to help guide and refine rover traverse decisions; (h) providing high resolution images and other context information to guide the selection of the most interesting in situ sampling targets; and (i) supporting acquisition and release of exciting E/PO products.

  15. A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2

    NASA Astrophysics Data System (ADS)

    Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.

    2014-04-01

    JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.

  16. Clinical applications of commercially available video recording and monitoring systems: inexpensive, high-quality video recording and monitoring systems for endoscopy and microsurgery.

    PubMed

    Tsunoda, Koichi; Tsunoda, Atsunobu; Ishimoto, ShinnIchi; Kimura, Satoko

    2006-01-01

    The exclusive charge-coupled device (CCD) camera system for the endoscope and electronic fiberscopes are in widespread use. However, both are usually stationary in an office or examination room, and a wheeled cart is needed for mobility. The total costs of the CCD camera system and electronic fiberscopy system are at least US Dollars 10,000 and US Dollars 30,000, respectively. Recently, the performance of audio and visual instruments has improved dramatically, with a concomitant reduction in their cost. Commercially available CCD video cameras with small monitors have become common. They provide excellent image quality and are much smaller and less expensive than previous models. The authors have developed adaptors for the popular mini-digital video (mini-DV) camera. The camera also provides video and acoustic output signals; therefore, the endoscopic images can be viewed on a large monitor simultaneously. The new system (a mini-DV video camera and an adaptor) costs only US Dollars 1,000. Therefore, the system is both cost-effective and useful for the outpatient clinic or casualty setting, or on house calls for the purpose of patient education. In the future, the authors plan to introduce the clinical application of a high-vision camera and an infrared camera as medical instruments for clinical and research situations.

  17. [Evidence-based effectiveness of road safety interventions: a literature review].

    PubMed

    Novoa, Ana M; Pérez, Katherine; Borrell, Carme

    2009-01-01

    Only road safety interventions with scientific evidence supporting their effectiveness should be implemented. The objective of this study was to identify and summarize the available evidence on the effectiveness of road safety interventions in reducing road traffic collisions, injuries and deaths. All literature reviews published in scientific journals that assessed the effectiveness of one or more road safety interventions and whose outcome measure was road traffic crashes, injuries or fatalities were included. An exhaustive search was performed in scientific literature databases. The interventions were classified according to the evidence of their effectiveness in reducing road traffic injuries (effective interventions, insufficient evidence of effectiveness, ineffective interventions) following the structure of the Haddon matrix. Fifty-four reviews were included. Effective interventions were found before, during and after the collision, and across all factors: a) the individual: the graduated licensing system (31% road traffic injury reduction); b) the vehicle: electronic stability control system (2 to 41% reduction); c) the infrastructure: area-wide traffic calming (0 to 20%), and d) the social environment: speed cameras (7 to 30%). Certain road safety interventions are ineffective, mostly road safety education, and others require further investigation. The most successful interventions are those that reduce or eliminate the hazard and do not depend on changes in road users' behavior or on their knowledge of road safety issues. Interventions based exclusively on education are ineffective in reducing road traffic injuries.

  18. A Quasi-Static Method for Determining the Characteristics of a Motion Capture Camera System in a "Split-Volume" Configuration

    NASA Technical Reports Server (NTRS)

    Miller, Chris; Mulavara, Ajitkumar; Bloomberg, Jacob

    2001-01-01

    To confidently report any data collected from a video-based motion capture system, its functional characteristics must be determined, namely accuracy, repeatability and resolution. Many researchers have examined these characteristics with motion capture systems, but they used only two cameras, positioned 90 degrees to each other. Everaert used 4 cameras, but all were aligned along major axes (two in x, one in y and z). Richards compared the characteristics of different commercially available systems set-up in practical configurations, but all cameras viewed a single calibration volume. The purpose of this study was to determine the accuracy, repeatability and resolution of a 6-camera Motion Analysis system in a split-volume configuration using a quasistatic methodology.

  19. Ultra-high resolution of radiocesium distribution detection based on Cherenkov light imaging

    NASA Astrophysics Data System (ADS)

    Yamamoto, Seiichi; Ogata, Yoshimune; Kawachi, Naoki; Suzui, Nobuo; Yin, Yong-Gen; Fujimaki, Shu

    2015-03-01

    After the nuclear disaster in Fukushima, radiocesium contamination became a serious scientific concern and research of its effects on plants increased. In such plant studies, high resolution images of radiocesium are required without contacting the subjects. Cherenkov light imaging of beta radionuclides has inherently high resolution and is promising for plant research. Since 137Cs and 134Cs emit beta particles, Cherenkov light imaging will be useful for the imaging of radiocesium distribution. Consequently, we developed and tested a Cherenkov light imaging system. We used a high sensitivity cooled charge coupled device (CCD) camera (Hamamatsu Photonics, ORCA2-ER) for imaging Cherenkov light from 137Cs. A bright lens (Xenon, F-number: 0.95, lens diameter: 25 mm) was mounted on the camera and placed in a black box. With a 100-μm 137Cs point source, we obtained 220-μm spatial resolution in the Cherenkov light image. With a 1-mm diameter, 320-kBq 137Cs point source, the source was distinguished within 2-s. We successfully obtained Cherenkov light images of a plant whose root was dipped in a 137Cs solution, radiocesium-containing samples as well as line and character phantom images with our imaging system. Cherenkov light imaging is promising for the high resolution imaging of radiocesium distribution without contacting the subject.

  20. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  1. Wired and Wireless Camera Triggering with Arduino

    NASA Astrophysics Data System (ADS)

    Kauhanen, H.; Rönnholm, P.

    2017-10-01

    Synchronous triggering is an important task that allows simultaneous data capture from multiple cameras. Accurate synchronization enables 3D measurements of moving objects or from a moving platform. In this paper, we describe one wired and four wireless variations of Arduino-based low-cost remote trigger systems designed to provide a synchronous trigger signal for industrial cameras. Our wireless systems utilize 315 MHz or 434 MHz frequencies with noise filtering capacitors. In order to validate the synchronization accuracy, we developed a prototype of a rotating trigger detection system (named RoTriDeS). This system is suitable to detect the triggering accuracy of global shutter cameras. As a result, the wired system indicated an 8.91 μs mean triggering time difference between two cameras. Corresponding mean values for the four wireless triggering systems varied between 7.92 and 9.42 μs. Presented values include both camera-based and trigger-based desynchronization. Arduino-based triggering systems appeared to be feasible, and they have the potential to be extended to more complicated triggering systems.

  2. Assessment of photographs from wildlife monitoring cameras in Drakes Estero, Point Reyes National Seashore, California

    USGS Publications Warehouse

    Lellis, William A.; Blakeslee, Carrie J.; Allen, Laurie K.; Molnia, Bruce F.; Price, Susan D.; Bristol, R. Sky; Stewart, Brent

    2012-01-01

    Between 2007 and 2010, National Park Service (NPS) staff at the Point Reyes National Seashore, California, collected over 300,000 photographic images of Drakes Estero from remotely operated wildlife monitoring cameras. The purpose of the systems was to obtain photographic data to help understand possible relationships between anthropogenic activities and Pacific harbor seal (Phoca vitulina richardsi) behavior and distribution. The value of the NPS photographs for use in assessing the frequency and impacts of seal disturbance and displacement in Drakes Estero has been debated. In September 2011, the NPS determined that the photographs did not provide meaningful information for development of a Draft Environmental Impact Statement (DEIS) for the Drakes Bay Oyster Company Special Use Permit. Limitations of the photographs included lack of study design, poor photographic quality, inadequate field of view, incomplete estuary coverage, camera obstructions, and weather limitations. The Marine Mammal Commission (MMC) reviewed the scientific data underpinning the Drakes Bay Oyster Company DEIS in November 2011 and recommended further analysis of the NPS photographs for use in characterizing rates and consequences of seal disturbance (Marine Mammal Commission, 2011). In response to that recommendation, the NPS asked the U.S. Geological Survey (USGS) to conduct an independent review of the photographs and render an opinion on the utility of the remote camera data for informing the environmental impact analyses included in the DEIS. In consultation with the NPS, we selected the 2008 photographic dataset for detailed evaluation because it covers a full harbor seal breeding season (March 1 to June 30), provides two fields of view (two cameras were deployed), and represents a time period when cameras were most consistently deployed and maintained. The NPS requested that the photographs be evaluated in absence of other data or information pertaining to seal and human activity in the estuary and that we focus on the extent to which the photographs could be used in understanding the relationship between human activity (including commercial oyster production) and harbor seal disturbance and distribution in the estuary.

  3. Error modeling and analysis of star cameras for a class of 1U spacecraft

    NASA Astrophysics Data System (ADS)

    Fowler, David M.

    As spacecraft today become increasingly smaller, the demand for smaller components and sensors rises as well. The smartphone, a cutting edge consumer technology, has impressive collections of both sensors and processing capabilities and may have the potential to fill this demand in the spacecraft market. If the technologies of a smartphone can be used in space, the cost of building miniature satellites would drop significantly and give a boost to the aerospace and scientific communities. Concentrating on the problem of spacecraft orientation, this study sets ground to determine the capabilities of a smartphone camera when acting as a star camera. Orientations determined from star images taken from a smartphone camera are compared to those of higher quality cameras in order to determine the associated accuracies. The results of the study reveal the abilities of low-cost off-the-shelf imagers in space and give a starting point for future research in the field. The study began with a complete geometric calibration of each analyzed imager such that all comparisons start from the same base. After the cameras were calibrated, image processing techniques were introduced to correct for atmospheric, lens, and image sensor effects. Orientations for each test image are calculated through methods of identifying the stars exposed on each image. Analyses of these orientations allow the overall errors of each camera to be defined and provide insight into the abilities of low-cost imagers.

  4. Dynamic photoelasticity by TDI imaging

    NASA Astrophysics Data System (ADS)

    Asundi, Anand K.; Sajan, M. R.

    2001-06-01

    High speed photographic system like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for the recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording system requiring time consuming and tedious wet processing of the films. Digital cameras are replacing the conventional cameras, to certain extent in static experiments. Recently, there is lots of interest in development and modifying CCD architectures and recording arrangements for dynamic scenes analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration mode for digitally recording dynamic photoelastic stress patterns. Applications in strobe and streak photoelastic pattern recording and system limitations will be explained in the paper.

  5. Fiber optic TV direct

    NASA Technical Reports Server (NTRS)

    Kassak, John E.

    1991-01-01

    The objective of the operational television (OTV) technology was to develop a multiple camera system (up to 256 cameras) for NASA Kennedy installations where camera video, synchronization, control, and status data are transmitted bidirectionally via a single fiber cable at distances in excess of five miles. It is shown that the benefits (such as improved video performance, immunity from electromagnetic interference and radio frequency interference, elimination of repeater stations, and more system configuration flexibility) can be realized if application of the proven fiber optic transmission concept is used. The control system will marry the lens, pan and tilt, and camera control functions into a modular based Local Area Network (LAN) control network. Such a system does not exist commercially at present since the Television Broadcast Industry's current practice is to divorce the positional controls from the camera control system. The application software developed for this system will have direct applicability to similar systems in industry using LAN based control systems.

  6. Levels of Autonomy and Autonomous System Performance Assessment for Intelligent Unmanned Systems

    DTIC Science & Technology

    2014-04-01

    LIDAR and camera sensors that is driven entirely by teleoperation would be AL 0. If that same robot used its LIDAR and camera data to generate a...obstacle detection, mapping, path planning 3 CMMAD semi- autonomous counter- mine system (Few 2010) Talon UGV, camera, LIDAR , metal detector...NCAP framework are performed on individual UMS components and do not require mission level evaluations. For example, bench testing of camera, LIDAR

  7. Biomechanics Analysis of Combat Sport (Silat) By Using Motion Capture System

    NASA Astrophysics Data System (ADS)

    Zulhilmi Kaharuddin, Muhammad; Badriah Khairu Razak, Siti; Ikram Kushairi, Muhammad; Syawal Abd. Rahman, Mohamed; An, Wee Chang; Ngali, Z.; Siswanto, W. A.; Salleh, S. M.; Yusup, E. M.

    2017-01-01

    ‘Silat’ is a Malay traditional martial art that is practiced in both amateur and in professional levels. The intensity of the motion spurs the scientific research in biomechanics. The main purpose of this abstract is to present the biomechanics method used in the study of ‘silat’. By using the 3D Depth Camera motion capture system, two subjects are to perform ‘Jurus Satu’ in three repetitions each. One subject is set as the benchmark for the research. The videos are captured and its data is processed using the 3D Depth Camera server system in the form of 16 3D body joint coordinates which then will be transformed into displacement, velocity and acceleration components by using Microsoft excel for data calculation and Matlab software for simulation of the body. The translated data obtained serves as an input to differentiate both subjects’ execution of the ‘Jurus Satu’. Nine primary movements with the addition of five secondary movements are observed visually frame by frame from the simulation obtained to get the exact frame that the movement takes place. Further analysis involves the differentiation of both subjects’ execution by referring to the average mean and standard deviation of joints for each parameter stated. The findings provide useful data for joints kinematic parameters as well as to improve the execution of ‘Jurus Satu’ and to exhibit the process of learning a movement that is relatively unknown by the use of a motion capture system.

  8. The imaging system design of three-line LMCCD mapping camera

    NASA Astrophysics Data System (ADS)

    Zhou, Huai-de; Liu, Jin-Guo; Wu, Xing-Xing; Lv, Shi-Liang; Zhao, Ying; Yu, Da

    2011-08-01

    In this paper, the authors introduced the theory about LMCCD (line-matrix CCD) mapping camera firstly. On top of the introduction were consists of the imaging system of LMCCD mapping camera. Secondly, some pivotal designs which were Introduced about the imaging system, such as the design of focal plane module, the video signal's procession, the controller's design of the imaging system, synchronous photography about forward and nadir and backward camera and the nadir camera of line-matrix CCD. At last, the test results of LMCCD mapping camera imaging system were introduced. The results as following: the precision of synchronous photography about forward and nadir and backward camera is better than 4 ns and the nadir camera of line-matrix CCD is better than 4 ns too; the photography interval of line-matrix CCD of the nadir camera can satisfy the butter requirements of LMCCD focal plane module; the SNR tested in laboratory is better than 95 under typical working condition(the solar incidence degree is 30, the reflectivity of the earth's surface is 0.3) of each CCD image; the temperature of the focal plane module is controlled under 30° in a working period of 15 minutes. All of these results can satisfy the requirements about the synchronous photography, the temperature control of focal plane module and SNR, Which give the guarantee of precision for satellite photogrammetry.

  9. Optical Meteor Systems Used by the NASA Meteoroid Environment Office

    NASA Technical Reports Server (NTRS)

    Kingery, A. M.; Blaauw, R. C.; Cooke, W. J.; Moser, D. E.

    2015-01-01

    The NASA Meteoroid Environment Office (MEO) uses two main meteor camera networks to characterize the meteoroid environment: an all sky system and a wide field system to study cm and mm size meteors respectively. The NASA All Sky Fireball Network consists of fifteen meteor video cameras in the United States, with plans to expand to eighteen cameras by the end of 2015. The camera design and All-Sky Guided and Real-time Detection (ASGARD) meteor detection software [1, 2] were adopted from the University of Western Ontario's Southern Ontario Meteor Network (SOMN). After seven years of operation, the network has detected over 12,000 multi-station meteors, including meteors from at least 53 different meteor showers. The network is used for speed distribution determination, characterization of meteor showers and sporadic sources, and for informing the public on bright meteor events. The NASA Wide Field Meteor Network was established in December of 2012 with two cameras and expanded to eight cameras in December of 2014. The two camera configuration saw 5470 meteors over two years of operation with two cameras, and has detected 3423 meteors in the first five months of operation (Dec 12, 2014 - May 12, 2015) with eight cameras. We expect to see over 10,000 meteors per year with the expanded system. The cameras have a 20 degree field of view and an approximate limiting meteor magnitude of +5. The network's primary goal is determining the nightly shower and sporadic meteor fluxes. Both camera networks function almost fully autonomously with little human interaction required for upkeep and analysis. The cameras send their data to a central server for storage and automatic analysis. Every morning the servers automatically generates an e-mail and web page containing an analysis of the previous night's events. The current status of the networks will be described, alongside with preliminary results. In addition, future projects, CCD photometry and broadband meteor color camera system, will be discussed.

  10. The effectiveness of detection of splashed particles using a system of three integrated high-speed cameras

    NASA Astrophysics Data System (ADS)

    Ryżak, Magdalena; Beczek, Michał; Mazur, Rafał; Sochan, Agata; Bieganowski, Andrzej

    2017-04-01

    The phenomenon of splash, which is one of the factors causing erosion of the soil surface, is the subject of research of various scientific teams. One of efficient methods of observation and analysis of this phenomenon are high-speed cameras to measure particles at 2000 frames per second or higher. Analysis of the phenomenon of splash with the use of high-speed cameras and specialized software can reveal, among other things, the number of broken particles, their speeds, trajectories, and the distances over which they were transferred. The paper presents an attempt at evaluation of the efficiency of detection of splashed particles with the use of a set of 3 cameras (Vision Research MIRO 310) and software Dantec Dynamics Studio, using a 3D module (Volumetric PTV). In order to assess the effectiveness of estimating the number of particles, the experiment was performed on glass beads with a diameter of 0.5 mm (corresponding to the sand fraction). Water droplets with a diameter of 4.2 mm fell on a sample from a height of 1.5 m. Two types of splashed particles were observed: particle having a low range (up to 18 mm) splashed at larger angles and particles of a high range (up to 118 mm) splashed at smaller angles. The detection efficiency the number of splashed particles estimated by the software was 45 - 65% for particles with a large range. The effectiveness of the detection of particles by the software has been calculated on the basis of comparison with the number of beads that fell on the adhesive surface around the sample. This work was partly financed from the National Science Centre, Poland; project no. 2014/14/E/ST10/00851.

  11. 241-AZ-101 Waste Tank Color Video Camera System Shop Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WERRY, S.M.

    2000-03-23

    This report includes shop acceptance test results. The test was performed prior to installation at tank AZ-101. Both the camera system and camera purge system were originally sought and procured as a part of initial waste retrieval project W-151.

  12. Integrated inertial stellar attitude sensor

    NASA Technical Reports Server (NTRS)

    Brady, Tye M. (Inventor); Kourepenis, Anthony S. (Inventor); Wyman, Jr., William F. (Inventor)

    2007-01-01

    An integrated inertial stellar attitude sensor for an aerospace vehicle includes a star camera system, a gyroscope system, a controller system for synchronously integrating an output of said star camera system and an output of said gyroscope system into a stream of data, and a flight computer responsive to said stream of data for determining from the star camera system output and the gyroscope system output the attitude of the aerospace vehicle.

  13. The first satellite laser echoes recorded on the streak camera

    NASA Technical Reports Server (NTRS)

    Hamal, Karel; Prochazka, Ivan; Kirchner, Georg; Koidl, F.

    1993-01-01

    The application of the streak camera with the circular sweep for the satellite laser ranging is described. The Modular Streak Camera system employing the circular sweep option was integrated into the conventional Satellite Laser System. The experimental satellite tracking and ranging has been performed. The first satellite laser echo streak camera records are presented.

  14. Webb Instruments Perfected to Microscopic Levels

    NASA Image and Video Library

    2014-06-20

    Dressed in a cleanroom suit to prevent contamination, Optics Technician Jeff Gum aligns a replacement Focal Plane Assembly (FPA) with a powerful three-dimensional microscope at NASA's Goddard Space Flight Center in Greenbelt, Md. This FPA will be installed on the Near Infrared Camera (NIRCam) instrument, which has unique components that are individually tailored to see in a particular infrared wavelength range. By using the microscope, Gum ensures the FPA detectors are characterized and ready for installation onto NIRCam, the James Webb Space Telescope's primary imager that will see the light from the earliest stars and galaxies that formed in the universe. Credit: NASA/Goddard/Chris Gunn NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  15. Hubble Finds a Lenticular Galaxy Standing Out in the Crowd

    NASA Image and Video Library

    2017-12-08

    A lone source shines out brightly from the dark expanse of deep space, glowing softly against a picturesque backdrop of distant stars and colorful galaxies. Captured by the NASA/ESA Hubble Space Telescope’s Advanced Camera for Surveys (ACS), this scene shows PGC 83677, a lenticular galaxy — a galaxy type that sits between the more familiar elliptical and spiral varieties. It reveals both the relatively calm outskirts and intriguing core of PGC 83677. Here, studies have uncovered signs of a monstrous black hole that is spewing out high-energy X-rays and ultraviolet light. Credit: NASA/ESA/Hubble; acknowledgements: Judy Schmidt (Geckzilla) NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  16. Baseline Design and Performance Analysis of Laser Altimeter for Korean Lunar Orbiter

    NASA Astrophysics Data System (ADS)

    Lim, Hyung-Chul; Neumann, Gregory A.; Choi, Myeong-Hwan; Yu, Sung-Yeol; Bang, Seong-Cheol; Ka, Neung-Hyun; Park, Jong-Uk; Choi, Man-Soo; Park, Eunseo

    2016-09-01

    Korea’s lunar exploration project includes the launching of an orbiter, a lander (including a rover), and an experimental orbiter (referred to as a lunar pathfinder). Laser altimeters have played an important scientific role in lunar, planetary, and asteroid exploration missions since their first use in 1971 onboard the Apollo 15 mission to the Moon. In this study, a laser altimeter was proposed as a scientific instrument for the Korean lunar orbiter, which will be launched by 2020, to study the global topography of the surface of the Moon and its gravitational field and to support other payloads such as a terrain mapping camera or spectral imager. This study presents the baseline design and performance model for the proposed laser altimeter. Additionally, the study discusses the expected performance based on numerical simulation results. The simulation results indicate that the design of system parameters satisfies performance requirements with respect to detection probability and range error even under unfavorable conditions.

  17. Dying Star Shrouded by a Blanket of Hailstones Forms the Bug Nebula

    NASA Image and Video Library

    2017-12-08

    Release Date: May 3, 2004 A Dying Star Shrouded by a Blanket of Hailstones Forms the Bug Nebula (NGC 6302) The Bug Nebula, NGC 6302, is one of the brightest and most extreme planetary nebulae known. The fiery, dying star at its center is shrouded by a blanket of icy hailstones. This NASA Hubble Wide Field Plantery Camera 2 image shows impressive walls of compressed gas, laced with trailing strands and bubbling outflows. Object Names: NGC 6302, Bug Nebula Image Type: Astronomical Credit: NASA, ESA and A.Zijlstra (UMIST, Manchester, UK) To learn more about this image go to: hubblesite.org/gallery/album/nebula/pr2004046a/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. Electronic camera-management system for 35-mm and 70-mm film cameras

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan

    1993-01-01

    Military and commercial test facilities have been tasked with the need for increasingly sophisticated data collection and data reduction. A state-of-the-art electronic control system for high speed 35 mm and 70 mm film cameras designed to meet these tasks is described. Data collection in today's test range environment is difficult at best. The need for a completely integrated image and data collection system is mandated by the increasingly complex test environment. Instrumentation film cameras have been used on test ranges to capture images for decades. Their high frame rates coupled with exceptionally high resolution make them an essential part of any test system. In addition to documenting test events, today's camera system is required to perform many additional tasks. Data reduction to establish TSPI (time- space-position information) may be performed after a mission and is subject to all of the variables present in documenting the mission. A typical scenario would consist of multiple cameras located on tracking mounts capturing the event along with azimuth and elevation position data. Corrected data can then be reduced using each camera's time and position deltas and calculating the TSPI of the object using triangulation. An electronic camera control system designed to meet these requirements has been developed by Photo-Sonics, Inc. The feedback received from test technicians at range facilities throughout the world led Photo-Sonics to design the features of this control system. These prominent new features include: a comprehensive safety management system, full local or remote operation, frame rate accuracy of less than 0.005 percent, and phase locking capability to Irig-B. In fact, Irig-B phase lock operation of multiple cameras can reduce the time-distance delta of a test object traveling at mach-1 to less than one inch during data reduction.

  19. X-Ray Computed Tomography Monitors Damage in Composites

    NASA Technical Reports Server (NTRS)

    Baaklini, George Y.

    1997-01-01

    The NASA Lewis Research Center recently codeveloped a state-of-the-art x-ray CT facility (designated SMS SMARTSCAN model 100-112 CITA by Scientific Measurement Systems, Inc., Austin, Texas). This multipurpose, modularized, digital x-ray facility includes an imaging system for digital radiography, CT, and computed laminography. The system consists of a 160-kV microfocus x-ray source, a solid-state charge-coupled device (CCD) area detector, a five-axis object-positioning subassembly, and a Sun SPARCstation-based computer system that controls data acquisition and image processing. The x-ray source provides a beam spot size down to 3 microns. The area detector system consists of a 50- by 50- by 3-mm-thick terbium-doped glass fiber-optic scintillation screen, a right-angle mirror, and a scientific-grade, digital CCD camera with a resolution of 1000 by 1018 pixels and 10-bit digitization at ambient cooling. The digital output is recorded with a high-speed, 16-bit frame grabber that allows data to be binned. The detector can be configured to provide a small field-of-view, approximately 45 by 45 mm in cross section, or a larger field-of-view, approximately 60 by 60 mm in cross section. Whenever the highest spatial resolution is desired, the small field-of-view is used, and for larger samples with some reduction in spatial resolution, the larger field-of-view is used.

  20. Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging.

    PubMed

    Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; Geert Sander de Jong, Jan; van Geest, Bert; Stoop, Karel; Young, Ian Ted

    2012-12-01

    We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.

  1. Light-Directed Ranging System Implementing Single Camera System for Telerobotics Applications

    NASA Technical Reports Server (NTRS)

    Wells, Dennis L. (Inventor); Li, Larry C. (Inventor); Cox, Brian J. (Inventor)

    1997-01-01

    A laser-directed ranging system has utility for use in various fields, such as telerobotics applications and other applications involving physically handicapped individuals. The ranging system includes a single video camera and a directional light source such as a laser mounted on a camera platform, and a remotely positioned operator. In one embodiment, the position of the camera platform is controlled by three servo motors to orient the roll axis, pitch axis and yaw axis of the video cameras, based upon an operator input such as head motion. The laser is offset vertically and horizontally from the camera, and the laser/camera platform is directed by the user to point the laser and the camera toward a target device. The image produced by the video camera is processed to eliminate all background images except for the spot created by the laser. This processing is performed by creating a digital image of the target prior to illumination by the laser, and then eliminating common pixels from the subsequent digital image which includes the laser spot. A reference point is defined at a point in the video frame, which may be located outside of the image area of the camera. The disparity between the digital image of the laser spot and the reference point is calculated for use in a ranging analysis to determine range to the target.

  2. Observations of the Perseids 2007 with SPOSH cameras

    NASA Astrophysics Data System (ADS)

    Oberst, J.; Flohrer, J.; Tost, W.; Elgner, S.; Koschny, D.; McAuliffe, J.

    2008-09-01

    A large number of Perseid meteors were captured during a 2007 campaign carried out in Germany and Austria using SPOSH (Smart Panoramic Optical Sensor Head) cameras. The SPOSH camera (developed at DLR and Jena Optronik under contract to ESA/ESTEC) has a custom-made optical system with a field of view of 120 x 120° (170° x 170° over the image diagonal) and features a back-illuminated 1024 x 1024 CCD, which warrants high sensitivity as well as high geometric and photometric accuracy. Images are taken at a rate of one every two seconds. While currently 4 SPOSH cameras are available, two of the cameras are equipped with rotating shutters for meteor speed information. The 4 SPOSH cameras were deployed at locations at Neustrelitz and Liebenhof (near Berlin, Germany), as well as Gahberg and Kanzelhöhe (Austria). Two more commercial cameras (Canon EOS) at separate locations were included in our campaign to warrant multiple observations of the meteors in the case of bad weather. Images were taken during the nights from August 10- 14, with excellent viewing conditions during the night of the Perseid maximum, Aug 12/13 at all stations. Following the campaign, geometric calibrations of the images and comprehensive searches for meteors in the data were carried out. We recorded more than 3300 meteors, among which there were 400 double station observations. During the peak of the shower, 180 meteors were recorded within 30 minutes from Kanzelhöhe (the Observatory at an altitude of 1500 m had extremely clear sky) alone. Hence, we have an unusually large data set, which includes meteors as faint as m=+6, as we estimate. Besides Perseids, a number of sporadic meteors and members of other showers were identified. A full trajectory analysis has been performed for a good number of meteors so far, with most data still awaiting further analysis. This poster presentation will give a full account on the scientific results of the campaign. Furthermore we will report lessons learned from the handling of the 2007 campaign, which includes modified instrumentation and an optimized set-up procedure for the stations as well as streamlined processing and computer-aided meteor detection in images. The campaign was carried out involving students and trainees from the Technical University Berlin and enjoyed funding support from EuroPlanet.

  3. Fast auto-acquisition tomography tilt series by using HD video camera in ultra-high voltage electron microscope.

    PubMed

    Nishi, Ryuji; Cao, Meng; Kanaji, Atsuko; Nishida, Tomoki; Yoshida, Kiyokazu; Isakozawa, Shigeto

    2014-11-01

    The ultra-high voltage electron microscope (UHVEM) H-3000 with the world highest acceleration voltage of 3 MV can observe remarkable three dimensional microstructures of microns-thick samples[1]. Acquiring a tilt series of electron tomography is laborious work and thus an automatic technique is highly desired. We proposed the Auto-Focus system using image Sharpness (AFS)[2,3] for UHVEM tomography tilt series acquisition. In the method, five images with different defocus values are firstly acquired and the image sharpness are calculated. The sharpness are then fitted to a quasi-Gaussian function to decide the best focus value[3]. Defocused images acquired by the slow scan CCD (SS-CCD) camera (Hitachi F486BK) are of high quality but one minute is taken for acquisition of five defocused images.In this study, we introduce a high-definition video camera (HD video camera; Hamamatsu Photonics K. K. C9721S) for fast acquisition of images[4]. It is an analog camera but the camera image is captured by a PC and the effective image resolution is 1280×1023 pixels. This resolution is lower than that of the SS-CCD camera of 4096×4096 pixels. However, the HD video camera captures one image for only 1/30 second. In exchange for the faster acquisition the S/N of images are low. To improve the S/N, 22 captured frames are integrated so that each image sharpness is enough to become lower fitting error. As countermeasure against low resolution, we selected a large defocus step, which is typically five times of the manual defocus step, to discriminate different defocused images.By using HD video camera for autofocus process, the time consumption for each autofocus procedure was reduced to about six seconds. It took one second for correction of an image position and the total correction time was seven seconds, which was shorter by one order than that using SS-CCD camera. When we used SS-CCD camera for final image capture, it took 30 seconds to record one tilt image. We can obtain a tilt series of 61 images within 30 minutes. Accuracy and repeatability were good enough to practical use (Figure 1). We successfully reduced the total acquisition time of a tomography tilt series in half than before.jmicro;63/suppl_1/i25/DFU066F1F1DFU066F1Fig. 1.Objective lens current change with a tilt angle during acquisition of tomography series (Sample: a rat hepatocyte, thickness: 2 m, magnification: 4k, acc. voltage: 2 MV). Tilt angle range is ±60 degree with 2 degree step angle. Two series were acquired in the same area. Both data were almost same and the deviation was smaller than the minimum step by manual, so auto-focus worked well. We also developed a computer-aided three dimensional (3D) visualization and analysis software for electron tomography "HawkC" which can sectionalize the 3D data semi-automatically[5,6]. If this auto-acquisition system is used with IMOD reconstruction software[7] and HawkC software, we will be able to do on-line UHVEM tomography. The system would help pathology examination in the future.This work was supported by the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under a Grant-in-Aid for Scientific Research (Grant No. 23560024, 23560786), and SENTAN, Japan Science and Technology Agency, Japan. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Fuzzy logic control for camera tracking system

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Fritz, R. H.; Giarratano, J.; Jani, Yashvant

    1992-01-01

    A concept utilizing fuzzy theory has been developed for a camera tracking system to provide support for proximity operations and traffic management around the Space Station Freedom. Fuzzy sets and fuzzy logic based reasoning are used in a control system which utilizes images from a camera and generates required pan and tilt commands to track and maintain a moving target in the camera's field of view. This control system can be implemented on a fuzzy chip to provide an intelligent sensor for autonomous operations. Capabilities of the control system can be expanded to include approach, handover to other sensors, caution and warning messages.

  5. Single-Command Approach and Instrument Placement by a Robot on a Target

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Cheng, Yang

    2005-01-01

    AUTOAPPROACH is a computer program that enables a mobile robot to approach a target autonomously, starting from a distance of as much as 10 m, in response to a single command. AUTOAPPROACH is used in conjunction with (1) software that analyzes images acquired by stereoscopic cameras aboard the robot and (2) navigation and path-planning software that utilizes odometer readings along with the output of the image-analysis software. Intended originally for application to an instrumented, wheeled robot (rover) in scientific exploration of Mars, AUTOAPPROACH could be adapted to terrestrial applications, notably including the robotic removal of land mines and other unexploded ordnance. A human operator generates the approach command by selecting the target in images acquired by the robot cameras. The approach path consists of multiple legs. Feature points are derived from images that contain the target and are thereafter tracked to correct odometric errors and iteratively refine estimates of the position and orientation of the robot relative to the target on successive legs. The approach is terminated when the robot attains the position and orientation required for placing a scientific instrument at the target. The workspace of the robot arm is then autonomously checked for self/terrain collisions prior to the deployment of the scientific instrument onto the target.

  6. Variation in detection among passive infrared triggered-cameras used in wildlife research

    USGS Publications Warehouse

    Damm, Philip E.; Grand, James B.; Barnett, Steven W.

    2010-01-01

    Precise and accurate estimates of demographics such as age structure, productivity, and density are necessary in determining habitat and harvest management strategies for wildlife populations. Surveys using automated cameras are becoming an increasingly popular tool for estimating these parameters. However, most camera studies fail to incorporate detection probabilities, leading to parameter underestimation. The objective of this study was to determine the sources of heterogeneity in detection for trail cameras that incorporate a passive infrared (PIR) triggering system sensitive to heat and motion. Images were collected at four baited sites within the Conecuh National Forest, Alabama, using three cameras at each site operating continuously over the same seven-day period. Detection was estimated for four groups of animals based on taxonomic group and body size. Our hypotheses of detection considered variation among bait sites and cameras. The best model (w=0.99) estimated different rates of detection for each camera in addition to different detection rates for four animal groupings. Factors that explain this variability might include poor manufacturing tolerances, variation in PIR sensitivity, animal behavior, and species-specific infrared radiation. Population surveys using trail cameras with PIR systems must incorporate detection rates for individual cameras. Incorporating time-lapse triggering systems into survey designs should eliminate issues associated with PIR systems.

  7. A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.

    PubMed

    Yang, Tao; Li, Guangpo; Li, Jing; Zhang, Yanning; Zhang, Xiaoqiang; Zhang, Zhuoyue; Li, Zhi

    2016-08-30

    This paper proposes a novel infrared camera array guidance system with capability to track and provide real time position and speed of a fixed-wing Unmanned air vehicle (UAV) during a landing process. The system mainly include three novel parts: (1) Infrared camera array and near infrared laser lamp based cooperative long range optical imaging module; (2) Large scale outdoor camera array calibration module; and (3) Laser marker detection and 3D tracking module. Extensive automatic landing experiments with fixed-wing flight demonstrate that our infrared camera array system has the unique ability to guide the UAV landing safely and accurately in real time. Moreover, the measurement and control distance of our system is more than 1000 m. The experimental results also demonstrate that our system can be used for UAV automatic accurate landing in Global Position System (GPS)-denied environments.

  8. Global calibration of multi-cameras with non-overlapping fields of view based on photogrammetry and reconfigurable target

    NASA Astrophysics Data System (ADS)

    Xia, Renbo; Hu, Maobang; Zhao, Jibin; Chen, Songlin; Chen, Yueling

    2018-06-01

    Multi-camera vision systems are often needed to achieve large-scale and high-precision measurement because these systems have larger fields of view (FOV) than a single camera. Multiple cameras may have no or narrow overlapping FOVs in many applications, which pose a huge challenge to global calibration. This paper presents a global calibration method for multi-cameras without overlapping FOVs based on photogrammetry technology and a reconfigurable target. Firstly, two planar targets are fixed together and made into a long target according to the distance between the two cameras to be calibrated. The relative positions of the two planar targets can be obtained by photogrammetric methods and used as invariant constraints in global calibration. Then, the reprojection errors of target feature points in the two cameras’ coordinate systems are calculated at the same time and optimized by the Levenberg–Marquardt algorithm to find the optimal solution of the transformation matrix between the two cameras. Finally, all the camera coordinate systems are converted to the reference coordinate system in order to achieve global calibration. Experiments show that the proposed method has the advantages of high accuracy (the RMS error is 0.04 mm) and low cost and is especially suitable for on-site calibration.

  9. Uav Cameras: Overview and Geometric Calibration Benchmark

    NASA Astrophysics Data System (ADS)

    Cramer, M.; Przybilla, H.-J.; Zurhorst, A.

    2017-08-01

    Different UAV platforms and sensors are used in mapping already, many of them equipped with (sometimes) modified cameras as known from the consumer market. Even though these systems normally fulfil their requested mapping accuracy, the question arises, which system performs best? This asks for a benchmark, to check selected UAV based camera systems in well-defined, reproducible environments. Such benchmark is tried within this work here. Nine different cameras used on UAV platforms, representing typical camera classes, are considered. The focus is laid on the geometry here, which is tightly linked to the process of geometrical calibration of the system. In most applications the calibration is performed in-situ, i.e. calibration parameters are obtained as part of the project data itself. This is often motivated because consumer cameras do not keep constant geometry, thus, cannot be seen as metric cameras. Still, some of the commercial systems are quite stable over time, as it was proven from repeated (terrestrial) calibrations runs. Already (pre-)calibrated systems may offer advantages, especially when the block geometry of the project does not allow for a stable and sufficient in-situ calibration. Especially for such scenario close to metric UAV cameras may have advantages. Empirical airborne test flights in a calibration field have shown how block geometry influences the estimated calibration parameters and how consistent the parameters from lab calibration can be reproduced.

  10. A novel camera localization system for extending three-dimensional digital image correlation

    NASA Astrophysics Data System (ADS)

    Sabato, Alessandro; Reddy, Narasimha; Khan, Sameer; Niezrecki, Christopher

    2018-03-01

    The monitoring of civil, mechanical, and aerospace structures is important especially as these systems approach or surpass their design life. Often, Structural Health Monitoring (SHM) relies on sensing techniques for condition assessment. Advancements achieved in camera technology and optical sensors have made three-dimensional (3D) Digital Image Correlation (DIC) a valid technique for extracting structural deformations and geometry profiles. Prior to making stereophotogrammetry measurements, a calibration has to be performed to obtain the vision systems' extrinsic and intrinsic parameters. It means that the position of the cameras relative to each other (i.e. separation distance, cameras angle, etc.) must be determined. Typically, cameras are placed on a rigid bar to prevent any relative motion between the cameras. This constraint limits the utility of the 3D-DIC technique, especially as it is applied to monitor large-sized structures and from various fields of view. In this preliminary study, the design of a multi-sensor system is proposed to extend 3D-DIC's capability and allow for easier calibration and measurement. The suggested system relies on a MEMS-based Inertial Measurement Unit (IMU) and a 77 GHz radar sensor for measuring the orientation and relative distance of the stereo cameras. The feasibility of the proposed combined IMU-radar system is evaluated through laboratory tests, demonstrating its ability in determining the cameras position in space for performing accurate 3D-DIC calibration and measurements.

  11. Depth Perception In Remote Stereoscopic Viewing Systems

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Von Sydow, Marika

    1989-01-01

    Report describes theoretical and experimental studies of perception of depth by human operators through stereoscopic video systems. Purpose of such studies to optimize dual-camera configurations used to view workspaces of remote manipulators at distances of 1 to 3 m from cameras. According to analysis, static stereoscopic depth distortion decreased, without decreasing stereoscopitc depth resolution, by increasing camera-to-object and intercamera distances and camera focal length. Further predicts dynamic stereoscopic depth distortion reduced by rotating cameras around center of circle passing through point of convergence of viewing axes and first nodal points of two camera lenses.

  12. Multi-color pyrometry imaging system and method of operating the same

    DOEpatents

    Estevadeordal, Jordi; Nirmalan, Nirm Velumylum; Tralshawala, Nilesh; Bailey, Jeremy Clyde

    2017-03-21

    A multi-color pyrometry imaging system for a high-temperature asset includes at least one viewing port in optical communication with at least one high-temperature component of the high-temperature asset. The system also includes at least one camera device in optical communication with the at least one viewing port. The at least one camera device includes a camera enclosure and at least one camera aperture defined in the camera enclosure, The at least one camera aperture is in optical communication with the at least one viewing port. The at least one camera device also includes a multi-color filtering mechanism coupled to the enclosure. The multi-color filtering mechanism is configured to sequentially transmit photons within a first predetermined wavelength band and transmit photons within a second predetermined wavelength band that is different than the first predetermined wavelength band.

  13. A sophisticated lander for scientific exploration of Mars: scientific objectives and implementation of the Mars-96 Small Station

    NASA Astrophysics Data System (ADS)

    Linkin, V.; Harri, A.-M.; Lipatov, A.; Belostotskaja, K.; Derbunovich, B.; Ekonomov, A.; Khloustova, L.; Kremnev, R.; Makarov, V.; Martinov, B.; Nenarokov, D.; Prostov, M.; Pustovalov, A.; Shustko, G.; Järvinen, I.; Kivilinna, H.; Korpela, S.; Kumpulainen, K.; Lehto, A.; Pellinen, R.; Pirjola, R.; Riihelä, P.; Salminen, A.; Schmidt, W.; Siili, T.; Blamont, J.; Carpentier, T.; Debus, A.; Hua, C. T.; Karczewski, J.-F.; Laplace, H.; Levacher, P.; Lognonné, Ph.; Malique, C.; Menvielle, M.; Mouli, G.; Pommereau, J.-P.; Quotb, K.; Runavot, J.; Vienne, D.; Grunthaner, F.; Kuhnke, F.; Musmann, G.; Rieder, R.; Wänke, H.; Economou, T.; Herring, M.; Lane, A.; McKay, C. P.

    1998-02-01

    A mission to Mars including two Small Stations, two Penetrators and an Orbiter was launched at Baikonur, Kazakhstan, on 16 November 1996. This was called the Mars-96 mission. The Small Stations were expected to land in September 1997 (L s approximately 178°), nominally to Amazonis-Arcadia region on locations (33 N, 169.4 W) and (37.6 N, 161.9W). The fourth stage of the Mars-96 launcher malfunctioned and hence the mission was lost. However, the state of the art concept of the Small Station can be applied to future Martian lander missions. Also, from the manufacturing and performance point of view, the Mars-96 Small Station could be built as such at low cost, and be fairly easily accommodated on almost any forthcoming Martian mission. This is primarily due to the very simple interface between the Small Station and the spacecraft. The Small Station is a sophisticated piece of equipment. With the total available power of approximately 400 mW the Station successfully supports an ambitious scientific program. The Station accommodates a panoramic camera, an alpha-proton-x-ray spectrometer, a seismometer, a magnetometer, an oxidant instrument, equipment for meteorological observations, and sensors for atmospheric measurement during the descent phase, including images taken by a descent phase camera. The total mass of the Small Station with payload on the Martian surface, including the airbags, is only 32 kg. Lander observations on the surface of Mars combined with data from Orbiter instruments will shed light on the contemporary Mars and its evolution. As in the Mars-96 mission, specific science goals could be exploration of the interior and surface of Mars, investigation of the structure and dynamics of the atmosphere, the role of water and other materials containing volatiles and in situ studies of the atmospheric boundary layer processes. To achieve the scientific goals of the mission the lander should carry a versatile set of instruments. The Small Station accommodates devices for atmospheric measurements, geophysical and geochemical studies of the Martian surface and interior, and cameras for descent phase and panoramic views. These instruments would be able to contribute remarkably to the process of solving some of the scientific puzzles of Mars.

  14. A sophisticated lander for scientific exploration of Mars: scientific objectives and implementation of the Mars-96 Small Station.

    PubMed

    Linkin, V; Harri, A M; Lipatov, A; Belostotskaja, K; Derbunovich, B; Ekonomov, A; Khloustova, L; Kremnev, R; Makarov, V; Martinov, B; Nenarokov, D; Prostov, M; Pustovalov, A; Shustko, G; Jarvinen, I; Kivilinna, H; Korpela, S; Kumpulainen, K; Lehto, A; Pellinen, R; Pirjola, R; Riihela, P; Salminen, A; Schmidt, W; McKay, C P

    1998-01-01

    A mission to Mars including two Small Stations, two Penetrators and an Orbiter was launched at Baikonur, Kazakhstan, on 16 November 1996. This was called the Mars-96 mission. The Small Stations were expected to land in September 1997 (Ls approximately 178 degrees), nominally to Amazonis-Arcadia region on locations (33 N, 169.4 W) and (37.6 N, 161.9 W). The fourth stage of the Mars-96 launcher malfunctioned and hence the mission was lost. However, the state of the art concept of the Small Station can be applied to future Martian lander missions. Also, from the manufacturing and performance point of view, the Mars-96 Small Station could be built as such at low cost, and be fairly easily accommodated on almost any forthcoming Martian mission. This is primarily due to the very simple interface between the Small Station and the spacecraft. The Small Station is a sophisticated piece of equipment. With the total available power of approximately 400 mW the Station successfully supports an ambitious scientific program. The Station accommodates a panoramic camera, an alpha-proton-x-ray spectrometer, a seismometer, a magnetometer, an oxidant instrument, equipment for meteorological observations, and sensors for atmospheric measurement during the descent phase, including images taken by a descent phase camera. The total mass of the Small Station with payload on the Martian surface, including the airbags, is only 32 kg. Lander observations on the surface of Mars combined with data from Orbiter instruments will shed light on the contemporary Mars and its evolution. As in the Mars-96 mission, specific science goals could be exploration of the interior and surface of Mars, investigation of the structure and dynamics of the atmosphere, the role of water and other materials containing volatiles and in situ studies of the atmospheric boundary layer processes. To achieve the scientific goals of the mission the lander should carry a versatile set of instruments. The Small Station accommodates devices for atmospheric measurements, geophysical and geochemical studies of the Martian surface and interior, and cameras for descent phase and panoramic views. These instruments would be able to contribute remarkably to the process of solving some of the scientific puzzles of Mars.

  15. Single-camera stereo-digital image correlation with a four-mirror adapter: optimized design and validation

    NASA Astrophysics Data System (ADS)

    Yu, Liping; Pan, Bing

    2016-12-01

    A low-cost, easy-to-implement but practical single-camera stereo-digital image correlation (DIC) system using a four-mirror adapter is established for accurate shape and three-dimensional (3D) deformation measurements. The mirrors assisted pseudo-stereo imaging system can convert a single camera into two virtual cameras, which view a specimen from different angles and record the surface images of the test object onto two halves of the camera sensor. To enable deformation measurement in non-laboratory conditions or extreme high temperature environments, an active imaging optical design, combining an actively illuminated monochromatic source with a coupled band-pass optical filter, is compactly integrated to the pseudo-stereo DIC system. The optical design, basic principles and implementation procedures of the established system for 3D profile and deformation measurements are described in detail. The effectiveness and accuracy of the established system are verified by measuring the profile of a regular cylinder surface and displacements of a translated planar plate. As an application example, the established system is used to determine the tensile strains and Poisson's ratio of a composite solid propellant specimen during stress relaxation test. Since the established single-camera stereo-DIC system only needs a single camera and presents strong robustness against variations in ambient light or the thermal radiation of a hot object, it demonstrates great potential in determining transient deformation in non-laboratory or high-temperature environments with the aid of a single high-speed camera.

  16. Rapid matching of stereo vision based on fringe projection profilometry

    NASA Astrophysics Data System (ADS)

    Zhang, Ruihua; Xiao, Yi; Cao, Jian; Guo, Hongwei

    2016-09-01

    As the most important core part of stereo vision, there are still many problems to solve in stereo matching technology. For smooth surfaces on which feature points are not easy to extract, this paper adds a projector into stereo vision measurement system based on fringe projection techniques, according to the corresponding point phases which extracted from the left and right camera images are the same, to realize rapid matching of stereo vision. And the mathematical model of measurement system is established and the three-dimensional (3D) surface of the measured object is reconstructed. This measurement method can not only broaden application fields of optical 3D measurement technology, and enrich knowledge achievements in the field of optical 3D measurement, but also provide potential possibility for the commercialized measurement system in practical projects, which has very important scientific research significance and economic value.

  17. Hubble Sees an Ancient Globular Cluster

    NASA Image and Video Library

    2017-12-08

    This image captures the stunning NGC 6535, a globular cluster 22,000 light-years away in the constellation of Serpens (The Serpent) that measures one light-year across. Globular clusters are tightly bound groups of stars which orbit galaxies. The large mass in the rich stellar centre of the globular cluster pulls the stars inward to form a ball of stars. The word globulus, from which these clusters take their name, is Latin for small sphere. Globular clusters are generally very ancient objects formed around the same time as their host galaxy. To date, no new star formation has been observed within a globular cluster, which explains the abundance of aging yellow stars in this image, most of them containing very few heavy elements. NGC 6535 was first discovered in 1852 by English astronomer John Russell Hind. The cluster would have appeared to Hind as a small, faint smudge through his telescope. Now, over 160 years later, instruments like the Advanced Camera for Surveys (ACS) and Wide Field Camera 3 (WFC3) on the NASA/ European Space Agency (ESA) Hubble Space Telescope allow us to marvel at the cluster and its contents in greater detail. Credit: ESA/Hubble & NASA, Acknowledgement: Gilles Chapdelaine NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. NASA Telescopes Help Discover Surprisingly Young Galaxy

    NASA Image and Video Library

    2017-12-08

    NASA image release April 12, 2011 Astronomers have uncovered one of the youngest galaxies in the distant universe, with stars that formed 13.5 billion years ago, a mere 200 million years after the Big Bang. The finding addresses questions about when the first galaxies arose, and how the early universe evolved. NASA's Hubble Space Telescope was the first to spot the newfound galaxy. Detailed observations from the W.M. Keck Observatory on Mauna Kea in Hawaii revealed the observed light dates to when the universe was only 950 million years old; the universe formed about 13.7 billion years ago. Infrared data from both Hubble and NASA's Spitzer Space Telescope revealed the galaxy's stars are quite mature, having formed when the universe was just a toddler at 200 million years old. The galaxy's image is being magnified by the gravity of a massive cluster of galaxies (Abell 383) parked in front of it, making it appear 11 times brighter. This phenomenon is called gravitational lensing. Hubble imaged the lensing galaxy Abell 383 with the Wide Field Camera 3 and the Advanced Camera for Surveys in November 2010 through March 2011. Credit: NASA, ESA, J. Richard (Center for Astronomical Research/Observatory of Lyon, France), and J.-P. Kneib (Astrophysical Laboratory of Marseille, France) NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  19. Mitigation of Atmospheric Effects on Imaging Systems

    DTIC Science & Technology

    2004-03-31

    focal length. The imaging system had two cameras: an Electrim camera sensitive in the visible (0.6 µ m) waveband and an Amber QWIP infrared camera...sensitive in the 9–micron region. The Amber QWIP infrared camera had 256x256 pixels, pixel pitch 38 mµ , focal length of 1.8 m, FOV of 5.4 x5.4 mr...each day. Unfortunately, signals from the different read ports of the Electrim camera picked up noise on their way to the digitizer, and this resulted

  20. Autocalibration of a projector-camera system.

    PubMed

    Okatani, Takayuki; Deguchi, Koichiro

    2005-12-01

    This paper presents a method for calibrating a projector-camera system that consists of multiple projectors (or multiple poses of a single projector), a camera, and a planar screen. We consider the problem of estimating the homography between the screen and the image plane of the camera or the screen-camera homography, in the case where there is no prior knowledge regarding the screen surface that enables the direct computation of the homography. It is assumed that the pose of each projector is unknown while its internal geometry is known. Subsequently, it is shown that the screen-camera homography can be determined from only the images projected by the projectors and then obtained by the camera, up to a transformation with four degrees of freedom. This transformation corresponds to arbitrariness in choosing a two-dimensional coordinate system on the screen surface and when this coordinate system is chosen in some manner, the screen-camera homography as well as the unknown poses of the projectors can be uniquely determined. A noniterative algorithm is presented, which computes the homography from three or more images. Several experimental results on synthetic as well as real images are shown to demonstrate the effectiveness of the method.

  1. LST and instrument considerations. [modular design

    NASA Technical Reports Server (NTRS)

    Levin, G. M.

    1974-01-01

    In order that the LST meet its scientific objectives and also be a National Astronomical Space Facility during the 1980's and 1990's, broad requirements have been levied by the scientific community. These scientific requirements can be directly translated into design requirements and specifications for the scientific instruments. The instrument ensemble design must be consistent with a 15-year operational lifetime. Downtime for major repair/refurbishment or instrument updating must be minimized. The overall efficiency and performance of the instruments should be maximized. Modularization of instruments and instrument subsystems, some degree of on-orbit servicing (both repair and replacement), on-axis location, minimizing the number of reflections within instruments, minimizing polarization effects, and simultaneous operation of the F/24 camera with other instruments, are just a few of the design guidelines and specifications which can and will be met in order that these broader scientific requirements be satisfied.-

  2. ACT-Vision: active collaborative tracking for multiple PTZ cameras

    NASA Astrophysics Data System (ADS)

    Broaddus, Christopher; Germano, Thomas; Vandervalk, Nicholas; Divakaran, Ajay; Wu, Shunguang; Sawhney, Harpreet

    2009-04-01

    We describe a novel scalable approach for the management of a large number of Pan-Tilt-Zoom (PTZ) cameras deployed outdoors for persistent tracking of humans and vehicles, without resorting to the large fields of view of associated static cameras. Our system, Active Collaborative Tracking - Vision (ACT-Vision), is essentially a real-time operating system that can control hundreds of PTZ cameras to ensure uninterrupted tracking of target objects while maintaining image quality and coverage of all targets using a minimal number of sensors. The system ensures the visibility of targets between PTZ cameras by using criteria such as distance from sensor and occlusion.

  3. Traffic monitoring with distributed smart cameras

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Rosner, Marcin; Ulm, Michael; Schwingshackl, Gert

    2012-01-01

    The observation and monitoring of traffic with smart visions systems for the purpose of improving traffic safety has a big potential. Today the automated analysis of traffic situations is still in its infancy--the patterns of vehicle motion and pedestrian flow in an urban environment are too complex to be fully captured and interpreted by a vision system. 3In this work we present steps towards a visual monitoring system which is designed to detect potentially dangerous traffic situations around a pedestrian crossing at a street intersection. The camera system is specifically designed to detect incidents in which the interaction of pedestrians and vehicles might develop into safety critical encounters. The proposed system has been field-tested at a real pedestrian crossing in the City of Vienna for the duration of one year. It consists of a cluster of 3 smart cameras, each of which is built from a very compact PC hardware system in a weatherproof housing. Two cameras run vehicle detection and tracking software, one camera runs a pedestrian detection and tracking module based on the HOG dectection principle. All 3 cameras use sparse optical flow computation in a low-resolution video stream in order to estimate the motion path and speed of objects. Geometric calibration of the cameras allows us to estimate the real-world co-ordinates of detected objects and to link the cameras together into one common reference system. This work describes the foundation for all the different object detection modalities (pedestrians, vehicles), and explains the system setup, tis design, and evaluation results which we have achieved so far.

  4. Listen; There's a Hell of a Good Universe Next Door; Let's Go

    NASA Technical Reports Server (NTRS)

    Rigby, Jane R.

    2012-01-01

    Scientific research is key to our nation's technological and economic development. One can attempt to focus research toward specific applications, but science has a way of surprising us. Think for example of the "charge-couple device", which was originally invented for memory storage, but became the modern digital camera that is used everywhere from camera phones to the Hubble Space Telescope. Using digital cameras, Hubble has taken pictures that reach back 12 billion light-years into the past, when the Universe was only 1-2 billion years old. Such results would never have been possible with the film cameras Hubble was originally supposed to use. Over the past two decades, Hubble and other telescopes have shown us much about the Universe -- many of these results are shocking. Our galaxy is swarming with planets; most of the mass in the Universe is invisible; and our Universe is accelerating ever faster and faster for unknown reasons. Thus, we live in a "hell of a good universe", to quote e.e. cummings, that we fundamentally don't understand. This means that you, as young scientists, have many worlds to discover

  5. An Intraocular Camera for Retinal Prostheses: Restoring Sight to the Blind

    NASA Astrophysics Data System (ADS)

    Stiles, Noelle R. B.; McIntosh, Benjamin P.; Nasiatka, Patrick J.; Hauer, Michelle C.; Weiland, James D.; Humayun, Mark S.; Tanguay, Armand R., Jr.

    Implantation of an intraocular retinal prosthesis represents one possible approach to the restoration of sight in those with minimal light perception due to photoreceptor degenerating diseases such as retinitis pigmentosa and age-related macular degeneration. In such an intraocular retinal prosthesis, a microstimulator array attached to the retina is used to electrically stimulate still-viable retinal ganglion cells that transmit retinotopic image information to the visual cortex by means of the optic nerve, thereby creating an image percept. We describe herein an intraocular camera that is designed to be implanted in the crystalline lens sac and connected to the microstimulator array. Replacement of an extraocular (head-mounted) camera with the intraocular camera restores the natural coupling of head and eye motion associated with foveation, thereby enhancing visual acquisition, navigation, and mobility tasks. This research is in no small part inspired by the unique scientific style and research methodologies that many of us have learned from Prof. Richard K. Chang of Yale University, and is included herein as an example of the extent and breadth of his impact and legacy.

  6. Conceptual design for an AIUC multi-purpose spectrograph camera using DMD technology

    NASA Astrophysics Data System (ADS)

    Rukdee, S.; Bauer, F.; Drass, H.; Vanzi, L.; Jordan, A.; Barrientos, F.

    2017-02-01

    Current and upcoming massive astronomical surveys are expected to discover a torrent of objects, which need groundbased follow-up observations to characterize their nature. For transient objects in particular, rapid early and efficient spectroscopic identification is needed. In particular, a small-field Integral Field Unit (IFU) would mitigate traditional slit losses and acquisition time. To this end, we present the design of a Digital Micromirror Device (DMD) multi-purpose spectrograph camera capable of running in several modes: traditional longslit, small-field patrol IFU, multi-object and full-field IFU mode via Hadamard spectra reconstruction. AIUC Optical multi-purpose CAMera (AIUCOCAM) is a low-resolution spectrograph camera of R 1,600 covering the spectral range of 0.45-0.85 μm. We employ a VPH grating as a disperser, which is removable to allow an imaging mode. This spectrograph is envisioned for use on a 1-2 m class telescope in Chile to take advantage of good site conditions. We present design decisions and challenges for a costeffective robotized spectrograph. The resulting instrument is remarkably versatile, capable of addressing a wide range of scientific topics.

  7. Automatic lightning detection and photographic system

    NASA Technical Reports Server (NTRS)

    Wojtasinski, R. J.; Holley, L. D.; Gray, J. L.; Hoover, R. B. (Inventor)

    1972-01-01

    A system is presented for monitoring and recording lightning strokes within a predetermined area with a camera having an electrically operated shutter with means for advancing the film in the camera after activating the shutter. The system includes an antenna for sensing lightning strikes which, in turn, generates a signal that is fed to an electronic circuit which generates signals for operating the shutter of the camera. Circuitry is provided for preventing activation of the shutter as the film in the camera is being advanced.

  8. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  9. Software Aids Visualization Of Mars Pathfinder Mission

    NASA Technical Reports Server (NTRS)

    Weidner, Richard J.

    1996-01-01

    Report describes Simulator for Imager for Mars Pathfinder (SIMP) computer program. SIMP generates "virtual reality" display of view through video camera on Mars lander spacecraft of Mars Pathfinder mission, along with display of pertinent textual and graphical data, for use by scientific investigators in planning sequences of activities for mission.

  10. Modeling Cometary Coma with a Three Dimensional, Anisotropic Multiple Scattering Distributed Processing Code

    NASA Technical Reports Server (NTRS)

    Luchini, Chris B.

    1997-01-01

    Development of camera and instrument simulations for space exploration requires the development of scientifically accurate models of the objects to be studied. Several planned cometary missions have prompted the development of a three dimensional, multi-spectral, anisotropic multiple scattering model of cometary coma.

  11. Development of an Extra-vehicular (EVA) Infrared (IR) Camera Inspection System

    NASA Technical Reports Server (NTRS)

    Gazarik, Michael; Johnson, Dave; Kist, Ed; Novak, Frank; Antill, Charles; Haakenson, David; Howell, Patricia; Pandolf, John; Jenkins, Rusty; Yates, Rusty

    2006-01-01

    Designed to fulfill a critical inspection need for the Space Shuttle Program, the EVA IR Camera System can detect crack and subsurface defects in the Reinforced Carbon-Carbon (RCC) sections of the Space Shuttle s Thermal Protection System (TPS). The EVA IR Camera performs this detection by taking advantage of the natural thermal gradients induced in the RCC by solar flux and thermal emission from the Earth. This instrument is a compact, low-mass, low-power solution (1.2cm3, 1.5kg, 5.0W) for TPS inspection that exceeds existing requirements for feature detection. Taking advantage of ground-based IR thermography techniques, the EVA IR Camera System provides the Space Shuttle program with a solution that can be accommodated by the existing inspection system. The EVA IR Camera System augments the visible and laser inspection systems and finds cracks and subsurface damage that is not measurable by the other sensors, and thus fills a critical gap in the Space Shuttle s inspection needs. This paper discusses the on-orbit RCC inspection measurement concept and requirements, and then presents a detailed description of the EVA IR Camera System design.

  12. The PanCam instrument on the 2018 Exomars rover: Scientific objectives

    NASA Astrophysics Data System (ADS)

    Jaumann, Ralf; Coates, Andrew; Hauber, Ernst; Hoffmann, Harald; Schmitz, Nicole; Le Deit, Laetitia; Tirsch, Daniela; Paar, Gerhard; Griffiths, Andrew

    2010-05-01

    The Exomars Panoramic Camera System is an imaging suite of three camera heads to be mounted on the ExoMars rover`s mast, with the boresight 1.8 m above ground. As late as the ExoMars Pasteur Payload Design Review (PDR) in 2009, the PanCam consists of two identical wide angle cameras (WAC) with fixed focal length lenses, and a high resolution camera (HRC) with an automatic focus mechanism, placed adjacent to the right WAC. The WAC stereo pair provides binocular vision for stereoscopic studies as well as 12 filter positions (per camera) for stereoscopic colour imaging and scientific multispectral studies. The stereo baseline of the pair is 500 mm. The two WAC have 22 mm focal length, f/10 lenses that illuminate detectors with 1024 × 1024 pixels. WAC lenses are fixed, with an optimal focus set to 4 m, and a focus ranging from 1.2 m (corresponding to the nearest view of the calibration target on the rover deck) to infinity. The HRC is able to focus between 0.9 m (distance to a drill core on the rover`s sample tray) and infinity. The instantaneous field of views of WAC and HRC are 580 μrad/pixel and 83 μrad/pixel, respectively. The corresponding resolution (in mm/pixel) at a distance of 2 m are 1.2 (WAC) and 0.17 (HRC), at 100 m distance it is 58 (WAC) and 8.3 (HRC). WAC and HRC will be geometrically co-aligned. The main scientific goal of PanCam is the geologic characterisation of the environment in which the rover is operating, providing the context for investigations carried out by the other instruments of the Pasteur payload. PanCam data will serve as a bridge between orbital data (high-resolution images from HRSC, CTX, and HiRISE, and spectrometer data from OMEGA and CRISM) and the data acquired in situ on the Martian surface. The position of HRC on top of the rover`s mast enables the detailed panoramic inspection of surface features over the full horizontal range of 360° even at large distances, an important prerequisite to identify the scientifically most promising targets and to plan the rover`s traverse. Key to success of PanCam is the provision of data that allow the determination of rock lithology, either of boulders on the surface or of outcrops. This task requires high spatial resolution as well as colour capabilities. The stereo images provide complementary information on the three-dimensional properties (i.e. the shape) of rocks. As an example, the degree of rounding of rocks as a result of fluvial transport can reveal the erosional history of the investigated particles, with possible implications on the chronology and intensity of rock-water interaction. The identification of lithology and geological history of rocks will strongly benefit from the co-aligned views of WAC (colour, stereo) and HRC (high spatial resolution), which will ensure that 3D and multispectral information is available together with fine-scale textural information for each scene. Stereo information is also of utmost importance for the determination of outcrop geometry (e.g., strike and dip of layered sequences), which helps to understand the emplacement history of sedimentary and volcanic rocks (e.g., cross-bedding, unconformities, etc.). PanCam will further reveal physical soil properties such as cohesion by imaging sites where the soil is disturbed by the rover`s wheels and the drill. Another essential task of PanCam is the imaging of samples (from the drill) before ingestion into the rover for further analysis by other instruments. PanCam can be tilted vertically and will also study the atmosphere (e.g., dust loading, opacity, clouds) and aeolian processes related to surface-atmosphere interactions, such as dust devils.

  13. Acquisition of Scientific Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noland, Lynn

    2014-05-16

    Whitworth University constructed a 63,00 sq. ft. biology and chemistry building which opened in the Fall of 2011. This project provided for new state-of-the-art science instrumentation enabling Whitworth students to develop skills and knowledge that are directly transferable to practical applications thus enhancing Whitworth student's ability to compete and perform in the scientific workforce. Additionally, STEM faculty undertake outreach programs in the area schools, bringing students to our campus to engage in activities with our science students. The ability to work with insturmentation that is current helps to make science exciting for middle school and high school students and getsmore » them thinking about careers in science. 14 items were purchased following the university's purchasing policy, that benefit instruction and research in the departments of biology, chemistry, and health sciences. They are: Cadaver Dissection Tables with Exhaust Chamber and accessories, Research Microscope with DF DIC, Phase and Fluorescence illumination with DP72 Camera, Microscope with Fluorescence, Microcomputer controlled ultracentrifuge, Ultracentrifuge rotor, Variable Temperature steam pressure sterilizer, Alliance APLC System, DNA Speedvac, Gel Cocumentation System, BioPac MP150, Glovebox personal workstation,Lyophilizer, Nano Drop 2000/2000c Spectrophotometer, C02 Incubator.« less

  14. The future of space imaging. Report of a community-based study of an advanced camera for the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Brown, Robert A. (Editor)

    1993-01-01

    The scientific and technical basis for an Advanced Camera (AC) for the Hubble Space Telescope (HST) is discussed. In March 1992, the NASA Program Scientist for HST invited the Space Telescope Science Institute to conduct a community-based study of an AC, which would be installed on a scheduled HST servicing mission in 1999. The study had three phases: a broad community survey of views on candidate science program and required performance of the AC, an analysis of technical issues relating to its implementation, and a panel of experts to formulate conclusions and prioritize recommendations. From the assessment of the imaging tasks astronomers have proposed for or desired from HST, we believe the most valuable 1999 instrument would be a camera with both near ultraviolet/optical (NUVO) and far ultraviolet (FUV) sensitivity, and with both wide field and high resolution options.

  15. Digital Earth Watch: Investigating the World with Digital Cameras

    NASA Astrophysics Data System (ADS)

    Gould, A. D.; Schloss, A. L.; Beaudry, J.; Pickle, J.

    2015-12-01

    Every digital camera including the smart phone camera can be a scientific tool. Pictures contain millions of color intensity measurements organized spatially allowing us to measure properties of objects in the images. This presentation will demonstrate how digital pictures can be used for a variety of studies with a special emphasis on using repeat digital photographs to study change-over-time in outdoor settings with a Picture Post. Demonstrations will include using inexpensive color filters to take pictures that enhance features in images such as unhealthy leaves on plants, or clouds in the sky. Software available at no cost from the Digital Earth Watch (DEW) website that lets students explore light, color and pixels, manipulate color in images and make measurements, will be demonstrated. DEW and Picture Post were developed with support from NASA. Please visit our websites: DEW: http://dew.globalsystemsscience.orgPicture Post: http://picturepost.unh.edu

  16. Collaborative web-based annotation of video footage of deep-sea life, ecosystems and geological processes

    NASA Astrophysics Data System (ADS)

    Kottmann, R.; Ratmeyer, V.; Pop Ristov, A.; Boetius, A.

    2012-04-01

    More and more seagoing scientific expeditions use video-controlled research platforms such as Remote Operating Vehicles (ROV), Autonomous Underwater Vehicles (AUV), and towed camera systems. These produce many hours of video material which contains detailed and scientifically highly valuable footage of the biological, chemical, geological, and physical aspects of the oceans. Many of the videos contain unique observations of unknown life-forms which are rare, and which cannot be sampled and studied otherwise. To make such video material online accessible and to create a collaborative annotation environment the "Video Annotation and processing platform" (V-App) was developed. A first solely web-based installation for ROV videos is setup at the German Center for Marine Environmental Sciences (available at http://videolib.marum.de). It allows users to search and watch videos with a standard web browser based on the HTML5 standard. Moreover, V-App implements social web technologies allowing a distributed world-wide scientific community to collaboratively annotate videos anywhere at any time. It has several features fully implemented among which are: • User login system for fine grained permission and access control • Video watching • Video search using keywords, geographic position, depth and time range and any combination thereof • Video annotation organised in themes (tracks) such as biology and geology among others in standard or full screen mode • Annotation keyword management: Administrative users can add, delete, and update single keywords for annotation or upload sets of keywords from Excel-sheets • Download of products for scientific use This unique web application system helps making costly ROV videos online available (estimated cost range between 5.000 - 10.000 Euros per hour depending on the combination of ship and ROV). Moreover, with this system each expert annotation adds instantaneous available and valuable knowledge to otherwise uncharted material.

  17. Trade-off between TMA and RC configurations for JANUS camera

    NASA Astrophysics Data System (ADS)

    Greggio, D.; Magrin, D.; Munari, M.; Paolinetti, R.; Turella, A.; Zusi, M.; Cremonese, G.; Debei, S.; Della Corte, V.; Friso, E.; Hoffmann, H.; Jaumann, R.; Michaelis, H.; Mugnuolo, R.; Olivieri, A.; Palumbo, P.; Ragazzoni, R.; Schmitz, N.

    2016-07-01

    JANUS (Jovis Amorum Ac Natorum Undique Scrutator) is a high-resolution visible camera designed for the ESA space mission JUICE (Jupiter Icy moons Explorer). The main scientific goal of JANUS is to observe the surface of the Jupiter satellites Ganymede and Europa in order to characterize their physical and geological properties. During the design phases, we have proposed two possible optical configurations: a Three Mirror Anastigmat (TMA) and a Ritchey-Chrétien (RC) both matching the performance requirements. Here we describe the two optical solutions and compare their performance both in terms of achieved optical quality, sensitivity to misalignment and stray light performances.

  18. A 3D photographic capsule endoscope system with full field of view

    NASA Astrophysics Data System (ADS)

    Ou-Yang, Mang; Jeng, Wei-De; Lai, Chien-Cheng; Kung, Yi-Chinn; Tao, Kuan-Heng

    2013-09-01

    Current capsule endoscope uses one camera to capture the surface image in the intestine. It can only observe the abnormal point, but cannot know the exact information of this abnormal point. Using two cameras can generate 3D images, but the visual plane changes while capsule endoscope rotates. It causes that two cameras can't capture the images information completely. To solve this question, this research provides a new kind of capsule endoscope to capture 3D images, which is 'A 3D photographic capsule endoscope system'. The system uses three cameras to capture images in real time. The advantage is increasing the viewing range up to 2.99 times respect to the two camera system. The system can accompany 3D monitor provides the exact information of symptom points, helping doctors diagnose the disease.

  19. A Versatile Time-Lapse Camera System Developed by the Hawaiian Volcano Observatory for Use at Kilauea Volcano, Hawaii

    USGS Publications Warehouse

    Orr, Tim R.; Hoblitt, Richard P.

    2008-01-01

    Volcanoes can be difficult to study up close. Because it may be days, weeks, or even years between important events, direct observation is often impractical. In addition, volcanoes are often inaccessible due to their remote location and (or) harsh environmental conditions. An eruption adds another level of complexity to what already may be a difficult and dangerous situation. For these reasons, scientists at the U.S. Geological Survey (USGS) Hawaiian Volcano Observatory (HVO) have, for years, built camera systems to act as surrogate eyes. With the recent advances in digital-camera technology, these eyes are rapidly improving. One type of photographic monitoring involves the use of near-real-time network-enabled cameras installed at permanent sites (Hoblitt and others, in press). Time-lapse camera-systems, on the other hand, provide an inexpensive, easily transportable monitoring option that offers more versatility in site location. While time-lapse systems lack near-real-time capability, they provide higher image resolution and can be rapidly deployed in areas where the use of sophisticated telemetry required by the networked cameras systems is not practical. This report describes the latest generation (as of 2008) time-lapse camera system used by HVO for photograph acquisition in remote and hazardous sites on Kilauea Volcano.

  20. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  1. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User’s Head Movement

    PubMed Central

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  2. The Orbiter camera payload system's large-format camera and attitude reference system

    NASA Technical Reports Server (NTRS)

    Schardt, B. B.; Mollberg, B. H.

    1985-01-01

    The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.

  3. Nuclear medicine imaging system

    DOEpatents

    Bennett, Gerald W.; Brill, A. Bertrand; Bizais, Yves J.; Rowe, R. Wanda; Zubal, I. George

    1986-01-07

    A nuclear medicine imaging system having two large field of view scintillation cameras mounted on a rotatable gantry and being movable diametrically toward or away from each other is disclosed. In addition, each camera may be rotated about an axis perpendicular to the diameter of the gantry. The movement of the cameras allows the system to be used for a variety of studies, including positron annihilation, and conventional single photon emission, as well as static orthogonal dual multi-pinhole tomography. In orthogonal dual multi-pinhole tomography, each camera is fitted with a seven pinhole collimator to provide seven views from slightly different perspectives. By using two cameras at an angle to each other, improved sensitivity and depth resolution is achieved. The computer system and interface acquires and stores a broad range of information in list mode, including patient physiological data, energy data over the full range detected by the cameras, and the camera position. The list mode acquisition permits the study of attenuation as a result of Compton scatter, as well as studies involving the isolation and correlation of energy with a range of physiological conditions.

  4. Nuclear medicine imaging system

    DOEpatents

    Bennett, Gerald W.; Brill, A. Bertrand; Bizais, Yves J. C.; Rowe, R. Wanda; Zubal, I. George

    1986-01-01

    A nuclear medicine imaging system having two large field of view scintillation cameras mounted on a rotatable gantry and being movable diametrically toward or away from each other is disclosed. In addition, each camera may be rotated about an axis perpendicular to the diameter of the gantry. The movement of the cameras allows the system to be used for a variety of studies, including positron annihilation, and conventional single photon emission, as well as static orthogonal dual multi-pinhole tomography. In orthogonal dual multi-pinhole tomography, each camera is fitted with a seven pinhole collimator to provide seven views from slightly different perspectives. By using two cameras at an angle to each other, improved sensitivity and depth resolution is achieved. The computer system and interface acquires and stores a broad range of information in list mode, including patient physiological data, energy data over the full range detected by the cameras, and the camera position. The list mode acquisition permits the study of attenuation as a result of Compton scatter, as well as studies involving the isolation and correlation of energy with a range of physiological conditions.

  5. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    USGS Publications Warehouse

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  6. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    PubMed

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  7. Preliminary optical design of the stereo channel of the imaging system simbiosys for the BepiColombo ESA mission

    NASA Astrophysics Data System (ADS)

    Da Deppo, Vania; Naletto, Giampiero; Cremonese, Gabriele; Debei, Stefano; Flamini, Enrico

    2017-11-01

    The paper describes the optical design and performance budget of a novel catadioptric instrument chosen as baseline for the Stereo Channel (STC) of the imaging system SIMBIOSYS for the BepiColombo ESA mission to Mercury. The main scientific objective is the 3D global mapping of the entire surface of Mercury with a scale factor of 50 m per pixel at periherm in four different spectral bands. The system consists of two twin cameras looking at +/-20° from nadir and sharing some components, such as the relay element in front of the detector and the detector itself. The field of view of each channel is 4° x 4° with a scale factor of 23''/pixel. The system guarantees good optical performance with Ensquared Energy of the order of 80% in one pixel. For the straylight suppression, an intermediate field stop is foreseen, which gives the possibility to design an efficient baffling system.

  8. ARNICA, the Arcetri Near-Infrared Camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

    1996-04-01

    ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

  9. Video auto stitching in multicamera surveillance system

    NASA Astrophysics Data System (ADS)

    He, Bin; Zhao, Gang; Liu, Qifang; Li, Yangyang

    2012-01-01

    This paper concerns the problem of video stitching automatically in a multi-camera surveillance system. Previous approaches have used multiple calibrated cameras for video mosaic in large scale monitoring application. In this work, we formulate video stitching as a multi-image registration and blending problem, and not all cameras are needed to be calibrated except a few selected master cameras. SURF is used to find matched pairs of image key points from different cameras, and then camera pose is estimated and refined. Homography matrix is employed to calculate overlapping pixels and finally implement boundary resample algorithm to blend images. The result of simulation demonstrates the efficiency of our method.

  10. Video auto stitching in multicamera surveillance system

    NASA Astrophysics Data System (ADS)

    He, Bin; Zhao, Gang; Liu, Qifang; Li, Yangyang

    2011-12-01

    This paper concerns the problem of video stitching automatically in a multi-camera surveillance system. Previous approaches have used multiple calibrated cameras for video mosaic in large scale monitoring application. In this work, we formulate video stitching as a multi-image registration and blending problem, and not all cameras are needed to be calibrated except a few selected master cameras. SURF is used to find matched pairs of image key points from different cameras, and then camera pose is estimated and refined. Homography matrix is employed to calculate overlapping pixels and finally implement boundary resample algorithm to blend images. The result of simulation demonstrates the efficiency of our method.

  11. Detecting method of subjects' 3D positions and experimental advanced camera control system

    NASA Astrophysics Data System (ADS)

    Kato, Daiichiro; Abe, Kazuo; Ishikawa, Akio; Yamada, Mitsuho; Suzuki, Takahito; Kuwashima, Shigesumi

    1997-04-01

    Steady progress is being made in the development of an intelligent robot camera capable of automatically shooting pictures with a powerful sense of reality or tracking objects whose shooting requires advanced techniques. Currently, only experienced broadcasting cameramen can provide these pictures.TO develop an intelligent robot camera with these abilities, we need to clearly understand how a broadcasting cameraman assesses his shooting situation and how his camera is moved during shooting. We use a real- time analyzer to study a cameraman's work and his gaze movements at studios and during sports broadcasts. This time, we have developed a detecting method of subjects' 3D positions and an experimental camera control system to help us further understand the movements required for an intelligent robot camera. The features are as follows: (1) Two sensor cameras shoot a moving subject and detect colors, producing its 3D coordinates. (2) Capable of driving a camera based on camera movement data obtained by a real-time analyzer. 'Moving shoot' is the name we have given to the object position detection technology on which this system is based. We used it in a soccer game, producing computer graphics showing how players moved. These results will also be reported.

  12. Performance Characteristics For The Orbiter Camera Payload System's Large Format Camera (LFC)

    NASA Astrophysics Data System (ADS)

    MoIIberg, Bernard H.

    1981-11-01

    The Orbiter Camera Payload System, the OCPS, is an integrated photographic system which is carried into Earth orbit as a payload in the Shuttle Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC) which is a precision wide-angle cartographic instrument that is capable of produc-ing high resolution stereophotography of great geometric fidelity in multiple base to height ratios. The primary design objective for the LFC was to maximize all system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment.

  13. Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop.

    PubMed

    Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K

    2008-01-01

    A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.

  14. Spinoff 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.

  15. ESTADIUS: A High Motion "One Arcsec" Daytime Attitude Estimation System for Stratospheric Applications

    NASA Astrophysics Data System (ADS)

    Montel, J.; Andre, Y.; Mirc, F.; Etcheto, P.; Evrard, J.; Bray, N.; Saccoccio, M.; Tomasini, L.; Perot, E.

    2015-09-01

    ESTADIUS is an autonomous, accurate and daytime attitude estimation system, for stratospheric balloons that require a high level of attitude measurement and stability. The system has been developed by CNES. ESTADIUS is based on star sensor an pyrometer data fusion within an extended Kalman filter. The star sensor is composed of a 16 MPixels visible-CCD camera and a large aperture camera lens (focal length of 135mm, aperture f/1.8, 10ºx15º field of view or FOV) which provides very accurate stars measurements due to very low pixel angular size. This also allows detecting stars against a bright sky background. The pyrometer is a 0.01º/h performance class Fiber Optic Gyroscope (FOG). The system is adapted to work down to an altitude of ~25km, even under high cinematic conditions. Key elements of ESTADIUS are: daytime conditions use (as well as night time), autonomy (automatic recognition of constellations), high angular rate robustness (a few deg/s thanks to the high performance of attitude propagation), stray-light robustness (thanks to a high performance baffle), high accuracy (<1", 1σ). Four stratospheric qualification flights were very successfully performed in 2010/2011 and 2013/2014 in Kiruna (Sweden) and Timmins (Canada). ESTADIUS will allow long stratospheric flights with a unique attitude estimation system avoiding the restriction of night/day conditions at launch. The first operational flight of ESTADIUS will be in 2015 for the PILOT scientific missions (led by IRAP and CNES in France). Further balloon missions such as CIDRE will use the system ESTADIUS is probably the first autonomous, large FOV, daytime stellar attitude measurement system. This paper details the technical features and in-flight results.

  16. Clementine mission

    NASA Astrophysics Data System (ADS)

    Rustan, Pedro L.

    1995-01-01

    The U.S. Department of Defense (DoD) and the National Aeronautics and Space Administration (NASA) started a cooperative program in 1992 to flight qualify recently developed lightweight technologies in a radiation stressed environment. The spacecraft, referred to as Clementine, was designed, built, and launched in less than a two year period. The spacecraft was launched into a high inclination orbit from Vandenburg Air Force Base in California on a Titan IIG launch vehicle in January 1994. The spacecraft was injected into a 420 by 3000 km orbit around the Moon and remained there for over two months. Unfortunately, after successfully completing the Lunar phase of the mission, a software malfunction prevented the accomplishment of the near-Earth asteroid (NEA) phase. Some of the technologies incorporated in the Clementine spacecraft include: a 370 gram, 7 watt star tracker camera; a 500 gram, 6 watt, UV/Vis camera; a 1600 gram, 30 watt Indium Antimonide focal plane array NIR camera; a 1650 gram, 30 watt, Mercury Cadmium Telluride LWIR camera; a LIDAR camera which consists of a Nd:YAG diode pumped laser for ranging and an intensified photocathode charge-coupled detector for imaging. The scientific results of the mission will be first analyzed by a NASA selected team, and then will be available to the entire community.

  17. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    PubMed

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  18. Application of infrared uncooled cameras in surveillance systems

    NASA Astrophysics Data System (ADS)

    Dulski, R.; Bareła, J.; Trzaskawka, P.; PiÄ tkowski, T.

    2013-10-01

    The recent necessity to protect military bases, convoys and patrols gave serious impact to the development of multisensor security systems for perimeter protection. One of the most important devices used in such systems are IR cameras. The paper discusses technical possibilities and limitations to use uncooled IR camera in a multi-sensor surveillance system for perimeter protection. Effective ranges of detection depend on the class of the sensor used and the observed scene itself. Application of IR camera increases the probability of intruder detection regardless of the time of day or weather conditions. It also simultaneously decreased the false alarm rate produced by the surveillance system. The role of IR cameras in the system was discussed as well as technical possibilities to detect human being. Comparison of commercially available IR cameras, capable to achieve desired ranges was done. The required spatial resolution for detection, recognition and identification was calculated. The simulation of detection ranges was done using a new model for predicting target acquisition performance which uses the Targeting Task Performance (TTP) metric. Like its predecessor, the Johnson criteria, the new model bounds the range performance with image quality. The scope of presented analysis is limited to the estimation of detection, recognition and identification ranges for typical thermal cameras with uncooled microbolometer focal plane arrays. This type of cameras is most widely used in security systems because of competitive price to performance ratio. Detection, recognition and identification range calculations were made, and the appropriate results for the devices with selected technical specifications were compared and discussed.

  19. Imaging_Earth_With_MUSES

    NASA Image and Video Library

    2017-07-11

    Commercial businesses and scientific researchers have a new capability to capture digital imagery of Earth, thanks to MUSES: the Multiple User System for Earth Sensing facility. This platform on the outside of the International Space Station is capable of holding four different payloads, ranging from high-resolution digital cameras to hyperspectral imagers, which will support Earth science observations in agricultural awareness, air quality, disaster response, fire detection, and many other research topics. MUSES program manager Mike Soutullo explains the system and its unique features including the ability to change and upgrade payloads using the space station’s Canadarm2 and Special Purpose Dexterous Manipulator. For more information about MUSES, please visit: https://www.nasa.gov/mission_pages/station/research/news/MUSES For more on ISS science, https://www.nasa.gov/mission_pages/station/research/index.html or follow us on Twitter @ISS_research

  20. The James Webb Space Telescope's Near-Infrared Camera (NIRCam): Making Models, Building Understanding

    NASA Astrophysics Data System (ADS)

    McCarthy, D. W., Jr.; Lebofsky, L. A.; Higgins, M. L.; Lebofsky, N. R.

    2011-09-01

    Since 2003, the Near Infrared Camear (NIRCam) science team for the James Webb Space Telescope (JWST) has conducted "Train the Trainer" workshops for adult leaders of the Girl Scout of the USA (GSUSA), engaging them in the process of scientific inquiry and equipping them to host astronomy-related activities at the troop level. Training includes topics in basic astronomy (night sky, phases of the Moon, the scale of the Solar System and beyond, stars, galaxies, telescopes, etc.) as well as JWST-specific research areas in extra-solar planetary systems and cosmology, to pave the way for girls and women to understand the first images from JWST. Participants become part of our world-wide network of 160 trainers teaching young women essential STEM-related concepts using astronomy, the night sky environment, applied math, engineering, and critical thinking.

  1. Video model deformation system for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1983-01-01

    A photogrammetric closed circuit television system to measure model deformation at the National Transonic Facility is described. The photogrammetric approach was chosen because of its inherent rapid data recording of the entire object field. Video cameras are used to acquire data instead of film cameras due to the inaccessibility of cameras which must be housed within the cryogenic, high pressure plenum of this facility. A rudimentary theory section is followed by a description of the video-based system and control measures required to protect cameras from the hostile environment. Preliminary results obtained with the same camera placement as planned for NTF are presented and plans for facility testing with a specially designed test wing are discussed.

  2. Evaluation of thermal cameras in quality systems according to ISO 9000 or EN 45000 standards

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof

    2001-03-01

    According to the international standards ISO 9001-9004 and EN 45001-45003 the industrial plants and the accreditation laboratories that implemented the quality systems according to these standards are required to evaluate an uncertainty of measurements. Manufacturers of thermal cameras do not offer any data that could enable estimation of measurement uncertainty of these imagers. Difficulties in determining the measurement uncertainty is an important limitation of thermal cameras for applications in the industrial plants and the cooperating accreditation laboratories that have implemented these quality systems. A set of parameters for characterization of commercial thermal cameras, a measuring set, some results of testing of these cameras, a mathematical model of uncertainty, and a software that enables quick calculation of uncertainty of temperature measurements with thermal cameras are presented in this paper.

  3. Mars Exploration Rover Navigation Camera in-flight calibration

    NASA Astrophysics Data System (ADS)

    Soderblom, Jason M.; Bell, James F.; Johnson, Jeffrey R.; Joseph, Jonathan; Wolff, Michael J.

    2008-06-01

    The Navigation Camera (Navcam) instruments on the Mars Exploration Rover (MER) spacecraft provide support for both tactical operations as well as scientific observations where color information is not necessary: large-scale morphology, atmospheric monitoring including cloud observations and dust devil movies, and context imaging for both the thermal emission spectrometer and the in situ instruments on the Instrument Deployment Device. The Navcams are a panchromatic stereoscopic imaging system built using identical charge-coupled device (CCD) detectors and nearly identical electronics boards as the other cameras on the MER spacecraft. Previous calibration efforts were primarily focused on providing a detailed geometric calibration in line with the principal function of the Navcams, to provide data for the MER navigation team. This paper provides a detailed description of a new Navcam calibration pipeline developed to provide an absolute radiometric calibration that we estimate to have an absolute accuracy of 10% and a relative precision of 2.5%. Our calibration pipeline includes steps to model and remove the bias offset, the dark current charge that accumulates in both the active and readout regions of the CCD, and the shutter smear. It also corrects pixel-to-pixel responsivity variations using flat-field images, and converts from raw instrument-corrected digital number values per second to units of radiance (W m-2 nm-1 sr-1), or to radiance factor (I/F). We also describe here the initial results of two applications where radiance-calibrated Navcam data provide unique information for surface photometric and atmospheric aerosol studies.

  4. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  5. LSST camera control system

    NASA Astrophysics Data System (ADS)

    Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael

    2006-06-01

    The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.

  6. A system for extracting 3-dimensional measurements from a stereo pair of TV cameras

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.; Cunningham, R.

    1976-01-01

    Obtaining accurate three-dimensional (3-D) measurement from a stereo pair of TV cameras is a task requiring camera modeling, calibration, and the matching of the two images of a real 3-D point on the two TV pictures. A system which models and calibrates the cameras and pairs the two images of a real-world point in the two pictures, either manually or automatically, was implemented. This system is operating and provides three-dimensional measurements resolution of + or - mm at distances of about 2 m.

  7. A compact high-definition low-cost digital stereoscopic video camera for rapid robotic surgery development.

    PubMed

    Carlson, Jay; Kowalczuk, Jędrzej; Psota, Eric; Pérez, Lance C

    2012-01-01

    Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration.

  8. Applications of a shadow camera system for energy meteorology

    NASA Astrophysics Data System (ADS)

    Kuhn, Pascal; Wilbert, Stefan; Prahl, Christoph; Garsche, Dominik; Schüler, David; Haase, Thomas; Ramirez, Lourdes; Zarzalejo, Luis; Meyer, Angela; Blanc, Philippe; Pitz-Paal, Robert

    2018-02-01

    Downward-facing shadow cameras might play a major role in future energy meteorology. Shadow cameras directly image shadows on the ground from an elevated position. They are used to validate other systems (e.g. all-sky imager based nowcasting systems, cloud speed sensors or satellite forecasts) and can potentially provide short term forecasts for solar power plants. Such forecasts are needed for electricity grids with high penetrations of renewable energy and can help to optimize plant operations. In this publication, two key applications of shadow cameras are briefly presented.

  9. Nonholonomic camera-space manipulation using cameras mounted on a mobile base

    NASA Astrophysics Data System (ADS)

    Goodwine, Bill; Seelinger, Michael J.; Skaar, Steven B.; Ma, Qun

    1998-10-01

    The body of work called `Camera Space Manipulation' is an effective and proven method of robotic control. Essentially, this technique identifies and refines the input-output relationship of the plant using estimation methods and drives the plant open-loop to its target state. 3D `success' of the desired motion, i.e., the end effector of the manipulator engages a target at a particular location with a particular orientation, is guaranteed when there is camera space success in two cameras which are adequately separated. Very accurate, sub-pixel positioning of a robotic end effector is possible using this method. To date, however, most efforts in this area have primarily considered holonomic systems. This work addresses the problem of nonholonomic camera space manipulation by considering the problem of a nonholonomic robot with two cameras and a holonomic manipulator on board the nonholonomic platform. While perhaps not as common in robotics, such a combination of holonomic and nonholonomic degrees of freedom are ubiquitous in industry: fork lifts and earth moving equipment are common examples of a nonholonomic system with an on-board holonomic actuator. The nonholonomic nature of the system makes the automation problem more difficult due to a variety of reasons; in particular, the target location is not fixed in the image planes, as it is for holonomic systems (since the cameras are attached to a moving platform), and there is a fundamental `path dependent' nature of nonholonomic kinematics. This work focuses on the sensor space or camera-space-based control laws necessary for effectively implementing an autonomous system of this type.

  10. Backing collisions: a study of drivers' eye and backing behaviour using combined rear-view camera and sensor systems.

    PubMed

    Hurwitz, David S; Pradhan, Anuj; Fisher, Donald L; Knodler, Michael A; Muttart, Jeffrey W; Menon, Rajiv; Meissner, Uwe

    2010-04-01

    Backing crash injures can be severe; approximately 200 of the 2,500 reported injuries of this type per year to children under the age of 15 years result in death. Technology for assisting drivers when backing has limited success in preventing backing crashes. Two questions are addressed: Why is the reduction in backing crashes moderate when rear-view cameras are deployed? Could rear-view cameras augment sensor systems? 46 drivers (36 experimental, 10 control) completed 16 parking trials over 2 days (eight trials per day). Experimental participants were provided with a sensor camera system, controls were not. Three crash scenarios were introduced. Parking facility at UMass Amherst, USA. 46 drivers (33 men, 13 women) average age 29 years, who were Massachusetts residents licensed within the USA for an average of 9.3 years. Interventions Vehicles equipped with a rear-view camera and sensor system-based parking aid. Subject's eye fixations while driving and researcher's observation of collision with objects during backing. Only 20% of drivers looked at the rear-view camera before backing, and 88% of those did not crash. Of those who did not look at the rear-view camera before backing, 46% looked after the sensor warned the driver. This study indicates that drivers not only attend to an audible warning, but will look at a rear-view camera if available. Evidence suggests that when used appropriately, rear-view cameras can mitigate the occurrence of backing crashes, particularly when paired with an appropriate sensor system.

  11. Backing collisions: a study of drivers’ eye and backing behaviour using combined rear-view camera and sensor systems

    PubMed Central

    Hurwitz, David S; Pradhan, Anuj; Fisher, Donald L; Knodler, Michael A; Muttart, Jeffrey W; Menon, Rajiv; Meissner, Uwe

    2012-01-01

    Context Backing crash injures can be severe; approximately 200 of the 2,500 reported injuries of this type per year to children under the age of 15 years result in death. Technology for assisting drivers when backing has limited success in preventing backing crashes. Objectives Two questions are addressed: Why is the reduction in backing crashes moderate when rear-view cameras are deployed? Could rear-view cameras augment sensor systems? Design 46 drivers (36 experimental, 10 control) completed 16 parking trials over 2 days (eight trials per day). Experimental participants were provided with a sensor camera system, controls were not. Three crash scenarios were introduced. Setting Parking facility at UMass Amherst, USA. Subjects 46 drivers (33 men, 13 women) average age 29 years, who were Massachusetts residents licensed within the USA for an average of 9.3 years. Interventions Vehicles equipped with a rear-view camera and sensor system-based parking aid. Main Outcome Measures Subject’s eye fixations while driving and researcher’s observation of collision with objects during backing. Results Only 20% of drivers looked at the rear-view camera before backing, and 88% of those did not crash. Of those who did not look at the rear-view camera before backing, 46% looked after the sensor warned the driver. Conclusions This study indicates that drivers not only attend to an audible warning, but will look at a rear-view camera if available. Evidence suggests that when used appropriately, rear-view cameras can mitigate the occurrence of backing crashes, particularly when paired with an appropriate sensor system. PMID:20363812

  12. Coincidence ion imaging with a fast frame camera

    NASA Astrophysics Data System (ADS)

    Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen

    2014-12-01

    A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.

  13. Development of the radial neutron camera system for the HL-2A tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y. P., E-mail: zhangyp@swip.ac.cn; Yang, J. W.; Liu, Yi

    2016-06-15

    A new radial neutron camera system has been developed and operated recently in the HL-2A tokamak to measure the spatial and time resolved 2.5 MeV D-D fusion neutron, enhancing the understanding of the energetic-ion physics. The camera mainly consists of a multichannel collimator, liquid-scintillation detectors, shielding systems, and a data acquisition system. Measurements of the D-D fusion neutrons using the camera have been successfully performed during the 2015 HL-2A experiment campaign. The measurements show that the distribution of the fusion neutrons in the HL-2A plasma has a peaked profile, suggesting that the neutral beam injection beam ions in the plasmamore » have a peaked distribution. It also suggests that the neutrons are primarily produced from beam-target reactions in the plasma core region. The measurement results from the neutron camera are well consistent with the results of both a standard {sup 235}U fission chamber and NUBEAM neutron calculations. In this paper, the new radial neutron camera system on HL-2A and the first experimental results are described.« less

  14. Coordinating High-Resolution Traffic Cameras : Developing Intelligent, Collaborating Cameras for Transportation Security and Communications

    DOT National Transportation Integrated Search

    2015-08-01

    Cameras are used prolifically to monitor transportation incidents, infrastructure, and congestion. Traditional camera systems often require human monitoring and only offer low-resolution video. Researchers for the Exploratory Advanced Research (EAR) ...

  15. A Remote-Control Airship for Coastal and Environmental Research

    NASA Astrophysics Data System (ADS)

    Puleo, J. A.; O'Neal, M. A.; McKenna, T. E.; White, T.

    2008-12-01

    The University of Delaware recently acquired an 18 m (60 ft) remote-control airship capable of carrying a 36 kg (120 lb) scientific payload for coastal and environmental research. By combining the benefits of tethered balloons (stable dwell time) and powered aircraft (ability to navigate), the platform allows for high-resolution data collection in both time and space. The platform was developed by Galaxy Blimps, LLC of Dallas, TX for collecting high-definition video of sporting events. The airship can fly to altitudes of at least 600 m (2000 ft) reaching speeds between zero and 18 m/s (35 knots) in winds up to 13 m/s (25 knots). Using a hand-held console and radio transmitter, a ground-based operator can manipulate the orientation and throttle of two gasoline engines, and the orientation of four fins. Airship location is delivered to the operator through a data downlink from an onboard altimeter and global positioning system (GPS) receiver. Scientific payloads are easily attached to a rail system on the underside of the blimp. Data collection can be automated (fixed time intervals) or triggered by a second operator using a second hand-held console. Data can be stored onboard or transmitted in real-time to a ground-based computer. The first science mission (Fall 2008) is designed to collect images of tidal inundation of a salt marsh to support numerical modeling of water quality in the Murderkill River Estuary in Kent County, Delaware (a tributary of Delaware Bay in the USA Mid-Atlantic region). Time sequenced imagery will be collected by a ten-megapixel camera and a thermal- infrared imager mounted in separate remote-control, gyro-stabilized camera mounts on the blimp. Live video- feeds will be transmitted to the instrument operator on the ground. Resulting time series data will ultimately be used to compare/update independent estimates of inundation based on LiDAR elevations and a suite of tide and temperature gauges.

  16. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    PubMed Central

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing. PMID:28981533

  17. Hubble Eyes Galaxy as it Gets a Cosmic Hair Ruffling

    NASA Image and Video Library

    2014-08-01

    From objects as small as Newton's apple to those as large as a galaxy, no physical body is free from the stern bonds of gravity, as evidenced in this stunning picture captured by the Wide Field Camera 3 and Advanced Camera for Surveys onboard the NASA/ESA Hubble Space Telescope. Here we see two spiral galaxies engaged in a cosmic tug-of-war — but in this contest, there will be no winner. The structures of both objects are slowly distorted to resemble new forms, and in some cases, merge together to form new, super galaxies. This particular fate is similar to that of the Milky Way Galaxy, when it will ultimately merge with our closest galactic partner, the Andromeda Galaxy. There is no need to panic however, as this process takes several hundreds of millions of years. Not all interacting galaxies result in mergers though. The merger is dependent on the mass of each galaxy, as well as the relative velocities of each body. It is quite possible that the event pictured here, romantically named 2MASX J06094582-2140234, will avoid a merger event altogether, and will merely distort the arms of each spiral without colliding — the cosmic equivalent of a hair ruffling! These galactic interactions also trigger new regions of star formation in the galaxies involved, causing them to be extremely luminous in the infrared part of the spectrum. For this reason, these types of galaxies are referred to as LIRGs, or Luminous Infrared Galaxies. This image was taken as part of as part of a Hubble survey of the central regions of LIRGs in the local Universe, which also used the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) instrument. Credit: ESA/Hubble & NASA, Acknowledgement: Luca Limatola NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. AMIE SMART-1: review of results and legacy 10 years after launch

    NASA Astrophysics Data System (ADS)

    Josset, Jean-Luc; Souchon, Audrey; Josset, Marie; Foing, Bernard

    2014-05-01

    The Advanced Moon micro-Imager Experiment (AMIE) camera was launched in September 2003 onboard the ESA SMART-1 spacecraft. We review the technical characteristics, scientific objectives and results of the instrument, 10 years after its launch. The AMIE camera is an ultra-compact imaging system that includes a tele-objective with a 5.3° x 5.3° field of view and an imaging sensor of 1024 x 1024 pixels. It is dedicated to spectral imaging with three spectral filters (750, 915 and 960 nm filters), photometric measurements (filter free CCD area), and Laser-link experiment (laser filter at 847 nm). The AMIE camera was designed to acquire high-resolution images of the lunar surface, in white light and for specific spectral bands, under a number of different viewing conditions and geometries. Specifically, its main scientific objectives included: (i) imaging of high latitude regions in the southern hemisphere, in particular the South Pole Aitken basin and the permanently shadowed regions close to the South Pole; (ii) determination of the photometric properties of the lunar surface from observations at different phase angles (physical properties of the regolith); (iii) multi-band imaging for constraining the chemical and mineral composition of the surface; (iv) detection and characterisation of lunar non-mare volcanic units; (v) study of lithological variations from impact craters and implications for crustal heterogeneity. The study of AMIE images enhanced the knowledge of the lunar surface, in particular regarding photometric modelling and surface physical properties of localized lunar areas and geological units. References: http://scholar.google.nl/scholar?q=smart-1+amie We acknowledge ESA, member states, industry and institutes for their contribution, and the members of the AMIE Team: J.-L. Josset, P. Plancke, Y. Langevin, P. Cerroni, M. C. De Sanctis, P. Pinet, S. Chevrel, S. Beauvivre, B.A. Hofmann, M. Josset, D. Koschny, M. Almeida, K. Muinonen, J. Piironen, M. A. Barucci, P. Ehrenfreund, Yu. Shkuratov, V. Shevchenko, Z. Sodnik, S. Mancuso, F. Ankersen, B.H. Foing, and other associated scientists, collaborators, students and colleagues.

  19. Engineering study for pallet adapting the Apollo laser altimeter and photographic camera system for the Lidar Test Experiment on orbital flight tests 2 and 4

    NASA Technical Reports Server (NTRS)

    Kuebert, E. J.

    1977-01-01

    A Laser Altimeter and Mapping Camera System was included in the Apollo Lunar Orbital Experiment Missions. The backup system, never used in the Apollo Program, is available for use in the Lidar Test Experiments on the STS Orbital Flight Tests 2 and 4. Studies were performed to assess the problem associated with installation and operation of the Mapping Camera System in the STS. They were conducted on the photographic capabilities of the Mapping Camera System, its mechanical and electrical interface with the STS, documentation, operation and survivability in the expected environments, ground support equipment, test and field support.

  20. Utilization and viability of biologically-inspired algorithms in a dynamic multiagent camera surveillance system

    NASA Astrophysics Data System (ADS)

    Mundhenk, Terrell N.; Dhavale, Nitin; Marmol, Salvador; Calleja, Elizabeth; Navalpakkam, Vidhya; Bellman, Kirstie; Landauer, Chris; Arbib, Michael A.; Itti, Laurent

    2003-10-01

    In view of the growing complexity of computational tasks and their design, we propose that certain interactive systems may be better designed by utilizing computational strategies based on the study of the human brain. Compared with current engineering paradigms, brain theory offers the promise of improved self-organization and adaptation to the current environment, freeing the programmer from having to address those issues in a procedural manner when designing and implementing large-scale complex systems. To advance this hypothesis, we discus a multi-agent surveillance system where 12 agent CPUs each with its own camera, compete and cooperate to monitor a large room. To cope with the overload of image data streaming from 12 cameras, we take inspiration from the primate"s visual system, which allows the animal to operate a real-time selection of the few most conspicuous locations in visual input. This is accomplished by having each camera agent utilize the bottom-up, saliency-based visual attention algorithm of Itti and Koch (Vision Research 2000;40(10-12):1489-1506) to scan the scene for objects of interest. Real time operation is achieved using a distributed version that runs on a 16-CPU Beowulf cluster composed of the agent computers. The algorithm guides cameras to track and monitor salient objects based on maps of color, orientation, intensity, and motion. To spread camera view points or create cooperation in monitoring highly salient targets, camera agents bias each other by increasing or decreasing the weight of different feature vectors in other cameras, using mechanisms similar to excitation and suppression that have been documented in electrophysiology, psychophysics and imaging studies of low-level visual processing. In addition, if cameras need to compete for computing resources, allocation of computational time is weighed based upon the history of each camera. A camera agent that has a history of seeing more salient targets is more likely to obtain computational resources. The system demonstrates the viability of biologically inspired systems in a real time tracking. In future work we plan on implementing additional biological mechanisms for cooperative management of both the sensor and processing resources in this system that include top down biasing for target specificity as well as novelty and the activity of the tracked object in relation to sensitive features of the environment.

  1. A telephoto camera system with shooting direction control by gaze detection

    NASA Astrophysics Data System (ADS)

    Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro

    2015-05-01

    For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.

  2. SPARTAN Near-IR Camera | SOAR

    Science.gov Websites

    SPARTAN Near-IR Camera SPARTAN Cookbook Ohio State Infrared Imager/Spectrograph (OSIRIS) - NO LONGER Instrumentation at SOAR»SPARTAN Near-IR Camera SPARTAN Near-IR Camera System Overview The Spartan Infrared Camera is a high spatial resolution near-IR imager. Spartan has a focal plane conisisting of four "

  3. Control system for several rotating mirror camera synchronization operation

    NASA Astrophysics Data System (ADS)

    Liu, Ningwen; Wu, Yunfeng; Tan, Xianxiang; Lai, Guoji

    1997-05-01

    This paper introduces a single chip microcomputer control system for synchronization operation of several rotating mirror high-speed cameras. The system consists of four parts: the microcomputer control unit (including the synchronization part and precise measurement part and the time delay part), the shutter control unit, the motor driving unit and the high voltage pulse generator unit. The control system has been used to control the synchronization working process of the GSI cameras (driven by a motor) and FJZ-250 rotating mirror cameras (driven by a gas driven turbine). We have obtained the films of the same objective from different directions in different speed or in same speed.

  4. Evaluation of the MSFC facsimile camera system as a tool for extraterrestrial geologic exploration

    NASA Technical Reports Server (NTRS)

    Wolfe, E. W.; Alderman, J. D.

    1971-01-01

    Utility of the Marshall Space Flight (MSFC) facsimile camera system for extraterrestrial geologic exploration was investigated during the spring of 1971 near Merriam Crater in northern Arizona. Although the system with its present hard-wired recorder operates erratically, the imagery showed that the camera could be developed as a prime imaging tool for automated missions. Its utility would be enhanced by development of computer techniques that utilize digital camera output for construction of topographic maps, and it needs increased resolution for examining near field details. A supplementary imaging system may be necessary for hand specimen examination at low magnification.

  5. The use of consumer depth cameras for 3D surface imaging of people with obesity: A feasibility study.

    PubMed

    Wheat, J S; Clarkson, S; Flint, S W; Simpson, C; Broom, D R

    2018-05-21

    Three dimensional (3D) surface imaging is a viable alternative to traditional body morphology measures, but the feasibility of using this technique with people with obesity has not been fully established. Therefore, the aim of this study was to investigate the validity, repeatability and acceptability of a consumer depth camera 3D surface imaging system in imaging people with obesity. The concurrent validity of the depth camera based system was investigated by comparing measures of mid-trunk volume to a gold-standard. The repeatability and acceptability of the depth camera system was assessed in people with obesity at a clinic. There was evidence of a fixed systematic difference between the depth camera system and the gold standard but excellent correlation between volume estimates (r 2 =0.997), with little evidence of proportional bias. The depth camera system was highly repeatable - low typical error (0.192L), high intraclass correlation coefficient (>0.999) and low technical error of measurement (0.64%). Depth camera based 3D surface imaging was also acceptable to people with obesity. It is feasible (valid, repeatable and acceptable) to use a low cost, flexible 3D surface imaging system to monitor the body size and shape of people with obesity in a clinical setting. Copyright © 2018 Asia Oceania Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.

  6. Development of a camera casing suited for cryogenic and vacuum applications

    NASA Astrophysics Data System (ADS)

    Delaquis, S. C.; Gornea, R.; Janos, S.; Lüthi, M.; von Rohr, Ch Rudolf; Schenk, M.; Vuilleumier, J.-L.

    2013-12-01

    We report on the design, construction, and operation of a PID temperature controlled and vacuum tight camera casing. The camera casing contains a commercial digital camera and a lighting system. The design of the camera casing and its components are discussed in detail. Pictures taken by this cryo-camera while immersed in argon vapour and liquid nitrogen are presented. The cryo-camera can provide a live view inside cryogenic set-ups and allows to record video.

  7. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition

    PubMed Central

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133

  8. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition.

    PubMed

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.

  9. Enhanced Early View of Ceres from Dawn

    NASA Image and Video Library

    2014-12-05

    As the Dawn spacecraft flies through space toward the dwarf planet Ceres, the unexplored world appears to its camera as a bright light in the distance, full of possibility for scientific discovery. This view was acquired as part of a final calibration of the science camera before Dawn's arrival at Ceres. To accomplish this, the camera needed to take pictures of a target that appears just a few pixels across. On Dec. 1, 2014, Ceres was about nine pixels in diameter, nearly perfect for this calibration. The images provide data on very subtle optical properties of the camera that scientists will use when they analyze and interpret the details of some of the pictures returned from orbit. Ceres is the bright spot in the center of the image. Because the dwarf planet is much brighter than the stars in the background, the camera team selected a long exposure time to make the stars visible. The long exposure made Ceres appear overexposed, and exaggerated its size; this was corrected by superimposing a shorter exposure of the dwarf planet in the center of the image. A cropped, magnified view of Ceres appears in the inset image at lower left. The image was taken on Dec. 1, 2014 with the Dawn spacecraft's framing camera, using a clear spectral filter. Dawn was about 740,000 miles (1.2 million kilometers) from Ceres at the time. Ceres is 590 miles (950 kilometers) across and was discovered in 1801. http://photojournal.jpl.nasa.gov/catalog/PIA19050

  10. A high-speed digital camera system for the observation of rapid H-alpha fluctuations in solar flares

    NASA Technical Reports Server (NTRS)

    Kiplinger, Alan L.; Dennis, Brian R.; Orwig, Larry E.

    1989-01-01

    Researchers developed a prototype digital camera system for obtaining H-alpha images of solar flares with 0.1 s time resolution. They intend to operate this system in conjunction with SMM's Hard X Ray Burst Spectrometer, with x ray instruments which will be available on the Gamma Ray Observatory and eventually with the Gamma Ray Imaging Device (GRID), and with the High Resolution Gamma-Ray and Hard X Ray Spectrometer (HIREGS) which are being developed for the Max '91 program. The digital camera has recently proven to be successful as a one camera system operating in the blue wing of H-alpha during the first Max '91 campaign. Construction and procurement of a second and possibly a third camera for simultaneous observations at other wavelengths are underway as are analyses of the campaign data.

  11. Cameras Improve Navigation for Pilots, Drivers

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Advanced Scientific Concepts Inc. (ASC), of Santa Barbara, California, received SBIR awards and other funding from the Jet Propulsion Laboratory, Johnson Space Center, and Langley Research Center to develop and refine its 3D flash LIDAR technologies for space applications. Today, ASC's NASA-derived technology is sold to assist with collision avoidance, navigation, and object tracking.

  12. 77 FR 58813 - Western Pacific Fisheries; Approval of a Marine Conservation Plan for American Samoa

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-24

    .... Determining genetic connectivity of coral reef ecosystems in the Samoa archipelago; 21. Surveying fish... quality of deep reef habitat through use of drop cameras; 32. Coral recruitment survey and monitoring; 33... scientific awareness of junior biologist; and 46. Monitoring of coral reefs in Independent Samoa. Objective 7...

  13. Low-Cost Alternative for Signal Generators in the Physics Laboratory

    ERIC Educational Resources Information Center

    Pathare, Shirish Rajan; Raghavendra, M. K.; Huli, Saurabhee

    2017-01-01

    Recently devices such as the optical mouse of a computer, webcams, Wii remote, and digital cameras have been used to record and analyze different physical phenomena quantitatively. Devices like tablets and smartphones are also becoming popular. Different scientific applications available at Google Play (Android devices) or the App Store (iOS…

  14. Multigenerational Learning for Expanding the Educational Involvement of Bilinguals Experiencing Academic Difficulties

    ERIC Educational Resources Information Center

    Martínez-Álvarez, Patricia

    2017-01-01

    Focusing on two bilingual children experiencing learning difficulties, I explore the scientific representations these students generate in an afterschool programme where they have opportunities to exercise agency. In the programme, children use a digital camera to document science in their lives and engage in conversations about the products they…

  15. Taking on the Heat--A Narrative Account of How Infrared Cameras Invite Instant Inquiry

    ERIC Educational Resources Information Center

    Haglund, Jesper; Jeppsson, Fredrik; Schönborn, Konrad J.

    2016-01-01

    Integration of technology, social learning and scientific models offers pedagogical opportunities for science education. A particularly interesting area is thermal science, where students often struggle with abstract concepts, such as heat. In taking on this conceptual obstacle, we explore how hand-held infrared (IR) visualization technology can…

  16. Saturn Apollo Program

    NASA Image and Video Library

    1967-08-01

    The Apollo Telescope Mount (ATM), designed and developed by the Marshall Space Flight Center, served as the primary scientific instrument unit aboard the Skylab. The ATM contained eight complex astronomical instruments designed to observe the Sun over a wide spectrum from visible light to x-rays. This photo depicts a mockup of the ATM contamination monitor camera and photometer.

  17. Science Opportunities with the Near-IR Camera (NIRCam) on the James Webb Space Telescope (JWST)

    NASA Technical Reports Server (NTRS)

    Beichman, Charles A.; Rieke, Marcia; Eisenstein, Daniel; Greene, Thomas P.; Krist, John; McCarthy, Don; Meyer, Michael; Stansberry, John

    2012-01-01

    The Near-Infrared Camera (NIRCam) on the James Webb Space Telescope (JWST) offers revolutionary gains in sensitivity throughout the 1-5 micrometer region. NIRCam will enable great advances in all areas of astrophysics, from the composition of objects in our own Kuiper Belt and the physical properties of planets orbiting nearby stars to the formation of stars and the detection of the youngest galaxies in the Universe. NIRCam also plays an important role in initial alignment of JWST and the long term maintenance of its image quality. NIRCam is presently undergoing instrument Integration and Test in preparation for delivery to the JWST project. Key near-term milestones include the completion of cryogenic testing of the entire instrument; demonstration of scientific and wavefront sensing performance requirements; testing of replacement H2RG detectors arrays; and an analysis of coronagraphic performance in light of measured telescope wavefront characteristics. This paper summarizes the performance of NIRCam, the scientific and education/outreach goals of the science team, and some results of the on-going testing program.

  18. Accuracy Potential and Applications of MIDAS Aerial Oblique Camera System

    NASA Astrophysics Data System (ADS)

    Madani, M.

    2012-07-01

    Airborne oblique cameras such as Fairchild T-3A were initially used for military reconnaissance in 30s. A modern professional digital oblique camera such as MIDAS (Multi-camera Integrated Digital Acquisition System) is used to generate lifelike three dimensional to the users for visualizations, GIS applications, architectural modeling, city modeling, games, simulators, etc. Oblique imagery provide the best vantage for accessing and reviewing changes to the local government tax base, property valuation assessment, buying & selling of residential/commercial for better decisions in a more timely manner. Oblique imagery is also used for infrastructure monitoring making sure safe operations of transportation, utilities, and facilities. Sanborn Mapping Company acquired one MIDAS from TrackAir in 2011. This system consists of four tilted (45 degrees) cameras and one vertical camera connected to a dedicated data acquisition computer system. The 5 digital cameras are based on the Canon EOS 1DS Mark3 with Zeiss lenses. The CCD size is 5,616 by 3,744 (21 MPixels) with the pixel size of 6.4 microns. Multiple flights using different camera configurations (nadir/oblique (28 mm/50 mm) and (50 mm/50 mm)) were flown over downtown Colorado Springs, Colorado. Boresight fights for 28 mm nadir camera were flown at 600 m and 1,200 m and for 50 mm nadir camera at 750 m and 1500 m. Cameras were calibrated by using a 3D cage and multiple convergent images utilizing Australis model. In this paper, the MIDAS system is described, a number of real data sets collected during the aforementioned flights are presented together with their associated flight configurations, data processing workflow, system calibration and quality control workflows are highlighted and the achievable accuracy is presented in some detail. This study revealed that the expected accuracy of about 1 to 1.5 GSD (Ground Sample Distance) for planimetry and about 2 to 2.5 GSD for vertical can be achieved. Remaining systematic errors were modeled by analyzing residuals using correction grid. The results of the final bundle adjustments are sufficient to enable Sanborn to produce DEM/DTM and orthophotos from the nadir imagery and create 3D models using georeferenced oblique imagery.

  19. Compensation for positioning error of industrial robot for flexible vision measuring system

    NASA Astrophysics Data System (ADS)

    Guo, Lei; Liang, Yajun; Song, Jincheng; Sun, Zengyu; Zhu, Jigui

    2013-01-01

    Positioning error of robot is a main factor of accuracy of flexible coordinate measuring system which consists of universal industrial robot and visual sensor. Present compensation methods for positioning error based on kinematic model of robot have a significant limitation that it isn't effective in the whole measuring space. A new compensation method for positioning error of robot based on vision measuring technique is presented. One approach is setting global control points in measured field and attaching an orientation camera to vision sensor. Then global control points are measured by orientation camera to calculate the transformation relation from the current position of sensor system to global coordinate system and positioning error of robot is compensated. Another approach is setting control points on vision sensor and two large field cameras behind the sensor. Then the three dimensional coordinates of control points are measured and the pose and position of sensor is calculated real-timely. Experiment result shows the RMS of spatial positioning is 3.422mm by single camera and 0.031mm by dual cameras. Conclusion is arithmetic of single camera method needs to be improved for higher accuracy and accuracy of dual cameras method is applicable.

  20. PubMed Central

    Baum, S.; Sillem, M.; Ney, J. T.; Baum, A.; Friedrich, M.; Radosa, J.; Kramer, K. M.; Gronwald, B.; Gottschling, S.; Solomayer, E. F.; Rody, A.; Joukhadar, R.

    2017-01-01

    Introduction Minimally invasive operative techniques are being used increasingly in gynaecological surgery. The expansion of the laparoscopic operation spectrum is in part the result of improved imaging. This study investigates the practical advantages of using 3D cameras in routine surgical practice. Materials and Methods Two different 3-dimensional camera systems were compared with a 2-dimensional HD system; the operating surgeonʼs experiences were documented immediately postoperatively using a questionnaire. Results Significant advantages were reported for suturing and cutting of anatomical structures when using the 3D compared to 2D camera systems. There was only a slight advantage for coagulating. The use of 3D cameras significantly improved the general operative visibility and in particular the representation of spacial depth compared to 2-dimensional images. There was not a significant advantage for image width. Depiction of adhesions and retroperitoneal neural structures was significantly improved by the stereoscopic cameras, though this did not apply to blood vessels, ureter, uterus or ovaries. Conclusion 3-dimensional cameras were particularly advantageous for the depiction of fine anatomical structures due to improved spacial depth representation compared to 2D systems. 3D cameras provide the operating surgeon with a monitor image that more closely resembles actual anatomy, thus simplifying laparoscopic procedures. PMID:28190888

  1. Analysis of edge density fluctuation measured by trial KSTAR beam emission spectroscopy systema)

    NASA Astrophysics Data System (ADS)

    Nam, Y. U.; Zoletnik, S.; Lampert, M.; Kovácsik, Á.

    2012-10-01

    A beam emission spectroscopy (BES) system based on direct imaging avalanche photodiode (APD) camera has been designed for Korea Superconducting Tokamak Advanced Research (KSTAR) and a trial system has been constructed and installed for evaluating feasibility of the design. The system contains two cameras, one is an APD camera for BES measurement and another is a fast visible camera for position calibration. Two pneumatically actuated mirrors were positioned at front and rear of lens optics. The front mirror can switch the measurement between edge and core region of plasma and the rear mirror can switch between the APD and the visible camera. All systems worked properly and the measured photon flux was reasonable as expected from the simulation. While the measurement data from the trial system were limited, it revealed some interesting characteristics of KSTAR plasma suggesting future research works with fully installed BES system. The analysis result and the development plan will be presented in this paper.

  2. Real-time tracking and fast retrieval of persons in multiple surveillance cameras of a shopping mall

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Baan, Jan; Landsmeer, Sander; Kruszynski, Chris; van Antwerpen, Gert; Dijk, Judith

    2013-05-01

    The capability to track individuals in CCTV cameras is important for e.g. surveillance applications at large areas such as train stations, airports and shopping centers. However, it is laborious to track and trace people over multiple cameras. In this paper, we present a system for real-time tracking and fast interactive retrieval of persons in video streams from multiple static surveillance cameras. This system is demonstrated in a shopping mall, where the cameras are positioned without overlapping fields-of-view and have different lighting conditions. The results show that the system allows an operator to find the origin or destination of a person more efficiently. The misses are reduced with 37%, which is a significant improvement.

  3. FPGA Based Adaptive Rate and Manifold Pattern Projection for Structured Light 3D Camera System †

    PubMed Central

    Lee, Sukhan

    2018-01-01

    The quality of the captured point cloud and the scanning speed of a structured light 3D camera system depend upon their capability of handling the object surface of a large reflectance variation in the trade-off of the required number of patterns to be projected. In this paper, we propose and implement a flexible embedded framework that is capable of triggering the camera single or multiple times for capturing single or multiple projections within a single camera exposure setting. This allows the 3D camera system to synchronize the camera and projector even for miss-matched frame rates such that the system is capable of projecting different types of patterns for different scan speed applications. This makes the system capturing a high quality of 3D point cloud even for the surface of a large reflectance variation while achieving a high scan speed. The proposed framework is implemented on the Field Programmable Gate Array (FPGA), where the camera trigger is adaptively generated in such a way that the position and the number of triggers are automatically determined according to camera exposure settings. In other words, the projection frequency is adaptive to different scanning applications without altering the architecture. In addition, the proposed framework is unique as it does not require any external memory for storage because pattern pixels are generated in real-time, which minimizes the complexity and size of the application-specific integrated circuit (ASIC) design and implementation. PMID:29642506

  4. Combined use of a priori data for fast system self-calibration of a non-rigid multi-camera fringe projection system

    NASA Astrophysics Data System (ADS)

    Stavroulakis, Petros I.; Chen, Shuxiao; Sims-Waterhouse, Danny; Piano, Samanta; Southon, Nicholas; Bointon, Patrick; Leach, Richard

    2017-06-01

    In non-rigid fringe projection 3D measurement systems, where either the camera or projector setup can change significantly between measurements or the object needs to be tracked, self-calibration has to be carried out frequently to keep the measurements accurate1. In fringe projection systems, it is common to use methods developed initially for photogrammetry for the calibration of the camera(s) in the system in terms of extrinsic and intrinsic parameters. To calibrate the projector(s) an extra correspondence between a pre-calibrated camera and an image created by the projector is performed. These recalibration steps are usually time consuming and involve the measurement of calibrated patterns on planes, before the actual object can continue to be measured after a motion of a camera or projector has been introduced in the setup and hence do not facilitate fast 3D measurement of objects when frequent experimental setup changes are necessary. By employing and combining a priori information via inverse rendering, on-board sensors, deep learning and leveraging a graphics processor unit (GPU), we assess a fine camera pose estimation method which is based on optimising the rendering of a model of a scene and the object to match the view from the camera. We find that the success of this calibration pipeline can be greatly improved by using adequate a priori information from the aforementioned sources.

  5. Study of Pitch Attitude Estimation Using a High-Definition TV (HDTV) Camera on the Japanese Lunar Explorer SELENE (KAGUYA)

    NASA Astrophysics Data System (ADS)

    Sobue, Shinichi; Yamazaki, Junichi; Matsumoto, Shuichi; Konishi, Hisahiro; Maejima, Hironori; Sasaki, Susumu; Kato, Manabu; Mitsuhashi, Seiji; Tachino, Junichi

    The lunar explorer SELENE (also called KAGUYA) carried thirteen scientific mission instruments to reveal the origin and evolution of Moon and to investigate the possible future utilization of Moon. In addition to the scientific instruments, a high-definition TV (HDTV) camera provided by the Japan Broadcasting Corporation (NHK) was carried on KAGUYA to promote public outreach. We usually use housekeeping telemetry data to derive the satellite attitude along with orbital determination and propagated information. However, it takes time to derive this information, since orbital determination and propagation calculation require the use of the orbital model. When a malfunction of the KAGUYA reaction wheel occurred, we could not have correct attitude information. This means that we don’t have a correct orbital determination in timely fashion. However, when we checked HDTV movies, we found that horizon information on the lunar surface derived from HDTV moving images as a horizon sensor was very useful for the detection of the attitude of KAGUYA. We then compared this information with the attitude information derived from orbital telemetry to validate the accuracy of the HDTV derived estimation. As a result of this comparison, there are good pitch attitude estimation using HDTV derived estimation and we could estimate the pitch angle change during the KAGUYA mission operation simplify and quickly. In this study, we show the usefulness of this HDTV camera as a horizon sensor.

  6. KMTNET: A Network of 1.6 m Wide-Field Optical Telescopes Installed at Three Southern Observatories

    NASA Astrophysics Data System (ADS)

    Kim, Seung-Lee; Lee, Chung-Uk; Park, Byeong-Gon; Kim, Dong-Jin; Cha, Sang-Mok; Lee, Yongseok; Han, Cheongho; Chun, Moo-Young; Yuk, Insoo

    2016-02-01

    The Korea Microlensing Telescope Network (KMTNet) is a wide-field photometric system installed by the Korea Astronomy and Space Science Institute (KASI). Here, we present the overall technical specifications of the KMTNet observation system, test observation results, data transfer and image processing procedure, and finally, the KMTNet science programs. The system consists of three 1.6 m wide-field optical telescopes equipped with mosaic CCD cameras of 18k by 18k pixels. Each telescope provides a 2.0 by 2.0 square degree field of view. We have finished installing all three telescopes and cameras sequentially at the Cerro-Tololo Inter-American Observatory (CTIO) in Chile, the South African Astronomical Observatory (SAAO) in South Africa, and the Siding Spring Observatory (SSO) in Australia. This network of telescopes, which is spread over three different continents at a similar latitude of about -30 degrees, enables 24-hour continuous monitoring of targets observable in the Southern Hemisphere. The test observations showed good image quality that meets the seeing requirement of less than 1.0 arcsec in I-band. All of the observation data are transferred to the KMTNet data center at KASI via the international network communication and are processed with the KMTNet data pipeline. The primary scientific goal of the KMTNet is to discover numerous extrasolar planets toward the Galactic bulge by using the gravitational microlensing technique, especially earth-mass planets in the habitable zone. During the non-bulge season, the system is used for wide-field photometric survey science on supernovae, asteroids, and external galaxies.

  7. Utilizing the Cyberforest live sound system with social media to remotely conduct woodland bird censuses in Central Japan.

    PubMed

    Saito, Kaoru; Nakamura, Kazuhiko; Ueta, Mutsuyuki; Kurosawa, Reiko; Fujiwara, Akio; Kobayashi, Hill Hiroki; Nakayama, Masaya; Toko, Ayako; Nagahama, Kazuyo

    2015-11-01

    We have developed a system that streams and archives live sound from remote areas across Japan via an unmanned automatic camera. The system was used to carry out pilot bird censuses in woodland; this allowed us to examine the use of live sound transmission and the role of social media as a mediator in remote scientific monitoring. The system has been streaming sounds 8 h per day for more than five years. We demonstrated that: (1) the transmission of live sound from a remote woodland could be used effectively to monitor birds in a remote location; (2) the simultaneous involvement of several participants via Internet Relay Chat to listen to live sound transmissions could enhance the accuracy of census data collection; and (3) interactions through Twitter allowed members of the public to engage or help with the remote monitoring of birds and experience inaccessible nature through the use of novel technologies.

  8. Alternative images for perpendicular parking : a usability test of a multi-camera parking assistance system.

    DOT National Transportation Integrated Search

    2004-10-01

    The parking assistance system evaluated consisted of four outward facing cameras whose images could be presented on a monitor on the center console. The images presented varied in the location of the virtual eye point of the camera (the height above ...

  9. A low-cost dual-camera imaging system for aerial applicators

    USDA-ARS?s Scientific Manuscript database

    Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...

  10. A smart telerobotic system driven by monocular vision

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R. J. P.; Maccato, A.; Wlczek, P.; Denney, B.; Scheerer, J.

    1994-01-01

    A robotic system that accepts autonomously generated motion and control commands is described. The system provides images from the monocular vision of a camera mounted on a robot's end effector, eliminating the need for traditional guidance targets that must be predetermined and specifically identified. The telerobotic vision system presents different views of the targeted object relative to the camera, based on a single camera image and knowledge of the target's solid geometry.

  11. X-ray microbeam stand-alone facility for cultured cells irradiation

    NASA Astrophysics Data System (ADS)

    Bożek, Sebastian; Bielecki, Jakub; Wiecheć, Anna; Lekki, Janusz; Stachura, Zbigniew; Pogoda, Katarzyna; Lipiec, Ewelina; Tkocz, Konrad; Kwiatek, Wojciech M.

    2017-03-01

    The article describes an X-ray microbeam standalone facility dedicated for irradiation of living cultured cells. The article can serve as an advice for such facilities construction, as it begins from engineering details, through mathematical modeling and experimental procedures, ending up with preliminary experimental results and conclusions. The presented system consists of an open type X-ray tube with microfocusing down to about 2 μm, an X-ray focusing system with optical elements arranged in the nested Kirckpatrick-Baez (or Montel) geometry, a sample stand and an optical microscope with a scientific digital CCD camera. For the beam visualisation an X-ray sensitive CCD camera and a spectral detector are used, as well as a scintillator screen combined with the microscope. A method of precise one by one irradiation of previously chosen cells is presented, as well as a fast method of uniform irradiation of a chosen sample area. Mathematical models of beam and cell with calculations of kerma and dose are presented. The experiments on dose-effect relationship, kinetics of DNA double strand breaks repair, as well as micronuclei observation were performed on PC-3 (Prostate Cancer) cultured cells. The cells were seeded and irradiated on Mylar foil, which covered a hole drilled in the Petri dish. DNA lesions were visualised with γ-H2AX marker combined with Alexa Fluor 488 fluorescent dye.

  12. A compact electron spectrometer for an LWFA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumpkin, A.; Crowell, R.; Li, Y.

    2007-01-01

    The use of a laser wakefield accelerator (LWFA) beam as a driver for a compact free-electron laser (FEL) has been proposed recently. A project is underway at Argonne National Laboratory (ANL) to operate an LWFA in the bubble regime and to use the quasi-monoenergetic electron beam as a driver for a 3-m-long undulator for generation of sub-ps UV radiation. The Terawatt Ultrafast High Field Facility (TUHFF) in the Chemistry Division provides the 20-TW peak power laser. A compact electron spectrometer whose initial fields of 0.45 T provide energy coverage of 30-200 MeV has been selected to characterize the electron beams.more » The system is based on the Ecole Polytechnique design used for their LWFA and incorporates the 5-cm-long permanent magnet dipole, the LANEX scintillator screen located at the dispersive plane, a Roper Scientific 16-bit MCP-intensified CCD camera, and a Bergoz ICT for complementary charge measurements. Test results on the magnets, the 16-bit camera, and the ICT will be described, and initial electron beam data will be presented as available. Other challenges will also be addressed.« less

  13. Can we see photosynthesis? Magnifying the tiny color changes of plant green leaves using Eulerian video magnification

    NASA Astrophysics Data System (ADS)

    Taj-Eddin, Islam A. T. F.; Afifi, Mahmoud; Korashy, Mostafa; Ahmed, Ali H.; Cheng, Ng Yoke; Hernandez, Evelyng; Abdel-Latif, Salma M.

    2017-11-01

    Plant aliveness is proven through laboratory experiments and special scientific instruments. We aim to detect the degree of animation of plants based on the magnification of the small color changes in the plant's green leaves using the Eulerian video magnification. Capturing the video under a controlled environment, e.g., using a tripod and direct current light sources, reduces camera movements and minimizes light fluctuations; we aim to reduce the external factors as much as possible. The acquired video is then stabilized and a proposed algorithm is used to reduce the illumination variations. Finally, the Euler magnification is utilized to magnify the color changes on the light invariant video. The proposed system does not require any special purpose instruments as it uses a digital camera with a regular frame rate. The results of magnified color changes on both natural and plastic leaves show that the live green leaves have color changes in contrast to the plastic leaves. Hence, we can argue that the color changes of the leaves are due to biological operations, such as photosynthesis. To date, this is possibly the first work that focuses on interpreting visually, some biological operations of plants without any special purpose instruments.

  14. Got Point Clouds: Characterizing Canopy Structure With Active and Passive Sensors

    NASA Astrophysics Data System (ADS)

    Popescu, S. C.; Malambo, L.; Sheridan, R.; Putman, E.; Murray, S.; Rooney, W.; Rajan, N.

    2016-12-01

    Unmanned Aerial Systems (UAS) provide the means to acquire highly customized aerial data at local scale with a multitude of sensors. UAS allow us to obtain affordably repeated observations of canopy structure for agricultural and natural resources applications by using passive optical sensors, such as cameras and photogrammetric techniques, and active sensors, such as lidar (Light Detection and Ranging). The objectives of this presentation are to: (1) offer a brief overview of UAS used for agriculture and natural resources studies, (2) describe experiences in conducting agriculture phenotyping and forest vegetation measurements, and (3) give details on the methodology developed for image and lidar data processing for characterizing the three dimensional structure of plant canopies. The UAS types used for this purpose included rotary platforms, such as quadcopters, hexacopters, and octocopters, with a payload capacity of up to 19 lbs. The sensors that collected data over two crop seasons include multispectral cameras in the visible color spectrum and near infrared, and UAS-lidar. For ground reference data we used terrestrial lidar scanners and field measurements. Results comparing UAS and terrestrial measurements show high correlation and open new areas of scientific investigation of crop canopies previously not possible with affordable techniques.

  15. Developing Short Films of Geoscience Research

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Webley, P. W.; Dehn, J.; Harrild, M.; Kienenberger, D.; Salganek, M.

    2015-12-01

    In today's prevalence of social media and networking, video products are becoming increasingly more useful to communicate research quickly and effectively to a diverse audience, including outreach activities as well as within the research community and to funding agencies. Due to the observational nature of geoscience, researchers often take photos and video footage to document fieldwork or to record laboratory experiments. Here we present how researchers can become more effective storytellers by collaborating with filmmakers to produce short documentary films of their research. We will focus on the use of traditional high-definition (HD) camcorders and HD DSLR cameras to record the scientific story while our research topic focuses on the use of remote sensing techniques, specifically thermal infrared imaging that is often used to analyze time varying natural processes such as volcanic hazards. By capturing the story in the thermal infrared wavelength range, in addition to traditional red-green-blue (RGB) color space, the audience is able to experience the world differently. We will develop a short film specifically designed using thermal infrared cameras that illustrates how visual storytellers can use these new tools to capture unique and important aspects of their research, convey their passion for earth systems science, as well as engage and captive the viewer.

  16. Post-Flight Estimation of Motion of Space Structures: Part 1

    NASA Technical Reports Server (NTRS)

    Brugarolas, Paul; Breckenridge, William

    2008-01-01

    A computer program estimates the relative positions and orientations of two space structures from data on the angular positions and distances of fiducial objects on one structure as measured by a target tracking electronic camera and laser range finders on another structure. The program is written specifically for determining the relative alignments of two antennas, connected by a long truss, deployed in outer space from a space shuttle. The program is based partly on transformations among the various coordinate systems involved in the measurements and on a nonlinear mathematical model of vibrations of the truss. The program implements a Kalman filter that blends the measurement data with data from the model. Using time series of measurement data from the tracking camera and range finders, the program generates time series of data on the relative position and orientation of the antennas. A similar program described in a prior NASA Tech Briefs article was used onboard for monitoring the structures during flight. The present program is more precise and designed for use on Earth in post-flight processing of the measurement data to enable correction, for antenna motions, of scientific data acquired by use of the antennas.

  17. Metadata Exporter for Scientific Photography Management

    NASA Astrophysics Data System (ADS)

    Staudigel, D.; English, B.; Delaney, R.; Staudigel, H.; Koppers, A.; Hart, S.

    2005-12-01

    Photographs have become an increasingly important medium, especially with the advent of digital cameras. It has become inexpensive to take photographs and quickly post them on a website. However informative photos may be, they still need to be displayed in a convenient way, and be cataloged in such a manner that makes them easily locatable. Managing the great number of photographs that digital cameras allow and creating a format for efficient dissemination of the information related to the photos is a tedious task. Products such as Apple's iPhoto have greatly eased the task of managing photographs, However, they often have limitations. Un-customizable metadata fields and poor metadata extraction tools limit their scientific usefulness. A solution to this persistent problem is a customizable metadata exporter. On the ALIA expedition, we successfully managed the thousands of digital photos we took. We did this with iPhoto and a version of the exporter that is now available to the public under the name "CustomHTMLExport" (http://www.versiontracker.com/dyn/moreinfo/macosx/27777), currently undergoing formal beta testing This software allows the use of customized metadata fields (including description, time, date, GPS data, etc.), which is exported along with the photo. It can also produce webpages with this data straight from iPhoto, in a much more flexible way than is already allowed. With this tool it becomes very easy to manage and distribute scientific photos.

  18. Issues in implementing services for a wireless web-enabled digital camera

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Sampat, Nitin; Fisher, Yoram; Canosa, John; Noel, Nicholas

    2001-05-01

    The competition in the exploding digital photography market has caused vendors to explore new ways to increase their return on investment. A common view among industry analysts is that increasingly it will be services provided by these cameras, and not the cameras themselves, that will provide the revenue stream. These services will be coupled to e- Appliance based Communities. In addition, the rapidly increasing need to upload images to the Internet for photo- finishing services as well as the need to download software upgrades to the camera is driving many camera OEMs to evaluate the benefits of using the wireless web to extend their enterprise systems. Currently, creating a viable e- appliance such as a digital camera coupled with a wireless web service requires more than just a competency in product development. This paper will evaluate the system implications in the deployment of recurring revenue services and enterprise connectivity of a wireless, web-enabled digital camera. These include, among other things, an architectural design approach for services such as device management, synchronization, billing, connectivity, security, etc. Such an evaluation will assist, we hope, anyone designing or connecting a digital camera to the enterprise systems.

  19. Very High-Speed Digital Video Capability for In-Flight Use

    NASA Technical Reports Server (NTRS)

    Corda, Stephen; Tseng, Ting; Reaves, Matthew; Mauldin, Kendall; Whiteman, Donald

    2006-01-01

    digital video camera system has been qualified for use in flight on the NASA supersonic F-15B Research Testbed aircraft. This system is capable of very-high-speed color digital imaging at flight speeds up to Mach 2. The components of this system have been ruggedized and shock-mounted in the aircraft to survive the severe pressure, temperature, and vibration of the flight environment. The system includes two synchronized camera subsystems installed in fuselage-mounted camera pods (see Figure 1). Each camera subsystem comprises a camera controller/recorder unit and a camera head. The two camera subsystems are synchronized by use of an MHub(TradeMark) synchronization unit. Each camera subsystem is capable of recording at a rate up to 10,000 pictures per second (pps). A state-of-the-art complementary metal oxide/semiconductor (CMOS) sensor in the camera head has a maximum resolution of 1,280 1,024 pixels at 1,000 pps. Exposure times of the electronic shutter of the camera range from 1/200,000 of a second to full open. The recorded images are captured in a dynamic random-access memory (DRAM) and can be downloaded directly to a personal computer or saved on a compact flash memory card. In addition to the high-rate recording of images, the system can display images in real time at 30 pps. Inter Range Instrumentation Group (IRIG) time code can be inserted into the individual camera controllers or into the M-Hub unit. The video data could also be used to obtain quantitative, three-dimensional trajectory information. The first use of this system was in support of the Space Shuttle Return to Flight effort. Data were needed to help in understanding how thermally insulating foam is shed from a space shuttle external fuel tank during launch. The cameras captured images of simulated external tank debris ejected from a fixture mounted under the centerline of the F-15B aircraft. Digital video was obtained at subsonic and supersonic flight conditions, including speeds up to Mach 2 and altitudes up to 50,000 ft (15.24 km). The digital video was used to determine the structural survivability of the debris in a real flight environment and quantify the aerodynamic trajectories of the debris.

  20. Bandit: Technologies for Proximity Operations of Teams of Sub-10Kg Spacecraft

    DTIC Science & Technology

    2007-10-16

    and adding a dedicated overhead camera system. As will be explained below, the forced-air system did not work and the existing system has proven too...erratic to justify the expense of the camera system. 6DOF Software Simulator. The existing Java-based graphical 6DOF simulator was to be improved for...proposed camera system for a nonfunctional table. The C-9 final report is enclosed. ["Prf flj ,er Figure 1. Forced-air table schematic Figure 2

  1. Science with the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Gardner, Jonathan P.

    2006-01-01

    The scientific capabilities of the James Webb Space Telescope (JWST) fall into four themes. The End of the Dark Ages: First Light and Reionization theme seeks to identify the first luminous sources to form and to determine the ionization history of the universe. The Assembly of Galaxies theme seeks to determine how galaxies and the dark matter, gas, stars, metals, morphological structures, and active nuclei within them evolved from the epoch of reionization to the present. The Birth of Stars and Protoplanetary Systems theme seeks to unravel the birth and early evolution of stars, from infall onto dust-enshrouded protostars, to the genesis of planetary systems. The Planetary Systems and the Origins of Life theme seeks to determine the physical and chemical properties of planetary systems around nearby stars and of our own, and investigate the potential for life in those systems. To enable these for science themes, JWST will be a large (6.5m) cold (50K) telescope launched to the second Earth-Sun Lagrange point early in the next decade. It is the successor to the Hubble Space Telescope, and is a partnership of NASA, ESA and CSA. JWST will have three instruments: The Near-Infrared Camera, and the Near-Infrared multi-object Spectrograph will cover the wavelength range 0.6 to 5 microns, while the Mid-Infrared Instrument will do both imaging and spectroscopy from 5 to 27 microns. I review the status and capabilities of the observatory and instruments in the context of the major scientific goals.

  2. A Reconfigurable Real-Time Compressive-Sampling Camera for Biological Applications

    PubMed Central

    Fu, Bo; Pitter, Mark C.; Russell, Noah A.

    2011-01-01

    Many applications in biology, such as long-term functional imaging of neural and cardiac systems, require continuous high-speed imaging. This is typically not possible, however, using commercially available systems. The frame rate and the recording time of high-speed cameras are limited by the digitization rate and the capacity of on-camera memory. Further restrictions are often imposed by the limited bandwidth of the data link to the host computer. Even if the system bandwidth is not a limiting factor, continuous high-speed acquisition results in very large volumes of data that are difficult to handle, particularly when real-time analysis is required. In response to this issue many cameras allow a predetermined, rectangular region of interest (ROI) to be sampled, however this approach lacks flexibility and is blind to the image region outside of the ROI. We have addressed this problem by building a camera system using a randomly-addressable CMOS sensor. The camera has a low bandwidth, but is able to capture continuous high-speed images of an arbitrarily defined ROI, using most of the available bandwidth, while simultaneously acquiring low-speed, full frame images using the remaining bandwidth. In addition, the camera is able to use the full-frame information to recalculate the positions of targets and update the high-speed ROIs without interrupting acquisition. In this way the camera is capable of imaging moving targets at high-speed while simultaneously imaging the whole frame at a lower speed. We have used this camera system to monitor the heartbeat and blood cell flow of a water flea (Daphnia) at frame rates in excess of 1500 fps. PMID:22028852

  3. Method used to test the imaging consistency of binocular camera's left-right optical system

    NASA Astrophysics Data System (ADS)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  4. RGB color calibration for quantitative image analysis: the "3D thin-plate spline" warping approach.

    PubMed

    Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado

    2012-01-01

    In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data.

  5. University Students Join NASA on Trip to Hawaiian Volcano

    NASA Image and Video Library

    2017-12-08

    Team kite This kite was part of the scientific tool kit. It carried a camera that can be used to make high-resolution mosaics of the study site. Credit: NASA/GSFC/Jacob Bleacher In June, five student journalists from Stony Brook University packed their hiking boots and hydration packs and joined a NASA-funded science team for 10 days on the lava fields of Kilauea, an active Hawaiian volcano. Kilauea’s lava fields are an ideal place to test equipment designed for use on Earth’s moon or Mars, because volcanic activity shaped so much of those terrains. The trip was part of an interdisciplinary program called RIS4E – short for Remote, In Situ, and Synchrotron Studies for Science and Exploration – which is designed to prepare for future exploration of the moon, near-Earth asteroids and the moons of Mars. To read reports from the RIS4E journalism students about their experiences in Hawaii, visit ReportingRIS4E.com NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. Obituary: James Gilbert Baker, 1914-2005

    NASA Astrophysics Data System (ADS)

    Baker, Neal Kenton

    2005-12-01

    Dr. James Gilbert Baker, renowned astronomer and optical physicist, died 29 June 2005 at his home in Bedford, New Hampshire at the age of 90. Although his scientific interest was astronomy, his extraordinary ability in optical design led to the creation of hundreds of optical systems that supported astronomy, aerial reconnaissance, instant photography (Polaroid SX70 camera), and the US space programs. He was the recipient of numerous awards for his creative work. He was born in Louisville, Kentucky, on 11 November 1914, the fourth child of Jesse B. Baker and Hattie M. Stallard. After graduating from Louisville DuPont Manual High, he went on to attend the University of Louisville majoring in Mathematics. He became very close to an Astronomy Professor, Dr. Moore, and many times used his telescopes to do nightly observations. While at the university, he built mirrors for his own telescopes and helped form the Louisville Astronomical Society in 1933. At the University of Louisville, he also met his future wife, Elizabeth Katherine Breitenstein of Jefferson County, Kentucky. He received his BA in 1935 at the height of the Depression. He began his graduate work in astronomy at the Harvard College Observatory. After his MA (1936), he was appointed a Junior Fellow (1937-1943) in the Prestigious Harvard Society of Fellows. He received his PhD in 1942 from Harvard in rather an unusual fashion, which is worth retelling. During an Astronomy Department dinner, Dr. Harlow Shapley (the director) asked him to give a talk. According to the "Courier-Journal Magazine", "Dr. Shapley stood up and proclaimed an on-the-spot departmental meeting and asked for a vote on recommending Baker for a Ph.D. on the basis of the 'oral exam' he had just finished. The vote was unanimous." It was at Harvard College Observatory during this first stage of his career that he collaborated with Donald H. Menzel, Lawrence H. Aller, and George H. Shortley on a landmark set of papers on the physical processes in gaseous nebulae. In addition to his theoretical work, he also began designing astronomical instruments with ever greater resolving powers and wide-angle acceptance which he described as the "the royal way to new discoveries."1 He is well known for the Baker-Schmidt telescope and the Baker Super Schmidt meteor camera. He was also a co-author with George Z. Dimitroff of a book entitled, "Telescopes and Accessories" (1945). In 1948 he received an Honorary Doctorate from the University of Louisville. With the start of World War II, the U.S. Army sought to establish an aerial reconnaissance branch and placed the project in charge of Colonel George W. Goddard. After months of searching for an optical designer, he asked for a recommendation from Dr. Mees2 of Eastman Kodak. Following the recommendations of Dr. Mees, Col. Goddard found this friendly and unassuming twenty-six year old graduate student at Harvard to be the perfect candidate. He was impressed by Dr. Baker's originality in optical design and provided him a small army research contract in early 1941 for a wide-angle camera system. Goddard's "Victory Lens" project began on 20 May 1942 when he visited Dr. Baker's office at Harvard College Observatory and described the need for a lens of f/2.5 covering a 5x5 plate to be made in huge quantities." Multiple designs were developed during the war effort. A hands-on man, Dr. Baker risked his life operating the cameras in many of the early test flights that carried the camera systems in unpressurized compartments on aircraft. He was the director of the Observatory Optical Project at Harvard University from 1943 to 1945. He began his long consulting career with the Perkin Elmer Corporation during this period. When the war ended, Harvard University decided to cease war-related projects and subsequently, Dr. Baker's lab was moved to Boston University and was eventually spun off as ITEK Corporation. However, he continued to be an associate professor and research associate at Harvard from 1946 to 1949. In 1948 he received the Presidential Medal for Merit for his work during World War II in the Office of Scientific Research and Development. In 1948, he moved to Orinda, California from Cambridge, Massachusetts and became a research associate of Lick Observatory for two years. He returned to Harvard in 1950. He had spent thousands of hours doing ray trace calculations on a Marchant calculator to produce his first aerial cameras. To replace the tedious calculations by hand, Dr. Baker introduced the use of numerical computers into the field of optics. His ray-trace program was one of the first applications run on the Harvard Mark II (1947) computer. Later on, he developed his own methodology to optimize the performance of his optical designs. These optical design computer programs were a family affair, developed under his direction by his own children to support his highly sophisticated designs of the 1960s and 1970s. For most of his career, Dr. Baker was involved with large system concepts covering not only the camera, but the camera delivery systems as well. As the chairman of U.S. Air Force Scientific Advisory Board, he recognized that national security requirements would require optical designs of even greater resolving power using aircraft at extreme altitudes. The need for such a plane resulted in the creation of the U-2 system consisting of a plane and camera functioning as a unit to create panoramic high-resolution aerial photographs. He formed Spica Incorporated in 1955 to perform the necessary optical design work for the US Government. The final design was a 36-inch f/10 system. Dr. Baker also designed the aircraft's periscope to allow the pilot to see his flight path. By 1958, he was almost solely responsible for all the cameras used in photoreconnaissance aircraft. He continued to serve on the President's Foreign Intelligence Advisory Board and on the Land Panel. Before the launch of Sputnik, he designed the Baker-Nunn satellite-tracking camera to support the Air Force's early satellite tracking and space surveillance networks. Because of his foresight, cameras were in place to track the Sputnik Satellite in October 1957. These cameras allowed the precise orbital determination of all orbiting spacecraft for over three decades until the tracking cameras were retired from service. He continued to advise top Government officials in the evolution of reconnaissance systems during the 1960s and 1970s. He received a Space Pioneer Award from the US Air Force. He received the Pioneers of National Reconnaissance Medal (2000) with the citation, "As a young Harvard astronomer, Dr. James G. Baker designed most of the lenses and many of the cameras used in aerial over flights of 'denied territory' enabling the success of the U.S. peacetime strategic reconnaissance policy." Around 1968, he undertook a consulting contract with Polaroid Corporation after Dr. Edwin Land persuaded him that only he could design the optical system for his new SX-70 Land. He was also responsible for the design of the Quintic focusing system for the Polaroid Spectra Camera system that employed a revolutionary combination of non-rotational aspherics to achieve focusing function. In 1958 he became a Fellow of the Optical Society of America (OSA). In 1960 he was elected President of the Society for one year and helped establish the Applied Optics Journal. He was the recipient of numerous OSA awards, spanning the breadth of the field, and has been honored with the Adolf Lomb Award, Ives Medal, Fraunhofer Award, and Richardson Award. He was made an honorary member of OSA in 1993. He also was the recipient of the 1978 Gold Medal, the highest award of the International Society of Optical Engineers (SPIE). Furthermore, he was the Recipient of the Elliott Cresson Medal of the Franklin Institute for his many innovations in astronomical tools. Dr. Baker was elected a Member of the National Academy of Sciences (1965), the American Philosophical Society (1970), the American Academy of Arts and Sciences (1946), and the National Academy of Engineering (1979). He was a member of the American Astronomical Society, the International Astronomical Union, and the Astronomical Society of the Pacific. He authored numerous professional papers and has over fifty US patents. He maintained his affiliation with the Harvard College Observatory and the Smithsonian Astrophysical Observatory until he retired in 2003. Even after his retirement in 2003, he continued work at his home on a new telescope design that he told his family he should have discovered in 1940. Light was always his tool to the understanding of the Universe. An entry from his personal observation log, 7 January 1933, made after an evening of star gazing reveals the pure inspiration of his efforts: "After all, it is the satisfaction obtained which benefits humanity, more than any other thing. It is in the satisfaction of greater human knowledge about the cosmos that the scientist is spurred on to greater efforts." James Baker fulfilled the destiny he had foreseen in 1933, living to see professional and amateur astronomers use his instruments and designs to further the understanding of the cosmos. Whereas, he had not predicted that his cameras would protect this nation for over many years. He is survived by his wife, his four children and five grandchildren. 1Oscar Bryant, "Astronomical Designs," in "Accent", the University of Louisville College of Arts and Sciences Alumni Newsletter, Spring 1994. 2George W. Goddard,Brigadier General, "Overview", 273.

  7. Measurement of reach envelopes with a four-camera Selective Spot Recognition (SELSPOT) system

    NASA Technical Reports Server (NTRS)

    Stramler, J. H., Jr.; Woolford, B. J.

    1983-01-01

    The basic Selective Spot Recognition (SELSPOT) system is essentially a system which uses infrared LEDs and a 'camera' with an infrared-sensitive photodetector, a focusing lens, and some A/D electronics to produce a digital output representing an X and Y coordinate for each LED for each camera. When the data are synthesized across all cameras with appropriate calibrations, an XYZ set of coordinates is obtained for each LED at a given point in time. Attention is given to the operating modes, a system checkout, and reach envelopes and software. The Video Recording Adapter (VRA) represents the main addition to the basic SELSPOT system. The VRA contains a microprocessor and other electronics which permit user selection of several options and some interaction with the system.

  8. A goggle navigation system for cancer resection surgery

    NASA Astrophysics Data System (ADS)

    Xu, Junbin; Shao, Pengfei; Yue, Ting; Zhang, Shiwu; Ding, Houzhu; Wang, Jinkun; Xu, Ronald

    2014-02-01

    We describe a portable fluorescence goggle navigation system for cancer margin assessment during oncologic surgeries. The system consists of a computer, a head mount display (HMD) device, a near infrared (NIR) CCD camera, a miniature CMOS camera, and a 780 nm laser diode excitation light source. The fluorescence and the background images of the surgical scene are acquired by the CCD camera and the CMOS camera respectively, co-registered, and displayed on the HMD device in real-time. The spatial resolution and the co-registration deviation of the goggle navigation system are evaluated quantitatively. The technical feasibility of the proposed goggle system is tested in an ex vivo tumor model. Our experiments demonstrate the feasibility of using a goggle navigation system for intraoperative margin detection and surgical guidance.

  9. Observation of Possible Lava Tube Skylights by SELENE cameras

    NASA Astrophysics Data System (ADS)

    Haruyama, Junichi; Hiesinger, Harald; van der Bogert, Carolyn

    We have discovered three deep hole-structures on the Moon in the Terrain Camera and Multi-band Imager on the SELENE. These holes are large depth to diameter ratios: Marius Hills Hole (MHH) is 65 m in diameter and 88-90 m in depth, Mare Tranquillitatis Hole (MTH) is 120 x 110 m in diameter and 180 m in depth, and Mare Ingenii Hole (MIH) is 140 x 110 m in diameter and deeper than 90 m. No volcanic material from the holes nor dike-relating pit craters is seen around the holes. They are possible lava tube skylights. These holes and possibly connected tubes have a lot of scientific interests and high potentialities as lunar bases.

  10. Laser Technology in Interplanetary Exploration: The Past and the Future

    NASA Technical Reports Server (NTRS)

    Smith, David E.

    2000-01-01

    Laser technology has been used in planetary exploration for many years but it has only been in the last decade that laser altimeters and ranging systems have been selected as flight instruments alongside cameras, spectrometers, magnetometers, etc. Today we have an active laser system operating at Mars and another destined for the asteroid Eros. A few years ago a laser ranging system on the Clementine mission changed much of our thinking about the moon and in a few years laser altimeters will be on their way to Mercury, and also to Europa. Along with the increased capabilities and reliability of laser systems has came the realization that precision ranging to the surface of planetary bodies from orbiting spacecraft enables more scientific problems to be addressed, including many associated with planetary rotation, librations, and tides. In addition, new Earth-based laser ranging systems working with similar systems on other planetary bodies in an asynchronous transponder mode will be able to make interplanetary ranging measurements at the few cm level and will advance our understanding of solar system dynamics and relativistic physics.

  11. OpenCV and TYZX : video surveillance for tracking.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Jim; Spencer, Andrew; Chu, Eric

    2008-08-01

    As part of the National Security Engineering Institute (NSEI) project, several sensors were developed in conjunction with an assessment algorithm. A camera system was developed in-house to track the locations of personnel within a secure room. In addition, a commercial, off-the-shelf (COTS) tracking system developed by TYZX was examined. TYZX is a Bay Area start-up that has developed its own tracking hardware and software which we use as COTS support for robust tracking. This report discusses the pros and cons of each camera system, how they work, a proposed data fusion method, and some visual results. Distributed, embedded image processingmore » solutions show the most promise in their ability to track multiple targets in complex environments and in real-time. Future work on the camera system may include three-dimensional volumetric tracking by using multiple simple cameras, Kalman or particle filtering, automated camera calibration and registration, and gesture or path recognition.« less

  12. Software for minimalistic data management in large camera trap studies

    PubMed Central

    Krishnappa, Yathin S.; Turner, Wendy C.

    2014-01-01

    The use of camera traps is now widespread and their importance in wildlife studies well understood. Camera trap studies can produce millions of photographs and there is a need for software to help manage photographs efficiently. In this paper, we describe a software system that was built to successfully manage a large behavioral camera trap study that produced more than a million photographs. We describe the software architecture and the design decisions that shaped the evolution of the program over the study’s three year period. The software system has the ability to automatically extract metadata from images, and add customized metadata to the images in a standardized format. The software system can be installed as a standalone application on popular operating systems. It is minimalistic, scalable and extendable so that it can be used by small teams or individual researchers for a broad variety of camera trap studies. PMID:25110471

  13. SL3-108-01295

    NASA Image and Video Library

    1973-07-01

    SL3-108-1295 (July-September 1973) --- A close-up view of astronaut Jack R. Lousma, Skylab 3 pilot, taking a hot bath in the crew quarters of the Orbital Workshop (OWS) of the Skylab space station cluster in Earth orbit. This picture was taken with a hand-held 35mm Nikon camera. Astronauts Lousma, Alan L. Bean and Owen K. Garriott remained with the Skylab space station in orbit for 59 days conducting numerous medical, scientific and technological experiments. In deploying the shower facility the shower curtain is pulled up from the floor and attached to the ceiling. The water comes through a push-button shower head attached to a flexible hose. Water is drawn off by a vacuum system. Photo credit: NASA

  14. Basic development of a small balloon-mounted telemetry and its operation system by university students

    NASA Astrophysics Data System (ADS)

    Yamamoto, Masa-yuki; Kakinami, Yoshihiro; Kono, Hiroki

    In Japan, the high altitude balloon for scientific observation has been continuously launched by JAXA. The balloon has a possibility to reach 50 km altitude without tight environmental condition for onboard equipments, operating with a cost lower than sounding rockets, however, development of the large-scale scientific observation balloons by university laboratories is still difficult. Being coupled with recent improvement of semiconductor sensors, laboratory-basis balloon experiments using small weather balloons has been becoming easily in these years. Owing to an advantage of wide land fields in continental regions, the launch of such small balloons has become to be carried out many times especially in continental countries (e.g. Near Space Ventures, Inc., 2013). Although the balloon is very small as its diameter of 6 feet, excluding its extra buoyancy and the weight of the balloon itself, it is expected that about 2 kg loading capacity is remained for payloads to send it up to about 35 km altitude. However, operation of such balloons in Japan is not in general because precise prediction of a landing area of the payload is difficult, thus high-risk situation for balloon releases is remained. In this study, we aim to achieve practical engineering experiments of weather balloons in Japan to be used for scientific observation within university laboratory level as an educational context. Here we report an approach of developing many devices for a small tethered balloon currently in progress. We evaluated an accuracy of altitude measurement by using a laboratory developed altitude data logger system that consists of a GPS-module and a barometric altimeter. Diameter of the balloon was about 1.4 m. Being fulfilled with about 1440 L helium, it produced buoyancy of about 15.7 N. Taking into account of total weight including the mooring equipments, available payload mass becomes to be about 1100 g. Applying an advantage of a 3D printer of FDM (Fused Deposition Modeling) method with a 3DCAD design software, we designed and manufactured a camera-platform type antenna rotator that automatically track the balloon direction based on the received GPS data as a balloon operation system on ground with automatic controlling software for the tracking system. In order to develop a future telemetry system onboard a small weather balloon, we have performed an onboard data logger system. In this presentation, system configuration of the automatic tracking system will be introduced more in detail. The telemetry system onboard the small balloon is currently under development. We have a plan to send the measured GPS coordinates, temperature, pressure, and humidity data detected by the onboard sensors to ground. A monitoring camera, a 3-axes accelerometer, geomagnetic azimuth measurement, and power monitoring were added to the developed data logger system. The acquired data will be stored in an SD card aboard as well as transmitted to the ground. Using a vacuum chamber with a pressure sensors and a constant-temperature reservoir in laboratory, environmental tests were operated. In this presentation, introducing the data obtained through the development of a prototype balloon system, our recent results and problems will be discussed.

  15. Calibration of a dual-PTZ camera system for stereo vision

    NASA Astrophysics Data System (ADS)

    Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng

    2010-08-01

    In this paper, we propose a calibration process for the intrinsic and extrinsic parameters of dual-PTZ camera systems. The calibration is based on a complete definition of six coordinate systems fixed at the image planes, and the pan and tilt rotation axes of the cameras. Misalignments between estimated and ideal coordinates of image corners are formed into cost values to be solved by the Nelder-Mead simplex optimization method. Experimental results show that the system is able to obtain 3D coordinates of objects with a consistent accuracy of 1 mm when the distance between the dual-PTZ camera set and the objects are from 0.9 to 1.1 meters.

  16. Geometrical calibration television measuring systems with solid state photodetectors

    NASA Astrophysics Data System (ADS)

    Matiouchenko, V. G.; Strakhov, V. V.; Zhirkov, A. O.

    2000-11-01

    The various optical measuring methods for deriving information about the size and form of objects are now used in difference branches- mechanical engineering, medicine, art, criminalistics. Measuring by means of the digital television systems is one of these methods. The development of this direction is promoted by occurrence on the market of various types and costs small-sized television cameras and frame grabbers. There are many television measuring systems using the expensive cameras, but accuracy performances of low cost cameras are also interested for the system developers. For this reason inexpensive mountingless camera SK1004CP (format 1/3', cost up to 40$) and frame grabber Aver2000 were used in experiments.

  17. Hubble Sees a "Mess of Stars"

    NASA Image and Video Library

    2015-08-14

    Bursts of pink and red, dark lanes of mottled cosmic dust, and a bright scattering of stars — this NASA/ESA Hubble Space Telescope image shows part of a messy barred spiral galaxy known as NGC 428. It lies approximately 48 million light-years away from Earth in the constellation of Cetus (The Sea Monster). Although a spiral shape is still just about visible in this close-up shot, overall NGC 428’s spiral structure appears to be quite distorted and warped, thought to be a result of a collision between two galaxies. There also appears to be a substantial amount of star formation occurring within NGC 428 — another telltale sign of a merger. When galaxies collide their clouds of gas can merge, creating intense shocks and hot pockets of gas, and often triggering new waves of star formation. NGC 428 was discovered by William Herschel in December 1786. More recently a type of supernova designated SN2013ct was discovered within the galaxy by Stuart Parker of the BOSS (Backyard Observatory Supernova Search) project in Australia and New Zealand, although it is unfortunately not visible in this image. This image was captured by Hubble’s Advanced Camera for Surveys (ACS) and Wide Field and Planetary Camera 2 (WFPC2). Image credit: ESA/Hubble and NASA and S. Smartt (Queen's University Belfast), Acknowledgements: Nick Rose and Flickr user pennine cloud NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  18. Hubble Catches a Galaxy Duo by the "Hare"

    NASA Image and Video Library

    2017-12-08

    This image from the NASA/ESA Hubble Space Telescope shows the unusual galaxy IRAS 06076-2139, found in the constellation Lepus (The Hare). Hubble’s Wide Field Camera 3 (WFC3) and Advanced Camera for Surveys (ACS) instruments observed the galaxy from a distance of 500 million light-years. This particular object stands out from the crowd by actually being composed of two separate galaxies rushing past each other at about 2 million kilometers (1,243,000 miles) per hour. This speed is most likely too fast for them to merge and form a single galaxy. However, because of their small separation of only about 20,000 light-years, the galaxies will distort one another through the force of gravity while passing each other, changing their structures on a grand scale. Such galactic interactions are a common sight for Hubble, and have long been a field of study for astronomers. The intriguing behaviors of interacting galaxies take many forms; galactic cannibalism, galaxy harassment and even galaxy collisions. The Milky Way itself will eventually fall victim to the latter, merging with the Andromeda Galaxy in about 4.5 billion years. The fate of our galaxy shouldn’t be alarming though: while galaxies are populated by billions of stars, the distances between individual stars are so large that hardly any stellar collisions will occur. Credit: ESA/Hubble & NASA NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Using Image Analysis to Explore Changes In Bacterial Mat Coverage at the Base of a Hydrothermal Vent within the Caldera of Axial Seamount

    NASA Astrophysics Data System (ADS)

    Knuth, F.; Crone, T. J.; Marburg, A.

    2017-12-01

    The Ocean Observatories Initiative's (OOI) Cabled Array is delivering real-time high-definition video data from an HD video camera (CAMHD), installed at the Mushroom hydrothermal vent in the ASHES hydrothermal vent field within the caldera of Axial Seamount, an active submarine volcano located approximately 450 kilometers off the coast of Washington at a depth of 1,542 m. Every three hours the camera pans, zooms and focuses in on nine distinct scenes of scientific interest across the vent, producing 14-minute-long videos during each run. This standardized video sampling routine enables scientists to programmatically analyze the content of the video using automated image analysis techniques. Each scene-specific time series dataset can service a wide range of scientific investigations, including the estimation of bacterial flux into the system by quantifying chemosynthetic bacterial clusters (floc) present in the water column, relating periodicity in hydrothermal vent fluid flow to earth tides, measuring vent chimney growth in response to changing hydrothermal fluid flow rates, or mapping the patterns of fauna colonization, distribution and composition across the vent over time. We are currently investigating the seventh scene in the sampling routine, focused on the bacterial mat covering the seafloor at the base of the vent. We quantify the change in bacterial mat coverage over time using image analysis techniques, and examine the relationship between mat coverage, fluid flow processes, episodic chimney collapse events, and other processes observed by Cabled Array instrumentation. This analysis is being conducted using cloud-enabled computer vision processing techniques, programmatic image analysis, and time-lapse video data collected over the course of the first CAMHD deployment, from November 2015 to July 2016.

  20. Hubble Finds a Little Gem

    NASA Image and Video Library

    2015-08-07

    This colorful bubble is a planetary nebula called NGC 6818, also known as the Little Gem Nebula. It is located in the constellation of Sagittarius (The Archer), roughly 6,000 light-years away from us. The rich glow of the cloud is just over half a light-year across — humongous compared to its tiny central star — but still a little gem on a cosmic scale. When stars like the sun enter "retirement," they shed their outer layers into space to create glowing clouds of gas called planetary nebulae. This ejection of mass is uneven, and planetary nebulae can have very complex shapes. NGC 6818 shows knotty filament-like structures and distinct layers of material, with a bright and enclosed central bubble surrounded by a larger, more diffuse cloud. Scientists believe that the stellar wind from the central star propels the outflowing material, sculpting the elongated shape of NGC 6818. As this fast wind smashes through the slower-moving cloud it creates particularly bright blowouts at the bubble’s outer layers. Hubble previously imaged this nebula back in 1997 with its Wide Field Planetary Camera 2, using a mix of filters that highlighted emission from ionized oxygen and hydrogen. This image, while from the same camera, uses different filters to reveal a different view of the nebula. Image credit: ESA/Hubble & NASA, Acknowledgement: Judy Schmidt NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  1. Coincidence electron/ion imaging with a fast frame camera

    NASA Astrophysics Data System (ADS)

    Li, Wen; Lee, Suk Kyoung; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander; Fan, Lin

    2015-05-01

    A new time- and position- sensitive particle detection system based on a fast frame CMOS camera is developed for coincidence electron/ion imaging. The system is composed of three major components: a conventional microchannel plate (MCP)/phosphor screen electron/ion imager, a fast frame CMOS camera and a high-speed digitizer. The system collects the positional information of ions/electrons from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of MCPs processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of electron/ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide. We further show that a time resolution of 30 ps can be achieved when measuring electron TOF spectrum and this enables the new system to achieve a good energy resolution along the TOF axis.

  2. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    USDA-ARS?s Scientific Manuscript database

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  3. Coincidence ion imaging with a fast frame camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei

    2014-12-15

    A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots onmore » each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.« less

  4. Research into a Single-aperture Light Field Camera System to Obtain Passive Ground-based 3D Imagery of LEO Objects

    NASA Astrophysics Data System (ADS)

    Bechis, K.; Pitruzzello, A.

    2014-09-01

    This presentation describes our ongoing research into using a ground-based light field camera to obtain passive, single-aperture 3D imagery of LEO objects. Light field cameras are an emerging and rapidly evolving technology for passive 3D imaging with a single optical sensor. The cameras use an array of lenslets placed in front of the camera focal plane, which provides angle of arrival information for light rays originating from across the target, allowing range to target and 3D image to be obtained from a single image using monocular optics. The technology, which has been commercially available for less than four years, has the potential to replace dual-sensor systems such as stereo cameras, dual radar-optical systems, and optical-LIDAR fused systems, thus reducing size, weight, cost, and complexity. We have developed a prototype system for passive ranging and 3D imaging using a commercial light field camera and custom light field image processing algorithms. Our light field camera system has been demonstrated for ground-target surveillance and threat detection applications, and this paper presents results of our research thus far into applying this technology to the 3D imaging of LEO objects. The prototype 3D imaging camera system developed by Northrop Grumman uses a Raytrix R5 C2GigE light field camera connected to a Windows computer with an nVidia graphics processing unit (GPU). The system has a frame rate of 30 Hz, and a software control interface allows for automated camera triggering and light field image acquisition to disk. Custom image processing software then performs the following steps: (1) image refocusing, (2) change detection, (3) range finding, and (4) 3D reconstruction. In Step (1), a series of 2D images are generated from each light field image; the 2D images can be refocused at up to 100 different depths. Currently, steps (1) through (3) are automated, while step (4) requires some user interaction. A key requirement for light field camera operation is that the target must be within the near-field (Fraunhofer distance) of the collecting optics. For example, in visible light the near-field of a 1-m telescope extends out to about 3,500 km, while the near-field of the AEOS telescope extends out over 46,000 km. For our initial proof of concept, we have integrated our light field camera with a 14-inch Meade LX600 advanced coma-free telescope, to image various surrogate ground targets at up to tens of kilometers range. Our experiments with the 14-inch telescope have assessed factors and requirements that are traceable and scalable to a larger-aperture system that would have the near-field distance needed to obtain 3D images of LEO objects. The next step would be to integrate a light field camera with a 1-m or larger telescope and evaluate its 3D imaging capability against LEO objects. 3D imaging of LEO space objects with light field camera technology can potentially provide a valuable new tool for space situational awareness, especially for those situations where laser or radar illumination of the target objects is not feasible.

  5. The ExoMars PanCam Instrument

    NASA Astrophysics Data System (ADS)

    Griffiths, Andrew; Coates, Andrew; Muller, Jan-Peter; Jaumann, Ralf; Josset, Jean-Luc; Paar, Gerhard; Barnes, David

    2010-05-01

    The ExoMars mission has evolved into a joint European-US mission to deliver a trace gas orbiter and a pair of rovers to Mars in 2016 and 2018 respectively. The European rover will carry the Pasteur exobiology payload including the 1.56 kg Panoramic Camera. PanCam will provide multispectral stereo images with 34 deg horizontal field-of-view (580 microrad/pixel) Wide-Angle Cameras (WAC) and (83 microrad/pixel) colour monoscopic "zoom" images with 5 deg horizontal field-of-view High Resolution Camera (HRC). The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage [1]. Integrated with the WACs and HRC into the PanCam optical bench (which helps the instrument meet its planetary protection requirements) is the PanCam interface unit (PIU); which provides image storage, a Spacewire interface to the rover and DC-DC power conversion. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission [2] as well as providing multispectral geological imaging, colour and stereo panoramic images and solar images for water vapour abundance and dust optical depth measurements. The High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls. Additionally HRC will be used to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. In short, PanCam provides the overview and context for the ExoMars experiment locations, required to enable the exobiology aims of the mission. In addition to these baseline capabilities further enhancements are possible to PanCam to enhance it's effectiveness for astrobiology and planetary exploration: 1. Rover Inspection Mirror (RIM) 2. Organics Detection by Fluorescence Excitation (ODFE) LEDs [3-6] 3. UVIS broadband UV Flux and Opacity Determination (UVFOD) photodiode This paper will discuss the scientific objectives and resource impacts of these enhancements. References: 1. Griffiths, A.D., Coates, A.J., Josset, J.-L., Paar, G., Hofmann, B., Pullan, D., Ruffer, P., Sims, M.R., Pillinger, C.T., The Beagle 2 stereo camera system, Planet. Space Sci. 53, 1466-1488, 2005. 2. Paar, G., Oberst, J., Barnes, D.P., Griffiths, A.D., Jaumann, R., Coates, A.J., Muller, J.P., Gao, Y., Li, R., 2007, Requirements and Solutions for ExoMars Rover Panoramic Camera 3d Vision Processing, abstract submitted to EGU meeting, Vienna, 2007. 3. Storrie-Lombardi, M.C., Hug, W.F., McDonald, G.D., Tsapin, A.I., and Nealson, K.H. 2001. Hollow cathode ion lasers for deep ultraviolet Raman spectroscopy and fluorescence imaging. Rev. Sci. Ins., 72 (12), 4452-4459. 4. Nealson, K.H., Tsapin, A., and Storrie-Lombardi, M. 2002. Searching for life in the universe: unconventional methods for an unconventional problem. International Microbiology, 5, 223-230. 5. Mormile, M.R. and Storrie-Lombardi, M.C. 2005. The use of ultraviolet excitation of native fluorescence for identifying biomarkers in halite crystals. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 246-253. 6. Storrie-Lombardi, M.C. 2005. Post-Bayesian strategies to optimize astrobiology instrument suites: lessons from Antarctica and the Pilbara. Astrobiology and Planetary Missions (R. B. Hoover, G. V. Levin and A. Y. Rozanov, Eds.), Proc. SPIE, 5906, 288-301.

  6. An evaluation of video cameras for collecting observational data on sanctuary-housed chimpanzees (Pan troglodytes).

    PubMed

    Hansen, Bethany K; Fultz, Amy L; Hopper, Lydia M; Ross, Stephen R

    2018-05-01

    Video cameras are increasingly being used to monitor captive animals in zoo, laboratory, and agricultural settings. This technology may also be useful in sanctuaries with large and/or complex enclosures. However, the cost of camera equipment and a lack of formal evaluations regarding the use of cameras in sanctuary settings make it challenging for facilities to decide whether and how to implement this technology. To address this, we evaluated the feasibility of using a video camera system to monitor chimpanzees at Chimp Haven. We viewed a group of resident chimpanzees in a large forested enclosure and compared observations collected in person and with remote video cameras. We found that via camera, the observer viewed fewer chimpanzees in some outdoor locations (GLMM post hoc test: est. = 1.4503, SE = 0.1457, Z = 9.951, p < 0.001) and identified a lower proportion of chimpanzees (GLMM post hoc test: est. = -2.17914, SE = 0.08490, Z = -25.666, p < 0.001) compared to in-person observations. However, the observer could view the 2 ha enclosure 15 times faster by camera compared to in person. In addition to these results, we provide recommendations to animal facilities considering the installation of a video camera system. Despite some limitations of remote monitoring, we posit that there are substantial benefits of using camera systems in sanctuaries to facilitate animal care and observational research. © 2018 Wiley Periodicals, Inc.

  7. Object tracking using multiple camera video streams

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  8. Real-time depth camera tracking with geometrically stable weight algorithm

    NASA Astrophysics Data System (ADS)

    Fu, Xingyin; Zhu, Feng; Qi, Feng; Wang, Mingming

    2017-03-01

    We present an approach for real-time camera tracking with depth stream. Existing methods are prone to drift in sceneries without sufficient geometric information. First, we propose a new weight method for an iterative closest point algorithm commonly used in real-time dense mapping and tracking systems. By detecting uncertainty in pose and increasing weight of points that constrain unstable transformations, our system achieves accurate and robust trajectory estimation results. Our pipeline can be fully parallelized with GPU and incorporated into the current real-time depth camera tracking system seamlessly. Second, we compare the state-of-the-art weight algorithms and propose a weight degradation algorithm according to the measurement characteristics of a consumer depth camera. Third, we use Nvidia Kepler Shuffle instructions during warp and block reduction to improve the efficiency of our system. Results on the public TUM RGB-D database benchmark demonstrate that our camera tracking system achieves state-of-the-art results both in accuracy and efficiency.

  9. Intraocular camera for retinal prostheses: Refractive and diffractive lens systems

    NASA Astrophysics Data System (ADS)

    Hauer, Michelle Christine

    The focus of this thesis is on the design and analysis of refractive, diffractive, and hybrid refractive/diffractive lens systems for a miniaturized camera that can be surgically implanted in the crystalline lens sac and is designed to work in conjunction with current and future generation retinal prostheses. The development of such an intraocular camera (IOC) would eliminate the need for an external head-mounted or eyeglass-mounted camera. Placing the camera inside the eye would allow subjects to use their natural eye movements for foveation (attention) instead of more cumbersome head tracking, would notably aid in personal navigation and mobility, and would also be significantly more psychologically appealing from the standpoint of personal appearances. The capability for accommodation with no moving parts or feedback control is incorporated by employing camera designs that exhibit nearly infinite depth of field. Such an ultracompact optical imaging system requires a unique combination of refractive and diffractive optical elements and relaxed system constraints derived from human psychophysics. This configuration necessitates an extremely compact, short focal-length lens system with an f-number close to unity. Initially, these constraints appear highly aggressive from an optical design perspective. However, after careful analysis of the unique imaging requirements of a camera intended to work in conjunction with the relatively low pixellation levels of a retinal microstimulator array, it becomes clear that such a design is not only feasible, but could possibly be implemented with a single lens system.

  10. Orbital-science investigation: Part C: photogrammetry of Apollo 15 photography

    USGS Publications Warehouse

    Wu, Sherman S.C.; Schafer, Francis J.; Jordan, Raymond; Nakata, Gary M.; Derick, James L.

    1972-01-01

    Mapping of large areas of the Moon by photogrammetric methods was not seriously considered until the Apollo 15 mission. In this mission, a mapping camera system and a 61-cm optical-bar high-resolution panoramic camera, as well as a laser altimeter, were used. The mapping camera system comprises a 7.6-cm metric terrain camera and a 7.6-cm stellar camera mounted in a fixed angular relationship (an angle of 96° between the two camera axes). The metric camera has a glass focal-plane plate with reseau grids. The ground-resolution capability from an altitude of 110 km is approximately 20 m. Because of the auxiliary stellar camera and the laser altimeter, the resulting metric photography can be used not only for medium- and small-scale cartographic or topographic maps, but it also can provide a basis for establishing a lunar geodetic network. The optical-bar panoramic camera has a 135- to 180-line resolution, which is approximately 1 to 2 m of ground resolution from an altitude of 110 km. Very large scale specialized topographic maps for supporting geologic studies of lunar-surface features can be produced from the stereoscopic coverage provided by this camera.

  11. University of Virginia suborbital infrared sensing experiment

    NASA Astrophysics Data System (ADS)

    Holland, Stephen; Nunnally, Clayton; Armstrong, Sarah; Laufer, Gabriel

    2002-03-01

    An Orion sounding rocket launched from Wallops Flight Facility carried a University of Virginia payload to an altitude of 47 km and returned infrared measurements of the Earth's upper atmosphere and video images of the ocean. The payload launch was the result of a three-year undergraduate design project by a multi-disciplinary student group from the University of Virginia and James Madison University. As part of a new multi-year design course, undergraduate students designed, built, tested, and participated in the launch of a suborbital platform from which atmospheric remote sensors and other scientific experiments could operate. The first launch included a simplified atmospheric measurement system intended to demonstrate full system operation and remote sensing capabilities during suborbital flight. A thermoelectrically cooled HgCdTe infrared detector, with peak sensitivity at 10 micrometers , measured upwelling radiation and a small camera and VCR system, aligned with the infrared sensor, provided a ground reference. Additionally, a simple orientation sensor, consisting of three photodiodes, equipped with red, green, and blue light with dichroic filters, was tested. Temperature measurements of the upper atmosphere were successfully obtained during the flight. Video images were successfully recorded on-board the payload and proved a valuable tool in the data analysis process. The photodiode system, intended as a replacement for the camera and VCR system, functioned well, despite low signal amplification. This fully integrated and flight tested payload will serve as a platform for future atmospheric sensing experiments. It is currently being modified for a second suborbital flight that will incorporate a gas filter correlation radiometry (GFCR) instrument to measure the distribution of stratospheric methane and imaging capabilities to record the chlorophyll distribution in the Metompkin Bay as an indicator of pollution runoff.

  12. Digital dental photography. Part 4: choosing a camera.

    PubMed

    Ahmad, I

    2009-06-13

    With so many cameras and systems on the market, making a choice of the right one for your practice needs is a daunting task. As described in Part 1 of this series, a digital single reflex (DSLR) camera is an ideal choice for dental use in enabling the taking of portraits, close-up or macro images of the dentition and study casts. However, for the sake of completion, some other cameras systems that are used in dentistry are also discussed.

  13. The new camera calibration system at the US Geological Survey

    USGS Publications Warehouse

    Light, D.L.

    1992-01-01

    Modern computerized photogrammetric instruments are capable of utilizing both radial and decentering camera calibration parameters which can increase plotting accuracy over that of older analog instrumentation technology from previous decades. Also, recent design improvements in aerial cameras have minimized distortions and increased the resolving power of camera systems, which should improve the performance of the overall photogrammetric process. In concert with these improvements, the Geological Survey has adopted the rigorous mathematical model for camera calibration developed by Duane Brown. An explanation of the Geological Survey's calibration facility and the additional calibration parameters now being provided in the USGS calibration certificate are reviewed. -Author

  14. 3D display for enhanced tele-operation and other applications

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Pezzaniti, J. Larry; Vaden, Justin; Hyatt, Brian; Morris, James; Chenault, David; Bodenhamer, Andrew; Pettijohn, Bradley; Tchon, Joe; Barnidge, Tracy; Kaufman, Seth; Kingston, David; Newell, Scott

    2010-04-01

    In this paper, we report on the use of a 3D vision field upgrade kit for TALON robot consisting of a replacement flat panel stereoscopic display, and multiple stereo camera systems. An assessment of the system's use for robotic driving, manipulation, and surveillance operations was conducted. A replacement display, replacement mast camera with zoom, auto-focus, and variable convergence, and a replacement gripper camera with fixed focus and zoom comprise the upgrade kit. The stereo mast camera allows for improved driving and situational awareness as well as scene survey. The stereo gripper camera allows for improved manipulation in typical TALON missions.

  15. An on-line calibration algorithm for external parameters of visual system based on binocular stereo cameras

    NASA Astrophysics Data System (ADS)

    Wang, Liqiang; Liu, Zhen; Zhang, Zhonghua

    2014-11-01

    Stereo vision is the key in the visual measurement, robot vision, and autonomous navigation. Before performing the system of stereo vision, it needs to calibrate the intrinsic parameters for each camera and the external parameters of the system. In engineering, the intrinsic parameters remain unchanged after calibrating cameras, and the positional relationship between the cameras could be changed because of vibration, knocks and pressures in the vicinity of the railway or motor workshops. Especially for large baselines, even minute changes in translation or rotation can affect the epipolar geometry and scene triangulation to such a degree that visual system becomes disabled. A technology including both real-time examination and on-line recalibration for the external parameters of stereo system becomes particularly important. This paper presents an on-line method for checking and recalibrating the positional relationship between stereo cameras. In epipolar geometry, the external parameters of cameras can be obtained by factorization of the fundamental matrix. Thus, it offers a method to calculate the external camera parameters without any special targets. If the intrinsic camera parameters are known, the external parameters of system can be calculated via a number of random matched points. The process is: (i) estimating the fundamental matrix via the feature point correspondences; (ii) computing the essential matrix from the fundamental matrix; (iii) obtaining the external parameters by decomposition of the essential matrix. In the step of computing the fundamental matrix, the traditional methods are sensitive to noise and cannot ensure the estimation accuracy. We consider the feature distribution situation in the actual scene images and introduce a regional weighted normalization algorithm to improve accuracy of the fundamental matrix estimation. In contrast to traditional algorithms, experiments on simulated data prove that the method improves estimation robustness and accuracy of the fundamental matrix. Finally, we take an experiment for computing the relationship of a pair of stereo cameras to demonstrate accurate performance of the algorithm.

  16. The Surgeon's View: Comparison of Two Digital Video Recording Systems in Veterinary Surgery.

    PubMed

    Giusto, Gessica; Caramello, Vittorio; Comino, Francesco; Gandini, Marco

    2015-01-01

    Video recording and photography during surgical procedures are useful in veterinary medicine for several reasons, including legal, educational, and archival purposes. Many systems are available, such as hand cameras, light-mounted cameras, and head cameras. We chose a reasonably priced head camera that is among the smallest video cameras available. To best describe its possible uses and advantages, we recorded video and images of eight different surgical cases and procedures, both in hospital and field settings. All procedures were recorded both with a head-mounted camera and a commercial hand-held photo camera. Then sixteen volunteers (eight senior clinicians and eight final-year students) completed an evaluation questionnaire. Both cameras produced high-quality photographs and videos, but observers rated the head camera significantly better regarding point of view and their understanding of the surgical operation. The head camera was considered significantly more useful in teaching surgical procedures. Interestingly, senior clinicians tended to assign generally lower scores compared to students. The head camera we tested is an effective, easy-to-use tool for recording surgeries and various veterinary procedures in all situations, with no need for assistance from a dedicated operator. It can be a valuable aid for veterinarians working in all fields of the profession and a useful tool for veterinary surgical education.

  17. RESTORATION OF ATMOSPHERICALLY DEGRADED IMAGES. VOLUME 3.

    DTIC Science & Technology

    AERIAL CAMERAS, LASERS, ILLUMINATION, TRACKING CAMERAS, DIFFRACTION, PHOTOGRAPHIC GRAIN, DENSITY, DENSITOMETERS, MATHEMATICAL ANALYSIS, OPTICAL SCANNING, SYSTEMS ENGINEERING, TURBULENCE, OPTICAL PROPERTIES, SATELLITE TRACKING SYSTEMS.

  18. Development of two-framing camera with large format and ultrahigh speed

    NASA Astrophysics Data System (ADS)

    Jiang, Xiaoguo; Wang, Yuan; Wang, Yi

    2012-10-01

    High-speed imaging facility is important and necessary for the formation of time-resolved measurement system with multi-framing capability. The framing camera which satisfies the demands of both high speed and large format needs to be specially developed in the ultrahigh speed research field. A two-framing camera system with high sensitivity and time-resolution has been developed and used for the diagnosis of electron beam parameters of Dragon-I linear induction accelerator (LIA). The camera system, which adopts the principle of light beam splitting in the image space behind the lens with long focus length, mainly consists of lens-coupled gated image intensifier, CCD camera and high-speed shutter trigger device based on the programmable integrated circuit. The fastest gating time is about 3 ns, and the interval time between the two frames can be adjusted discretely at the step of 0.5 ns. Both the gating time and the interval time can be tuned to the maximum value of about 1 s independently. Two images with the size of 1024×1024 for each can be captured simultaneously in our developed camera. Besides, this camera system possesses a good linearity, uniform spatial response and an equivalent background illumination as low as 5 electrons/pix/sec, which fully meets the measurement requirements of Dragon-I LIA.

  19. Vision guided landing of an an autonomous helicopter in hazardous terrain

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Montgomery, Jim

    2005-01-01

    Future robotic space missions will employ a precision soft-landing capability that will enable exploration of previously inaccessible sites that have strong scientific significance. To enable this capability, a fully autonomous onboard system that identifies and avoids hazardous features such as steep slopes and large rocks is required. Such a system will also provide greater functionality in unstructured terrain to unmanned aerial vehicles. This paper describes an algorithm for landing hazard avoidance based on images from a single moving camera. The core of the algorithm is an efficient application of structure from motion to generate a dense elevation map of the landing area. Hazards are then detected in this map and a safe landing site is selected. The algorithm has been implemented on an autonomous helicopter testbed and demonstrated four times resulting in the first autonomous landing of an unmanned helicopter in unknown and hazardous terrain.

  20. Apollo 15 Mission Report

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A detailed discussion is presented of the Apollo 15 mission, which conducted exploration of the moon over longer periods, greater ranges, and with more instruments of scientific data acquisition than previous missions. The topics include trajectory, lunar surface science, inflight science and photography, command and service module performance, lunar module performance, lunar surface operational equipment, pilot's report, biomedical evaluation, mission support performance, assessment of mission objectives, launch phase summary, anomaly summary, and vehicle and equipment descriptions. The capability of transporting larger payloads and extending time on the moon were demonstrated. The ground-controlled TV camera allowed greater real-time participation by earth-bound personnel. The crew operated more as scientists and relied more on ground support team for systems monitoring. The modified pressure garment and portable life support system provided better mobility and extended EVA time. The lunar roving vehicle and the lunar communications relay unit were also demonstrated.

Top