Computer image generation: Reconfigurability as a strategy in high fidelity space applications
NASA Technical Reports Server (NTRS)
Bartholomew, Michael J.
1989-01-01
The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.
Space imaging measurement system based on fixed lens and moving detector
NASA Astrophysics Data System (ADS)
Akiyama, Akira; Doshida, Minoru; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu
2006-08-01
We have developed the Space Imaging Measurement System based on the fixed lens and fast moving detector to the control of the autonomous ground vehicle. The space measurement is the most important task in the development of the autonomous ground vehicle. In this study we move the detector back and forth along the optical axis at the fast rate to measure the three-dimensional image data. This system is just appropriate to the autonomous ground vehicle because this system does not send out any optical energy to measure the distance and keep the safety. And we use the digital camera of the visible ray range. Therefore it gives us the cost reduction of the three-dimensional image data acquisition with respect to the imaging laser system. We can combine many pieces of the narrow space imaging measurement data to construct the wide range three-dimensional data. This gives us the improvement of the image recognition with respect to the object space. To develop the fast movement of the detector, we build the counter mass balance in the mechanical crank system of the Space Imaging Measurement System. And then we set up the duct to prevent the optical noise due to the ray not coming through lens. The object distance is derived from the focus distance which related to the best focused image data. The best focused image data is selected from the image of the maximum standard deviation in the standard deviations of series images.
Low-cost space-varying FIR filter architecture for computational imaging systems
NASA Astrophysics Data System (ADS)
Feng, Guotong; Shoaib, Mohammed; Schwartz, Edward L.; Dirk Robinson, M.
2010-01-01
Recent research demonstrates the advantage of designing electro-optical imaging systems by jointly optimizing the optical and digital subsystems. The optical systems designed using this joint approach intentionally introduce large and often space-varying optical aberrations that produce blurry optical images. Digital sharpening restores reduced contrast due to these intentional optical aberrations. Computational imaging systems designed in this fashion have several advantages including extended depth-of-field, lower system costs, and improved low-light performance. Currently, most consumer imaging systems lack the necessary computational resources to compensate for these optical systems with large aberrations in the digital processor. Hence, the exploitation of the advantages of the jointly designed computational imaging system requires low-complexity algorithms enabling space-varying sharpening. In this paper, we describe a low-cost algorithmic framework and associated hardware enabling the space-varying finite impulse response (FIR) sharpening required to restore largely aberrated optical images. Our framework leverages the space-varying properties of optical images formed using rotationally-symmetric optical lens elements. First, we describe an approach to leverage the rotational symmetry of the point spread function (PSF) about the optical axis allowing computational savings. Second, we employ a specially designed bank of sharpening filters tuned to the specific radial variation common to optical aberrations. We evaluate the computational efficiency and image quality achieved by using this low-cost space-varying FIR filter architecture.
Space-based infrared sensors of space target imaging effect analysis
NASA Astrophysics Data System (ADS)
Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang
2018-02-01
Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.
A phase space approach to imaging from limited data
NASA Astrophysics Data System (ADS)
Testorf, Markus E.
2015-09-01
The optical instrument function is used as the basis to develop optical system theory for imaging applications. The detection of optical signals is conveniently described as the overlap integral of the Wigner distribution functions of instrument and optical signal. Based on this framework various optical imaging systems, including plenoptic cameras, phase-retrieval algorithms, and Shack-Hartman sensors are shown to acquire information about a domain in phase-space, with finite extension and finite resolution. It is demonstrated how phase space optics can be used both to analyze imaging systems, as well as for designing methods for image reconstruction.
Computer graphics testbed to simulate and test vision systems for space applications
NASA Technical Reports Server (NTRS)
Cheatham, John B.; Wu, Chris K.; Lin, Y. H.
1991-01-01
A system was developed for displaying computer graphics images of space objects and the use of the system was demonstrated as a testbed for evaluating vision systems for space applications. In order to evaluate vision systems, it is desirable to be able to control all factors involved in creating the images used for processing by the vision system. Considerable time and expense is involved in building accurate physical models of space objects. Also, precise location of the model relative to the viewer and accurate location of the light source require additional effort. As part of this project, graphics models of space objects such as the Solarmax satellite are created that the user can control the light direction and the relative position of the object and the viewer. The work is also aimed at providing control of hue, shading, noise and shadows for use in demonstrating and testing imaging processing techniques. The simulated camera data can provide XYZ coordinates, pitch, yaw, and roll for the models. A physical model is also being used to provide comparison of camera images with the graphics images.
How We Get Pictures from Space. NASA Facts (Revised Edition).
ERIC Educational Resources Information Center
Haynes, Robert
This booklet discusses image processing from spacecraft in deep space. The camera system on board the spacecraft, the Deep Space Network (DSN), and the image processing system are described. A table listing photographs taken by unmanned spacecraft from 1959-1977 is provided. (YP)
Image-based systems for space surveillance: from images to collision avoidance
NASA Astrophysics Data System (ADS)
Pyanet, Marine; Martin, Bernard; Fau, Nicolas; Vial, Sophie; Chalte, Chantal; Beraud, Pascal; Fuss, Philippe; Le Goff, Roland
2011-11-01
In many spatial systems, image is a core technology to fulfil the mission requirements. Depending on the application, the needs and the constraints are different and imaging systems can offer a large variety of configurations in terms of wavelength, resolution, field-of-view, focal length or sensitivity. Adequate image processing algorithms allow the extraction of the needed information and the interpretation of images. As a prime contractor for many major civil or military projects, Astrium ST is very involved in the proposition, development and realization of new image-based techniques and systems for space-related purposes. Among the different applications, space surveillance is a major stake for the future of space transportation. Indeed, studies show that the number of debris in orbit is exponentially growing and the already existing population of small and medium debris is a concrete threat to operational satellites. This paper presents Astrium ST activities regarding space surveillance for space situational awareness (SSA) and space traffic management (STM). Among other possible SSA architectures, the relevance of a ground-based optical station network is investigated. The objective is to detect and track space debris and maintain an exhaustive and accurate catalogue up-to-date in order to assess collision risk for satellites and space vehicles. The system is composed of different type of optical stations dedicated to specific functions (survey, passive tracking, active tracking), distributed around the globe. To support these investigations, two in-house operational breadboards were implemented and are operated for survey and tracking purposes. This paper focuses on Astrium ST end-to-end optical-based survey concept. For the detection of new debris, a network of wide field of view survey stations is considered: those stations are able to detect small objects and associated image processing (detection and tracking) allow a preliminary restitution of their orbit.
An all-reflective wide-angle flat-field telescope for space
NASA Technical Reports Server (NTRS)
Hallam, K. L.; Howell, B. J.; Wilson, M. E.
1984-01-01
An all-reflective wide-angle flat-field telescope (WAFFT) designed and built at Goddard Space Flight Center demonstrates the markedly improved wide-angle imaging capability which can be achieved with a design based on a recently announced class of unobscured 3-mirror optical systems. Astronomy and earth observation missions in space dictate the necessity or preference for wide-angle all-reflective systems which can provide UV through IR wavelength coverage and tolerate the space environment. An initial prototype unit has been designed to meet imaging requirements suitable for monitoring the ultraviolet sky from space. The unobscured f/4, 36 mm efl system achieves a full 20 x 30 deg field of view with resolution over a flat focal surface that is well matched for use with advanced ultraviolet image array detectors. Aspects of the design and fabrication approach, which have especially important bearing on the system solution, are reviewed; and test results are compared with the analytic performance predictions. Other possible applications of the WAFFT class of imaging system are briefly discussed. The exceptional wide-angle, high quality resolution, and very wide spectral coverage of the WAFFT-type optical system could make it a very important tool for future space research.
Taoka, Toshiaki; Masutani, Yoshitaka; Kawai, Hisashi; Nakane, Toshiki; Matsuoka, Kiwamu; Yasuno, Fumihiko; Kishimoto, Toshifumi; Naganawa, Shinji
2017-04-01
The activity of the glymphatic system is impaired in animal models of Alzheimer's disease (AD). We evaluated the activity of the human glymphatic system in cases of AD with a diffusion-based technique called diffusion tensor image analysis along the perivascular space (DTI-ALPS). Diffusion tensor images were acquired to calculate diffusivities in the x, y, and z axes of the plane of the lateral ventricle body in 31 patients. We evaluated the diffusivity along the perivascular spaces as well as projection fibers and association fibers separately, to acquire an index for diffusivity along the perivascular space (ALPS-index) and correlated them with the mini mental state examinations (MMSE) score. We found a significant negative correlation between diffusivity along the projection fibers and association fibers. We also observed a significant positive correlation between diffusivity along perivascular spaces shown as ALPS-index and the MMSE score, indicating lower water diffusivity along the perivascular space in relation to AD severity. Activity of the glymphatic system may be evaluated with diffusion images. Lower diffusivity along the perivascular space on DTI-APLS seems to reflect impairment of the glymphatic system. This method may be useful for evaluating the activity of the glymphatic system.
Aperture Mask for Unambiguous Parity Determination in Long Wavelength Imagers
NASA Technical Reports Server (NTRS)
Bos, Brent
2011-01-01
A document discusses a new parity pupil mask design that allows users to unambiguously determine the image space coordinate system of all the James Webb Space Telescope (JWST) science instruments by using two out-of-focus images. This is an improvement over existing mask designs that could not completely eliminate the coordinate system parity ambiguity at a wavelength of 5.6 microns. To mitigate the problem of how the presence of diffraction artifacts can obscure the pupil mask detail, this innovation has been created with specifically designed edge features so that the image space coordinate system parity can be determined in the presence of diffraction, even at long wavelengths.
Global Interior Robot Localisation by a Colour Content Image Retrieval System
NASA Astrophysics Data System (ADS)
Chaari, A.; Lelandais, S.; Montagne, C.; Ahmed, M. Ben
2007-12-01
We propose a new global localisation approach to determine a coarse position of a mobile robot in structured indoor space using colour-based image retrieval techniques. We use an original method of colour quantisation based on the baker's transformation to extract a two-dimensional colour pallet combining as well space and vicinity-related information as colourimetric aspect of the original image. We conceive several retrieving approaches bringing to a specific similarity measure [InlineEquation not available: see fulltext.] integrating the space organisation of colours in the pallet. The baker's transformation provides a quantisation of the image into a space where colours that are nearby in the original space are also nearby in the output space, thereby providing dimensionality reduction and invariance to minor changes in the image. Whereas the distance [InlineEquation not available: see fulltext.] provides for partial invariance to translation, sight point small changes, and scale factor. In addition to this study, we developed a hierarchical search module based on the logic classification of images following rooms. This hierarchical module reduces the searching indoor space and ensures an improvement of our system performances. Results are then compared with those brought by colour histograms provided with several similarity measures. In this paper, we focus on colour-based features to describe indoor images. A finalised system must obviously integrate other type of signature like shape and texture.
MOSES: a modular sensor electronics system for space science and commercial applications
NASA Astrophysics Data System (ADS)
Michaelis, Harald; Behnke, Thomas; Tschentscher, Matthias; Mottola, Stefano; Neukum, Gerhard
1999-10-01
The camera group of the DLR--Institute of Space Sensor Technology and Planetary Exploration is developing imaging instruments for scientific and space applications. One example is the ROLIS imaging system of the ESA scientific space mission `Rosetta', which consists of a descent/downlooking and a close-up imager. Both are parts of the Rosetta-Lander payload and will operate in the extreme environment of a cometary nucleus. The Rosetta Lander Imaging System (ROLIS) will introduce a new concept for the sensor electronics, which is referred to as MOSES (Modula Sensor Electronics System). MOSES is a 3D miniaturized CCD- sensor-electronics which is based on single modules. Each of the modules has some flexibility and enables a simple adaptation to specific application requirements. MOSES is mainly designed for space applications where high performance and high reliability are required. This concept, however, can also be used in other science or commercial applications. This paper describes the concept of MOSES, its characteristics, performance and applications.
Knowledge-based machine vision systems for space station automation
NASA Technical Reports Server (NTRS)
Ranganath, Heggere S.; Chipman, Laure J.
1989-01-01
Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.
NASA Astrophysics Data System (ADS)
Newswander, T.; Riesland, David W.; Miles, Duane; Reinhart, Lennon
2017-09-01
For space optical systems that image extended scenes such as earth-viewing systems, modulation transfer function (MTF) test data is directly applicable to system optical resolution. For many missions, it is the most direct metric for establishing the best focus of the instrument. Additionally, MTF test products can be combined to predict overall imaging performance. For fixed focus instruments, finding the best focus during ground testing is critical to achieving good imaging performance. The ground testing should account for the full-imaging system, operational parameters, and operational environment. Testing the full-imaging system removes uncertainty caused by breaking configurations and the combination of multiple subassembly test results. For earth viewing, the imaging system needs to be tested at infinite conjugate. Operational environment test conditions should include temperature and vacuum. Optical MTF testing in the presence of operational vibration and gravity release is less straightforward and may not be possible on the ground. Gravity effects are mitigated by testing in multiple orientations. Many space telescope systems are designed and built to have optimum performance in a gravity-free environment. These systems can have imaging performance that is dominated by aberration including astigmatism. This paper discusses how the slanted edge MTF test is applied to determine the best focus of a space optical telescope in ground testing accounting for gravity sag effects. Actual optical system test results and conclusions are presented.
The exploration of outer space with cameras: A history of the NASA unmanned spacecraft missions
NASA Astrophysics Data System (ADS)
Mirabito, M. M.
The use of television cameras and other video imaging devices to explore the solar system's planetary bodies with unmanned spacecraft is chronicled. Attention is given to the missions and the imaging devices, beginning with the Ranger 7 moon mission, which featured the first successfully operated electrooptical subsystem, six television cameras with vidicon image sensors. NASA established a network of parabolic, ground-based antennas on the earth (the Deep Space Network) to receive signals from spacecraft travelling farther than 16,000 km into space. The image processing and enhancement techniques used to convert spacecraft data transmissions into black and white and color photographs are described, together with the technological requirements that drove the development of the various systems. Terrestrial applications of the planetary imaging systems are explored, including medical and educational uses. Finally, the implementation and functional characteristics of CCDs are detailed, noting their installation on the Space Telescope.
An advanced scanning method for space-borne hyper-spectral imaging system
NASA Astrophysics Data System (ADS)
Wang, Yue-ming; Lang, Jun-Wei; Wang, Jian-Yu; Jiang, Zi-Qing
2011-08-01
Space-borne hyper-spectral imagery is an important means for the studies and applications of earth science. High cost efficiency could be acquired by optimized system design. In this paper, an advanced scanning method is proposed, which contributes to implement both high temporal and spatial resolution imaging system. Revisit frequency and effective working time of space-borne hyper-spectral imagers could be greatly improved by adopting two-axis scanning system if spatial resolution and radiometric accuracy are not harshly demanded. In order to avoid the quality degradation caused by image rotation, an idea of two-axis rotation has been presented based on the analysis and simulation of two-dimensional scanning motion path and features. Further improvement of the imagers' detection ability under the conditions of small solar altitude angle and low surface reflectance can be realized by the Ground Motion Compensation on pitch axis. The structure and control performance are also described. An intelligent integration technology of two-dimensional scanning and image motion compensation is elaborated in this paper. With this technology, sun-synchronous hyper-spectral imagers are able to pay quick visit to hot spots, acquiring both high spatial and temporal resolution hyper-spectral images, which enables rapid response of emergencies. The result has reference value for developing operational space-borne hyper-spectral imagers.
NASA Astrophysics Data System (ADS)
Mehta, Shalin B.; Sheppard, Colin J. R.
2010-05-01
Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.
Image Analysis via Fuzzy-Reasoning Approach: Prototype Applications at NASA
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steven J.
2004-01-01
A set of imaging techniques based on Fuzzy Reasoning (FR) approach was built for NASA at Kennedy Space Center (KSC) to perform complex real-time visual-related safety prototype tasks, such as detection and tracking of moving Foreign Objects Debris (FOD) during the NASA Space Shuttle liftoff and visual anomaly detection on slidewires used in the emergency egress system for Space Shuttle at the launch pad. The system has also proved its prospective in enhancing X-ray images used to screen hard-covered items leading to a better visualization. The system capability was used as well during the imaging analysis of the Space Shuttle Columbia accident. These FR-based imaging techniques include novel proprietary adaptive image segmentation, image edge extraction, and image enhancement. Probabilistic Neural Network (PNN) scheme available from NeuroShell(TM) Classifier and optimized via Genetic Algorithm (GA) was also used along with this set of novel imaging techniques to add powerful learning and image classification capabilities. Prototype applications built using these techniques have received NASA Space Awards, including a Board Action Award, and are currently being filed for patents by NASA; they are being offered for commercialization through the Research Triangle Institute (RTI), an internationally recognized corporation in scientific research and technology development. Companies from different fields, including security, medical, text digitalization, and aerospace, are currently in the process of licensing these technologies from NASA.
Estimating optical imaging system performance for space applications
NASA Technical Reports Server (NTRS)
Sinclair, K. F.
1972-01-01
The critical system elements of an optical imaging system are identified and a method for an initial assessment of system performance is presented. A generalized imaging system is defined. A system analysis is considered, followed by a component analysis. An example of the method is given using a film imaging system.
High resolution metric imaging payload
NASA Astrophysics Data System (ADS)
Delclaud, Y.
2017-11-01
Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, C.
1995-02-01
Views of the Solar System has been created as an educational tour of the solar system. It contains images and information about the Sun, planets, moons, asteroids and comets found within the solar system. The image processing for many of the images was done by the author. This tour uses hypertext to allow space travel by simply clicking on a desired planet. This causes information and images about the planet to appear on screen. While on a planet page, hyperlinks travel to pages about the moons and other relevant available resources. Unusual terms are linked to and defined in themore » Glossary page. Statistical information of the planets and satellites can be browsed through lists sorted by name, radius and distance. History of Space Exploration contains information about rocket history, early astronauts, space missions, spacecraft and detailed chronology tables of space exploration. The Table of Contents page has links to all of the various pages within Views Of the Solar System.« less
Image degradation characteristics and restoration based on regularization for diffractive imaging
NASA Astrophysics Data System (ADS)
Zhi, Xiyang; Jiang, Shikai; Zhang, Wei; Wang, Dawei; Li, Yun
2017-11-01
The diffractive membrane optical imaging system is an important development trend of ultra large aperture and lightweight space camera. However, related investigations on physics-based diffractive imaging degradation characteristics and corresponding image restoration methods are less studied. In this paper, the model of image quality degradation for the diffraction imaging system is first deduced mathematically based on diffraction theory and then the degradation characteristics are analyzed. On this basis, a novel regularization model of image restoration that contains multiple prior constraints is established. After that, the solving approach of the equation with the multi-norm coexistence and multi-regularization parameters (prior's parameters) is presented. Subsequently, the space-variant PSF image restoration method for large aperture diffractive imaging system is proposed combined with block idea of isoplanatic region. Experimentally, the proposed algorithm demonstrates its capacity to achieve multi-objective improvement including MTF enhancing, dispersion correcting, noise and artifact suppressing as well as image's detail preserving, and produce satisfactory visual quality. This can provide scientific basis for applications and possesses potential application prospects on future space applications of diffractive membrane imaging technology.
NASA Technical Reports Server (NTRS)
Albrecht, R.; Barbieri, C.; Adorf, H.-M.; Corrain, G.; Gemmo, A.; Greenfield, P.; Hainaut, O.; Hook, R. N.; Tholen, D. J.; Blades, J. C.
1994-01-01
Images of the Pluto-Charon system were obtained with the Faint Object Camera (FOC) of the Hubble Space Telescope (HST) after the refurbishment of the telescope. The images are of superb quality, allowing the determination of radii, fluxes, and albedos. Attempts were made to improve the resolution of the already diffraction limited images by image restoration. These yielded indications of surface albedo distributions qualitatively consistent with models derived from observations of Pluto-Charon mutual eclipses.
Design and implementation of a PC-based image-guided surgical system.
Stefansic, James D; Bass, W Andrew; Hartmann, Steven L; Beasley, Ryan A; Sinha, Tuhin K; Cash, David M; Herline, Alan J; Galloway, Robert L
2002-11-01
In interactive, image-guided surgery, current physical space position in the operating room is displayed on various sets of medical images used for surgical navigation. We have developed a PC-based surgical guidance system (ORION) which synchronously displays surgical position on up to four image sets and updates them in real time. There are three essential components which must be developed for this system: (1) accurately tracked instruments; (2) accurate registration techniques to map physical space to image space; and (3) methods to display and update the image sets on a computer monitor. For each of these components, we have developed a set of dynamic link libraries in MS Visual C++ 6.0 supporting various hardware tools and software techniques. Surgical instruments are tracked in physical space using an active optical tracking system. Several of the different registration algorithms were developed with a library of robust math kernel functions, and the accuracy of all registration techniques was thoroughly investigated. Our display was developed using the Win32 API for windows management and tomographic visualization, a frame grabber for live video capture, and OpenGL for visualization of surface renderings. We have begun to use this current implementation of our system for several surgical procedures, including open and minimally invasive liver surgery.
The utility of polarized heliospheric imaging for space weather monitoring.
DeForest, C E; Howard, T A; Webb, D F; Davies, J A
2016-01-01
A polarizing heliospheric imager is a critical next generation tool for space weather monitoring and prediction. Heliospheric imagers can track coronal mass ejections (CMEs) as they cross the solar system, using sunlight scattered by electrons in the CME. This tracking has been demonstrated to improve the forecasting of impact probability and arrival time for Earth-directed CMEs. Polarized imaging allows locating CMEs in three dimensions from a single vantage point. Recent advances in heliospheric imaging have demonstrated that a polarized imager is feasible with current component technology.Developing this technology to a high technology readiness level is critical for space weather relevant imaging from either a near-Earth or deep-space mission. In this primarily technical review, we developpreliminary hardware requirements for a space weather polarizing heliospheric imager system and outline possible ways to flight qualify and ultimately deploy the technology operationally on upcoming specific missions. We consider deployment as an instrument on NOAA's Deep Space Climate Observatory follow-on near the Sun-Earth L1 Lagrange point, as a stand-alone constellation of smallsats in low Earth orbit, or as an instrument located at the Sun-Earth L5 Lagrange point. The critical first step is the demonstration of the technology, in either a science or prototype operational mission context.
Concepts for image management and communication system for space vehicle health management
NASA Astrophysics Data System (ADS)
Alsafadi, Yasser; Martinez, Ralph
On a space vehicle, the Crew Health Care System will handle minor accidents or illnesses immediately, thereby eliminating the necessity of early mission termination or emergency rescue. For practical reasons, only trained personnel with limited medical experience can be available on space vehicles to render preliminary health care. There is the need to communicate with medical experts at different locations on earth. Interplanetary Image Management and Communication System (IIMACS) will be a bridge between worlds and deliver medical images acquired in space to physicians at different medical centers on earth. This paper discusses the implementation of IIMACS by extending the Global Picture Archiving and Communication System (GPACS) being developed to interconnect medical centers on earth. Furthermore, this paper explores system requirements of IIMACS and different user scenarios. Our conclusion is that IIMACS is feasible using the maturing technology base of GPACS.
Projection x-space magnetic particle imaging.
Goodwill, Patrick W; Konkle, Justin J; Zheng, Bo; Saritas, Emine U; Conolly, Steven M
2012-05-01
Projection magnetic particle imaging (MPI) can improve imaging speed by over 100-fold over traditional 3-D MPI. In this work, we derive the 2-D x-space signal equation, 2-D image equation, and introduce the concept of signal fading and resolution loss for a projection MPI imager. We then describe the design and construction of an x-space projection MPI scanner with a field gradient of 2.35 T/m across a 10 cm magnet free bore. The system has an expected resolution of 3.5 × 8.0 mm using Resovist tracer, and an experimental resolution of 3.8 × 8.4 mm resolution. The system images 2.5 cm × 5.0 cm partial field-of views (FOVs) at 10 frames/s, and acquires a full field-of-view of 10 cm × 5.0 cm in 4 s. We conclude by imaging a resolution phantom, a complex "Cal" phantom, mice injected with Resovist tracer, and experimentally confirm the theoretically predicted x-space spatial resolution.
Sampling and Visualizing Creases with Scale-Space Particles
Kindlmann, Gordon L.; Estépar, Raúl San José; Smith, Stephen M.; Westin, Carl-Fredrik
2010-01-01
Particle systems have gained importance as a methodology for sampling implicit surfaces and segmented objects to improve mesh generation and shape analysis. We propose that particle systems have a significantly more general role in sampling structure from unsegmented data. We describe a particle system that computes samplings of crease features (i.e. ridges and valleys, as lines or surfaces) that effectively represent many anatomical structures in scanned medical data. Because structure naturally exists at a range of sizes relative to the image resolution, computer vision has developed the theory of scale-space, which considers an n-D image as an (n + 1)-D stack of images at different blurring levels. Our scale-space particles move through continuous four-dimensional scale-space according to spatial constraints imposed by the crease features, a particle-image energy that draws particles towards scales of maximal feature strength, and an inter-particle energy that controls sampling density in space and scale. To make scale-space practical for large three-dimensional data, we present a spline-based interpolation across scale from a small number of pre-computed blurrings at optimally selected scales. The configuration of the particle system is visualized with tensor glyphs that display information about the local Hessian of the image, and the scale of the particle. We use scale-space particles to sample the complex three-dimensional branching structure of airways in lung CT, and the major white matter structures in brain DTI. PMID:19834216
Image Capture and Display Based on Embedded Linux
NASA Astrophysics Data System (ADS)
Weigong, Zhang; Suran, Di; Yongxiang, Zhang; Liming, Li
For the requirement of building a highly reliable communication system, SpaceWire was selected in the integrated electronic system. There was a need to test the performance of SpaceWire. As part of the testing work, the goal of this paper is to transmit image data from CMOS camera through SpaceWire and display real-time images on the graphical user interface with Qt in the embedded development platform of Linux & ARM. A point-to-point mode of transmission was chosen; the running result showed the two communication ends basically reach a consensus picture in succession. It suggests that the SpaceWire can transmit the data reliably.
Sustained Space Superiority: A National Strategy for the United States
2002-08-01
of systems related to the initial concept continued. The U.S. Army developed the Nike Zeus system, and the United States conducted the first...resolution imaging satellite was Space Imaging on September 24, 1999. It plans to capture thirty to forty percent of the commercial imagery market ...actively supported commercial space companies in order to open new international markets , Congress imposed restrictions on these commercial
Space shuttle visual simulation system design study
NASA Technical Reports Server (NTRS)
1973-01-01
The current and near-future state-of-the-art in visual simulation equipment technology is related to the requirements of the space shuttle visual system. Image source, image sensing, and displays are analyzed on a subsystem basis, and the principal conclusions are used in the formulation of a recommended baseline visual system. Perceptibility and visibility are also analyzed.
Space Radar Image of Wadi Kufra, Libya
1998-04-14
The ability of a sophisticated radar instrument to image large regions of the world from space, using different frequencies that can penetrate dry sand cover, produced the discovery in this image: a previously unknown branch of an ancient river, buried under thousands of years of windblown sand in a region of the Sahara Desert in North Africa. This area is near the Kufra Oasis in southeast Libya, centered at 23.3 degrees north latitude, 22.9 degrees east longitude. The image was acquired by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture (SIR-C/X-SAR) imaging radar when it flew aboard the space shuttle Endeavour on its 60th orbit on October 4, 1994. This SIR-C image reveals a system of old, now inactive stream valleys, called "paleodrainage systems, http://photojournal.jpl.nasa.gov/catalog/PIA01310
NASA Technical Reports Server (NTRS)
Pendergast, Karl J.; Schauwecker, Christopher J.
1998-01-01
Third in the series of NASA great observatories, the Advanced X-Ray Astrophysics Facility (AXAF) is scheduled for launch from the Space Shuttle in November of 1998. Following in the path of the Hubble Space Telescope and the Compton Gamma Ray Observatory, this observatory will image light at X-ray wavelengths, facilitating the detailed study of such phenomena as supernovae and quasars. The AXAF project is sponsored by the Marshall Space Flight Center in Huntsville, Alabama. Because of exacting requirements on the performance of the AXAF optical system, it was necessary to reduce the transmission of reaction wheel jitter disturbances to the observatory. This reduction was accomplished via use of a passive mechanical isolation system to interface the reaction wheels with the spacecraft central structure. In addition to presenting a description of the spacecraft, the isolation system, and the key image quality requirement flowdown, this paper details the analyses performed in support of system-level imaging performance requirement verification. These analyses include the identification of system-level requirement suballocations, quantification of imaging and pointing performance, and formulation of unit-level isolation system transmissibility requirements. Given in comparison to the non-isolated system imaging performance, the results of these analyses clearly illustrate the effectiveness of an innovative reaction wheel passive isolation system.
Nonlinear research of an image motion stabilization system embedded in a space land-survey telescope
NASA Astrophysics Data System (ADS)
Somov, Yevgeny; Butyrin, Sergey; Siguerdidjane, Houria
2017-01-01
We consider an image motion stabilization system embedded into a space telescope for a scanning optoelectronic observation of terrestrial targets. Developed model of this system is presented taking into account physical hysteresis of piezo-ceramic driver and a time delay at a forming of digital control. We have presented elaborated algorithms for discrete filtering and digital control, obtained results on analysis of the image motion velocity oscillations in the telescope focal plane, and also methods for terrestrial and in-flight verification of the system.
A precise method for adjusting the optical system of laser sub-aperture
NASA Astrophysics Data System (ADS)
Song, Xing; Zhang, Xue-min; Yang, Jianfeng; Xue, Li
2018-02-01
In order to adapt to the requirement of modern astronomical observation and warfare, the resolution of the space telescope is needed to improve, sub-aperture stitching imaging technique is one method to improve the resolution, which could be used not only the foundation and space-based large optical systems, also used in laser transmission and microscopic imaging. A large aperture main mirror of sub-aperture stitching imaging system is composed of multiple sub-mirrors distributed according to certain laws. All sub-mirrors are off-axis mirror, so the alignment of sub-aperture stitching imaging system is more complicated than a single off-axis optical system. An alignment method based on auto-collimation imaging and interferometric imaging is introduced in this paper, by using this alignment method, a sub-aperture stitching imaging system which is composed of 12 sub-mirrors was assembled with high resolution, the beam coincidence precision is better than 0.01mm, and the system wave aberration is better than 0.05λ.
Web Mining for Web Image Retrieval.
ERIC Educational Resources Information Center
Chen, Zheng; Wenyin, Liu; Zhang, Feng; Li, Mingjing; Zhang, Hongjiang
2001-01-01
Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)
2004-02-04
KENNEDY SPACE CENTER, FLA. - Armando Oliu, Final Inspection Team lead for the Shuttle program, speaks to reporters about the aid the Image Analysis Lab is giving the FBI in a kidnapping case. Oliu oversees the image lab that is using an advanced SGI® TP9500 data management system to review the tape of the kidnapping in progress in Sarasota, Fla. KSC installed the new $3.2 million system in preparation for Return to Flight of the Space Shuttle fleet. The lab is studying the Sarasota kidnapping video to provide any new information possible to law enforcement officers. KSC is joining NASA’s Marshall Space Flight Center in Alabama in reviewing the tape.
NASA Astrophysics Data System (ADS)
Tian, Biao; Liu, Yang; Xu, Shiyou; Chen, Zengping
2014-01-01
Interferometric inverse synthetic aperture radar (InISAR) imaging provides complementary information to monostatic inverse synthetic aperture radar (ISAR) imaging. This paper proposes a new InISAR imaging system for space targets based on wideband direct sampling using two antennas. The system is easy to realize in engineering since the motion trajectory of space targets can be known in advance, which is simpler than that of three receivers. In the preprocessing step, high speed movement compensation is carried out by designing an adaptive matched filter containing speed that is obtained from the narrow band information. Then, the coherent processing and keystone transform for ISAR imaging are adopted to reserve the phase history of each antenna. Through appropriate collocation of the system, image registration and phase unwrapping can be avoided. Considering the situation not to be satisfied, the influence of baseline variance is analyzed and compensation method is adopted. The corresponding size can be achieved by interferometric processing of the two complex ISAR images. Experimental results prove the validity of the analysis and the three-dimensional imaging algorithm.
Gu, X; Fang, Z-M; Liu, Y; Lin, S-L; Han, B; Zhang, R; Chen, X
2014-01-01
Three-dimensional fluid-attenuated inversion recovery magnetic resonance imaging of the inner ear after intratympanic injection of gadolinium, together with magnetic resonance imaging scoring of the perilymphatic space, were used to investigate the positive identification rate of hydrops and determine the technique's diagnostic value for delayed endolymphatic hydrops. Twenty-five patients with delayed endolymphatic hydrops underwent pure tone audiometry, bithermal caloric testing, vestibular-evoked myogenic potential testing and three-dimensional magnetic resonance imaging of the inner ear after bilateral intratympanic injection of gadolinium. The perilymphatic space of the scanned images was analysed to investigate the positive identification rate of endolymphatic hydrops. According to the magnetic resonance imaging scoring of the perilymphatic space and the diagnostic standard, 84 per cent of the patients examined had endolymphatic hydrops. In comparison, the positive identification rates for vestibular-evoked myogenic potential and bithermal caloric testing were 52 per cent and 72 per cent respectively. Three-dimensional magnetic resonance imaging after intratympanic injection of gadolinium is valuable in the diagnosis of delayed endolymphatic hydrops and its classification. The perilymphatic space scoring system improved the diagnostic accuracy of magnetic resonance imaging.
REVIEW OF DEVELOPMENTS IN SPACE REMOTE SENSING FOR MONITORING RESOURCES.
Watkins, Allen H.; Lauer, D.T.; Bailey, G.B.; Moore, D.G.; Rohde, W.G.
1984-01-01
Space remote sensing systems are compared for suitability in assessing and monitoring the Earth's renewable resources. Systems reviewed include the Landsat Thematic Mapper (TM), the National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR), the French Systeme Probatoire d'Observation de la Terre (SPOT), the German Shuttle Pallet Satellite (SPAS) Modular Optoelectronic Multispectral Scanner (MOMS), the European Space Agency (ESA) Spacelab Metric Camera, the National Aeronautics and Space Administration (NASA) Large Format Camera (LFC) and Shuttle Imaging Radar (SIR-A and -B), the Russian Meteor satellite BIK-E and fragment experiments and MKF-6M and KATE-140 camera systems, the ESA Earth Resources Satellite (ERS-1), the Japanese Marine Observation Satellite (MOS-1) and Earth Resources Satellite (JERS-1), the Canadian Radarsat, the Indian Resources Satellite (IRS), and systems proposed or planned by China, Brazil, Indonesia, and others. Also reviewed are the concepts for a 6-channel Shuttle Imaging Spectroradiometer, a 128-channel Shuttle Imaging Spectrometer Experiment (SISEX), and the U. S. Mapsat.
Supervised pixel classification using a feature space derived from an artificial visual system
NASA Technical Reports Server (NTRS)
Baxter, Lisa C.; Coggins, James M.
1991-01-01
Image segmentation involves labelling pixels according to their membership in image regions. This requires the understanding of what a region is. Using supervised pixel classification, the paper investigates how groups of pixels labelled manually according to perceived image semantics map onto the feature space created by an Artificial Visual System. Multiscale structure of regions are investigated and it is shown that pixels form clusters based on their geometric roles in the image intensity function, not by image semantics. A tentative abstract definition of a 'region' is proposed based on this behavior.
Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process
NASA Technical Reports Server (NTRS)
Dominquez, Jesus A.; Klinko, Steve J.
2007-01-01
Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.
A real-time MTFC algorithm of space remote-sensing camera based on FPGA
NASA Astrophysics Data System (ADS)
Zhao, Liting; Huang, Gang; Lin, Zhe
2018-01-01
A real-time MTFC algorithm of space remote-sensing camera based on FPGA was designed. The algorithm can provide real-time image processing to enhance image clarity when the remote-sensing camera running on-orbit. The image restoration algorithm adopted modular design. The MTF measurement calculation module on-orbit had the function of calculating the edge extension function, line extension function, ESF difference operation, normalization MTF and MTFC parameters. The MTFC image filtering and noise suppression had the function of filtering algorithm and effectively suppressing the noise. The algorithm used System Generator to design the image processing algorithms to simplify the design structure of system and the process redesign. The image gray gradient dot sharpness edge contrast and median-high frequency were enhanced. The image SNR after recovery reduced less than 1 dB compared to the original image. The image restoration system can be widely used in various fields.
NASA Technical Reports Server (NTRS)
1992-01-01
This document describes the Advanced Imaging System CCD based camera. The AIS1 camera system was developed at Photometric Ltd. in Tucson, Arizona as part of a Phase 2 SBIR contract No. NAS5-30171 from the NASA/Goddard Space Flight Center in Greenbelt, Maryland. The camera project was undertaken as a part of the Space Telescope Imaging Spectrograph (STIS) project. This document is intended to serve as a complete manual for the use and maintenance of the camera system. All the different parts of the camera hardware and software are discussed and complete schematics and source code listings are provided.
Integrating Space Systems Operations at the Marine Expeditionary Force Level
2015-06-01
Electromagnetic Interference ENVI Environment for Visualizing Images EW Electronic Warfare xvi FA40 Space Operations Officer FEC Fires and Effects...Information Facility SFE Space Force Enhancement SIGINT Signals Intelligence SSA Space Situational Awareness SSE Space Support Element STK Systems...April 23, 2015. 65 • GPS Interference and Navigation Tool (GIANT) for providing GPS accuracy prediction reports • Systems Toolkit ( STK ) Analysis
NASA Technical Reports Server (NTRS)
1989-01-01
The Marshall Space Flight Center annual report summarizes their advanced studies, research programs, and technological developments. Areas covered include: transportation systems; space systems such as Gravity Probe-B and Gamma Ray Imaging Telescope; data systems; microgravity science; astronomy and astrophysics; solar, magnetospheric, and atomic physics; aeronomy; propulsion; materials and processes; structures and dynamics; automated systems; space systems; and avionics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kotasidis, Fotis A., E-mail: Fotis.Kotasidis@unige.ch; Zaidi, Habib; Geneva Neuroscience Centre, Geneva University, CH-1205 Geneva
2014-06-15
Purpose: The Ingenuity time-of-flight (TF) PET/MR is a recently developed hybrid scanner combining the molecular imaging capabilities of PET with the excellent soft tissue contrast of MRI. It is becoming common practice to characterize the system's point spread function (PSF) and understand its variation under spatial transformations to guide clinical studies and potentially use it within resolution recovery image reconstruction algorithms. Furthermore, due to the system's utilization of overlapping and spherical symmetric Kaiser-Bessel basis functions during image reconstruction, its image space PSF and reconstructed spatial resolution could be affected by the selection of the basis function parameters. Hence, a detailedmore » investigation into the multidimensional basis function parameter space is needed to evaluate the impact of these parameters on spatial resolution. Methods: Using an array of 12 × 7 printed point sources, along with a custom made phantom, and with the MR magnet on, the system's spatially variant image-based PSF was characterized in detail. Moreover, basis function parameters were systematically varied during reconstruction (list-mode TF OSEM) to evaluate their impact on the reconstructed resolution and the image space PSF. Following the spatial resolution optimization, phantom, and clinical studies were subsequently reconstructed using representative basis function parameters. Results: Based on the analysis and under standard basis function parameters, the axial and tangential components of the PSF were found to be almost invariant under spatial transformations (∼4 mm) while the radial component varied modestly from 4 to 6.7 mm. Using a systematic investigation into the basis function parameter space, the spatial resolution was found to degrade for basis functions with a large radius and small shape parameter. However, it was found that optimizing the spatial resolution in the reconstructed PET images, while having a good basis function superposition and keeping the image representation error to a minimum, is feasible, with the parameter combination range depending upon the scanner's intrinsic resolution characteristics. Conclusions: Using the printed point source array as a MR compatible methodology for experimentally measuring the scanner's PSF, the system's spatially variant resolution properties were successfully evaluated in image space. Overall the PET subsystem exhibits excellent resolution characteristics mainly due to the fact that the raw data are not under-sampled/rebinned, enabling the spatial resolution to be dictated by the scanner's intrinsic resolution and the image reconstruction parameters. Due to the impact of these parameters on the resolution properties of the reconstructed images, the image space PSF varies both under spatial transformations and due to basis function parameter selection. Nonetheless, for a range of basis function parameters, the image space PSF remains unaffected, with the range depending on the scanner's intrinsic resolution properties.« less
Design of a concise Féry-prism hyperspectral imaging system based on multi-configuration
NASA Astrophysics Data System (ADS)
Dong, Wei; Nie, Yun-feng; Zhou, Jin-song
2013-08-01
In order to meet the needs of space borne and airborne hyperspectral imaging system for light weight, simplification and high spatial resolution, a novel design of Féry-prism hyperspectral imaging system based on Zemax multi-configuration method is presented. The novel structure is well arranged by analyzing optical monochromatic aberrations theoretically, and the optical structure of this design is concise. The fundamental of this design is Offner relay configuration, whereas the secondary mirror is replaced by Féry-prism with curved surfaces and a reflective front face. By reflection, the light beam passes through the Féry-prism twice, which promotes spectral resolution and enhances image quality at the same time. The result shows that the system can achieve light weight and simplification, compared to other hyperspectral imaging systems. Composed of merely two spherical mirrors and one achromatized Féry-prism to perform both dispersion and imaging functions, this structure is concise and compact. The average spectral resolution is 6.2nm; The MTFs for 0.45~1.00um spectral range are greater than 0.75, RMSs are less than 2.4um; The maximal smile is less than 10% pixel, while the keystones is less than 2.8% pixel; image quality approximates the diffraction limit. The design result shows that hyperspectral imaging system with one modified Féry-prism substituting the secondary mirror of Offner relay configuration is feasible from the perspective of both theory and practice, and possesses the merits of simple structure, convenient optical alignment, and good image quality, high resolution in space and spectra, adjustable dispersive nonlinearity. The system satisfies the requirements of airborne or space borne hyperspectral imaging system.
Experiment and application of soft x-ray grazing incidence optical scattering phenomena
NASA Astrophysics Data System (ADS)
Chen, Shuyan; Li, Cheng; Zhang, Yang; Su, Liping; Geng, Tao; Li, Kun
2017-08-01
For short wavelength imaging systems,surface scattering effects is one of important factors degrading imaging performance. Study of non-intuitive surface scatter effects resulting from practical optical fabrication tolerances is a necessary work for optical performance evaluation of high resolution short wavelength imaging systems. In this paper, Soft X-ray optical scattering distribution is measured by a soft X-ray reflectometer installed by my lab, for different sample mirrors、wavelength and grazing angle. Then aim at space solar telescope, combining these scattered light distributions, and surface scattering numerical model of grazing incidence imaging system, PSF and encircled energy of optical system of space solar telescope are computed. We can conclude that surface scattering severely degrade imaging performance of grazing incidence systems through analysis and computation.
Research on application of several tracking detectors in APT system
NASA Astrophysics Data System (ADS)
Liu, Zhi
2005-01-01
APT system is the key technology in free space optical communication system, and acquisition and tracking detector is the key component in PAT system. There are several candidate detectors that can be used in PAT system, such as CCD, QAPD and CMOS Imager etc. The characteristics of these detectors are quite different, i.e., the structures and the working schemes. This paper gives thoroughly compare of the usage and working principle of CCD and CMOS imager, and discusses the key parameters like tracking error, noise analyses, power analyses etc. Conclusion is given at the end of this paper that CMOS imager is a good candidate detector for PAT system in free space optical communication system.
Holm, Thomas; Gallo, Kevin P.; Bailey, Bryan
2010-01-01
The Committee on Earth Observation Satellites is an international group that coordinates civil space-borne observations of the Earth, and provides the space component of the Global Earth Observing System of Systems (GEOSS). The CEOS Virtual Constellations concept was implemented in an effort to engage and coordinate disparate Earth observing programs of CEOS member agencies and ultimately facilitate their contribution in supplying the space-based observations required to satisfy the requirements of the GEOSS. The CEOS initially established Study Teams for four prototype constellations that included precipitation, land surface imaging, ocean surface topography, and atmospheric composition. The basic mission of the Land Surface Imaging (LSI) Constellation [1] is to promote the efficient, effective, and comprehensive collection, distribution, and application of space-acquired image data of the global land surface, especially to meet societal needs of the global population, such as those addressed by the nine Group on Earth Observations (GEO) Societal Benefit Areas (SBAs) of agriculture, biodiversity, climate, disasters, ecosystems, energy, health, water, and weather. The LSI Constellation Portal is the result of an effort to address important goals within the LSI Constellation mission and provide resources to assist in planning for future space missions that might further contribute to meeting those goals.
2004-02-04
KENNEDY SPACE CENTER, FLA. - Armando Oliu, Final Inspection Team lead for the Shuttle program, speaks to reporters about the aid the Image Analysis Lab is giving the FBI in a kidnapping case. Behind him at right is Mike Rein, External Affairs division chief. Oliu oversees the image lab that is using an advanced SGI® TP9500 data management system to review the tape of the kidnapping in progress in Sarasota, Fla. KSC installed the new $3.2 million system in preparation for Return to Flight of the Space Shuttle fleet. The lab is studying the Sarasota kidnapping video to provide any new information possible to law enforcement officers. KSC is joining NASA’s Marshall Space Flight Center in Alabama in reviewing the tape.
Phase space methods in HMD systems
NASA Astrophysics Data System (ADS)
Babington, James
2017-06-01
We consider using phase space techniques and methods in analysing optical ray propagation in head mounted display systems. Two examples are considered that illustrate the concepts and methods. Firstly, a shark tooth freeform geometry, and secondly, a waveguide geometry that replicates a pupil in one dimension. Classical optics and imaging in particular provide a natural stage to employ phase space techniques, albeit as a constrained system. We consider how phase space provides a global picture of the physical ray trace data. As such, this gives a complete optical world history of all of the rays propagating through the system. Using this data one can look at, for example, how aberrations arise on a surface by surface basis. These can be extracted numerically from phase space diagrams in the example of a freeform imaging prism. For the waveguide geometry, phase space diagrams provide a way of illustrating how replicated pupils behave and what these imply for design considerations such as tolerances.
2014-08-04
Resident Space Object Proximity Analysis and IMAging) mission is carried out by a 6U Cube Sat class satellite equipped with a warm gas propulsion system... mission . The ARAPAIMA (Application for Resident Space Object Proximity Analysis and IMAging) mission is carried out by a 6 U CubeSat class satellite...attitude determination and control subsystem (ADCS) (or a proximity operation and imaging satellite mission . The ARAP AI MA (Application for
NIAC Phase II Orbiting Rainbows: Future Space Imaging with Granular Systems
NASA Technical Reports Server (NTRS)
Quadrelli, Marco B.; Basinger, Scott; Arumugam, Darmindra; Swartzlander, Grover
2017-01-01
Inspired by the light scattering and focusing properties of distributed optical assemblies in Nature, such as rainbows and aerosols, and by recent laboratory successes in optical trapping and manipulation, we propose a unique combination of space optics and autonomous robotic system technology, to enable a new vision of space system architecture with applications to ultra-lightweight space optics and, ultimately, in-situ space system fabrication. Typically, the cost of an optical system is driven by the size and mass of the primary aperture. The ideal system is a cloud of spatially disordered dust-like objects that can be optically manipulated: it is highly reconfigurable, fault-tolerant, and allows very large aperture sizes at low cost. This new concept is based on recent understandings in the physics of optical manipulation of small particles in the laboratory and the engineering of distributed ensembles of spacecraft swarms to shape an orbiting cloud of micron-sized objects. In the same way that optical tweezers have revolutionized micro- and nano-manipulation of objects, our breakthrough concept will enable new large scale NASA mission applications and develop new technology in the areas of Astrophysical Imaging Systems and Remote Sensing because the cloud can operate as an adaptive optical imaging sensor. While achieving the feasibility of constructing one single aperture out of the cloud is the main topic of this work, it is clear that multiple orbiting aerosol lenses could also combine their power to synthesize a much larger aperture in space to enable challenging goals such as exo-planet detection. Furthermore, this effort could establish feasibility of key issues related to material properties, remote manipulation, and autonomy characteristics of cloud in orbit. There are several types of endeavors (science missions) that could be enabled by this type of approach, i.e. it can enable new astrophysical imaging systems, exo-planet search, large apertures allow for unprecedented high resolution to discern continents and important features of other planets, hyperspectral imaging, adaptive systems, spectroscopy imaging through limb, and stable optical systems from Lagrange-points. Furthermore, future micro-miniaturization might hold promise of a further extension of our dust aperture concept to other more exciting smart dust concepts with other associated capabilities. Our objective in Phase II was to experimentally and numerically investigate how to optically manipulate and maintain the shape of an orbiting cloud of dust-like matter so that it can function as an adaptable ultra-lightweight surface. Our solution is based on the aperture being an engineered granular medium, instead of a conventional monolithic aperture. This allows building of apertures at a reduced cost, enables extremely fault-tolerant apertures that cannot otherwise be made, and directly enables classes of missions for exoplanet detection based on Fourier spectroscopy with tight angular resolution and innovative radar systems for remote sensing. In this task, we have examined the advanced feasibility of a crosscutting concept that contributes new technological approaches for space imaging systems, autonomous systems, and space applications of optical manipulation. The proposed investigation has matured the concept that we started in Phase I to TRL 3, identifying technology gaps and candidate system architectures for the space-borne cloud as an aperture.
Utilizing the Southwest Ultraviolet Imaging System (SwUIS) on the International Space Station
NASA Astrophysics Data System (ADS)
Schindhelm, Eric; Stern, S. Alan; Ennico-Smith, Kimberly
2013-09-01
We present the Southwest Ultraviolet Imaging System (SwUIS), a compact, low-cost instrument designed for remote sensing observations from a manned platform in space. It has two chief configurations; a high spatial resolution mode with a 7-inch Maksutov-Cassegrain telescope, and a large field-of-view camera mode using a lens assembly. It can operate with either an intensified CCD or an electron multiplying CCD camera. Interchangeable filters and lenses enable broadband and narrowband imaging at UV/visible/near-infrared wavelengths, over a range of spatial resolution. SwUIS has flown previously on Space Shuttle flights STS-85 and STS-93, where it recorded multiple UV images of planets, comets, and vulcanoids. We describe the instrument and its capabilities in detail. The SWUIS's broad wavelength coverage and versatile range of hardware configurations make it an attractive option for use as a facility instrument for Earth science and astronomical imaging investigations aboard the International Space Station.
Efficient characterization of phase space mapping in axially symmetric optical systems
NASA Astrophysics Data System (ADS)
Barbero, Sergio; Portilla, Javier
2018-01-01
Phase space mapping, typically between an object and image plane, characterizes an optical system within a geometrical optics framework. We propose a novel conceptual frame to characterize the phase mapping in axially symmetric optical systems for arbitrary object locations, not restricted to a specific object plane. The idea is based on decomposing the phase mapping into a set of bivariate equations corresponding to different values of the radial coordinate on a specific object surface (most likely the entrance pupil). These equations are then approximated through bivariate Chebyshev interpolation at Chebyshev nodes, which guarantees uniform convergence. Additionally, we propose the use of a new concept (effective object phase space), defined as the set of points of the phase space at the first optical element (typically the entrance pupil) that are effectively mapped onto the image surface. The effective object phase space provides, by means of an inclusion test, a way to avoid tracing rays that do not reach the image surface.
Iterative methods for dose reduction and image enhancement in tomography
Miao, Jianwei; Fahimian, Benjamin Pooya
2012-09-18
A system and method for creating a three dimensional cross sectional image of an object by the reconstruction of its projections that have been iteratively refined through modification in object space and Fourier space is disclosed. The invention provides systems and methods for use with any tomographic imaging system that reconstructs an object from its projections. In one embodiment, the invention presents a method to eliminate interpolations present in conventional tomography. The method has been experimentally shown to provide higher resolution and improved image quality parameters over existing approaches. A primary benefit of the method is radiation dose reduction since the invention can produce an image of a desired quality with a fewer number projections than seen with conventional methods.
NASA Technical Reports Server (NTRS)
Nein, M. E.; Davis, B. G.
1982-01-01
The Coherent Optical System of Modular Imaging Collectors (COSMIC) is the design concept for a phase-coherent optical telescope array that may be placed in earth orbit by the Space Shuttle in the 1990s. The initial system module is a minimum redundancy array whose photon collecting area is three times larger than that of the Space Telescope, and possesses a one-dimensional resoution of better than 0.01 arcsec in the visible range. Thermal structural requirements are assessed. Although the coherent beam combination requirements will be met by an active control system, the COSMIC structural/thermal design must meet more stringent performance criteria than even those of the Space Telescope.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Y; Mutic, S; Du, D
Purpose: To evaluate the feasibility of using the weighted hybrid iterative spiral k-space encoded estimation (WHISKEE) technique to improve spatial resolution of tracking images for onboard MR image guided radiation therapy (MR-IGRT). Methods: MR tracking images of abdomen and pelvis had been acquired from healthy volunteers using the ViewRay onboard MRIGRT system (ViewRay Inc. Oakwood Village, OH) at a spatial resolution of 2.0mm*2.0mm*5.0mm. The tracking MR images were acquired using the TrueFISP sequence. The temporal resolution had to be traded off to 2 frames per second (FPS) to achieve the 2.0mm in-plane spatial resolution. All MR images were imported intomore » the MATLAB software. K-space data were synthesized through the Fourier Transform of the MR images. A mask was created to selected k-space points that corresponded to the under-sampled spiral k-space trajectory with an acceleration (or undersampling) factor of 3. The mask was applied to the fully sampled k-space data to synthesize the undersampled k-space data. The WHISKEE method was applied to the synthesized undersampled k-space data to reconstructed tracking MR images at 6 FPS. As a comparison, the undersampled k-space data were also reconstructed using the zero-padding technique. The reconstructed images were compared to the original image. The relatively reconstruction error was evaluated using the percentage of the norm of the differential image over the norm of the original image. Results: Compared to the zero-padding technique, the WHISKEE method was able to reconstruct MR images with better image quality. It significantly reduced the relative reconstruction error from 39.5% to 3.1% for the pelvis image and from 41.5% to 4.6% for the abdomen image at an acceleration factor of 3. Conclusion: We demonstrated that it was possible to use the WHISKEE method to expedite MR image acquisition for onboard MR-IGRT systems to achieve good spatial and temporal resolutions simultaneously. Y. Hu and O. green receive travel reimbursement from ViewRay. S. Mutic has consulting and research agreements with ViewRay. Q. Zeng, R. Nana, J.L. Patrick, S. Shvartsman and J.F. Dempsey are ViewRay employees.« less
Fully Three-Dimensional Virtual-Reality System
NASA Technical Reports Server (NTRS)
Beckman, Brian C.
1994-01-01
Proposed virtual-reality system presents visual displays to simulate free flight in three-dimensional space. System, virtual space pod, is testbed for control and navigation schemes. Unlike most virtual-reality systems, virtual space pod would not depend for orientation on ground plane, which hinders free flight in three dimensions. Space pod provides comfortable seating, convenient controls, and dynamic virtual-space images for virtual traveler. Controls include buttons plus joysticks with six degrees of freedom.
Mobile Aerial Tracking and Imaging System (MATrIS) for Aeronautical Research
NASA Technical Reports Server (NTRS)
Banks, Daniel W.; Blanchard, Robert C.; Miller, Geoffrey M.
2004-01-01
A mobile, rapidly deployable ground-based system to track and image targets of aeronautical interest has been developed. Targets include reentering reusable launch vehicles as well as atmospheric and transatmospheric vehicles. The optics were designed to image targets in the visible and infrared wavelengths. To minimize acquisition cost and development time, the system uses commercially available hardware and software where possible. The conception and initial funding of this system originated with a study of ground-based imaging of global aerothermal characteristics of reusable launch vehicle configurations. During that study the National Aeronautics and Space Administration teamed with the Missile Defense Agency/Innovative Science and Technology Experimentation Facility to test techniques and analysis on two Space Shuttle flights.
Distance preservation in color image transforms
NASA Astrophysics Data System (ADS)
Santini, Simone
1999-12-01
Most current image processing systems work on color images, and color is a precious perceptual clue for determining image similarity. Working with color images, however, is not the sam thing as working with images taking values in a 3D Euclidean space. Not only are color spaces bounded, but the characteristics of the observer endow the space with a 'perceptual' metric that in general does not correspond to the metric naturally inherited from R3. This paper studies the problem of filtering color images abstractly. It begins by determining the properties of the color sum and color product operations such that he desirable properties of orthonormal bases will be preserved. The paper then defines a general scheme, based on the action of the additive group on the color space, by which operations that satisfy the required properties can be defined.
Closeup oblique view of the aft fuselage of the Orbiter ...
Close-up oblique view of the aft fuselage of the Orbiter Discovery looking forward and port as the last Space Shuttle Main Engine is being removed, it can be seen on the left side of the image frame. Note that one of the Orbiter Maneuvering System/ Reaction Control System has been removed while one of them remains. Also note that the body flap, below the engine positions has a protective covering to prevent damage to the High-temperature Reusable Surface Insulation tiles. This image was taken inside the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Closeup oblique view of the aft fuselage of the Orbiter ...
Close-up oblique view of the aft fuselage of the Orbiter Discovery looking forward and starboard as the last Space Shuttle Main Engine is being removed, it can be seen on the right side of the image frame. Note that one of the Orbiter Maneuvering System/ Reaction Control System has been removed while one of them remains. Also note that the body flap, below the engine positions has a protective covering to prevent damage to the High-temperature Reusable Surface Insulation tiles. This image was taken inside the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Virtual performer: single camera 3D measuring system for interaction in virtual space
NASA Astrophysics Data System (ADS)
Sakamoto, Kunio; Taneji, Shoto
2006-10-01
The authors developed interaction media systems in the 3D virtual space. In these systems, the musician virtually plays an instrument like the theremin in the virtual space or the performer plays a show using the virtual character such as a puppet. This interactive virtual media system consists of the image capture, measuring performer's position, detecting and recognizing motions and synthesizing video image using the personal computer. In this paper, we propose some applications of interaction media systems; a virtual musical instrument and superimposing CG character. Moreover, this paper describes the measuring method of the positions of the performer, his/her head and both eyes using a single camera.
The Panchromatic STARBurst IRregular Dwarf Survey (STARBIRDS): Observations and Data Archive
NASA Astrophysics Data System (ADS)
McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D.
2015-06-01
Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The data sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging. Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESA), and the Canadian Astronomy Data Centre (CADC/NRC/CSA).
NASA Astrophysics Data System (ADS)
Zhang, Hua; Zeng, Luan
2017-11-01
Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.
Benioff, Paul
2009-01-01
Tmore » his work is based on the field of reference frames based on quantum representations of real and complex numbers described in other work. Here frame domains are expanded to include space and time lattices. Strings of qukits are described as hybrid systems as they are both mathematical and physical systems. As mathematical systems they represent numbers. As physical systems in each frame the strings have a discrete Schrodinger dynamics on the lattices. he frame field has an iterative structure such that the contents of a stage j frame have images in a stage j - 1 (parent) frame. A discussion of parent frame images includes the proposal that points of stage j frame lattices have images as hybrid systems in parent frames. he resulting association of energy with images of lattice point locations, as hybrid systems states, is discussed. Representations and images of other physical systems in the different frames are also described.« less
Accuracy of lineaments mapping from space
NASA Technical Reports Server (NTRS)
Short, Nicholas M.
1989-01-01
The use of Landsat and other space imaging systems for lineaments detection is analyzed in terms of their effectiveness in recognizing and mapping fractures and faults, and the results of several studies providing a quantitative assessment of lineaments mapping accuracies are discussed. The cases under investigation include a Landsat image of the surface overlying a part of the Anadarko Basin of Oklahoma, the Landsat images and selected radar imagery of major lineaments systems distributed over much of Canadian Shield, and space imagery covering a part of the East African Rift in Kenya. It is demonstrated that space imagery can detect a significant portion of a region's fracture pattern, however, significant fractions of faults and fractures recorded on a field-produced geological map are missing from the imagery as it is evident in the Kenya case.
Interactive data-processing system for metallurgy
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1978-01-01
Equipment indicates that system can rapidly and accurately process metallurgical and materials-processing data for wide range of applications. Advantages include increase in contract between areas on image, ability to analyze images via operator-written programs, and space available for storing images.
Development of a Sunspot Tracking System
NASA Technical Reports Server (NTRS)
Taylor, Jaime R.
1998-01-01
Large solar flares produce a significant amount of energetic particles which pose a hazard for human activity in space. In the hope of understanding flare mechanisms and thus better predicting solar flares, NASA's Marshall Space Flight Center (MSFC) developed an experimental vector magnetograph (EXVM) polarimeter to measure the Sun's magnetic field. The EXVM will be used to perform ground-based solar observations and will provide a proof of concept for the design of a similar instrument for the Japanese Solar-B space mission. The EXVM typically operates for a period of several minutes. During this time there is image motion due to atmospheric fluctuation and telescope wind loading. To optimize the EXVM performance an image motion compensation device (sunspot tracker) is needed. The sunspot tracker consists of two parts, an image motion determination system and an image deflection system. For image motion determination a CCD or CID camera is used to digitize an image, than an algorithm is applied to determine the motion. This motion or error signal is sent to the image deflection system which moves the image back to its original location. Both of these systems are under development. Two algorithms are available for sunspot tracking which require the use of only one row and one column of image data. To implement these algorithms, two identical independent systems are being developed, one system for each axis of motion. Two CID cameras have been purchased; the data from each camera will be used to determine image motion for each direction. The error signal generated by the tracking algorithm will be sent to an image deflection system consisting of an actuator and a mirror constrained to move about one axis. Magnetostrictive actuators were chosen to move the mirror over piezoelectrics due to their larger driving force and larger range of motion. The actuator and mirror mounts are currently under development.
NASA Technical Reports Server (NTRS)
Lum, Henry, Jr.
1988-01-01
Information on systems autonomy is given in viewgraph form. Information is given on space systems integration, intelligent autonomous systems, automated systems for in-flight mission operations, the Systems Autonomy Demonstration Project on the Space Station Thermal Control System, the architecture of an autonomous intelligent system, artificial intelligence research issues, machine learning, and real-time image processing.
An algorithm for pavement crack detection based on multiscale space
NASA Astrophysics Data System (ADS)
Liu, Xiang-long; Li, Qing-quan
2006-10-01
Conventional human-visual and manual field pavement crack detection method and approaches are very costly, time-consuming, dangerous, labor-intensive and subjective. They possess various drawbacks such as having a high degree of variability of the measure results, being unable to provide meaningful quantitative information and almost always leading to inconsistencies in crack details over space and across evaluation, and with long-periodic measurement. With the development of the public transportation and the growth of the Material Flow System, the conventional method can far from meet the demands of it, thereby, the automatic pavement state data gathering and data analyzing system come to the focus of the vocation's attention, and developments in computer technology, digital image acquisition, image processing and multi-sensors technology made the system possible, but the complexity of the image processing always made the data processing and data analyzing come to the bottle-neck of the whole system. According to the above description, a robust and high-efficient parallel pavement crack detection algorithm based on Multi-Scale Space is proposed in this paper. The proposed method is based on the facts that: (1) the crack pixels in pavement images are darker than their surroundings and continuous; (2) the threshold values of gray-level pavement images are strongly related with the mean value and standard deviation of the pixel-grey intensities. The Multi-Scale Space method is used to improve the data processing speed and minimize the effectiveness caused by image noise. Experiment results demonstrate that the advantages are remarkable: (1) it can correctly discover tiny cracks, even from very noise pavement image; (2) the efficiency and accuracy of the proposed algorithm are superior; (3) its application-dependent nature can simplify the design of the entire system.
NASA Technical Reports Server (NTRS)
Youngquist, Robert C. (Inventor); Moerk, Steven (Inventor)
1999-01-01
An imaging system is described which can be used to either passively search for sources of ultrasonics or as an active phase imaging system. which can image fires. gas leaks, or air temperature gradients. This system uses an array of ultrasonic receivers coupled to an ultrasound collector or lens to provide an electronic image of the ultrasound intensity in a selected angular region of space. A system is described which includes a video camera to provide a visual reference to a region being examined for ultrasonic signals.
Image pattern recognition supporting interactive analysis and graphical visualization
NASA Technical Reports Server (NTRS)
Coggins, James M.
1992-01-01
Image Pattern Recognition attempts to infer properties of the world from image data. Such capabilities are crucial for making measurements from satellite or telescope images related to Earth and space science problems. Such measurements can be the required product itself, or the measurements can be used as input to a computer graphics system for visualization purposes. At present, the field of image pattern recognition lacks a unified scientific structure for developing and evaluating image pattern recognition applications. The overall goal of this project is to begin developing such a structure. This report summarizes results of a 3-year research effort in image pattern recognition addressing the following three principal aims: (1) to create a software foundation for the research and identify image pattern recognition problems in Earth and space science; (2) to develop image measurement operations based on Artificial Visual Systems; and (3) to develop multiscale image descriptions for use in interactive image analysis.
Identification of geostationary satellites using polarization data from unresolved images
NASA Astrophysics Data System (ADS)
Speicher, Andy
In order to protect critical military and commercial space assets, the United States Space Surveillance Network must have the ability to positively identify and characterize all space objects. Unfortunately, positive identification and characterization of space objects is a manual and labor intensive process today since even large telescopes cannot provide resolved images of most space objects. Since resolved images of geosynchronous satellites are not technically feasible with current technology, another method of distinguishing space objects was explored that exploits the polarization signature from unresolved images. The objective of this study was to collect and analyze visible-spectrum polarization data from unresolved images of geosynchronous satellites taken over various solar phase angles. Different collection geometries were used to evaluate the polarization contribution of solar arrays, thermal control materials, antennas, and the satellite bus as the solar phase angle changed. Since materials on space objects age due to the space environment, it was postulated that their polarization signature may change enough to allow discrimination of identical satellites launched at different times. The instrumentation used in this experiment was a United States Air Force Academy (USAFA) Department of Physics system that consists of a 20-inch Ritchey-Chretien telescope and a dual focal plane optical train fed with a polarizing beam splitter. A rigorous calibration of the system was performed that included corrections for pixel bias, dark current, and response. Additionally, the two channel polarimeter was calibrated by experimentally determining the Mueller matrix for the system and relating image intensity at the two cameras to Stokes parameters S0 and S1. After the system calibration, polarization data was collected during three nights on eight geosynchronous satellites built by various manufacturers and launched several years apart. Three pairs of the eight satellites were identical buses to determine if identical buses could be correctly differentiated. When Stokes parameters were plotted against time and solar phase angle, the data indicates that there were distinguishing features in S0 (total intensity) and S1 (linear polarization) that may lead to positive identification or classification of each satellite.
Overcoming Dynamic Disturbances in Imaging Systems
NASA Technical Reports Server (NTRS)
Young, Eric W.; Dente, Gregory C.; Lyon, Richard G.; Chesters, Dennis; Gong, Qian
2000-01-01
We develop and discuss a methodology with the potential to yield a significant reduction in complexity, cost, and risk of space-borne optical systems in the presence of dynamic disturbances. More robust systems almost certainly will be a result as well. Many future space-based and ground-based optical systems will employ optical control systems to enhance imaging performance. The goal of the optical control subsystem is to determine the wavefront aberrations and remove them. Ideally reducing an aberrated image of the object under investigation to a sufficiently clear (usually diffraction-limited) image. Control will likely be distributed over several elements. These elements may include telescope primary segments, telescope secondary, telescope tertiary, deformable mirror(s), fine steering mirror(s), etc. The last two elements, in particular, may have to provide dynamic control. These control subsystems may become elaborate indeed. But robust system performance will require evaluation of the image quality over a substantial range and in a dynamic environment. Candidate systems for improvement in the Earth Sciences Enterprise could include next generation Landsat systems or atmospheric sensors for dynamic imaging of individual, severe storms. The technology developed here could have a substantial impact on the development of new systems in the Space Science Enterprise; such as the Next Generation Space Telescope(NGST) and its follow-on the Next NGST. Large Interferometric Systems of non-zero field, such as Planet Finder and Submillimeter Probe of the Evolution of Cosmic Structure, could benefit. These systems most likely will contain large, flexible optomechanical structures subject to dynamic disturbance. Furthermore, large systems for high resolution imaging of planets or the sun from space may also benefit. Tactical and Strategic Defense systems will need to image very small targets as well and could benefit from the technology developed here. We discuss a novel speckle imaging technique with the potential to separate dynamic aberrations from static aberrations. Post-processing of a set of image data, using an algorithm based on this technique, should work for all but the lowest light levels and highest frequency dynamic environments. This technique may serve to reduce the complexity of the control system and provide for robust, fault-tolerant, reduced risk operation. For a given object, a short exposure image is "frozen" on the focal plane in the presence of the environmental disturbance (turbulence, jitter, etc.). A key factor is that this imaging data exhibits frame-to-frame linear shift invariance. Therefore, although the Point Spread Function is varying from frame to frame, the source is fixed; and each short exposure contains object spectrum data out to the diffraction limit of the imaging system. This novel speckle imaging technique uses the Knox-Thompson method. The magnitude of the complex object spectrum is straightforward to determine by well-established approaches. The phase of the complex object spectrum is decomposed into two parts. One is a single-valued function determined by the divergence of the optical phase gradient. The other is a multi-valued function determined by the circulation of the optical phase gradient-"hidden phase." Finite difference equations are developed for the phase. The novelty of this approach is captured in the inclusion of this "hidden phase." This technique allows the diffraction-limited reconstruction of the object from the ensemble of short exposure frames while simultaneously estimating the phase as a function of time from a set of exposures.
Overcoming Dynamic Disturbances in Imaging Systems
NASA Technical Reports Server (NTRS)
Young, Eric W.; Dente, Gregory C.; Lyon, Richard G.; Chesters, Dennis; Gong, Qian
2000-01-01
We develop and discuss a methodology with the potential to yield a significant reduction in complexity, cost, and risk of space-borne optical systems in the presence of dynamic disturbances. More robust systems almost certainly will be a result as well. Many future space-based and ground-based optical systems will employ optical control systems to enhance imaging performance. The goal of the optical control subsystem is to determine the wavefront aberrations and remove them. Ideally reducing an aberrated image of the object under investigation to a sufficiently clear (usually diffraction-limited) image. Control will likely be distributed over several elements. These elements may include telescope primary segments, telescope secondary, telescope tertiary, deformable mirror(s), fine steering mirror(s), etc. The last two elements, in particular, may have to provide dynamic control. These control subsystems may become elaborate indeed. But robust system performance will require evaluation of the image quality over a substantial range and in a dynamic environment. Candidate systems for improvement in the Earth Sciences Enterprise could include next generation Landsat systems or atmospheric sensors for dynamic imaging of individual, severe storms. The technology developed here could have a substantial impact on the development of new systems in the Space Science Enterprise; such as the Next Generation Space Telescope(NGST) and its follow-on the Next NGST. Large Interferometric Systems of non-zero field, such as Planet Finder and Submillimeter Probe of the Evolution of Cosmic Structure, could benefit. These systems most likely will contain large, flexible optormechanical structures subject to dynamic disturbance. Furthermore, large systems for high resolution imaging of planets or the sun from space may also benefit. Tactical and Strategic Defense systems will need to image very small targets as well and could benefit from the technology developed here. We discuss a novel speckle imaging technique with the potential to separate dynamic aberrations from static aberrations. Post-processing of a set of image data, using an algorithm based on this technique, should work for all but the lowest light levels and highest frequency dynamic environments. This technique may serve to reduce the complexity of the control system and provide for robust, fault-tolerant, reduced risk operation. For a given object, a short exposure image is "frozen" on the focal plane in the presence of the environmental disturbance (turbulence, jitter, etc.). A key factor is that this imaging data exhibits frame-to-frame linear shift invariance. Therefore, although the Point Spread Function is varying from frame to frame, the source is fixed; and each short exposure contains object spectrum data out to the diffraction limit of the imaging system. This novel speckle imaging technique uses the Knox-Thompson method. The magnitude of the complex object spectrum is straightforward to determine by well-established approaches. The phase of the complex object spectrum is decomposed into two parts. One is a single-valued function determined by the divergence of the optical phase gradient. The other is a multi-valued function determined by, the circulation of the optical phase gradient-"hidden phase." Finite difference equations are developed for the phase. The novelty of this approach is captured in the inclusion of this "hidden phase." This technique allows the diffraction-limited reconstruction of the object from the ensemble of short exposure frames while simultaneously estimating the phase as a function of time from a set of exposures.
General view of the flight deck of the Orbiter Discovery ...
General view of the flight deck of the Orbiter Discovery looking forward along the approximate center line of the orbiter at the center console. The Multifunction Electronic Display System (MEDS) is evident in the mid-ground center of this image, this system was a major upgrade from the previous analog display system. The commander's station is on the port side or left in this view and the pilot's station is on the starboard side or right tin this view. Not the grab bar in the upper center of the image which was primarily used for commander and pilot ingress with the orbiter in a vertical position on the launch pad. Also note that the forward observation windows have protective covers over them. This image was taken at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Guo, Xiaohu; Dong, Liquan; Zhao, Yuejin; Jia, Wei; Kong, Lingqin; Wu, Yijian; Li, Bing
2015-04-01
Wavefront coding (WFC) technology is adopted in the space optical system to resolve the problem of defocus caused by temperature difference or vibration of satellite motion. According to the theory of WFC, we calculate and optimize the phase mask parameter of the cubic phase mask plate, which is used in an on-axis three-mirror Cassegrain (TMC) telescope system. The simulation analysis and the experimental results indicate that the defocused modulation transfer function curves and the corresponding blurred images have a perfect consistency in the range of 10 times the depth of focus (DOF) of the original TMC system. After digital image processing by a Wiener filter, the spatial resolution of the restored images is up to 57.14 line pairs/mm. The results demonstrate that the WFC technology in the TMC system has superior performance in extending the DOF and less sensitivity to defocus, which has great value in resolving the problem of defocus in the space optical system.
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.
2011-10-01
In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.
International Space Station from Space Shuttle Endeavour
NASA Technical Reports Server (NTRS)
2007-01-01
The crew of the Space Shuttle Endeavour took this spectacular image of the International Space Station during the STS118 mission, August 8-21, 2007. The image was acquired by an astronaut through one of the crew cabin windows, looking back over the length of the Shuttle. This oblique (looking at an angle from vertical, rather than straight down towards the Earth) image was acquired almost one hour after late inspection activities had begun. The sensor head of the Orbiter Boom Sensor System is visible at image top left. The entire Space Station is visible at image bottom center, set against the backdrop of the Ionian Sea approximately 330 kilometers below it. Other visible features of the southeastern Mediterranean region include the toe and heel of Italy's 'boot' at image lower left, and the western coastlines of Albania and Greece, which extend across image center. Farther towards the horizon, the Aegean and Black Seas are also visible. Featured astronaut photograph STS118-E-9469 was acquired by the STS-118 crew on August 19, 2007, with a Kodak 760C digital camera using a 28 mm lens, and is provided by the ISS Crew Earth Observations experiment and Image Science and Analysis Laboratory at Johnson Space Center.
Solar polar orbit radio telescope for space weather forecast
NASA Astrophysics Data System (ADS)
Wu, J.; Wang, C.; Wang, S.; Wu, J.; Sun, W.; Cai, J.; Yan, Y.
Radio emission from density plasma can be detected at low radio frequencies. An image of such plasma clouds of the entire inner interplanetary space is always a wanted input for space weather forecast and ICME propagation studies. To take such an image within the ecliptic plane may not fully explore what is happening around the Sun not only because of the blockage of the Sun, also because most of the ICMEs are propagating in the low-latitude of the Sun, near the ecliptic plane. It is then proposed to launch a solar polar orbit radio telescope to acquire high density plasma cloud images from the entire inner interplanetary space. Low radio frequency images require a large antenna aperture in space. It is, therefore, proposed to use the existing passive synthetic aperture radiometer technology to reduce mass and complicity of the deployment system of the big antenna. In order to reduce the mass of the antenna by using minimum number of elements, a zero redundant antenna element design can be used with a rotating time-shared sampling system. A preliminary assessment study shows the mission is feasible.
Ultrasound Imaging System Video
NASA Technical Reports Server (NTRS)
2002-01-01
In this video, astronaut Peggy Whitson uses the Human Research Facility (HRF) Ultrasound Imaging System in the Destiny Laboratory of the International Space Station (ISS) to image her own heart. The Ultrasound Imaging System provides three-dimension image enlargement of the heart and other organs, muscles, and blood vessels. It is capable of high resolution imaging in a wide range of applications, both research and diagnostic, such as Echocardiography (ultrasound of the heart), abdominal, vascular, gynecological, muscle, tendon, and transcranial ultrasound.
Space Object and Light Attribute Rendering (SOLAR) Projection System
2017-05-08
AVAILABILITY STATEMENT A DISTRIBUTION UNLIMITED: PB Public Release 13. SUPPLEMENTARY NOTES 14. ABSTRACT A state of the art planetarium style projection system...Rendering (SOLAR) Projection System 1 Abstract A state of the art planetarium style projection system called Space Object and Light Attribute Rendering...planetarium style projection system for emulation of a variety of close proximity and long range imaging experiments. University at Buffalo’s Space
NASA Astrophysics Data System (ADS)
Li, Senhu; Sarment, David
2015-12-01
Minimally invasive neurosurgery needs intraoperative imaging updates and high efficient image guide system to facilitate the procedure. An automatic image guided system utilized with a compact and mobile intraoperative CT imager was introduced in this work. A tracking frame that can be easily attached onto the commercially available skull clamp was designed. With known geometry of fiducial and tracking sensor arranged on this rigid frame that was fabricated through high precision 3D printing, not only was an accurate, fully automatic registration method developed in a simple and less-costly approach, but also it helped in estimating the errors from fiducial localization in image space through image processing, and in patient space through the calibration of tracking frame. Our phantom study shows the fiducial registration error as 0.348+/-0.028mm, comparing the manual registration error as 1.976+/-0.778mm. The system in this study provided a robust and accurate image-to-patient registration without interruption of routine surgical workflow and any user interactions involved through the neurosurgery.
Lunar Volatile System Dynamics: Observations Enabled by the Deep Space Gateway
NASA Astrophysics Data System (ADS)
Honniball, C. I.; Lucey, P. G.; Petro, N.; Hurley, D.; Farrell, W.
2018-02-01
A UV spectrometer-imager and IR spectrometer are proposed to solve questions regarding the lunar volatile system. The instrument takes advantage of highly elliptical orbits and the thermal management system of the Deep Space Gateway.
The development of a specialized processor for a space-based multispectral earth imager
NASA Astrophysics Data System (ADS)
Khedr, Mostafa E.
2008-10-01
This work was done in the Department of Computer Engineering, Lvov Polytechnic National University, Lvov, Ukraine, as a thesis entitled "Space Imager Computer System for Raw Video Data Processing" [1]. This work describes the synthesis and practical implementation of a specialized computer system for raw data control and processing onboard a satellite MultiSpectral earth imager. This computer system is intended for satellites with resolution in the range of one meter with 12-bit precession. The design is based mostly on general off-the-shelf components such as (FPGAs) plus custom designed software for interfacing with PC and test equipment. The designed system was successfully manufactured and now fully functioning in orbit.
Image Processing for Cameras with Fiber Bundle Image Relay
length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors . However, such fiber-coupled imaging systems...coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image...vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with
Integration of stereotactic ultrasonic data into an interactive image-guided neurosurgical system
NASA Astrophysics Data System (ADS)
Shima, Daniel W.; Galloway, Robert L., Jr.
1998-06-01
Stereotactic ultrasound can be incorporated into an interactive, image-guide neurosurgical system by using an optical position sensor to define the location of an intraoperative scanner in physical space. A C-program has been developed that communicates with the OptotrakTM system developed by Northern Digital Inc. to optically track the three-dimensional position and orientation of a fan-shaped area defined with respect to a hand-held probe. (i.e., a virtual B-mode ultrasound fan beam) Volumes of CT and MR head scans from the same patient are registered to a location in physical space using a point-based technique. The coordinates of the virtual fan beam in physical space are continuously calculated and updated on-the-fly. During each program loop, the CT and MR data volumes are reformatted along the same plane and displayed as two fan-shaped images that correspond to the current physical-space location of the virtual fan beam. When the reformatted preoperative tomographic images are eventually paired with a real-time intraoperative ultrasound image, a neurosurgeon will be able to use the unique information of each imaging modality (e.g., the high resolution and tissue contrast of CT and MR and the real-time functionality of ultrasound) in a complementary manner to identify structures in the brain more easily and to guide surgical procedures more effectively.
Vision-based overlay of a virtual object into real scene for designing room interior
NASA Astrophysics Data System (ADS)
Harasaki, Shunsuke; Saito, Hideo
2001-10-01
In this paper, we introduce a geometric registration method for augmented reality (AR) and an application system, interior simulator, in which a virtual (CG) object can be overlaid into a real world space. Interior simulator is developed as an example of an AR application of the proposed method. Using interior simulator, users can visually simulate the location of virtual furniture and articles in the living room so that they can easily design the living room interior without placing real furniture and articles, by viewing from many different locations and orientations in real-time. In our system, two base images of a real world space are captured from two different views for defining a projective coordinate of object 3D space. Then each projective view of a virtual object in the base images are registered interactively. After such coordinate determination, an image sequence of a real world space is captured by hand-held camera with tracking non-metric measured feature points for overlaying a virtual object. Virtual objects can be overlaid onto the image sequence by taking each relationship between the images. With the proposed system, 3D position tracking device, such as magnetic trackers, are not required for the overlay of virtual objects. Experimental results demonstrate that 3D virtual furniture can be overlaid into an image sequence of the scene of a living room nearly at video rate (20 frames per second).
a Clustering-Based Approach for Evaluation of EO Image Indexing
NASA Astrophysics Data System (ADS)
Bahmanyar, R.; Rigoll, G.; Datcu, M.
2013-09-01
The volume of Earth Observation data is increasing immensely in order of several Terabytes a day. Therefore, to explore and investigate the content of this huge amount of data, developing more sophisticated Content-Based Information Retrieval (CBIR) systems are highly demanded. These systems should be able to not only discover unknown structures behind the data, but also provide relevant results to the users' queries. Since in any retrieval system the images are processed based on a discrete set of their features (i.e., feature descriptors), study and assessment of the structure of feature space, build by different feature descriptors, is of high importance. In this paper, we introduce a clustering-based approach to study the content of image collections. In our approach, we claim that using both internal and external evaluation of clusters for different feature descriptors, helps to understand the structure of feature space. Moreover, the semantic understanding of users about the images also can be assessed. To validate the performance of our approach, we used an annotated Synthetic Aperture Radar (SAR) image collection. Quantitative results besides the visualization of feature space demonstrate the applicability of our approach.
NASA Technical Reports Server (NTRS)
1984-01-01
Among the topics discussed are NASA's land remote sensing plans for the 1980s, the evolution of Landsat 4 and the performance of its sensors, the Landsat 4 thematic mapper image processing system radiometric and geometric characteristics, data quality, image data radiometric analysis and spectral/stratigraphic analysis, and thematic mapper agricultural, forest resource and geological applications. Also covered are geologic applications of side-looking airborne radar, digital image processing, the large format camera, the RADARSAT program, the SPOT 1 system's program status, distribution plans, and simulation program, Space Shuttle multispectral linear array studies of the optical and biological properties of terrestrial land cover, orbital surveys of solar-stimulated luminescence, the Space Shuttle imaging radar research facility, and Space Shuttle-based polar ice sounding altimetry.
Effect analysis of oil paint on the space optical contamination
NASA Astrophysics Data System (ADS)
Lu, Chun-lian; Lv, He; Han, Chun-xu; Wei, Hai-Bin
2013-08-01
The space contamination of spacecraft surface is a hot topic in the spacecraft environment project and environment safeguard for spacecraft. Since the 20th century, many American satellites have had malfunction for space contamination. The space optical systems are usually exposed to the external space environment. The particulate contamination of optical systems will degrade the detection ability. We call the optical damage. It also has a bad influence on the spectral imaging quality of the whole system. In this paper, effects of contamination on spectral imaging were discussed. The experiment was designed to observe the effect value. We used numeral curve fitting to analyze the relationship between the optical damage factor (Transmittance decay factor) and the contamination degree of the optical system. We gave the results of six specific wavelengths from 450 to 700nm and obtained the function of between the optical damage factor and contamination degree. We chose three colors of oil paint to be compared. Through the numeral curve fitting and processing data, we could get the mass thickness for different colors of oil paint when transmittance decreased to 50% and 30%. Some comparisons and research conclusions were given. From the comparisons and researches, we could draw the conclusions about contamination effects of oil paint on the spectral imaging system.
ERIC Educational Resources Information Center
Haapaniemi, Peter
1990-01-01
Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…
Preliminary design of a satellite observation system for Space Station Freedom
NASA Technical Reports Server (NTRS)
Cabe, Greg (Editor); Gallagher, Chris; Wilson, Brian; Rehfeld, James; Maurer, Alexa; Stern, Dan; Nualart, Jaime; Le, Xuan-Trang
1992-01-01
Degobah Satellite Systems (DSS), in cooperation with the University Space Research Association (USRA), NASA - Johnson Space Center (JSC), and the University of Texas, has completed the preliminary design of a satellite system to provide inexpensive on-demand video images of all or any portion of Space Station Freedom (SSF). DSS has narrowed the scope of the project to complement the work done by Mr. Dennis Wells at Johnson Space Center. This three month project has resulted in completion of the preliminary design of AERCAM, the Autonomous Extravehicular Robotic Camera, detailed in this design report. This report begins by providing information on the project background, describing the mission objectives, constraints, and assumptions. Preliminary designs for the primary concept and satellite subsystems are then discussed in detail. Included in the technical portion of the report are detailed descriptions of an advanced imaging system and docking and safing systems that ensure compatibility with the SSF. The report concludes by describing management procedures and project costs.
The Goddard Space Flight Center Program to develop parallel image processing systems
NASA Technical Reports Server (NTRS)
Schaefer, D. H.
1972-01-01
Parallel image processing which is defined as image processing where all points of an image are operated upon simultaneously is discussed. Coherent optical, noncoherent optical, and electronic methods are considered parallel image processing techniques.
Space imaging infrared optical guidance for autonomous ground vehicle
NASA Astrophysics Data System (ADS)
Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Yamada, Hirofumi; Ishii, Hiromitsu
2008-08-01
We have developed the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle based on the uncooled infrared camera and focusing technique to detect the objects to be evaded and to set the drive path. For this purpose we made servomotor drive system to control the focus function of the infrared camera lens. To determine the best focus position we use the auto focus image processing of Daubechies wavelet transform technique with 4 terms. From the determined best focus position we transformed it to the distance of the object. We made the aluminum frame ground vehicle to mount the auto focus infrared unit. Its size is 900mm long and 800mm wide. This vehicle mounted Ackerman front steering system and the rear motor drive system. To confirm the guidance ability of the Space Imaging Infrared Optical Guidance for Autonomous Ground Vehicle we had the experiments for the detection ability of the infrared auto focus unit to the actual car on the road and the roadside wall. As a result the auto focus image processing based on the Daubechies wavelet transform technique detects the best focus image clearly and give the depth of the object from the infrared camera unit.
Flight simulator with spaced visuals
NASA Technical Reports Server (NTRS)
Gilson, Richard D. (Inventor); Thurston, Marlin O. (Inventor); Olson, Karl W. (Inventor); Ventola, Ronald W. (Inventor)
1980-01-01
A flight simulator arrangement wherein a conventional, movable base flight trainer is combined with a visual cue display surface spaced a predetermined distance from an eye position within the trainer. Thus, three degrees of motive freedom (roll, pitch and crab) are provided for a visual proprioceptive, and vestibular cue system by the trainer while the remaining geometric visual cue image alterations are developed by a video system. A geometric approach to computing runway image eliminates a need to electronically compute trigonometric functions, while utilization of a line generator and designated vanishing point at the video system raster permits facile development of the images of the longitudinal edges of the runway.
Axial nonimaging characteristics of imaging lenses: discussion.
Siew, Ronian
2016-05-01
At observation planes away from the image plane, an imaging lens is a nonimaging optic. We examine the variation of axial irradiance with distance in image space and highlight the following little-known observation for discussion: On a per-unit-area basis, the position of the highest concentration in image space is generally not at the focal plane. This characteristic is contrary to common experience, and it offers an additional degree of freedom for the design of detection systems. Additionally, it would also apply to lenses with negative refractive index. The position of peak concentration and its irradiance is dependent upon the location and irradiance of the image. As such, this discussion also includes a close examination of expressions for image irradiance and explains how they are related to irradiance calculations beyond the image plane. This study is restricted to rotationally symmetric refractive imaging systems with incoherent extended Lambertian sources.
NASA Facts: How We Get Pictures from Space
NASA Technical Reports Server (NTRS)
Haynes, Robert
1987-01-01
The past 25 years of space travel and exploration has generated an unprecedented quantity of data from planetary systems. Images taken in space and telemetered back to Earth have greatly aided scientists in formulating better and more accurate theories about the nature and origin of out solar system. The procedures and spacecraft systems used to gather data are explained.
Real-time processing of interferograms for monitoring protein crystal growth on the Space Station
NASA Technical Reports Server (NTRS)
Choudry, A.; Dupuis, N.
1988-01-01
The possibility of using microscopic interferometric techniques to monitor the growth of protein crystals on the Space Station is studied. Digital image processing techniques are used to develop a system for the real-time analysis of microscopic interferograms of nucleation sites during protein crystal growth. Features of the optical setup and the image processing system are discussed and experimental results are presented.
Popova, I I; Orlov, O I; Matsnev, E I; Revyakin, Yu G
2016-01-01
The paper reports the results of testing some diagnostic video systems enabling digital rendering of TNT teeth and jaws. The authors substantiate the criteria of choosing and integration of imaging systems in future on Russian segment of the International space station kit LOR developed for examination and download of high-quality images of cosmonauts' TNT, parodentium and teeth.
Large-viewing-angle electroholography by space projection
NASA Astrophysics Data System (ADS)
Sato, Koki; Obana, Kazuki; Okumura, Toshimichi; Kanaoka, Takumi; Nishikawa, Satoko; Takano, Kunihiko
2004-06-01
The specification of hologram image is the full parallax 3D image. In this case we can get more natural 3D image because focusing and convergence are coincident each other. We try to get practical electro-holography system because for conventional electro-holography the image viewing angle is very small. This is due to the limited display pixel size. Now we are developing new method for large viewing angle by space projection method. White color laser is irradiated to single DMD panel ( time shared CGH of RGB three colors ). 3D space screen constructed by very small water particle is used to reconstruct the 3D image with large viewing angle by scattering of water particle.
Task-based design of a synthetic-collimator SPECT system used for small animal imaging.
Lin, Alexander; Kupinski, Matthew A; Peterson, Todd E; Shokouhi, Sepideh; Johnson, Lindsay C
2018-05-07
In traditional multipinhole SPECT systems, image multiplexing - the overlapping of pinhole projection images - may occur on the detector, which can inhibit quality image reconstructions due to photon-origin uncertainty. One proposed system to mitigate the effects of multiplexing is the synthetic-collimator SPECT system. In this system, two detectors, a silicon detector and a germanium detector, are placed at different distances behind the multipinhole aperture, allowing for image detection to occur at different magnifications and photon energies, resulting in higher overall sensitivity while maintaining high resolution. The unwanted effects of multiplexing are reduced by utilizing the additional data collected from the front silicon detector. However, determining optimal system configurations for a given imaging task requires efficient parsing of the complex parameter space, to understand how pinhole spacings and the two detector distances influence system performance. In our simulation studies, we use the ensemble mean-squared error of the Wiener estimator (EMSE W ) as the figure of merit to determine optimum system parameters for the task of estimating the uptake of an 123 I-labeled radiotracer in three different regions of a computer-generated mouse brain phantom. The segmented phantom map is constructed by using data from the MRM NeAt database and allows for the reduction in dimensionality of the system matrix which improves the computational efficiency of scanning the system's parameter space. To contextualize our results, the Wiener estimator is also compared against a region of interest estimator using maximum-likelihood reconstructed data. Our results show that the synthetic-collimator SPECT system outperforms traditional multipinhole SPECT systems in this estimation task. We also find that image multiplexing plays an important role in the system design of the synthetic-collimator SPECT system, with optimal germanium detector distances occurring at maxima in the derivative of the percent multiplexing function. Furthermore, we report that improved task performance can be achieved by using an adaptive system design in which the germanium detector distance may vary with projection angle. Finally, in our comparative study, we find that the Wiener estimator outperforms the conventional region of interest estimator. Our work demonstrates how this optimization method has the potential to quickly and efficiently explore vast parameter spaces, providing insight into the behavior of competing factors, which are otherwise very difficult to calculate and study using other existing means. © 2018 American Association of Physicists in Medicine.
Interior view of the external airlock of the Orbiter Discovery. ...
Interior view of the external airlock of the Orbiter Discovery. In the lower portion of the image is the Aft Hatch and in the upper portion the image is the Upper Hatch. This photograph was taken in the Orbiter Processing Facility at the Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Design of an automated imaging system for use in a space experiment
NASA Technical Reports Server (NTRS)
Hartz, William G.; Bozzolo, Nora G.; Lewis, Catherine C.; Pestak, Christopher J.
1991-01-01
An experiment, occurring in an orbiting platform, examines the mass transfer across gas-liquid and liquid-liquid interfaces. It employs an imaging system with real time image analysis. The design includes optical design, imager selection and integration, positioner control, image recording, software development for processing and interfaces to telemetry. It addresses the constraints of weight, volume, and electric power associated with placing the experiment in the Space Shuttle cargo bay. Challenging elements of the design are: imaging and recording of a 200-micron-diameter bubble with a resolution of 2 microns to serve a primary source of data; varying frame rates from 500 per second to 1 frame per second, depending on the experiment phase; and providing three-dimensional information to determine the shape of the bubble.
Research-grade CMOS image sensors for remote sensing applications
NASA Astrophysics Data System (ADS)
Saint-Pe, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Martin-Gonthier, Philippe; Corbiere, Franck; Belliot, Pierre; Estribeau, Magali
2004-11-01
Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid-90s, CMOS Image Sensors (CIS) have been competing with CCDs for consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding space applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this paper will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments and performances of CIS prototypes built using an imaging CMOS process will be presented in the corresponding section.
Sparse magnetic resonance imaging reconstruction using the bregman iteration
NASA Astrophysics Data System (ADS)
Lee, Dong-Hoon; Hong, Cheol-Pyo; Lee, Man-Woo
2013-01-01
Magnetic resonance imaging (MRI) reconstruction needs many samples that are sequentially sampled by using phase encoding gradients in a MRI system. It is directly connected to the scan time for the MRI system and takes a long time. Therefore, many researchers have studied ways to reduce the scan time, especially, compressed sensing (CS), which is used for sparse images and reconstruction for fewer sampling datasets when the k-space is not fully sampled. Recently, an iterative technique based on the bregman method was developed for denoising. The bregman iteration method improves on total variation (TV) regularization by gradually recovering the fine-scale structures that are usually lost in TV regularization. In this study, we studied sparse sampling image reconstruction using the bregman iteration for a low-field MRI system to improve its temporal resolution and to validate its usefulness. The image was obtained with a 0.32 T MRI scanner (Magfinder II, SCIMEDIX, Korea) with a phantom and an in-vivo human brain in a head coil. We applied random k-space sampling, and we determined the sampling ratios by using half the fully sampled k-space. The bregman iteration was used to generate the final images based on the reduced data. We also calculated the root-mean-square-error (RMSE) values from error images that were obtained using various numbers of bregman iterations. Our reconstructed images using the bregman iteration for sparse sampling images showed good results compared with the original images. Moreover, the RMSE values showed that the sparse reconstructed phantom and the human images converged to the original images. We confirmed the feasibility of sparse sampling image reconstruction methods using the bregman iteration with a low-field MRI system and obtained good results. Although our results used half the sampling ratio, this method will be helpful in increasing the temporal resolution at low-field MRI systems.
A spatial registration method for navigation system combining O-arm with spinal surgery robot
NASA Astrophysics Data System (ADS)
Bai, H.; Song, G. L.; Zhao, Y. W.; Liu, X. Z.; Jiang, Y. X.
2018-05-01
The minimally invasive surgery in spinal surgery has become increasingly popular in recent years as it reduces the chances of complications during post-operation. However, the procedure of spinal surgery is complicated and the surgical vision of minimally invasive surgery is limited. In order to increase the quality of percutaneous pedicle screw placement, the O-arm that is a mobile intraoperative imaging system is used to assist surgery. The robot navigation system combined with O-arm is also increasing, with the extensive use of O-arm. One of the major problems in the surgical navigation system is to associate the patient space with the intra-operation image space. This study proposes a spatial registration method of spinal surgical robot navigation system, which uses the O-arm to scan a calibration phantom with metal calibration spheres. First, the metal artifacts were reduced in the CT slices and then the circles in the images based on the moments invariant could be identified. Further, the position of the calibration sphere in the image space was obtained. Moreover, the registration matrix is obtained based on the ICP algorithm. Finally, the position error is calculated to verify the feasibility and accuracy of the registration method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raj, Sunny; Jha, Sumit Kumar; Pullum, Laura L.
Validating the correctness of human detection vision systems is crucial for safety applications such as pedestrian collision avoidance in autonomous vehicles. The enormous space of possible inputs to such an intelligent system makes it difficult to design test cases for such systems. In this report, we present our tool MAYA that uses an error model derived from a convolutional neural network (CNN) to explore the space of images similar to a given input image, and then tests the correctness of a given human or object detection system on such perturbed images. We demonstrate the capability of our tool on themore » pre-trained Histogram-of-Oriented-Gradients (HOG) human detection algorithm implemented in the popular OpenCV toolset and the Caffe object detection system pre-trained on the ImageNet benchmark. Our tool may serve as a testing resource for the designers of intelligent human and object detection systems.« less
ARES I AND ARES V CONCEPT IMAGE
NASA Technical Reports Server (NTRS)
2008-01-01
THIS CONCEPT IMAGE SHOWS NASA'S NEXT GENERATION LAUNCH VEHICLE SYSTEMS STANDING SIDE BY SIDE. ARES I, LEFT, IS THE CREW LAUNCH VEHICLE THAT WILL CARRY THE ORION CREW EXPLORATION VEHICLE TO SPACE. ARES V IS THE CARGO LAUNCH VEHICLE THAT WILL DELIVER LARGE SCALE HARDWARE, INCLUDING THE LUNAR LANDER, TO SPACE.
NASA Astrophysics Data System (ADS)
Avdyushev, V.; Banshchikova, M.; Chuvashov, I.; Kuzmin, A.
2017-09-01
In the paper are presented capabilities of software "Vector-M" for a diagnostics of the ionosphere state from auroral emissions images and plasma characteristics from the different orbits as a part of the system of control of space weather. The software "Vector-M" is developed by the celestial mechanics and astrometry department of Tomsk State University in collaboration with Space Research Institute (Moscow) and Central Aerological Observatory of Russian Federal Service for Hydrometeorology and Environmental Monitoring. The software "Vector-M" is intended for calculation of attendant geophysical and astronomical information for the centre of mass of the spacecraft and the space of observations in the experiment with auroral imager Aurovisor-VIS/MP in the orbit of the perspective Meteor-MP spacecraft.
Attitude determination for high-accuracy submicroradian jitter pointing on space-based platforms
NASA Astrophysics Data System (ADS)
Gupta, Avanindra A.; van Houten, Charles N.; Germann, Lawrence M.
1990-10-01
A description of the requirement definition process is given for a new wideband attitude determination subsystem (ADS) for image motion compensation (IMC) systems. The subsystem consists of either lateral accelerometers functioning in differential pairs or gas-bearing gyros for high-frequency sensors using CCD-based star trackers for low-frequency sensors. To minimize error the sensor signals are combined so that the mixing filter does not allow phase distortion. The two ADS models are introduced in an IMC simulation to predict measurement error, correction capability, and residual image jitter for a variety of system parameters. The IMC three-axis testbed is utilized to simulate an incoming beam in inertial space. Results demonstrate that both mechanical and electronic IMC meet the requirements of image stabilization for space-based observation at submicroradian-jitter levels. Currently available technology may be employed to implement IMC systems.
Space Radar Image of Baikal Lake, Russia
NASA Technical Reports Server (NTRS)
1994-01-01
This is an X-band black-and-white image of the forests east of the Baikal Forest in the Jablonowy Mountains of Russia. The image is centered at 52.5 degrees north latitude and 116 degrees east longitude near the mining town of Bukatschatscha. This image was acquired by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar aboard the space shuttle Endeavour on October 4, 1994, during the second flight of the spaceborne radar. This area is part of an international research project known as the Taiga Aerospace Investigation using Geographic Information System Applications.
Test Image by Mars Descent Imager
2010-07-19
Ken Edgett, deputy principal investigator for NASA Mars Descent Imager, holds a ruler used as a depth-of-field test target. The instrument took this image inside the Malin Space Science Systems clean room in San Diego, CA, during calibration testing.
State of the art in video system performance
NASA Technical Reports Server (NTRS)
Lewis, Michael J.
1990-01-01
The closed circuit television (CCTV) system that is onboard the Space Shuttle has the following capabilities: camera, video signal switching and routing unit (VSU); and Space Shuttle video tape recorder. However, this system is inadequate for use with many experiments that require video imaging. In order to assess the state-of-the-art in video technology and data storage systems, a survey was conducted of the High Resolution, High Frame Rate Video Technology (HHVT) products. The performance of the state-of-the-art solid state cameras and image sensors, video recording systems, data transmission devices, and data storage systems versus users' requirements are shown graphically.
IoSiS: a radar system for imaging of satellites in space
NASA Astrophysics Data System (ADS)
Jirousek, M.; Anger, S.; Dill, S.; Schreiber, E.; Peichl, M.
2017-05-01
Space debris nowadays is one of the main threats for satellite systems especially in low earth orbit (LEO). More than 700,000 debris objects with potential to destroy or damage a satellite are estimated. The effects of an impact often are not identifiable directly from ground. High-resolution radar images are helpful in analyzing a possible damage. Therefor DLR is currently developing a radar system called IoSiS (Imaging of Satellites in Space), being based on an existing steering antenna structure and our multi-purpose high-performance radar system GigaRad for experimental investigations. GigaRad is a multi-channel system operating at X band and using a bandwidth of up to 4.4 GHz in the IoSiS configuration, providing fully separated transmit (TX) and receive (RX) channels, and separated antennas. For the observation of small satellites or space debris a highpower traveling-wave-tube amplifier (TWTA) is mounted close to the TX antenna feed. For the experimental phase IoSiS uses a 9 m TX and a 1 m RX antenna mounted on a common steerable positioner. High-resolution radar images are obtained by using Inverse Synthetic Aperture Radar (ISAR) techniques. The guided tracking of known objects during overpass allows here wide azimuth observation angles. Thus high azimuth resolution comparable to the range resolution can be achieved. This paper outlines technical main characteristics of the IoSiS radar system including the basic setup of the antenna, the radar instrument with the RF error correction, and the measurement strategy. Also a short description about a simulation tool for the whole instrument and expected images is shown.
Closeup view of the payload bay side of the aft ...
Close-up view of the payload bay side of the aft fuselage bulkhead of the Orbiter Discovery. This image has a detailed portions of the Remote Manipulator System and the Orbiter Maneuvering System/Reaction Control System Pods. This photograph wa taken in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
The progress of sub-pixel imaging methods
NASA Astrophysics Data System (ADS)
Wang, Hu; Wen, Desheng
2014-02-01
This paper reviews the Sub-pixel imaging technology principles, characteristics, the current development status at home and abroad and the latest research developments. As Sub-pixel imaging technology has achieved the advantages of high resolution of optical remote sensor, flexible working ways and being miniaturized with no moving parts. The imaging system is suitable for the application of space remote sensor. Its application prospect is very extensive. It is quite possible to be the research development direction of future space optical remote sensing technology.
Application of field dependent polynomial model
NASA Astrophysics Data System (ADS)
Janout, Petr; Páta, Petr; Skala, Petr; Fliegel, Karel; Vítek, Stanislav; Bednář, Jan
2016-09-01
Extremely wide-field imaging systems have many advantages regarding large display scenes whether for use in microscopy, all sky cameras, or in security technologies. The Large viewing angle is paid by the amount of aberrations, which are included with these imaging systems. Modeling wavefront aberrations using the Zernike polynomials is known a longer time and is widely used. Our method does not model system aberrations in a way of modeling wavefront, but directly modeling of aberration Point Spread Function of used imaging system. This is a very complicated task, and with conventional methods, it was difficult to achieve the desired accuracy. Our optimization techniques of searching coefficients space-variant Zernike polynomials can be described as a comprehensive model for ultra-wide-field imaging systems. The advantage of this model is that the model describes the whole space-variant system, unlike the majority models which are partly invariant systems. The issue that this model is the attempt to equalize the size of the modeled Point Spread Function, which is comparable to the pixel size. Issues associated with sampling, pixel size, pixel sensitivity profile must be taken into account in the design. The model was verified in a series of laboratory test patterns, test images of laboratory light sources and consequently on real images obtained by an extremely wide-field imaging system WILLIAM. Results of modeling of this system are listed in this article.
NASA Technical Reports Server (NTRS)
1992-01-01
To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.
A knowledge-based machine vision system for space station automation
NASA Technical Reports Server (NTRS)
Chipman, Laure J.; Ranganath, H. S.
1989-01-01
A simple knowledge-based approach to the recognition of objects in man-made scenes is being developed. Specifically, the system under development is a proposed enhancement to a robot arm for use in the space station laboratory module. The system will take a request from a user to find a specific object, and locate that object by using its camera input and information from a knowledge base describing the scene layout and attributes of the object types included in the scene. In order to use realistic test images in developing the system, researchers are using photographs of actual NASA simulator panels, which provide similar types of scenes to those expected in the space station environment. Figure 1 shows one of these photographs. In traditional approaches to image analysis, the image is transformed step by step into a symbolic representation of the scene. Often the first steps of the transformation are done without any reference to knowledge of the scene or objects. Segmentation of an image into regions generally produces a counterintuitive result in which regions do not correspond to objects in the image. After segmentation, a merging procedure attempts to group regions into meaningful units that will more nearly correspond to objects. Here, researchers avoid segmenting the image as a whole, and instead use a knowledge-directed approach to locate objects in the scene. The knowledge-based approach to scene analysis is described and the categories of knowledge used in the system are discussed.
Space charge effects in ultrafast electron diffraction and imaging
NASA Astrophysics Data System (ADS)
Tao, Zhensheng; Zhang, He; Duxbury, P. M.; Berz, Martin; Ruan, Chong-Yu
2012-02-01
Understanding space charge effects is central for the development of high-brightness ultrafast electron diffraction and microscopy techniques for imaging material transformation with atomic scale detail at the fs to ps timescales. We present methods and results for direct ultrafast photoelectron beam characterization employing a shadow projection imaging technique to investigate the generation of ultrafast, non-uniform, intense photoelectron pulses in a dc photo-gun geometry. Combined with N-particle simulations and an analytical Gaussian model, we elucidate three essential space-charge-led features: the pulse lengthening following a power-law scaling, the broadening of the initial energy distribution, and the virtual cathode threshold. The impacts of these space charge effects on the performance of the next generation high-brightness ultrafast electron diffraction and imaging systems are evaluated.
Time-of-Flight Microwave Camera.
Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh
2015-10-05
Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.
Computer graphics testbed to simulate and test vision systems for space applications
NASA Technical Reports Server (NTRS)
Cheatham, John B.
1991-01-01
Research activity has shifted from computer graphics and vision systems to the broader scope of applying concepts of artificial intelligence to robotics. Specifically, the research is directed toward developing Artificial Neural Networks, Expert Systems, and Laser Imaging Techniques for Autonomous Space Robots.
Optics: Light, Color, and Their Uses. An Educator's Guide With Activities In Science and Mathematics
NASA Technical Reports Server (NTRS)
2000-01-01
This document includes information on the Chandra X-Ray Observatory, the Hubble Space Telescope, the Next Generation Space Telescope, Soft X-Ray Imager, and the Lightning Imaging System. Classroom activities from grades K-12 are included, focusing on light and color, using mirrors, lenses, prisms, and filters.
Oblique view at ground level looking at the aft and ...
Oblique view at ground level looking at the aft and port side of the Orbiter Discovery in the Vehicle Assembly Building at NASA's Kennedy Space Center. Note that the Orbiter Maneuvering System/Reaction Control System pods and the Shuttle Main Engines are removed in this image. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Distance measurement based on light field geometry and ray tracing.
Chen, Yanqin; Jin, Xin; Dai, Qionghai
2017-01-09
In this paper, we propose a geometric optical model to measure the distances of object planes in a light field image. The proposed geometric optical model is composed of two sub-models based on ray tracing: object space model and image space model. The two theoretic sub-models are derived on account of on-axis point light sources. In object space model, light rays propagate into the main lens and refract inside it following the refraction theorem. In image space model, light rays exit from emission positions on the main lens and subsequently impinge on the image sensor with different imaging diameters. The relationships between imaging diameters of objects and their corresponding emission positions on the main lens are investigated through utilizing refocusing and similar triangle principle. By combining the two sub-models together and tracing light rays back to the object space, the relationships between objects' imaging diameters and corresponding distances of object planes are figured out. The performance of the proposed geometric optical model is compared with existing approaches using different configurations of hand-held plenoptic 1.0 cameras and real experiments are conducted using a preliminary imaging system. Results demonstrate that the proposed model can outperform existing approaches in terms of accuracy and exhibits good performance at general imaging range.
A large-area gamma-ray imaging telescope system
NASA Technical Reports Server (NTRS)
Koch, D. G.
1983-01-01
The concept definition of using the External Tank (ET) of the Space Shuttle as the basis for constructing a large area gamma ray imaging telescope in space is detailed. The telescope will be used to locate and study cosmic sources of gamma rays of energy greater than 100 MeV. Both the telescope properties and the means whereby an ET is used for this purpose are described. A parallel is drawn between those systems that would be common to both a Space Station and this ET application. In addition, those systems necessary for support of the telescope can form the basis for using the ET as part of the Space Station. The major conclusions of this concept definition are that the ET is ideal for making into a gamma ray telescope, and that this telescope will provide a substantial increase in collecting area.
NASA Technical Reports Server (NTRS)
Macdonald, H.; Waite, W. P.; Kaupp, V. H.; Bridges, L. C.; Storm, M.
1983-01-01
Comparisons between LANDSAT MSS imagery, and aircraft and space radar imagery from different geologic environments in the United States, Panama, Colombia, and New Guinea demonstrate the interdependence of radar system geometry and terrain configuration for optimum retrieval of geologic information. Illustrations suggest that in the case of space radars (SIR-A in particular), the ability to acquire multiple look-angle/look-direction radar images of a given area is more valuable for landform mapping than further improvements in spatial resolution. Radar look-angle is concluded to be one of the most important system parameters of a space radar designed to be used for geologic reconnaissance mapping. The optimum set of system parameters must be determined for imaging different classes of landform features and tailoring the look-angle to local topography.
DDGIPS: a general image processing system in robot vision
NASA Astrophysics Data System (ADS)
Tian, Yuan; Ying, Jun; Ye, Xiuqing; Gu, Weikang
2000-10-01
Real-Time Image Processing is the key work in robot vision. With the limitation of the hardware technique, many algorithm-oriented firmware systems were designed in the past. But their architectures were not flexible enough to achieve a multi-algorithm development system. Because of the rapid development of microelectronics technique, many high performance DSP chips and high density FPGA chips have come to life, and this makes it possible to construct a more flexible architecture in real-time image processing system. In this paper, a Double DSP General Image Processing System (DDGIPS) is concerned. We try to construct a two-DSP-based FPGA-computational system with two TMS320C6201s. The TMS320C6x devices are fixed-point processors based on the advanced VLIW CPU, which has eight functional units, including two multipliers and six arithmetic logic units. These features make C6x a good candidate for a general purpose system. In our system, the two TMS320C6201s each has a local memory space, and they also have a shared system memory space which enables them to intercommunicate and exchange data efficiently. At the same time, they can be directly inter-connected in star-shaped architecture. All of these are under the control of a FPGA group. As the core of the system, FPGA plays a very important role: it takes charge of DPS control, DSP communication, memory space access arbitration and the communication between the system and the host machine. And taking advantage of reconfiguring FPGA, all of the interconnection between the two DSP or between DSP and FPGA can be changed. In this way, users can easily rebuild the real-time image processing system according to the data stream and the task of the application and gain great flexibility.
DDGIPS: a general image processing system in robot vision
NASA Astrophysics Data System (ADS)
Tian, Yuan; Ying, Jun; Ye, Xiuqing; Gu, Weikang
2000-10-01
Real-Time Image Processing is the key work in robot vision. With the limitation of the hardware technique, many algorithm-oriented firmware systems were designed in the past. But their architectures were not flexible enough to achieve a multi- algorithm development system. Because of the rapid development of microelectronics technique, many high performance DSP chips and high density FPGA chips have come to life, and this makes it possible to construct a more flexible architecture in real-time image processing system. In this paper, a Double DSP General Image Processing System (DDGIPS) is concerned. We try to construct a two-DSP-based FPGA-computational system with two TMS320C6201s. The TMS320C6x devices are fixed-point processors based on the advanced VLIW CPU, which has eight functional units, including two multipliers and six arithmetic logic units. These features make C6x a good candidate for a general purpose system. In our system, the two TMS320C6210s each has a local memory space, and they also have a shared system memory space which enable them to intercommunicate and exchange data efficiently. At the same time, they can be directly interconnected in star- shaped architecture. All of these are under the control of FPGA group. As the core of the system, FPGA plays a very important role: it takes charge of DPS control, DSP communication, memory space access arbitration and the communication between the system and the host machine. And taking advantage of reconfiguring FPGA, all of the interconnection between the two DSP or between DSP and FPGA can be changed. In this way, users can easily rebuild the real-time image processing system according to the data stream and the task of the application and gain great flexibility.
Fast iterative image reconstruction using sparse matrix factorization with GPU acceleration
NASA Astrophysics Data System (ADS)
Zhou, Jian; Qi, Jinyi
2011-03-01
Statistically based iterative approaches for image reconstruction have gained much attention in medical imaging. An accurate system matrix that defines the mapping from the image space to the data space is the key to high-resolution image reconstruction. However, an accurate system matrix is often associated with high computational cost and huge storage requirement. Here we present a method to address this problem by using sparse matrix factorization and parallel computing on a graphic processing unit (GPU).We factor the accurate system matrix into three sparse matrices: a sinogram blurring matrix, a geometric projection matrix, and an image blurring matrix. The sinogram blurring matrix models the detector response. The geometric projection matrix is based on a simple line integral model. The image blurring matrix is to compensate for the line-of-response (LOR) degradation due to the simplified geometric projection matrix. The geometric projection matrix is precomputed, while the sinogram and image blurring matrices are estimated by minimizing the difference between the factored system matrix and the original system matrix. The resulting factored system matrix has much less number of nonzero elements than the original system matrix and thus substantially reduces the storage and computation cost. The smaller size also allows an efficient implement of the forward and back projectors on GPUs, which have limited amount of memory. Our simulation studies show that the proposed method can dramatically reduce the computation cost of high-resolution iterative image reconstruction. The proposed technique is applicable to image reconstruction for different imaging modalities, including x-ray CT, PET, and SPECT.
NASA Technical Reports Server (NTRS)
Jain, A. (Inventor)
1978-01-01
Significant height information of ocean waves, or peaks of rough terrain is obtained by compressing the radar signal over different widths of the available chirp or Doppler bandwidths, and cross-correlating one of these images with each of the others. Upon plotting a fixed (e.g., zero) component of the cross-correlation values as the spacing is increased over some empirically determined range, the system is calibrated. To measure height with the system, a spacing value is selected and a cross-correlation value is determined between two intensity images at a selected frequency spacing. The measured height is the slope of the cross-correlation value used. Both electronic and optical radar signal data compressors and cross-correlations are disclosed for implementation of the system.
Research-grade CMOS image sensors for demanding space applications
NASA Astrophysics Data System (ADS)
Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre
2004-06-01
Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid-90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.
Research-grade CMOS image sensors for demanding space applications
NASA Astrophysics Data System (ADS)
Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre
2017-11-01
Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid- 90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.
Real-time automatic registration in optical surgical navigation
NASA Astrophysics Data System (ADS)
Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming
2016-05-01
An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.
Imaging of near-Earth space plasma.
Mitchell, Cathryn N
2002-12-15
This paper describes the technique of imaging the ionosphere using tomographic principles. It reports on current developments and speculates on the future of this research area. Recent developments in computing and ionospheric measurement, together with the sharing of data via the internet, now allow us to envisage a time when high-resolution, real-time images and 'movies' of the ionosphere will be possible for radio communications planning. There is great potential to use such images for improving our understanding of the physical processes controlling the behaviour of the ionosphere. While real-time images and movies of the electron concentration are now almost possible, forecasting of ionospheric morphology is still in its early stages. It has become clear that the ionosphere cannot be considered as a system in isolation, and consequently new research projects to link together models of the solar-terrestrial system, including the Sun, solar wind, magnetosphere, ionosphere and thermosphere, are now being proposed. The prospect is now on the horizon of assimilating data from the entire solar-terrestrial system to produce a real-time computer model and 'space weather' forecast. The role of tomography in imaging beyond the ionosphere to include the whole near-Earth space-plasma realm is yet to be realized, and provides a challenging prospect for the future. Finally, exciting possibilities exist in applying such methods to image the atmospheres and ionospheres of other planets.
Direct-to-digital holography reduction of reference hologram noise and fourier space smearing
Voelkl, Edgar
2006-06-27
Systems and methods are described for reduction of reference hologram noise and reduction of Fourier space smearing, especially in the context of direct-to-digital holography (off-axis interferometry). A method of reducing reference hologram noise includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference image waves; and transforming the corresponding plurality of reference image waves into a reduced noise reference image wave. A method of reducing smearing in Fourier space includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference complex image waves; transforming the corresponding plurality of reference image waves into a reduced noise reference complex image wave; recording a hologram of an object; processing the hologram of the object into an object complex image wave; and dividing the complex image wave of the object by the reduced noise reference complex image wave to obtain a reduced smearing object complex image wave.
Navigation for the new millennium: Autonomous navigation for Deep Space 1
NASA Technical Reports Server (NTRS)
Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.;
1997-01-01
The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.
HERCULES/MSI: a multispectral imager with geolocation for STS-70
NASA Astrophysics Data System (ADS)
Simi, Christopher G.; Kindsfather, Randy; Pickard, Henry; Howard, William, III; Norton, Mark C.; Dixon, Roberta
1995-11-01
A multispectral intensified CCD imager combined with a ring laser gyroscope based inertial measurement unit was flown on the Space Shuttle Discovery from July 13-22, 1995 (Space Transport System Flight No. 70, STS-70). The camera includes a six position filter wheel, a third generation image intensifier, and a CCD camera. The camera is integrated with a laser gyroscope system that determines the ground position of the imagery to an accuracy of better than three nautical miles. The camera has two modes of operation; a panchromatic mode for high-magnification imaging [ground sample distance (GSD) of 4 m], or a multispectral mode consisting of six different user-selectable spectral ranges at reduced magnification (12 m GSD). This paper discusses the system hardware and technical trade-offs involved with camera optimization, and presents imagery observed during the shuttle mission.
Advanced Image Processing for NASA Applications
NASA Technical Reports Server (NTRS)
LeMoign, Jacqueline
2007-01-01
The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.
3D GeoWall Analysis System for Shuttle External Tank Foreign Object Debris Events
NASA Technical Reports Server (NTRS)
Brown, Richard; Navard, Andrew; Spruce, Joseph
2010-01-01
An analytical, advanced imaging method has been developed for the initial monitoring and identification of foam debris and similar anomalies that occur post-launch in reference to the space shuttle s external tank (ET). Remote sensing technologies have been used to perform image enhancement and analysis on high-resolution, true-color images collected with the DCS 760 Kodak digital camera located in the right umbilical well of the space shuttle. Improvements to the camera, using filters, have added sharpness/definition to the image sets; however, image review/analysis of the ET has been limited by the fact that the images acquired by umbilical cameras during launch are two-dimensional, and are usually nonreferenceable between frames due to rotation translation of the ET as it falls away from the space shuttle. Use of stereo pairs of these images can enable strong visual indicators that can immediately portray depth perception of damaged areas or movement of fragments between frames is not perceivable in two-dimensional images. A stereoscopic image visualization system has been developed to allow 3D depth perception of stereo-aligned image pairs taken from in-flight umbilical and handheld digital shuttle cameras. This new system has been developed to augment and optimize existing 2D monitoring capabilities. Using this system, candidate sequential image pairs are identified for transformation into stereo viewing pairs. Image orientation is corrected using control points (similar points) between frames to place the two images in proper X-Y viewing perspective. The images are then imported into the WallView stereo viewing software package. The collected control points are used to generate a transformation equation that is used to re-project one image and effectively co-register it to the other image. The co-registered, oriented image pairs are imported into a WallView image set and are used as a 3D stereo analysis slide show. Multiple sequential image pairs can be used to allow forensic review of temporal phenomena between pairs. The observer, while wearing linear polarized glasses, is able to review image pairs in passive 3D stereo.
UniSat-5: a space-based optical system for space debris monitoring
NASA Astrophysics Data System (ADS)
Di Roberto, Riccardo; Cappelletti, Chantal
2012-07-01
Micro-satellite missions, thanks to the miniaturization process of electronic components, now have a broader range of applications. Gauss Group at School of Aerospace Engineering has been a pioneer in educational micro-satellites, namely with UNISAT and EDUSAT missions. Moreover it has been long involved in space debris related studies, such as optical observations as well as mitigation. A new project is under development for a compact digital imaging system. The purpose will be in situ observation of space debris on board Unisat-5 micro-satellite. One of the key elements of observing on orbit is that many atmospheric phenomena would be avoided, such as diffraction and EM absorption. Hence images would gain more contrast and solar spectral irradiance would be higher for the whole visible spectrum Earlier limitations of power and instrument size prevented the inclusion of these payloads in educational satellite missions. The system is composed of an optical tube, a camera, C band and S band transceivers and two antennas. The system is independent from the rest of the spacecraft. The optical tube is a Schmidt-Cassegrain reflector, and the magnitude limit is 13. The camera is equipped with a panchromatic 5Mpix sensor, capable of direct video streaming, as well as local storage of recorded images. The transceivers operate on ISM 2.4GHz and 5 GHz Wi-Fi bands, and they provide stand-alone communication capabilities to the payload, and Unisat-5 OBDH can switch between the two. Both transceivers are connected to their respective custom-designed patch antenna. The ground segment is constituted of a high gain antenna dish, which will use the same transceiver on board the spacecraft as the feed, in order to establish a TCP/IP wireless link. Every component of this system is a consumer grade product. Therefore price reduction of cutting edge imaging technology now allows the use of professional instruments, that combined with the new wireless technology developed for commercially available RF equipment, allows for an affordable, stand-alone system for digital imaging in space. The space debris observation will work in pair with the attitude determination system, as well as the orbit determination system. UniSat-5 micro-satellite will be launched during Q4 2012 by a Kosmotras DNEPR LV, and it will be injected in a Sun Synchronous Orbit. UniSat-5 will be a the first university satellite for space debris monitoring, and it will test the technology for the future design of a formation flight for on orbit optical debris detection. This paper deals with the space debris observation system boarded on UniSat-5 and the observation strategies adopted considering the mission proposed.
Ship Detection Using High Resolution Satellite Imagery and Space-Based AIS
NASA Astrophysics Data System (ADS)
Hannevik, Tonje Nanette; Skauen, Andreas N.; Olsen, R. B.
2013-03-01
This paper presents a trial carried out in the Malangen area close to Tromsø city in the north of Norway in September 2010. High resolution Synthetic Aperture Radar (SAR) images from RADARSAT-2 were used to analyse how SAR images and cooperative reporting can be combined. Data from the Automatic Identification System, both land-based and space-based, have been used to identify detected vessels in the SAR images. The paper presents results of ship detection in high resolution RADARSAT-2 Standard Quad-Pol images, and how these results together with land-based and space-based AIS can be used. Some examples of tracking of vessels are also shown.
Autonomous control systems: applications to remote sensing and image processing
NASA Astrophysics Data System (ADS)
Jamshidi, Mohammad
2001-11-01
One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.
Rod-to-rod spacing illuminating device
Fodor, G.; Gaal, P.S.
1984-03-14
A system for obtaining an image of an object includes at least one light source having an incandescent filament. An image of the filament is projected onto an object to be observed. Using light reflected from the object, an image of the object is generated. Such a system may employ a television camera to generate the image, and is especially suited for remote observation of objects.
Backscatter X-Ray Development for Space Vehicle Thermal Protection Systems
NASA Astrophysics Data System (ADS)
Bartha, Bence B.; Hope, Dale; Vona, Paul; Born, Martin; Corak, Tony
2011-06-01
The Backscatter X-Ray (BSX) imaging technique is used for various single sided inspection purposes. Previously developed BSX techniques for spray-on-foam insulation (SOFI) have been used for detecting defects in Space Shuttle External Tank foam insulation. The developed BSX hardware and techniques are currently being enhanced to advance Non-Destructive Evaluation (NDE) methods for future space vehicle applications. Various Thermal Protection System (TPS) materials were inspected using the enhanced BSX imaging techniques, investigating the capability of the method to detect voids and other discontinuities at various locations within each material. Calibration standards were developed for the TPS materials in order to characterize and develop enhanced BSX inspection capabilities. The ability of the BSX technique to detect both manufactured and natural defects was also studied and compared to through-transmission x-ray techniques. The energy of the x-ray, source to object distance, angle of x-ray, focal spot size and x-ray detector configurations were parameters playing a significant role in the sensitivity of the BSX technique to image various materials and defects. The image processing of the results also showed significant increase in the sensitivity of the technique. The experimental results showed BSX to be a viable inspection technique for space vehicle TPS systems.
SPACE WARPS - I. Crowdsourcing the discovery of gravitational lenses
NASA Astrophysics Data System (ADS)
Marshall, Philip J.; Verma, Aprajita; More, Anupreeta; Davis, Christopher P.; More, Surhud; Kapadia, Amit; Parrish, Michael; Snyder, Chris; Wilcox, Julianne; Baeten, Elisabeth; Macmillan, Christine; Cornen, Claude; Baumer, Michael; Simpson, Edwin; Lintott, Chris J.; Miller, David; Paget, Edward; Simpson, Robert; Smith, Arfon M.; Küng, Rafael; Saha, Prasenjit; Collett, Thomas E.
2016-01-01
We describe SPACE WARPS, a novel gravitational lens discovery service that yields samples of high purity and completeness through crowdsourced visual inspection. Carefully produced colour composite images are displayed to volunteers via a web-based classification interface, which records their estimates of the positions of candidate lensed features. Images of simulated lenses, as well as real images which lack lenses, are inserted into the image stream at random intervals; this training set is used to give the volunteers instantaneous feedback on their performance, as well as to calibrate a model of the system that provides dynamical updates to the probability that a classified image contains a lens. Low-probability systems are retired from the site periodically, concentrating the sample towards a set of lens candidates. Having divided 160 deg2 of Canada-France-Hawaii Telescope Legacy Survey imaging into some 430 000 overlapping 82 by 82 arcsec tiles and displaying them on the site, we were joined by around 37 000 volunteers who contributed 11 million image classifications over the course of eight months. This stage 1 search reduced the sample to 3381 images containing candidates; these were then refined in stage 2 to yield a sample that we expect to be over 90 per cent complete and 30 per cent pure, based on our analysis of the volunteers performance on training images. We comment on the scalability of the SPACE WARPS system to the wide field survey era, based on our projection that searches of 105 images could be performed by a crowd of 105 volunteers in 6 d.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalay, Berfin; Demiralp, Metin
2014-10-06
The expectation value definitions over an extended space from the considered Hilbert space of the system under consideration is given in another paper of the second author in this symposium. There, in that paper, the conceptuality rather than specification is emphasized on. This work uses that conceptuality to investigate the time evolutions of the position related operators' expectation values not in its standard meaning but rather in a new version of the definition over not the original Hilbert space but in the space obtained by extensions via introducing the images of the given initial wave packet under the positive integermore » powers of the system Hamiltonian. These images may not be residing in the same space of the initial wave packet when certain singularities appear in the structure of the system Hamiltonian. This may break down the existence of the integrals in the definitions of the expectation values. The cure is the use of basis functions in the abovementioned extended space and the sandwiching of the target operator whose expectation value is under questioning by an appropriately chosen operator guaranteeing the existence of the relevant integrals. Work specifically focuses on the hydrogen-like quantum systems whose Hamiltonians contain a polar singularity at the origin.« less
NASA Astrophysics Data System (ADS)
Culp, Robert D.; Lewis, Robert A.
1989-05-01
Papers are presented on advances in guidance, navigation, and control; guidance and control storyboard displays; attitude referenced pointing systems; guidance, navigation, and control for specialized missions; and recent experiences. Other topics of importance to support the application of guidance and control to the space community include concept design and performance test of a magnetically suspended single-gimbal control moment gyro; design, fabrication and test of a prototype double gimbal control moment gyroscope for the NASA Space Station; the Circumstellar Imaging Telescope Image Motion Compensation System providing ultra-precise control on the Space Station platform; pinpointing landing concepts for the Mars Rover Sample Return mission; and space missile guidance and control simulation and flight testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pittman, Jeffery P.; Cassidy, Stephen R.; Mosey, Whitney LC
2013-07-31
Pacific Northwest National Laboratory (PNNL) and the Pacific Northwest Site Office (PNSO) have recently completed an effort to identify the current state of the campus and gaps that exist with regards to space needs, facilities and infrastructure. This effort has been used to establish a campus strategy to ensure PNNL is ready to further the United States (U.S.) Department of Energy (DOE) mission. Ten-year business projections and the impacts on space needs were assessed and incorporated into the long-term facility plans. In identifying/quantifying the space needs for PNNL, the following categories were addressed: Multi-purpose Programmatic (wet chemistry and imaging laboratorymore » space), Strategic (Systems Engineering and Computation Analytics, and Collaboration space), Remediation (space to offset the loss of the Research Technology Laboratory [RTL] Complex due to decontamination and demolition), and Optimization (the exit of older and less cost-effective facilities). The findings of the space assessment indicate a need for wet chemistry space, imaging space, and strategic space needs associated with systems engineering and collaboration space.« less
SPECT System Optimization Against A Discrete Parameter Space
Meng, L. J.; Li, N.
2013-01-01
In this paper, we present an analytical approach for optimizing the design of a static SPECT system or optimizing the sampling strategy with a variable/adaptive SPECT imaging hardware against an arbitrarily given set of system parameters. This approach has three key aspects. First, it is designed to operate over a discretized system parameter space. Second, we have introduced an artificial concept of virtual detector as the basic building block of an imaging system. With a SPECT system described as a collection of the virtual detectors, one can convert the task of system optimization into a process of finding the optimum imaging time distribution (ITD) across all virtual detectors. Thirdly, the optimization problem (finding the optimum ITD) could be solved with a block-iterative approach or other non-linear optimization algorithms. In essence, the resultant optimum ITD could provide a quantitative measure of the relative importance (or effectiveness) of the virtual detectors and help to identify the system configuration or sampling strategy that leads to an optimum imaging performance. Although we are using SPECT imaging as a platform to demonstrate the system optimization strategy, this development also provides a useful framework for system optimization problems in other modalities, such as positron emission tomography (PET) and X-ray computed tomography (CT) [1, 2]. PMID:23587609
Space Radar Image of Baikal Lake, Russia
1999-05-01
This is an X-band black-and-white image of the forests east of the Baikal Forest in the Jablonowy Mountains of Russia. The image is centered at 52.5 degrees north latitude and 116 degrees east longitude near the mining town of Bukatschatscha. This image was acquired by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar aboard the space shuttle Endeavour on October 4, 1994, during the second flight of the spaceborne radar. This area is part of an international research project known as the Taiga Aerospace Investigation using Geographic Information System Applications. http://photojournal.jpl.nasa.gov/catalog/PIA01754
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Armando Oliu, Final Inspection Team lead for the Shuttle program, speaks to reporters about the aid the Image Analysis Lab is giving the FBI in a kidnapping case. Oliu oversees the image lab that is using an advanced SGI TP9500 data management system to review the tape of the kidnapping in progress in Sarasota, Fla. KSC installed the new $3.2 million system in preparation for Return to Flight of the Space Shuttle fleet. The lab is studying the Sarasota kidnapping video to provide any new information possible to law enforcement officers. KSC is joining NASAs Marshall Space Flight Center in Alabama in reviewing the tape.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Armando Oliu, Final Inspection Team lead for the Shuttle program, speaks to reporters about the aidced the Image Analysis Lab is giving the FBI in a kidnapping case. Oliu oversees the image lab that is using an advanced SGI TP9500 data management system to review the tape of the kidnapping in progress in Sarasota, Fla. KSC installed the new $3.2 million system in preparation for Return to Flight of the Space Shuttle fleet. The lab is studying the Sarasota kidnapping video to provide any new information possible to law enforcement officers. KSC is joining NASAs Marshall Space Flight Center in Alabama in reviewing the tape.
Direct imaging of extra-solar planets with stationary occultations viewed by a space telescope
NASA Technical Reports Server (NTRS)
Elliot, J. L.
1978-01-01
The use of a telescope in space to detect planets outside the solar system by means of imaging at optical wavelengths is discussed. If the 'black' limb of the moon is utilized as an occulting edge, a hypothetical Jupiter-Sun system could be detected at a distance as great as 10 pc, and a signal-to-noise ratio of 9 could be achieved in less than 20 min with a 2.4 m telescope in space. An orbit for the telescope is proposed; this orbit could achieve a stationary lunar occultation of any star for a period of nearly two hours.
NASA Astrophysics Data System (ADS)
Wei, Liqing; Xiao, Xizhong; Wang, Yueming; Zhuang, Xiaoqiong; Wang, Jianyu
2017-11-01
Space-borne hyperspectral imagery is an important tool for earth sciences and industrial applications. Higher spatial and spectral resolutions have been sought persistently, although this results in more power, larger volume and weight during a space-borne spectral imager design. For miniaturization of hyperspectral imager and optimization of spectral splitting methods, several methods are compared in this paper. Spectral time delay integration (TDI) method with high transmittance Integrated Stepwise Filter (ISF) is proposed.With the method, an ISF imaging spectrometer with TDI could achieve higher system sensitivity than the traditional prism/grating imaging spectrometer. In addition, the ISF imaging spectrometer performs well in suppressing infrared background radiation produced by instrument. A compact shortwave infrared (SWIR) hyperspectral imager prototype based on HgCdTe covering the spectral range of 2.0-2.5 μm with 6 TDI stages was designed and integrated. To investigate the performance of ISF spectrometer, a method to derive the optimal blocking band curve of the ISF is introduced, along with known error characteristics. To assess spectral performance of the ISF system, a new spectral calibration based on blackbody radiation with temperature scanning is proposed. The results of the imaging experiment showed the merits of ISF. ISF has great application prospects in the field of high sensitivity and high resolution space-borne hyperspectral imagery.
Mars Descent Imager for Curiosity
2010-07-19
A pocketknife provides scale for this image of the Mars Descent Imager camera; the camera will fly on the Curiosity rover of NASA Mars Science Laboratory mission. Malin Space Science Systems, San Diego, Calif., supplied the camera for the mission.
THE PANCHROMATIC STARBURST IRREGULAR DWARF SURVEY (STARBIRDS): OBSERVATIONS AND DATA ARCHIVE
DOE Office of Scientific and Technical Information (OSTI.GOV)
McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D., E-mail: kmcquinn@astro.umn.edu
2015-06-22
Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The datamore » sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging.« less
Multidimensionally encoded magnetic resonance imaging.
Lin, Fa-Hsuan
2013-07-01
Magnetic resonance imaging (MRI) typically achieves spatial encoding by measuring the projection of a q-dimensional object over q-dimensional spatial bases created by linear spatial encoding magnetic fields (SEMs). Recently, imaging strategies using nonlinear SEMs have demonstrated potential advantages for reconstructing images with higher spatiotemporal resolution and reducing peripheral nerve stimulation. In practice, nonlinear SEMs and linear SEMs can be used jointly to further improve the image reconstruction performance. Here, we propose the multidimensionally encoded (MDE) MRI to map a q-dimensional object onto a p-dimensional encoding space where p > q. MDE MRI is a theoretical framework linking imaging strategies using linear and nonlinear SEMs. Using a system of eight surface SEM coils with an eight-channel radiofrequency coil array, we demonstrate the five-dimensional MDE MRI for a two-dimensional object as a further generalization of PatLoc imaging and O-space imaging. We also present a method of optimizing spatial bases in MDE MRI. Results show that MDE MRI with a higher dimensional encoding space can reconstruct images more efficiently and with a smaller reconstruction error when the k-space sampling distribution and the number of samples are controlled. Copyright © 2012 Wiley Periodicals, Inc.
Thermal/vacuum measurements of the Herschel space telescope by close-range photogrammetry
NASA Astrophysics Data System (ADS)
Parian, J. Amiri; Cozzani, A.; Appolloni, M.; Casarosa, G.
2017-11-01
In the frame of the development of a videogrammetric system to be used in thermal vacuum chambers at the European Space Research and Technology Centre (ESTEC) and other sites across Europe, the design of a network using micro-cameras was specified by the European Space agency (ESA)-ESTEC. The selected test set-up is the photogrammetric test of the Herschel Satellite Flight Model in the ESTEC Large Space Simulator. The photogrammetric system will be used to verify the Herschel Telescope alignment and Telescope positioning with respect to the Cryostat Vacuum Vessel (CVV) inside the Large Space Simulator during Thermal-Vacuum/Thermal-Balance test phases. We designed a close-range photogrammetric network by heuristic simulation and a videogrammetric system with an overall accuracy of 1:100,000. A semi-automated image acquisition system, which is able to work at low temperatures (-170°C) in order to acquire images according to the designed network has been constructed by ESA-ESTEC. In this paper we will present the videogrammetric system and sub-systems and the results of real measurements with a representative setup similar to the set-up of Herschel spacecraft which was realized in ESTEC Test Centre.
Characterization of low-mass deformable mirrors and ASIC drivers for high-contrast imaging
NASA Astrophysics Data System (ADS)
Mejia Prada, Camilo; Yao, Li; Wu, Yuqian; Roberts, Lewis C.; Shelton, Chris; Wu, Xingtao
2017-09-01
The development of compact, high performance Deformable Mirrors (DMs) is one of the most important technological challenges for high-contrast imaging on space missions. Microscale Inc. has fabricated and characterized piezoelectric stack actuator deformable mirrors (PZT-DMs) and Application-Specific Integrated Circuit (ASIC) drivers for direct integration. The DM-ASIC system is designed to eliminate almost all cables, enabling a very compact optical system with low mass and low power consumption. We report on the optical tests used to evaluate the performance of the DM and ASIC units. We also compare the results to the requirements for space-based high-contrast imaging of exoplanets.
Dependence of quantitative accuracy of CT perfusion imaging on system parameters
NASA Astrophysics Data System (ADS)
Li, Ke; Chen, Guang-Hong
2017-03-01
Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.
Hard X-ray imaging facility for space shuttle: A scientific and conceptual engineering study
NASA Technical Reports Server (NTRS)
Peterson, L. E.; Hudson, H. S.; Hurford, G.; Schneible, D.
1976-01-01
A shuttle-accommodated instrument for imaging hard X-rays in the study of nonthermal particles and high temperature particles in various solar and cosmic phenomena was defined and its feasibility demonstrated. The imaging system configuration is described as well as the electronics, aspect systems, mechanical and thermal properties and the ground support equipment.
Yothers, Mitchell P; Browder, Aaron E; Bumm, Lloyd A
2017-01-01
We have developed a real-space method to correct distortion due to thermal drift and piezoelectric actuator nonlinearities on scanning tunneling microscope images using Matlab. The method uses the known structures typically present in high-resolution atomic and molecularly resolved images as an internal standard. Each image feature (atom or molecule) is first identified in the image. The locations of each feature's nearest neighbors are used to measure the local distortion at that location. The local distortion map across the image is simultaneously fit to our distortion model, which includes thermal drift in addition to piezoelectric actuator hysteresis and creep. The image coordinates of the features and image pixels are corrected using an inverse transform from the distortion model. We call this technique the thermal-drift, hysteresis, and creep transform. Performing the correction in real space allows defects, domain boundaries, and step edges to be excluded with a spatial mask. Additional real-space image analyses are now possible with these corrected images. Using graphite(0001) as a model system, we show lattice fitting to the corrected image, averaged unit cell images, and symmetry-averaged unit cell images. Statistical analysis of the distribution of the image features around their best-fit lattice sites measures the aggregate noise in the image, which can be expressed as feature confidence ellipsoids.
NASA Astrophysics Data System (ADS)
Yothers, Mitchell P.; Browder, Aaron E.; Bumm, Lloyd A.
2017-01-01
We have developed a real-space method to correct distortion due to thermal drift and piezoelectric actuator nonlinearities on scanning tunneling microscope images using Matlab. The method uses the known structures typically present in high-resolution atomic and molecularly resolved images as an internal standard. Each image feature (atom or molecule) is first identified in the image. The locations of each feature's nearest neighbors are used to measure the local distortion at that location. The local distortion map across the image is simultaneously fit to our distortion model, which includes thermal drift in addition to piezoelectric actuator hysteresis and creep. The image coordinates of the features and image pixels are corrected using an inverse transform from the distortion model. We call this technique the thermal-drift, hysteresis, and creep transform. Performing the correction in real space allows defects, domain boundaries, and step edges to be excluded with a spatial mask. Additional real-space image analyses are now possible with these corrected images. Using graphite(0001) as a model system, we show lattice fitting to the corrected image, averaged unit cell images, and symmetry-averaged unit cell images. Statistical analysis of the distribution of the image features around their best-fit lattice sites measures the aggregate noise in the image, which can be expressed as feature confidence ellipsoids.
Effect of multiple circular holes Fraunhofer diffraction for the infrared optical imaging
NASA Astrophysics Data System (ADS)
Lu, Chunlian; Lv, He; Cao, Yang; Cai, Zhisong; Tan, Xiaojun
2014-11-01
With the development of infrared optics, infrared optical imaging systems play an increasingly important role in modern optical imaging systems. Infrared optical imaging is used in industry, agriculture, medical, military and transportation. But in terms of infrared optical imaging systems which are exposed for a long time, some contaminations will affect the infrared optical imaging. When the contamination contaminate on the lens surface of the optical system, it would affect diffraction. The lens can be seen as complementary multiple circular holes screen happen Fraunhofer diffraction. According to Babinet principle, you can get the diffraction of the imaging system. Therefore, by studying the multiple circular holes Fraunhofer diffraction, conclusions can be drawn about the effect of infrared imaging. This paper mainly studies the effect of multiple circular holes Fraunhofer diffraction for the optical imaging. Firstly, we introduce the theory of Fraunhofer diffraction and Point Spread Function. Point Spread Function is a basic tool to evaluate the image quality of the optical system. Fraunhofer diffraction will affect Point Spread Function. Then, the results of multiple circular holes Fraunhofer diffraction are given for different hole size and hole spacing. We choose the hole size from 0.1mm to 1mm and hole spacing from 0.3mm to 0.8mm. The infrared wavebands of optical imaging are chosen from 1μm to 5μm. We use the MATLAB to simulate light intensity distribution of multiple circular holes Fraunhofer diffraction. Finally, three-dimensional diffraction maps of light intensity are given to contrast.
Readying ISIM for its First Thermal Vacuum Test
2017-12-08
Engineers work with the Integrated Science Instrument Module for the James Webb Space Telescope inside the thermal vacuum chamber at NASA's Goddard Space Flight Center in Greenbelt, Md. The ISIM and the ISIM System Integration Fixture that holds the ISIM Electronics Compartment was recently lifted inside the chamber for its first thermal vacuum test. In this image one of the ISIM's many protective blanket layers is pulled back. The blankets will be removed during testing. Image credit: NASA/Chris Gunn NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Technical Reports Server (NTRS)
Stapelfeldt, Karl R.; Brenner, Michael P.; Warfield, Keith R.; Dekens, Frank G.; Belikov, Ruslan; Brugarolas, Paul B.; Bryden, Geoffrey; Cahoy, Kerri L.; Chakrabarti, Supriya; Dubovitsky, Serge;
2014-01-01
"Exo-C" is NASA's first community study of a modest aperture space telescope designed for high contrast observations of exoplanetary systems. The mission will be capable of taking optical spectra of nearby exoplanets in reflected light, discover previously undetected planets, and imaging structure in a large sample of circumstellar disks. It will obtain unique science results on planets down to super-Earth sizes and serve as a technology pathfinder toward an eventual flagship-class mission to find and characterize habitable exoplanets. We present the mission/payload design and highlight steps to reduce mission cost/risk relative to previous mission concepts. At the study conclusion in 2015, NASA will evaluate it for potential development at the end of this decade. Keywords: Exoplanets, high contrast imaging, optical astronomy, space mission concepts
2013-09-01
Ground testing of prototype hardware and processing algorithms for a Wide Area Space Surveillance System (WASSS) Neil Goldstein, Rainer A...at Magdalena Ridge Observatory using the prototype Wide Area Space Surveillance System (WASSS) camera, which has a 4 x 60 field-of-view , < 0.05...objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and a Principal Component Analysis based image
Evaluation and testing of image quality of the Space Solar Extreme Ultraviolet Telescope
NASA Astrophysics Data System (ADS)
Peng, Jilong; Yi, Zhong; Zhou, Shuhong; Yu, Qian; Hou, Yinlong; Wang, Shanshan
2018-01-01
For the space solar extreme ultraviolet telescope, the star point test can not be performed in the x-ray band (19.5nm band) as there is not light source of bright enough. In this paper, the point spread function of the optical system is calculated to evaluate the imaging performance of the telescope system. Combined with the actual processing surface error, such as small grinding head processing and magnetorheological processing, the optical design software Zemax and data analysis software Matlab are used to directly calculate the system point spread function of the space solar extreme ultraviolet telescope. Matlab codes are programmed to generate the required surface error grid data. These surface error data is loaded to the specified surface of the telescope system by using the communication technique of DDE (Dynamic Data Exchange), which is used to connect Zemax and Matlab. As the different processing methods will lead to surface error with different size, distribution and spatial frequency, the impact of imaging is also different. Therefore, the characteristics of the surface error of different machining methods are studied. Combining with its position in the optical system and simulation its influence on the image quality, it is of great significance to reasonably choose the processing technology. Additionally, we have also analyzed the relationship between the surface error and the image quality evaluation. In order to ensure the final processing of the mirror to meet the requirements of the image quality, we should choose one or several methods to evaluate the surface error according to the different spatial frequency characteristics of the surface error.
Wave-Optics Analysis of Pupil Imaging
NASA Technical Reports Server (NTRS)
Dean, Bruce H.; Bos, Brent J.
2006-01-01
Pupil imaging performance is analyzed from the perspective of physical optics. A multi-plane diffraction model is constructed by propagating the scalar electromagnetic field, surface by surface, along the optical path comprising the pupil imaging optical system. Modeling results are compared with pupil images collected in the laboratory. The experimental setup, although generic for pupil imaging systems in general, has application to the James Webb Space Telescope (JWST) optical system characterization where the pupil images are used as a constraint to the wavefront sensing and control process. Practical design considerations follow from the diffraction modeling which are discussed in the context of the JWST Observatory.
Utilization of the Space Vision System as an Augmented Reality System For Mission Operations
NASA Technical Reports Server (NTRS)
Maida, James C.; Bowen, Charles
2003-01-01
Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to flight hardware capable of utilizing this technology. This is the basis for this proposed Space Human Factors Engineering project, the determination of the display symbology within the performance limits of the Space Vision System that will objectively improve human performance. This utilization of existing flight hardware will greatly reduce the costs of implementation for flight. Besides being used onboard shuttle and space station and as a ground-based system for mission operational support, it also has great potential for science and medical training and diagnostics, remote learning, team learning, video/media conferencing, and educational outreach.
A study of photon propagation in free-space based on hybrid radiosity-radiance theorem.
Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Liang, Jimin; Wang, Lin; Yang, Da'an; Garofalakis, Anikitos; Ripoll, Jorge; Tian, Jie
2009-08-31
Noncontact optical imaging has attracted increasing attention in recent years due to its significant advantages on detection sensitivity, spatial resolution, image quality and system simplicity compared with contact measurement. However, photon transport simulation in free-space is still an extremely challenging topic for the complexity of the optical system. For this purpose, this paper proposes an analytical model for photon propagation in free-space based on hybrid radiosity-radiance theorem (HRRT). It combines Lambert's cosine law and the radiance theorem to handle the influence of the complicated lens and to simplify the photon transport process in the optical system. The performance of the proposed model is evaluated and validated with numerical simulations and physical experiments. Qualitative comparison results of flux distribution at the detector are presented. In particular, error analysis demonstrates the feasibility and potential of the proposed model for simulating photon propagation in free-space.
Comparing features sets for content-based image retrieval in a medical-case database
NASA Astrophysics Data System (ADS)
Muller, Henning; Rosset, Antoine; Vallee, Jean-Paul; Geissbuhler, Antoine
2004-04-01
Content-based image retrieval systems (CBIRSs) have frequently been proposed for the use in medical image databases and PACS. Still, only few systems were developed and used in a real clinical environment. It rather seems that medical professionals define their needs and computer scientists develop systems based on data sets they receive with little or no interaction between the two groups. A first study on the diagnostic use of medical image retrieval also shows an improvement in diagnostics when using CBIRSs which underlines the potential importance of this technique. This article explains the use of an open source image retrieval system (GIFT - GNU Image Finding Tool) for the retrieval of medical images in the medical case database system CasImage that is used in daily, clinical routine in the university hospitals of Geneva. Although the base system of GIFT shows an unsatisfactory performance, already little changes in the feature space show to significantly improve the retrieval results. The performance of variations in feature space with respect to color (gray level) quantizations and changes in texture analysis (Gabor filters) is compared. Whereas stock photography relies mainly on colors for retrieval, medical images need a large number of gray levels for successful retrieval, especially when executing feedback queries. The results also show that a too fine granularity in the gray levels lowers the retrieval quality, especially with single-image queries. For the evaluation of the retrieval peformance, a subset of the entire case database of more than 40,000 images is taken with a total of 3752 images. Ground truth was generated by a user who defined the expected query result of a perfect system by selecting images relevant to a given query image. The results show that a smaller number of gray levels (32 - 64) leads to a better retrieval performance, especially when using relevance feedback. The use of more scales and directions for the Gabor filters in the texture analysis also leads to improved results but response time is going up equally due to the larger feature space. CBIRSs can be of great use in managing large medical image databases. They allow to find images that might otherwise be lost for research and publications. They also give students students the possibility to navigate within large image repositories. In the future, CBIR might also become more important in case-based reasoning and evidence-based medicine to support the diagnostics because first studies show good results.
Closeup side view of Space Shuttle Main Engine (SSME) 2059 ...
Close-up side view of Space Shuttle Main Engine (SSME) 2059 mounted in a SSME Engine Handler near the Drying Area in the High Bay section of the SSME Processing Facility. The prominent features of the SSME in this view are the hot-gas expansion nozzle extending from the approximate image center toward the image right. The main-engine components extend from the approximate image center toward image right until it meets up with the mount for the SSME Engine Handler. The engine is rotated to a position where the major components in the view are the Low-Pressure Fuel Turbopump Discharge Duct with reflective foil insulation on the upper side of the engine, the Low-Pressure Oxidizer Turbopump and its Discharge Duct on the right side of the engine assembly extending itself down and wrapping under the bottom side of the assembly to the High-Pressure Oxidizer Turbopump pump. The High-Pressure Oxidizer Turbopump Discharge Duct exists the turbopump and extends up to the top side of the assembly where it enters the main oxidizer valve. The sphere on the lower side of the engine assembly is an accumulator that is part of the SSMEs POGO suppression system. - Space Transportation System, Space Shuttle Main Engine, Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Space Images for NASA JPL Android Version
NASA Technical Reports Server (NTRS)
Nelson, Jon D.; Gutheinz, Sandy C.; Strom, Joshua R.; Arca, Jeremy M.; Perez, Martin; Boggs, Karen; Stanboli, Alice
2013-01-01
This software addresses the demand for easily accessible NASA JPL images and videos by providing a user friendly and simple graphical user interface that can be run via the Android platform from any location where Internet connection is available. This app is complementary to the iPhone version of the application. A backend infrastructure stores, tracks, and retrieves space images from the JPL Photojournal and Institutional Communications Web server, and catalogs the information into a streamlined rating infrastructure. This system consists of four distinguishing components: image repository, database, server-side logic, and Android mobile application. The image repository contains images from various JPL flight projects. The database stores the image information as well as the user rating. The server-side logic retrieves the image information from the database and categorizes each image for display. The Android mobile application is an interfacing delivery system that retrieves the image information from the server for each Android mobile device user. Also created is a reporting and tracking system for charting and monitoring usage. Unlike other Android mobile image applications, this system uses the latest emerging technologies to produce image listings based directly on user input. This allows for countless combinations of images returned. The backend infrastructure uses industry-standard coding and database methods, enabling future software improvement and technology updates. The flexibility of the system design framework permits multiple levels of display possibilities and provides integration capabilities. Unique features of the software include image/video retrieval from a selected set of categories, image Web links that can be shared among e-mail users, sharing to Facebook/Twitter, marking as user's favorites, and image metadata searchable for instant results.
The properties of borderlines in discontinuous conservative systems
NASA Astrophysics Data System (ADS)
Wang, X.-M.; Fang, Z.-J.
2006-02-01
The properties of the set of borderline images in discontinuous conservative systems are commonly investigated. The invertible system in which a stochastic web was found in 1999 is re-discussed here. The result shows that the set of images of the borderline actually forms the same stochastic web. The web has two typical local fine structures. Firstly, in some parts of the web the borderline crosses the manifold of hyperbolic points so that the chaotic diffusion is damped greatly; secondly, in other parts of phase space many holes and elliptic islands appear in the stochastic layer. This local structure shows infinite self-similarity. The noninvertible system in which the so-called chaotic quasi-attractor was found in [X.-M. Wang et al., Eur. Phys. J. D 19, 119 (2002)] is also studied here. The numerical investigation shows that such a chaotic quasi-attractor is confined by the preceding lower order images of the borderline. The mechanism of this confinement is revealed: a forbidden zone exists that any orbit can not visit, which is the sub-phase space of one side of the first image of the borderline. Each order of the images of the forbidden zone can be qualitatively divided into two sub-phase regions: one is the so-called escaping region that provides the orbit with an escaping channel, the other is the so-called dissipative region where the contraction of phase space occurs.
X-ray and optical stereo-based 3D sensor fusion system for image-guided neurosurgery.
Kim, Duk Nyeon; Chae, You Seong; Kim, Min Young
2016-04-01
In neurosurgery, an image-guided operation is performed to confirm that the surgical instruments reach the exact lesion position. Among the multiple imaging modalities, an X-ray fluoroscope mounted on C- or O-arm is widely used for monitoring the position of surgical instruments and the target position of the patient. However, frequently used fluoroscopy can result in relatively high radiation doses, particularly for complex interventional procedures. The proposed system can reduce radiation exposure and provide the accurate three-dimensional (3D) position information of surgical instruments and the target position. X-ray and optical stereo vision systems have been proposed for the C- or O-arm. Two subsystems have same optical axis and are calibrated simultaneously. This provides easy augmentation of the camera image and the X-ray image. Further, the 3D measurement of both systems can be defined in a common coordinate space. The proposed dual stereoscopic imaging system is designed and implemented for mounting on an O-arm. The calibration error of the 3D coordinates of the optical stereo and X-ray stereo is within 0.1 mm in terms of the mean and the standard deviation. Further, image augmentation with the camera image and the X-ray image using an artificial skull phantom is achieved. As the developed dual stereoscopic imaging system provides 3D coordinates of the point of interest in both optical images and fluoroscopic images, it can be used by surgeons to confirm the position of surgical instruments in a 3D space with minimum radiation exposure and to verify whether the instruments reach the surgical target observed in fluoroscopic images.
A comparison of imaging methods for use in an array biosensor
NASA Technical Reports Server (NTRS)
Golden, Joel P.; Ligler, Frances S.
2002-01-01
An array biosensor has been developed which uses an actively-cooled, charge-coupled device (CCD) imager. In an effort to save money and space, a complementary metal-oxide semiconductor (CMOS) camera and photodiode were tested as replacements for the cooled CCD imager. Different concentrations of CY5 fluorescent dye in glycerol were imaged using the three different detection systems with the same imaging optics. Signal discrimination above noise was compared for each of the three systems.
Diagnostic ultrasound at MACH 20: retroperitoneal and pelvic imaging in space.
Jones, J A; Sargsyan, A E; Barr, Y R; Melton, S; Hamilton, D R; Dulchavsky, S A; Whitson, P A
2009-07-01
An operationally available diagnostic imaging capability augments spaceflight medical support by facilitating the diagnosis, monitoring and treatment of medical or surgical conditions, by improving medical outcomes and, thereby, by lowering medical mission impacts and the probability of crew evacuation due to medical causes. Microgravity-related physiological changes occurring during spaceflight can affect the genitourinary system and potentially cause conditions such as urinary retention or nephrolithiasis for which ultrasonography (U/S) would be a useful diagnostic tool. This study describes the first genitourinary ultrasound examination conducted in space, and evaluates image quality, frame rate, resolution requirements, real-time remote guidance of nonphysician crew medical officers and evaluation of on-orbit tools that can augment image acquisition. A nonphysician crew medical officer (CMO) astronaut, with minimal training in U/S, performed a self-examination of the genitourinary system onboard the International Space Station, using a Philips/ATL Model HDI-5000 ultrasound imaging unit located in the International Space Station Human Research Facility. The CMO was remotely guided by voice commands from experienced, earth-based sonographers stationed in Mission Control Center in Houston. The crewmember, with guidance, was able to acquire all of the target images. Real-time and still U/S images received at Mission Control Center in Houston were of sufficient quality for the images to be diagnostic for multiple potential genitourinary applications. Microgravity-based ultrasound imaging can provide diagnostic quality images of the retroperitoneum and pelvis, offering improved diagnosis and treatment for onboard medical contingencies. Successful completion of complex sonographic examinations can be obtained even with minimally trained nonphysician ultrasound operators, with the assistance of ground-based real-time guidance.
Huang, Yongyang; Badar, Mudabbir; Nitkowski, Arthur; Weinroth, Aaron; Tansu, Nelson; Zhou, Chao
2017-01-01
Space-division multiplexing optical coherence tomography (SDM-OCT) is a recently developed parallel OCT imaging method in order to achieve multi-fold speed improvement. However, the assembly of fiber optics components used in the first prototype system was labor-intensive and susceptible to errors. Here, we demonstrate a high-speed SDM-OCT system using an integrated photonic chip that can be reliably manufactured with high precisions and low per-unit cost. A three-layer cascade of 1 × 2 splitters was integrated in the photonic chip to split the incident light into 8 parallel imaging channels with ~3.7 mm optical delay in air between each channel. High-speed imaging (~1s/volume) of porcine eyes ex vivo and wide-field imaging (~18.0 × 14.3 mm2) of human fingers in vivo were demonstrated with the chip-based SDM-OCT system. PMID:28856055
Image processing occupancy sensor
Brackney, Larry J.
2016-09-27
A system and method of detecting occupants in a building automation system environment using image based occupancy detection and position determinations. In one example, the system includes an image processing occupancy sensor that detects the number and position of occupants within a space that has controllable building elements such as lighting and ventilation diffusers. Based on the position and location of the occupants, the system can finely control the elements to optimize conditions for the occupants, optimize energy usage, among other advantages.
Space Telescopes Reveal Secrets of Turbulent Black Hole
2017-12-08
NASA image release September 29, 2011 This image of the distant active galaxy Markarian 509 was taken in April 2007 with the Hubble Space Telescope's Wide Field Camera 2. To read more go to: www.nasa.gov/mission_pages/hubble/science/turbulent-black... Credit: NASA, ESA, G. Kriss (STScI), and J. de Plaa (SRON Netherlands Institute for Space Research); Acknowledgment: B. Peterson (Ohio State University) NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Binary-space-partitioned images for resolving image-based visibility.
Fu, Chi-Wing; Wong, Tien-Tsin; Tong, Wai-Shun; Tang, Chi-Keung; Hanson, Andrew J
2004-01-01
We propose a novel 2D representation for 3D visibility sorting, the Binary-Space-Partitioned Image (BSPI), to accelerate real-time image-based rendering. BSPI is an efficient 2D realization of a 3D BSP tree, which is commonly used in computer graphics for time-critical visibility sorting. Since the overall structure of a BSP tree is encoded in a BSPI, traversing a BSPI is comparable to traversing the corresponding BSP tree. BSPI performs visibility sorting efficiently and accurately in the 2D image space by warping the reference image triangle-by-triangle instead of pixel-by-pixel. Multiple BSPIs can be combined to solve "disocclusion," when an occluded portion of the scene becomes visible at a novel viewpoint. Our method is highly automatic, including a tensor voting preprocessing step that generates candidate image partition lines for BSPIs, filters the noisy input data by rejecting outliers, and interpolates missing information. Our system has been applied to a variety of real data, including stereo, motion, and range images.
Barra da Tijuca, Rio de Janeiro, from Space
2017-12-08
While gymnasts leap, cyclists pedal and divers twirl for Olympic gold in Rio de Janeiro, Brazil, several NASA Earth Observing satellites catch glimpses of the city and its surroundings from space. This image shows how Rio Olympic Park appeared to the Operational Land Imager (OLI), a sensor on Landsat 8, last September as the city prepared for the 2016 Summer Olympic Games. Image credit: Landsat 8/NASA Earth Observatory NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Multiple Hypotheses Image Segmentation and Classification With Application to Dietary Assessment
Zhu, Fengqing; Bosch, Marc; Khanna, Nitin; Boushey, Carol J.; Delp, Edward J.
2016-01-01
We propose a method for dietary assessment to automatically identify and locate food in a variety of images captured during controlled and natural eating events. Two concepts are combined to achieve this: a set of segmented objects can be partitioned into perceptually similar object classes based on global and local features; and perceptually similar object classes can be used to assess the accuracy of image segmentation. These ideas are implemented by generating multiple segmentations of an image to select stable segmentations based on the classifier’s confidence score assigned to each segmented image region. Automatic segmented regions are classified using a multichannel feature classification system. For each segmented region, multiple feature spaces are formed. Feature vectors in each of the feature spaces are individually classified. The final decision is obtained by combining class decisions from individual feature spaces using decision rules. We show improved accuracy of segmenting food images with classifier feedback. PMID:25561457
Multiple hypotheses image segmentation and classification with application to dietary assessment.
Zhu, Fengqing; Bosch, Marc; Khanna, Nitin; Boushey, Carol J; Delp, Edward J
2015-01-01
We propose a method for dietary assessment to automatically identify and locate food in a variety of images captured during controlled and natural eating events. Two concepts are combined to achieve this: a set of segmented objects can be partitioned into perceptually similar object classes based on global and local features; and perceptually similar object classes can be used to assess the accuracy of image segmentation. These ideas are implemented by generating multiple segmentations of an image to select stable segmentations based on the classifier's confidence score assigned to each segmented image region. Automatic segmented regions are classified using a multichannel feature classification system. For each segmented region, multiple feature spaces are formed. Feature vectors in each of the feature spaces are individually classified. The final decision is obtained by combining class decisions from individual feature spaces using decision rules. We show improved accuracy of segmenting food images with classifier feedback.
NASA-HBCU Space Science and Engineering Research Forum Proceedings
NASA Technical Reports Server (NTRS)
Sanders, Yvonne D. (Editor); Freeman, Yvonne B. (Editor); George, M. C. (Editor)
1989-01-01
The proceedings of the Historically Black Colleges and Universities (HBCU) forum are presented. A wide range of research topics from plant science to space science and related academic areas was covered. The sessions were divided into the following subject areas: Life science; Mathematical modeling, image processing, pattern recognition, and algorithms; Microgravity processing, space utilization and application; Physical science and chemistry; Research and training programs; Space science (astronomy, planetary science, asteroids, moon); Space technology (engineering, structures and systems for application in space); Space technology (physics of materials and systems for space applications); and Technology (materials, techniques, measurements).
Merging Real-Time and Retrospective Data Services, NOAA's Solar X-Ray Imager
NASA Astrophysics Data System (ADS)
Wilkinson, D. C.
2004-12-01
The ground systems team for NOAA's first Solar X-ray Imager (SXI) proposed a merger of real-time and retrospective data services with two goals in mind. First, it was anticipated that this would be a more economical approach than legacy systems that divided these services between two separate organizations within NOAA. Also, unifying these services would naturally provide a simpler, and more consistent public interface for all SXI data users. The implementation of this innovative approach has been successful on both accounts. NOAA's Space Environment Center (SEC) receives the telemetry stream from SXI and generates the raw and processed imagery that they use in their Space Weather alert and forecast services. These data are instantaneously transferred to NOAA's National Geophysical Data Center through a combination of data push and pull protocols. The result is an interface that provides access to all SXI data, including images that are less than two minutes old. The success of this system has prompted its use in the ground systems design for the SXI and Space Environment Monitor (SEM) data collected from GOES-N, schedule for launch in December 2004.
Multi-beam range imager for autonomous operations
NASA Technical Reports Server (NTRS)
Marzwell, Neville I.; Lee, H. Sang; Ramaswami, R.
1993-01-01
For space operations from the Space Station Freedom the real time range imager will be very valuable in terms of refuelling, docking as well as space exploration operations. For these applications as well as many other robotics and remote ranging applications, a small potable, power efficient, robust range imager capable of a few tens of km ranging with 10 cm accuracy is needed. The system developed is based on a well known pseudo-random modulation technique applied to a laser transmitter combined with a novel range resolution enhancement technique. In this technique, the transmitter is modulated by a relatively low frequency of an order of a few MHz to enhance the signal to noise ratio and to ease the stringent systems engineering requirements while accomplishing a very high resolution. The desired resolution cannot easily be attained by other conventional approaches. The engineering model of the system is being designed to obtain better than 10 cm range accuracy simply by implementing a high precision clock circuit. In this paper we present the principle of the pseudo-random noise (PN) lidar system and the results of the proof of experiment.
The Great Observatories All-Sky LIRG Survey: Herschel Image Atlas and Aperture Photometry
NASA Astrophysics Data System (ADS)
Chu, Jason K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Díaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.
2017-04-01
Far-infrared images and photometry are presented for 201 Luminous and Ultraluminous Infrared Galaxies [LIRGs: log ({L}{IR}/{L}⊙ )=11.00{--}11.99, ULIRGs: log ({L}{IR}/{L}⊙ )=12.00{--}12.99], in the Great Observatories All-Sky LIRG Survey (GOALS), based on observations with the Herschel Space Observatory Photodetector Array Camera and Spectrometer (PACS) and the Spectral and Photometric Imaging Receiver (SPIRE) instruments. The image atlas displays each GOALS target in the three PACS bands (70, 100, and 160 μm) and the three SPIRE bands (250, 350, and 500 μm), optimized to reveal structures at both high and low surface brightness levels, with images scaled to simplify comparison of structures in the same physical areas of ˜100 × 100 kpc2. Flux densities of companion galaxies in merging systems are provided where possible, depending on their angular separation and the spatial resolution in each passband, along with integrated system fluxes (sum of components). This data set constitutes the imaging and photometric component of the GOALS Herschel OT1 observing program, and is complementary to atlases presented for the Hubble Space Telescope, Spitzer Space Telescope, and Chandra X-ray Observatory. Collectively, these data will enable a wide range of detailed studies of active galactic nucleus and starburst activity within the most luminous infrared galaxies in the local universe. Based on Herschel Space Observatory observations. Herschel is an ESA space observatory with science instruments provided by the European-led Principal Investigator consortia, and important participation from NASA.
NASA Technical Reports Server (NTRS)
Seeley, John S. (Editor); Lear, John W. (Editor); Russak, Sidney L. (Editor); Monfils, Andre (Editor)
1986-01-01
Papers are presented on such topics as the development of the Imaging Spectrometer for Shuttle and space platform applications; the in-flight calibration of pushbroom remote sensing instruments for the SPOT program; buttable detector arrays for 1.55-1.7 micron imaging; the design of the Improved Stratospheric and Mesospheric Sounder on the Upper Atmosphere Research Satellite; and SAGE II design and in-orbit performance. Consideration is also given to the Shuttle Imaging Radar-B/C instruments; the Venus Radar Mapper multimode radar system design; various ISO instruments (ISOCAM, ISOPHOT, and SWS and LWS); and instrumentation for the Space Infrared Telescope Facility.
2002-02-01
This photograph depicts the Solar X-Ray Imager (SXI) being installed in the X-Ray Calibration Facility (XRCF) vacuum chamber for testing at the Marshall Space Flight Center (MSFC). The XRCF vacuum chamber simulates a space environment with low temperature and pressure. The x-ray images from SXI on the Geostationary Operational Environmental Satellite-12 (GOES-12) will be used by the National Oceanic and Atmospheric Administration (NOAA) and U.S. Air Force to forecast the intensity and speed of solar disturbances that could destroy satellite electronics or disrupt long-distance radio communications. The SXI will observe solar flares, coronal mass ejections, coronal holes, and active regions in the x-ray region of the electromagnetic spectrum. These features are the dominant sources of disturbances in space weather. The imager instrument consists of a telescope assembly with a 6.3-inch (16-centimeter) diameter grazing incidence mirror and a detector system. The imager was developed, tested, and calibrated by MSFC, in conjunction with the NASA Goddard Space Flight Center and U.S. Air Force.
Automated eye blink detection and correction method for clinical MR eye imaging.
Wezel, Joep; Garpebring, Anders; Webb, Andrew G; van Osch, Matthias J P; Beenakker, Jan-Willem M
2017-07-01
To implement an on-line monitoring system to detect eye blinks during ocular MRI using field probes, and to reacquire corrupted k-space lines by means of an automatic feedback system integrated with the MR scanner. Six healthy subjects were scanned on a 7 Tesla MRI whole-body system using a custom-built receive coil. Subjects were asked to blink multiple times during the MR-scan. The local magnetic field changes were detected with an external fluorine-based field probe which was positioned close to the eye. The eye blink produces a field shift greater than a threshold level, this was communicated in real-time to the MR system which immediately reacquired the motion-corrupted k-space lines. The uncorrected images, using the original motion-corrupted data, showed severe artifacts, whereas the corrected images, using the reacquired data, provided an image quality similar to images acquired without blinks. Field probes can successfully detect eye blinks during MRI scans. By automatically reacquiring the eye blink-corrupted data, high quality MR-images of the eye can be acquired. Magn Reson Med 78:165-171, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Time-of-Flight Microwave Camera
Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh
2015-01-01
Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598
Implementing An Image Understanding System Architecture Using Pipe
NASA Astrophysics Data System (ADS)
Luck, Randall L.
1988-03-01
This paper will describe PIPE and how it can be used to implement an image understanding system. Image understanding is the process of developing a description of an image in order to make decisions about its contents. The tasks of image understanding are generally split into low level vision and high level vision. Low level vision is performed by PIPE -a high performance parallel processor with an architecture specifically designed for processing video images at up to 60 fields per second. High level vision is performed by one of several types of serial or parallel computers - depending on the application. An additional processor called ISMAP performs the conversion from iconic image space to symbolic feature space. ISMAP plugs into one of PIPE's slots and is memory mapped into the high level processor. Thus it forms the high speed link between the low and high level vision processors. The mechanisms for bottom-up, data driven processing and top-down, model driven processing are discussed.
Time-of-Flight Microwave Camera
NASA Astrophysics Data System (ADS)
Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh
2015-10-01
Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.
Near-ultraviolet imaging of Jupiter's satellite Io with the Hubble Space Telescope
NASA Technical Reports Server (NTRS)
Paresce, F.; Sartoretti, P.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.
1992-01-01
The surface of Jupiter's Galilean satellite Io has been resolved for the first time in the near ultraviolet at 2850 A by the Faint Object Camera (FOC) on the Hubble Space Telescope (HST). The restored images reveal significant surface structure down to the resolution limit of the optical system corresponding to approximately 250 km at the sub-earth point.
Floating aerial 3D display based on the freeform-mirror and the improved integral imaging system
NASA Astrophysics Data System (ADS)
Yu, Xunbo; Sang, Xinzhu; Gao, Xin; Yang, Shenwu; Liu, Boyang; Chen, Duo; Yan, Binbin; Yu, Chongxiu
2018-09-01
A floating aerial three-dimensional (3D) display based on the freeform-mirror and the improved integral imaging system is demonstrated. In the traditional integral imaging (II), the distortion originating from lens aberration warps elemental images and degrades the visual effect severely. To correct the distortion of the observed pixels and to improve the image quality, a directional diffuser screen (DDS) is introduced. However, the improved integral imaging system can hardly present realistic images with the large off-screen depth, which limits floating aerial visual experience. To display the 3D image in the free space, the off-axis reflection system with the freeform-mirror is designed. By combining the improved II and the designed freeform optical element, the floating aerial 3D image is presented.
Astronauts Sullivan and Leestma perform in-space simulation of refueling
1984-10-14
S84-43432 (11 Oct. 1984) --- Appearing small in the center background of this image, astronauts Kathryn D. Sullivan, left, and David C. Leestma, both 41-G mission specialists, perform an in-space simulation of refueling another spacecraft in orbit. Their station on the space shuttle Challenger is the orbital refueling system (ORS), positioned on the mission peculiar support structure (MPR ESS). The Large Format Camera (LFC) is left of the two mission specialists. In the left foreground is the antenna for the shuttle imaging radar (SIR-B) system onboard. The Canadian-built remote manipulator system (RMS) is positioned to allow close-up recording capability of the busy scene. A 50mm lens on a 70mm camera was used to photograph this scene. Photo credit: NASA
Hubble Team Unveils Most Colorful View of Universe Captured by Space Telescope
2014-06-04
Astronomers using NASA's Hubble Space Telescope have assembled a comprehensive picture of the evolving universe – among the most colorful deep space images ever captured by the 24-year-old telescope. Researchers say the image, in new study called the Ultraviolet Coverage of the Hubble Ultra Deep Field, provides the missing link in star formation. The Hubble Ultra Deep Field 2014 image is a composite of separate exposures taken in 2003 to 2012 with Hubble's Advanced Camera for Surveys and Wide Field Camera 3. Credit: NASA/ESA Read more: 1.usa.gov/1neD0se NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
A frameless stereotaxic operating microscope for neurosurgery.
Friets, E M; Strohbehn, J W; Hatch, J F; Roberts, D W
1989-06-01
A new system, which we call the frameless stereotaxic operating microscope, is discussed. Its purpose is to display CT or other image data in the operating microscope in the correct scale, orientation, and position without the use of a stereotaxic frame. A nonimaging ultrasonic rangefinder allows the position of the operating microscope and the position of the patient to be determined. Discrete fiducial points on the patient's external anatomy are located in both image space and operating room space, linking the image data and the operating room. Physician-selected image information, e.g., tumor contours or guidance to predetermined targets, is projected through the optics of the operating microscope using a miniature cathode ray tube and a beam splitter. Projected images superpose the surgical field, reconstructed from image data to match the focal plane of the operating microscope. The algorithms on which the system is based are described, and the sources and effects of errors are discussed. The system's performance is simulated, providing an estimate of accuracy. Two phantoms are used to measure accuracy experimentally. Clinical results and observations are given.
Lightning over Equatorial Africa
NASA Technical Reports Server (NTRS)
2002-01-01
These two images were taken 9 seconds apart as the STS-97 Space Shuttle flew over equatorial Africa east of Lake Volta on December 11, 2000. The top of the large thunderstorm, roughly 20 km across, is illuminated by a full moon and frequent bursts of lightning. Because the Space Shuttle travels at about 7 km/sec, the astronaut perspectives on this storm system becomes more oblique over the 9-second interval between photographs. The images were taken with a Nikon 35 mm camera equipped with a 400 mm lens and high-speed (800 ISO) color negative film. Images are STS097-351-9 and STS097-351-12, provided and archived by the Earth Science and Image Analysis Laboratory, Johnson Space Center. Additional images taken by astronauts can be viewed at NASA-JSC's Gateway to Astronaut Photography of Earth at http://eol.jsc.nasa.gov/
Geologic interpretation of space shuttle radar images of Indonesia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sabing, F.F.
1983-11-01
The National Aeronautics and Space Administration (NASA) space shuttle mission in November 1981 acquired images of parts of the earth with a synthetic aperture radar system at a wavelength of 23.5 cm (9.3 in.) and spatial resolution of 38 m (125 ft). This report describes the geologic interpretation of 1:250,000-scale images of Irian Jaya and eastern Kalimantan, Indonesia, where the all-weather capability of radar penetrates the persistent cloud cover. The inclined look direction of radar enhances subtle topographic features that may be the expression of geologic structures. On the Indonesian images, the following terrain categories are recognizable for geologic mapping:more » carbonate, clastic, volcanic, alluvial and coastal, melange, and metamorphic, as well as undifferentiated bedrock. Regional and local geologic structures are well expressed on the images.« less
NASA Astrophysics Data System (ADS)
Bachche, Shivaji; Oka, Koichi
2013-06-01
This paper presents the comparative study of various color space models to determine the suitable color space model for detection of green sweet peppers. The images were captured by using CCD cameras and infrared cameras and processed by using Halcon image processing software. The LED ring around the camera neck was used as an artificial lighting to enhance the feature parameters. For color images, CieLab, YIQ, YUV, HSI and HSV whereas for infrared images, grayscale color space models were selected for image processing. In case of color images, HSV color space model was found more significant with high percentage of green sweet pepper detection followed by HSI color space model as both provides information in terms of hue/lightness/chroma or hue/lightness/saturation which are often more relevant to discriminate the fruit from image at specific threshold value. The overlapped fruits or fruits covered by leaves can be detected in better way by using HSV color space model as the reflection feature from fruits had higher histogram than reflection feature from leaves. The IR 80 optical filter failed to distinguish fruits from images as filter blocks useful information on features. Computation of 3D coordinates of recognized green sweet peppers was also conducted in which Halcon image processing software provides location and orientation of the fruits accurately. The depth accuracy of Z axis was examined in which 500 to 600 mm distance between cameras and fruits was found significant to compute the depth distance precisely when distance between two cameras maintained to 100 mm.
Large-aperture space optical system testing based on the scanning Hartmann.
Wei, Haisong; Yan, Feng; Chen, Xindong; Zhang, Hao; Cheng, Qiang; Xue, Donglin; Zeng, Xuefeng; Zhang, Xuejun
2017-03-10
Based on the Hartmann testing principle, this paper proposes a novel image quality testing technology which applies to a large-aperture space optical system. Compared with the traditional testing method through a large-aperture collimator, the scanning Hartmann testing technology has great advantages due to its simple structure, low cost, and ability to perform wavefront measurement of an optical system. The basic testing principle of the scanning Hartmann testing technology, data processing method, and simulation process are presented in this paper. Certain simulation results are also given to verify the feasibility of this technology. Furthermore, a measuring system is developed to conduct a wavefront measurement experiment for a 200 mm aperture optical system. The small deviation (6.3%) of root mean square (RMS) between experimental results and interferometric results indicates that the testing system can measure low-order aberration correctly, which means that the scanning Hartmann testing technology has the ability to test the imaging quality of a large-aperture space optical system.
NASA Technical Reports Server (NTRS)
Goward, Samuel N.; Townshend, John R.; Zanoni, Vicki; Policelli, Fritz; Stanley, Tom; Ryan, Robert; Holekamp, Kara; Underwood, Lauren; Pagnutti, Mary; Fletcher, Rose
2003-01-01
In an effort to more full explore the potential of commercial remotely sensed land data sources, the NASA Earth Science Enterprise (ESE) implemented an experimental Scientific Data Purchase (SDP) that solicited bids from the private sector to meet ESE-user data needs. The images from the Space Imaging IKONOS system provided a particularly good match to the current ESE missions such as Terra and Landsat 7 and therefore serve as a focal point in this analysis.
Systematic Calibration for a Backpacked Spherical Photogrammetry Imaging System
NASA Astrophysics Data System (ADS)
Rau, J. Y.; Su, B. W.; Hsiao, K. W.; Jhan, J. P.
2016-06-01
A spherical camera can observe the environment for almost 720 degrees' field of view in one shoot, which is useful for augmented reality, environment documentation, or mobile mapping applications. This paper aims to develop a spherical photogrammetry imaging system for the purpose of 3D measurement through a backpacked mobile mapping system (MMS). The used equipment contains a Ladybug-5 spherical camera, a tactical grade positioning and orientation system (POS), i.e. SPAN-CPT, and an odometer, etc. This research aims to directly apply photogrammetric space intersection technique for 3D mapping from a spherical image stereo-pair. For this purpose, several systematic calibration procedures are required, including lens distortion calibration, relative orientation calibration, boresight calibration for direct georeferencing, and spherical image calibration. The lens distortion is serious on the ladybug-5 camera's original 6 images. Meanwhile, for spherical image mosaicking from these original 6 images, we propose the use of their relative orientation and correct their lens distortion at the same time. However, the constructed spherical image still contains systematic error, which will reduce the 3D measurement accuracy. Later for direct georeferencing purpose, we need to establish a ground control field for boresight/lever-arm calibration. Then, we can apply the calibrated parameters to obtain the exterior orientation parameters (EOPs) of all spherical images. In the end, the 3D positioning accuracy after space intersection will be evaluated, including EOPs obtained by structure from motion method.
NASA Technical Reports Server (NTRS)
1994-01-01
Charge Coupled Devices (CCDs) are high technology silicon chips that connect light directly into electronic or digital images, which can be manipulated or enhanced by computers. When Goddard Space Flight Center (GSFC) scientists realized that existing CCD technology could not meet scientific requirements for the Hubble Space Telescope Imagining Spectrograph, GSFC contracted with Scientific Imaging Technologies, Inc. (SITe) to develop an advanced CCD. SITe then applied many of the NASA-driven enhancements to the manufacture of CCDs for digital mammography. The resulting device images breast tissue more clearly and efficiently. The LORAD Stereo Guide Breast Biopsy system incorporates SITe's CCD as part of a digital camera system that is replacing surgical biopsy in many cases. Known as stereotactic needle biopsy, it is performed under local anesthesia with a needle and saves women time, pain, scarring, radiation exposure and money.
Three-dimensional imaging of the craniofacial complex.
Nguyen, Can X.; Nissanov, Jonathan; Öztürk, Cengizhan; Nuveen, Michiel J.; Tuncay, Orhan C.
2000-02-01
Orthodontic treatment requires the rearrangement of craniofacial complex elements in three planes of space, but oddly the diagnosis is done with two-dimensional images. Here we report on a three-dimensional (3D) imaging system that employs the stereoimaging method of structured light to capture the facial image. The images can be subsequently integrated with 3D cephalometric tracings derived from lateral and PA films (www.clinorthodres.com/cor-c-070). The accuracy of the reconstruction obtained with this inexpensive system is about 400 µ.
Grzelakowski, Krzysztof P
2016-05-01
Since its introduction the importance of complementary k||-space (LEED) and real space (LEEM) information in the investigation of surface science phenomena has been widely demonstrated over the last five decades. In this paper we report the application of a novel kind of electron spectromicroscope Dual Emission Electron spectroMicroscope (DEEM) with two independent electron optical channels for reciprocal and real space quasi-simultaneous imaging in investigation of a Cs covered Mo(110) single crystal by using the 800eV electron beam from an "in-lens" electron gun system developed for the sample illumination. With the DEEM spectromicroscope it is possible to observe dynamic, irreversible processes at surfaces in the energy-filtered real space and in the corresponding energy-filtered kǁ-space quasi-simultaneously in two independent imaging columns. The novel concept of the high energy electron beam sample illumination in the cathode lens based microscopes allows chemically selective imaging and analysis under laboratory conditions. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. Armando Oliu, Final Inspection Team lead for the Shuttle program, speaks to reporters about the aid the Image Analysis Lab is giving the FBI in a kidnapping case. Behind him at right is Mike Rein, External Affairs division chief. Oliu oversees the image lab that is using an advanced SGI TP9500 data management system to review the tape of the kidnapping in progress in Sarasota, Fla. KSC installed the new $3.2 million system in preparation for Return to Flight of the Space Shuttle fleet. The lab is studying the Sarasota kidnapping video to provide any new information possible to law enforcement officers. KSC is joining NASAs Marshall Space Flight Center in Alabama in reviewing the tape.
Earth benefits from space life sciences
NASA Technical Reports Server (NTRS)
Garshnek, V.; Nicogossian, A. E.; Griffiths, L.
1988-01-01
The applications to medicine of various results from space exploration are examined. Improvements have been made in the management of cardiovascular disease, in particular the use of the ultrasonic scanner to image arteries in three dimensions, the use of excimer lasers to disrupt arterial plaques in coronary blood vessels, and the use of advanced electrodes for cardiac monitoring. A bone stiffness analyzer has helped to diagnose osteoporosis and aid in its treatment. An automated light microscope system is used for chromosome analysis, and an X-ray image intensifier called Lixiscope is used in emergency medical care. An advanced portable defibrillator has been developed for the heart, and an insulin delivery system has been derived from space microminiaturization techniques.
A Low-Power High-Speed Smart Sensor Design for Space Exploration Missions
NASA Technical Reports Server (NTRS)
Fang, Wai-Chi
1997-01-01
A low-power high-speed smart sensor system based on a large format active pixel sensor (APS) integrated with a programmable neural processor for space exploration missions is presented. The concept of building an advanced smart sensing system is demonstrated by a system-level microchip design that is composed with an APS sensor, a programmable neural processor, and an embedded microprocessor in a SOI CMOS technology. This ultra-fast smart sensor system-on-a-chip design mimics what is inherent in biological vision systems. Moreover, it is programmable and capable of performing ultra-fast machine vision processing in all levels such as image acquisition, image fusion, image analysis, scene interpretation, and control functions. The system provides about one tera-operation-per-second computing power which is a two order-of-magnitude increase over that of state-of-the-art microcomputers. Its high performance is due to massively parallel computing structures, high data throughput rates, fast learning capabilities, and advanced VLSI system-on-a-chip implementation.
Hyperspectral imaging from space: Warfighter-1
NASA Astrophysics Data System (ADS)
Cooley, Thomas; Seigel, Gary; Thorsos, Ivan
1999-01-01
The Air Force Research Laboratory Integrated Space Technology Demonstrations (ISTD) Program Office has partnered with Orbital Sciences Corporation (OSC) to complement the commercial satellite's high-resolution panchromatic imaging and Multispectral imaging (MSI) systems with a moderate resolution Hyperspectral imaging (HSI) spectrometer camera. The program is an advanced technology demonstration utilizing a commercially based space capability to provide unique functionality in remote sensing technology. This leveraging of commercial industry to enhance the value of the Warfighter-1 program utilizes the precepts of acquisition reform and is a significant departure from the old-school method of contracting for government managed large demonstration satellites with long development times and technology obsolescence concerns. The HSI system will be able to detect targets from the spectral signature measured by the hyperspectral camera. The Warfighter-1 program will also demonstrate the utility of the spectral information to theater military commanders and intelligence analysts by transmitting HSI data directly to a mobile ground station that receives and processes the data. After a brief history of the project origins, this paper will present the details of the Warfighter-1 system and expected results from exploitation of HSI data as well as the benefits realized by this collaboration between the Air Force and commercial industry.
Performance benefits and limitations of a camera network
NASA Astrophysics Data System (ADS)
Carr, Peter; Thomas, Paul J.; Hornsey, Richard
2005-06-01
Visual information is of vital significance to both animals and artificial systems. The majority of mammals rely on two images, each with a resolution of 107-108 'pixels' per image. At the other extreme are insect eyes where the field of view is segmented into 103-105 images, each comprising effectively one pixel/image. The great majority of artificial imaging systems lie nearer to the mammalian characteristics in this parameter space, although electronic compound eyes have been developed in this laboratory and elsewhere. If the definition of a vision system is expanded to include networks or swarms of sensor elements, then schools of fish, flocks of birds and ant or termite colonies occupy a region where the number of images and the pixels/image may be comparable. A useful system might then have 105 imagers, each with about 104-105 pixels. Artificial analogs to these situations include sensor webs, smart dust and co-ordinated robot clusters. As an extreme example, we might consider the collective vision system represented by the imminent existence of ~109 cellular telephones, each with a one-megapixel camera. Unoccupied regions in this resolution-segmentation parameter space suggest opportunities for innovative artificial sensor network systems. Essential for the full exploitation of these opportunities is the availability of custom CMOS image sensor chips whose characteristics can be tailored to the application. Key attributes of such a chip set might include integrated image processing and control, low cost, and low power. This paper compares selected experimentally determined system specifications for an inward-looking array of 12 cameras with the aid of a camera-network model developed to explore the tradeoff between camera resolution and the number of cameras.
Focus detection by shearing interference of vortex beams for non-imaging systems.
Li, Xiongfeng; Zhan, Shichao; Liang, Yiyong
2018-02-10
In focus detection of non-imaging systems, the common image-based methods are not available. Also, interference techniques are seldom used because only the degree with hardly any direction of defocus can be derived from the fringe spacing. In this paper, we propose a vortex-beam-based shearing interference system to do focus detection for a focused laser direct-writing system, where a vortex beam is already involved. Both simulated and experimental results show that fork-like features are added in the interference patterns due to the existence of an optical vortex, which makes it possible to distinguish the degree and direction of defocus simultaneously. The theoretical fringe spacing and resolution of this method are derived. A resolution of 0.79 μm can be achieved under the experimental combination of parameters, and it can be further improved with the help of the image processing algorithm and closed-loop controlling in the future. Finally, the influence of incomplete collimation and the wedge angle of the shear plate is discussed. This focus detection approach is extremely appropriate for those non-imaging systems containing one or more focused vortex beams.
Closeup oblique view of the aft fuselage of the Orbiter ...
Close-up oblique view of the aft fuselage of the Orbiter Discovery looking forward and port with the Space Shuttle Main Engines (SSME) and Orbiter Maneuvering System/Reaction Control System pods still in place. However. the heat shields have been removed from the SSMEs providing a good view toward the interior of the aft fuselage. This image was taken inside the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Closeup oblique view of the aft fuselage of the Orbiter ...
Close-up oblique view of the aft fuselage of the Orbiter Discovery looking forward and starboard with the Space Shuttle Main Engines (SSME) and Orbiter Maneuvering System/Reaction Control System pods removed. The openings for the SSMEs have been covered with a flexible barrier to create a positive pressure envelope inside of the aft fuselage. This image was taken inside the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Closeup oblique view of the aft fuselage of the Orbiter ...
Close-up oblique view of the aft fuselage of the Orbiter Discovery looking forward and starboard with the Space Shuttle Main Engines (SSME) and Orbiter Maneuvering System/Reaction Control System pods still in place. However. the heat shields have been removed from the SSMEs providing a good view toward the interior of the aft fuselage. This image was taken inside the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
NASA Technical Reports Server (NTRS)
1994-01-01
An aerial color infrared (CIR) mapping system developed by Kennedy Space Center enables Florida's Charlotte County to accurately appraise its citrus groves while reducing appraisal costs. The technology was further advanced by development of a dual video system making it possible to simultaneously view images of the same area and detect changes. An image analysis system automatically surveys and photo interprets grove images as well as automatically counts trees and reports totals. The system, which saves both time and money, has potential beyond citrus grove valuation.
The James Webb Space Telescope and its Potential for Exoplanet Science
NASA Technical Reports Server (NTRS)
Clampin, Mark
2008-01-01
The James Webb Space Telescope (JWST) is a large aperture (6.5 meter), cryogenic space telescope with a suite of near and mid-infrared instruments covering the wavelength range of 0.6 microns to 28 microns. JWST s primary science goal is to detect and characterize the first galaxies. It will also study the assembly of galaxies, star formation, and the formation of evolution of planetary systems. Recent progress in hardware development for the observatory will be presented, including a discussion of the status of JWST s optical system and Beryllium mirror fabrication, progress with sunshield prototypes, and recent changes in the integration and test configuration. We also review the expected scientific performance of the observatory for observations of exosolar planets by means of transit imaging and spectroscopy and direct imaging. We also review the recent discovery of Fomalhaut B and implications for debris disk imaging nd exoplanet detection with JWST.
NASA Technical Reports Server (NTRS)
1998-01-01
Positive Systems has worked in conjunction with Stennis Space Center to design the ADAR System 5500. This is a four-band airborne digital imaging system used to capture multispectral imagery similar to that available from satellite platforms such as Landsat, SPOT and the new generation of high resolution satellites. Positive Systems has provided remote sensing services for the development of digital aerial camera systems and software for commercial aerial imaging applications.
PET image reconstruction: a robust state space approach.
Liu, Huafeng; Tian, Yi; Shi, Pengcheng
2005-01-01
Statistical iterative reconstruction algorithms have shown improved image quality over conventional nonstatistical methods in PET by using accurate system response models and measurement noise models. Strictly speaking, however, PET measurements, pre-corrected for accidental coincidences, are neither Poisson nor Gaussian distributed and thus do not meet basic assumptions of these algorithms. In addition, the difficulty in determining the proper system response model also greatly affects the quality of the reconstructed images. In this paper, we explore the usage of state space principles for the estimation of activity map in tomographic PET imaging. The proposed strategy formulates the organ activity distribution through tracer kinetics models, and the photon-counting measurements through observation equations, thus makes it possible to unify the dynamic reconstruction problem and static reconstruction problem into a general framework. Further, it coherently treats the uncertainties of the statistical model of the imaging system and the noisy nature of measurement data. Since H(infinity) filter seeks minimummaximum-error estimates without any assumptions on the system and data noise statistics, it is particular suited for PET image reconstruction where the statistical properties of measurement data and the system model are very complicated. The performance of the proposed framework is evaluated using Shepp-Logan simulated phantom data and real phantom data with favorable results.
Cameras Reveal Elements in the Short Wave Infrared
NASA Technical Reports Server (NTRS)
2010-01-01
Goodrich ISR Systems Inc. (formerly Sensors Unlimited Inc.), based out of Princeton, New Jersey, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory, Marshall Space Flight Center, Kennedy Space Center, Goddard Space Flight Center, Ames Research Center, Stennis Space Center, and Langley Research Center to assist in advancing and refining indium gallium arsenide imaging technology. Used on the Lunar Crater Observation and Sensing Satellite (LCROSS) mission in 2009 for imaging the short wave infrared wavelengths, the technology has dozens of applications in military, security and surveillance, machine vision, medical, spectroscopy, semiconductor inspection, instrumentation, thermography, and telecommunications.
Infrared image enhancement using H(infinity) bounds for surveillance applications.
Qidwai, Uvais
2008-08-01
In this paper, two algorithms have been presented to enhance the infrared (IR) images. Using the autoregressive moving average model structure and H(infinity) optimal bounds, the image pixels are mapped from the IR pixel space into normal optical image space, thus enhancing the IR image for improved visual quality. Although H(infinity)-based system identification algorithms are very common now, they are not quite suitable for real-time applications owing to their complexity. However, many variants of such algorithms are possible that can overcome this constraint. Two such algorithms have been developed and implemented in this paper. Theoretical and algorithmic results show remarkable enhancement in the acquired images. This will help in enhancing the visual quality of IR images for surveillance applications.
2010-02-20
S130-E-012478 (20 Feb. 2010) --- Backdropped by Earth?s horizon and the blackness of space, a partial view of space shuttle Endeavour's payload bay, vertical stabilizer, orbital maneuvering system (OMS) pods, Remote Manipulator System/Orbiter Boom Sensor System (RMS/OBSS) and docking mechanism are featured in this image photographed by an STS-130 crew member from an aft flight deck window.
Design of compact off-axis four-mirror anastigmatic system for space communications
NASA Astrophysics Data System (ADS)
Zhao, Fa-cai; Sun, Quan-she; Chen, Kun-feng; Zhu, Xing-bang; Wang, Shao-shui; Wang, Guo-quan; Zheng, Xiang-liang
2013-08-01
The deployment of advanced hyperspectral imaging and other Earth sensing instruments onboard Earth observing satellites is driving the demand for high-data rate communications. Space laser communications technology offers the potential for significantly increasing in data return capability from space to Earth. Compared to the current state of the art radio frequency communications links, lasercom links operate at much higher carrier frequencies. The use of higher carrier frequencies implies a much smaller diffraction loss, which in turn, results in a much higher efficiency in delivering the signal energy. Optical communications meet the required data rates with small, low-mass, and low-power communications packages. The communications optical system assembly typically consists of a front aperture, reflection or refraction type telescope, with or without a solar rejection filter, aft optics, fine-pointing mirrors, and array detectors. Optical system used in space laser communications usually has long focal length, large aperture compared with common optical systems. So the reflective optical system is widely used. An unobstructed four-mirror anastigmatic telescope system was proposed, which was modified based on the theory about geometry optics of common-axis three-mirror systems. Intermediate image was between secondary and tertiary mirror. In order to fold the optical path, four-mirror was designed by adding the plane reflective mirror at intermediate image. The design was analyzed, then a system with effective aperture of 200mm and field of view of 1.0°x1.0° was designed, total length and magnification are 700mm and 20, respectively. The system has advantages of large magnification, relative short physical size and loose manufacturing tolerances.
NASA Technical Reports Server (NTRS)
Pool, Sam Lee
1988-01-01
Because the prolonged stay on board the Space Station will increase the risk of possible inflight medical problems from that on Skylab missions, the Health Maintenance Facility (HMF) planned for the Space Station is much more sophisticated than the small clinics of the Skylab missions. The development of the HMF is directed by the consideration of three primary factors: prevention, diagnosis, and treatment of injuries and illnesses that may occur in flight. The major components of the HMF include the clinical laboratory, pharmacy, imaging system, critical-care system, patient-restraint system, data-management system, exercise system, surgical system, electrophysiologic-monitoring system, introvenous-fluid system, dental system, and hyperbaric-treatment-support system.
Event-based Sensing for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.
A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.
HICO and RAIDS Experiment Payload - Hyperspectral Imager for the Coastal Ocean
NASA Technical Reports Server (NTRS)
Corson, Mike
2009-01-01
HICO and RAIDS Experiment Payload - Hyperspectral Imager For The Coastal Ocean (HREP-HICO) will operate a visible and near-infrared (VNIR) Maritime Hyperspectral Imaging (MHSI) system, to detect, identify and quantify coastal geophysical features from the International Space Station.
NASA Technical Reports Server (NTRS)
Schultz, Christopher J.; Lang, Timothy J.; Leake, Skye; Runco, Mario, Jr.; Blakeslee, Richard J.
2017-01-01
Video and still frame images from cameras aboard the International Space Station (ISS) are used to inspire, educate, and provide a unique vantage point from low-Earth orbit that is second to none; however, these cameras have overlooked capabilities for contributing to scientific analysis of the Earth and near-space environment. The goal of this project is to study how geo referenced video/images from available ISS camera systems can be useful for scientific analysis, using lightning properties as a demonstration.
NASA Technical Reports Server (NTRS)
Ando, K.
1982-01-01
A substantial technology base of solid state pushbroom sensors exists and is in the process of further evolution at both GSFC and JPL. Technologies being developed relate to short wave infrared (SWIR) detector arrays; HgCdTe hybrid detector arrays; InSb linear and area arrays; passive coolers; spectral beam splitters; the deposition of spectral filters on detector arrays; and the functional design of the shuttle/space platform imaging spectrometer (SIS) system. Spatial and spectral characteristics of field, aircraft and space multispectral sensors are summaried. The status, field of view, and resolution of foreign land observing systems are included.
First Solar System Results of the Spitzer Space Telescope
NASA Technical Reports Server (NTRS)
VanCleve, J.; Cruikshank, D. P.; Stansberry, J. A.; Burgdorf, M. J.; Devost, D.; Emery, J. P.; Fazio, G.; Fernandez, Y. R.; Glaccum, W.; Grillmair, C.
2004-01-01
The Spitzer Space Telescope, formerly known as SIRTF, is now operational and delivers unprecedented sensitivity for the observation of Solar System targets. Spitzer's capabilities and first general results were presented at the January 2004 AAS meeting. In this poster, we focus on Spitzer's performance for moving targets, and the first Solar System results. Spitzer has three instruments, IRAC, IRS, and MIPS. IRAC (InfraRed Array Camera) provides simultaneous images at wavelengths of 3.6, 4.5, 5.8, and 8.0 microns. IRS (InfraRed Spectrograph) has 4 modules providing low-resolution (R=60-120) spectra from 5.3 to 40 microns, high-resolution (R=600) spectra from 10 to 37 m, and an autonomous target acquisition system (PeakUp) which includes small-field imaging at 15 m. MIPS (Multiband Imaging Photometer for SIRTF) does imaging photometry at 24, 70, and 160 m and low-resolution (R=15-25) spectroscopy (SED) between 55 and 96 microns. Guaranteed Time Observer (GTO) programs include the moons of the outer Solar System, Pluto, Centaurs, Kuiper Belt Objects, and comets
Flash LIDAR Systems for Planetary Exploration
NASA Astrophysics Data System (ADS)
Dissly, Richard; Weinberg, J.; Weimer, C.; Craig, R.; Earhart, P.; Miller, K.
2009-01-01
Ball Aerospace offers a mature, highly capable 3D flash-imaging LIDAR system for planetary exploration. Multi mission applications include orbital, standoff and surface terrain mapping, long distance and rapid close-in ranging, descent and surface navigation and rendezvous and docking. Our flash LIDAR is an optical, time-of-flight, topographic imaging system, leveraging innovations in focal plane arrays, readout integrated circuit real time processing, and compact and efficient pulsed laser sources. Due to its modular design, it can be easily tailored to satisfy a wide range of mission requirements. Flash LIDAR offers several distinct advantages over traditional scanning systems. The entire scene within the sensor's field of view is imaged with a single laser flash. This directly produces an image with each pixel already correlated in time, making the sensor resistant to the relative motion of a target subject. Additionally, images may be produced at rates much faster than are possible with a scanning system. And because the system captures a new complete image with each flash, optical glint and clutter are easily filtered and discarded. This allows for imaging under any lighting condition and makes the system virtually insensitive to stray light. Finally, because there are no moving parts, our flash LIDAR system is highly reliable and has a long life expectancy. As an industry leader in laser active sensor system development, Ball Aerospace has been working for more than four years to mature flash LIDAR systems for space applications, and is now under contract to provide the Vision Navigation System for NASA's Orion spacecraft. Our system uses heritage optics and electronics from our star tracker products, and space qualified lasers similar to those used in our CALIPSO LIDAR, which has been in continuous operation since 2006, providing more than 1.3 billion laser pulses to date.
Three-dimensional face model reproduction method using multiview images
NASA Astrophysics Data System (ADS)
Nagashima, Yoshio; Agawa, Hiroshi; Kishino, Fumio
1991-11-01
This paper describes a method of reproducing three-dimensional face models using multi-view images for a virtual space teleconferencing system that achieves a realistic visual presence for teleconferencing. The goal of this research, as an integral component of a virtual space teleconferencing system, is to generate a three-dimensional face model from facial images, synthesize images of the model virtually viewed from different angles, and with natural shadow to suit the lighting conditions of the virtual space. The proposed method is as follows: first, front and side view images of the human face are taken by TV cameras. The 3D data of facial feature points are obtained from front- and side-views by an image processing technique based on the color, shape, and correlation of face components. Using these 3D data, the prepared base face models, representing typical Japanese male and female faces, are modified to approximate the input facial image. The personal face model, representing the individual character, is then reproduced. Next, an oblique view image is taken by TV camera. The feature points of the oblique view image are extracted using the same image processing technique. A more precise personal model is reproduced by fitting the boundary of the personal face model to the boundary of the oblique view image. The modified boundary of the personal face model is determined by using face direction, namely rotation angle, which is detected based on the extracted feature points. After the 3D model is established, the new images are synthesized by mapping facial texture onto the model.
Flight Results from the HST SM4 Relative Navigation Sensor System
NASA Technical Reports Server (NTRS)
Naasz, Bo; Eepoel, John Van; Queen, Steve; Southward, C. Michael; Hannah, Joel
2010-01-01
On May 11, 2009, Space Shuttle Atlantis roared off of Launch Pad 39A enroute to the Hubble Space Telescope (HST) to undertake its final servicing of HST, Servicing Mission 4. Onboard Atlantis was a small payload called the Relative Navigation Sensor experiment, which included three cameras of varying focal ranges, avionics to record images and estimate, in real time, the relative position and attitude (aka "pose") of the telescope during rendezvous and deploy. The avionics package, known as SpaceCube and developed at the Goddard Space Flight Center, performed image processing using field programmable gate arrays to accelerate this process, and in addition executed two different pose algorithms in parallel, the Goddard Natural Feature Image Recognition and the ULTOR Passive Pose and Position Engine (P3E) algorithms
Image fusion pitfalls for cranial radiosurgery.
Jonker, Benjamin P
2013-01-01
Stereotactic radiosurgery requires imaging to define both the stereotactic space in which the treatment is delivered and the target itself. Image fusion is the process of using rotation and translation to bring a second image set into alignment with the first image set. This allows the potential concurrent use of multiple image sets to define the target and stereotactic space. While a single magnetic resonance imaging (MRI) sequence alone can be used for delineation of the target and fiducials, there may be significant advantages to using additional imaging sets including other MRI sequences, computed tomography (CT) scans, and advanced imaging sets such as catheter-based angiography, diffusor tension imaging-based fiber tracking and positon emission tomography in order to more accurately define the target and surrounding critical structures. Stereotactic space is usually defined by detection of fiducials on the stereotactic head frame or mask system. Unfortunately MRI sequences are susceptible to geometric distortion, whereas CT scans do not face this problem (although they have poorer resolution of the target in most cases). Thus image fusion can allow the definition of stereotactic space to proceed from the geometrically accurate CT images at the same time as using MRI to define the target. The use of image fusion is associated with risk of error introduced by inaccuracies of the fusion process, as well as workflow changes that if not properly accounted for can mislead the treating clinician. The purpose of this review is to describe the uses of image fusion in stereotactic radiosurgery as well as its potential pitfalls.
Advanced sensor-simulation capability
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.
1990-09-01
This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.
NASA Technical Reports Server (NTRS)
Poulton, C. E.
1975-01-01
Comparative statistics were presented on the capability of LANDSAT-1 and three of the Skylab remote sensing systems (S-190A, S-190B, S-192) for the recognition and inventory of analogous natural vegetations and landscape features important in resource allocation and management. Two analogous regions presenting vegetational zonation from salt desert to alpine conditions above the timberline were observed, emphasizing the visual interpretation mode in the investigation. An hierarchical legend system was used as the basic classification of all land surface features. Comparative tests were run on image identifiability with the different sensor systems, and mapping and interpretation tests were made both in monocular and stereo interpretation with all systems except the S-192. Significant advantage was found in the use of stereo from space when image analysis is by visual or visual-machine-aided interactive systems. Some cost factors in mapping from space are identified. The various image types are compared and an operational system is postulated.
NASA Technical Reports Server (NTRS)
Nabors, Sammy
2015-01-01
NASA offers companies an optical system that provides a unique panoramic perspective with a single camera. NASA's Marshall Space Flight Center has developed a technology that combines a panoramic refracting optic (PRO) lens with a unique detection system to acquire a true 360-degree field of view. Although current imaging systems can acquire panoramic images, they must use up to five cameras to obtain the full field of view. MSFC's technology obtains its panoramic images from one vantage point.
The design of visible system for improving the measurement accuracy of imaging points
NASA Astrophysics Data System (ADS)
Shan, Qiu-sha; Li, Gang; Zeng, Luan; Liu, Kai; Yan, Pei-pei; Duan, Jing; Jiang, Kai
2018-02-01
It has a widely applications in robot vision and 3D measurement for binocular stereoscopic measurement technology. And the measure precision is an very important factor, especially in 3D coordination measurement, high measurement accuracy is more stringent to the distortion of the optical system. In order to improving the measurement accuracy of imaging points, to reducing the distortion of the imaging points, the optical system must be satisfied the requirement of extra low distortion value less than 0.1#65285;, a transmission visible optical lens was design, which has characteristic of telecentric beam path in image space, adopted the imaging model of binocular stereo vision, and imaged the drone at the finity distance. The optical system was adopted complex double Gauss structure, and put the pupil stop on the focal plane of the latter groups, maked the system exit pupil on the infinity distance, and realized telecentric beam path in image space. The system mainly optical parameter as follows: the system spectrum rangement is visible light wave band, the optical effective length is f '=30mm, the relative aperture is 1/3, and the fields of view is 21°. The final design results show that the RMS value of the spread spots of the optical lens in the maximum fields of view is 2.3μm, which is less than one pixel(3.45μm) the distortion value is less than 0.1%, the system has the advantage of extra low distortion value and avoids the latter image distortion correction; the proposed modulation transfer function of the optical lens is 0.58(@145 lp/mm), the imaging quality of the system is closed to the diffraction limited; the system has simply structure, and can satisfies the requirements of the optical indexes. Ultimately, based on the imaging model of binocular stereo vision was achieved to measuring the drone at the finity distance.
Identifying explosives using broadband millimeter-wave imaging
NASA Astrophysics Data System (ADS)
Weatherall, James C.; Yam, Kevin; Barber, Jeffrey; Smith, Barry T.; Smith, Peter R.; Greca, Joseph
2017-05-01
Millimeter wave imaging is employed in Advanced Technology Imaging (AIT) systems to screen personnel for concealed explosives and weapons. AIT systems deployed in airports auto-detect potential threats by highlighting their location on a generic outline of a person using imaging data collected over a range of frequency. We show how the spectral information from the imaging data can be used to identify the composition of an anomalous object, in particular if it is an explosive material. The discriminative value of the technique was illustrated on military sheet explosive using millimeter-wave reflection data at frequencies 18 - 40 GHz, and commercial explosives using 2 - 18 GHz, but the free-space measurement was limited to a single horn with a large-area sample. This work extends the method to imaging data collected at high resolution with a 18 - 40 GHz imaging system. The identification of explosives is accomplished by extracting the dielectric constant from the free-space, multifrequency data. The reflection coefficient is a function of frequency because of propagation effects associated with the material's complex dielectric constant, which include interference from multiple reflections and energy loss in the sample. The dielectric constant is obtained by numerically fitting the reflection coefficient as a function of frequency to an optical model. In principal, the implementation of this technique in standoff imaging systems would allow threat assessment to be accomplished within the scope of millimeter-wave screening.
Direct Images, Fields of Hilbert Spaces, and Geometric Quantization
NASA Astrophysics Data System (ADS)
Lempert, László; Szőke, Róbert
2014-04-01
Geometric quantization often produces not one Hilbert space to represent the quantum states of a classical system but a whole family H s of Hilbert spaces, and the question arises if the spaces H s are canonically isomorphic. Axelrod et al. (J. Diff. Geo. 33:787-902, 1991) and Hitchin (Commun. Math. Phys. 131:347-380, 1990) suggest viewing H s as fibers of a Hilbert bundle H, introduce a connection on H, and use parallel transport to identify different fibers. Here we explore to what extent this can be done. First we introduce the notion of smooth and analytic fields of Hilbert spaces, and prove that if an analytic field over a simply connected base is flat, then it corresponds to a Hermitian Hilbert bundle with a flat connection and path independent parallel transport. Second we address a general direct image problem in complex geometry: pushing forward a Hermitian holomorphic vector bundle along a non-proper map . We give criteria for the direct image to be a smooth field of Hilbert spaces. Third we consider quantizing an analytic Riemannian manifold M by endowing TM with the family of adapted Kähler structures from Lempert and Szőke (Bull. Lond. Math. Soc. 44:367-374, 2012). This leads to a direct image problem. When M is homogeneous, we prove the direct image is an analytic field of Hilbert spaces. For certain such M—but not all—the direct image is even flat; which means that in those cases quantization is unique.
Abboud, Talal; Bamsey, Matthew; Paul, Anna-Lisa; Graham, Thomas; Braham, Stephen; Noumeir, Rita; Berinstain, Alain; Ferl, Robert
2013-01-01
Higher plants are an integral part of strategies for sustained human presence in space. Space-based greenhouses have the potential to provide closed-loop recycling of oxygen, water and food. Plant monitoring systems with the capacity to remotely observe the condition of crops in real-time within these systems would permit operators to take immediate action to ensure optimum system yield and reliability. One such plant health monitoring technique involves the use of reporter genes driving fluorescent proteins as biological sensors of plant stress. In 2006 an initial prototype green fluorescent protein imager system was deployed at the Arthur Clarke Mars Greenhouse located in the Canadian High Arctic. This prototype demonstrated the advantageous of this biosensor technology and underscored the challenges in collecting and managing telemetric data from exigent environments. We present here the design and deployment of a second prototype imaging system deployed within and connected to the infrastructure of the Arthur Clarke Mars Greenhouse. This is the first imager to run autonomously for one year in the un-crewed greenhouse with command and control conducted through the greenhouse satellite control system. Images were saved locally in high resolution and sent telemetrically in low resolution. Imager hardware is described, including the custom designed LED growth light and fluorescent excitation light boards, filters, data acquisition and control system, and basic sensing and environmental control. Several critical lessons learned related to the hardware of small plant growth payloads are also elaborated. PMID:23486220
Quick acquisition and recognition method for the beacon in deep space optical communications.
Wang, Qiang; Liu, Yuefei; Ma, Jing; Tan, Liying; Yu, Siyuan; Li, Changjiang
2016-12-01
In deep space optical communications, it is very difficult to acquire the beacon given the long communication distance. Acquisition efficiency is essential for establishing and holding the optical communication link. Here we proposed a quick acquisition and recognition method for the beacon in deep optical communications based on the characteristics of the deep optical link. To identify the beacon from the background light efficiently, we utilized the maximum similarity between the collecting image and the reference image for accurate recognition and acquisition of the beacon in the area of uncertainty. First, the collecting image and the reference image were processed by Fourier-Mellin. Second, image sampling and image matching were applied for the accurate positioning of the beacon. Finally, the field programmable gate array (FPGA)-based system was used to verify and realize this method. The experimental results showed that the acquisition time for the beacon was as fast as 8.1s. Future application of this method in the system design of deep optical communication will be beneficial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Rongyu; Zhao, Changyin; Zhang, Xiaoxiang, E-mail: cyzhao@pmo.ac.cn
The data reduction method for optical space debris observations has many similarities with the one adopted for surveying near-Earth objects; however, due to several specific issues, the image degradation is particularly critical, which makes it difficult to obtain precise astrometry. An automatic image reconstruction method was developed to improve the astrometry precision for space debris, based on the mathematical morphology operator. Variable structural elements along multiple directions are adopted for image transformation, and then all the resultant images are stacked to obtain a final result. To investigate its efficiency, trial observations are made with Global Positioning System satellites and themore » astrometry accuracy improvement is obtained by comparison with the reference positions. The results of our experiments indicate that the influence of degradation in astrometric CCD images is reduced, and the position accuracy of both objects and stellar stars is improved distinctly. Our technique will contribute significantly to optical data reduction and high-order precision astrometry for space debris.« less
NASA's EPIC View of 2017 Eclipse Across America
2017-08-22
From a million miles out in space, NASA’s Earth Polychromatic Imaging Camera (EPIC) captured natural color images of the moon’s shadow crossing over North America on Aug. 21, 2017. EPIC is aboard NOAA’s Deep Space Climate Observatory (DSCOVR), where it photographs the full sunlit side of Earth every day, giving it a unique view of total solar eclipses. EPIC normally takes about 20 to 22 images of Earth per day, so this animation appears to speed up the progression of the eclipse. To see the images of Earth every day, go to: epic.gsfc.nasa.gov NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Heuristic approach to image registration
NASA Astrophysics Data System (ADS)
Gertner, Izidor; Maslov, Igor V.
2000-08-01
Image registration, i.e. correct mapping of images obtained from different sensor readings onto common reference frame, is a critical part of multi-sensor ATR/AOR systems based on readings from different types of sensors. In order to fuse two different sensor readings of the same object, the readings have to be put into a common coordinate system. This task can be formulated as optimization problem in a space of all possible affine transformations of an image. In this paper, a combination of heuristic methods is explored to register gray- scale images. The modification of Genetic Algorithm is used as the first step in global search for optimal transformation. It covers the entire search space with (randomly or heuristically) scattered probe points and helps significantly reduce the search space to a subspace of potentially most successful transformations. Due to its discrete character, however, Genetic Algorithm in general can not converge while coming close to the optimum. Its termination point can be specified either as some predefined number of generations or as achievement of a certain acceptable convergence level. To refine the search, potential optimal subspaces are searched using more delicate and efficient for local search Taboo and Simulated Annealing methods.
Imaging Beyond What Man Can See
NASA Technical Reports Server (NTRS)
May, George; Mitchell, Brian
2004-01-01
Three lightweight, portable hyperspectral sensor systems have been built that capture energy from 200 to 1700 nanometers (ultravio1et to shortwave infrared). The sensors incorporate a line scanning technique that requires no relative movement between the target and the sensor. This unique capability, combined with portability, opens up new uses of hyperspectral imaging for laboratory and field environments. Each system has a GUI-based software package that allows the user to communicate with the imaging device for setting spatial resolution, spectral bands and other parameters. NASA's Space Partnership Development has sponsored these innovative developments and their application to human problems on Earth and in space. Hyperspectral datasets have been captured and analyzed in numerous areas including precision agriculture, food safety, biomedical imaging, and forensics. Discussion on research results will include realtime detection of food contaminants, molds and toxin research on corn, identifying counterfeit documents, non-invasive wound monitoring and aircraft applications. Future research will include development of a thermal infrared hyperspectral sensor that will support natural resource applications on Earth and thermal analyses during long duration space flight. This paper incorporates a variety of disciplines and imaging technologies that have been linked together to allow the expansion of remote sensing across both traditional and non-traditional boundaries.
Hybrid vision activities at NASA Johnson Space Center
NASA Technical Reports Server (NTRS)
Juday, Richard D.
1990-01-01
NASA's Johnson Space Center in Houston, Texas, is active in several aspects of hybrid image processing. (The term hybrid image processing refers to a system that combines digital and photonic processing). The major thrusts are autonomous space operations such as planetary landing, servicing, and rendezvous and docking. By processing images in non-Cartesian geometries to achieve shift invariance to canonical distortions, researchers use certain aspects of the human visual system for machine vision. That technology flow is bidirectional; researchers are investigating the possible utility of video-rate coordinate transformations for human low-vision patients. Man-in-the-loop teleoperations are also supported by the use of video-rate image-coordinate transformations, as researchers plan to use bandwidth compression tailored to the varying spatial acuity of the human operator. Technological elements being developed in the program include upgraded spatial light modulators, real-time coordinate transformations in video imagery, synthetic filters that robustly allow estimation of object pose parameters, convolutionally blurred filters that have continuously selectable invariance to such image changes as magnification and rotation, and optimization of optical correlation done with spatial light modulators that have limited range and couple both phase and amplitude in their response.
NASA Astrophysics Data System (ADS)
Tan, Ru-Chao; Lei, Tong; Zhao, Qing-Min; Gong, Li-Hua; Zhou, Zhi-Hong
2016-12-01
To improve the slow processing speed of the classical image encryption algorithms and enhance the security of the private color images, a new quantum color image encryption algorithm based on a hyper-chaotic system is proposed, in which the sequences generated by the Chen's hyper-chaotic system are scrambled and diffused with three components of the original color image. Sequentially, the quantum Fourier transform is exploited to fulfill the encryption. Numerical simulations show that the presented quantum color image encryption algorithm possesses large key space to resist illegal attacks, sensitive dependence on initial keys, uniform distribution of gray values for the encrypted image and weak correlation between two adjacent pixels in the cipher-image.
A comparison of radiosity with current methods of sound level prediction in commercial spaces
NASA Astrophysics Data System (ADS)
Beamer, C. Walter, IV; Muehleisen, Ralph T.
2002-11-01
The ray tracing and image methods (and variations thereof) are widely used for the computation of sound fields in architectural spaces. The ray tracing and image methods are best suited for spaces with mostly specular reflecting surfaces. The radiosity method, a method based on solving a system of energy balance equations, is best applied to spaces with mainly diffusely reflective surfaces. Because very few spaces are either purely specular or purely diffuse, all methods must deal with both types of reflecting surfaces. A comparison of the radiosity method to other methods for the prediction of sound levels in commercial environments is presented. [Work supported by NSF.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, a digital still camera has been mounted in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following ET separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers check the digital still camera they will mount in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following the tank's separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, a worker mounts a digital still camera in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following the ET separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers prepare a digital still camera they will mount in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following its separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers prepare a digital still camera they will mount in the External Tank (ET) umbilical well on the aft end of Space Shuttle Discovery. The camera is being used to obtain and downlink high-resolution images of the disconnect point on the ET following the ET separation from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
Efficient and automatic image reduction framework for space debris detection based on GPU technology
NASA Astrophysics Data System (ADS)
Diprima, Francesco; Santoni, Fabio; Piergentili, Fabrizio; Fortunato, Vito; Abbattista, Cristoforo; Amoruso, Leonardo
2018-04-01
In the last years, the increasing number of space debris has triggered the need of a distributed monitoring system for the prevention of possible space collisions. Space surveillance based on ground telescope allows the monitoring of the traffic of the Resident Space Objects (RSOs) in the Earth orbit. This space debris surveillance has several applications such as orbit prediction and conjunction assessment. In this paper is proposed an optimized and performance-oriented pipeline for sources extraction intended to the automatic detection of space debris in optical data. The detection method is based on the morphological operations and Hough Transform for lines. Near real-time detection is obtained using General Purpose computing on Graphics Processing Units (GPGPU). The high degree of processing parallelism provided by GPGPU allows to split data analysis over thousands of threads in order to process big datasets with a limited computational time. The implementation has been tested on a large and heterogeneous images data set, containing both imaging satellites from different orbit ranges and multiple observation modes (i.e. sidereal and object tracking). These images were taken during an observation campaign performed from the EQUO (EQUatorial Observatory) observatory settled at the Broglio Space Center (BSC) in Kenya, which is part of the ASI-Sapienza Agreement.
Charge-coupled device image sensor study
NASA Technical Reports Server (NTRS)
1973-01-01
The design specifications and predicted performance characteristics of a Charge-Coupled Device Area Imager and a Charge-Coupled Device Linear Imager are presented. The Imagers recommended are intended for use in space-borne imaging systems and therefore would meet the requirements for the intended application. A unique overlapping metal electrode structure and a buried channel structure are described. Reasons for the particular imager designs are discussed.
Thermographic Imaging of the Space Shuttle During Re-Entry Using a Near Infrared Sensor
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Horvath, Thomas J.; Kerns, Robbie V.; Burke, Eric R.; Taylor, Jeff C.; Spisz, Tom; Gibson, David M.; Shea, Edward J.; Mercer, C. David; Schwartz, Richard J.;
2012-01-01
High resolution calibrated near infrared (NIR) imagery of the Space Shuttle Orbiter was obtained during hypervelocity atmospheric re-entry of the STS-119, STS-125, STS-128, STS-131, STS-132, STS-133, and STS-134 missions. This data has provided information on the distribution of surface temperature and the state of the airflow over the windward surface of the Orbiter during descent. The thermal imagery complemented data collected with onboard surface thermocouple instrumentation. The spatially resolved global thermal measurements made during the Orbiter s hypersonic re-entry will provide critical flight data for reducing the uncertainty associated with present day ground-to-flight extrapolation techniques and current state-of-the-art empirical boundary-layer transition or turbulent heating prediction methods. Laminar and turbulent flight data is critical for the validation of physics-based, semi-empirical boundary-layer transition prediction methods as well as stimulating the validation of laminar numerical chemistry models and the development of turbulence models supporting NASA s next-generation spacecraft. In this paper we provide details of the NIR imaging system used on both air and land-based imaging assets. The paper will discuss calibrations performed on the NIR imaging systems that permitted conversion of captured radiant intensity (counts) to temperature values. Image processing techniques are presented to analyze the NIR data for vignetting distortion, best resolution, and image sharpness. Keywords: HYTHIRM, Space Shuttle thermography, hypersonic imaging, near infrared imaging, histogram analysis, singular value decomposition, eigenvalue image sharpness
SAR processing using SHARC signal processing systems
NASA Astrophysics Data System (ADS)
Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.
1998-09-01
Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.
Real-space Wigner-Seitz Cells Imaging of Potassium on Graphite via Elastic Atomic Manipulation
Yin, Feng; Koskinen, Pekka; Kulju, Sampo; Akola, Jaakko; Palmer, Richard E.
2015-01-01
Atomic manipulation in the scanning tunnelling microscopy, conventionally a tool to build nanostructures one atom at a time, is here employed to enable the atomic-scale imaging of a model low-dimensional system. Specifically, we use low-temperature STM to investigate an ultra thin film (4 atomic layers) of potassium created by epitaxial growth on a graphite substrate. The STM images display an unexpected honeycomb feature, which corresponds to a real-space visualization of the Wigner-Seitz cells of the close-packed surface K atoms. Density functional simulations indicate that this behaviour arises from the elastic, tip-induced vertical manipulation of potassium atoms during imaging, i.e. elastic atomic manipulation, and reflects the ultrasoft properties of the surface under strain. The method may be generally applicable to other soft e.g. molecular or biomolecular systems. PMID:25651973
Coronagraphic Imaging of Debris Disks from a High Altitude Balloon Platform
NASA Technical Reports Server (NTRS)
Unwin, Stephen; Traub, Wesley; Bryden, Geoffrey; Brugarolas, Paul; Chen, Pin; Guyon, Olivier; Hillenbrand, Lynne; Kasdin, Jeremy; Krist, John; Macintosh, Bruce;
2012-01-01
Debris disks around nearby stars are tracers of the planet formation process, and they are a key element of our understanding of the formation and evolution of extrasolar planetary systems. With multi-color images of a significant number of disks, we can probe important questions: can we learn about planetary system evolution; what materials are the disks made of; and can they reveal the presence of planets? Most disks are known to exist only through their infrared flux excesses as measured by the Spitzer Space Telescope, and through images measured by Herschel. The brightest, most extended disks have been imaged with HST, and a few, such as Fomalhaut, can be observed using ground-based telescopes. But the number of good images is still very small, and there are none of disks with densities as low as the disk associated with the asteroid belt and Edgeworth-Kuiper belt in our own Solar System. Direct imaging of disks is a major observational challenge, demanding high angular resolution and extremely high dynamic range close to the parent star. The ultimate experiment requires a space-based platform, but demonstrating much of the needed technology, mitigating the technical risks of a space-based coronagrap, and performing valuable measurements of circumstellar debris disks, can be done from a high-altitude balloon platform. In this paper we present a balloon-borne telescope experiment based on the Zodiac II design that would undertake compelling studies of a sample of debris disks.
Coronagraphic Imaging of Debris Disks from a High Altitude Balloon Platform
NASA Technical Reports Server (NTRS)
Unwin, Stephen; Traub, Wesley; Bryden, Geoffrey; Brugarolas, Paul; Chen, Pin; Guyon, Olivier; Hillenbrand, Lynne; Krist, John; Macintosh, Bruce; Mawet, Dimitri;
2012-01-01
Debris disks around nearby stars are tracers of the planet formation process, and they are a key element of our understanding of the formation and evolution of extrasolar planetary systems. With multi-color images of a significant number of disks, we can probe important questions: can we learn about planetary system evolution; what materials are the disks made of; and can they reveal the presence of planets? Most disks are known to exist only through their infrared flux excesses as measured by the Spitzer Space Telescope, and through images measaured by Herschel. The brightest, most extended disks have been imaged with HST, and a few, such as Fomalhaut, can be observed using ground-based telescopes. But the number of good images is still very small, and there are none of disks with densities as low as the disk associated with the asteroid belt and Edgeworth-Kuiper belt in our own Solar System. Direct imaging of disks is major observational challenge, demanding high angular resolution and extremely high dynamic range close to the parent star. The ultimate experiment requires a space-based platform, but demonstrating much of the needed technology, mitigating the technical risks of a space-based coronagraph, and performing valuable measurements of circumstellar debris disks, can be done from a high-altitude balloon platform. In this paper we present a balloon-borne telescope concept based on the Zodiac II design that could undertake compelling studies of a sample of debris disks.
High-Definition Television (HDTV) Images for Earth Observations and Earth Science Applications
NASA Technical Reports Server (NTRS)
Robinson, Julie A.; Holland, S. Douglas; Runco, Susan K.; Pitts, David E.; Whitehead, Victor S.; Andrefouet, Serge M.
2000-01-01
As part of Detailed Test Objective 700-17A, astronauts acquired Earth observation images from orbit using a high-definition television (HDTV) camcorder, Here we provide a summary of qualitative findings following completion of tests during missions STS (Space Transport System)-93 and STS-99. We compared HDTV imagery stills to images taken using payload bay video cameras, Hasselblad film camera, and electronic still camera. We also evaluated the potential for motion video observations of changes in sunlight and the use of multi-aspect viewing to image aerosols. Spatial resolution and color quality are far superior in HDTV images compared to National Television Systems Committee (NTSC) video images. Thus, HDTV provides the first viable option for video-based remote sensing observations of Earth from orbit. Although under ideal conditions, HDTV images have less spatial resolution than medium-format film cameras, such as the Hasselblad, under some conditions on orbit, the HDTV image acquired compared favorably with the Hasselblad. Of particular note was the quality of color reproduction in the HDTV images HDTV and electronic still camera (ESC) were not compared with matched fields of view, and so spatial resolution could not be compared for the two image types. However, the color reproduction of the HDTV stills was truer than colors in the ESC images. As HDTV becomes the operational video standard for Space Shuttle and Space Station, HDTV has great potential as a source of Earth-observation data. Planning for the conversion from NTSC to HDTV video standards should include planning for Earth data archiving and distribution.
ERIC Educational Resources Information Center
Kingwell, Jeff
1996-01-01
Data management systems for earth science information gathered from space are being affected by two related trends: (1) a move from ad hoc systems established for particular projects to a longer lasting national and global infrastructure; and (2) an emphasis on efficient service delivery in an era of diminishing resources for national space…
Spaceborne electronic imaging systems
NASA Technical Reports Server (NTRS)
1971-01-01
Criteria and recommended practices for the design of the spaceborne elements of electronic imaging systems are presented. A spaceborne electronic imaging system is defined as a device that collects energy in some portion of the electromagnetic spectrum with detector(s) whose direct output is an electrical signal that can be processed (using direct transmission or delayed transmission after recording) to form a pictorial image. This definition encompasses both image tube systems and scanning point-detector systems. The intent was to collect the design experience and recommended practice of the several systems possessing the common denominator of acquiring images from space electronically and to maintain the system viewpoint rather than pursuing specialization in devices. The devices may be markedly different physically, but each was designed to provide a particular type of image within particular limitations. Performance parameters which determine the type of system selected for a given mission and which influence the design include: Sensitivity, Resolution, Dynamic range, Spectral response, Frame rate/bandwidth, Optics compatibility, Image motion, Radiation resistance, Size, Weight, Power, and Reliability.
2004-02-04
KENNEDY SPACE CENTER, FLA. - Reporters are eager to hear from Armando Oliu about the aid the Image Analysis Lab is giving the FBI in a kidnapping case. Oliu, Final Inspection Team lead for the Shuttle program, oversees the lab that is using an advanced SGI® TP9500 data management system to review the tape of the kidnapping in progress in Sarasota, Fla. KSC installed the new $3.2 million system in preparation for Return to Flight of the Space Shuttle fleet. The lab is studying the Sarasota kidnapping video to provide any new information possible to law enforcement officers. KSC is joining NASA’s Marshall Space Flight Center in Alabama in reviewing the tape.
Interactive Scene Analysis Module - A sensor-database fusion system for telerobotic environments
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; Vazquez, Sixto L.; Goode, Plesent W.
1992-01-01
Accomplishing a task with telerobotics typically involves a combination of operator control/supervision and a 'script' of preprogrammed commands. These commands usually assume that the location of various objects in the task space conform to some internal representation (database) of that task space. The ability to quickly and accurately verify the task environment against the internal database would improve the robustness of these preprogrammed commands. In addition, the on-line initialization and maintenance of a task space database is difficult for operators using Cartesian coordinates alone. This paper describes the Interactive Scene' Analysis Module (ISAM) developed to provide taskspace database initialization and verification utilizing 3-D graphic overlay modelling, video imaging, and laser radar based range imaging. Through the fusion of taskspace database information and image sensor data, a verifiable taskspace model is generated providing location and orientation data for objects in a task space. This paper also describes applications of the ISAM in the Intelligent Systems Research Laboratory (ISRL) at NASA Langley Research Center, and discusses its performance relative to representation accuracy and operator interface efficiency.
MOSAIC - A space-multiplexing technique for optical processing of large images
NASA Technical Reports Server (NTRS)
Athale, Ravindra A.; Astor, Michael E.; Yu, Jeffrey
1993-01-01
A technique for Fourier processing of images larger than the space-bandwidth products of conventional or smart spatial light modulators and two-dimensional detector arrays is described. The technique involves a spatial combination of subimages displayed on individual spatial light modulators to form a phase-coherent image, which is subsequently processed with Fourier optical techniques. Because of the technique's similarity with the mosaic technique used in art, the processor used is termed an optical MOSAIC processor. The phase accuracy requirements of this system were studied by computer simulation. It was found that phase errors of less than lambda/8 did not degrade the performance of the system and that the system was relatively insensitive to amplitude nonuniformities. Several schemes for implementing the subimage combination are described. Initial experimental results demonstrating the validity of the mosaic concept are also presented.
Three-dimensional tracking and imaging laser scanner for space operations
NASA Astrophysics Data System (ADS)
Laurin, Denis G.; Beraldin, J. A.; Blais, Francois; Rioux, Marc; Cournoyer, Luc
1999-05-01
This paper presents the development of a laser range scanner (LARS) as a three-dimensional sensor for space applications. The scanner is a versatile system capable of doing surface imaging, target ranging and tracking. It is capable of short range (0.5 m to 20 m) and long range (20 m to 10 km) sensing using triangulation and time-of-flight (TOF) methods respectively. At short range (1 m), the resolution is sub-millimeter and drops gradually with distance (2 cm at 10 m). For long range, the TOF provides a constant resolution of plus or minus 3 cm, independent of range. The LARS could complement the existing Canadian Space Vision System (CSVS) for robotic manipulation. As an active vision system, the LARS is immune to sunlight and adverse lighting; this is a major advantage over the CSVS, as outlined in this paper. The LARS could also replace existing radar systems used for rendezvous and docking. There are clear advantages of an optical system over a microwave radar in terms of size, mass, power and precision. Equipped with two high-speed galvanometers, the laser can be steered to address any point in a 30 degree X 30 degree field of view. The scanning can be continuous (raster scan, Lissajous) or direct (random). This gives the scanner the ability to register high-resolution 3D images of range and intensity (up to 4000 X 4000 pixels) and to perform point target tracking as well as object recognition and geometrical tracking. The imaging capability of the scanner using an eye-safe laser is demonstrated. An efficient fiber laser delivers 60 mW of CW or 3 (mu) J pulses at 20 kHz for TOF operation. Implementation of search and track of multiple targets is also demonstrated. For a single target, refresh rates up to 137 Hz is possible. Considerations for space qualification of the scanner are discussed. Typical space operations, such as docking, object attitude tracking, and inspections are described.
Characterization of steel rebar spacing using synthetic aperture radar imaging
NASA Astrophysics Data System (ADS)
Hu, Jie; Tang, Qixiang; Twumasi, Jones Owusu; Yu, Tzuyang
2018-03-01
Steel rebars is a vital component in reinforced concrete (RC) and prestressed concrete structures since they provide mechanical functions to those structures. Damages occurred to steel rebars can lead to the premature failure of concrete structures. Characterization of steel rebars using nondestructive evaluation (NDE) offers engineers and decision makers important information for effective/good repair of aging concrete structures. Among existing NDE techniques, microwave/radar NDE has been proven to be a promising technique for surface and subsurface sensing of concrete structures. The objective of this paper is to use microwave/radar NDE to characterize steel rebar grids in free space, as a basis for the subsurface sensing of steel rebars inside RC structures. A portable 10-GHz radar system based on synthetic aperture radar (SAR) imaging was used in this paper. Effect of rebar grid spacing was considered and used to define subsurface steel rebar grids. Five rebar grid spacings were used; 12.7 cm (5 in.), 17.78 cm (7 in.), 22.86 cm (9 in.), 27.94 cm (11 in.), and 33.02 cm (13 in.) # 3 rebars were used in all grid specimens. All SAR images were collected inside an anechoic chamber. It was found that SAR images can successfully capture the change of rebar grid spacing and used for quantifying the spacing of rebar grids. Empirical models were proposed to estimate actual rebar spacing and contour area using SAR images.
Navigator Accuracy Requirements for Prospective Motion Correction
Maclaren, Julian; Speck, Oliver; Stucht, Daniel; Schulze, Peter; Hennig, Jürgen; Zaitsev, Maxim
2010-01-01
Prospective motion correction in MR imaging is becoming increasingly popular to prevent the image artefacts that result from subject motion. Navigator information is used to update the position of the imaging volume before every spin excitation so that lines of acquired k-space data are consistent. Errors in the navigator information, however, result in residual errors in each k-space line. This paper presents an analysis linking noise in the tracking system to the power of the resulting image artefacts. An expression is formulated for the required navigator accuracy based on the properties of the imaged object and the desired resolution. Analytical results are compared with computer simulations and experimental data. PMID:19918892
NASA Astrophysics Data System (ADS)
Zhao, Feng; Frietman, Edward E. E.; Han, Zhong; Chen, Ray T.
1999-04-01
A characteristic feature of a conventional von Neumann computer is that computing power is delivered by a single processing unit. Although increasing the clock frequency improves the performance of the computer, the switching speed of the semiconductor devices and the finite speed at which electrical signals propagate along the bus set the boundaries. Architectures containing large numbers of nodes can solve this performance dilemma, with the comment that main obstacles in designing such systems are caused by difficulties to come up with solutions that guarantee efficient communications among the nodes. Exchanging data becomes really a bottleneck should al nodes be connected by a shared resource. Only optics, due to its inherent parallelism, could solve that bottleneck. Here, we explore a multi-faceted free space image distributor to be used in optical interconnects in massively parallel processing. In this paper, physical and optical models of the image distributor are focused on from diffraction theory of light wave to optical simulations. the general features and the performance of the image distributor are also described. The new structure of an image distributor and the simulations for it are discussed. From the digital simulation and experiment, it is found that the multi-faceted free space image distributing technique is quite suitable for free space optical interconnection in massively parallel processing and new structure of the multifaceted free space image distributor would perform better.
Performance of Scattering Matrix Decomposition and Color Spaces for Synthetic Aperture Radar Imagery
2010-03-01
Color Spaces and Synthetic Aperture Radar (SAR) Multicolor Imaging. 15 2.3.1 Colorimetry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.3.2...III. Decomposition Techniques on SAR Polarimetry and Colorimetry applied to SAR Imagery...space polarimetric SAR systems. Colorimetry is also introduced in this chapter, presenting the fundamentals of the RGB and CMY color spaces, defined for
NASA Technical Reports Server (NTRS)
1999-01-01
This video gives a brief history of the Jet Propulsion Laboratory, current missions and what the future may hold. Scenes includes various planets in the solar system, robotic exploration of space, discussions on the Hubble Space Telescope, the source of life, and solar winds. This video was narrated by Jodie Foster. Animations include: close-up image of the Moon; close-up images of the surface of Mars; robotic exploration of Mars; the first mapping assignment of Mars; animated views of Jupiter; animated views of Saturn; and views of a Giant Storm on Neptune called the Great Dark Spot.
General view of the mid deck of the Orbiter Discovery ...
General view of the mid deck of the Orbiter Discovery during pre-launch preparations. Note the payload and mission specialists seats. The seats are removed packed and stowed during on-orbit activities. Also not the black panels in the right of the image, they are protective panels used for preparation of the orbiter and astronaut ingress while the orbiter is in its vertical launch position. This image was taken at Kenney Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Manned observations technology development, FY 1992 report
NASA Technical Reports Server (NTRS)
Israel, Steven
1992-01-01
This project evaluated the suitability of the NASA/JSC developed electronic still camera (ESC) digital image data for Earth observations from the Space Shuttle, as a first step to aid planning for Space Station Freedom. Specifically, image resolution achieved from the Space Shuttle using the current ESC system, which is configured with a Loral 15 mm x 15 mm (1024 x 1024 pixel array) CCD chip on the focal plane of a Nikon F4 camera, was compared to that of current handheld 70 mm Hasselblad 500 EL/M film cameras.
A generic FPGA-based detector readout and real-time image processing board
NASA Astrophysics Data System (ADS)
Sarpotdar, Mayuresh; Mathew, Joice; Safonova, Margarita; Murthy, Jayant
2016-07-01
For space-based astronomical observations, it is important to have a mechanism to capture the digital output from the standard detector for further on-board analysis and storage. We have developed a generic (application- wise) field-programmable gate array (FPGA) board to interface with an image sensor, a method to generate the clocks required to read the image data from the sensor, and a real-time image processor system (on-chip) which can be used for various image processing tasks. The FPGA board is applied as the image processor board in the Lunar Ultraviolet Cosmic Imager (LUCI) and a star sensor (StarSense) - instruments developed by our group. In this paper, we discuss the various design considerations for this board and its applications in the future balloon and possible space flights.
ISIM Lowered into Thermal Vacuum Chamber
2017-12-08
An overhead glimpse inside the thermal vacuum chamber at NASA's Goddard Space Flight Center in Greenbelt, Md., as engineers ready the James Webb Space Telescope's Integrated Science Instrument Module, just lowered into the chamber for its first thermal vacuum test. The ISIM and the ISIM System Integration Fixture that holds the ISIM Electronics Compartment is completely covered in protective blankets to shield it from contamination. Image credit: NASA/Chris Gunn NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
The infrared video image pseudocolor processing system
NASA Astrophysics Data System (ADS)
Zhu, Yong; Zhang, JiangLing
2003-11-01
The infrared video image pseudo-color processing system, emphasizing on the algorithm and its implementation for measured object"s 2D temperature distribution using pseudo-color technology, is introduced in the paper. The data of measured object"s thermal image is the objective presentation of its surface temperature distribution, but the color has a close relationship with people"s subjective cognition. The so-called pseudo-color technology cross the bridge between subjectivity and objectivity, and represents the measured object"s temperature distribution in reason and at first hand. The algorithm of pseudo-color is based on the distance of IHS space. Thereby the definition of pseudo-color visual resolution is put forward. Both the software (which realize the map from the sample data to the color space) and the hardware (which carry out the conversion from the color space to palette by HDL) co-operate. Therefore the two levels map which is logic map and physical map respectively is presented. The system has been used abroad in failure diagnose of electric power devices, fire protection for lifesaving and even SARS detection in CHINA lately.
NASA Technical Reports Server (NTRS)
1999-01-01
Duncan Technologies, Inc., (DTI) developed an infrared imaging system for detection of hydrogen flames in the Space Shuttle Main Engines. The product is the result of a NASA Small Business Innovation Research (SBIR) award from the Stennis Space Center.
A conceptual design for an exoplanet imager
NASA Astrophysics Data System (ADS)
Hyland, David C.; Winkeller, Jon; Mosher, Robert; Momin, Anif; Iglesias, Gerardo; Donnellan, Quentin; Stanley, Jerry; Myers, Storm; Whittington, William G.; Asazuma, Taro; Slagle, Kami; Newton, Lindsay; Bourgeois, Scott; Tejeda, Donny; Young, Brian; Shaver, Nick; Cooper, Jacob; Underwood, Dennis; Perkins, James; Morea, Nathan; Goodnight, Ryan; Colunga, Aaron; Peltier, Scott; Singleton, Zane; Brashear, John; McPherson, Ronald; Guillory, Winston; Patel, Sunil; Stovall, Rachel; Meyer, Ryall; Eberle, Patrick; Morrison, Cole; Mong, Chun Yu
2007-09-01
This paper reports the results of a design study for an exoplanet imaging system. The design team consisted of the students in the "Electromagnetic Sensing for Space-Bourne Imaging" class taught by the principal author in the Spring, 2005 semester. The design challenge was to devise a space system capable of forming 10X10 pixel images of terrestrial-class planets out to 10 parsecs, observing in the 9.0 to 17.0 microns range. It was presumed that this system would operate after the Terrestrial Planet Finder had been deployed and had identified a number of planetary systems for more detailed imaging. The design team evaluated a large number of tradeoffs, starting with the use of a single monolithic telescope, versus a truss-mounted sparse aperture, versus a formation of free-flying telescopes. Having selected the free-flyer option, the team studied a variety of sensing technologies, including amplitude interferometry, intensity correlation imaging (ICI, based on the Brown-Twiss effect and phase retrieval), heterodyne interferometry and direct electric field reconstruction. Intensity correlation imaging was found to have several advantages. It does not require combiner spacecraft, nor nanometer-level control of the relative positions, nor diffraction-limited optics. Orbit design, telescope design, spacecraft structural design, thermal management and communications architecture trades were also addressed. A six spacecraft design involving non-repeating baselines was selected. By varying the overall scale of the baselines it was found possible to unambiguously characterize an entire multi-planet system, to image the parent star and, for the largest base scales, to determine 10X10 pixel images of individual planets.
STS-133 crew members Mike Barratt and Nicole Stott in cupola
2010-06-08
JSC2010-E-090701 (8 June 2010) --- Several computer monitors are featured in this image photographed during an STS-133 exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA's Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
A visualization system for CT based pulmonary fissure analysis
NASA Astrophysics Data System (ADS)
Pu, Jiantao; Zheng, Bin; Park, Sang Cheol
2009-02-01
In this study we describe a visualization system of pulmonary fissures depicted on CT images. The purpose is to provide clinicians with an intuitive perception of a patient's lung anatomy through an interactive examination of fissures, enhancing their understanding and accurate diagnosis of lung diseases. This system consists of four key components: (1) region-of-interest segmentation; (2) three-dimensional surface modeling; (3) fissure type classification; and (4) an interactive user interface, by which the extracted fissures are displayed flexibly in different space domains including image space, geometric space, and mixed space using simple toggling "on" and "off" operations. In this system, the different visualization modes allow users not only to examine the fissures themselves but also to analyze the relationship between fissures and their surrounding structures. In addition, the users can adjust thresholds interactively to visualize the fissure surface under different scanning and processing conditions. Such a visualization tool is expected to facilitate investigation of structures near the fissures and provide an efficient "visual aid" for other applications such as treatment planning and assessment of therapeutic efficacy as well as education of medical professionals.
NASA Technical Reports Server (NTRS)
1999-01-01
This narrow angle image taken by Cassini's camera system of the Moon is one of the best of a sequence of narrow angle frames taken as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The 80 millisecond exposure was taken through a spectral filter centered at 0.33 microns; the filter bandpass was 85 Angstroms wide. The spatial scale of the image is about 1.4 miles per pixel (about 2.3 kilometers). The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ. Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.Holkenbrink, Patrick F.
1978-01-01
Landsat data are received by National Aeronautics and Space Administration (NASA) tracking stations and converted into digital form on high-density tapes (HDTs) by the Image Processing Facility (IPF) at the Goddard Space Flight Center (GSFC), Greenbelt, Maryland. The HDTs are shipped to the EROS Data Center (EDC) where they are converted into customer products by the EROS Data Center digital image processing system (EDIPS). This document describes in detail one of these products: the computer-compatible tape (CCT) produced from Landsat-1, -2, and -3 multispectral scanner (MSS) data and Landsat-3 only return-beam vidicon (RBV) data. Landsat-1 and -2 RBV data will not be processed by IPF/EDIPS to CCT format.
Conference on Space and Military Applications of Automation and Robotics
NASA Technical Reports Server (NTRS)
1988-01-01
Topics addressed include: robotics; deployment strategies; artificial intelligence; expert systems; sensors and image processing; robotic systems; guidance, navigation, and control; aerospace and missile system manufacturing; and telerobotics.
Diemoz, Paul C; Vittoria, Fabio A; Olivo, Alessandro
2016-05-16
Previous studies on edge illumination (EI) X-ray phase-contrast imaging (XPCi) have investigated the nature and amplitude of the signal provided by this technique. However, the response of the imaging system to different object spatial frequencies was never explicitly considered and studied. This is required in order to predict the performance of a given EI setup for different classes of objects. To this scope, in the present work we derive analytical expressions for the contrast transfer function of an EI imaging system, using the approximation of near-field regime, and study its dependence upon the main experimental parameters. We then exploit these results to compare the frequency response of an EI system with respect of that of a free-space propagation XPCi one. The results achieved in this work can be useful for predicting the signals obtainable for different types of objects and also as a basis for new retrieval methods.
Space Optic Manufacturing - X-ray Mirror
NASA Technical Reports Server (NTRS)
1998-01-01
NASA's Space Optics Manufacturing Center has been working to expand our view of the universe via sophisticated new telescopes. The Optics Center's goal is to develop low-cost, advanced space optics technologies for the NASA program in the 21st century - including the long-term goal of imaging Earth-like planets in distant solar systems. To reduce the cost of mirror fabrication, Marshall Space Flight Center (MSFC) has developed replication techniques, the machinery and materials to replicate electro-formed nickel mirrors. The process allows fabricating precisely shaped mandrels to be used and reused as masters for replicating high-quality mirrors. This image shows a lightweight replicated x-ray mirror with gold coatings applied.
1998-08-31
NASA's Space Optics Manufacturing Center has been working to expand our view of the universe via sophisticated new telescopes. The Optics Center's goal is to develop low-cost, advanced space optics technologies for the NASA program in the 21st century - including the long-term goal of imaging Earth-like planets in distant solar systems. To reduce the cost of mirror fabrication, Marshall Space Flight Center (MSFC) has developed replication techniques, the machinery and materials to replicate electro-formed nickel mirrors. The process allows fabricating precisely shaped mandrels to be used and reused as masters for replicating high-quality mirrors. This image shows a lightweight replicated x-ray mirror with gold coatings applied.
Terahertz Tools Advance Imaging for Security, Industry
NASA Technical Reports Server (NTRS)
2010-01-01
Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.
NASA Astrophysics Data System (ADS)
Speicher, Andy; Matin, Mohammad; Tippets, Roger; Chun, Francis
2014-09-01
In order to protect critical military and commercial space assets, the United States Space Surveillance Network must have the ability to positively identify and characterize all space objects. Unfortunately, positive identification and characterization of space objects is a manual and labor intensive process today since even large telescopes cannot provide resolved images of most space objects. The objective of this study was to calibrate a system to exploit the optical signature of unresolved geosynchronous satellite images by collecting polarization data in the visible wavelengths for the purpose of revealing discriminating features. These features may lead to positive identification or classification of each satellite. The system was calibrated with an algorithm and process that takes raw observation data from a two-channel polarimeter and converts it to Stokes parameters S0 and S1. This instrumentation is a new asset for the United States Air Force Academy (USAFA) Department of Physics and consists of one 20-inch Ritchey-Chretien telescope and a dual focal plane system fed with a polarizing beam splitter. This study calibrated the system and collected preliminary polarization data on five geosynchronous satellites to validate performance. Preliminary data revealed that each of the five satellites had a different polarization signature that could potentially lead to identification in future studies.
NASA Overview (K-12, Educators, and General Public)
NASA Technical Reports Server (NTRS)
Ericsson, Aprille Joy
2003-01-01
This viewgraph presentation provides an overview of NASA activities intended for recruitment of employees. It includes NASA's vision statement and mission, images of solar system bodies and the Sojourner rover, as well as information the Aqua satellite and the Stratospheric Aerosol and Gas Experiment III (Sage III). Images of experimental aircraft, a space shuttle, and the Hubble Space Telescope (HST) are shown, and a section on mission planning is included.
A rapid and robust gradient measurement technique using dynamic single-point imaging.
Jang, Hyungseok; McMillan, Alan B
2017-09-01
We propose a new gradient measurement technique based on dynamic single-point imaging (SPI), which allows simple, rapid, and robust measurement of k-space trajectory. To enable gradient measurement, we utilize the variable field-of-view (FOV) property of dynamic SPI, which is dependent on gradient shape. First, one-dimensional (1D) dynamic SPI data are acquired from a targeted gradient axis, and then relative FOV scaling factors between 1D images or k-spaces at varying encoding times are found. These relative scaling factors are the relative k-space position that can be used for image reconstruction. The gradient measurement technique also can be used to estimate the gradient impulse response function for reproducible gradient estimation as a linear time invariant system. The proposed measurement technique was used to improve reconstructed image quality in 3D ultrashort echo, 2D spiral, and multi-echo bipolar gradient-echo imaging. In multi-echo bipolar gradient-echo imaging, measurement of the k-space trajectory allowed the use of a ramp-sampled trajectory for improved acquisition speed (approximately 30%) and more accurate quantitative fat and water separation in a phantom. The proposed dynamic SPI-based method allows fast k-space trajectory measurement with a simple implementation and no additional hardware for improved image quality. Magn Reson Med 78:950-962, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Satellite image collection optimization
NASA Astrophysics Data System (ADS)
Martin, William
2002-09-01
Imaging satellite systems represent a high capital cost. Optimizing the collection of images is critical for both satisfying customer orders and building a sustainable satellite operations business. We describe the functions of an operational, multivariable, time dynamic optimization system that maximizes the daily collection of satellite images. A graphical user interface allows the operator to quickly see the results of what if adjustments to an image collection plan. Used for both long range planning and daily collection scheduling of Space Imaging's IKONOS satellite, the satellite control and tasking (SCT) software allows collection commands to be altered up to 10 min before upload to the satellite.
Design of an imaging spectrometer for earth observation using freeform mirrors
NASA Astrophysics Data System (ADS)
Peschel, T.; Damm, C.; Beier, M.; Gebhardt, A.; Risse, S.; Walter, I.; Sebastian, I.; Krutz, D.
2017-09-01
In 2017 the new hyperspectral DLR Earth Sensing Imaging Spectrometer (DESIS) will be integrated in the Multi-User-System for Earth Sensing (MUSES) platform [1] installed on the International Space Station (ISS).
Deep Space 1 Ion Engine Completed a 3-Year Journey
NASA Technical Reports Server (NTRS)
Sovey, James S.; Patterson, Michael J.; Rawlin, Vincent K.; Hamley, John A.
2001-01-01
A xenon ion engine and power processor system, which was developed by the NASA Glenn Research Center in partnership with the Jet Propulsion Laboratory and Boeing Electron Dynamic Devices, completed nearly 3 years of operation aboard the Deep Space 1 spacecraft. The 2.3-kW ion engine, which provided primary propulsion and two-axis attitude control, thrusted for more than 16,000 hr and consumed more than 70 kg of xenon propellant. The Deep Space 1 spacecraft was launched on October 24, 1998, to validate 12 futuristic technologies, including the ion-propulsion system. After the technology validation process was successfully completed, the Deep Space 1 spacecraft flew by the small asteroid Braille on July 29, 1999. The final objective of this mission was to encounter the active comet Borrelly, which is about 6 miles long. The ion engine was on a thrusting schedule to navigate the Deep Space 1 spacecraft to within 1400 miles of the comet. Since the hydrazine used for spacecraft attitude control was in short supply, the ion engine also provided two-axis attitude control to conserve the hydrazine supply for the Borrelly encounter. The comet encounter took place on September 22, 2001. Dr. Marc Rayman, project manager of Deep Space 1 at the Jet Propulsion Laboratory said, "Deep Space 1 plunged into the heart of the comet Borrelly and has lived to tell every detail of its spinetingling adventure! The images are even better than the impressive images of comet Halley taken by Europe's Giotto spacecraft in 1986." The Deep Space 1 mission, which successfully tested the 12 high-risk, advanced technologies and captured the best images ever taken of a comet, was voluntarily terminated on December 18, 2001. The successful demonstration of the 2-kW-class ion propulsion system technology is now providing mission planners with off-the-shelf flight hardware. Higher power, next generation ion propulsion systems are being developed for large flagship missions, such as outer planet explorers and sample-return missions.
[Sub-field imaging spectrometer design based on Offner structure].
Wu, Cong-Jun; Yan, Chang-Xiang; Liu, Wei; Dai, Hu
2013-08-01
To satisfy imaging spectrometers's miniaturization, lightweight and large field requirements in space application, the current optical design of imaging spectrometer with Offner structure was analyzed, and an simple method to design imaging spectrometer with concave grating based on current ways was given. Using the method offered, the sub-field imaging spectrometer with 400 km altitude, 0.4-1.0 microm wavelength range, 5 F-number of 720 mm focal length and 4.3 degrees total field was designed. Optical fiber was used to transfer the image in telescope's focal plane to three slits arranged in the same plane so as to achieve subfield. The CCD detector with 1 024 x 1 024 and 18 microm x 18 microm was used to receive the image of the three slits after dispersing. Using ZEMAX software optimization and tolerance analysis, the system can satisfy 5 nm spectrum resolution and 5 m field resolution, and the MTF is over 0.62 with 28 lp x mm(-1). The field of the system is almost 3 times that of similar instruments used in space probe.
NASA Astrophysics Data System (ADS)
Florian, Michael K.; Gladders, Michael D.; Li, Nan; Sharon, Keren
2016-01-01
The sample of cosmological strong lensing systems has been steadily growing in recent years and with the advent of the next generation of space-based survey telescopes, the sample will reach into the thousands. The accuracy of strong lens models relies on robust identification of multiple image families of lensed galaxies. For the most massive lenses, often more than one background galaxy is magnified and multiply imaged, and even in the cases of only a single lensed source, identification of counter images is not always robust. Recently, we have shown that the Gini coefficient in space-telescope-quality imaging is a measurement of galaxy morphology that is relatively well-preserved by strong gravitational lensing. Here, we investigate its usefulness as a diagnostic for the purposes of image family identification and show that it can remove some of the degeneracies encountered when using color as the sole diagnostic, and can do so without the need for additional observations since whenever a color is available, two Gini coefficients are as well.
Tan, Ek T; Lee, Seung-Kyun; Weavers, Paul T; Graziani, Dominic; Piel, Joseph E; Shu, Yunhong; Huston, John; Bernstein, Matt A; Foo, Thomas K F
2016-09-01
To investigate the effects on echo planar imaging (EPI) distortion of using high gradient slew rates (SR) of up to 700 T/m/s for in vivo human brain imaging, with a dedicated, head-only gradient coil. Simulation studies were first performed to determine the expected echo spacing and distortion reduction in EPI. A head gradient of 42-cm inner diameter and with asymmetric transverse coils was then installed in a whole-body, conventional 3T magnetic resonance imaging (MRI) system. Human subject imaging was performed on five subjects to determine the effects of EPI on echo spacing and signal dropout at various gradient slew rates. The feasibility of whole-brain imaging at 1.5 mm-isotropic spatial resolution was demonstrated with gradient-echo and spin-echo diffusion-weighted EPI. As compared to a whole-body gradient coil, the EPI echo spacing in the head-only gradient coil was reduced by 48%. Simulation and in vivo results, respectively, showed up to 25-26% and 19% improvement in signal dropout. Whole-brain imaging with EPI at 1.5 mm spatial resolution provided good whole-brain coverage, spatial linearity, and low spatial distortion effects. Our results of human brain imaging with EPI using the compact head gradient coil at slew rates higher than in conventional whole-body MR systems demonstrate substantially improved image distortion, and point to a potential for benefits to non-EPI pulse sequences. J. Magn. Reson. Imaging 2016;44:653-664. © 2016 International Society for Magnetic Resonance in Medicine.
Visual information processing; Proceedings of the Meeting, Orlando, FL, Apr. 20-22, 1992
NASA Technical Reports Server (NTRS)
Huck, Friedrich O. (Editor); Juday, Richard D. (Editor)
1992-01-01
Topics discussed in these proceedings include nonlinear processing and communications; feature extraction and recognition; image gathering, interpolation, and restoration; image coding; and wavelet transform. Papers are presented on noise reduction for signals from nonlinear systems; driving nonlinear systems with chaotic signals; edge detection and image segmentation of space scenes using fractal analyses; a vision system for telerobotic operation; a fidelity analysis of image gathering, interpolation, and restoration; restoration of images degraded by motion; and information, entropy, and fidelity in visual communication. Attention is also given to image coding methods and their assessment, hybrid JPEG/recursive block coding of images, modified wavelets that accommodate causality, modified wavelet transform for unbiased frequency representation, and continuous wavelet transform of one-dimensional signals by Fourier filtering.
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
Compact Video Microscope Imaging System Implemented in Colloid Studies
NASA Technical Reports Server (NTRS)
McDowell, Mark
2002-01-01
Long description Photographs showing fiber-optic light source, microscope and charge-coupled discharge (CCD) camera head connected to camera body, CCD camera body feeding data to image acquisition board in PC, and Cartesian robot controlled via PC board. The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. CMIS can be used in situ with a minimum amount of user intervention. This system can scan, find areas of interest in, focus on, and acquire images automatically. Many multiple-cell experiments require microscopy for in situ observations; this is feasible only with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control. The software also has a user-friendly interface, which can be used independently of the hardware for further post-experiment analysis. CMIS has been successfully developed in the SML Laboratory at the NASA Glenn Research Center and adapted for use for colloid studies and is available for telescience experiments. The main innovations this year are an improved interface, optimized algorithms, and the ability to control conventional full-sized microscopes in addition to compact microscopes. The CMIS software-hardware interface is being integrated into our SML Analysis package, which will be a robust general-purpose image-processing package that can handle over 100 space and industrial applications.
Overcoming turbulence-induced space-variant blur by using phase-diverse speckle.
Thelen, Brian J; Paxman, Richard G; Carrara, David A; Seldin, John H
2009-01-01
Space-variant blur occurs when imaging through volume turbulence over sufficiently large fields of view. Space-variant effects are particularly severe in horizontal-path imaging, slant-path (air-to-ground or ground-to-air) geometries, and ground-based imaging of low-elevation satellites or astronomical objects. In these geometries, the isoplanatic angle can be comparable to or even smaller than the diffraction-limited resolution angle. We report on a postdetection correction method that seeks to correct for the effects of space-variant aberrations, with the goal of reconstructing near-diffraction-limited imagery. Our approach has been to generalize the method of phase-diverse speckle (PDS) by using a physically motivated distributed-phase-screen model. Simulation results are presented that demonstrate the reconstruction of near-diffraction-limited imagery under both matched and mismatched model assumptions. In addition, we present evidence that PDS could be used as a beaconless wavefront sensor in a multiconjugate adaptive optics system when imaging extended scenes.
Simultenious binary hash and features learning for image retrieval
NASA Astrophysics Data System (ADS)
Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.
2016-05-01
Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.
Image fusion pitfalls for cranial radiosurgery
Jonker, Benjamin P.
2013-01-01
Stereotactic radiosurgery requires imaging to define both the stereotactic space in which the treatment is delivered and the target itself. Image fusion is the process of using rotation and translation to bring a second image set into alignment with the first image set. This allows the potential concurrent use of multiple image sets to define the target and stereotactic space. While a single magnetic resonance imaging (MRI) sequence alone can be used for delineation of the target and fiducials, there may be significant advantages to using additional imaging sets including other MRI sequences, computed tomography (CT) scans, and advanced imaging sets such as catheter-based angiography, diffusor tension imaging-based fiber tracking and positon emission tomography in order to more accurately define the target and surrounding critical structures. Stereotactic space is usually defined by detection of fiducials on the stereotactic head frame or mask system. Unfortunately MRI sequences are susceptible to geometric distortion, whereas CT scans do not face this problem (although they have poorer resolution of the target in most cases). Thus image fusion can allow the definition of stereotactic space to proceed from the geometrically accurate CT images at the same time as using MRI to define the target. The use of image fusion is associated with risk of error introduced by inaccuracies of the fusion process, as well as workflow changes that if not properly accounted for can mislead the treating clinician. The purpose of this review is to describe the uses of image fusion in stereotactic radiosurgery as well as its potential pitfalls. PMID:23682338
JSC Shuttle Mission Simulator (SMS) visual system payload bay video image
NASA Technical Reports Server (NTRS)
1981-01-01
This space shuttle orbiter payload bay (PLB) video image is used in JSC's Fixed Based (FB) Shuttle Mission Simulator (SMS). The image is projected inside the FB-SMS crew compartment during mission simulation training. The FB-SMS is located in the Mission Simulation and Training Facility Bldg 5.
Optical analysis of crystal growth
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Passeur, Andrea; Harper, Sabrina
1994-01-01
Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.
High-Speed Observer: Automated Streak Detection in SSME Plumes
NASA Technical Reports Server (NTRS)
Rieckoff, T. J.; Covan, M.; OFarrell, J. M.
2001-01-01
A high frame rate digital video camera installed on test stands at Stennis Space Center has been used to capture images of Space Shuttle main engine plumes during test. These plume images are processed in real time to detect and differentiate anomalous plume events occurring during a time interval on the order of 5 msec. Such speed yields near instantaneous availability of information concerning the state of the hardware. This information can be monitored by the test conductor or by other computer systems, such as the integrated health monitoring system processors, for possible test shutdown before occurrence of a catastrophic engine failure.
Tan, Ek T.; Lee, Seung-Kyun; Weavers, Paul T.; Graziani, Dominic; Piel, Joseph E.; Shu, Yunhong; Huston, John; Bernstein, Matt A.; Foo, Thomas K.F.
2016-01-01
Purpose To investigate the effects on echo planar imaging (EPI) distortion of using high gradient slew rates (SR) of up to 700 T/m/s for in-vivo human brain imaging, with a dedicated, head-only gradient coil. Materials and Methods Simulation studies were first performed to determine the expected echo spacing and distortion reduction in EPI. A head gradient of 42-cm inner diameter and with asymmetric transverse coils was then installed in a whole-body, conventional 3T MRI system. Human subject imaging was performed on five subjects to determine the effects of EPI on echo spacing and signal dropout at various gradient slew rates. The feasibility of whole-brain imaging at 1.5 mm-isotropic spatial resolution was demonstrated with gradient-echo and spin-echo diffusion-weighted EPI. Results As compared to a whole-body gradient coil, the EPI echo spacing in the head-only gradient coil was reduced by 48%. Simulation and in vivo results, respectively, showed up to 25-26% and 19% improvement in signal dropout. Whole-brain imaging with EPI at 1.5 mm spatial resolution provided good whole-brain coverage, spatial linearity, and low spatial distortion effects. Conclusion Our results of human brain imaging with EPI using the compact head gradient coil at slew rates higher than in conventional whole-body MR systems demonstrate substantially improved image distortion, and point to a potential for benefits to non-EPI pulse sequences. PMID:26921117
Hyperspectral Systems Increase Imaging Capabilities
NASA Technical Reports Server (NTRS)
2010-01-01
In 1983, NASA started developing hyperspectral systems to image in the ultraviolet and infrared wavelengths. In 2001, the first on-orbit hyperspectral imager, Hyperion, was launched aboard the Earth Observing-1 spacecraft. Based on the hyperspectral imaging sensors used in Earth observation satellites, Stennis Space Center engineers and Institute for Technology Development researchers collaborated on a new design that was smaller and used an improved scanner. Featured in Spinoff 2007, the technology is now exclusively licensed by Themis Vision Systems LLC, of Richmond, Virginia, and is widely used in medical and life sciences, defense and security, forensics, and microscopy.
A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na
2013-01-01
We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.
Imaging detectors and electronics—a view of the future
NASA Astrophysics Data System (ADS)
Spieler, Helmuth
2004-09-01
Imaging sensors and readout electronics have made tremendous strides in the past two decades. The application of modern semiconductor fabrication techniques and the introduction of customized monolithic integrated circuits have made large-scale imaging systems routine in high-energy physics. This technology is now finding its way into other areas, such as space missions, synchrotron light sources, and medical imaging. I review current developments and discuss the promise and limits of new technologies. Several detector systems are described as examples of future trends. The discussion emphasizes semiconductor detector systems, but I also include recent developments for large-scale superconducting detector arrays.
Process simulation in digital camera system
NASA Astrophysics Data System (ADS)
Toadere, Florin
2012-06-01
The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.
Dragon Spacecraft grappled by SSRMS
2014-04-20
View of the SpaceX Dragon Commercial Resupply Services-3 (CRS-3) spacecraft grappled by the Canadarm2 Space Station Remote Manipulator System (SSRMS) during Expedition 39. Image was released by released by flight engineer 3 (FE3) on Instagram.
Deep space navigation systems and operations
NASA Technical Reports Server (NTRS)
Jordan, J. F.
1981-01-01
The history of the deep space navigation system developed by NASA is outlined. Its application to Mariner, Viking and Pioneer missions is reviewed. Voyager navigation results for Jupiter and Saturn are commented on and velocity correction in relation to fuel expenditure and computer time are discussed. The navigation requirements of the Gahleo and Venus orbiting imaging radar (VOIR) missions are assessed. The measurement and data processing systems are described.
NASA Technical Reports Server (NTRS)
Nashman, Marilyn; Chaconas, Karen J.
1988-01-01
The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.
Efficient space-time sampling with pixel-wise coded exposure for high-speed imaging.
Liu, Dengyu; Gu, Jinwei; Hitomi, Yasunobu; Gupta, Mohit; Mitsunaga, Tomoo; Nayar, Shree K
2014-02-01
Cameras face a fundamental trade-off between spatial and temporal resolution. Digital still cameras can capture images with high spatial resolution, but most high-speed video cameras have relatively low spatial resolution. It is hard to overcome this trade-off without incurring a significant increase in hardware costs. In this paper, we propose techniques for sampling, representing, and reconstructing the space-time volume to overcome this trade-off. Our approach has two important distinctions compared to previous works: 1) We achieve sparse representation of videos by learning an overcomplete dictionary on video patches, and 2) we adhere to practical hardware constraints on sampling schemes imposed by architectures of current image sensors, which means that our sampling function can be implemented on CMOS image sensors with modified control units in the future. We evaluate components of our approach, sampling function and sparse representation, by comparing them to several existing approaches. We also implement a prototype imaging system with pixel-wise coded exposure control using a liquid crystal on silicon device. System characteristics such as field of view and modulation transfer function are evaluated for our imaging system. Both simulations and experiments on a wide range of scenes show that our method can effectively reconstruct a video from a single coded image while maintaining high spatial resolution.
Human visual system-based color image steganography using the contourlet transform
NASA Astrophysics Data System (ADS)
Abdul, W.; Carré, P.; Gaborit, P.
2010-01-01
We present a steganographic scheme based on the contourlet transform which uses the contrast sensitivity function (CSF) to control the force of insertion of the hidden information in a perceptually uniform color space. The CIELAB color space is used as it is well suited for steganographic applications because any change in the CIELAB color space has a corresponding effect on the human visual system as is very important for steganographic schemes to be undetectable by the human visual system (HVS). The perceptual decomposition of the contourlet transform gives it a natural advantage over other decompositions as it can be molded with respect to the human perception of different frequencies in an image. The evaluation of the imperceptibility of the steganographic scheme with respect to the color perception of the HVS is done using standard methods such as the structural similarity (SSIM) and CIEDE2000. The robustness of the inserted watermark is tested against JPEG compression.
Krishna Kumar, P; Araki, Tadashi; Rajan, Jeny; Saba, Luca; Lavra, Francesco; Ikeda, Nobutaka; Sharma, Aditya M; Shafique, Shoaib; Nicolaides, Andrew; Laird, John R; Gupta, Ajay; Suri, Jasjit S
2017-08-01
Monitoring of cerebrovascular diseases via carotid ultrasound has started to become a routine. The measurement of image-based lumen diameter (LD) or inter-adventitial diameter (IAD) is a promising approach for quantification of the degree of stenosis. The manual measurements of LD/IAD are not reliable, subjective and slow. The curvature associated with the vessels along with non-uniformity in the plaque growth poses further challenges. This study uses a novel and generalized approach for automated LD and IAD measurement based on a combination of spatial transformation and scale-space. In this iterative procedure, the scale-space is first used to get the lumen axis which is then used with spatial image transformation paradigm to get a transformed image. The scale-space is then reapplied to retrieve the lumen region and boundary in the transformed framework. Then, inverse transformation is applied to display the results in original image framework. Two hundred and two patients' left and right common carotid artery (404 carotid images) B-mode ultrasound images were retrospectively analyzed. The validation of our algorithm has done against the two manual expert tracings. The coefficient of correlation between the two manual tracings for LD was 0.98 (p < 0.0001) and 0.99 (p < 0.0001), respectively. The precision of merit between the manual expert tracings and the automated system was 97.7 and 98.7%, respectively. The experimental analysis demonstrated superior performance of the proposed method over conventional approaches. Several statistical tests demonstrated the stability and reliability of the automated system.
1999-04-01
NASA's Space Optics Manufacturing Center has been working to expand our view of the universe via sophisticated new telescopes. The Optics Center's goal is to develop low-cost, advanced space optics technologies for the NASA program in the 21st century - including the long-term goal of imaging Earth-like planets in distant solar systems. To reduce the cost of mirror fabrication, Marshall Space Flight Center (MSFC) has developed replication techniques, the machinery, and materials to replicate electro-formed nickel mirrors. The process allows fabricating precisely shaped mandrels to be used and reused as masters for replicating high-quality mirrors. Image shows Dr. Alan Shapiro cleaning mirror mandrel to be applied with highly reflective and high-density coating in the Large Aperture Coating Chamber, MFSC Space Optics Manufacturing Technology Center (SOMTC).
Front-end ASICs for high-energy astrophysics in space
NASA Astrophysics Data System (ADS)
Gevin, O.; Limousin, O.; Meuris, A.
2016-07-01
In most of embedded imaging systems for space applications, high granularity and increasing size of focal planes justify an almost systematic use of integrated circuits. . To fulfill challenging requirements for excellent spatial and energy resolution, integrated circuits must fit the sensors perfectly and interface the system such a way to optimize simultaneously noise, geometry and architecture. Moreover, very low power consumption and radiation tolerance are mandatory to envision a use onboard a payload in space. Consequently, being part of an optimized detection system for space, the integrated circuit is specifically designed for each application and becomes an Application Specific Integrated Circuits (ASIC). The paper focuses on mixed analog and digital signal ASICs for spectro-imaging systems in the keVMeV energy band. The first part of the paper summarizes the main advantages conferred by the use of front-end ASICs for highenergy astrophysics instruments in space mission. Space qualification of ASICs requires the chip to be radiation hard. The paper will shortly describe some of the typical hardening techniques and give some guidelines that an ASIC designer should follow to choose the most efficient technology for his project. The first task of the front-end electronics is to convert the charge coming from the detector into a voltage. For most of the Silicon detectors (CCD, DEPFET, SDD) this is conversion happens in the detector itself. For other sensor materials, charge preamplifiers operate the conversion. The paper shortly describes the different key parameters of charge preamplifiers and the binding parameters for the design. Filtering is generally mandatory in order to increase the signal to noise ratio or to reduce the duration of the signal. After a brief review on the main noise sources, the paper reviews noise-filtering techniques that are commonly used in Integrated circuits designs. The way sensors and ASICs are interconnected together plays a major role in the noise performances of the detection systems. The geometry of a sensor is therefore critical and drives the ASIC design. The second part of the paper takes the geometry of the detector as a story line to explore different kinds of ASIC structures and architectures. From the simple single-channel ASIC for CCDs to the most advanced 3D ASIC prototypes used to build dead-zone free imaging systems, the paper reports on different families of circuits for spectro-imaging systems. It emphasizes a variety of designer choices, all around the word, in different space missions.
Polaris Instrument Development and PARI Experience
NASA Astrophysics Data System (ADS)
Stewart, Nathan
2011-01-01
At the Pisgah Astronomical Research Institute (PARI) in Rosman, NC I spent 8 weeks as the NC Space Grant/J. Donald Cline Astronomy Scholar. I developed multiple projects and assisted as a mentor to PARI Space Science Lab and Duke TIP high school gifted student program which both took place during my stay. My main focus was the development of the Polaris imaging telescope. This telescope is used to take images of the pulsating variable star Polaris. These readings are used to make seeing estimates for the air column above PARI. The system stores and archives images and analyzes them for magnitude change and movement of the stellar image. In addition to the Polaris project I developed a solar panel voltage and charge monitoring system which involved me working with charge controllers and photovoltaic technology. I developed a charging scheme using Flexmax 60 charge controller. Data is recorded and transmitted via optical fiber for analysis and correlation with solar zenith angle.
Support of imaging radar for the shuttle system and subsystem definition study, phase 2
NASA Technical Reports Server (NTRS)
1974-01-01
An orbital microwave imaging radar system suggested for use in conjunction with the space shuttle is presented. Several applications of the system are described, including agriculture, meteorology, terrain analysis, various types of mapping, petroleum and mineral exploration, oil spill detection and sea and lake ice monitoring. The design criteria, which are based on the requirements of the above applications, are discussed.
NASA Astrophysics Data System (ADS)
Polichtchouk, Yuri; Tokareva, Olga; Bulgakova, Irina V.
2003-03-01
Methodical problems of space images processing for assessment of atmosphere pollution impact on forest ecosystems using geoinformation systems are developed. An approach to quantitative assessment of atmosphere pollution impact on forest ecosystems is based on calculating relative squares of forest landscapes which are inside atmosphere pollution zones. Landscape structure of forested territories in the southern part of Western Siberia are determined on the basis of procession of middle resolution space images from spaceborn Resource-O. Particularities of atmosphere pollution zones modeling caused by gas burning in torches on territories of oil fields are considered. Pollution zones were revealed by modeling of contaminants dispersal in atmosphere with standard models. Polluted landscapes squares are calculated depending on atmosphere pollution level.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, United Space Alliance worker Craig Meyer fits an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, United Space Alliance worker Craig Meyer fits an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
Development of a simultaneous optical/PET imaging system for awake mice
NASA Astrophysics Data System (ADS)
Takuwa, Hiroyuki; Ikoma, Yoko; Yoshida, Eiji; Tashima, Hideaki; Wakizaka, Hidekatsu; Shinaji, Tetsuya; Yamaya, Taiga
2016-09-01
Simultaneous measurements of multiple physiological parameters are essential for the study of brain disease mechanisms and the development of suitable therapies to treat them. In this study, we developed a measurement system for simultaneous optical imaging and PET for awake mice. The key elements of this system are the OpenPET, optical imaging and fixation apparatus for an awake mouse. The OpenPET is our original open-type PET geometry, which can be used in combination with another device because of the easily accessible open space of the former. A small prototype of the axial shift single-ring OpenPET was used. The objective lens for optical imaging with a mounted charge-coupled device camera was placed inside the open space of the AS-SROP. Our original fixation apparatus to hold an awake mouse was also applied. As a first application of this system, simultaneous measurements of cerebral blood flow (CBF) by laser speckle imaging (LSI) and [11C]raclopride-PET were performed under control and 5% CO2 inhalation (hypercapnia) conditions. Our system successfully obtained the CBF and [11C]raclopride radioactivity concentration simultaneously. Accumulation of [11C]raclopride was observed in the striatum where the density of dopamine D2 receptors is high. LSI measurements could be stably performed for more than 60 minutes. Increased CBF induced by hypercapnia was observed while CBF under the control condition was stable. We concluded that our imaging system should be useful for investigating the mechanisms of brain diseases in awake animal models.
Coded-aperture Compton camera for gamma-ray imaging
NASA Astrophysics Data System (ADS)
Farber, Aaron M.
This dissertation describes the development of a novel gamma-ray imaging system concept and presents results from Monte Carlo simulations of the new design. Current designs for large field-of-view gamma cameras suitable for homeland security applications implement either a coded aperture or a Compton scattering geometry to image a gamma-ray source. Both of these systems require large, expensive position-sensitive detectors in order to work effectively. By combining characteristics of both of these systems, a new design can be implemented that does not require such expensive detectors and that can be scaled down to a portable size. This new system has significant promise in homeland security, astronomy, botany and other fields, while future iterations may prove useful in medical imaging, other biological sciences and other areas, such as non-destructive testing. A proof-of-principle study of the new gamma-ray imaging system has been performed by Monte Carlo simulation. Various reconstruction methods have been explored and compared. General-Purpose Graphics-Processor-Unit (GPGPU) computation has also been incorporated. The resulting code is a primary design tool for exploring variables such as detector spacing, material selection and thickness and pixel geometry. The advancement of the system from a simple 1-dimensional simulation to a full 3-dimensional model is described. Methods of image reconstruction are discussed and results of simulations consisting of both a 4 x 4 and a 16 x 16 object space mesh have been presented. A discussion of the limitations and potential areas of further study is also presented.
A Wigner-based ray-tracing method for imaging simulations
NASA Astrophysics Data System (ADS)
Mout, B. M.; Wick, M.; Bociort, F.; Urbach, H. P.
2015-09-01
The Wigner Distribution Function (WDF) forms an alternative representation of the optical field. It can be a valuable tool for understanding and classifying optical systems. Furthermore, it possesses properties that make it suitable for optical simulations: both the intensity and the angular spectrum can be easily obtained from the WDF and the WDF remains constant along the paths of paraxial geometrical rays. In this study we use these properties by implementing a numerical Wigner-Based Ray-Tracing method (WBRT) to simulate diffraction effects at apertures in free-space and in imaging systems. Both paraxial and non-paraxial systems are considered and the results are compared with numerical implementations of the Rayleigh-Sommerfeld and Fresnel diffraction integrals to investigate the limits of the applicability of this approach. The results of the different methods are in good agreement when simulating free-space diffraction or calculating point spread functions (PSFs) for aberration-free imaging systems, even at numerical apertures exceeding the paraxial regime. For imaging systems with aberrations, the PSFs of WBRT diverge from the results using diffraction integrals. For larger aberrations WBRT predicts negative intensities, suggesting that this model is unable to deal with aberrations.
Hubble Sees the Wings of a Butterfly: The Twin Jet Nebula
2015-08-26
The shimmering colors visible in this NASA/ESA Hubble Space Telescope image show off the remarkable complexity of the Twin Jet Nebula. The new image highlights the nebula’s shells and its knots of expanding gas in striking detail. Two iridescent lobes of material stretch outwards from a central star system. Within these lobes two huge jets of gas are streaming from the star system at speeds in excess of one million kilometers (621,400 miles) per hour. Read more: go.nasa.gov/1hGASfl Credit: ESA/Hubble & NASA, Acknowledgement: Judy Schmidt NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy; Martin, Richard E.
2013-05-01
Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA's electron beam freeform fabrication (EBF3) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF3 technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF3 system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality deposit, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for deposit assessment metrics.
D Modelling of AN Indoor Space Using a Rotating Stereo Frame Camera System
NASA Astrophysics Data System (ADS)
Kang, J.; Lee, I.
2016-06-01
Sophisticated indoor design and growing development in urban architecture make indoor spaces more complex. And the indoor spaces are easily connected to public transportations such as subway and train stations. These phenomena allow to transfer outdoor activities to the indoor spaces. Constant development of technology has a significant impact on people knowledge about services such as location awareness services in the indoor spaces. Thus, it is required to develop the low-cost system to create the 3D model of the indoor spaces for services based on the indoor models. In this paper, we thus introduce the rotating stereo frame camera system that has two cameras and generate the indoor 3D model using the system. First, select a test site and acquired images eight times during one day with different positions and heights of the system. Measurements were complemented by object control points obtained from a total station. As the data were obtained from the different positions and heights of the system, it was possible to make various combinations of data and choose several suitable combinations for input data. Next, we generated the 3D model of the test site using commercial software with previously chosen input data. The last part of the processes will be to evaluate the accuracy of the generated indoor model from selected input data. In summary, this paper introduces the low-cost system to acquire indoor spatial data and generate the 3D model using images acquired by the system. Through this experiments, we ensure that the introduced system is suitable for generating indoor spatial information. The proposed low-cost system will be applied to indoor services based on the indoor spatial information.
Marshall Space Flight Center In-House Earned Value System (EVS)
NASA Technical Reports Server (NTRS)
Smith, Donnie
2004-01-01
The Earned Value System (EVS) is a project management budgeting and scheduling process for in-house project and institutional applications. This viewgraph presentation includes images of the system's computer interface.
2013-01-15
S48-E-007 (12 Sept 1991) --- Astronaut James F. Buchli, mission specialist, catches snack crackers as they float in the weightless environment of the earth-orbiting Discovery. This image was transmitted by the Electronic Still Camera, Development Test Objective (DTO) 648. The ESC is making its initial appearance on a Space Shuttle flight. Electronic still photography is a new technology that enables a camera to electronically capture and digitize an image with resolution approaching film quality. The digital image is stored on removable hard disks or small optical disks, and can be converted to a format suitable for downlink transmission or enhanced using image processing software. The Electronic Still Camera (ESC) was developed by the Man- Systems Division at the Johnson Space Center and is the first model in a planned evolutionary development leading to a family of high-resolution digital imaging devices. H. Don Yeates, JSC's Man-Systems Division, is program manager for the ESC. THIS IS A SECOND GENERATION PRINT MADE FROM AN ELECTRONICALLY PRODUCED NEGATIVE
Quantitative phase imaging using grating-based quadrature phase interferometer
NASA Astrophysics Data System (ADS)
Wu, Jigang; Yaqoob, Zahid; Heng, Xin; Cui, Xiquan; Yang, Changhuei
2007-02-01
In this paper, we report the use of holographic gratings, which act as the free-space equivalent of the 3x3 fiber-optic coupler, to perform full field phase imaging. By recording two harmonically-related gratings in the same holographic plate, we are able to obtain nontrivial phase shift between different output ports of the gratings-based Mach-Zehnder interferometer. The phase difference can be adjusted by changing the relative phase of the recording beams when recording the hologram. We have built a Mach-Zehnder interferometer using harmonically-related holographic gratings with 600 and 1200 lines/mm spacing. Two CCD cameras at the output ports of the gratings-based Mach-Zehnder interferometer are used to record the full-field quadrature interferograms, which are subsequently processed to reconstruct the phase image. The imaging system has ~12X magnification with ~420μmx315μm field-of-view. To demonstrate the capability of our system, we have successfully performed phase imaging of a pure phase object and a paramecium caudatum.
Navarro, Pedro J; Alonso, Diego; Stathis, Kostas
2016-01-01
We develop an automated image processing system for detecting microaneurysm (MA) in diabetic patients. Diabetic retinopathy is one of the main causes of preventable blindness in working age diabetic people with the presence of an MA being one of the first signs. We transform the eye fundus images to the L*a*b* color space in order to separately process the L* and a* channels, looking for MAs in each of them. We then fuse the results, and last send the MA candidates to a k-nearest neighbors classifier for final assessment. The performance of the method, measured against 50 images with an ophthalmologist's hand-drawn ground-truth, shows high sensitivity (100%) and accuracy (84%), and running times around 10 s. This kind of automatic image processing application is important in order to reduce the burden on the public health system associated with the diagnosis of diabetic retinopathy given the high number of potential patients that need periodic screening.
A laboratory demonstration of the capability to image an Earth-like extrasolar planet.
Trauger, John T; Traub, Wesley A
2007-04-12
The detection and characterization of an Earth-like planet orbiting a nearby star requires a telescope with an extraordinarily large contrast at small angular separations. At visible wavelengths, an Earth-like planet would be 1 x 10(-10) times fainter than the star at angular separations of typically 0.1 arcsecond or less. There are several proposed space telescope systems that could, in principle, achieve this. Here we report a laboratory experiment that reaches these limits. We have suppressed the diffracted and scattered light near a star-like source to a level of 6 x 10(-10) times the peak intensity in individual coronagraph images. In a series of such images, together with simple image processing, we have effectively reduced this to a residual noise level of about 0.1 x 10(-10). This demonstrates that a coronagraphic telescope in space could detect and spectroscopically characterize nearby exoplanetary systems, with the sensitivity to image an 'Earth-twin' orbiting a nearby star.
Application of ultrasound processed images in space: Quanitative assessment of diffuse affectations
NASA Astrophysics Data System (ADS)
Pérez-Poch, A.; Bru, C.; Nicolau, C.
The purpose of this study was to evaluate diffuse affectations in the liver using texture image processing techniques. Ultrasound diagnose equipments are the election of choice to be used in space environments as they are free from hazardous effects on health. However, due to the need for highly trained radiologists to assess the images, this imaging method is mainly applied on focal lesions rather than on non-focal ones. We have conducted a clinical study on 72 patients with different degrees of chronic hepatopaties and a group of control of 18 individuals. All subjects' clinical reports and results of biopsies were compared to the degree of affectation calculated by our computer system , thus validating the method. Full statistical results are given in the present paper showing a good correlation (r=0.61) between pathologist's report and analysis of the heterogenicity of the processed images from the liver. This computer system to analyze diffuse affectations may be used in-situ or via telemedicine to the ground.
Application of ultrasound processed images in space: assessing diffuse affectations
NASA Astrophysics Data System (ADS)
Pérez-Poch, A.; Bru, C.; Nicolau, C.
The purpose of this study was to evaluate diffuse affectations in the liver using texture image processing techniques. Ultrasound diagnose equipments are the election of choice to be used in space environments as they are free from hazardous effects on health. However, due to the need for highly trained radiologists to assess the images, this imaging method is mainly applied on focal lesions rather than on non-focal ones. We have conducted a clinical study on 72 patients with different degrees of chronic hepatopaties and a group of control of 18 individuals. All subjects' clinical reports and results of biopsies were compared to the degree of affectation calculated by our computer system , thus validating the method. Full statistical results are given in the present paper showing a good correlation (r=0.61) between pathologist's report and analysis of the heterogenicity of the processed images from the liver. This computer system to analyze diffuse affectations may be used in-situ or via telemedicine to the ground.
Design Analysis of a Space Based Chromotomographic Hyperspectral Imaging Experiment
2010-03-01
Tilt Platforms S-340 Platform Recommended Models Mirror Aluminum Aluminum S-340.Ax Invar Zerodur glass S-340.ix Titanium BK7 glass S-340.Tx Steel S-340...composed of a telescope, two grating spectrometers, calibration lamps, and focal plane electronics and cooling system. The telescope is a three mirror ...advanced hyperspectral imager for coastal bathymetry is that the experiment will closely mirror that of the proposed space-based chromotomographic hy
Hubble Space Telescope, Faint Object Camera
NASA Technical Reports Server (NTRS)
1981-01-01
This drawing illustrates Hubble Space Telescope's (HST's), Faint Object Camera (FOC). The FOC reflects light down one of two optical pathways. The light enters a detector after passing through filters or through devices that can block out light from bright objects. Light from bright objects is blocked out to enable the FOC to see background images. The detector intensifies the image, then records it much like a television camera. For faint objects, images can be built up over long exposure times. The total image is translated into digital data, transmitted to Earth, and then reconstructed. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors.
NASA Technical Reports Server (NTRS)
Wu, Diana Terri; Ricco, Antonio Joseph; Lera, Matthew P.; Timucin, Linda R.; Parra, Macarena P.
2012-01-01
Nanosatellites offer frequent, low-cost space access as secondary payloads on launches of larger conventional satellites. We summarize the payload science and technology of the Microsatellite in-situ Space Technologies (MisST) nanosatellite for conducting automated biological experiments. The payload (two fused 10-cm cubes) includes 1) an integrated fluidics system that maintains organism viability and supports growth and 2) a fixed-focus imager with fluorescence and scattered-light imaging capabilities. The payload monitors temperature, pressure and relative humidity, and actively controls temperature. C. elegans (nematode, 50 m diameter x 1 mm long) was selected as a model organism due to previous space science experience, its completely sequenced genome, size, hardiness, and the variety of strains available. Three strains were chosen: two green GFP-tagged strains and one red tdTomato-tagged strain that label intestinal, nerve, and pharyngeal cells, respectively. The integrated fluidics system includes bioanalytical and reservoir modules. The former consists of four 150 L culture wells and a 4x5 mm imaging zone the latter includes two 8 mL fluid reservoirs for reagent and waste storage. The fluidic system is fabricated using multilayer polymer rapid prototyping: laser cutting, precision machining, die cutting, and pressure-sensitive adhesives it also includes eight solenoid-operated valves and one mini peristaltic pump. Young larval-state (L2) nematodes are loaded in C. elegans Maintenance Media (CeMM) in the bioanalytical module during pre-launch assembly. By the time orbit is established, the worms have grown to sufficient density to be imaged and are fed fresh CeMM. The strains are pumped sequentially into the imaging area, imaged, then pumped into waste. Reagent storage utilizes polymer bags under slight pressure to prevent bubble formation in wells or channels. The optical system images green and red fluorescence bands by excitation with blue (473 nm peak) and amber (587 nm peak) LEDs it achieves 8 m lateral resolution using a CMOS imaging chip (as configured for serial data speeds) or 4 m resolution using USB imaging chips. The imager consists of a modified commercial off-the-shelf CMOS chip camera, amber, blue and white LEDs, as well as a relay lens and dual-band filters to obviate moving parts while supporting both fluorescence wavelengths.
Characteristics of mist 3D screen for projection type electro-holography
NASA Astrophysics Data System (ADS)
Sato, Koki; Okumura, Toshimichi; Kanaoka, Takumi; Koizumi, Shinya; Nishikawa, Satoko; Takano, Kunihiko
2006-01-01
The specification of hologram image is the full parallax 3D image. In this case we can get more natural 3D image because focusing and convergence are coincident each other. We try to get practical electro-holography system because for conventional electro-holography the image viewing angle is very small. This is due to the limited display pixel size. Now we are developing new method for large viewing angle by space projection method. White color laser is irradiated to single DMD panel (time shared CGH of RGB three colors). 3D space screen constructed by very small water particle is used to reconstruct the 3D image with large viewing angle by scattering of water particle.
Design of a space-based infrared imaging interferometer
NASA Astrophysics Data System (ADS)
Hart, Michael; Hope, Douglas; Romeo, Robert
2017-07-01
Present space-based optical imaging sensors are expensive. Launch costs are dictated by weight and size, and system design must take into account the low fault tolerance of a system that cannot be readily accessed once deployed. We describe the design and first prototype of the space-based infrared imaging interferometer (SIRII) that aims to mitigate several aspects of the cost challenge. SIRII is a six-element Fizeau interferometer intended to operate in the short-wave and midwave IR spectral regions over a 6×6 mrad field of view. The volume is smaller by a factor of three than a filled-aperture telescope with equivalent resolving power. The structure and primary optics are fabricated from light-weight space-qualified carbon fiber reinforced polymer; they are easy to replicate and inexpensive. The design is intended to permit one-time alignment during assembly, with no need for further adjustment once on orbit. A three-element prototype of the SIRII imager has been constructed with a unit telescope primary mirror diameter of 165 mm and edge-to-edge baseline of 540 mm. The optics, structure, and interferometric signal processing principles draw on experience developed in ground-based astronomical applications designed to yield the highest sensitivity and resolution with cost-effective optical solutions. The initial motivation for the development of SIRII was the long-term collection of technical intelligence from geosynchronous orbit, but the scalable nature of the design will likely make it suitable for a range of IR imaging scenarios.
Lens Systems for Sky Surveys and Space Surveillance
NASA Astrophysics Data System (ADS)
Ackermann, M.; McGraw, J.; Zimmer, P.
2013-09-01
Since the early days of astrophotography, lens systems have played a key role in capturing images of the night sky. The first images were attempted with visual-refractors. These were soon followed with color-corrected refractors and finally specially designed photo-refractors. Being telescopes, these instruments were of long-focus and imaged narrow fields of view. Simple photographic lenses were soon put into service to capture wide-field images. These lenses also had the advantage of requiring shorter exposure times than possible using large refractors. Eventually, lenses were specifically designed for astrophotography. With the introduction of the Schmidt-camera and related catadioptric systems, the popularity of astrograph lenses declined, but surprisingly, a few remained in use. Over the last 30 years, as small CCDs have displaced large photographic plates, lens systems have again found favor for their ability to image great swaths of sky in a relatively small and simple package. In this paper, we follow the development of lens-based astrograph systems from their beginnings through the current use of both commercial and custom lens systems for sky surveys and space surveillance. Some of the optical milestones discussed include the early Petzval-type portrait lenses, the Ross astrographic lens and the current generation of optics such as the commercial 200mm camera lens by Canon, and the Russian VT-53e in service with ISON.
The Revolution in Earth and Space Science Education.
ERIC Educational Resources Information Center
Barstow, Daniel; Geary, Ed; Yazijian, Harvey
2002-01-01
Explains the changing nature of earth and space science education such as using inquiry-based teaching, how technology allows students to use satellite images in inquiry-based investigations, the consideration of earth and space as a whole system rather than a sequence of topics, and increased student participation in learning opportunities. (YDS)
Alternatives for Military Space Radar
2007-01-01
transmitted microwaves to produce images of the Earth’s surface (somewhat akin to photographs produced by optical imaging).2 By providing their own...microwaves for illumination (rather than sunlight, as in an optical imaging system). By providing their own illu- mination, radars can produce...carry a variety of payloads, including electro- optical , infrared, and SAR imagers; a film camera; and signals- intelligence equipment. The aircraft’s
Space environments and their effects on space automation and robotics
NASA Technical Reports Server (NTRS)
Garrett, Henry B.
1990-01-01
Automated and robotic systems will be exposed to a variety of environmental anomalies as a result of adverse interactions with the space environment. As an example, the coupling of electrical transients into control systems, due to EMI from plasma interactions and solar array arcing, may cause spurious commands that could be difficult to detect and correct in time to prevent damage during critical operations. Spacecraft glow and space debris could introduce false imaging information into optical sensor systems. The presentation provides a brief overview of the primary environments (plasma, neutral atmosphere, magnetic and electric fields, and solid particulates) that cause such adverse interactions. The descriptions, while brief, are intended to provide a basis for the other papers presented at this conference which detail the key interactions with automated and robotic systems. Given the growing complexity and sensitivity of automated and robotic space systems, an understanding of adverse space environments will be crucial to mitigating their effects.
Closeup view of the aft fuselage looking forward along the ...
Close-up view of the aft fuselage looking forward along the approximate centerline of the Orbiter Discovery looking at the expansion nozzles of the Space Shuttle Main Engines (SSME) and the Orbiter Maneuvering System. Also in the view is the orbiter's body flap with a protective covering over the High-temperature Reusable Surface Insulation tiles on the surface facing the SSMEs. This image was taken inside the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Space and Earth Science Data Compression Workshop
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1991-01-01
The workshop explored opportunities for data compression to enhance the collection and analysis of space and Earth science data. The focus was on scientists' data requirements, as well as constraints imposed by the data collection, transmission, distribution, and archival systems. The workshop consisted of several invited papers; two described information systems for space and Earth science data, four depicted analysis scenarios for extracting information of scientific interest from data collected by Earth orbiting and deep space platforms, and a final one was a general tutorial on image data compression.
Second generation spectrograph for the Hubble Space Telescope
NASA Astrophysics Data System (ADS)
Woodgate, B. E.; Boggess, A.; Gull, T. R.; Heap, S. R.; Krueger, V. L.; Maran, S. P.; Melcher, R. W.; Rebar, F. J.; Vitagliano, H. D.; Green, R. F.; Wolff, S. C.; Hutchings, J. B.; Jenkins, E. B.; Linsky, J. L.; Moos, H. W.; Roesler, F.; Shine, R. A.; Timothy, J. G.; Weistrop, D. E.; Bottema, M.; Meyer, W.
1986-01-01
The preliminary design for the Space Telescope Imaging Spectrograph (STIS), which has been selected by NASA for definition study for future flight as a second-generation instrument on the Hubble Space Telescope (HST), is presented. STIS is a two-dimensional spectrograph that will operate from 1050 A to 11,000 A at the limiting HST resolution of 0.05 arcsec FWHM, with spectral resolutions of 100, 1200, 20,000, and 100,000 and a maximum field-of-view of 50 x 50 arcsec. Its basic operating modes include echelle model, long slit mode, slitless spectrograph mode, coronographic spectroscopy, photon time-tagging, and direct imaging. Research objectives are active galactic nuclei, the intergalactic medium, global properties of galaxies, the origin of stellar systems, stelalr spectral variability, and spectrographic mapping of solar system processes.
Pagoulatos, N; Edwards, W S; Haynor, D R; Kim, Y
1999-12-01
The use of stereotactic systems has been one of the main approaches for image-based guidance of the surgical tool within the brain. The main limitation of stereotactic systems is that they are based on preoperative images that might become outdated and invalid during the course of surgery. Ultrasound (US) is considered the most practical and cost-effective intraoperative imaging modality, but US images inherently have a low signal-to-noise ratio. Integrating intraoperative US with stereotactic systems has recently been attempted. In this paper, we present a new system for interactively registering two-dimensional US and three-dimensional magnetic resonance (MR) images. This registration is based on tracking the US probe with a dc magnetic position sensor. We have performed an extensive analysis of the errors of our system by using a custom-built phantom. The registration error between the MR and the position sensor space was found to have a mean value of 1.78 mm and a standard deviation of 0.18 mm. The registration error between US and MR space was dependent on the distance of the target point from the US probe face. For a 3.5-MHz phased one-dimensional array transducer and a depth of 6 cm, the mean value of the registration error was 2.00 mm and the standard deviation was 0.75 mm. The registered MR images were reconstructed using either zeroth-order or first-order interpolation. The ease of use and the interactive nature of our system (approximately 6.5 frames/s for 344 x 310 images and first-order interpolation on a Pentium II 450 MHz) demonstrates its potential to be used in the operating room.
Real-time Flare Detection in Ground-Based Hα Imaging at Kanzelhöhe Observatory
NASA Astrophysics Data System (ADS)
Pötzi, W.; Veronig, A. M.; Riegler, G.; Amerstorfer, U.; Pock, T.; Temmer, M.; Polanec, W.; Baumgartner, D. J.
2015-03-01
Kanzelhöhe Observatory (KSO) regularly performs high-cadence full-disk imaging of the solar chromosphere in the Hα and Ca ii K spectral lines as well as in the solar photosphere in white light. In the frame of ESA's (European Space Agency) Space Situational Awareness (SSA) program, a new system for real-time Hα data provision and automatic flare detection was developed at KSO. The data and events detected are published in near real-time at ESA's SSA Space Weather portal (http://swe.ssa.esa.int/web/guest/kso-federated). In this article, we describe the Hα instrument, the image-recognition algorithms we developed, and the implementation into the KSO Hα observing system. We also present the evaluation results of the real-time data provision and flare detection for a period of five months. The Hα data provision worked in 99.96 % of the images, with a mean time lag of four seconds between image recording and online provision. Within the given criteria for the automatic image-recognition system (at least three Hα images are needed for a positive detection), all flares with an area ≥ 50 micro-hemispheres that were located within 60° of the solar center and occurred during the KSO observing times were detected, a number of 87 events in total. The automatically determined flare importance and brightness classes were correct in ˜ 85 %. The mean flare positions in heliographic longitude and latitude were correct to within ˜ 1°. The median of the absolute differences for the flare start and peak times from the automatic detections in comparison with the official NOAA (and KSO) visual flare reports were 3 min (1 min).
NASA Astrophysics Data System (ADS)
Seager, Sara; Cash, Webster C.; Kasdin, N. Jeremy; Sparks, William B.; Turnbull, Margaret C.; Kuchner, Marc J.; Roberge, Aki; Domagal-Goldman, Shawn; Shaklan, Stuart; Thomson, Mark; Lisman, Doug; Martin, Suzanne; Cady, Eric; Webb, David
2014-06-01
"Exo-S" is NASA's first directed community study of a starshade and telescope system for space-based discovery and characterization of exoplanets by direct imaging. Under a cost cap of $1B, Exo-S will use a modestly sized starshade (also known as an "external occulter") and a modest aperture space telescope for high contrast observations of exoplanetary systems. The Exo-S will obtain spectra of a subset of its newly discovered exoplanets as well as already known Jupiter-mass exoplanets. Exo-S will be capable of reaching down to the discovery of Earth-size planets in the habitable zones of twenty sun-like stars, with a favorable few accessible for spectral characterization. We present highlights of the science goals, the mission design, and technology milestones already reached. At the study conclusion in 2015, NASA will evaluate the Exo-S concept for potential development at the end of this decade.
NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1
NASA Technical Reports Server (NTRS)
Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)
1990-01-01
The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.
End-to-end imaging information rate advantages of various alternative communication systems
NASA Technical Reports Server (NTRS)
Rice, R. F.
1982-01-01
The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.
X-space MPI: magnetic nanoparticles for safe medical imaging.
Goodwill, Patrick William; Saritas, Emine Ulku; Croft, Laura Rose; Kim, Tyson N; Krishnan, Kannan M; Schaffer, David V; Conolly, Steven M
2012-07-24
One quarter of all iodinated contrast X-ray clinical imaging studies are now performed on Chronic Kidney Disease (CKD) patients. Unfortunately, the iodine contrast agent used in X-ray is often toxic to CKD patients' weak kidneys, leading to significant morbidity and mortality. Hence, we are pioneering a new medical imaging method, called Magnetic Particle Imaging (MPI), to replace X-ray and CT iodinated angiography, especially for CKD patients. MPI uses magnetic nanoparticle contrast agents that are much safer than iodine for CKD patients. MPI already offers superb contrast and extraordinary sensitivity. The iron oxide nanoparticle tracers required for MPI are also used in MRI, and some are already approved for human use, but the contrast agents are far more effective at illuminating blood vessels when used in the MPI modality. We have recently developed a systems theoretic framework for MPI called x-space MPI, which has already dramatically improved the speed and robustness of MPI image reconstruction. X-space MPI has allowed us to optimize the hardware for fi ve MPI scanners. Moreover, x-space MPI provides a powerful framework for optimizing the size and magnetic properties of the iron oxide nanoparticle tracers used in MPI. Currently MPI nanoparticles have diameters in the 10-20 nanometer range, enabling millimeter-scale resolution in small animals. X-space MPI theory predicts that larger nanoparticles could enable up to 250 micrometer resolution imaging, which would represent a major breakthrough in safe imaging for CKD patients.
Direct Imaging of Stellar Surfaces: Results from the Stellar Imager (SI) Vision Mission Study
NASA Technical Reports Server (NTRS)
Carpenter, Kenneth; Schrijver, Carolus; Karovska, Margarita
2006-01-01
The Stellar Imager (SI) is a UV-Optical, Space-Based Interferometer designed to enable 0.1 milli-arcsecond (mas) spectral imaging of stellar surfaces and stellar interiors (via asteroseismology) and of the Universe in general. SI is identified as a "Flagship and Landmark Discovery Mission'' in the 2005 Sun Solar System Connection (SSSC) Roadmap and as a candidate for a "Pathways to Life Observatory'' in the Exploration of the Universe Division (EUD) Roadmap (May, 2005). The ultra-sharp images of the Stellar Imager will revolutionize our view of many dynamic astrophysical processes: The 0.1 mas resolution of this deep-space telescope will transform point sources into extended sources, and snapshots into evolving views. SI's science focuses on the role of magnetism in the Universe, particularly on magnetic activity on the surfaces of stars like the Sun. SI's prime goal is to enable long-term forecasting of solar activity and the space weather that it drives in support of the Living With a Star program in the Exploration Era. SI will also revolutionize our understanding of the formation of planetary systems, of the habitability and climatology of distant planets, and of many magneto-hydrodynamically controlled processes in the Universe. In this paper we will discuss the results of the SI Vision Mission Study, elaborating on the science goals of the SI Mission and a mission architecture that could meet those goals.
Applications of artificial intelligence V; Proceedings of the Meeting, Orlando, FL, May 18-20, 1987
NASA Technical Reports Server (NTRS)
Gilmore, John F. (Editor)
1987-01-01
The papers contained in this volume focus on current trends in applications of artificial intelligence. Topics discussed include expert systems, image understanding, artificial intelligence tools, knowledge-based systems, heuristic systems, manufacturing applications, and image analysis. Papers are presented on expert system issues in automated, autonomous space vehicle rendezvous; traditional versus rule-based programming techniques; applications to the control of optional flight information; methodology for evaluating knowledge-based systems; and real-time advisory system for airborne early warning.
BRIGHT PROMINENCE ERUPTION (DECEMBER 14, 2012)
2012-12-17
The Sun blows a robust prominence out into space (Dec. 10, 2102). The outer image, from the STEREO-A's COR1 coronagraph, has been changed from green to red to complement the green Sun image, taken in extreme UV light. The movie covers six hours of activity. Kind of Christmassy looking, isn't it? Some of the prominence falls back towards the sun, although the disturbance as a whole continues out into the solar system. Credit: NASA/GSFC/STEREO NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Complex Pupil Masks for Aberrated Imaging of Closely Spaced Objects
NASA Astrophysics Data System (ADS)
Reddy, A. N. K.; Sagar, D. K.; Khonina, S. N.
2017-12-01
Current approach demonstrates the suppression of optical side-lobes and the contraction of the main lobe in the composite image of two object points of the optical system under the influence of defocusing effect when an asymmetric phase edges are imposed over the apodized circular aperture. The resolution of two point sources having different intensity ratio is discussed in terms of the modified Sparrow criterion, functions of the degree of coherence of the illumination, the intensity difference and the degree of asymmetric phase masking. Here we have introduced and explored the effects of focus aberration (defect-of-focus) on the two-point resolution of the optical systems. Results on the aberrated composite image of closely spaced objects with amplitude mask and asymmetric phase masks forms a significant contribution in astronomical and microscopic observations.
Closeup view of the reflective insulation protecting the Crew Compartment ...
Close-up view of the reflective insulation protecting the Crew Compartment bulkhead, orbiter structure and landing gear housing in the void created by the removal of the Forward Reaction Control System Module from the forward section of the Orbiter Discovery. This image was taken from the service platform in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Detail view of the port side of the payload bay ...
Detail view of the port side of the payload bay of the Orbiter Discovery. This view shows Remote Manipulator System, Canadarm, sensors in the center of the image and a close-up view of a small segment of the orbiter's radiator panel. This photograph was taken in the Orbiter Processing Facility at the Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Design and Implementation of a Novel Portable 360° Stereo Camera System with Low-Cost Action Cameras
NASA Astrophysics Data System (ADS)
Holdener, D.; Nebiker, S.; Blaser, S.
2017-11-01
The demand for capturing indoor spaces is rising with the digitalization trend in the construction industry. An efficient solution for measuring challenging indoor environments is mobile mapping. Image-based systems with 360° panoramic coverage allow a rapid data acquisition and can be processed to georeferenced 3D images hosted in cloud-based 3D geoinformation services. For the multiview stereo camera system presented in this paper, a 360° coverage is achieved with a layout consisting of five horizontal stereo image pairs in a circular arrangement. The design is implemented as a low-cost solution based on a 3D printed camera rig and action cameras with fisheye lenses. The fisheye stereo system is successfully calibrated with accuracies sufficient for the applied measurement task. A comparison of 3D distances with reference data delivers maximal deviations of 3 cm on typical distances in indoor space of 2-8 m. Also the automatic computation of coloured point clouds from the stereo pairs is demonstrated.
Detail view of the interior of the flight deck looking ...
Detail view of the interior of the flight deck looking forward showing the overhead control panels. Note that the flight deck windows have protective covers over them in this view. This images can be digitally stitched with image HAER No. TX-116-A-19 to expand the view to include the Commander and Pilot positions during ascent and reentry and landing. This view was taken in the Orbiter Processing Facility at the Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Deformable Mirrors Correct Optical Distortions
NASA Technical Reports Server (NTRS)
2010-01-01
By combining the high sensitivity of space telescopes with revolutionary imaging technologies consisting primarily of adaptive optics, the Terrestrial Planet Finder is slated to have imaging power 100 times greater than the Hubble Space Telescope. To this end, Boston Micromachines Corporation, of Cambridge, Massachusetts, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory for space-based adaptive optical technology. The work resulted in a microelectromechanical systems (MEMS) deformable mirror (DM) called the Kilo-DM. The company now offers a full line of MEMS DMs, which are being used in observatories across the world, in laser communication, and microscopy.
Hyperspectral Imaging of human arm
NASA Technical Reports Server (NTRS)
2003-01-01
ProVision Technologies, a NASA research partnership center at Sternis Space Center in Mississippi, has developed a new hyperspectral imaging (HSI) system that is much smaller than the original large units used aboard remote sensing aircraft and satellites. The new apparatus is about the size of a breadbox. Health-related applications of HSI include non-invasive analysis of human skin to characterize wounds and wound healing rates (especially important for space travelers who heal more slowly), determining if burns are first-, second-, or third degree (rather than painful punch biopsies). The work is sponsored under NASA's Space Product Development (SPD) program.
System and method for generating motion corrected tomographic images
Gleason, Shaun S [Knoxville, TN; Goddard, Jr., James S.
2012-05-01
A method and related system for generating motion corrected tomographic images includes the steps of illuminating a region of interest (ROI) to be imaged being part of an unrestrained live subject and having at least three spaced apart optical markers thereon. Simultaneous images are acquired from a first and a second camera of the markers from different angles. Motion data comprising 3D position and orientation of the markers relative to an initial reference position is then calculated. Motion corrected tomographic data obtained from the ROI using the motion data is then obtained, where motion corrected tomographic images obtained therefrom.
NASA Astrophysics Data System (ADS)
Lin, Alexander; Johnson, Lindsay C.; Shokouhi, Sepideh; Peterson, Todd E.; Kupinski, Matthew A.
2015-03-01
In synthetic-collimator SPECT imaging, two detectors are placed at different distances behind a multi-pinhole aperture. This configuration allows for image detection at different magnifications and photon energies, resulting in higher overall sensitivity while maintaining high resolution. Image multiplexing the undesired overlapping between images due to photon origin uncertainty may occur in both detector planes and is often present in the second detector plane due to greater magnification. However, artifact-free image reconstruction is possible by combining data from both the front detector (little to no multiplexing) and the back detector (noticeable multiplexing). When the two detectors are used in tandem, spatial resolution is increased, allowing for a higher sensitivity-to-detector-area ratio. Due to variability in detector distances and pinhole spacings found in synthetic-collimator SPECT systems, a large parameter space must be examined to determine optimal imaging configurations. We chose to assess image quality based on the task of estimating activity in various regions of a mouse brain. Phantom objects were simulated using mouse brain data from the Magnetic Resonance Microimaging Neurological Atlas (MRM NeAt) and projected at different angles through models of a synthetic-collimator SPECT system, which was developed by collaborators at Vanderbilt University. Uptake in the different brain regions was modeled as being normally distributed about predetermined means and variances. We computed the performance of the Wiener estimator for the task of estimating activity in different regions of the mouse brain. Our results demonstrate the utility of the method for optimizing synthetic-collimator system design.
A micro-vibration generated method for testing the imaging quality on ground of space remote sensing
NASA Astrophysics Data System (ADS)
Gu, Yingying; Wang, Li; Wu, Qingwen
2018-03-01
In this paper, a novel method is proposed, which can simulate satellite platform micro-vibration and test the impact of satellite micro-vibration on imaging quality of space optical remote sensor on ground. The method can generate micro-vibration of satellite platform in orbit from vibrational degrees of freedom, spectrum, magnitude, and coupling path. Experiment results show that the relative error of acceleration control is within 7%, in frequencies from 7Hz to 40Hz. Utilizing this method, the system level test about the micro-vibration impact on imaging quality of space optical remote sensor can be realized. This method will have an important applications in testing micro-vibration tolerance margin of optical remote sensor, verifying vibration isolation and suppression performance of optical remote sensor, exploring the principle of micro-vibration impact on imaging quality of optical remote sensor.
2004-02-04
KENNEDY SPACE CENTER, FLA. - One of the world’s highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data (shown here) in preparation for the shuttle fleet’s return to flight, is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. The system, developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI
Software for Managing an Archive of Images
NASA Technical Reports Server (NTRS)
Hallai, Charles; Jones, Helene; Callac, Chris
2003-01-01
This is a revised draft by Innovators concerning the report on Software for Managing and Archive of Images.The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by todays standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional film-based camera, along with metadata about each image.
The Long-Wave Infrared Earth Image as a Pointing Reference for Deep-Space Optical Communications
NASA Astrophysics Data System (ADS)
Biswas, A.; Piazzolla, S.; Peterson, G.; Ortiz, G. G.; Hemmati, H.
2006-11-01
Optical communications from space require an absolute pointing reference. Whereas at near-Earth and even planetary distances out to Mars and Jupiter a laser beacon transmitted from Earth can serve as such a pointing reference, for farther distances extending to the outer reaches of the solar system, the means for meeting this requirement remains an open issue. We discuss in this article the prospects and consequences of utilizing the Earth image sensed in the long-wave infrared (LWIR) spectral band as a beacon to satisfy the absolute pointing requirements. We have used data from satellite-based thermal measurements of Earth to synthesize images at various ranges and have shown the centroiding accuracies that can be achieved with prospective LWIR image sensing arrays. The nonuniform emissivity of Earth causes a mispointing bias error term that exceeds a provisional pointing budget allocation when using simple centroiding algorithms. Other issues related to implementing thermal imaging of Earth from deep space for the purposes of providing a pointing reference are also reported.
The New CCSDS Image Compression Recommendation
NASA Technical Reports Server (NTRS)
Yeh, Pen-Shu; Armbruster, Philippe; Kiely, Aaron; Masschelein, Bart; Moury, Gilles; Schaefer, Christoph
2005-01-01
The Consultative Committee for Space Data Systems (CCSDS) data compression working group has recently adopted a recommendation for image data compression, with a final release expected in 2005. The algorithm adopted in the recommendation consists of a two-dimensional discrete wavelet transform of the image, followed by progressive bit-plane coding of the transformed data. The algorithm can provide both lossless and lossy compression, and allows a user to directly control the compressed data volume or the fidelity with which the wavelet-transformed data can be reconstructed. The algorithm is suitable for both frame-based image data and scan-based sensor data, and has applications for near-Earth and deep-space missions. The standard will be accompanied by free software sources on a future web site. An Application-Specific Integrated Circuit (ASIC) implementation of the compressor is currently under development. This paper describes the compression algorithm along with the requirements that drove the selection of the algorithm. Performance results and comparisons with other compressors are given for a test set of space images.
NASA Astrophysics Data System (ADS)
Chuang, Cheng-Hung; Chen, Yen-Lin
2013-02-01
This study presents a steganographic optical image encryption system based on reversible data hiding and double random phase encoding (DRPE) techniques. Conventional optical image encryption systems can securely transmit valuable images using an encryption method for possible application in optical transmission systems. The steganographic optical image encryption system based on the DRPE technique has been investigated to hide secret data in encrypted images. However, the DRPE techniques vulnerable to attacks and many of the data hiding methods in the DRPE system can distort the decrypted images. The proposed system, based on reversible data hiding, uses a JBIG2 compression scheme to achieve lossless decrypted image quality and perform a prior encryption process. Thus, the DRPE technique enables a more secured optical encryption process. The proposed method extracts and compresses the bit planes of the original image using the lossless JBIG2 technique. The secret data are embedded in the remaining storage space. The RSA algorithm can cipher the compressed binary bits and secret data for advanced security. Experimental results show that the proposed system achieves a high data embedding capacity and lossless reconstruction of the original images.
Mobile Aerial Tracking and Imaging System (MATRIS) for Aeronautical Research
NASA Technical Reports Server (NTRS)
Banks, Daniel W.; Blanchard, R. C.; Miller, G. M.
2004-01-01
A mobile, rapidly deployable ground-based system to track and image targets of aeronautical interest has been developed. Targets include reentering reusable launch vehicles (RLVs) as well as atmospheric and transatmospheric vehicles. The optics were designed to image targets in the visible and infrared wavelengths. To minimize acquisition cost and development time, the system uses commercially available hardware and software where possible. The conception and initial funding of this system originated with a study of ground-based imaging of global aerothermal characteristics of RLV configurations. During that study NASA teamed with the Missile Defense Agency/Innovative Science and Technology Experimentation Facility (MDA/ISTEF) to test techniques and analysis on two Space Shuttle flights.
1975-09-30
systems a linear model results in an object f being mappad into an image _ by a point spread function matrix H. Thus with noise j +Hf +n (1) The simplest... linear models for imaging systems are given by space invariant point spread functions (SIPSF) in which case H is block circulant. If the linear model is...Ij,...,k-IM1 is a set of two dimensional indices each distinct and prior to k. Modeling Procedare: To derive the linear predictor (block LP of figure
2007-01-01
primary scientific objectives: (1) Learn how planetary systems form from protostellar disks , and how they acquire their inhomogeneous composition; (2...characterize the family of extrasolar planetary systems by imaging the structure in debris disks to understand how and where planets of different...scientific objectives: (1) Learn how planetary systems form from protostellar disks , and how they acquire their inhomogeneous composition; (2
Differential morphology and image processing.
Maragos, P
1996-01-01
Image processing via mathematical morphology has traditionally used geometry to intuitively understand morphological signal operators and set or lattice algebra to analyze them in the space domain. We provide a unified view and analytic tools for morphological image processing that is based on ideas from differential calculus and dynamical systems. This includes ideas on using partial differential or difference equations (PDEs) to model distance propagation or nonlinear multiscale processes in images. We briefly review some nonlinear difference equations that implement discrete distance transforms and relate them to numerical solutions of the eikonal equation of optics. We also review some nonlinear PDEs that model the evolution of multiscale morphological operators and use morphological derivatives. Among the new ideas presented, we develop some general 2-D max/min-sum difference equations that model the space dynamics of 2-D morphological systems (including the distance computations) and some nonlinear signal transforms, called slope transforms, that can analyze these systems in a transform domain in ways conceptually similar to the application of Fourier transforms to linear systems. Thus, distance transforms are shown to be bandpass slope filters. We view the analysis of the multiscale morphological PDEs and of the eikonal PDE solved via weighted distance transforms as a unified area in nonlinear image processing, which we call differential morphology, and briefly discuss its potential applications to image processing and computer vision.
First Space VLBI Observations and Images Using the VLBA and VSOP
NASA Astrophysics Data System (ADS)
Romney, J. D.; Benson, J. M.; Claussen, M. J.; Desai, K. M.; Flatters, C.; Mioduszewski, A. J.; Ulvestad, J. S.
1997-12-01
The National Radio Astronomy Observatory (NRAO) is a participant in the VSOP Space VLBI mission, an international collaboration led by Japan's Institute of Space and Astronautical Science. NRAO has committed up to 30% of scheduled observing time on the Very Long Baseline Array (VLBA), and corresponding correlation resources, to Space VLBI observations. The NRAO Space VLBI Project, funded by NASA, has been working for several years to complete the necessary enhancements to the VLBA correlator and the AIPS image processing system. These developments were completed by the time of the successful launch of the VSOP mission's Halca spacecraft on 1997 February 12. As part of the in-orbit checkout phase, the first Space VLBI fringes from a VLBA observation were detected on 1997 June 12, and the VSOP mission's first images, in both the 1.6- and 5-GHz bands, were obtained shortly thereafter. In-orbit test observations continued through early September, with the first General Observing Time (GOT) scientific observations beginning in July. Through mid-October, a total of 20 Space VLBI observations, comprising 190 hours, had been completed at the VLBA correlator. This paper reviews the unique features of correlation and imaging of Space VLBI observations. These include, for correlation, the ephemeris for an orbiting VLBI ``station'' which is not fixed on the surface of the earth, and the requirement to close the loop on the phase-transfer process from a frequency standard on the ground to the spacecraft. Images from a number of early tests and scientific observations are presented. NRAO's user-support program, providing expert assistance in data analysis to Space VLBI observers, is also described.
A mathematical model of neuro-fuzzy approximation in image classification
NASA Astrophysics Data System (ADS)
Gopalan, Sasi; Pinto, Linu; Sheela, C.; Arun Kumar M., N.
2016-06-01
Image digitization and explosion of World Wide Web has made traditional search for image, an inefficient method for retrieval of required grassland image data from large database. For a given input query image Content-Based Image Retrieval (CBIR) system retrieves the similar images from a large database. Advances in technology has increased the use of grassland image data in diverse areas such has agriculture, art galleries, education, industry etc. In all the above mentioned diverse areas it is necessary to retrieve grassland image data efficiently from a large database to perform an assigned task and to make a suitable decision. A CBIR system based on grassland image properties and it uses the aid of a feed-forward back propagation neural network for an effective image retrieval is proposed in this paper. Fuzzy Memberships plays an important role in the input space of the proposed system which leads to a combined neural fuzzy approximation in image classification. The CBIR system with mathematical model in the proposed work gives more clarity about fuzzy-neuro approximation and the convergence of the image features in a grassland image.
2004-02-04
KENNEDY SPACE CENTER, FLA. - These towers are part of one of the world’s highest performing visual film analysis systems, developed to review and analyze previous shuttle flight data in preparation for the shuttle fleet’s return to flight. The system is being used today for another purpose. NASA has permitted its use in helping to analyze a film that shows a recent kidnapping in progress in Florida. Developed by NASA, United Space Alliance (USA) and Silicon Graphics Inc., the system allows multiple-person collaboration, highly detailed manipulation and evaluation of specific imagery. The system is housed in the Image Analysis Facility inside the Vehicle Assembly Building. [Photo taken Aug. 15, 2003, courtesy of Terry Wallace, SGI
Andersen, Flemming; Watanabe, Hideaki; Bjarkam, Carsten; Danielsen, Erik H; Cumming, Paul
2005-07-15
The analysis of physiological processes in brain by position emission tomography (PET) is facilitated when images are spatially normalized to a standard coordinate system. Thus, PET activation studies of human brain frequently employ the common stereotaxic coordinates of Talairach. We have developed an analogous stereotaxic coordinate system for the brain of the Gottingen miniature pig, based on automatic co-registration of magnetic resonance (MR) images obtained in 22 male pigs. The origin of the pig brain stereotaxic space (0, 0, 0) was arbitrarily placed in the centroid of the pineal gland as identified on the average MRI template. The orthogonal planes were imposed using the line between stereotaxic zero and the optic chiasm. A series of mean MR images in the coronal, sagittal and horizontal planes were generated. To test the utility of the common coordinate system for functional imaging studies of minipig brain, we calculated cerebral blood flow (CBF) maps from normal minipigs and from minipigs with a syndrome of parkisonism induced by 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)-poisoning. These maps were transformed from the native space into the common stereotaxic space. After global normalization of these maps, an undirected search for differences between the groups was then performed using statistical parametric mapping. Using this method, we detected a statistically significant focal increase in CBF in the left cerebellum of the MPTP-lesioned group. We expect the present approach to be of general use in the statistical parametric mapping of CBF and other physiological parameters in living pig brain.
NASA Technical Reports Server (NTRS)
1996-01-01
Topic considered include: survey objectives; technologies for non-Invasive imaging of subsurface; cost; data requirements and sources; climatic condition; hydrology and geology; chemicals; magnetometry; electrical(resistivity, potential); optical-style imaging; reflection/refraction seismics; gravitometry; photo-acoustic activation;well drilling and borehole analysis; comparative assessment matrix; ground sensors; choice of the neutron sources; logistic of operations; system requirements; health and safety plans.
View of ISS taken during the STS-122 Approach
2008-02-09
S122-E-007027 (9 Feb. 2008) --- This digital still image of the International Space Station was photographed through an overhead window on the Space Shuttle Atlantis as the two spacecraft approached each other for a Feb. 9 docking. While STS-122 astronauts were recording photos of their home for the next several days, crew members aboard the ISS were clicking images of the shuttle, with the primary focus being on its thermal protection system.
Correspondence Search Mitigation Using Feature Space Anti-Aliasing
2007-01-01
trackers are widely used in astro -inertial nav- igation systems for long-range aircraft, space navigation, and ICBM guidance. When ground images are to be...frequency domain representation of the point spread function, H( fx , fy), is called the optical transfer function. Applying the Fourier transform to the...frequency domain representation of the image: I( fx , fy, t) = O( fx , fy, t)H( fx , fy) (4) In most conditions, the projected scene can be treated as a
2017-12-08
The gravitational field surrounding this massive cluster of galaxies, Abell 68, acts as a natural lens in space to brighten and magnify the light coming from very distant background galaxies. Like a fun house mirror, lensing creates a fantasy landscape of arc-like images and mirror images of background galaxies. The foreground cluster is 2 billion light-years away, and the lensed images come from galaxies far behind it. In this photo, the image of a spiral galaxy at upper left has been stretched and mirrored into a shape similar to that of a simulated alien from the classic 1970s computer game "Space Invaders!" A second, less distorted image of the same galaxy appears to the left of the large, bright elliptical galaxy. In the upper right of the photo is another striking feature of the image that is unrelated to gravitational lensing. What appears to be purple liquid dripping from a galaxy is a phenomenon called ram-pressure stripping. The gas clouds within the galaxy are being stripped out and heated up as the galaxy passes through a region of denser intergalactic gas. This image was taken in infrared light by Hubble’s Wide Field Camera 3, and combined with near-infrared observations from Hubble’s Advanced Camera for Surveys. The image is based in part on data spotted by Nick Rose in the Hubble’s Hidden Treasures image processing competition. The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center in Greenbelt, Md., manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Md., conducts Hubble science operations. STScI is operated by the Association of Universities for Research in Astronomy, Inc., in Washington. Credit: NASA and ESA Acknowledgement: N. Rose For image files and more information about Abell 68, visit: hubblesite.org/news/2013/09 www.spacetelescope.org/news/heic04 heritage.stsci.edu/2013/09 www.spacetelescope.org/projects/hiddentreasures/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
2017-12-08
The gravitational field surrounding this massive cluster of galaxies, Abell 68, acts as a natural lens in space to brighten and magnify the light coming from very distant background galaxies. Like a fun house mirror, lensing creates a fantasy landscape of arc-like images and mirror images of background galaxies. The foreground cluster is 2 billion light-years away, and the lensed images come from galaxies far behind it. In this photo, the image of a spiral galaxy at upper left has been stretched and mirrored into a shape similar to that of a simulated alien from the classic 1970s computer game "Space Invaders!" A second, less distorted image of the same galaxy appears to the left of the large, bright elliptical galaxy. In the upper right of the photo is another striking feature of the image that is unrelated to gravitational lensing. What appears to be purple liquid dripping from a galaxy is a phenomenon called ram-pressure stripping. The gas clouds within the galaxy are being stripped out and heated up as the galaxy passes through a region of denser intergalactic gas. This image was taken in infrared light by Hubble’s Wide Field Camera 3, and combined with near-infrared observations from Hubble’s Advanced Camera for Surveys. The image is based in part on data spotted by Nick Rose in the Hubble’s Hidden Treasures image processing competition. The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center in Greenbelt, Md., manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore, Md., conducts Hubble science operations. STScI is operated by the Association of Universities for Research in Astronomy, Inc., in Washington. To read more go to: 1.usa.gov/Z6uDUp Credit: NASA and ESA Acknowledgement: N. Rose For image files and more information about Abell 68, visit: hubblesite.org/news/2013/09 www.spacetelescope.org/news/heic04 heritage.stsci.edu/2013/09 www.spacetelescope.org/projects/hiddentreasures/ NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Robbins, Woodrow E.
1988-01-01
The present conference discusses topics in novel technologies and techniques of three-dimensional imaging, human factors-related issues in three-dimensional display system design, three-dimensional imaging applications, and image processing for remote sensing. Attention is given to a 19-inch parallactiscope, a chromostereoscopic CRT-based display, the 'SpaceGraph' true three-dimensional peripheral, advantages of three-dimensional displays, holographic stereograms generated with a liquid crystal spatial light modulator, algorithms and display techniques for four-dimensional Cartesian graphics, an image processing system for automatic retina diagnosis, the automatic frequency control of a pulsed CO2 laser, and a three-dimensional display of magnetic resonance imaging of the spine.
OmniBird: a miniature PTZ NIR sensor system for UCAV day/night autonomous operations
NASA Astrophysics Data System (ADS)
Yi, Steven; Li, Hui
2007-04-01
Through a SBIR funding from NAVAIR, we have successfully developed an innovative, miniaturized, and lightweight PTZ UCAV imager called OmniBird for UCAV taxiing. The proposed OmniBird will be able to fit in a small space. The designed zoom capability allows it to acquire focused images for targets ranging from 10 to 250 feet. The innovative panning mechanism also allows the system to have a field of view of +/- 100 degrees within the provided limited spacing (6 cubic inches). The integrated optics, camera sensor, and mechanics solution will allow the OmniBird to stay optically aligned and shock-proof under harsh environments.
2009-09-24
ISS020-E-041981 (24 Sept. 2009) --- The exterior of the Japanese Kibo complex of the International Space Station and the station's Canadarm2 (bottom) are featured in this image photographed by an Expedition 20 crew member on the station. European Space Agency astronaut Frank De Winne and NASA astronaut Nicole Stott, both Expedition 20 flight engineers, used the controls of the Japanese Experiment Module Robotic Manipulator System (JEM-RMS) in Kibo to grapple and transfer two Japanese payloads from the Exposed Pallet to their Exposed Facility locations -- first HICO/Hyperspectral Imager for the Coastal Ocean & RAIDS/Remote Atmospheric and Ionospheric Detection System (HREP), then Superconducting Submillimeter-wave Limb-emission Sounder (SMILES).
Son, Jung-Young; Saveljev, Vladmir V; Kim, Jae-Soon; Kim, Sung-Sik; Javidi, Bahram
2004-09-10
The viewing zone of autostereoscopic imaging systems that use lenticular, parallax-barrier, and microlens-array plates as the viewing-zone-forming optics is analyzed in order to verify the image-quality differences between different locations of the zone. The viewing zone consists of many subzones. The images seen at most of these subzones are composed of at least one image strip selected from the total number of different view images displayed. These different view images are not mixed but patched to form a complete image. This image patching deteriorates the quality of the image seen at different subzones. We attempt to quantify the quality of the image seen at these viewing subzones by taking the inverse of the number of different view images patched together at different subzones. Although the combined viewing zone can be extended to almost all of the front space of the imaging system, in reality it is limited mainly by the image quality.
Note: An improved 3D imaging system for electron-electron coincidence measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Yun Fei; Lee, Suk Kyoung; Adhikari, Pradip
We demonstrate an improved imaging system that can achieve highly efficient 3D detection of two electrons in coincidence. The imaging system is based on a fast frame complementary metal-oxide semiconductor camera and a high-speed waveform digitizer. We have shown previously that this detection system is capable of 3D detection of ions and electrons with good temporal and spatial resolution. Here, we show that with a new timing analysis algorithm, this system can achieve an unprecedented dead-time (<0.7 ns) and dead-space (<1 mm) when detecting two electrons. A true zero dead-time detection is also demonstrated.
Note: An improved 3D imaging system for electron-electron coincidence measurements
NASA Astrophysics Data System (ADS)
Lin, Yun Fei; Lee, Suk Kyoung; Adhikari, Pradip; Herath, Thushani; Lingenfelter, Steven; Winney, Alexander H.; Li, Wen
2015-09-01
We demonstrate an improved imaging system that can achieve highly efficient 3D detection of two electrons in coincidence. The imaging system is based on a fast frame complementary metal-oxide semiconductor camera and a high-speed waveform digitizer. We have shown previously that this detection system is capable of 3D detection of ions and electrons with good temporal and spatial resolution. Here, we show that with a new timing analysis algorithm, this system can achieve an unprecedented dead-time (<0.7 ns) and dead-space (<1 mm) when detecting two electrons. A true zero dead-time detection is also demonstrated.
NASA Technical Reports Server (NTRS)
1995-01-01
NASA's Technology Transfer Office at Stennis Space Center worked with the Johns Hopkins Wilmer Eye Institute in Baltimore, Md., to incorporate NASA software originally developed by NASA to process satellite images into the Low Vision Enhancement System (LVES). The LVES, referred to as 'ELVIS' by its users, is a portable image processing system that could make it possible to improve a person's vision by enhancing and altering images to compensate for impaired eyesight. The system consists of two orientation cameras, a zoom camera, and a video projection system. The headset and hand-held control weigh about two pounds each. Pictured is Jacob Webb, the first Mississippian to use the LVES.
Automated Formosat Image Processing System for Rapid Response to International Disasters
NASA Astrophysics Data System (ADS)
Cheng, M. C.; Chou, S. C.; Chen, Y. C.; Chen, B.; Liu, C.; Yu, S. J.
2016-06-01
FORMOSAT-2, Taiwan's first remote sensing satellite, was successfully launched in May of 2004 into the Sun-synchronous orbit at 891 kilometers of altitude. With the daily revisit feature, the 2-m panchromatic, 8-m multi-spectral resolution images captured have been used for researches and operations in various societal benefit areas. This paper details the orchestration of various tasks conducted in different institutions in Taiwan in the efforts responding to international disasters. The institutes involved including its space agency-National Space Organization (NSPO), Center for Satellite Remote Sensing Research of National Central University, GIS Center of Feng-Chia University, and the National Center for High-performance Computing. Since each institution has its own mandate, the coordinated tasks ranged from receiving emergency observation requests, scheduling and tasking of satellite operation, downlink to ground stations, images processing including data injection, ortho-rectification, to delivery of image products. With the lessons learned from working with international partners, the FORMOSAT Image Processing System has been extensively automated and streamlined with a goal to shorten the time between request and delivery in an efficient manner. The integrated team has developed an Application Interface to its system platform that provides functions of search in archive catalogue, request of data services, mission planning, inquiry of services status, and image download. This automated system enables timely image acquisition and substantially increases the value of data product. Example outcome of these efforts in recent response to support Sentinel Asia in Nepal Earthquake is demonstrated herein.
Adaptive Optics For Imaging Bright Objects Next To Dim Ones
NASA Technical Reports Server (NTRS)
Shao, Michael; Yu, Jeffrey W.; Malbet, Fabien
1996-01-01
Adaptive optics used in imaging optical systems, according to proposal, to enhance high-dynamic-range images (images of bright objects next to dim objects). Designed to alter wavefronts to correct for effects of scattering of light from small bumps on imaging optics. Original intended application of concept in advanced camera installed on Hubble Space Telescope for imaging of such phenomena as large planets near stars other than Sun. Also applicable to other high-quality telescopes and cameras.
Vision requirements for Space Station applications
NASA Technical Reports Server (NTRS)
Crouse, K. R.
1985-01-01
Problems which will be encountered by computer vision systems in Space Station operations are discussed, along with solutions be examined at Johnson Space Station. Lighting cannot be controlled in space, nor can the random presence of reflective surfaces. Task-oriented capabilities are to include docking to moving objects, identification of unexpected objects during autonomous flights to different orbits, and diagnoses of damage and repair requirements for autonomous Space Station inspection robots. The approaches being examined to provide these and other capabilities are television IR sensors, advanced pattern recognition programs feeding on data from laser probes, laser radar for robot eyesight and arrays of SMART sensors for automated location and tracking of target objects. Attention is also being given to liquid crystal light valves for optical processing of images for comparisons with on-board electronic libraries of images.
NASA Astrophysics Data System (ADS)
Li, Xiaoliang; Luo, Lei; Li, Pengwei; Yu, Qingkui
2018-03-01
The image sensor in satellite optical communication system may generate noise due to space irradiation damage, leading to deviation for the determination of the light spot centroid. Based on the irradiation test data of CMOS devices, simulated defect spots in different sizes have been used for calculating the centroid deviation value by grey-level centroid algorithm. The impact on tracking & pointing accuracy of the system has been analyzed. The results show that both the amount and the position of irradiation-induced defect pixels contribute to spot centroid deviation. And the larger spot has less deviation. At last, considering the space radiation damage, suggestions are made for the constraints of spot size selection.
Compact Microscope Imaging System With Intelligent Controls Improved
NASA Technical Reports Server (NTRS)
McDowell, Mark
2004-01-01
The Compact Microscope Imaging System (CMIS) with intelligent controls is a diagnostic microscope analysis tool with intelligent controls for use in space, industrial, medical, and security applications. This compact miniature microscope, which can perform tasks usually reserved for conventional microscopes, has unique advantages in the fields of microscopy, biomedical research, inline process inspection, and space science. Its unique approach integrates a machine vision technique with an instrumentation and control technique that provides intelligence via the use of adaptive neural networks. The CMIS system was developed at the NASA Glenn Research Center specifically for interface detection used for colloid hard spheres experiments; biological cell detection for patch clamping, cell movement, and tracking; and detection of anode and cathode defects for laboratory samples using microscope technology.
Optical ranked-order filtering using threshold decomposition
Allebach, Jan P.; Ochoa, Ellen; Sweeney, Donald W.
1990-01-01
A hybrid optical/electronic system performs median filtering and related ranked-order operations using threshold decomposition to encode the image. Threshold decomposition transforms the nonlinear neighborhood ranking operation into a linear space-invariant filtering step followed by a point-to-point threshold comparison step. Spatial multiplexing allows parallel processing of all the threshold components as well as recombination by a second linear, space-invariant filtering step. An incoherent optical correlation system performs the linear filtering, using a magneto-optic spatial light modulator as the input device and a computer-generated hologram in the filter plane. Thresholding is done electronically. By adjusting the value of the threshold, the same architecture is used to perform median, minimum, and maximum filtering of images. A totally optical system is also disclosed.
Mercury Transit (Composite Image)
2017-12-08
On May 9, 2016, Mercury passed directly between the sun and Earth. This event – which happens about 13 times each century – is called a transit. NASA’s Solar Dynamics Observatory, or SDO, studies the sun 24/7 and captured the entire seven-and-a-half-hour event. This composite image of Mercury’s journey across the sun was created with visible-light images from the Helioseismic and Magnetic Imager on SDO. Image Credit: NASA's Goddard Space Flight Center/SDO/Genna Duberstein NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Wave analysis of a plenoptic system and its applications
NASA Astrophysics Data System (ADS)
Shroff, Sapna A.; Berkner, Kathrin
2013-03-01
Traditional imaging systems directly image a 2D object plane on to the sensor. Plenoptic imaging systems contain a lenslet array at the conventional image plane and a sensor at the back focal plane of the lenslet array. In this configuration the data captured at the sensor is not a direct image of the object. Each lenslet effectively images the aperture of the main imaging lens at the sensor. Therefore the sensor data retains angular light-field information which can be used for a posteriori digital computation of multi-angle images and axially refocused images. If a filter array, containing spectral filters or neutral density or polarization filters, is placed at the pupil aperture of the main imaging lens, then each lenslet images the filters on to the sensor. This enables the digital separation of multiple filter modalities giving single snapshot, multi-modal images. Due to the diversity of potential applications of plenoptic systems, their investigation is increasing. As the application space moves towards microscopes and other complex systems, and as pixel sizes become smaller, the consideration of diffraction effects in these systems becomes increasingly important. We discuss a plenoptic system and its wave propagation analysis for both coherent and incoherent imaging. We simulate a system response using our analysis and discuss various applications of the system response pertaining to plenoptic system design, implementation and calibration.
Widefield TSCSPC-systems with large-area-detectors: application in simultaneous multi-channel-FLIM
NASA Astrophysics Data System (ADS)
Stepanov, Sergei; Bakhlanov, Sergei; Drobchenko, Evgeny; Eckert, Hann-Jörg; Kemnitz, Klaus
2010-11-01
Novel proximity-type Time- and Space-Correlated Single Photon Counting (TSCSPC) crossed-delay-line (DL)- and multi-anode (MA)-systems of outstanding performance and homogeneity were developed, using large-area detector heads of 25 and 40 mm diameter. Instrument response functions IRF(space) = (60 +/- 5) μm FWHM and IRF(time) = (28 +/- 3) ps FWHM were achieved over the full 12 cm2 area of the detector. Deadtime at throughput of 105 cps is 10% for "high-resolution" system and 5% in the "video"-system at 106 cps, at slightly reduced time- and space resolution. A fluorescence lifetime of (3.5 +/- 1) ps can be recovered from multi-exponential dynamics of a single living cyanobacterium (Acaryochloris marina). The present large-area detectors are particularly useful in simultaneous multichannel applications, such as 2-colour anisotropy or 4-colour lifetime imaging, utilizing dual- or quad-view image splitters. The long-term stability, low- excitation-intensity (< 100 mW/cm2) widefield systems enable minimal-invasive observation, without significant bleaching or photodynamic reactions, thus allowing long-period observation of up to several hours in living cells.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser prepare to conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
NASA Technical Reports Server (NTRS)
2004-01-01
KENNEDY SPACE CENTER, FLA. In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttles Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
2004-09-17
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, from left, United Space Alliance workers Loyd Turner, Craig Meyer and Erik Visser prepare to conduct a fit check of an External Tank (ET) digital still camera in the right-hand liquid oxygen umbilical well on Space Shuttle Atlantis. NASA is pursuing use of the camera, beginning with the Shuttle’s Return To Flight, to obtain and downlink high-resolution images of the ET following separation of the ET from the orbiter after launch. The Kodak camera will record 24 images, at one frame per 1.5 seconds, on a flash memory card. After orbital insertion, the crew will transfer the images from the memory card to a laptop computer. The files will then be downloaded through the Ku-band system to the Mission Control Center in Houston for analysis.
Image Processor Electronics (IPE): The High-Performance Computing System for NASA SWIFT Mission
NASA Technical Reports Server (NTRS)
Nguyen, Quang H.; Settles, Beverly A.
2003-01-01
Gamma Ray Bursts (GRBs) are believed to be the most powerful explosions that have occurred in the Universe since the Big Bang and are a mystery to the scientific community. Swift, a NASA mission that includes international participation, was designed and built in preparation for a 2003 launch to help to determine the origin of Gamma Ray Bursts. Locating the position in the sky where a burst originates requires intensive computing, because the duration of a GRB can range between a few milliseconds up to approximately a minute. The instrument data system must constantly accept multiple images representing large regions of the sky that are generated by sixteen gamma ray detectors operating in parallel. It then must process the received images very quickly in order to determine the existence of possible gamma ray bursts and their locations. The high-performance instrument data computing system that accomplishes this is called the Image Processor Electronics (IPE). The IPE was designed, built and tested by NASA Goddard Space Flight Center (GSFC) in order to meet these challenging requirements. The IPE is a small size, low power and high performing computing system for space applications. This paper addresses the system implementation and the system hardware architecture of the IPE. The paper concludes with the IPE system performance that was measured during end-to-end system testing.
Compact Microscope Imaging System Developed
NASA Technical Reports Server (NTRS)
McDowell, Mark
2001-01-01
The Compact Microscope Imaging System (CMIS) is a diagnostic tool with intelligent controls for use in space, industrial, medical, and security applications. The CMIS can be used in situ with a minimum amount of user intervention. This system, which was developed at the NASA Glenn Research Center, can scan, find areas of interest, focus, and acquire images automatically. Large numbers of multiple cell experiments require microscopy for in situ observations; this is only feasible with compact microscope systems. CMIS is a miniature machine vision system that combines intelligent image processing with remote control capabilities. The software also has a user-friendly interface that can be used independently of the hardware for post-experiment analysis. CMIS has potential commercial uses in the automated online inspection of precision parts, medical imaging, security industry (examination of currency in automated teller machines and fingerprint identification in secure entry locks), environmental industry (automated examination of soil/water samples), biomedical field (automated blood/cell analysis), and microscopy community. CMIS will improve research in several ways: It will expand the capabilities of MSD experiments utilizing microscope technology. It may be used in lunar and Martian experiments (Rover Robot). Because of its reduced size, it will enable experiments that were not feasible previously. It may be incorporated into existing shuttle orbiter and space station experiments, including glove-box-sized experiments as well as ground-based experiments.
Comparison of ISS Power System Telemetry with Analytically Derived Data for Shadowed Cases
NASA Technical Reports Server (NTRS)
Fincannon, H. James
2002-01-01
Accurate International Space Station (ISS) power prediction requires the quantification of solar array shadowing. Prior papers have discussed the NASA Glenn Research Center (GRC) ISS power system tool SPACE (System Power Analysis for Capability Evaluation) and its integrated shadowing algorithms. On-orbit telemetry has become available that permits the correlation of theoretical shadowing predictions with actual data. This paper documents the comparison of a shadowing metric (total solar array current) as derived from SPACE predictions and on-orbit flight telemetry data for representative significant shadowing cases. Images from flight video recordings and the SPACE computer program graphical output are used to illustrate the comparison. The accuracy of the SPACE shadowing capability is demonstrated for the cases examined.
O-space with high resolution readouts outperforms radial imaging.
Wang, Haifeng; Tam, Leo; Kopanoglu, Emre; Peters, Dana C; Constable, R Todd; Galiana, Gigi
2017-04-01
While O-Space imaging is well known to accelerate image acquisition beyond traditional Cartesian sampling, its advantages compared to undersampled radial imaging, the linear trajectory most akin to O-Space imaging, have not been detailed. In addition, previous studies have focused on ultrafast imaging with very high acceleration factors and relatively low resolution. The purpose of this work is to directly compare O-Space and radial imaging in their potential to deliver highly undersampled images of high resolution and minimal artifacts, as needed for diagnostic applications. We report that the greatest advantages to O-Space imaging are observed with extended data acquisition readouts. A sampling strategy that uses high resolution readouts is presented and applied to compare the potential of radial and O-Space sequences to generate high resolution images at high undersampling factors. Simulations and phantom studies were performed to investigate whether use of extended readout windows in O-Space imaging would increase k-space sampling and improve image quality, compared to radial imaging. Experimental O-Space images acquired with high resolution readouts show fewer artifacts and greater sharpness than radial imaging with equivalent scan parameters. Radial images taken with longer readouts show stronger undersampling artifacts, which can cause small or subtle image features to disappear. These features are preserved in a comparable O-Space image. High resolution O-Space imaging yields highly undersampled images of high resolution and minimal artifacts. The additional nonlinear gradient field improves image quality beyond conventional radial imaging. Copyright © 2016 Elsevier Inc. All rights reserved.
ARIES: Enabling Visual Exploration and Organization of Art Image Collections.
Crissaff, Lhaylla; Wood Ruby, Louisa; Deutch, Samantha; DuBois, R Luke; Fekete, Jean-Daniel; Freire, Juliana; Silva, Claudio
2018-01-01
Art historians have traditionally used physical light boxes to prepare exhibits or curate collections. On a light box, they can place slides or printed images, move the images around at will, group them as desired, and visual-ly compare them. The transition to digital images has rendered this workflow obsolete. Now, art historians lack well-designed, unified interactive software tools that effectively support the operations they perform with physi-cal light boxes. To address this problem, we designed ARIES (ARt Image Exploration Space), an interactive image manipulation system that enables the exploration and organization of fine digital art. The system allows images to be compared in multiple ways, offering dynamic overlays analogous to a physical light box, and sup-porting advanced image comparisons and feature-matching functions, available through computational image processing. We demonstrate the effectiveness of our system to support art historians tasks through real use cases.
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1983-01-01
The Mesoscale Analysis and Space Sensor (MASS) Data Management and Analysis System developed by Atsuko Computing International (ACI) on the MASS HP-1000 Computer System within the Systems Dynamics Laboratory of the Marshall Space Flight Center is described. The MASS Data Management and Analysis System was successfully implemented and utilized daily by atmospheric scientists to graphically display and analyze large volumes of conventional and satellite derived meteorological data. The scientists can process interactively various atmospheric data (Sounding, Single Level, Gird, and Image) by utilizing the MASS (AVE80) share common data and user inputs, thereby reducing overhead, optimizing execution time, and thus enhancing user flexibility, useability, and understandability of the total system/software capabilities. In addition ACI installed eight APPLE III graphics/imaging computer terminals in individual scientist offices and integrated them into the MASS HP-1000 Computer System thus providing significant enhancement to the overall research environment.
Hennessy, Rosanna C; Stougaard, Peter; Olsson, Stefan
2017-03-21
Here, we report the development of a microplate reader-based system for visualizing gene expression dynamics in living bacterial cells in response to a fungus in space and real-time. A bacterium expressing the red fluorescent protein mCherry fused to the promoter region of a regulator gene nunF indicating activation of an antifungal secondary metabolite gene cluster was used as a reporter system. Time-lapse image recordings of the reporter red signal and a green signal from fluorescent metabolites combined with microbial growth measurements showed that nunF-regulated gene transcription is switched on when the bacterium enters the deceleration growth phase and upon physical encounter with fungal hyphae. This novel technique enables real-time live imaging of samples by time-series multi-channel automatic recordings using a microplate reader as both an incubator and image recorder of general use to researchers. The technique can aid in deciding when to destructively sample for other methods e.g. transcriptomics and mass spectrometry imaging to study gene expression and metabolites exchanged during the interaction.
NASA Technical Reports Server (NTRS)
Ragusa, James M.; Orwig, Gary; Gilliam, Michael; Blacklock, David; Shaykhian, Ali
1994-01-01
Status is given of an applications investigation on the potential for using an expert system shell for classification and retrieval of high resolution, digital, color space shuttle closeout photography. This NASA funded activity has focused on the use of integrated information technologies to intelligently classify and retrieve still imagery from a large, electronically stored collection. A space shuttle processing problem is identified, a working prototype system is described, and commercial applications are identified. A conclusion reached is that the developed system has distinct advantages over the present manual system and cost efficiencies will result as the system is implemented. Further, commercial potential exists for this integrated technology.
Advances in Sensors and Their Integration into Aircraft Guidance and Control Systems,
1983-06-01
this function taking account of the limitations of the existing air- craft systems such as:- (a) Cockpit space (b) use of existing controls particularly...electrostatically focused under the influence of high potentials to form an electron image on a thin silicon wafer target upon which a very tightly spaced ...matrix of p-n junctions have been formed. The spacing of the diodes is of the order of n m. A gain mechanism is caused because the photo electrons
Color Retinal Image Enhancement Based on Luminosity and Contrast Adjustment.
Zhou, Mei; Jin, Kai; Wang, Shaoze; Ye, Juan; Qian, Dahong
2018-03-01
Many common eye diseases and cardiovascular diseases can be diagnosed through retinal imaging. However, due to uneven illumination, image blurring, and low contrast, retinal images with poor quality are not useful for diagnosis, especially in automated image analyzing systems. Here, we propose a new image enhancement method to improve color retinal image luminosity and contrast. A luminance gain matrix, which is obtained by gamma correction of the value channel in the HSV (hue, saturation, and value) color space, is used to enhance the R, G, and B (red, green and blue) channels, respectively. Contrast is then enhanced in the luminosity channel of L * a * b * color space by CLAHE (contrast-limited adaptive histogram equalization). Image enhancement by the proposed method is compared to other methods by evaluating quality scores of the enhanced images. The performance of the method is mainly validated on a dataset of 961 poor-quality retinal images. Quality assessment (range 0-1) of image enhancement of this poor dataset indicated that our method improved color retinal image quality from an average of 0.0404 (standard deviation 0.0291) up to an average of 0.4565 (standard deviation 0.1000). The proposed method is shown to achieve superior image enhancement compared to contrast enhancement in other color spaces or by other related methods, while simultaneously preserving image naturalness. This method of color retinal image enhancement may be employed to assist ophthalmologists in more efficient screening of retinal diseases and in development of improved automated image analysis for clinical diagnosis.
NASA Astrophysics Data System (ADS)
Goldstein, N.; Dressler, R. A.; Richtsmeier, S. S.; McLean, J.; Dao, P. D.; Murray-Krezan, J.; Fulcoly, D. O.
2013-09-01
Recent ground testing of a wide area camera system and automated star removal algorithms has demonstrated the potential to detect, quantify, and track deep space objects using small aperture cameras and on-board processors. The camera system, which was originally developed for a space-based Wide Area Space Surveillance System (WASSS), operates in a fixed-stare mode, continuously monitoring a wide swath of space and differentiating celestial objects from satellites based on differential motion across the field of view. It would have greatest utility in a LEO orbit to provide automated and continuous monitoring of deep space with high refresh rates, and with particular emphasis on the GEO belt and GEO transfer space. Continuous monitoring allows a concept of change detection and custody maintenance not possible with existing sensors. The detection approach is equally applicable to Earth-based sensor systems. A distributed system of such sensors, either Earth-based, or space-based, could provide automated, persistent night-time monitoring of all of deep space. The continuous monitoring provides a daily record of the light curves of all GEO objects above a certain brightness within the field of view. The daily updates of satellite light curves offers a means to identify specific satellites, to note changes in orientation and operational mode, and to queue other SSA assets for higher resolution queries. The data processing approach may also be applied to larger-aperture, higher resolution camera systems to extend the sensitivity towards dimmer objects. In order to demonstrate the utility of the WASSS system and data processing, a ground based field test was conducted in October 2012. We report here the results of the observations made at Magdalena Ridge Observatory using the prototype WASSS camera, which has a 4×60° field-of-view , <0.05° resolution, a 2.8 cm2 aperture, and the ability to view within 4° of the sun. A single camera pointed at the GEO belt provided a continuous night-long record of the intensity and location of more than 50 GEO objects detected within the camera's 60-degree field-of-view, with a detection sensitivity similar to the camera's shot noise limit of Mv=13.7. Performance is anticipated to scale with aperture area, allowing the detection of dimmer objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and an image processing algorithm that exploits the different angular velocities of celestial objects and SOs. Principal Components Analysis (PCA) is used to filter out all objects moving with the velocity of the celestial frame of reference. The resulting filtered images are projected back into an Earth-centered frame of reference, or into any other relevant frame of reference, and co-added to form a series of images of the GEO objects as a function of time. The PCA approach not only removes the celestial background, but it also removes systematic variations in system calibration, sensor pointing, and atmospheric conditions. The resulting images are shot-noise limited, and can be exploited to automatically identify deep space objects, produce approximate state vectors, and track their locations and intensities as a function of time.
Recognition of blurred images by the method of moments.
Flusser, J; Suk, T; Saic, S
1996-01-01
The article is devoted to the feature-based recognition of blurred images acquired by a linear shift-invariant imaging system against an image database. The proposed approach consists of describing images by features that are invariant with respect to blur and recognizing images in the feature space. The PSF identification and image restoration are not required. A set of symmetric blur invariants based on image moments is introduced. A numerical experiment is presented to illustrate the utilization of the invariants for blurred image recognition. Robustness of the features is also briefly discussed.
NASA Astrophysics Data System (ADS)
Petrochenko, Andrey; Konyakhin, Igor
2017-06-01
In connection with the development of robotics have become increasingly popular variety of three-dimensional reconstruction of the system mapping and image-set received from the optical sensors. The main objective of technical and robot vision is the detection, tracking and classification of objects of the space in which these systems and robots operate [15,16,18]. Two-dimensional images sometimes don't contain sufficient information to address those or other problems: the construction of the map of the surrounding area for a route; object identification, tracking their relative position and movement; selection of objects and their attributes to complement the knowledge base. Three-dimensional reconstruction of the surrounding space allows you to obtain information on the relative positions of objects, their shape, surface texture. Systems, providing training on the basis of three-dimensional reconstruction of the results of the comparison can produce two-dimensional images of three-dimensional model that allows for the recognition of volume objects on flat images. The problem of the relative orientation of industrial robots with the ability to build threedimensional scenes of controlled surfaces is becoming actual nowadays.
Looking at Earth from Space: Teacher's Guide with Activities for Earth and Space Science
NASA Technical Reports Server (NTRS)
Steele, Colleen (Editor); Steele, Colleen; Ryan, William F.
1995-01-01
The Maryland Pilot Earth Science and Technology Education Network (MAPS-NET) project was sponsored by the National Aeronautics and Space Administration (NASA) to enrich teacher preparation and classroom learning in the area of Earth system science. This publication includes a teacher's guide that replicates material taught during a graduate-level course of the project and activities developed by the teachers. The publication was developed to provide teachers with a comprehensive approach to using satellite imagery to enhance science education. The teacher's guide is divided into topical chapters and enables teachers to expand their knowledge of the atmosphere, common weather patterns, and remote sensing. Topics include: weather systems and satellite imagery including mid-latitude weather systems; wave motion and the general circulation; cyclonic disturbances and baroclinic instability; clouds; additional common weather patterns; satellite images and the internet; environmental satellites; orbits; and ground station set-up. Activities are listed by suggested grade level and include the following topics: using weather symbols; forecasting the weather; cloud families and identification; classification of cloud types through infrared Automatic Picture Transmission (APT) imagery; comparison of visible and infrared imagery; cold fronts; to ski or not to ski (imagery as a decision making tool), infrared and visible satellite images; thunderstorms; looping satellite images; hurricanes; intertropical convergence zone; and using weather satellite images to enhance a study of the Chesapeake Bay. A list of resources is also included.
Marginal Space Deep Learning: Efficient Architecture for Volumetric Image Parsing.
Ghesu, Florin C; Krubasik, Edward; Georgescu, Bogdan; Singh, Vivek; Yefeng Zheng; Hornegger, Joachim; Comaniciu, Dorin
2016-05-01
Robust and fast solutions for anatomical object detection and segmentation support the entire clinical workflow from diagnosis, patient stratification, therapy planning, intervention and follow-up. Current state-of-the-art techniques for parsing volumetric medical image data are typically based on machine learning methods that exploit large annotated image databases. Two main challenges need to be addressed, these are the efficiency in scanning high-dimensional parametric spaces and the need for representative image features which require significant efforts of manual engineering. We propose a pipeline for object detection and segmentation in the context of volumetric image parsing, solving a two-step learning problem: anatomical pose estimation and boundary delineation. For this task we introduce Marginal Space Deep Learning (MSDL), a novel framework exploiting both the strengths of efficient object parametrization in hierarchical marginal spaces and the automated feature design of Deep Learning (DL) network architectures. In the 3D context, the application of deep learning systems is limited by the very high complexity of the parametrization. More specifically 9 parameters are necessary to describe a restricted affine transformation in 3D, resulting in a prohibitive amount of billions of scanning hypotheses. The mechanism of marginal space learning provides excellent run-time performance by learning classifiers in clustered, high-probability regions in spaces of gradually increasing dimensionality. To further increase computational efficiency and robustness, in our system we learn sparse adaptive data sampling patterns that automatically capture the structure of the input. Given the object localization, we propose a DL-based active shape model to estimate the non-rigid object boundary. Experimental results are presented on the aortic valve in ultrasound using an extensive dataset of 2891 volumes from 869 patients, showing significant improvements of up to 45.2% over the state-of-the-art. To our knowledge, this is the first successful demonstration of the DL potential to detection and segmentation in full 3D data with parametrized representations.
Campbell-Washburn, Adrienne E; Xue, Hui; Lederman, Robert J; Faranesh, Anthony Z; Hansen, Michael S
2016-06-01
MRI-guided interventions demand high frame rate imaging, making fast imaging techniques such as spiral imaging and echo planar imaging (EPI) appealing. In this study, we implemented a real-time distortion correction framework to enable the use of these fast acquisitions for interventional MRI. Distortions caused by gradient waveform inaccuracies were corrected using the gradient impulse response function (GIRF), which was measured by standard equipment and saved as a calibration file on the host computer. This file was used at runtime to calculate the predicted k-space trajectories for image reconstruction. Additionally, the off-resonance reconstruction frequency was modified in real time to interactively deblur spiral images. Real-time distortion correction for arbitrary image orientations was achieved in phantoms and healthy human volunteers. The GIRF-predicted k-space trajectories matched measured k-space trajectories closely for spiral imaging. Spiral and EPI image distortion was visibly improved using the GIRF-predicted trajectories. The GIRF calibration file showed no systematic drift in 4 months and was demonstrated to correct distortions after 30 min of continuous scanning despite gradient heating. Interactive off-resonance reconstruction was used to sharpen anatomical boundaries during continuous imaging. This real-time distortion correction framework will enable the use of these high frame rate imaging methods for MRI-guided interventions. Magn Reson Med 75:2278-2285, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Campbell-Washburn, Adrienne E; Xue, Hui; Lederman, Robert J; Faranesh, Anthony Z; Hansen, Michael S
2015-01-01
Purpose MRI-guided interventions demand high frame-rate imaging, making fast imaging techniques such as spiral imaging and echo planar imaging (EPI) appealing. In this study, we implemented a real-time distortion correction framework to enable the use of these fast acquisitions for interventional MRI. Methods Distortions caused by gradient waveform inaccuracies were corrected using the gradient impulse response function (GIRF), which was measured by standard equipment and saved as a calibration file on the host computer. This file was used at runtime to calculate the predicted k-space trajectories for image reconstruction. Additionally, the off-resonance reconstruction frequency was modified in real-time to interactively de-blur spiral images. Results Real-time distortion correction for arbitrary image orientations was achieved in phantoms and healthy human volunteers. The GIRF predicted k-space trajectories matched measured k-space trajectories closely for spiral imaging. Spiral and EPI image distortion was visibly improved using the GIRF predicted trajectories. The GIRF calibration file showed no systematic drift in 4 months and was demonstrated to correct distortions after 30 minutes of continuous scanning despite gradient heating. Interactive off-resonance reconstruction was used to sharpen anatomical boundaries during continuous imaging. Conclusions This real-time distortion correction framework will enable the use of these high frame-rate imaging methods for MRI-guided interventions. PMID:26114951
Luma-chroma space filter design for subpixel-based monochrome image downsampling.
Fang, Lu; Au, Oscar C; Cheung, Ngai-Man; Katsaggelos, Aggelos K; Li, Houqiang; Zou, Feng
2013-10-01
In general, subpixel-based downsampling can achieve higher apparent resolution of the down-sampled images on LCD or OLED displays than pixel-based downsampling. With the frequency domain analysis of subpixel-based downsampling, we discover special characteristics of the luma-chroma color transform choice for monochrome images. With these, we model the anti-aliasing filter design for subpixel-based monochrome image downsampling as a human visual system-based optimization problem with a two-term cost function and obtain a closed-form solution. One cost term measures the luminance distortion and the other term measures the chrominance aliasing in our chosen luma-chroma space. Simulation results suggest that the proposed method can achieve sharper down-sampled gray/font images compared with conventional pixel and subpixel-based methods, without noticeable color fringing artifacts.
A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.
Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo
2015-01-01
The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.
Color model comparative analysis for breast cancer diagnosis using H and E stained images
NASA Astrophysics Data System (ADS)
Li, Xingyu; Plataniotis, Konstantinos N.
2015-03-01
Digital cancer diagnosis is a research realm where signal processing techniques are used to analyze and to classify color histopathology images. Different from grayscale image analysis of magnetic resonance imaging or X-ray, colors in histopathology images convey large amount of histological information and thus play significant role in cancer diagnosis. Though color information is widely used in histopathology works, as today, there is few study on color model selections for feature extraction in cancer diagnosis schemes. This paper addresses the problem of color space selection for digital cancer classification using H and E stained images, and investigates the effectiveness of various color models (RGB, HSV, CIE L*a*b*, and stain-dependent H and E decomposition model) in breast cancer diagnosis. Particularly, we build a diagnosis framework as a comparison benchmark and take specific concerns of medical decision systems into account in evaluation. The evaluation methodologies include feature discriminate power evaluation and final diagnosis performance comparison. Experimentation on a publicly accessible histopathology image set suggests that the H and E decomposition model outperforms other assessed color spaces. For reasons behind various performance of color spaces, our analysis via mutual information estimation demonstrates that color components in the H and E model are less dependent, and thus most feature discriminate power is collected in one channel instead of spreading out among channels in other color spaces.
The Design of Optical Sensor for the Pinhole/Occulter Facility
NASA Technical Reports Server (NTRS)
Greene, Michael E.
1990-01-01
Three optical sight sensor systems were designed, built and tested. Two optical lines of sight sensor system are capable of measuring the absolute pointing angle to the sun. The system is for use with the Pinhole/Occulter Facility (P/OF), a solar hard x ray experiment to be flown from Space Shuttle or Space Station. The sensor consists of a pinhole camera with two pairs of perpendicularly mounted linear photodiode arrays to detect the intensity distribution of the solar image produced by the pinhole, track and hold circuitry for data reduction, an analog to digital converter, and a microcomputer. The deflection of the image center is calculated from these data using an approximation for the solar image. A second system consists of a pinhole camera with a pair of perpendicularly mounted linear photodiode arrays, amplification circuitry, threshold detection circuitry, and a microcomputer board. The deflection of the image is calculated by knowing the position of each pixel of the photodiode array and merely counting the pixel numbers until threshold is surpassed. A third optical sensor system is capable of measuring the internal vibration of the P/OF between the mask and base. The system consists of a white light source, a mirror and a pair of perpendicularly mounted linear photodiode arrays to detect the intensity distribution of the solar image produced by the mirror, amplification circuitry, threshold detection circuitry, and a microcomputer board. The deflection of the image and hence the vibration of the structure is calculated by knowing the position of each pixel of the photodiode array and merely counting the pixel numbers until threshold is surpassed.
NASA Technical Reports Server (NTRS)
Partridge, James D.
2002-01-01
'NASA is preparing to launch the Next Generation Space Telescope (NGST). This telescope will be larger than the Hubble Space Telescope, be launched on an Atlas missile rather than the Space Shuttle, have a segmented primary mirror, and be placed in a higher orbit. All these differences pose significant challenges.' This effort addresses the challenge of implementing an algorithm for aligning the segments of the primary mirror during the initial deployment that was designed by Philip Olivier and members of SOMTC (Space Optics Manufacturing Technology Center). The implementation was to be performed on the SIBOA (Systematic Image Based Optical Alignment) test bed. Unfortunately, hardware/software aspect concerning SIBOA and an extended time period for algorithm development prevented testing before the end of the study period. Properties of the digital camera were studied and understood, resulting in the current ability of selecting optimal settings regarding saturation. The study was successful in manually capturing several images of two stacked segments with various relative phases. These images can be used to calibrate the algorithm for future implementation. Currently the system is ready for testing.
NASA Astrophysics Data System (ADS)
Nguyen, An Hung; Guillemette, Thomas; Lambert, Andrew J.; Pickering, Mark R.; Garratt, Matthew A.
2017-09-01
Image registration is a fundamental image processing technique. It is used to spatially align two or more images that have been captured at different times, from different sensors, or from different viewpoints. There have been many algorithms proposed for this task. The most common of these being the well-known Lucas-Kanade (LK) and Horn-Schunck approaches. However, the main limitation of these approaches is the computational complexity required to implement the large number of iterations necessary for successful alignment of the images. Previously, a multi-pass image interpolation algorithm (MP-I2A) was developed to considerably reduce the number of iterations required for successful registration compared with the LK algorithm. This paper develops a kernel-warping algorithm (KWA), a modified version of the MP-I2A, which requires fewer iterations to successfully register two images and less memory space for the field-programmable gate array (FPGA) implementation than the MP-I2A. These reductions increase feasibility of the implementation of the proposed algorithm on FPGAs with very limited memory space and other hardware resources. A two-FPGA system rather than single FPGA system is successfully developed to implement the KWA in order to compensate insufficiency of hardware resources supported by one FPGA, and increase parallel processing ability and scalability of the system.
Mapping experiment with space station
NASA Technical Reports Server (NTRS)
Wu, S. S. C.
1986-01-01
Mapping of the Earth from space stations can be approached in two areas. One is to collect gravity data for defining topographic datum using Earth's gravity field in terms of spherical harmonics. The other is to search and explore techniques of mapping topography using either optical or radar images with or without reference to ground central points. Without ground control points, an integrated camera system can be designed. With ground control points, the position of the space station (camera station) can be precisely determined at any instant. Therefore, terrestrial topography can be precisely mapped either by conventional photogrammetric methods or by current digital technology of image correlation. For the mapping experiment, it is proposed to establish four ground points either in North America or Africa (including the Sahara desert). If this experiment should be successfully accomplished, it may also be applied to the defense charting systems.
NASA Astrophysics Data System (ADS)
Lang, Jun
2015-03-01
In this paper, we propose a novel color image encryption method by using Color Blend (CB) and Chaos Permutation (CP) operations in the reality-preserving multiple-parameter fractional Fourier transform (RPMPFRFT) domain. The original color image is first exchanged and mixed randomly from the standard red-green-blue (RGB) color space to R‧G‧B‧ color space by rotating the color cube with a random angle matrix. Then RPMPFRFT is employed for changing the pixel values of color image, three components of the scrambled RGB color space are converted by RPMPFRFT with three different transform pairs, respectively. Comparing to the complex output transform, the RPMPFRFT transform ensures that the output is real which can save storage space of image and convenient for transmission in practical applications. To further enhance the security of the encryption system, the output of the former steps is scrambled by juxtaposition of sections of the image in the reality-preserving multiple-parameter fractional Fourier domains and the alignment of sections is determined by two coupled chaotic logistic maps. The parameters in the Color Blend, Chaos Permutation and the RPMPFRFT transform are regarded as the key in the encryption algorithm. The proposed color image encryption can also be applied to encrypt three gray images by transforming the gray images into three RGB color components of a specially constructed color image. Numerical simulations are performed to demonstrate that the proposed algorithm is feasible, secure, sensitive to keys and robust to noise attack and data loss.
Rahman, Md Mahmudur; Antani, Sameer K; Demner-Fushman, Dina; Thoma, George R
2015-10-01
This article presents an approach to biomedical image retrieval by mapping image regions to local concepts where images are represented in a weighted entropy-based concept feature space. The term "concept" refers to perceptually distinguishable visual patches that are identified locally in image regions and can be mapped to a glossary of imaging terms. Further, the visual significance (e.g., visualness) of concepts is measured as the Shannon entropy of pixel values in image patches and is used to refine the feature vector. Moreover, the system can assist the user in interactively selecting a region-of-interest (ROI) and searching for similar image ROIs. Further, a spatial verification step is used as a postprocessing step to improve retrieval results based on location information. The hypothesis that such approaches would improve biomedical image retrieval is validated through experiments on two different data sets, which are collected from open access biomedical literature.
NASA Astrophysics Data System (ADS)
Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.
2017-09-01
Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.
Rahman, Md. Mahmudur; Antani, Sameer K.; Demner-Fushman, Dina; Thoma, George R.
2015-01-01
Abstract. This article presents an approach to biomedical image retrieval by mapping image regions to local concepts where images are represented in a weighted entropy-based concept feature space. The term “concept” refers to perceptually distinguishable visual patches that are identified locally in image regions and can be mapped to a glossary of imaging terms. Further, the visual significance (e.g., visualness) of concepts is measured as the Shannon entropy of pixel values in image patches and is used to refine the feature vector. Moreover, the system can assist the user in interactively selecting a region-of-interest (ROI) and searching for similar image ROIs. Further, a spatial verification step is used as a postprocessing step to improve retrieval results based on location information. The hypothesis that such approaches would improve biomedical image retrieval is validated through experiments on two different data sets, which are collected from open access biomedical literature. PMID:26730398
Operation and Performance of the Mars Exploration Rover Imaging System on the Martian Surface
NASA Technical Reports Server (NTRS)
Maki, Justin N.; Litwin, Todd; Herkenhoff, Ken
2005-01-01
This slide presentation details the Mars Exploration Rover (MER) imaging system. Over 144,000 images have been gathered from all Mars Missions, with 83.5% of them being gathered by MER. Each Rover has 9 cameras (Navcam, front and rear Hazcam, Pancam, Microscopic Image, Descent Camera, Engineering Camera, Science Camera) and produces 1024 x 1024 (1 Megapixel) images in the same format. All onboard image processing code is implemented in flight software and includes extensive processing capabilities such as autoexposure, flat field correction, image orientation, thumbnail generation, subframing, and image compression. Ground image processing is done at the Jet Propulsion Laboratory's Multimission Image Processing Laboratory using Video Image Communication and Retrieval (VICAR) while stereo processing (left/right pairs) is provided for raw image, radiometric correction; solar energy maps,triangulation (Cartesian 3-spaces) and slope maps.
General view of the Orbiter Discovery in the Orbiter Processing ...
General view of the Orbiter Discovery in the Orbiter Processing Facility at Kennedy Space Center showing the payload bay doors open exposing the heat-dissipating radiator panels located on the inside of the payload bay doors. Also in the view is the boom portion of the boom sensor system deployed as part of the return to flight procedures after STS-107 to inspect the orbiter's thermal protection system. The Remote Manipulator System, the "Canadarm", and the airlock are seen in the background of the image. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
STS-93 MS Hawley works with data associated with the OCA on the middeck
2013-11-18
STS093-327-004 (23-27 July 1999) --- Astronaut Steven A. Hawley works with data associated with the Orbital Communications Adapter (OCA) on the middeck of the Space Shuttle Columbia. Not far away from him is the window-mounted instrument which supports the Southwest Ultraviolet Imaging System (SWUIS). SWUIS is an innovative telescope/charge-coupled device camera system designed to image planets and other solar system bodies.
Video Guidance, Landing, and Imaging system (VGLIS) for space missions
NASA Technical Reports Server (NTRS)
Schappell, R. T.; Knickerbocker, R. L.; Tietz, J. C.; Grant, C.; Flemming, J. C.
1975-01-01
The feasibility of an autonomous video guidance system that is capable of observing a planetary surface during terminal descent and selecting the most acceptable landing site was demonstrated. The system was breadboarded and "flown" on a physical simulator consisting of a control panel and monitor, a dynamic simulator, and a PDP-9 computer. The breadboard VGLIS consisted of an image dissector camera and the appropriate processing logic. Results are reported.
Super-resolution for scanning light stimulation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bitzer, L. A.; Neumann, K.; Benson, N., E-mail: niels.benson@uni-due.de
Super-resolution (SR) is a technique used in digital image processing to overcome the resolution limitation of imaging systems. In this process, a single high resolution image is reconstructed from multiple low resolution images. SR is commonly used for CCD and CMOS (Complementary Metal-Oxide-Semiconductor) sensor images, as well as for medical applications, e.g., magnetic resonance imaging. Here, we demonstrate that super-resolution can be applied with scanning light stimulation (LS) systems, which are common to obtain space-resolved electro-optical parameters of a sample. For our purposes, the Projection Onto Convex Sets (POCS) was chosen and modified to suit the needs of LS systems.more » To demonstrate the SR adaption, an Optical Beam Induced Current (OBIC) LS system was used. The POCS algorithm was optimized by means of OBIC short circuit current measurements on a multicrystalline solar cell, resulting in a mean square error reduction of up to 61% and improved image quality.« less
Several considerations with respect to the future of digital photography and photographic printing
NASA Astrophysics Data System (ADS)
Tuijn, Chris; Mahy, Marc F.
2000-12-01
Digital cameras are no longer exotic gadgets being used by a privileged group of early adopters. More and more people realize that there are obvious advantages to the digital solution over the conventional film-based workflow. Claiming that prints on paper are no longer necessary in the digit workflow, however, would be similar to reviving the myth of the paperless office. Often, people still like to share their memories on paper and this for a variety of reasons. There are still some hurdles to be taken in order to make the digital dream com true. In this paper, we will give a survey of the different workflows in digital photography. The local, semi-local and Internet solutions will be discussed as well as the preferred output systems for each of these solutions. When discussing output system, we immediately think of appropriate color management solutions. In the second part of this paper, we will discuss the major color management issues appearing in digital photography. A clear separation between the image acquisition and the image rendering phases will be made. After a quick survey of the different image restoration and enhancement techniques, we will make some reflections on the ideal color exchange space; the enhanced image should be delivered in this exchange space and, from there, the standard color management transformations can be applied to transfer the image from this exchange space to the native color space of the output device. We will also discus some color gamut characteristics and color management problems of different types of photographic printers that can occur during this conversion process.
NASA Technical Reports Server (NTRS)
Vaughan, Andrew T. (Inventor); Riedel, Joseph E. (Inventor)
2016-01-01
A single, compact, lower power deep space positioning system (DPS) configured to determine a location of a spacecraft anywhere in the solar system, and provide state information relative to Earth, Sun, or any remote object. For example, the DPS includes a first camera and, possibly, a second camera configured to capture a plurality of navigation images to determine a state of a spacecraft in a solar system. The second camera is located behind, or adjacent to, a secondary reflector of a first camera in a body of a telescope.
Imaging Thermal He(+)in Geospace from the Lunar Surface
NASA Technical Reports Server (NTRS)
Gallagher, D. L.; Sandel, B. R.; Adrian, Mark L.; Goldstein, Jerry; Jahn, Joerg-Micha; Spasojevic, Maria; Griffin, Brand
2007-01-01
By mass, thermal plasma dominates near-earth space and strongly influences the transport of energy and mass into the earth's atmosphere. It is proposed to play an important role in modifying the strength of space weather storms by its presence in regions of magnetic reconnection in the dayside magnetopause and in the near to mid-magnetotail. Ionospheric-origin thermal plasma also represents the most significant potential loss of atmospheric mass from our planet over geological time. Knowledge of the loss of convected thermal plasma into the solar wind versus its recirculation across high latitudes and through the magnetospheric flanks into the magnetospheric tail will enable determination of the mass balance for this mass-dominant component of the Geospace system and of its influence on global magnetospheric processes that are critical to space weather prediction and hence to the impact of space processes on human technology in space and on Earth. Our proposed concept addresses this basic issue of Geospace dynamics by imaging thermal He(+) ions in extreme ultraviolet light with an instrument on the lunar surface. The concept is derived from the highly successful Extreme Ultraviolet imager (EUV) flown on the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE) spacecraft. From the lunar surface an advanced EUV imager is anticipated to have much higher sensitivity, lower background noise, and higher communication bandwidth back to Earth. From the near-magnetic equatorial location on the lunar surface, such an imager would be ideally located to follow thermal He(+) ions to high latitudes, into the magnetospheric flanks, and into the magnetotail.
NASA Technical Reports Server (NTRS)
Kelly, Patrick L.; Fox, Ori D.; Filippenko, Alexei V.; Cenko, S. Bradley; Prato, Lisa; Schaefer, Gail; Shen, Ken J.; Zheng, WeiKang; Graham, Melissa L.; Tucker, Brad E.
2014-01-01
We constrain the properties of the progenitor system of the highly reddened Type Ia supernova (SN Ia) 2014J in Messier 82 (M82; d (is) approx. 3.5 Mpc). We determine the supernova (SN) location using Keck-II K-band adaptive optics images, and we find no evidence for flux from a progenitor system in pre-explosion near-ultraviolet through near-infrared Hubble Space Telescope (HST) images. Our upper limits exclude systems having a bright red giant companion, including symbiotic novae with luminosities comparable to that of RS Ophiuchi. While the flux constraints are also inconsistent with predictions for comparatively cool He-donor systems (T (is) approximately 35,000 K), we cannot preclude a system similar to V445 Puppis. The progenitor constraints are robust across a wide range of RV and AV values, but significantly greater values than those inferred from the SN light curve and spectrum would yield proportionally brighter luminosity limits. The comparatively faint flux expected from a binary progenitor system consisting of white dwarf stars would not have been detected in the pre-explosion HST imaging. Infrared HST exposures yield more stringent constraints on the luminosities of very cool (T (is) less than 3000 K) companion stars than was possible in the case of SN Ia 2011fe.
Real-time implementation of camera positioning algorithm based on FPGA & SOPC
NASA Astrophysics Data System (ADS)
Yang, Mingcao; Qiu, Yuehong
2014-09-01
In recent years, with the development of positioning algorithm and FPGA, to achieve the camera positioning based on real-time implementation, rapidity, accuracy of FPGA has become a possibility by way of in-depth study of embedded hardware and dual camera positioning system, this thesis set up an infrared optical positioning system based on FPGA and SOPC system, which enables real-time positioning to mark points in space. Thesis completion include: (1) uses a CMOS sensor to extract the pixel of three objects with total feet, implemented through FPGA hardware driver, visible-light LED, used here as the target point of the instrument. (2) prior to extraction of the feature point coordinates, the image needs to be filtered to avoid affecting the physical properties of the system to bring the platform, where the median filtering. (3) Coordinate signs point to FPGA hardware circuit extraction, a new iterative threshold selection method for segmentation of images. Binary image is then segmented image tags, which calculates the coordinates of the feature points of the needle through the center of gravity method. (4) direct linear transformation (DLT) and extreme constraints method is applied to three-dimensional reconstruction of the plane array CMOS system space coordinates. using SOPC system on a chip here, taking advantage of dual-core computing systems, which let match and coordinate operations separately, thus increase processing speed.
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Tietjen, Alan B.; Horvath, Thomas J.; Tomek, Deborah M.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Bush, Brett C.; Mercer, C. David; Shea, Edward J.
2010-01-01
High resolution calibrated near infrared (NIR) imagery was obtained of the Space Shuttle s reentry during STS-119, STS-125, and STS-128 missions. The infrared imagery was collected using a US Navy NP-3D Orion aircraft using a long-range infrared optical package referred to as Cast Glance. The slant ranges between the Space Shuttle and Cast Glance were approximately 26-41 nautical miles at point of closest approach. The Hypersonic Thermodynamic Infrared Measurements (HYTHIRM) project was a NASA Langley led endeavor sponsored by the NASA Engineering Safety Center, the Space Shuttle Program Office and the NASA Aeronautics Research Mission Directorate to demonstrate a quantitative thermal imaging capability. HYTHIRM required several mission tools to acquire the imagery. These tools include pre-mission acquisition simulations of the Shuttle trajectory in relationship to the Cast Glance aircraft flight path, radiance modeling to predict the infrared response of the Shuttle, and post mission analysis tools to process the infrared imagery to quantitative temperature maps. The spatially resolved global thermal measurements made during the Shuttle s hypersonic reentry provides valuable flight data for reducing the uncertainty associated with present day ground-to-flight extrapolation techniques and current state-of-the-art empirical boundary-layer transition or turbulent heating prediction methods. Laminar and turbulent flight data is considered critical for the development of turbulence models supporting NASA s next-generation spacecraft. This paper will provide the motivation and details behind the use of an upgraded NIR imaging system used onboard a Navy Cast Glance aircraft and describe the characterizations and procedures performed to obtain quantitative temperature maps. A brief description and assessment will be provided of the previously used analog NIR camera along with image examples from Shuttle missions STS-121, STS-115, and solar tower test. These thermal observations confirmed the challenge of a long-range acquisition during re-entry. These challenges are due to unknown atmospheric conditions, image saturation, vibration etc. This provides the motivation for the use of a digital NIR sensor. The characterizations performed on the digital NIR sensor included radiometric, spatial, and spectral measurements using blackbody radiation sources and known targets. An assessment of the collected data for three Space Shuttle atmospheric re-entries, STS-119, STS-125, and STS-128, are provided along with a description of various events of interest captured using the digital NIR imaging system such as RCS firings and boundary layer transitions. Lastly the process used to convert the raw image counts to quantitative temperatures is presented along with comparisons to the Space Shuttle's onboard thermocouples.
Large space telescope, phase A. Volume 4: Scientific instrument package
NASA Technical Reports Server (NTRS)
1972-01-01
The design and characteristics of the scientific instrument package for the Large Space Telescope are discussed. The subjects include: (1) general scientific objectives, (2) package system analysis, (3) scientific instrumentation, (4) imaging photoelectric sensors, (5) environmental considerations, and (6) reliability and maintainability.
Interior view of the Flight Deck looking forward, the Commander's ...
Interior view of the Flight Deck looking forward, the Commander's seat and controls are on the left and the pilot's seat and controls are on the right of the view. Note that the flight deck windows have protective covers over them in this view. This images can be digitally stitched with image HAER No. TX-116-A-20 to expand the view to include the overhead control panels of the flight deck. This view was taken in the Orbiter Processing Facility at the Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
The architecture of a video image processor for the space station
NASA Technical Reports Server (NTRS)
Yalamanchili, S.; Lee, D.; Fritze, K.; Carpenter, T.; Hoyme, K.; Murray, N.
1987-01-01
The architecture of a video image processor for space station applications is described. The architecture was derived from a study of the requirements of algorithms that are necessary to produce the desired functionality of many of these applications. Architectural options were selected based on a simulation of the execution of these algorithms on various architectural organizations. A great deal of emphasis was placed on the ability of the system to evolve and grow over the lifetime of the space station. The result is a hierarchical parallel architecture that is characterized by high level language programmability, modularity, extensibility and can meet the required performance goals.
NASA Astrophysics Data System (ADS)
Zhou, Nanrun; Chen, Weiwei; Yan, Xinyu; Wang, Yunqian
2018-06-01
In order to obtain higher encryption efficiency, a bit-level quantum color image encryption scheme by exploiting quantum cross-exchange operation and a 5D hyper-chaotic system is designed. Additionally, to enhance the scrambling effect, the quantum channel swapping operation is employed to swap the gray values of corresponding pixels. The proposed color image encryption algorithm has larger key space and higher security since the 5D hyper-chaotic system has more complex dynamic behavior, better randomness and unpredictability than those based on low-dimensional hyper-chaotic systems. Simulations and theoretical analyses demonstrate that the presented bit-level quantum color image encryption scheme outperforms its classical counterparts in efficiency and security.
2017-12-08
Carina Nebula Details: Great Clouds Credit for Hubble Image: NASA, ESA, N. Smith (University of California, Berkeley), and The Hubble Heritage Team (STScI/AURA) Credit for CTIO Image: N. Smith (University of California, Berkeley) and NOAO/AURA/NSF The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute conducts Hubble science operations. Goddard is responsible for HST project management, including mission and science operations, servicing missions, and all associated development activities. To learn more about the Hubble Space Telescope go here: www.nasa.gov/mission_pages/hubble/main/index.html NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe. Follow us on Twitter Join us on Facebook
Multispectral Image Processing for Plants
NASA Technical Reports Server (NTRS)
Miles, Gaines E.
1991-01-01
The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.
Geometry-Based Observability Metric
NASA Technical Reports Server (NTRS)
Eaton, Colin; Naasz, Bo
2012-01-01
The Satellite Servicing Capabilities Office (SSCO) is currently developing and testing Goddard s Natural Feature Image Recognition (GNFIR) software for autonomous rendezvous and docking missions. GNFIR has flight heritage and is still being developed and tailored for future missions with non-cooperative targets: (1) DEXTRE Pointing Package System on the International Space Station, (2) Relative Navigation System (RNS) on the Space Shuttle for the fourth Hubble Servicing Mission.
Natural texture retrieval based on perceptual similarity measurement
NASA Astrophysics Data System (ADS)
Gao, Ying; Dong, Junyu; Lou, Jianwen; Qi, Lin; Liu, Jun
2018-04-01
A typical texture retrieval system performs feature comparison and might not be able to make human-like judgments of image similarity. Meanwhile, it is commonly known that perceptual texture similarity is difficult to be described by traditional image features. In this paper, we propose a new texture retrieval scheme based on texture perceptual similarity. The key of the proposed scheme is that prediction of perceptual similarity is performed by learning a non-linear mapping from image features space to perceptual texture space by using Random Forest. We test the method on natural texture dataset and apply it on a new wallpapers dataset. Experimental results demonstrate that the proposed texture retrieval scheme with perceptual similarity improves the retrieval performance over traditional image features.
Image processing applications: From particle physics to society
NASA Astrophysics Data System (ADS)
Sotiropoulou, C.-L.; Luciano, P.; Gkaitatzis, S.; Citraro, S.; Giannetti, P.; Dell'Orso, M.
2017-01-01
We present an embedded system for extremely efficient real-time pattern recognition execution, enabling technological advancements with both scientific and social impact. It is a compact, fast, low consumption processing unit (PU) based on a combination of Field Programmable Gate Arrays (FPGAs) and the full custom associative memory chip. The PU has been developed for real time tracking in particle physics experiments, but delivers flexible features for potential application in a wide range of fields. It has been proposed to be used in accelerated pattern matching execution for Magnetic Resonance Fingerprinting (biomedical applications), in real time detection of space debris trails in astronomical images (space applications) and in brain emulation for image processing (cognitive image processing). We illustrate the potentiality of the PU for the new applications.
NASA Technical Reports Server (NTRS)
Kharkovsky, S.; Zoughi, R.; Hepburn, Frank L.
2007-01-01
In the recent years, continuous-wave near-field and lens-focused millimeter wave imaging systems have been effectively used to demonstrate their utility for producing high-resolution images of metallic structures covered with spay on foam insulation (SOFI) such as the Space Shuttle external fuel tank. However, for some specific structures a certain interference -pattern may be superimposed on the produced images. There are methods by which the influence of this unwanted interference can be reduced, such as the incorporation of an incidence .angle and the proper use of signal polarization. This paper presents the basics of this problem and describes the use of the methods for reducing this unwanted influence through specific examples.
Determination of technical readiness for an atmospheric carbon imaging spectrometer
NASA Astrophysics Data System (ADS)
Mobilia, Joseph; Kumer, John B.; Palmer, Alice; Sawyer, Kevin; Mao, Yalan; Katz, Noah; Mix, Jack; Nast, Ted; Clark, Charles S.; Vanbezooijen, Roel; Magoncelli, Antonio; Baraze, Ronald A.; Chenette, David L.
2013-09-01
The geoCARB sensor uses a 4-channel push broom slit-scan infrared imaging grating spectrometer to measure the absorption spectra of sunlight reflected from the ground in narrow wavelength regions. The instrument is designed for flight at geostationary orbit to provide mapping of greenhouse gases over continental scales, several times per day, with a spatial resolution of a few kilometers. The sensor provides multiple daily maps of column-averaged mixing ratios of CO2, CH4, and CO over the regions of interest, which enables flux determination at unprecedented time, space, and accuracy scales. The geoCARB sensor development is based on our experience in successful implementation of advanced space deployed optical instruments for remote sensing. A few recent examples include the Atmospheric Imaging Assembly (AIA) and Helioseismic and Magnetic Imager (HMI) on the geostationary Solar Dynamics Observatory (SDO), the Space Based Infrared System (SBIRS GEO-1) and the Interface Region Imaging Spectrograph (IRIS), along with sensors under development, the Near Infared camera (NIRCam) for James Webb (JWST), and the Global Lightning Mapper (GLM) and Solar UltraViolet Imager (SUVI) for the GOES-R series. The Tropospheric Infrared Mapping Spectrometer (TIMS), developed in part through the NASA Instrument Incubator Program (IIP), provides an important part of the strong technological foundation for geoCARB. The paper discusses subsystem heritage and technology readiness levels for these subsystems. The system level flight technology readiness and methods used to determine this level are presented along with plans to enhance the level.
2015-02-04
In this image, Mercury's horizon cuts a striking edge against the stark blackness of space. On the right, sunlight harshly brings the landscape into relief while on the left, the surface is shrouded in the darkness of night. This image was acquired as part of MDIS's limb imaging campaign. Once per week, MDIS captures images of Mercury's limb, with an emphasis on imaging the southern hemisphere limb. These limb images provide information about Mercury's shape and complement measurements of topography made by the Mercury Laser Altimeter (MLA) of Mercury's northern hemisphere. The MESSENGER spacecraft is the first ever to orbit the planet Mercury, and the spacecraft's seven scientific instruments and radio science investigation are unraveling the history and evolution of the Solar System's innermost planet. In the mission's more than three years of orbital operations, MESSENGER has acquired over 250,000 images and extensive other data sets. MESSENGER is capable of continuing orbital operations until early 2015. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Improved Resolution Optical Time Stretch Imaging Based on High Efficiency In-Fiber Diffraction.
Wang, Guoqing; Yan, Zhijun; Yang, Lei; Zhang, Lin; Wang, Chao
2018-01-12
Most overlooked challenges in ultrafast optical time stretch imaging (OTSI) are sacrificed spatial resolution and higher optical loss. These challenges are originated from optical diffraction devices used in OTSI, which encode image into spectra of ultrashort optical pulses. Conventional free-space diffraction gratings, as widely used in existing OTSI systems, suffer from several inherent drawbacks: limited diffraction efficiency in a non-Littrow configuration due to inherent zeroth-order reflection, high coupling loss between free-space gratings and optical fibers, bulky footprint, and more importantly, sacrificed imaging resolution due to non-full-aperture illumination for individual wavelengths. Here we report resolution-improved and diffraction-efficient OTSI using in-fiber diffraction for the first time to our knowledge. The key to overcome the existing challenges is a 45° tilted fiber grating (TFG), which serves as a compact in-fiber diffraction device offering improved diffraction efficiency (up to 97%), inherent compatibility with optical fibers, and improved imaging resolution owning to almost full-aperture illumination for all illumination wavelengths. 50 million frames per second imaging of fast moving object at 46 m/s with improved imaging resolution has been demonstrated. This conceptually new in-fiber diffraction design opens the way towards cost-effective, compact and high-resolution OTSI systems for image-based high-throughput detection and measurement.
Discrete Fourier Transform in a Complex Vector Space
NASA Technical Reports Server (NTRS)
Dean, Bruce H. (Inventor)
2015-01-01
An image-based phase retrieval technique has been developed that can be used on board a space based iterative transformation system. Image-based wavefront sensing is computationally demanding due to the floating-point nature of the process. The discrete Fourier transform (DFT) calculation is presented in "diagonal" form. By diagonal we mean that a transformation of basis is introduced by an application of the similarity transform of linear algebra. The current method exploits the diagonal structure of the DFT in a special way, particularly when parts of the calculation do not have to be repeated at each iteration to converge to an acceptable solution in order to focus an image.
Boundary-layer transition and global skin friction measurement with an oil-fringe imaging technique
NASA Technical Reports Server (NTRS)
Monson, Daryl J.; Mateer, George G.; Menter, Florian R.
1993-01-01
A new oil-fringe imaging system skin friction (FISF) technique to measure skin friction on wind tunnel models is presented. In the method used to demonstrate the technique, lines of oil are applied on surfaces that connect the intended sets of measurement points, and then a wind tunnel is run so that the oil thins and forms interference fringes that are spaced in proportion to local skin friction. After a run the fringe spacings are imaged with a CCD-array digital camera and measured on a computer. Skin friction and transition measurements on a two-dimensional wing are presented and compared with computational predictions.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
Development of a CCTV system for welder training and monitoring of Space Shuttle Main Engine welds
NASA Technical Reports Server (NTRS)
Gordon, S. S.; Flanigan, L. A.; Dyer, G. E.
1987-01-01
A Weld Operator's Remote Monitoring System (WORMS) for remote viewing of manual and automatic GTA welds has been developed for use in Space Shuttle Main Engine (SSME) manufacturing. This system utilizes fiberoptics to transmit images from a receiving lens to a small closed-circuit television (CCTV) camera. The camera converts the image to an electronic signal, which is sent to a videotape recorder (VTR) and a monitor. The overall intent of this system is to provide a clearer, more detailed view of welds than is available by direct observation. This system has six primary areas of application: (1) welder training; (2) viewing of joint penetration; (3) viewing visually inaccessible welds; (4) quality control and quality assurance; (5) remote joint tracking and adjustment of variables in machine welds; and (6) welding research and development. This paper describes WORMS and how it applies to each application listed.
Imaging characteristics of photogrammetric camera systems
Welch, R.; Halliday, J.
1973-01-01
In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.
P91-1 ARGOS spacecraft thermal control
NASA Astrophysics Data System (ADS)
Sadunas, Jonas; Baginski, Ben; McCarthy, Daniel
1993-07-01
The P91-1, or ARGOS, is a Department of Defense funded (DOD) Space Test Program (STP) satellite managed by the Space and Missile Systems Center Space and Small Launch Vehicle Programs Office (SMC/CUL). Rockwell International Space Systems Division is the space vehicle prime contractor. The P91-1 mission is to fly a suite of eight experiments in a 450 nautical mile sun-synchronous orbit dedicated to three dimensional UV imaging of the ionosphere, X-ray source mapping, navigation, space debris characterization, performance characterization of high temperature super conductivity RF devices, and on orbit demonstration of an electrical propulsion system. The primary purpose of this paper is to acquaint the thermal control community, and potential future follow on mission users, with the thermal control characteristics of the spacecraft, experiment/SV thermal integration aspects, and test verification plans.
2012-11-08
S48-E-013 (15 Sept 1991) --- The Upper Atmosphere Research Satellite (UARS) in the payload bay of the earth- orbiting Discovery. UARS is scheduled for deploy on flight day three of the STS-48 mission. Data from UARS will enable scientists to study ozone depletion in the stratosphere, or upper atmosphere. This image was transmitted by the Electronic Still Camera (ESC), Development Test Objective (DTO) 648. The ESC is making its initial appearance on a Space Shuttle flight. Electronic still photography is a new technology that enables a camera to electronically capture and digitize an image with resolution approaching film quality. The digital image is stored on removable hard disks or small optical disks, and can be converted to a format suitable for downlink transmission or enhanced using image processing software. The Electronic Still Camera (ESC) was developed by the Man- Systems Division at the Johnson Space Center and is the first model in a planned evolutionary development leading to a family of high-resolution digital imaging devices. H. Don Yeates, JSC's Man-Systems Division, is program manager for the ESC. THIS IS A SECOND GENERATION PRINT MADE FROM AN ELECTRONICALLY PRODUCED NEGATIVE.
2014-07-02
Date acquired: May 05, 2014 Today's color image features both Mercury's terminator and limb. The terminator is the striking separation of night and day on Mercury. It is seen in this image with the change from dark, on the left of the image, to light. Mercury's limb is also captured, as we can see the edge between sunlit Mercury and space. The MESSENGER spacecraft is the first ever to orbit the planet Mercury, and the spacecraft's seven scientific instruments and radio science investigation are unraveling the history and evolution of the Solar System's innermost planet. During the first two years of orbital operations, MESSENGER acquired over 150,000 images and extensive other data sets. MESSENGER is capable of continuing orbital operations until early 2015. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
AlDahlawi, Ismail; Prasad, Dheerendra; Podgorsak, Matthew B
2017-05-01
The Gamma Knife Icon comes with an integrated cone-beam CT (CBCT) for image-guided stereotactic treatment deliveries. The CBCT can be used for defining the Leksell stereotactic space using imaging without the need for the traditional invasive frame system, and this allows also for frameless thermoplastic mask stereotactic treatments (single or fractionated) with the Gamma Knife unit. In this study, we used an in-house built marker tool to evaluate the stability of the CBCT-based stereotactic space and its agreement with the standard frame-based stereotactic space. We imaged the tool with a CT indicator box using our CT-simulator at the beginning, middle, and end of the study period (6 weeks) for determining the frame-based stereotactic space. The tool was also scanned with the Icon's CBCT on a daily basis throughout the study period, and the CBCT images were used for determining the CBCT-based stereotactic space. The coordinates of each marker were determined in each CT and CBCT scan using the Leksell GammaPlan treatment planning software. The magnitudes of vector difference between the means of each marker in frame-based and CBCT-based stereotactic space ranged from 0.21 to 0.33 mm, indicating good agreement of CBCT-based and frame-based stereotactic space definition. Scanning 4-month later showed good prolonged stability of the CBCT-based stereotactic space definition. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Names-to-Mars Chip for InSight Spacecraft
2015-12-17
The dime-size microchip in this close-up image carries 826,923 names that will go to Mars on NASA InSight lander. The image was taken in November 2015 inside a clean room at Lockheed Martin Space Systems, Denver, where the lander was built.
a Novel Approach to Camera Calibration Method for Smart Phones Under Road Environment
NASA Astrophysics Data System (ADS)
Lee, Bijun; Zhou, Jian; Ye, Maosheng; Guo, Yuan
2016-06-01
Monocular vision-based lane departure warning system has been increasingly used in advanced driver assistance systems (ADAS). By the use of the lane mark detection and identification, we proposed an automatic and efficient camera calibration method for smart phones. At first, we can detect the lane marker feature in a perspective space and calculate edges of lane markers in image sequences. Second, because of the width of lane marker and road lane is fixed under the standard structural road environment, we can automatically build a transformation matrix between perspective space and 3D space and get a local map in vehicle coordinate system. In order to verify the validity of this method, we installed a smart phone in the `Tuzhi' self-driving car of Wuhan University and recorded more than 100km image data on the road in Wuhan. According to the result, we can calculate the positions of lane markers which are accurate enough for the self-driving car to run smoothly on the road.
An intelligent space for mobile robot localization using a multi-camera system.
Rampinelli, Mariana; Covre, Vitor Buback; de Queiroz, Felippe Mendonça; Vassallo, Raquel Frizera; Bastos-Filho, Teodiano Freire; Mazo, Manuel
2014-08-15
This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization.
An Intelligent Space for Mobile Robot Localization Using a Multi-Camera System
Rampinelli, Mariana.; Covre, Vitor Buback.; de Queiroz, Felippe Mendonça.; Vassallo, Raquel Frizera.; Bastos-Filho, Teodiano Freire.; Mazo, Manuel.
2014-01-01
This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization. PMID:25196009
Massively parallel information processing systems for space applications
NASA Technical Reports Server (NTRS)
Schaefer, D. H.
1979-01-01
NASA is developing massively parallel systems for ultra high speed processing of digital image data collected by satellite borne instrumentation. Such systems contain thousands of processing elements. Work is underway on the design and fabrication of the 'Massively Parallel Processor', a ground computer containing 16,384 processing elements arranged in a 128 x 128 array. This computer uses existing technology. Advanced work includes the development of semiconductor chips containing thousands of feedthrough paths. Massively parallel image analog to digital conversion technology is also being developed. The goal is to provide compact computers suitable for real-time onboard processing of images.
Reduce Fluid Experiment System: Flight data from the IML-1 Mission
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Harper, Sabrina
1995-01-01
Processing and data reduction of holographic images from the International Microgravity Laboratory 1 (IML-1) presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Use of several processing techniques, including the Computerized Holographic Image Processing System and the Software Development Package (SDP-151) will provide fundamental information for holographic and schlieren analysis of the space flight data.
NASA Technical Reports Server (NTRS)
Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy R.; Martin, Richard E.
2013-01-01
Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA s electron beam free-form fabrication (EBF(sup 3)) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF(sup 3) technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF(sup 3) system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality weld, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for weld assessment metrics.
NASA Technical Reports Server (NTRS)
1995-01-01
Intelligent Vision Systems, Inc. (InVision) needed image acquisition technology that was reliable in bad weather for its TDS-200 Traffic Detection System. InVision researchers used information from NASA Tech Briefs and assistance from Johnson Space Center to finish the system. The NASA technology used was developed for Earth-observing imaging satellites: charge coupled devices, in which silicon chips convert light directly into electronic or digital images. The TDS-200 consists of sensors mounted above traffic on poles or span wires, enabling two sensors to view an intersection; a "swing and sway" feature to compensate for movement of the sensors; a combination of electronic shutter and gain control; and sensor output to an image digital signal processor, still frame video and optionally live video.
NASA Astrophysics Data System (ADS)
Blois, G.; Sambrook Smith, G.; Best, J.; Hardy, R.; Lead, J.
2008-12-01
Most natural rivers have beds of loose, cohesionless sediment that form a porous bed, thus permitting significant interactions between the free flow above the bed and that within the pore spaces. Many unresolved problems in channel engineering and ecohydraulics are related to an incomplete understanding of this interstitial flow. For example, the mechanisms of pollutant transport and prediction of river bed morphodynamics may be strongly influenced by flow occurring within the pore spaces. While this lack of understanding has been widely acknowledged, the direct experimental investigation of flow within the pore spaces has been restricted by the practical difficulties in collecting such data. This has also created drawbacks in the numerical modeling of pore flow as there remains a dearth of robust experimental data with which to validate such models. In order to help address these issues, we present details of a new endoscopic PIV system designed to tackle some of the challenges highlighted above. The work presented in this paper is also being used to validate a numerical model that is being developed as part of this project. A fully endoscopic PIV system has been developed to collect velocity and turbulence data for flow within the pore space of a gravel bed. The system comprises a pulsed Nd:YAG laser that provides high intensity illumination for single exposure pairs of images on a high-resolution digital camera. The use of rigid endoscopes for both the laser light source and camera allows measurement of quasi-instantaneous flow fields by high-resolution PIV images (2352*1728 pixels). In the first instance, the endoscopic PIV system has been used to study flow within an artificial pore space model constructed from 38 and 51 mm diameter spheres, used to represent a simplified version of a natural gravel-bed river. Across-correlation processing approach has been applied to the PIV images and the processing parameters have been optimized for the experimental conditions. A series of instantaneous two-dimensional flow fields in a simple pore space has been reconstructed permitting quantification of the mean flow. A not symmetric flow structure has been highlighted showing the strong dependence of flow on the bed geometry and presence of the free surface. Preliminary results will be discussed here in order to highlight the critical aspects of the technique. Illumination from the laser endoscope must be optimized in terms of angle of divergence, uniformity and stability, with any source of irregular illumination causing strong reflections from the surface of the spheres resulting in saturation of huge image areas. The preliminary results obtained demonstrate the utility of the fully endoscopic PIV technique for investigation of flow structure in pore spaces. Further developments of the technique will include improving light uniformity, removing reflections from images and increasing the illuminated portion of the pore space area.
2015-04-20
Every day of every year, NASA satellites provide useful data about our home planet, and along the way, some beautiful images as well. This video includes satellite images of Earth in 2014 from NASA and its partners as well as photos and a time lapse video from the International Space Station. We’ve also included a range of data visualizations, model runs, and a conceptual animation that were produced in 2014 (but in some cases might have been utilizing data from earlier years.) Credit: NASA's Goddard Space Flight Center NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Optical ranked-order filtering using threshold decomposition
Allebach, J.P.; Ochoa, E.; Sweeney, D.W.
1987-10-09
A hybrid optical/electronic system performs median filtering and related ranked-order operations using threshold decomposition to encode the image. Threshold decomposition transforms the nonlinear neighborhood ranking operation into a linear space-invariant filtering step followed by a point-to-point threshold comparison step. Spatial multiplexing allows parallel processing of all the threshold components as well as recombination by a second linear, space-invariant filtering step. An incoherent optical correlation system performs the linear filtering, using a magneto-optic spatial light modulator as the input device and a computer-generated hologram in the filter plane. Thresholding is done electronically. By adjusting the value of the threshold, the same architecture is used to perform median, minimum, and maximum filtering of images. A totally optical system is also disclosed. 3 figs.
Wavefront sensing in space: flight demonstration II of the PICTURE sounding rocket payload
NASA Astrophysics Data System (ADS)
Douglas, Ewan S.; Mendillo, Christopher B.; Cook, Timothy A.; Cahoy, Kerri L.; Chakrabarti, Supriya
2018-01-01
A NASA sounding rocket for high-contrast imaging with a visible nulling coronagraph, the Planet Imaging Concept Testbed Using a Rocket Experiment (PICTURE) payload, has made two suborbital attempts to observe the warm dust disk inferred around Epsilon Eridani. The first flight in 2011 demonstrated a 5 mas fine pointing system in space. The reduced flight data from the second launch, on November 25, 2015, presented herein, demonstrate active sensing of wavefront phase in space. Despite several anomalies in flight, postfacto reduction phase stepping interferometer data provide insight into the wavefront sensing precision and the system stability for a portion of the pupil. These measurements show the actuation of a 32 × 32-actuator microelectromechanical system deformable mirror. The wavefront sensor reached a median precision of 1.4 nm per pixel, with 95% of samples between 0.8 and 12.0 nm per pixel. The median system stability, including telescope and coronagraph wavefront errors other than tip, tilt, and piston, was 3.6 nm per pixel, with 95% of samples between 1.2 and 23.7 nm per pixel.
MS Hadfield and MS Parazynski raise the SSRMS from the SLP during an EVA for STS-100
2001-04-22
STS100-714-015 (22 April 2001) --- Astronauts Scott E. Parazynski (center frame) and Chris A. Hadfield (partially obscured) prepare to unpack the new Space Station Remote Manipulator System (SSRMS) or Canadarm2 during the first of two STS-100 space walks. Hadfield represents the Canadian Space Agency (CSA). The image was exposed with a 70mm camera from inside the Space Shuttle Endeavour's crew cabin.
NASA Technical Reports Server (NTRS)
Rice, R. F.; Hilbert, E. E. (Inventor)
1976-01-01
A space communication system incorporating a concatenated Reed Solomon Viterbi coding channel is discussed for transmitting compressed and uncompressed data from a spacecraft to a data processing center on Earth. Imaging (and other) data are first compressed into source blocks which are then coded by a Reed Solomon coder and interleaver, followed by a convolutional encoder. The received data is first decoded by a Viterbi decoder, followed by a Reed Solomon decoder and deinterleaver. The output of the latter is then decompressed, based on the compression criteria used in compressing the data in the spacecraft. The decompressed data is processed to reconstruct an approximation of the original data-producing condition or images.
OrbView-3 Initial On-Orbit Characterization
NASA Technical Reports Server (NTRS)
Ross, Kent; Blonski, Slawomir; Holekamp, Kara; Pagnutti, Mary; Zanoni, Vicki; Carver, David; Fendley, Debbie; Smith, Charles
2004-01-01
NASA at Stennis Space Center (SSC) established a Space Act Agreement with Orbital Sciences Corporation (OSC) and ORBIMAGE Inc. to collaborate on the characterization of the OrbView-3 system and its imagery products and to develop characterization techniques further. In accordance with the agreement, NASA performed an independent radiometric, spatial, and geopositional accuracy assessment of OrbView-3 imagery acquired before completion of the system's initial on-orbit checkout. OSC acquired OrbView-3 imagery over SSC from July 2003 through January 2004, and NASA collected ground reference information coincident with many of these acquisitions. After evaluating all acquisitions, NASA deemed two multispectral images and five panchromatic images useful for characterization. NASA then performed radiometric, spatial, and geopositional characterizations.
NASA Technical Reports Server (NTRS)
Henderson, F. B. (Editor); Rock, B. N. (Editor)
1983-01-01
Consideration is given to: the applications of near-infrared spectroscopy to geological reconnaissance and exploration from space; imaging systems for identifying the spectral properties of geological materials in the visible and near-infrared; and Thematic Mapper (TM) data analysis. Consideration is also given to descriptions of individual geological remote sensing systems, including: GEO-SPAS; SPOT; the Thermal Infrared Multispectral Scanner (TIMS); and the Shuttle Imaging Radars A and B (SIR-A and SIR-B). Additional topics include: the importance of geobotany in geological remote sensing; achromatic holographic stereograms from Landsat MSS data; and the availability and applications of NOAA's non-Landsat satellite data archive.
Yin, X X; Ng, B W-H; Ramamohanarao, K; Baghai-Wadji, A; Abbott, D
2012-09-01
It has been shown that, magnetic resonance images (MRIs) with sparsity representation in a transformed domain, e.g. spatial finite-differences (FD), or discrete cosine transform (DCT), can be restored from undersampled k-space via applying current compressive sampling theory. The paper presents a model-based method for the restoration of MRIs. The reduced-order model, in which a full-system-response is projected onto a subspace of lower dimensionality, has been used to accelerate image reconstruction by reducing the size of the involved linear system. In this paper, the singular value threshold (SVT) technique is applied as a denoising scheme to reduce and select the model order of the inverse Fourier transform image, and to restore multi-slice breast MRIs that have been compressively sampled in k-space. The restored MRIs with SVT for denoising show reduced sampling errors compared to the direct MRI restoration methods via spatial FD, or DCT. Compressive sampling is a technique for finding sparse solutions to underdetermined linear systems. The sparsity that is implicit in MRIs is to explore the solution to MRI reconstruction after transformation from significantly undersampled k-space. The challenge, however, is that, since some incoherent artifacts result from the random undersampling, noise-like interference is added to the image with sparse representation. These recovery algorithms in the literature are not capable of fully removing the artifacts. It is necessary to introduce a denoising procedure to improve the quality of image recovery. This paper applies a singular value threshold algorithm to reduce the model order of image basis functions, which allows further improvement of the quality of image reconstruction with removal of noise artifacts. The principle of the denoising scheme is to reconstruct the sparse MRI matrices optimally with a lower rank via selecting smaller number of dominant singular values. The singular value threshold algorithm is performed by minimizing the nuclear norm of difference between the sampled image and the recovered image. It has been illustrated that this algorithm improves the ability of previous image reconstruction algorithms to remove noise artifacts while significantly improving the quality of MRI recovery.
The HR 4796A Debris System: Discovery of Extensive Exo-ring Dust Material
NASA Astrophysics Data System (ADS)
Schneider, Glenn; Debes, John H.; Grady, Carol A.; Gáspár, Andras; Henning, Thomas; Hines, Dean C.; Kuchner, Marc J.; Perrin, Marshall; Wisniewski, John P.
2018-02-01
The optically and IR-bright and starlight-scattering HR 4796A ringlike debris disk is one of the most- (and best-) studied exoplanetary debris systems. The presence of a yet-undetected planet has been inferred (or suggested) from the narrow width and inner/outer truncation radii of its r = 1.″05 (77 au) debris ring. We present new, highly sensitive Hubble Space Telescope (HST) visible-light images of the HR 4796A circumstellar debris system and its environment over a very wide range of stellocentric angles from 0.″32 (23 au) to ≈15″ (1100 au). These very high-contrast images were obtained with the Space Telescope Imaging Spectrograph (STIS) using six-roll PSF template–subtracted coronagraphy suppressing the primary light of HR 4796A, with three image-plane occulters, and simultaneously subtracting the background light from its close angular proximity M2.5V companion. The resulting images unambiguously reveal the debris ring embedded within a much larger, morphologically complex, and biaxially asymmetric exo-ring scattering structure. These images at visible wavelengths are sensitive to and map the spatial distribution, brightness, and radial surface density of micron-size particles over 5 dex in surface brightness. These particles in the exo-ring environment may be unbound from the system and interacting with the local ISM. Herein, we present a new morphological and photometric view of the larger-than-prior-seen HR 4796A exoplanetary debris system with sensitivity to small particles at stellocentric distances an order of magnitude greater than has previously been observed.
Unconventional imaging with contained granular media
NASA Astrophysics Data System (ADS)
Quadrelli, Marco B.; Basinger, Scott; Sidick, Erkin
2017-09-01
Typically, the cost of a space-borne imaging system is driven by the size and mass of the primary aperture. The solution that we propose uses a method to construct an imaging system in space in which the nonlinear optical properties of a cloud of micron-sized particles, shaped into a specific surface by electromagnetic means, and allows one to form a very large and lightweight aperture of an optical system, hence reducing overall mass and cost. Recent work at JPL has investigated the feasibility of a granular imaging system, concluding that such a system could be built and controlled in orbit. We conducted experiments and simulation of the optical response of a granular lens. In all cases, the optical response, measured by the Modulation Transfer Function, of hexagonal reflectors was closely comparable to that of a conventional spherical mirror. We conducted some further analyses by evaluating the sensitivity to fill factor and grain shape, and found a marked sensitivity to fill factor but no sensitivity to grain shape. We have also found that at fill factors as low as 30%, the reflection from a granular lens is still excellent. Furthermore, we replaced the monolithic primary mirror in an existing integrated model of an optical system (WFIRST Coronagraph) with a granular lens, and found that the granular lens that can be useful for exoplanet detection provides excellent contrast levels. We will present our testbed and simulation results in this paper.
Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)
2001-01-01
The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.
NASA Technical Reports Server (NTRS)
Mckee, James W.
1988-01-01
This final report describes the accomplishments of the General Purpose Intelligent Sensor Interface task of the Applications of Artificial Intelligence to Space Station grant for the period from October 1, 1987 through September 30, 1988. Portions of the First Biannual Report not revised will not be included but only referenced. The goal is to develop an intelligent sensor system that will simplify the design and development of expert systems using sensors of the physical phenomena as a source of data. This research will concentrate on the integration of image processing sensors and voice processing sensors with a computer designed for expert system development. The result of this research will be the design and documentation of a system in which the user will not need to be an expert in such areas as image processing algorithms, local area networks, image processor hardware selection or interfacing, television camera selection, voice recognition hardware selection, or analog signal processing. The user will be able to access data from video or voice sensors through standard LISP statements without any need to know about the sensor hardware or software.
The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox
NASA Astrophysics Data System (ADS)
Harris, A. T., III; Goodman, J.; Justice, B.
2014-12-01
As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.