Sample records for airborne real-time imaging

  1. Integrated micro-optofluidic platform for real-time detection of airborne microorganisms

    NASA Astrophysics Data System (ADS)

    Choi, Jeongan; Kang, Miran; Jung, Jae Hee

    2015-11-01

    We demonstrate an integrated micro-optofluidic platform for real-time, continuous detection and quantification of airborne microorganisms. Measurements of the fluorescence and light scattering from single particles in a microfluidic channel are used to determine the total particle number concentration and the microorganism number concentration in real-time. The system performance is examined by evaluating standard particle measurements with various sample flow rates and the ratios of fluorescent to non-fluorescent particles. To apply this method to real-time detection of airborne microorganisms, airborne Escherichia coli, Bacillus subtilis, and Staphylococcus epidermidis cells were introduced into the micro-optofluidic platform via bioaerosol generation, and a liquid-type particle collection setup was used. We demonstrate successful discrimination of SYTO82-dyed fluorescent bacterial cells from other residue particles in a continuous and real-time manner. In comparison with traditional microscopy cell counting and colony culture methods, this micro-optofluidic platform is not only more accurate in terms of the detection efficiency for airborne microorganisms but it also provides additional information on the total particle number concentration.

  2. Integrated micro-optofluidic platform for real-time detection of airborne microorganisms

    PubMed Central

    Choi, Jeongan; Kang, Miran; Jung, Jae Hee

    2015-01-01

    We demonstrate an integrated micro-optofluidic platform for real-time, continuous detection and quantification of airborne microorganisms. Measurements of the fluorescence and light scattering from single particles in a microfluidic channel are used to determine the total particle number concentration and the microorganism number concentration in real-time. The system performance is examined by evaluating standard particle measurements with various sample flow rates and the ratios of fluorescent to non-fluorescent particles. To apply this method to real-time detection of airborne microorganisms, airborne Escherichia coli, Bacillus subtilis, and Staphylococcus epidermidis cells were introduced into the micro-optofluidic platform via bioaerosol generation, and a liquid-type particle collection setup was used. We demonstrate successful discrimination of SYTO82-dyed fluorescent bacterial cells from other residue particles in a continuous and real-time manner. In comparison with traditional microscopy cell counting and colony culture methods, this micro-optofluidic platform is not only more accurate in terms of the detection efficiency for airborne microorganisms but it also provides additional information on the total particle number concentration. PMID:26522006

  3. The Waypoint Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Technical Reports Server (NTRS)

    He, Yubin; Blakeslee, Richard; Goodman, Michael; Hall, John

    2010-01-01

    NASA Earth science research utilizes both spaceborne and airborne real time observations in the planning and operations of its field campaigns. The coordination of air and space components is critical to achieve the goals and objectives and ensure the success of an experiment. Spaceborne imagery provides regular and continual coverage of the Earth and it is a significant component in all NASA field experiments. Real time visible and infrared geostationary images from GOES satellites and multi-spectral data from the many elements of the NASA suite of instruments aboard the TRMM, Terra, Aqua, Aura, and other NASA satellites have become norm. Similarly, the NASA Airborne Science Program draws upon a rich pool of instrumented aircraft. The NASA McDonnell Douglas DC-8, Lockheed P3 Orion, DeHavilland Twin Otter, King Air B200, Gulfstream-III are all staples of a NASA's well-stocked, versatile hangar. A key component in many field campaigns is coordinating the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions. Given the variables involved, developing a good flight plan that meets the objectives of the field experiment can be a challenging and time consuming task. Planning a research aircraft mission within the context of meeting the science objectives is complex task because it is much more than flying from point A to B. Flight plans typically consist of flying a series of transects or involve dynamic path changes when "chasing" a hurricane or forest fire. These aircraft flight plans are typically designed by the mission scientists then verified and implemented by the navigator or pilot. Flight planning can be an arduous task requiring frequent sanity checks by the flight crew. This requires real time situational awareness of the weather conditions that affect the aircraft track. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an

  4. Real-time simulation of an airborne radar for overwater approaches

    NASA Technical Reports Server (NTRS)

    Karmarkar, J.; Clark, D.

    1982-01-01

    Software developed to provide a real time simulation of an airborne radar for overwater approaches to oil rig platforms is documented. The simulation is used to study advanced concepts for enhancement of airborne radar approaches (ARA) in order to reduce crew workload, improve approach tracking precision, and reduce weather minimums. ARA's are currently used for offshore helicopter operations to and from oil rigs.

  5. The Waypoint Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Astrophysics Data System (ADS)

    He, M.; Goodman, H. M.; Blakeslee, R.; Hall, J. M.

    2010-12-01

    NASA Earth science research utilizes both spaceborne and airborne real time observations in the planning and operations of its field campaigns. The coordination of air and space components is critical to achieve the goals and objectives and ensure the success of an experiment. Spaceborne imagery provides regular and continual coverage of the Earth and it is a significant component in all NASA field experiments. Real time visible and infrared geostationary images from GOES satellites and multi-spectral data from the many elements of the NASA suite of instruments aboard the TRMM, Terra, Aqua, Aura, and other NASA satellites have become norm. Similarly, the NASA Airborne Science Program draws upon a rich pool of instrumented aircraft. The NASA McDonnell Douglas DC-8, Lockheed P3 Orion, DeHavilland Twin Otter, King Air B200, Gulfstream-III are all staples of a NASA’s well-stocked, versatile hangar. A key component in many field campaigns is coordinating the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions. Given the variables involved, developing a good flight plan that meets the objectives of the field experiment can be a challenging and time consuming task. Planning a research aircraft mission within the context of meeting the science objectives is complex task because it is much more than flying from point A to B. Flight plans typically consist of flying a series of transects or involve dynamic path changes when “chasing” a hurricane or forest fire. These aircraft flight plans are typically designed by the mission scientists then verified and implemented by the navigator or pilot. Flight planning can be an arduous task requiring frequent sanity checks by the flight crew. This requires real time situational awareness of the weather conditions that affect the aircraft track. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool

  6. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    USDA-ARS?s Scientific Manuscript database

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  7. Target detection method by airborne and spaceborne images fusion based on past images

    NASA Astrophysics Data System (ADS)

    Chen, Shanjing; Kang, Qing; Wang, Zhenggang; Shen, ZhiQiang; Pu, Huan; Han, Hao; Gu, Zhongzheng

    2017-11-01

    To solve the problem that remote sensing target detection method has low utilization rate of past remote sensing data on target area, and can not recognize camouflage target accurately, a target detection method by airborne and spaceborne images fusion based on past images is proposed in this paper. The target area's past of space remote sensing image is taken as background. The airborne and spaceborne remote sensing data is fused and target feature is extracted by the means of airborne and spaceborne images registration, target change feature extraction, background noise suppression and artificial target feature extraction based on real-time aerial optical remote sensing image. Finally, the support vector machine is used to detect and recognize the target on feature fusion data. The experimental results have established that the proposed method combines the target area change feature of airborne and spaceborne remote sensing images with target detection algorithm, and obtains fine detection and recognition effect on camouflage and non-camouflage targets.

  8. Apparatus for real-time airborne particulate radionuclide collection and analysis

    DOEpatents

    Smart, John E.; Perkins, Richard W.

    2001-01-01

    An improved apparatus for collecting and analyzing an airborne particulate radionuclide having a filter mounted in a housing, the housing having an air inlet upstream of the filter and an air outlet downstream of the filter, wherein an air stream flows therethrough. The air inlet receives the air stream, the filter collects the airborne particulate radionuclide and permits a filtered air stream to pass through the air outlet. The improvement which permits real time counting is a gamma detecting germanium diode mounted downstream of the filter in the filtered air stream. The gamma detecting germanium diode is spaced apart from a downstream side of the filter a minimum distance for a substantially maximum counting detection while permitting substantially free air flow through the filter and uniform particulate radionuclide deposition on the filter.

  9. Estimating forest structural characteristics using the airborne LiDAR scanning system and a near-real time profiling laser system

    NASA Astrophysics Data System (ADS)

    Zhao, Kaiguang

    LiDAR (Light Detection and Ranging) directly measures canopy vertical structures, and provides an effective remote sensing solution to accurate and spatially-explicit mapping of forest characteristics, such as canopy height and Leaf Area Index. However, many factors, such as large data volume and high costs for data acquisition, precludes the operational and practical use of most currently available LiDARs for frequent and large-scale mapping. At the same time, a growing need is arising for real-time remote sensing platforms, e.g., to provide timely information for urgent applications. This study aims to develop an airborne profiling LiDAR system, featured with on-the-fly data processing, for near real- or real-time forest inventory. The development of such a system involves implementing the on-board data processing and analysis as well as building useful regression-based models to relate LiDAR measurements with forest biophysical parameters. This work established a paradigm for an on-the-fly airborne profiling LiDAR system to inventory regional forest resources in real- or near real-time. The system was developed based on an existing portable airborne laser system (PALS) that has been previously assembled at NASA by Dr. Ross Nelson. Key issues in automating PALS as an on-the-fly system were addressed, including the design of an archetype for the system workflow, the development of efficient and robust algorithms for automatic data processing and analysis, the development of effective regression models to predict forest biophysical parameters from LiDAR measurements, and the implementation of an integrated software package to incorporate all the above development. This work exploited the untouched potential of airborne laser profilers for real-time forest inventory, and therefore, documented an initial step toward developing airborne-laser-based, on-the-fly, real-time, forest inventory systems. Results from this work demonstrated the utility and effectiveness of

  10. Application of real-time PCR for total airborne bacterial assessment: Comparison with epifluorescence microscopy and culture-dependent methods

    NASA Astrophysics Data System (ADS)

    Rinsoz, Thomas; Duquenne, Philippe; Greff-Mirguet, Guylaine; Oppliger, Anne

    Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count non-culturable or non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescence microscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the "impaction on nutrient agar" method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria.

  11. Real-time photo-magnetic imaging.

    PubMed

    Nouizi, Farouk; Erkol, Hakan; Luk, Alex; Unlu, Mehmet B; Gulsen, Gultekin

    2016-10-01

    We previously introduced a new high resolution diffuse optical imaging modality termed, photo-magnetic imaging (PMI). PMI irradiates the object under investigation with near-infrared light and monitors the variations of temperature using magnetic resonance thermometry (MRT). In this paper, we present a real-time PMI image reconstruction algorithm that uses analytic methods to solve the forward problem and assemble the Jacobian matrix much faster. The new algorithm is validated using real MRT measured temperature maps. In fact, it accelerates the reconstruction process by more than 250 times compared to a single iteration of the FEM-based algorithm, which opens the possibility for the real-time PMI.

  12. Detection of airborne genetically modified maize pollen by real-time PCR.

    PubMed

    Folloni, Silvia; Kagkli, Dafni-Maria; Rajcevic, Bojan; Guimarães, Nilson C C; Van Droogenbroeck, Bart; Valicente, Fernando H; Van den Eede, Guy; Van den Bulcke, Marc

    2012-09-01

    The cultivation of genetically modified (GM) crops has raised numerous concerns in the European Union and other parts of the world about their environmental and economic impact. Especially outcrossing of genetically modified organisms (GMO) was from the beginning a critical issue as airborne pollen has been considered an important way of GMO dispersal. Here, we investigate the use of airborne pollen sampling combined with microscopic analysis and molecular PCR analysis as an approach to monitor GM maize cultivations in a specific area. Field trial experiments in the European Union and South America demonstrated the applicability of the approach under different climate conditions, in rural and semi-urban environment, even at very low levels of airborne pollen. The study documents in detail the sampling of GM pollen, sample DNA extraction and real-time PCR analysis. Our results suggest that this 'GM pollen monitoring by bioaerosol sampling and PCR screening' approach might represent an useful aid in the surveillance of GM-free areas, centres of origin and natural reserves. © 2012 Blackwell Publishing Ltd.

  13. Flexible real-time magnetic resonance imaging framework.

    PubMed

    Santos, Juan M; Wright, Graham A; Pauly, John M

    2004-01-01

    The extension of MR imaging to new applications has demonstrated the limitations of the architecture of current real-time systems. Traditional real-time implementations provide continuous acquisition of data and modification of basic sequence parameters on the fly. We have extended the concept of real-time MRI by designing a system that drives the examinations from a real-time localizer and then gets reconfigured for different imaging modes. Upon operator request or automatic feedback the system can immediately generate a new pulse sequence or change fundamental aspects of the acquisition such as gradient waveforms excitation pulses and scan planes. This framework has been implemented by connecting a data processing and control workstation to a conventional clinical scanner. Key components on the design of this framework are the data communication and control mechanisms, reconstruction algorithms optimized for real-time and adaptability, flexible user interface and extensible user interaction. In this paper we describe the various components that comprise this system. Some of the applications implemented in this framework include real-time catheter tracking embedded in high frame rate real-time imaging and immediate switching between real-time localizer and high-resolution volume imaging for coronary angiography applications.

  14. Real-time detection of airborne fluorescent bioparticles in Antarctica

    NASA Astrophysics Data System (ADS)

    Crawford, Ian; Gallagher, Martin W.; Bower, Keith N.; Choularton, Thomas W.; Flynn, Michael J.; Ruske, Simon; Listowski, Constantino; Brough, Neil; Lachlan-Cope, Thomas; Fleming, Zoë L.; Foot, Virginia E.; Stanley, Warren R.

    2017-12-01

    We demonstrate, for the first time, continuous real-time observations of airborne bio-fluorescent aerosols recorded at the British Antarctic Survey's Halley VI Research Station, located on the Brunt Ice Shelf close to the Weddell Sea coast (lat 75°34'59'' S, long 26°10'0'' W) during Antarctic summer, 2015. As part of the NERC MAC (Microphysics of Antarctic Clouds) aircraft aerosol cloud interaction project, observations with a real-time ultraviolet-light-induced fluorescence (UV-LIF) spectrometer were conducted to quantify airborne biological containing particle concentrations along with dust particles as a function of wind speed and direction over a 3-week period. Significant, intermittent enhancements of both non- and bio-fluorescent particles were observed to varying degrees in very specific wind directions and during strong wind events. Analysis of the particle UV-induced emission spectra, particle sizes and shapes recorded during these events suggest the majority of particles were likely a subset of dust with weak fluorescence emission responses. A minor fraction, however, were likely primary biological particles that were very strongly fluorescent, with a subset identified as likely being pollen based on comparison with laboratory data obtained using the same instrument. A strong correlation of bio-fluorescent particles with wind speed was observed in some, but not all, periods. Interestingly, the fraction of fluorescent particles to total particle concentration also increased significantly with wind speed during these events. The enhancement in concentrations of these particles could be interpreted as due to resuspension from the local ice surface but more likely due to emissions from distal sources within Antarctica as well as intercontinental transport. Likely distal sources identified by back trajectory analyses and dispersion modelling were the coastal ice margin zones in Halley Bay consisting of bird colonies with likely associated high bacterial

  15. An embedded multi-core parallel model for real-time stereo imaging

    NASA Astrophysics Data System (ADS)

    He, Wenjing; Hu, Jian; Niu, Jingyu; Li, Chuanrong; Liu, Guangyu

    2018-04-01

    The real-time processing based on embedded system will enhance the application capability of stereo imaging for LiDAR and hyperspectral sensor. The task partitioning and scheduling strategies for embedded multiprocessor system starts relatively late, compared with that for PC computer. In this paper, aimed at embedded multi-core processing platform, a parallel model for stereo imaging is studied and verified. After analyzing the computing amount, throughout capacity and buffering requirements, a two-stage pipeline parallel model based on message transmission is established. This model can be applied to fast stereo imaging for airborne sensors with various characteristics. To demonstrate the feasibility and effectiveness of the parallel model, a parallel software was designed using test flight data, based on the 8-core DSP processor TMS320C6678. The results indicate that the design performed well in workload distribution and had a speed-up ratio up to 6.4.

  16. The Way Point Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Technical Reports Server (NTRS)

    He, Yubin; Blakeslee, Richard; Goodman, Michael; Hall, John

    2012-01-01

    Airborne real time observation are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientist, planning a research aircraft mission within the context of meeting the science objective is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircraft are often involved in the NASA field campaigns the coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving dynamic weather conditions often determine the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientist and help them plan and modify the flight tracks successfully. Scientists at the University of Alabama Huntsville and the NASA Marshal Space Flight Center developed the Waypoint Planning Tool (WPT), an interactive software tool that enables scientist to develop their own flight plans (also known as waypoints), with point and click mouse capabilities on a digital map filled with time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analyses during and after each campaign helped identify both issues and new requirements, initiating the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities to the Google Earth Plugin and Java Web Start/Applet on web platform, as well as to the rising open source GIS tools with new JavaScript frameworks, the Waypoint planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed

  17. Near-real-time TOMS, telecommunications and meteorological support for the 1987 Airborne Antarctic Ozone Experiment

    NASA Technical Reports Server (NTRS)

    Ardanuy, P.; Victorine, J.; Sechrist, F.; Feiner, A.; Penn, L.

    1988-01-01

    The goal of the 1987 Airborne Antarctic Ozone Experiment was to improve the understanding of the mechanisms involved in the formation of the Antarctic ozone hole. Total ozone data taken by the Nimbus-7 Total Ozone Mapping Spectrometer (TOMS) played a central role in the successful outcome of the experiment. During the experiment, the near-real-time TOMS total ozone observations were supplied within hours of real time to the operations center in Punta Arenas, Chile. The final report summarizes the role which Research and Data Systems (RDS) Corporation played in the support of the experiment. The RDS provided telecommunications to support the science and operations efforts for the Airborne Antarctic Ozone Experiment, and supplied near real-time weather information to ensure flight and crew safety; designed and installed the telecommunications network to link NASA-GSFC, the United Kingdom Meteorological Office (UKMO), Palmer Station, the European Center for Medium-Range Weather Forecasts (ECMWF) to the operation at Punta Arenas; engineered and installed stations and other stand-alone systems to collect data from designated low-orbiting polar satellites and beacons; provided analyses of Nimbus-7 TOMS data and backup data products to Punta Arenas; and provided synoptic meteorological data analysis and reduction.

  18. Miniaturized Airborne Imaging Central Server System

    NASA Technical Reports Server (NTRS)

    Sun, Xiuhong

    2011-01-01

    In recent years, some remote-sensing applications require advanced airborne multi-sensor systems to provide high performance reflective and emissive spectral imaging measurement rapidly over large areas. The key or unique problem of characteristics is associated with a black box back-end system that operates a suite of cutting-edge imaging sensors to collect simultaneously the high throughput reflective and emissive spectral imaging data with precision georeference. This back-end system needs to be portable, easy-to-use, and reliable with advanced onboard processing. The innovation of the black box backend is a miniaturized airborne imaging central server system (MAICSS). MAICSS integrates a complex embedded system of systems with dedicated power and signal electronic circuits inside to serve a suite of configurable cutting-edge electro- optical (EO), long-wave infrared (LWIR), and medium-wave infrared (MWIR) cameras, a hyperspectral imaging scanner, and a GPS and inertial measurement unit (IMU) for atmospheric and surface remote sensing. Its compatible sensor packages include NASA s 1,024 1,024 pixel LWIR quantum well infrared photodetector (QWIP) imager; a 60.5 megapixel BuckEye EO camera; and a fast (e.g. 200+ scanlines/s) and wide swath-width (e.g., 1,920+ pixels) CCD/InGaAs imager-based visible/near infrared reflectance (VNIR) and shortwave infrared (SWIR) imaging spectrometer. MAICSS records continuous precision georeferenced and time-tagged multisensor throughputs to mass storage devices at a high aggregate rate, typically 60 MB/s for its LWIR/EO payload. MAICSS is a complete stand-alone imaging server instrument with an easy-to-use software package for either autonomous data collection or interactive airborne operation. Advanced multisensor data acquisition and onboard processing software features have been implemented for MAICSS. With the onboard processing for real time image development, correction, histogram-equalization, compression, georeference, and

  19. Real-Time Airborne Gamma-Ray Background Estimation Using NASVD with MLE and Radiation Transport for Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulisek, Jonathan A.; Schweppe, John E.; Stave, Sean C.

    2015-06-01

    Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this, we have developed a new technique for real-time estimation ofmore » background gamma radiation from aerial measurements. This method is built upon on the noise-adjusted singular value decomposition (NASVD) technique that was previously developed for estimating the potassium (K), uranium (U), and thorium (T) concentrations in soil post-flight. The method can be calibrated using K, U, and T spectra determined from radiation transport simulations along with basis functions, which may be determined empirically by applying maximum likelihood estimation (MLE) to previously measured airborne gamma-ray spectra. The method was applied to both measured and simulated airborne gamma-ray spectra, with and without man-made radiological source injections. Compared to schemes based on simple averaging, this technique was less sensitive to background contamination from the injected man-made sources and may be particularly useful when the gamma-ray background frequently changes during the course of the flight.« less

  20. Real-time inspection by submarine images

    NASA Astrophysics Data System (ADS)

    Tascini, Guido; Zingaretti, Primo; Conte, Giuseppe

    1996-10-01

    A real-time application of computer vision concerning tracking and inspection of a submarine pipeline is described. The objective is to develop automatic procedures for supporting human operators in the real-time analysis of images acquired by means of cameras mounted on underwater remotely operated vehicles (ROV) Implementation of such procedures gives rise to a human-machine system for underwater pipeline inspection that can automatically detect and signal the presence of the pipe, of its structural or accessory elements, and of dangerous or alien objects in its neighborhood. The possibility of modifying the image acquisition rate in the simulations performed on video- recorded images is used to prove that the system performs all necessary processing with an acceptable robustness working in real-time up to a speed of about 2.5 kn, widely greater than that the actual ROVs and the security features allow.

  1. Real-time neutron imaging of gas turbines

    NASA Astrophysics Data System (ADS)

    Stewart, P. A. E.

    1987-06-01

    The current status of real-time neutron radiography imaging is briefly reviewed, and results of tests carried out on cold neutron sources are reported. In particular, attention is given to demonstrations of neutron radiography on a running gas turbine engine. The future role of real-time neutron imaging in engineering diagnostics is briefly discussed.

  2. Ames Lab 101: Real-Time 3D Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Song

    2010-08-02

    Ames Laboratory scientist Song Zhang explains his real-time 3-D imaging technology. The technique can be used to create high-resolution, real-time, precise, 3-D images for use in healthcare, security, and entertainment applications.

  3. Ames Lab 101: Real-Time 3D Imaging

    ScienceCinema

    Zhang, Song

    2017-12-22

    Ames Laboratory scientist Song Zhang explains his real-time 3-D imaging technology. The technique can be used to create high-resolution, real-time, precise, 3-D images for use in healthcare, security, and entertainment applications.

  4. Real-time optical image processing techniques

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang

    1988-01-01

    Nonlinear real-time optical processing on spatial pulse frequency modulation has been pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the modification of micro-channel spatial light modulators (MSLMs). Micro-channel spatial light modulators are modified via the Fabry-Perot method to achieve the high gamma operation required for non-linear operation. Real-time nonlinear processing was performed using the halftone screen and MSLM. The experiments showed the effectiveness of the thresholding and also showed the needs of higher SBP for image processing. The Hughes LCLV has been characterized and found to yield high gamma (about 1.7) when operated in low frequency and low bias mode. Cascading of two LCLVs should also provide enough gamma for nonlinear processing. In this case, the SBP of the LCLV is sufficient but the uniformity of the LCLV needs improvement. These include image correlation, computer generation of holograms, pseudo-color image encoding for image enhancement, and associative-retrieval in neural processing. The discovery of the only known optical method for dynamic range compression of an input image in real-time by using GaAs photorefractive crystals is reported. Finally, a new architecture for non-linear multiple sensory, neural processing has been suggested.

  5. Real-time hyperspectral imaging for food safety applications

    USDA-ARS?s Scientific Manuscript database

    Multispectral imaging systems with selected bands can commonly be used for real-time applications of food processing. Recent research has demonstrated several image processing methods including binning, noise removal filter, and appropriate morphological analysis in real-time mode can remove most fa...

  6. Airborne imaging spectrometers developed in China

    NASA Astrophysics Data System (ADS)

    Wang, Jianyu; Xue, Yongqi

    1998-08-01

    Airborne imaging spectral technology, principle means in airborne remote sensing, has been developed rapidly both in the world and in China recently. This paper describes Modular Airborne Imaging Spectrometer (MAIS), Operational Modular Airborne Imaging Spectrometer (OMAIS) and Pushbroom Hyperspectral Imagery (PHI) that have been developed or are being developed in Airborne Remote Sensing Lab of Shanghai Institute of Technical Physics, CAS.

  7. In situ real-time measurement of physical characteristics of airborne bacterial particles

    NASA Astrophysics Data System (ADS)

    Jung, Jae Hee; Lee, Jung Eun

    2013-12-01

    Bioaerosols, including aerosolized bacteria, viruses, and fungi, are associated with public health and environmental problems. One promising control method to reduce the harmful effects of bioaerosols is thermal inactivation via a continuous-flow high-temperature short-time (HTST) system. However, variations in bioaerosol physical characteristics - for example, the particle size and shape - during the continuous-flow inactivation process can change the transport properties in the air, which can affect particle deposition in the human respiratory system or the filtration efficiency of ventilation systems. Real-time particle monitoring techniques are a desirable alternative to the time-consuming process of microscopic analysis that is conventionally used in sampling and particle characterization. Here, we report in situ real-time optical scattering measurements of the physical characteristics of airborne bacteria particles following an HTST process in a continuous-flow system. Our results demonstrate that the aerodynamic diameter of bacterial aerosols decreases when exposed to a high-temperature environment, and that the shape of the bacterial cells is significantly altered. These variations in physical characteristics using optical scattering measurements were found to be in agreement with the results of scanning electron microscopy analysis.

  8. Real-Time Imaging System for the OpenPET

    NASA Astrophysics Data System (ADS)

    Tashima, Hideaki; Yoshida, Eiji; Kinouchi, Shoko; Nishikido, Fumihiko; Inadama, Naoko; Murayama, Hideo; Suga, Mikio; Haneishi, Hideaki; Yamaya, Taiga

    2012-02-01

    The OpenPET and its real-time imaging capability have great potential for real-time tumor tracking in medical procedures such as biopsy and radiation therapy. For the real-time imaging system, we intend to use the one-pass list-mode dynamic row-action maximum likelihood algorithm (DRAMA) and implement it using general-purpose computing on graphics processing units (GPGPU) techniques. However, it is difficult to make consistent reconstructions in real-time because the amount of list-mode data acquired in PET scans may be large depending on the level of radioactivity, and the reconstruction speed depends on the amount of the list-mode data. In this study, we developed a system to control the data used in the reconstruction step while retaining quantitative performance. In the proposed system, the data transfer control system limits the event counts to be used in the reconstruction step according to the reconstruction speed, and the reconstructed images are properly intensified by using the ratio of the used counts to the total counts. We implemented the system on a small OpenPET prototype system and evaluated the performance in terms of the real-time tracking ability by displaying reconstructed images in which the intensity was compensated. The intensity of the displayed images correlated properly with the original count rate and a frame rate of 2 frames per second was achieved with average delay time of 2.1 s.

  9. Real-Time Confocal Imaging Of The Living Eye

    NASA Astrophysics Data System (ADS)

    Jester, James V.; Cavanagh, H. Dwight; Essepian, John; Shields, William J.; Lemp, Michael A.

    1989-12-01

    In 1986, we adapted the Tandem Scanning Reflected Light Microscope of Petran and Hadraysky to permit non-invasive, confocal imaging of the living eye in real-time. We were first to obtain stable, confocal optical sections in vivo, from human and animal eyes. Using confocal imaging systems we have now studied living, normal volunteers, rabbits, cats and primates sequentially, non-invasively, and in real-time. The continued development of real-time confocal imaging systems will unlock the door to a new field of cell biology involving for the first time the study of dynamic cellular processes in living organ systems. Towards this end we have concentrated our initial studies on three areas (1) evaluation of confocal microscope systems for real-time image acquisition, (2) studies of the living normal cornea (epithelium, stroma, endothelium) in human and other species; and (3) sequential wound-healing responses in the cornea in single animals to lamellar-keratectomy injury (cellular migration, inflammation, scarring). We believe that this instrument represents an important, new paradigm for research in cell biology and pathology and that it will fundamentally alter all experimental and clinical approaches in future years.

  10. Real-time digital signal processing for live electro-optic imaging.

    PubMed

    Sasagawa, Kiyotaka; Kanno, Atsushi; Tsuchiya, Masahiro

    2009-08-31

    We present an imaging system that enables real-time magnitude and phase detection of modulated signals and its application to a Live Electro-optic Imaging (LEI) system, which realizes instantaneous visualization of RF electric fields. The real-time acquisition of magnitude and phase images of a modulated optical signal at 5 kHz is demonstrated by imaging with a Si-based high-speed CMOS image sensor and real-time signal processing with a digital signal processor. In the LEI system, RF electric fields are probed with light via an electro-optic crystal plate and downconverted to an intermediate frequency by parallel optical heterodyning, which can be detected with the image sensor. The artifacts caused by the optics and the image sensor characteristics are corrected by image processing. As examples, we demonstrate real-time visualization of electric fields from RF circuits.

  11. Switched Antenna Array Tile for Real-Time Microwave Imaging Aperture

    DTIC Science & Technology

    2016-06-26

    Switched Antenna Array Tile for Real -Time Microwave Imaging Aperture William F. Moulder, Janusz J. Majewski, Charles M. Coldwell, James D. Krieger...Fast Imaging Algorithm 10mm 250mm Switched Array Tile Fig. 1. Diagram of real -time imaging array, with fabricated antenna tile. except for antenna...formed. IV. CONCLUSIONS A switched array tile to be used in a real time imaging aperture has been presented. Design and realization of the tile were

  12. Real-time imaging of quantum entanglement.

    PubMed

    Fickler, Robert; Krenn, Mario; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton

    2013-01-01

    Quantum Entanglement is widely regarded as one of the most prominent features of quantum mechanics and quantum information science. Although, photonic entanglement is routinely studied in many experiments nowadays, its signature has been out of the grasp for real-time imaging. Here we show that modern technology, namely triggered intensified charge coupled device (ICCD) cameras are fast and sensitive enough to image in real-time the effect of the measurement of one photon on its entangled partner. To quantitatively verify the non-classicality of the measurements we determine the detected photon number and error margin from the registered intensity image within a certain region. Additionally, the use of the ICCD camera allows us to demonstrate the high flexibility of the setup in creating any desired spatial-mode entanglement, which suggests as well that visual imaging in quantum optics not only provides a better intuitive understanding of entanglement but will improve applications of quantum science.

  13. Real-Time Imaging of Quantum Entanglement

    PubMed Central

    Fickler, Robert; Krenn, Mario; Lapkiewicz, Radek; Ramelow, Sven; Zeilinger, Anton

    2013-01-01

    Quantum Entanglement is widely regarded as one of the most prominent features of quantum mechanics and quantum information science. Although, photonic entanglement is routinely studied in many experiments nowadays, its signature has been out of the grasp for real-time imaging. Here we show that modern technology, namely triggered intensified charge coupled device (ICCD) cameras are fast and sensitive enough to image in real-time the effect of the measurement of one photon on its entangled partner. To quantitatively verify the non-classicality of the measurements we determine the detected photon number and error margin from the registered intensity image within a certain region. Additionally, the use of the ICCD camera allows us to demonstrate the high flexibility of the setup in creating any desired spatial-mode entanglement, which suggests as well that visual imaging in quantum optics not only provides a better intuitive understanding of entanglement but will improve applications of quantum science. PMID:23715056

  14. Real-time PCR detection of toxigenic Fusarium in airborne and settled grain dust and associations with trichothecene mycotoxins.

    PubMed

    Halstensen, Anne Straumfors; Nordby, Karl-Christian; Eduard, Wijnand; Klemsdal, Sonja Sletner

    2006-12-01

    Inhalation of immunomodulating mycotoxins produced by Fusarium spp. that are commonly found in grain dust may imply health risks for grain farmers. Airborne Fusarium and mycotoxin exposure levels are mainly unknown due to difficulties in identifying Fusarium and mycotoxins in personal aerosol samples. We used a novel real-time PCR method to quantify the fungal trichodiene synthase gene (tri5) and DNA specific to F. langsethiae and F. avenaceum in airborne and settled grain dust, determined the personal inhalant exposure level to toxigenic Fusarium during various activities, and evaluated whether quantitative measurements of Fusarium-DNA could predict trichothecene levels in grain dust. Airborne Fusarium-DNA was detected in personal samples even from short tasks (10-60 min). The median Fusarium-DNA level was significantly higher in settled than in airborne grain dust (p < 0.001), and only the F. langsethiae-DNA levels correlated significantly in settled and airborne dust (r(s) = 0.20, p = 0.003). Both F. langsethiae-DNA and tri5-DNA were associated with HT-2 and T-2 toxins (r(s) = 0.24-0.71, p < 0.05 to p < 00.01) in settled dust, and could thus be suitable as indicators for HT-2 and T-2. The median personal inhalant exposure to specific toxigenic Fusarium spp. was less than 1 genome m(-3), but the exposure ranged from 0-10(5) genomes m(-3). This study is the first to apply real-time PCR on personal samples of inhalable grain dust for the quantification of tri5 and species-specific Fusarium-DNA, which may have potential for risk assessments of inhaled trichothecenes.

  15. Classification and overview of research in real-time imaging

    NASA Astrophysics Data System (ADS)

    Sinha, Purnendu; Gorinsky, Sergey V.; Laplante, Phillip A.; Stoyenko, Alexander D.; Marlowe, Thomas J.

    1996-10-01

    Real-time imaging has application in areas such as multimedia, virtual reality, medical imaging, and remote sensing and control. Recently, the imaging community has witnessed a tremendous growth in research and new ideas in these areas. To lend structure to this growth, we outline a classification scheme and provide an overview of current research in real-time imaging. For convenience, we have categorized references by research area and application.

  16. Towards real-time medical diagnostics using hyperspectral imaging technology

    NASA Astrophysics Data System (ADS)

    Bjorgan, Asgeir; Randeberg, Lise L.

    2015-07-01

    Hyperspectral imaging provides non-contact, high resolution spectral images which has a substantial diagnostic potential. This can be used for e.g. diagnosis and early detection of arthritis in finger joints. Processing speed is currently a limitation for clinical use of the technique. A real-time system for analysis and visualization using GPU processing and threaded CPU processing is presented. Images showing blood oxygenation, blood volume fraction and vessel enhanced images are among the data calculated in real-time. This study shows the potential of real-time processing in this context. A combination of the processing modules will be used in detection of arthritic finger joints from hyperspectral reflectance and transmittance data.

  17. Real-Time Symbol Extraction From Grey-Level Images

    NASA Astrophysics Data System (ADS)

    Massen, R.; Simnacher, M.; Rosch, J.; Herre, E.; Wuhrer, H. W.

    1988-04-01

    A VME-bus image pipeline processor for extracting vectorized contours from grey-level images in real-time is presented. This 3 Giga operation per second processor uses large kernel convolvers and new non-linear neighbourhood processing algorithms to compute true 1-pixel wide and noise-free contours without thresholding even from grey-level images with quite varying edge sharpness. The local edge orientation is used as an additional cue to compute a list of vectors describing the closed and open contours in real-time and to dump a CAD-like symbolic image description into a symbol memory at pixel clock rate.

  18. Real-time monitoring of non-viable airborne particles correlates with airborne colonies and represents an acceptable surrogate for daily assessment of cell-processing cleanroom performance.

    PubMed

    Raval, Jay S; Koch, Eileen; Donnenberg, Albert D

    2012-10-01

    Airborne particulate monitoring is mandated as a component of good manufacturing practice. We present a procedure developed to monitor and interpret airborne particulates in an International Organization for Standardization (ISO) class 7 cleanroom used for the cell processing of Section 351 and Section 361 products. We collected paired viable and non-viable airborne particle data over a period of 1 year in locations chosen to provide a range of air quality. We used receiver operator characteristic (ROC) analysis to determine empirically the relationship between non-viable and viable airborne particle counts. Viable and non-viable particles were well-correlated (r(2) = 0.78), with outlier observations at the low end of the scale (non-viable particles without detectable airborne colonies). ROC analysis predicted viable counts ≥ 0.5/feet(3) (a limit set by the United States Pharmacopeia) at an action limit of ≥ 32 000 particles (≥ 0.5 µ)/feet(3), with 95.6% sensitivity and 50% specificity. This limit was exceeded 2.6 times during 18 months of retrospective daily cleanroom data (an expected false alarm rate of 1.3 times/year). After implementing this action limit, we were alerted in real time to an air-handling failure undetected by our hospital facilities management. A rational action limit for non-viable particles was determined based on the correlation with airborne colonies. Reaching or exceeding the action limit of 32 000 non-viable particles/feet(3) triggers suspension of cleanroom cell-processing activities, deep cleaning, investigation of air handling, and a deviation management process. Our full procedure for particle monitoring is available as an online supplement.

  19. Real-time monitoring of non-viable airborne particles correlates with airborne colonies and represents an acceptable surrogate for daily assessment of cell-processing cleanroom performance

    PubMed Central

    RAVAL, JAY S.; KOCH, EILEEN; DONNENBERG, ALBERT D.

    2014-01-01

    Background aims Airborne particulate monitoring is mandated as a component of good manufacturing practice. We present a procedure developed to monitor and interpret airborne particulates in an International Organization for Standardization (ISO) class 7 cleanroom used for the cell processing of Section 351 and Section 361 products. Methods We collected paired viable and non-viable airborne particle data over a period of 1 year in locations chosen to provide a range of air quality. We used receiver operator characteristic (ROC) analysis to determine empirically the relationship between non-viable and viable airborne particle counts. Results Viable and non-viable particles were well-correlated (r 2 = 0.78), with outlier observations at the low end of the scale (non-viable particles without detectable airborne colonies). ROC analysis predicted viable counts ≥0.5/feet 3 (a limit set by the United States Pharmacopeia) at an action limit of ≥32 000 particles (≥0.5 μ)/feet 3 , with 95.6% sensitivity and 50% specificity. This limit was exceeded 2.6 times during 18 months of retrospective daily cleanroom data (an expected false alarm rate of 1.3 times/year). After implementing this action limit, we were alerted in real time to an air-handling failure undetected by our hospital facilities management. Conclusions A rational action limit for non-viable particles was determined based on the correlation with airborne colonies. Reaching or exceeding the action limit of 32 000 non-viable particles/feet 3 triggers suspension of cleanroom cell-processing activities, deep cleaning, investigation of air handling, and a deviation management process. Our full procedure for particle monitoring is available as an online supplement. PMID:22746538

  20. Study on real-time images compounded using spatial light modulator

    NASA Astrophysics Data System (ADS)

    Xu, Jin; Chen, Zhebo; Ni, Xuxiang; Lu, Zukang

    2007-01-01

    Image compounded technology is often used on film and its facture. In common, image compounded use image processing arithmetic, get useful object, details, background or some other things from the images firstly, then compounding all these information into one image. When using this method, the film system needs a powerful processor, for the process function is very complex, we get the compounded image for a few time delay. In this paper, we introduce a new method of image real-time compounded, use this method, we can do image composite at the same time with movie shot. The whole system is made up of two camera-lens, spatial light modulator array and image sensor. In system, the spatial light modulator could be liquid crystal display (LCD), liquid crystal on silicon (LCoS), thin film transistor liquid crystal display (TFTLCD), Deformable Micro-mirror Device (DMD), and so on. Firstly, one camera-lens images the object on the spatial light modulator's panel, we call this camera-lens as first image lens. Secondly, we output an image to the panel of spatial light modulator. Then, the image of the object and image that output by spatial light modulator will be spatial compounded on the panel of spatial light modulator. Thirdly, the other camera-lens images the compounded image to the image sensor, and we call this camera-lens as second image lens. After these three steps, we will gain the compound images by image sensor. For the spatial light modulator could output the image continuously, then the image will be compounding continuously too, and the compounding procedure is completed in real-time. When using this method to compounding image, if we will put real object into invented background, we can output the invented background scene on the spatial light modulator, and the real object will be imaged by first image lens. Then, we get the compounded images by image sensor in real time. The same way, if we will put real background to an invented object, we can output the

  1. Real-time microstructural and functional imaging and image processing in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Westphal, Volker

    Optical Coherence Tomography (OCT) is a noninvasive optical imaging technique that allows high-resolution cross-sectional imaging of tissue microstructure, achieving a spatial resolution of about 10 mum. OCT is similar to B-mode ultrasound (US) except that it uses infrared light instead of ultrasound. In contrast to US, no coupling gel is needed, simplifying the image acquisition. Furthermore, the fiber optic implementation of OCT is compatible with endoscopes. In recent years, the transition from slow imaging, bench-top systems to real-time clinical systems has been under way. This has lead to a variety of applications, namely in ophthalmology, gastroenterology, dermatology and cardiology. First, this dissertation will demonstrate that OCT is capable of imaging and differentiating clinically relevant tissue structures in the gastrointestinal tract. A careful in vitro correlation study between endoscopic OCT images and corresponding histological slides was performed. Besides structural imaging, OCT systems were further developed for functional imaging, as for example to visualize blood flow. Previously, imaging flow in small vessels in real-time was not possible. For this research, a new processing scheme similar to real-time Doppler in US was introduced. It was implemented in dedicated hardware to allow real-time acquisition and overlayed display of blood flow in vivo. A sensitivity of 0.5mm/s was achieved. Optical coherence microscopy (OCM) is a variation of OCT, improving the resolution even further to a few micrometers. Advances made in the OCT scan engine for the Doppler setup enabled real-time imaging in vivo with OCM. In order to generate geometrical correct images for all the previous applications in real-time, extensive image processing algorithms were developed. Algorithms for correction of distortions due to non-telecentric scanning, nonlinear scan mirror movements, and refraction were developed and demonstrated. This has led to interesting new

  2. Feasibility study: real-time 3-D ultrasound imaging of the brain.

    PubMed

    Smith, Stephen W; Chu, Kengyeh; Idriss, Salim F; Ivancevich, Nikolas M; Light, Edward D; Wolf, Patrick D

    2004-10-01

    We tested the feasibility of real-time, 3-D ultrasound (US) imaging in the brain. The 3-D scanner uses a matrix phased-array transducer of 512 transmit channels and 256 receive channels operating at 2.5 MHz with a 15-mm diameter footprint. The real-time system scans a 65 degrees pyramid, producing up to 30 volumetric scans per second, and features up to five image planes as well as 3-D rendering, 3-D pulsed-wave and color Doppler. In a human subject, the real-time 3-D scans produced simultaneous transcranial horizontal (axial), coronal and sagittal image planes and real-time volume-rendered images of the gross anatomy of the brain. In a transcranial sheep model, we obtained real-time 3-D color flow Doppler scans and perfusion images using bolus injection of contrast agents into the internal carotid artery.

  3. Real-Time Aggressive Image Data Compression

    DTIC Science & Technology

    1990-03-31

    implemented with higher degrees of modularity, concurrency, and higher levels of machine intelligence , thereby providing higher data -throughput rates...Project Summary Project Title: Real-Time Aggressive Image Data Compression Principal Investigators: Dr. Yih-Fang Huang and Dr. Ruey-wen Liu Institution...Summary The objective of the proposed research is to develop reliable algorithms !.hat can achieve aggressive image data compression (with a compression

  4. Time-gated real-time pump-probe imaging spectroscopy

    NASA Astrophysics Data System (ADS)

    Ferrari, Raffaele; D'Andrea, Cosimo; Bassi, Andrea; Valentini, Gianluca; Cubeddu, Rinaldo

    2007-07-01

    An experimental technique which allows one to perform pump-probe transient absorption spectroscopy in real-time is an important tool to study irreversible processes. This is particularly interesting in the case of biological samples which easily deteriorate upon exposure to light pulses, with the formation of permanent photoproducts and structural changes. In particular pump-probe spectroscopy can provide fundamental information for the design of optical chromophores. In this work a real-time pump-probe imaging spectroscopy system has been realized and we have explored the possibility to further reduce the number of laser pulses by using a time-gated camera. We believe that the use of a time-gated camera can provide an important step towards the final goal of pump-probe single shot spectroscopy.

  5. Deep architecture neural network-based real-time image processing for image-guided radiotherapy.

    PubMed

    Mori, Shinichiro

    2017-08-01

    To develop real-time image processing for image-guided radiotherapy, we evaluated several neural network models for use with different imaging modalities, including X-ray fluoroscopic image denoising. Setup images of prostate cancer patients were acquired with two oblique X-ray fluoroscopic units. Two types of residual network were designed: a convolutional autoencoder (rCAE) and a convolutional neural network (rCNN). We changed the convolutional kernel size and number of convolutional layers for both networks, and the number of pooling and upsampling layers for rCAE. The ground-truth image was applied to the contrast-limited adaptive histogram equalization (CLAHE) method of image processing. Network models were trained to keep the quality of the output image close to that of the ground-truth image from the input image without image processing. For image denoising evaluation, noisy input images were used for the training. More than 6 convolutional layers with convolutional kernels >5×5 improved image quality. However, this did not allow real-time imaging. After applying a pair of pooling and upsampling layers to both networks, rCAEs with >3 convolutions each and rCNNs with >12 convolutions with a pair of pooling and upsampling layers achieved real-time processing at 30 frames per second (fps) with acceptable image quality. Use of our suggested network achieved real-time image processing for contrast enhancement and image denoising by the use of a conventional modern personal computer. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Real-time Avatar Animation from a Single Image.

    PubMed

    Saragih, Jason M; Lucey, Simon; Cohn, Jeffrey F

    2011-01-01

    A real time facial puppetry system is presented. Compared with existing systems, the proposed method requires no special hardware, runs in real time (23 frames-per-second), and requires only a single image of the avatar and user. The user's facial expression is captured through a real-time 3D non-rigid tracking system. Expression transfer is achieved by combining a generic expression model with synthetically generated examples that better capture person specific characteristics. Performance of the system is evaluated on avatars of real people as well as masks and cartoon characters.

  7. Real-time Avatar Animation from a Single Image

    PubMed Central

    Saragih, Jason M.; Lucey, Simon; Cohn, Jeffrey F.

    2014-01-01

    A real time facial puppetry system is presented. Compared with existing systems, the proposed method requires no special hardware, runs in real time (23 frames-per-second), and requires only a single image of the avatar and user. The user’s facial expression is captured through a real-time 3D non-rigid tracking system. Expression transfer is achieved by combining a generic expression model with synthetically generated examples that better capture person specific characteristics. Performance of the system is evaluated on avatars of real people as well as masks and cartoon characters. PMID:24598812

  8. Monitoring airborne molecular contamination: a quantitative and qualitative comparison of real-time and grab-sampling techniques

    NASA Astrophysics Data System (ADS)

    Shupp, Aaron M.; Rodier, Dan; Rowley, Steven

    2007-03-01

    Monitoring and controlling Airborne Molecular Contamination (AMC) has become essential in deep ultraviolet (DUV) photolithography for both optimizing yields and protecting tool optics. A variety of technologies have been employed for both real-time and grab-sample monitoring. Real-time monitoring has the advantage of quickly identifying "spikes" and upset conditions, while 2 - 24 hour plus grab sampling allows for extremely low detection limits by concentrating the mass of the target contaminant over a period of time. Employing a combination of both monitoring techniques affords the highest degree of control, lowest detection limits, and the most detailed data possible in terms of speciation. As happens with many technologies, there can be concern regarding the accuracy and agreement between real-time and grab-sample methods. This study utilizes side by side comparisons of two different real-time monitors operating in parallel with both liquid impingers and dry sorbent tubes to measure NIST traceable gas standards as well as real world samples. By measuring in parallel, a truly valid comparison is made between methods while verifying the results against a certified standard. The final outcome for this investigation is that a dry sorbent tube grab-sample technique produced results that agreed in terms of accuracy with NIST traceable standards as well as the two real-time techniques Ion Mobility Spectrometry (IMS) and Pulsed Fluorescence Detection (PFD) while a traditional liquid impinger technique showed discrepancies.

  9. Handheld real-time volumetric imaging of the spine: technology development.

    PubMed

    Tiouririne, Mohamed; Nguyen, Sarah; Hossack, John A; Owen, Kevin; William Mauldin, F

    2014-03-01

    Technical difficulties, poor image quality and reliance on pattern identifications represent some of the drawbacks of two-dimensional ultrasound imaging of spinal bone anatomy. To overcome these limitations, this study sought to develop real-time volumetric imaging of the spine using a portable handheld device. The device measured 19.2 cm × 9.2 cm × 9.0 cm and imaged at 5 MHz centre frequency. 2D imaging under conventional ultrasound and volumetric (3D) imaging in real time was achieved and verified by inspection using a custom spine phantom. Further device performance was assessed and revealed a 75-min battery life and an average frame rate of 17.7 Hz in volumetric imaging mode. The results suggest that real-time volumetric imaging of the spine is a feasible technique for more intuitive visualization of the spine. These results may have important ramifications for a large array of neuraxial procedures.

  10. Real Time Target Tracking in a Phantom Using Ultrasonic Imaging

    NASA Astrophysics Data System (ADS)

    Xiao, X.; Corner, G.; Huang, Z.

    In this paper we present a real-time ultrasound image guidance method suitable for tracking the motion of tumors. A 2D ultrasound based motion tracking system was evaluated. A robot was used to control the focused ultrasound and position it at the target that has been segmented from a real-time ultrasound video. Tracking accuracy and precision were investigated using a lesion mimicking phantom. Experiments have been conducted and results show sufficient efficiency of the image guidance algorithm. This work could be developed as the foundation for combining the real time ultrasound imaging tracking and MRI thermometry monitoring non-invasive surgery.

  11. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  12. Development and Evaluation of Real-Time Volumetric Compton Gamma-Ray Imaging

    NASA Astrophysics Data System (ADS)

    Barnowski, Ross Wegner

    An approach to gamma-ray imaging has been developed that enables near real-time volumetric (3D) imaging of unknown environments thus improving the utility of gamma-ray imaging for source-search and radiation mapping applications. The approach, herein dubbed scene data fusion (SDF), is based on integrating mobile radiation imagers with real time tracking and scene reconstruction algorithms to enable a mobile mode of operation and 3D localization of gamma-ray sources. The real-time tracking allows the imager to be moved throughout the environment or around a particular object of interest, obtaining the multiple perspectives necessary for standoff 3D imaging. A 3D model of the scene, provided in real-time by a simultaneous localization and mapping (SLAM) algorithm, can be incorporated into the image reconstruction reducing the reconstruction time and improving imaging performance. The SDF concept is demonstrated in this work with a Microsoft Kinect RGB-D sensor, a real-time SLAM solver, and two different mobile gamma-ray imaging platforms. The first is a cart-based imaging platform known as the Volumetric Compton Imager (VCI), comprising two 3D position-sensitive high purity germanium (HPGe) detectors, exhibiting excellent gamma-ray imaging characteristics, but with limited mobility due to the size and weight of the cart. The second system is the High Efficiency Multimodal Imager (HEMI) a hand-portable gamma-ray imager comprising 96 individual cm3 CdZnTe crystals arranged in a two-plane, active-mask configuration. The HEMI instrument has poorer energy and angular resolution than the VCI, but is truly hand-portable, allowing the SDF concept to be tested in multiple environments and for more challenging imaging scenarios. An iterative algorithm based on Compton kinematics is used to reconstruct the gamma-ray source distribution in all three spatial dimensions. Each of the two mobile imaging systems are used to demonstrate SDF for a variety of scenarios, including

  13. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  14. A computational approach to real-time image processing for serial time-encoded amplified microscopy

    NASA Astrophysics Data System (ADS)

    Oikawa, Minoru; Hiyama, Daisuke; Hirayama, Ryuji; Hasegawa, Satoki; Endo, Yutaka; Sugie, Takahisa; Tsumura, Norimichi; Kuroshima, Mai; Maki, Masanori; Okada, Genki; Lei, Cheng; Ozeki, Yasuyuki; Goda, Keisuke; Shimobaba, Tomoyoshi

    2016-03-01

    High-speed imaging is an indispensable technique, particularly for identifying or analyzing fast-moving objects. The serial time-encoded amplified microscopy (STEAM) technique was proposed to enable us to capture images with a frame rate 1,000 times faster than using conventional methods such as CCD (charge-coupled device) cameras. The application of this high-speed STEAM imaging technique to a real-time system, such as flow cytometry for a cell-sorting system, requires successively processing a large number of captured images with high throughput in real time. We are now developing a high-speed flow cytometer system including a STEAM camera. In this paper, we describe our approach to processing these large amounts of image data in real time. We use an analog-to-digital converter that has up to 7.0G samples/s and 8-bit resolution for capturing the output voltage signal that involves grayscale images from the STEAM camera. Therefore the direct data output from the STEAM camera generates 7.0G byte/s continuously. We provided a field-programmable gate array (FPGA) device as a digital signal pre-processor for image reconstruction and finding objects in a microfluidic channel with high data rates in real time. We also utilized graphics processing unit (GPU) devices for accelerating the calculation speed of identification of the reconstructed images. We built our prototype system, which including a STEAM camera, a FPGA device and a GPU device, and evaluated its performance in real-time identification of small particles (beads), as virtual biological cells, owing through a microfluidic channel.

  15. Image denoising for real-time MRI.

    PubMed

    Klosowski, Jakob; Frahm, Jens

    2017-03-01

    To develop an image noise filter suitable for MRI in real time (acquisition and display), which preserves small isolated details and efficiently removes background noise without introducing blur, smearing, or patch artifacts. The proposed method extends the nonlocal means algorithm to adapt the influence of the original pixel value according to a simple measure for patch regularity. Detail preservation is improved by a compactly supported weighting kernel that closely approximates the commonly used exponential weight, while an oracle step ensures efficient background noise removal. Denoising experiments were conducted on real-time images of healthy subjects reconstructed by regularized nonlinear inversion from radial acquisitions with pronounced undersampling. The filter leads to a signal-to-noise ratio (SNR) improvement of at least 60% without noticeable artifacts or loss of detail. The method visually compares to more complex state-of-the-art filters as the block-matching three-dimensional filter and in certain cases better matches the underlying noise model. Acceleration of the computation to more than 100 complex frames per second using graphics processing units is straightforward. The sensitivity of nonlocal means to small details can be significantly increased by the simple strategies presented here, which allows partial restoration of SNR in iteratively reconstructed images without introducing a noticeable time delay or image artifacts. Magn Reson Med 77:1340-1352, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  16. Low-level processing for real-time image analysis

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  17. Dragon Ears airborne acoustic array: CSP analysis applied to cross array to compute real-time 2D acoustic sound field

    NASA Astrophysics Data System (ADS)

    Cerwin, Steve; Barnes, Julie; Kell, Scott; Walters, Mark

    2003-09-01

    This paper describes development and application of a novel method to accomplish real-time solid angle acoustic direction finding using two 8-element orthogonal microphone arrays. The developed prototype system was intended for localization and signature recognition of ground-based sounds from a small UAV. Recent advances in computer speeds have enabled the implementation of microphone arrays in many audio applications. Still, the real-time presentation of a two-dimensional sound field for the purpose of audio target localization is computationally challenging. In order to overcome this challenge, a crosspower spectrum phase1 (CSP) technique was applied to each 8-element arm of a 16-element cross array to provide audio target localization. In this paper, we describe the technique and compare it with two other commonly used techniques; Cross-Spectral Matrix2 and MUSIC3. The results show that the CSP technique applied to two 8-element orthogonal arrays provides a computationally efficient solution with reasonable accuracy and tolerable artifacts, sufficient for real-time applications. Additional topics include development of a synchronized 16-channel transmitter and receiver to relay the airborne data to the ground-based processor and presentation of test data demonstrating both ground-mounted operation and airborne localization of ground-based gunshots and loud engine sounds.

  18. Ultrashort Microwave-Pumped Real-Time Thermoacoustic Breast Tumor Imaging System.

    PubMed

    Ye, Fanghao; Ji, Zhong; Ding, Wenzheng; Lou, Cunguang; Yang, Sihua; Xing, Da

    2016-03-01

    We report the design of a real-time thermoacoustic (TA) scanner dedicated to imaging deep breast tumors and investigate its imaging performance. The TA imaging system is composed of an ultrashort microwave pulse generator and a ring transducer array with 384 elements. By vertically scanning the transducer array that encircles the breast phantom, we achieve real-time, 3D thermoacoustic imaging (TAI) with an imaging speed of 16.7 frames per second. The stability of the microwave energy and its distribution in the cling-skin acoustic coupling cup are measured. The results indicate that there is a nearly uniform electromagnetic field in each XY-imaging plane. Three plastic tubes filled with salt water are imaged dynamically to evaluate the real-time performance of our system, followed by 3D imaging of an excised breast tumor embedded in a breast phantom. Finally, to demonstrate the potential for clinical applications, the excised breast of a ewe embedded with an ex vivo human breast tumor is imaged clearly with a contrast of about 1:2.8. The high imaging speed, large field of view, and 3D imaging performance of our dedicated TAI system provide the potential for clinical routine breast screening.

  19. Ship detection based on rotation-invariant HOG descriptors for airborne infrared images

    NASA Astrophysics Data System (ADS)

    Xu, Guojing; Wang, Jinyan; Qi, Shengxiang

    2018-03-01

    Infrared thermal imagery is widely used in various kinds of aircraft because of its all-time application. Meanwhile, detecting ships from infrared images attract lots of research interests in recent years. In the case of downward-looking infrared imagery, in order to overcome the uncertainty of target imaging attitude due to the unknown position relationship between the aircraft and the target, we propose a new infrared ship detection method which integrates rotation invariant gradient direction histogram (Circle Histogram of Oriented Gradient, C-HOG) descriptors and the support vector machine (SVM) classifier. In details, the proposed method uses HOG descriptors to express the local feature of infrared images to adapt to changes in illumination and to overcome sea clutter effects. Different from traditional computation of HOG descriptor, we subdivide the image into annular spatial bins instead of rectangle sub-regions, and then Radial Gradient Transform (RGT) on the gradient is applied to achieve rotation invariant histogram information. Considering the engineering application of airborne and real-time requirements, we use SVM for training ship target and non-target background infrared sample images to discriminate real ships from false targets. Experimental results show that the proposed method has good performance in both the robustness and run-time for infrared ship target detection with different rotation angles.

  20. Achieving real-time capsule endoscopy (CE) video visualization through panoramic imaging

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Xie, Jean; Mui, Peter; Leighton, Jonathan A.

    2013-02-01

    In this paper, we mainly present a novel and real-time capsule endoscopy (CE) video visualization concept based on panoramic imaging. Typical CE videos run about 8 hours and are manually reviewed by physicians to locate diseases such as bleedings and polyps. To date, there is no commercially available tool capable of providing stabilized and processed CE video that is easy to analyze in real time. The burden on physicians' disease finding efforts is thus big. In fact, since the CE camera sensor has a limited forward looking view and low image frame rate (typical 2 frames per second), and captures very close range imaging on the GI tract surface, it is no surprise that traditional visualization method based on tracking and registration often fails to work. This paper presents a novel concept for real-time CE video stabilization and display. Instead of directly working on traditional forward looking FOV (field of view) images, we work on panoramic images to bypass many problems facing traditional imaging modalities. Methods on panoramic image generation based on optical lens principle leading to real-time data visualization will be presented. In addition, non-rigid panoramic image registration methods will be discussed.

  1. Real-time chirp-coded imaging with a programmable ultrasound biomicroscope.

    PubMed

    Bosisio, Mattéo R; Hasquenoph, Jean-Michel; Sandrin, Laurent; Laugier, Pascal; Bridal, S Lori; Yon, Sylvain

    2010-03-01

    Ultrasound biomicroscopy (UBM) of mice can provide a testing ground for new imaging strategies. The UBM system presented in this paper facilitates the development of imaging and measurement methods with programmable design, arbitrary waveform coding, broad bandwidth (2-80 MHz), digital filtering, programmable processing, RF data acquisition, multithread/multicore real-time display, and rapid mechanical scanning (real time. Chirp and conventional impulse imaging (31 and 46 MHz center frequencies) of a wire phantom at fast sectorial scanning (0.7 degrees ms(-1), 20 frames/s one-way image rate) are compared. Axial and lateral resolutions at the focus with chirps approach impulse imaging resolutions. Chirps yield 10-15 dB gain in SNR and a 2-3 mm gain in imaging depth. Real-time impulse and chirp-coded imaging (at 10-5 frames/s) are demonstrated in the mouse, in vivo. The system's open structure favors test and implementation of new sequences.

  2. Volumetric Real-Time Imaging Using a CMUT Ring Array

    PubMed Central

    Choe, Jung Woo; Oralkan, Ömer; Nikoozadeh, Amin; Gencel, Mustafa; Stephens, Douglas N.; O’Donnell, Matthew; Sahn, David J.; Khuri-Yakub, Butrus T.

    2012-01-01

    A ring array provides a very suitable geometry for forward-looking volumetric intracardiac and intravascular ultrasound imaging. We fabricated an annular 64-element capacitive micromachined ultrasonic transducer (CMUT) array featuring a 10-MHz operating frequency and a 1.27-mm outer radius. A custom software suite was developed to run on a PC-based imaging system for real-time imaging using this device. This paper presents simulated and experimental imaging results for the described CMUT ring array. Three different imaging methods—flash, classic phased array (CPA), and synthetic phased array (SPA)—were used in the study. For SPA imaging, two techniques to improve the image quality—Hadamard coding and aperture weighting—were also applied. The results show that SPA with Hadamard coding and aperture weighting is a good option for ring-array imaging. Compared with CPA, it achieves better image resolution and comparable signal-to-noise ratio at a much faster image acquisition rate. Using this method, a fast frame rate of up to 463 volumes per second is achievable if limited only by the ultrasound time of flight; with the described system we reconstructed three cross-sectional images in real-time at 10 frames per second, which was limited by the computation time in synthetic beamforming. PMID:22718870

  3. Volumetric real-time imaging using a CMUT ring array.

    PubMed

    Choe, Jung Woo; Oralkan, Ömer; Nikoozadeh, Amin; Gencel, Mustafa; Stephens, Douglas N; O'Donnell, Matthew; Sahn, David J; Khuri-Yakub, Butrus T

    2012-06-01

    A ring array provides a very suitable geometry for forward-looking volumetric intracardiac and intravascular ultrasound imaging. We fabricated an annular 64-element capacitive micromachined ultrasonic transducer (CMUT) array featuring a 10-MHz operating frequency and a 1.27-mm outer radius. A custom software suite was developed to run on a PC-based imaging system for real-time imaging using this device. This paper presents simulated and experimental imaging results for the described CMUT ring array. Three different imaging methods--flash, classic phased array (CPA), and synthetic phased array (SPA)--were used in the study. For SPA imaging, two techniques to improve the image quality--Hadamard coding and aperture weighting--were also applied. The results show that SPA with Hadamard coding and aperture weighting is a good option for ring-array imaging. Compared with CPA, it achieves better image resolution and comparable signal-to-noise ratio at a much faster image acquisition rate. Using this method, a fast frame rate of up to 463 volumes per second is achievable if limited only by the ultrasound time of flight; with the described system we reconstructed three cross-sectional images in real-time at 10 frames per second, which was limited by the computation time in synthetic beamforming.

  4. A Review on Real-Time 3D Ultrasound Imaging Technology

    PubMed Central

    Zeng, Zhaozheng

    2017-01-01

    Real-time three-dimensional (3D) ultrasound (US) has attracted much more attention in medical researches because it provides interactive feedback to help clinicians acquire high-quality images as well as timely spatial information of the scanned area and hence is necessary in intraoperative ultrasound examinations. Plenty of publications have been declared to complete the real-time or near real-time visualization of 3D ultrasound using volumetric probes or the routinely used two-dimensional (2D) probes. So far, a review on how to design an interactive system with appropriate processing algorithms remains missing, resulting in the lack of systematic understanding of the relevant technology. In this article, previous and the latest work on designing a real-time or near real-time 3D ultrasound imaging system are reviewed. Specifically, the data acquisition techniques, reconstruction algorithms, volume rendering methods, and clinical applications are presented. Moreover, the advantages and disadvantages of state-of-the-art approaches are discussed in detail. PMID:28459067

  5. A Review on Real-Time 3D Ultrasound Imaging Technology.

    PubMed

    Huang, Qinghua; Zeng, Zhaozheng

    2017-01-01

    Real-time three-dimensional (3D) ultrasound (US) has attracted much more attention in medical researches because it provides interactive feedback to help clinicians acquire high-quality images as well as timely spatial information of the scanned area and hence is necessary in intraoperative ultrasound examinations. Plenty of publications have been declared to complete the real-time or near real-time visualization of 3D ultrasound using volumetric probes or the routinely used two-dimensional (2D) probes. So far, a review on how to design an interactive system with appropriate processing algorithms remains missing, resulting in the lack of systematic understanding of the relevant technology. In this article, previous and the latest work on designing a real-time or near real-time 3D ultrasound imaging system are reviewed. Specifically, the data acquisition techniques, reconstruction algorithms, volume rendering methods, and clinical applications are presented. Moreover, the advantages and disadvantages of state-of-the-art approaches are discussed in detail.

  6. An image retrieval framework for real-time endoscopic image retargeting.

    PubMed

    Ye, Menglong; Johns, Edward; Walter, Benjamin; Meining, Alexander; Yang, Guang-Zhong

    2017-08-01

    Serial endoscopic examinations of a patient are important for early diagnosis of malignancies in the gastrointestinal tract. However, retargeting for optical biopsy is challenging due to extensive tissue variations between examinations, requiring the method to be tolerant to these changes whilst enabling real-time retargeting. This work presents an image retrieval framework for inter-examination retargeting. We propose both a novel image descriptor tolerant of long-term tissue changes and a novel descriptor matching method in real time. The descriptor is based on histograms generated from regional intensity comparisons over multiple scales, offering stability over long-term appearance changes at the higher levels, whilst remaining discriminative at the lower levels. The matching method then learns a hashing function using random forests, to compress the string and allow for fast image comparison by a simple Hamming distance metric. A dataset that contains 13 in vivo gastrointestinal videos was collected from six patients, representing serial examinations of each patient, which includes videos captured with significant time intervals. Precision-recall for retargeting shows that our new descriptor outperforms a number of alternative descriptors, whilst our hashing method outperforms a number of alternative hashing approaches. We have proposed a novel framework for optical biopsy in serial endoscopic examinations. A new descriptor, combined with a novel hashing method, achieves state-of-the-art retargeting, with validation on in vivo videos from six patients. Real-time performance also allows for practical integration without disturbing the existing clinical workflow.

  7. Quantitative real-time monitoring of multi-elements in airborne particulates by direct introduction into an inductively coupled plasma mass spectrometer

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshinari; Sato, Hikaru; Hiyoshi, Katsuhiro; Furuta, Naoki

    2012-10-01

    A new calibration system for real-time determination of trace elements in airborne particulates was developed. Airborne particulates were directly introduced into an inductively coupled plasma mass spectrometer, and the concentrations of 15 trace elements were determined by means of an external calibration method. External standard solutions were nebulized by an ultrasonic nebulizer (USN) coupled with a desolvation system, and the resulting aerosol was introduced into the plasma. The efficiency of sample introduction via the USN was calculated by two methods: (1) the introduction of a Cr standard solution via the USN was compared with introduction of a Cr(CO)6 standard gas via a standard gas generator and (2) the aerosol generated by the USN was trapped on filters and then analyzed. The Cr introduction efficiencies obtained by the two methods were the same, and the introduction efficiencies of the other elements were equal to the introduction efficiency of Cr. Our results indicated that our calibration method for introduction efficiency worked well for the 15 elements (Ti, V, Cr, Mn, Co, Ni, Cu, Zn, As, Mo, Sn, Sb, Ba, Tl and Pb). The real-time data and the filter-collection data agreed well for elements with low-melting oxides (V, Co, As, Mo, Sb, Tl, and Pb). In contrast, the real-time data were smaller than the filter-collection data for elements with high-melting oxides (Ti, Cr, Mn, Ni, Cu, Zn, Sn, and Ba). This result implies that the oxides of these 8 elements were not completely fused, vaporized, atomized, and ionized in the initial radiation zone of the inductively coupled plasma. However, quantitative real-time monitoring can be realized after correction for the element recoveries which can be calculated from the ratio of real-time data/filter-collection data.

  8. Real-time two-dimensional temperature imaging using ultrasound.

    PubMed

    Liu, Dalong; Ebbini, Emad S

    2009-01-01

    We present a system for real-time 2D imaging of temperature change in tissue media using pulse-echo ultrasound. The frontend of the system is a SonixRP ultrasound scanner with a research interface giving us the capability of controlling the beam sequence and accessing radio frequency (RF) data in real-time. The beamformed RF data is streamlined to the backend of the system, where the data is processed using a two-dimensional temperature estimation algorithm running in the graphics processing unit (GPU). The estimated temperature is displayed in real-time providing feedback that can be used for real-time control of the heating source. Currently we have verified our system with elastography tissue mimicking phantom and in vitro porcine heart tissue, excellent repeatability and sensitivity were demonstrated.

  9. Mapping methane emissions using the airborne imaging spectrometer AVIRIS-NG

    NASA Astrophysics Data System (ADS)

    Thorpe, A. K.; Frankenberg, C.; Thompson, D. R.; Duren, R. M.; Bue, B. D.; Green, R. O.

    2017-12-01

    The next generation Airborne Visible/Infrared Imaging Spectrometer (AVIRIS-NG) has been used to survey large regions and map methane plumes with unambiguous identification of emission source locations. This capability is aided by real time detection and geolocation of gas plumes, permitting adaptive surveys and communication to ground teams for rapid follow up. We present results from AVIRIS-NG flight campaigns in Colorado, New Mexico, and California. Hundreds of plumes were observed, reflecting emissions from the energy sector that include hydraulic fracturing, gas processing plants, tanks, pumpjacks, and pipeline leaks. In some cases, plumes observed by AVIRIS-NG resulted in mitigation. Additional examples will be shown for methane from dairy lagoons, landfills, natural emissions, as well as carbon dioxide from power plants and refineries. We describe the unique capabilities of airborne imaging spectrometers to augment other measurement techniques by efficiently surveying key regions for methane point sources and supporting timely assessment and mitigation. We summarize the outlook for near- and longer-term monitoring capabilities including future satellite systems. Figure caption. AVIRIS-NG true color image subset with superimposed methane plume showing retrieved gas concentrations. Plume extends 200 m downwind of the southern edge of the well pad. Google Earth imagery with finer spatial resolution is also included (red box), indicating that tanks in the inset scene as the source of emissions. Five wells are located at the center of this well pad and all use horizontal drilling to produce mostly natural gas.

  10. Near Real-Time Image Reconstruction

    NASA Astrophysics Data System (ADS)

    Denker, C.; Yang, G.; Wang, H.

    2001-08-01

    In recent years, post-facto image-processing algorithms have been developed to achieve diffraction-limited observations of the solar surface. We present a combination of frame selection, speckle-masking imaging, and parallel computing which provides real-time, diffraction-limited, 256×256 pixel images at a 1-minute cadence. Our approach to achieve diffraction limited observations is complementary to adaptive optics (AO). At the moment, AO is limited by the fact that it corrects wavefront abberations only for a field of view comparable to the isoplanatic patch. This limitation does not apply to speckle-masking imaging. However, speckle-masking imaging relies on short-exposure images which limits its spectroscopic applications. The parallel processing of the data is performed on a Beowulf-class computer which utilizes off-the-shelf, mass-market technologies to provide high computational performance for scientific calculations and applications at low cost. Beowulf computers have a great potential, not only for image reconstruction, but for any kind of complex data reduction. Immediate access to high-level data products and direct visualization of dynamic processes on the Sun are two of the advantages to be gained.

  11. Magneto-optical system for high speed real time imaging.

    PubMed

    Baziljevich, M; Barness, D; Sinvani, M; Perel, E; Shaulov, A; Yeshurun, Y

    2012-08-01

    A new magneto-optical system has been developed to expand the range of high speed real time magneto-optical imaging. A special source for the external magnetic field has also been designed, using a pump solenoid to rapidly excite the field coil. Together with careful modifications of the cryostat, to reduce eddy currents, ramping rates reaching 3000 T/s have been achieved. Using a powerful laser as the light source, a custom designed optical assembly, and a high speed digital camera, real time imaging rates up to 30 000 frames per seconds have been demonstrated.

  12. Magneto-optical system for high speed real time imaging

    NASA Astrophysics Data System (ADS)

    Baziljevich, M.; Barness, D.; Sinvani, M.; Perel, E.; Shaulov, A.; Yeshurun, Y.

    2012-08-01

    A new magneto-optical system has been developed to expand the range of high speed real time magneto-optical imaging. A special source for the external magnetic field has also been designed, using a pump solenoid to rapidly excite the field coil. Together with careful modifications of the cryostat, to reduce eddy currents, ramping rates reaching 3000 T/s have been achieved. Using a powerful laser as the light source, a custom designed optical assembly, and a high speed digital camera, real time imaging rates up to 30 000 frames per seconds have been demonstrated.

  13. UWGSP7: a real-time optical imaging workstation

    NASA Astrophysics Data System (ADS)

    Bush, John E.; Kim, Yongmin; Pennington, Stan D.; Alleman, Andrew P.

    1995-04-01

    With the development of UWGSP7, the University of Washington Image Computing Systems Laboratory has a real-time workstation for continuous-wave (cw) optical reflectance imaging. Recent discoveries in optical science and imaging research have suggested potential practical use of the technology as a medical imaging modality and identified the need for a machine to support these applications in real time. The UWGSP7 system was developed to provide researchers with a high-performance, versatile tool for use in optical imaging experiments with the eventual goal of bringing the technology into clinical use. One of several major applications of cw optical reflectance imaging is tumor imaging which uses a light-absorbing dye that preferentially sequesters in tumor tissue. This property could be used to locate tumors and to identify tumor margins intraoperatively. Cw optical reflectance imaging consists of illumination of a target with a band-limited light source and monitoring the light transmitted by or reflected from the target. While continuously illuminating the target, a control image is acquired and stored. A dye is injected into a subject and a sequence of data images are acquired and processed. The data images are aligned with the control image and then subtracted to obtain a signal representing the change in optical reflectance over time. This signal can be enhanced by digital image processing and displayed in pseudo-color. This type of emerging imaging technique requires a computer system that is versatile and adaptable. The UWGSP7 utilizes a VESA local bus PC as a host computer running the Windows NT operating system and includes ICSL developed add-on boards for image acquisition and processing. The image acquisition board is used to digitize and format the analog signal from the input device into digital frames and to the average frames into images. To accommodate different input devices, the camera interface circuitry is designed in a small mezzanine board

  14. Handheld real-time volumetric 3-D gamma-ray imaging

    NASA Astrophysics Data System (ADS)

    Haefner, Andrew; Barnowski, Ross; Luke, Paul; Amman, Mark; Vetter, Kai

    2017-06-01

    This paper presents the concept of real-time fusion of gamma-ray imaging and visual scene data for a hand-held mobile Compton imaging system in 3-D. The ability to obtain and integrate both gamma-ray and scene data from a mobile platform enables improved capabilities in the localization and mapping of radioactive materials. This not only enhances the ability to localize these materials, but it also provides important contextual information of the scene which once acquired can be reviewed and further analyzed subsequently. To demonstrate these concepts, the high-efficiency multimode imager (HEMI) is used in a hand-portable implementation in combination with a Microsoft Kinect sensor. This sensor, in conjunction with open-source software, provides the ability to create a 3-D model of the scene and to track the position and orientation of HEMI in real-time. By combining the gamma-ray data and visual data, accurate 3-D maps of gamma-ray sources are produced in real-time. This approach is extended to map the location of radioactive materials within objects with unknown geometry.

  15. Real-time biscuit tile image segmentation method based on edge detection.

    PubMed

    Matić, Tomislav; Aleksi, Ivan; Hocenski, Željko; Kraus, Dieter

    2018-05-01

    In this paper we propose a novel real-time Biscuit Tile Segmentation (BTS) method for images from ceramic tile production line. BTS method is based on signal change detection and contour tracing with a main goal of separating tile pixels from background in images captured on the production line. Usually, human operators are visually inspecting and classifying produced ceramic tiles. Computer vision and image processing techniques can automate visual inspection process if they fulfill real-time requirements. Important step in this process is a real-time tile pixels segmentation. BTS method is implemented for parallel execution on a GPU device to satisfy the real-time constraints of tile production line. BTS method outperforms 2D threshold-based methods, 1D edge detection methods and contour-based methods. Proposed BTS method is in use in the biscuit tile production line. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Near Real Time Review of Instrument Performance using the Airborne Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Delene, D. J.

    2014-12-01

    Research aircraft that conduct atmospheric measurements carry an increasing array of instrumentation. While on-board personnel constantly review instrument parameters and time series plots, there are an overwhelming number of items. Furthermore, directing the aircraft flight takes up much of the flight scientist time. Typically, a flight engineer is given the responsibility of reviewing the status of on-board instruments. While major issues like not receiving data are quickly identified during a flight, subtle issues like low but believable concentration measurements may go unnoticed. Therefore, it is critical to review data after a flight in near real time. The Airborne Data Processing and Analysis (ADPAA) software package used by the University of North Dakota automates the post-processing of aircraft flight data. Utilizing scripts to process the measurements recorded by data acquisition systems enables the generation of data files within an hour of flight completion. The ADPAA Cplot visualization program enables plots to be quickly generated that enable timely review of all recorded and processed parameters. Near real time review of aircraft flight data enables instrument problems to be identified, investigated and fixed before conducting another flight. On one flight, near real time data review resulted in the identification of unusually low measurements of cloud condensation nuclei, and rapid data visualization enabled the timely investigation of the cause. As a result, a leak was found and fixed before the next flight. Hence, with the high cost of aircraft flights, it is critical to find and fix instrument problems in a timely matter. The use of a automated processing scripts and quick visualization software enables scientists to review aircraft flight data in near real time to identify potential problems.

  17. Real-time intraoperative fluorescence imaging system using light-absorption correction.

    PubMed

    Themelis, George; Yoo, Jung Sun; Soh, Kwang-Sup; Schulz, Ralf; Ntziachristos, Vasilis

    2009-01-01

    We present a novel fluorescence imaging system developed for real-time interventional imaging applications. The system implements a correction scheme that improves the accuracy of epi-illumination fluorescence images for light intensity variation in tissues. The implementation is based on the use of three cameras operating in parallel, utilizing a common lens, which allows for the concurrent collection of color, fluorescence, and light attenuation images at the excitation wavelength from the same field of view. The correction is based on a ratio approach of fluorescence over light attenuation images. Color images and video is used for surgical guidance and for registration with the corrected fluorescence images. We showcase the performance metrics of this system on phantoms and animals, and discuss the advantages over conventional epi-illumination systems developed for real-time applications and the limits of validity of corrected epi-illumination fluorescence imaging.

  18. A customizable system for real-time image processing using the Blackfin DSProcessor and the MicroC/OS-II real-time kernel

    NASA Astrophysics Data System (ADS)

    Coffey, Stephen; Connell, Joseph

    2005-06-01

    This paper presents a development platform for real-time image processing based on the ADSP-BF533 Blackfin processor and the MicroC/OS-II real-time operating system (RTOS). MicroC/OS-II is a completely portable, ROMable, pre-emptive, real-time kernel. The Blackfin Digital Signal Processors (DSPs), incorporating the Analog Devices/Intel Micro Signal Architecture (MSA), are a broad family of 16-bit fixed-point products with a dual Multiply Accumulate (MAC) core. In addition, they have a rich instruction set with variable instruction length and both DSP and MCU functionality thus making them ideal for media based applications. Using the MicroC/OS-II for task scheduling and management, the proposed system can capture and process raw RGB data from any standard 8-bit greyscale image sensor in soft real-time and then display the processed result using a simple PC graphical user interface (GUI). Additionally, the GUI allows configuration of the image capture rate and the system and core DSP clock rates thereby allowing connectivity to a selection of image sensors and memory devices. The GUI also allows selection from a set of image processing algorithms based in the embedded operating system.

  19. Subcellular real-time in vivo imaging of intralymphatic and intravascular cancer-cell trafficking

    NASA Astrophysics Data System (ADS)

    McElroy, M.; Hayashi, K.; Kaushal, S.; Bouvet, M.; Hoffman, Robert M.

    2008-02-01

    With the use of fluorescent cells labeled with green fluorescent protein (GFP) in the nucleus and red fluorescent protein (RFP) in the cytoplasm and a highly sensitive small animal imaging system with both macro-optics and micro-optics, we have developed subcellular real-time imaging of cancer cell trafficking in live mice. Dual-color cancer cells were injected by a vascular route in an abdominal skin flap in nude mice. The mice were imaged with an Olympus OV100 small animal imaging system with a sensitive CCD camera and four objective lenses, parcentered and parfocal, enabling imaging from macrocellular to subcellular. We observed the nuclear and cytoplasmic behavior of cancer cells in real time in blood vessels as they moved by various means or adhered to the vessel surface in the abdominal skin flap. During extravasation, real-time dual-color imaging showed that cytoplasmic processes of the cancer cells exited the vessels first, with nuclei following along the cytoplasmic projections. Both cytoplasm and nuclei underwent deformation during extravasation. Different cancer cell lines seemed to strongly vary in their ability to extravasate. We have also developed real-time imaging of cancer cell trafficking in lymphatic vessels. Cancer cells labeled with GFP and/or RFP were injected into the inguinal lymph node of nude mice. The labeled cancer cells trafficked through lymphatic vessels where they were imaged via a skin flap in real-time at the cellular level until they entered the axillary lymph node. The bright dual-color fluorescence of the cancer cells and the real-time microscopic imaging capability of the Olympus OV100 enabled imaging the trafficking cancer cells in both blood vessels and lymphatics. With the dual-color cancer cells and the highly sensitive imaging system described here, the subcellular dynamics of cancer metastasis can now be observed in live mice in real time.

  20. Real-time airborne gamma-ray background estimation using NASVD with MLE and radiation transport for calibration

    NASA Astrophysics Data System (ADS)

    Kulisek, J. A.; Schweppe, J. E.; Stave, S. C.; Bernacki, B. E.; Jordan, D. V.; Stewart, T. N.; Seifert, C. E.; Kernan, W. J.

    2015-06-01

    Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this challenge, we have developed a new technique for real-time estimation of background gamma radiation from aerial measurements without the need for human analyst intervention. The method can be calibrated using radiation transport simulations along with data from previous flights over areas for which the isotopic composition need not be known. Over the examined measured and simulated data sets, the method generated accurate background estimates even in the presence of a strong, 60Co source. The potential to track large and abrupt changes in background spectral shape and magnitude was demonstrated. The method can be implemented fairly easily in most modern computing languages and environments.

  1. SWUIS-A: A Versatile, Low-Cost UV/VIS/IR Imaging System for Airborne Astronomy and Aeronomy Research

    NASA Technical Reports Server (NTRS)

    Durda, Daniel D.; Stern, S. Alan; Tomlinson, William; Slater, David C.; Vilas, Faith

    2001-01-01

    We have developed and successfully flight-tested on 14 different airborne missions the hardware and techniques for routinely conducting valuable astronomical and aeronomical observations from high-performance, two-seater military-type aircraft. The SWUIS-A (Southwest Universal Imaging System - Airborne) system consists of an image-intensified CCD camera with broad band response from the near-UV to the near IR, high-quality foreoptics, a miniaturized video recorder, an aircraft-to-camera power and telemetry interface with associated camera controls, and associated cables, filters, and other minor equipment. SWUIS-A's suite of high-quality foreoptics gives it selectable, variable focal length/variable field-of-view capabilities. The SWUIS-A camera frames at 60 Hz video rates, which is a key requirement for both jitter compensation and high time resolution (useful for occultation, lightning, and auroral studies). Broadband SWUIS-A image coadds can exceed a limiting magnitude of V = 10.5 in <1 sec with dark sky conditions. A valuable attribute of SWUIS-A airborne observations is the fact that the astronomer flies with the instrument, thereby providing Space Shuttle-like "payload specialist" capability to "close-the-loop" in real-time on the research done on each research mission. Key advantages of the small, high-performance aircraft on which we can fly SWUIS-A include significant cost savings over larger, more conventional airborne platforms, worldwide basing obviating the need for expensive, campaign-style movement of specialized large aircraft and their logistics support teams, and ultimately faster reaction times to transient events. Compared to ground-based instruments, airborne research platforms offer superior atmospheric transmission, the mobility to reach remote and often-times otherwise unreachable locations over the Earth, and virtually-guaranteed good weather for observing the sky. Compared to space-based instruments, airborne platforms typically offer

  2. The design of real time infrared image generation software based on Creator and Vega

    NASA Astrophysics Data System (ADS)

    Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu

    2013-09-01

    Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.

  3. Real-time image mosaicing for medical applications.

    PubMed

    Loewke, Kevin E; Camarillo, David B; Jobst, Christopher A; Salisbury, J Kenneth

    2007-01-01

    In this paper we describe the development of a robotically-assisted image mosaicing system for medical applications. The processing occurs in real-time due to a fast initial image alignment provided by robotic position sensing. Near-field imaging, defined by relatively large camera motion, requires translations as well as pan and tilt orientations to be measured. To capture these measurements we use 5-d.o.f. sensing along with a hand-eye calibration to account for sensor offset. This sensor-based approach speeds up the mosaicing, eliminates cumulative errors, and readily handles arbitrary camera motions. Our results have produced visually satisfactory mosaics on a dental model but can be extended to other medical images.

  4. A Novel, Real-Time, In Vivo Mouse Retinal Imaging System

    PubMed Central

    Butler, Mark C.; Sullivan, Jack M.

    2015-01-01

    Purpose To develop an efficient, low-cost instrument for robust real-time imaging of the mouse retina in vivo, and assess system capabilities by evaluating various animal models. Methods Following multiple disappointing attempts to visualize the mouse retina during a subretinal injection using commercially available systems, we identified the key limitation to be inadequate illumination due to off axis illumination and poor optical train optimization. Therefore, we designed a paraxial illumination system for Greenough-type stereo dissecting microscope incorporating an optimized optical launch and an efficiently coupled fiber optic delivery system. Excitation and emission filters control spectral bandwidth. A color coupled-charged device (CCD) camera is coupled to the microscope for image capture. Although, field of view (FOV) is constrained by the small pupil aperture, the high optical power of the mouse eye, and the long working distance (needed for surgical manipulations), these limitations can be compensated by eye positioning in order to observe the entire retina. Results The retinal imaging system delivers an adjustable narrow beam to the dilated pupil with minimal vignetting. The optic nerve, vasculature, and posterior pole are crisply visualized and the entire retina can be observed through eye positioning. Normal and degenerative retinal phenotypes can be followed over time. Subretinal or intraocular injection procedures are followed in real time. Real-time, intravenous fluorescein angiography for the live mouse has been achieved. Conclusions A novel device is established for real-time viewing and image capture of the small animal retina during subretinal injections for preclinical gene therapy studies. PMID:26551329

  5. A Novel, Real-Time, In Vivo Mouse Retinal Imaging System.

    PubMed

    Butler, Mark C; Sullivan, Jack M

    2015-11-01

    To develop an efficient, low-cost instrument for robust real-time imaging of the mouse retina in vivo, and assess system capabilities by evaluating various animal models. Following multiple disappointing attempts to visualize the mouse retina during a subretinal injection using commercially available systems, we identified the key limitation to be inadequate illumination due to off axis illumination and poor optical train optimization. Therefore, we designed a paraxial illumination system for Greenough-type stereo dissecting microscope incorporating an optimized optical launch and an efficiently coupled fiber optic delivery system. Excitation and emission filters control spectral bandwidth. A color coupled-charged device (CCD) camera is coupled to the microscope for image capture. Although, field of view (FOV) is constrained by the small pupil aperture, the high optical power of the mouse eye, and the long working distance (needed for surgical manipulations), these limitations can be compensated by eye positioning in order to observe the entire retina. The retinal imaging system delivers an adjustable narrow beam to the dilated pupil with minimal vignetting. The optic nerve, vasculature, and posterior pole are crisply visualized and the entire retina can be observed through eye positioning. Normal and degenerative retinal phenotypes can be followed over time. Subretinal or intraocular injection procedures are followed in real time. Real-time, intravenous fluorescein angiography for the live mouse has been achieved. A novel device is established for real-time viewing and image capture of the small animal retina during subretinal injections for preclinical gene therapy studies.

  6. Toroidal sensor arrays for real-time photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Bychkov, Anton S.; Cherepetskaya, Elena B.; Karabutov, Alexander A.; Makarov, Vladimir A.

    2017-07-01

    This article addresses theoretical and numerical investigation of image formation in photoacoustic (PA) imaging with complex-shaped concave sensor arrays. The spatial resolution and the size of sensitivity region of PA and laser ultrasonic (LU) imaging systems are assessed using sensitivity maps and spatial resolution maps in the image plane. This paper also discusses the relationship between the size of high-sensitivity regions and the spatial resolution of real-time imaging systems utilizing toroidal arrays. It is shown that the use of arrays with toroidal geometry significantly improves the diagnostic capabilities of PA and LU imaging to investigate biological objects, rocks, and composite materials.

  7. Real-time dynamic display of registered 4D cardiac MR and ultrasound images using a GPU

    NASA Astrophysics Data System (ADS)

    Zhang, Q.; Huang, X.; Eagleson, R.; Guiraudon, G.; Peters, T. M.

    2007-03-01

    In minimally invasive image-guided surgical interventions, different imaging modalities, such as magnetic resonance imaging (MRI), computed tomography (CT), and real-time three-dimensional (3D) ultrasound (US), can provide complementary, multi-spectral image information. Multimodality dynamic image registration is a well-established approach that permits real-time diagnostic information to be enhanced by placing lower-quality real-time images within a high quality anatomical context. For the guidance of cardiac procedures, it would be valuable to register dynamic MRI or CT with intraoperative US. However, in practice, either the high computational cost prohibits such real-time visualization of volumetric multimodal images in a real-world medical environment, or else the resulting image quality is not satisfactory for accurate guidance during the intervention. Modern graphics processing units (GPUs) provide the programmability, parallelism and increased computational precision to begin to address this problem. In this work, we first outline our research on dynamic 3D cardiac MR and US image acquisition, real-time dual-modality registration and US tracking. Then we describe image processing and optimization techniques for 4D (3D + time) cardiac image real-time rendering. We also present our multimodality 4D medical image visualization engine, which directly runs on a GPU in real-time by exploiting the advantages of the graphics hardware. In addition, techniques such as multiple transfer functions for different imaging modalities, dynamic texture binding, advanced texture sampling and multimodality image compositing are employed to facilitate the real-time display and manipulation of the registered dual-modality dynamic 3D MR and US cardiac datasets.

  8. Real-time Imaging Orientation Determination System to Verify Imaging Polarization Navigation Algorithm

    PubMed Central

    Lu, Hao; Zhao, Kaichun; Wang, Xiaochu; You, Zheng; Huang, Kaoli

    2016-01-01

    Bio-inspired imaging polarization navigation which can provide navigation information and is capable of sensing polarization information has advantages of high-precision and anti-interference over polarization navigation sensors that use photodiodes. Although all types of imaging polarimeters exist, they may not qualify for the research on the imaging polarization navigation algorithm. To verify the algorithm, a real-time imaging orientation determination system was designed and implemented. Essential calibration procedures for the type of system that contained camera parameter calibration and the inconsistency of complementary metal oxide semiconductor calibration were discussed, designed, and implemented. Calibration results were used to undistort and rectify the multi-camera system. An orientation determination experiment was conducted. The results indicated that the system could acquire and compute the polarized skylight images throughout the calibrations and resolve orientation by the algorithm to verify in real-time. An orientation determination algorithm based on image processing was tested on the system. The performance and properties of the algorithm were evaluated. The rate of the algorithm was over 1 Hz, the error was over 0.313°, and the population standard deviation was 0.148° without any data filter. PMID:26805851

  9. Acquisition performance of LAPAN-A3/IPB multispectral imager in real-time mode of operation

    NASA Astrophysics Data System (ADS)

    Hakim, P. R.; Permala, R.; Jayani, A. P. S.

    2018-05-01

    LAPAN-A3/IPB satellite was launched in June 2016 and its multispectral imager has been producing Indonesian coverage images. In order to improve its support for remote sensing application, the imager should produce images with high quality and quantity. To improve the quantity of LAPAN-A3/IPB multispectral image captured, image acquisition could be executed in real-time mode from LAPAN ground station in Bogor when the satellite passes west Indonesia region. This research analyses the performance of LAPAN-A3/IPB multispectral imager acquisition in real-time mode, in terms of image quality and quantity, under assumption of several on-board and ground segment limitations. Results show that with real-time operation mode, LAPAN-A3/IPB multispectral imager could produce twice as much as image coverage compare to recorded mode. However, the images produced in real-time mode will have slightly degraded quality due to image compression process involved. Based on several analyses that have been done in this research, it is recommended to use real-time acquisition mode whenever it possible, unless for some circumstances that strictly not allow any quality degradation of the images produced.

  10. Use of Airborne Hyperspectral Data in the Simulation of Satellite Images

    NASA Astrophysics Data System (ADS)

    de Miguel, Eduardo; Jimenez, Marcos; Ruiz, Elena; Salido, Elena; Gutierrez de la Camara, Oscar

    2016-08-01

    The simulation of future images is part of the development phase of most Earth Observation missions. This simulation uses frequently as starting point images acquired from airborne instruments. These instruments provide the required flexibility in acquisition parameters (time, date, illumination and observation geometry...) and high spectral and spatial resolution, well above the target values (as required by simulation tools). However, there are a number of important problems hampering the use of airborne imagery. One of these problems is that observation zenith angles (OZA), are far from those that the misisons to be simulated would use.We examine this problem by evaluating the difference in ground reflectance estimated from airborne images for different observation/illumination geometries. Next, we analyze a solution for simulation purposes, in which a Bi- directional Reflectance Distribution Function (BRDF) model is attached to an image of the isotropic surface reflectance. The results obtained confirm the need for reflectance anisotropy correction when using airborne images for creating a reflectance map for simulation purposes. But this correction should not be used without providing the corresponding estimation of BRDF, in the form of model parameters, to the simulation teams.

  11. Real-time, T-ray imaging using a sub-terahertz gyrotron

    NASA Astrophysics Data System (ADS)

    Han, Seong-Tae; Torrezan, Antonio C.; Sirigiri, Jagadishwar R.; Shapiro, Michael A.; Temkin, Richard J.

    2012-06-01

    We demonstrated real-time, active, T-ray imaging using a 0.46 THz gyrotron capable of producing 16 W in continuous wave operation and a pyroelectric array camera with 124-by-124 pixels. An expanded Gaussian beam from the gyrotron was used to maintain the power density above the detection level of the pyroelectric array over the area of the irradiated object. Real-time imaging at a video rate of 48 Hz was achieved through the use of the built-in chopper of the camera. Potential applications include fast scanning for security purposes and for quality control of dry or frozen foods.

  12. Internet Teleprescence by Real-Time View-Dependent Image Generation with Omnidirectional Video Camera

    NASA Astrophysics Data System (ADS)

    Morita, Shinji; Yamazawa, Kazumasa; Yokoya, Naokazu

    2003-01-01

    This paper describes a new networked telepresence system which realizes virtual tours into a visualized dynamic real world without significant time delay. Our system is realized by the following three steps: (1) video-rate omnidirectional image acquisition, (2) transportation of an omnidirectional video stream via internet, and (3) real-time view-dependent perspective image generation from the omnidirectional video stream. Our system is applicable to real-time telepresence in the situation where the real world to be seen is far from an observation site, because the time delay from the change of user"s viewing direction to the change of displayed image is small and does not depend on the actual distance between both sites. Moreover, multiple users can look around from a single viewpoint in a visualized dynamic real world in different directions at the same time. In experiments, we have proved that the proposed system is useful for internet telepresence.

  13. Determination of pasture quality using airborne hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Pullanagari, R. R.; Kereszturi, G.; Yule, Ian J.; Irwin, M. E.

    2015-10-01

    Pasture quality is a critical determinant which influences animal performance (live weight gain, milk and meat production) and animal health. Assessment of pasture quality is therefore required to assist farmers with grazing planning and management, benchmarking between seasons and years. Traditionally, pasture quality is determined by field sampling which is laborious, expensive and time consuming, and the information is not available in real-time. Hyperspectral remote sensing has potential to accurately quantify biochemical composition of pasture over wide areas in great spatial detail. In this study an airborne imaging spectrometer (AisaFENIX, Specim) was used with a spectral range of 380-2500 nm with 448 spectral bands. A case study of a 600 ha hill country farm in New Zealand is used to illustrate the use of the system. Radiometric and atmospheric corrections, along with automatized georectification of the imagery using Digital Elevation Model (DEM), were applied to the raw images to convert into geocoded reflectance images. Then a multivariate statistical method, partial least squares (PLS), was applied to estimate pasture quality such as crude protein (CP) and metabolisable energy (ME) from canopy reflectance. The results from this study revealed that estimates of CP and ME had a R2 of 0.77 and 0.79, and RMSECV of 2.97 and 0.81 respectively. By utilizing these regression models, spatial maps were created over the imaged area. These pasture quality maps can be used for adopting precision agriculture practices which improves farm profitability and environmental sustainability.

  14. Measuring the complexity of design in real-time imaging software

    NASA Astrophysics Data System (ADS)

    Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.

    2007-02-01

    Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.

  15. Registration of angiographic image on real-time fluoroscopic image for image-guided percutaneous coronary intervention.

    PubMed

    Kim, Dongkue; Park, Sangsoo; Jeong, Myung Ho; Ryu, Jeha

    2018-02-01

    In percutaneous coronary intervention (PCI), cardiologists must study two different X-ray image sources: a fluoroscopic image and an angiogram. Manipulating a guidewire while alternately monitoring the two separate images on separate screens requires a deep understanding of the anatomy of coronary vessels and substantial training. We propose 2D/2D spatiotemporal image registration of the two images in a single image in order to provide cardiologists with enhanced visual guidance in PCI. The proposed 2D/2D spatiotemporal registration method uses a cross-correlation of two ECG series in each image to temporally synchronize two separate images and register an angiographic image onto the fluoroscopic image. A guidewire centerline is then extracted from the fluoroscopic image in real time, and the alignment of the centerline with vessel outlines of the chosen angiographic image is optimized using the iterative closest point algorithm for spatial registration. A proof-of-concept evaluation with a phantom coronary vessel model with engineering students showed an error reduction rate greater than 74% on wrong insertion to nontarget branches compared to the non-registration method and more than 47% reduction in the task completion time in performing guidewire manipulation for very difficult tasks. Evaluation with a small number of experienced doctors shows a potentially significant reduction in both task completion time and error rate for difficult tasks. The total registration time with real procedure X-ray (angiographic and fluoroscopic) images takes [Formula: see text] 60 ms, which is within the fluoroscopic image acquisition rate of 15 Hz. By providing cardiologists with better visual guidance in PCI, the proposed spatiotemporal image registration method is shown to be useful in advancing the guidewire to the coronary vessel branches, especially those difficult to insert into.

  16. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  17. An automatic detection method for the boiler pipe header based on real-time image acquisition

    NASA Astrophysics Data System (ADS)

    Long, Yi; Liu, YunLong; Qin, Yongliang; Yang, XiangWei; Li, DengKe; Shen, DingJie

    2017-06-01

    Generally, an endoscope is used to test the inner part of the thermal power plants boiler pipe header. However, since the endoscope hose manual operation, the length and angle of the inserted probe cannot be controlled. Additionally, it has a big blind spot observation subject to the length of the endoscope wire. To solve these problems, an automatic detection method for the boiler pipe header based on real-time image acquisition and simulation comparison techniques was proposed. The magnetic crawler with permanent magnet wheel could carry the real-time image acquisition device to complete the crawling work and collect the real-time scene image. According to the obtained location by using the positioning auxiliary device, the position of the real-time detection image in a virtual 3-D model was calibrated. Through comparing of the real-time detection images and the computer simulation images, the defects or foreign matter fall into could be accurately positioning, so as to repair and clean up conveniently.

  18. Real-time three-dimensional optical coherence tomography image-guided core-needle biopsy system.

    PubMed

    Kuo, Wei-Cheng; Kim, Jongsik; Shemonski, Nathan D; Chaney, Eric J; Spillman, Darold R; Boppart, Stephen A

    2012-06-01

    Advances in optical imaging modalities, such as optical coherence tomography (OCT), enable us to observe tissue microstructure at high resolution and in real time. Currently, core-needle biopsies are guided by external imaging modalities such as ultrasound imaging and x-ray computed tomography (CT) for breast and lung masses, respectively. These image-guided procedures are frequently limited by spatial resolution when using ultrasound imaging, or by temporal resolution (rapid real-time feedback capabilities) when using x-ray CT. One feasible approach is to perform OCT within small gauge needles to optically image tissue microstructure. However, to date, no system or core-needle device has been developed that incorporates both three-dimensional OCT imaging and tissue biopsy within the same needle for true OCT-guided core-needle biopsy. We have developed and demonstrate an integrated core-needle biopsy system that utilizes catheter-based 3-D OCT for real-time image-guidance for target tissue localization, imaging of tissue immediately prior to physical biopsy, and subsequent OCT imaging of the biopsied specimen for immediate assessment at the point-of-care. OCT images of biopsied ex vivo tumor specimens acquired during core-needle placement are correlated with corresponding histology, and computational visualization of arbitrary planes within the 3-D OCT volumes enables feedback on specimen tissue type and biopsy quality. These results demonstrate the potential for using real-time 3-D OCT for needle biopsy guidance by imaging within the needle and tissue during biopsy procedures.

  19. Video enhancement workbench: an operational real-time video image processing system

    NASA Astrophysics Data System (ADS)

    Yool, Stephen R.; Van Vactor, David L.; Smedley, Kirk G.

    1993-01-01

    Video image sequences can be exploited in real-time, giving analysts rapid access to information for military or criminal investigations. Video-rate dynamic range adjustment subdues fluctuations in image intensity, thereby assisting discrimination of small or low- contrast objects. Contrast-regulated unsharp masking enhances differentially shadowed or otherwise low-contrast image regions. Real-time removal of localized hotspots, when combined with automatic histogram equalization, may enhance resolution of objects directly adjacent. In video imagery corrupted by zero-mean noise, real-time frame averaging can assist resolution and location of small or low-contrast objects. To maximize analyst efficiency, lengthy video sequences can be screened automatically for low-frequency, high-magnitude events. Combined zoom, roam, and automatic dynamic range adjustment permit rapid analysis of facial features captured by video cameras recording crimes in progress. When trying to resolve small objects in murky seawater, stereo video places the moving imagery in an optimal setting for human interpretation.

  20. Real-time moving objects detection and tracking from airborne infrared camera

    NASA Astrophysics Data System (ADS)

    Zingoni, Andrea; Diani, Marco; Corsini, Giovanni

    2017-10-01

    Detecting and tracking moving objects in real-time from an airborne infrared (IR) camera offers interesting possibilities in video surveillance, remote sensing and computer vision applications, such as monitoring large areas simultaneously, quickly changing the point of view on the scene and pursuing objects of interest. To fully exploit such a potential, versatile solutions are needed, but, in the literature, the majority of them works only under specific conditions about the considered scenario, the characteristics of the moving objects or the aircraft movements. In order to overcome these limitations, we propose a novel approach to the problem, based on the use of a cheap inertial navigation system (INS), mounted on the aircraft. To exploit jointly the information contained in the acquired video sequence and the data provided by the INS, a specific detection and tracking algorithm has been developed. It consists of three main stages performed iteratively on each acquired frame. The detection stage, in which a coarse detection map is computed, using a local statistic both fast to calculate and robust to noise and self-deletion of the targeted objects. The registration stage, in which the position of the detected objects is coherently reported on a common reference frame, by exploiting the INS data. The tracking stage, in which the steady objects are rejected, the moving objects are tracked, and an estimation of their future position is computed, to be used in the subsequent iteration. The algorithm has been tested on a large dataset of simulated IR video sequences, recreating different environments and different movements of the aircraft. Promising results have been obtained, both in terms of detection and false alarm rate, and in terms of accuracy in the estimation of position and velocity of the objects. In addition, for each frame, the detection and tracking map has been generated by the algorithm, before the acquisition of the subsequent frame, proving its

  1. Towards real-time quantitative optical imaging for surgery

    NASA Astrophysics Data System (ADS)

    Gioux, Sylvain

    2017-07-01

    There is a pressing clinical need to provide image guidance during surgery. Currently, assessment of tissue that needs to be resected or avoided is performed subjectively leading to a large number of failures, patient morbidity and increased healthcare cost. Because near-infrared (NIR) optical imaging is safe, does not require contact, and can provide relatively deep information (several mm), it offers unparalleled capabilities for providing image guidance during surgery. In this work, we introduce a novel concept that enables the quantitative imaging of endogenous molecular information over large fields-of-view. Because this concept can be implemented in real-time, it is amenable to provide video-rate endogenous information during surgery.

  2. Ultrasound contrast agent imaging: Real-time imaging of the superharmonics

    NASA Astrophysics Data System (ADS)

    Peruzzini, D.; Viti, J.; Tortoli, P.; Verweij, M. D.; de Jong, N.; Vos, H. J.

    2015-10-01

    Currently, in medical ultrasound contrast agent (UCA) imaging the second harmonic scattering of the microbubbles is regularly used. This scattering is in competition with the signal that is caused by nonlinear wave propagation in tissue. It was reported that UCA imaging based on the third or higher harmonics, i.e. "superharmonic" imaging, shows better contrast. However, the superharmonic scattering has a lower signal level compared to e.g. second harmonic signals. This study investigates the contrast-to-tissue ratio (CTR) and signal to noise ratio (SNR) of superharmonic UCA scattering in a tissue/vessel mimicking phantom using a real-time clinical scanner. Numerical simulations were performed to estimate the level of harmonics generated by the microbubbles. Data were acquired with a custom built dual-frequency cardiac phased array probe. Fundamental real-time images were produced while beam formed radiofrequency (RF) data was stored for further offline processing. The phantom consisted of a cavity filled with UCA surrounded by tissue mimicking material. The acoustic pressure in the cavity of the phantom was 110 kPa (MI = 0.11) ensuring non-destructivity of UCA. After processing of the acquired data from the phantom, the UCA-filled cavity could be clearly observed in the images, while tissue signals were suppressed at or below the noise floor. The measured CTR values were 36 dB, >38 dB, and >32 dB, for the second, third, and fourth harmonic respectively, which were in agreement with those reported earlier for preliminary contrast superharmonic imaging. The single frame SNR values (in which `signal' denotes the signal level from the UCA area) were 23 dB, 18 dB, and 11 dB, respectively. This indicates that noise, and not the tissue signal, is the limiting factor for the UCA detection when using the superharmonics in nondestructive mode.

  3. Real-Time Investigation of Tuberculosis Transmission: Developing the Respiratory Aerosol Sampling Chamber (RASC).

    PubMed

    Wood, Robin; Morrow, Carl; Barry, Clifton E; Bryden, Wayne A; Call, Charles J; Hickey, Anthony J; Rodes, Charles E; Scriba, Thomas J; Blackburn, Jonathan; Issarow, Chacha; Mulder, Nicola; Woodward, Jeremy; Moosa, Atica; Singh, Vinayak; Mizrahi, Valerie; Warner, Digby F

    2016-01-01

    Knowledge of the airborne nature of respiratory disease transmission owes much to the pioneering experiments of Wells and Riley over half a century ago. However, the mechanical, physiological, and immunopathological processes which drive the production of infectious aerosols by a diseased host remain poorly understood. Similarly, very little is known about the specific physiological, metabolic and morphological adaptations which enable pathogens such as Mycobacterium tuberculosis (Mtb) to exit the infected host, survive exposure to the external environment during airborne carriage, and adopt a form that is able to enter the respiratory tract of a new host, avoiding innate immune and physical defenses to establish a nascent infection. As a first step towards addressing these fundamental knowledge gaps which are central to any efforts to interrupt disease transmission, we developed and characterized a small personal clean room comprising an array of sampling devices which enable isolation and representative sampling of airborne particles and organic matter from tuberculosis (TB) patients. The complete unit, termed the Respiratory Aerosol Sampling Chamber (RASC), is instrumented to provide real-time information about the particulate output of a single patient, and to capture samples via a suite of particulate impingers, impactors and filters. Applying the RASC in a clinical setting, we demonstrate that a combination of molecular and microbiological assays, as well as imaging by fluorescence and scanning electron microscopy, can be applied to investigate the identity, viability, and morphology of isolated aerosolized particles. Importantly, from a preliminary panel of active TB patients, we observed the real-time production of large numbers of airborne particles including Mtb, as confirmed by microbiological culture and polymerase chain reaction (PCR) genotyping. Moreover, direct imaging of captured samples revealed the presence of multiple rod-like Mtb organisms whose

  4. Real-Time Investigation of Tuberculosis Transmission: Developing the Respiratory Aerosol Sampling Chamber (RASC)

    PubMed Central

    Wood, Robin; Morrow, Carl; Barry, Clifton E.; Bryden, Wayne A.; Call, Charles J.; Hickey, Anthony J.; Rodes, Charles E.; Scriba, Thomas J.; Blackburn, Jonathan; Issarow, Chacha; Mulder, Nicola; Woodward, Jeremy; Moosa, Atica; Singh, Vinayak; Mizrahi, Valerie; Warner, Digby F.

    2016-01-01

    Knowledge of the airborne nature of respiratory disease transmission owes much to the pioneering experiments of Wells and Riley over half a century ago. However, the mechanical, physiological, and immunopathological processes which drive the production of infectious aerosols by a diseased host remain poorly understood. Similarly, very little is known about the specific physiological, metabolic and morphological adaptations which enable pathogens such as Mycobacterium tuberculosis (Mtb) to exit the infected host, survive exposure to the external environment during airborne carriage, and adopt a form that is able to enter the respiratory tract of a new host, avoiding innate immune and physical defenses to establish a nascent infection. As a first step towards addressing these fundamental knowledge gaps which are central to any efforts to interrupt disease transmission, we developed and characterized a small personal clean room comprising an array of sampling devices which enable isolation and representative sampling of airborne particles and organic matter from tuberculosis (TB) patients. The complete unit, termed the Respiratory Aerosol Sampling Chamber (RASC), is instrumented to provide real-time information about the particulate output of a single patient, and to capture samples via a suite of particulate impingers, impactors and filters. Applying the RASC in a clinical setting, we demonstrate that a combination of molecular and microbiological assays, as well as imaging by fluorescence and scanning electron microscopy, can be applied to investigate the identity, viability, and morphology of isolated aerosolized particles. Importantly, from a preliminary panel of active TB patients, we observed the real-time production of large numbers of airborne particles including Mtb, as confirmed by microbiological culture and polymerase chain reaction (PCR) genotyping. Moreover, direct imaging of captured samples revealed the presence of multiple rod-like Mtb organisms whose

  5. Compact wearable dual-mode imaging system for real-time fluorescence image-guided surgery.

    PubMed

    Zhu, Nan; Huang, Chih-Yu; Mondal, Suman; Gao, Shengkui; Huang, Chongyuan; Gruev, Viktor; Achilefu, Samuel; Liang, Rongguang

    2015-09-01

    A wearable all-plastic imaging system for real-time fluorescence image-guided surgery is presented. The compact size of the system is especially suitable for applications in the operating room. The system consists of a dual-mode imaging system, see-through goggle, autofocusing, and auto-contrast tuning modules. The paper will discuss the system design and demonstrate the system performance.

  6. Compact wearable dual-mode imaging system for real-time fluorescence image-guided surgery

    PubMed Central

    Zhu, Nan; Huang, Chih-Yu; Mondal, Suman; Gao, Shengkui; Huang, Chongyuan; Gruev, Viktor; Achilefu, Samuel; Liang, Rongguang

    2015-01-01

    Abstract. A wearable all-plastic imaging system for real-time fluorescence image-guided surgery is presented. The compact size of the system is especially suitable for applications in the operating room. The system consists of a dual-mode imaging system, see-through goggle, autofocusing, and auto-contrast tuning modules. The paper will discuss the system design and demonstrate the system performance. PMID:26358823

  7. GPU-Based Real-Time Volumetric Ultrasound Image Reconstruction for a Ring Array

    PubMed Central

    Choe, Jung Woo; Nikoozadeh, Amin; Oralkan, Ömer; Khuri-Yakub, Butrus T.

    2014-01-01

    Synthetic phased array (SPA) beamforming with Hadamard coding and aperture weighting is an optimal option for real-time volumetric imaging with a ring array, a particularly attractive geometry in intracardiac and intravascular applications. However, the imaging frame rate of this method is limited by the immense computational load required in synthetic beamforming. For fast imaging with a ring array, we developed graphics processing unit (GPU)-based, real-time image reconstruction software that exploits massive data-level parallelism in beamforming operations. The GPU-based software reconstructs and displays three cross-sectional images at 45 frames per second (fps). This frame rate is 4.5 times higher than that for our previously-developed multi-core CPU-based software. In an alternative imaging mode, it shows one B-mode image rotating about the axis and its maximum intensity projection (MIP), processed at a rate of 104 fps. This paper describes the image reconstruction procedure on the GPU platform and presents the experimental images obtained using this software. PMID:23529080

  8. Evaluation of Optical Sonography for Real-Time Breast Imaging and Biopsy Guidance

    DTIC Science & Technology

    2002-08-01

    supported through images of target standards and subjective validation using images of human anatomy . Keywords: Diffractive Energy Imaging...real-time imaging technology that uses the principles of acoustical holography to produce unique images of the human anatomy . The ADI technology is

  9. Real-time FPGA-based radar imaging for smart mobility systems

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio; Neri, Bruno

    2016-04-01

    The paper presents an X-band FMCW (Frequency Modulated Continuous Wave) Radar Imaging system, called X-FRI, for surveillance in smart mobility applications. X-FRI allows for detecting the presence of targets (e.g. obstacles in a railway crossing or urban road crossing, or ships in a small harbor), as well as their speed and their position. With respect to alternative solutions based on LIDAR or camera systems, X-FRI operates in real-time also in bad lighting and weather conditions, night and day. The radio-frequency transceiver is realized through COTS (Commercial Off The Shelf) components on a single-board. An FPGA-based baseband platform allows for real-time Radar image processing.

  10. Instant snapshot of the internal structure of Unzen lava dome, Japan with airborne muography

    PubMed Central

    Tanaka, Hiroyuki K. M.

    2016-01-01

    An emerging elementary particle imaging technique called muography has increasingly been used to resolve the internal structures of volcanoes with a spatial resolution of less than 100 m. However, land-based muography requires several days at least to acquire satisfactory image contrast and thus, it has not been a practical tool to diagnose the erupting volcano in a real time manner. To address this issue, airborne muography was implemented for the first time, targeting Heisei-Shinzan lava dome of Unzen volcano, Japan. Obtained in 2.5 hours, the resultant image clearly showed the density contrast inside the dome, which is essential information to predict the magnitude of the dome collapse. Since airborne muography is not restricted by topographic conditions for apparatus placements, we anticipate that the technique is applicable to creating images of this type of lava dome evolution from various angles in real time. PMID:28008978

  11. Performance enhancement of various real-time image processing techniques via speculative execution

    NASA Astrophysics Data System (ADS)

    Younis, Mohamed F.; Sinha, Purnendu; Marlowe, Thomas J.; Stoyenko, Alexander D.

    1996-03-01

    In real-time image processing, an application must satisfy a set of timing constraints while ensuring the semantic correctness of the system. Because of the natural structure of digital data, pure data and task parallelism have been used extensively in real-time image processing to accelerate the handling time of image data. These types of parallelism are based on splitting the execution load performed by a single processor across multiple nodes. However, execution of all parallel threads is mandatory for correctness of the algorithm. On the other hand, speculative execution is an optimistic execution of part(s) of the program based on assumptions on program control flow or variable values. Rollback may be required if the assumptions turn out to be invalid. Speculative execution can enhance average, and sometimes worst-case, execution time. In this paper, we target various image processing techniques to investigate applicability of speculative execution. We identify opportunities for safe and profitable speculative execution in image compression, edge detection, morphological filters, and blob recognition.

  12. Assessment of bacterial pathogens in fresh rainwater and airborne particulate matter using Real-Time PCR

    NASA Astrophysics Data System (ADS)

    Kaushik, Rajni; Balasubramanian, Rajasekhar

    2012-01-01

    Bacterial pathogens in airborne particulate matter (PM) and in rainwater (RW) were detected using a robust and sensitive Real-Time PCR method. Both RW and PM were collected simultaneously in the tropical atmosphere of Singapore, which were then subjected to analysis for the presence of selected bacterial pathogens and potential pathogen of health concern ( Escherichia coli, Klebsiella pneumoniae, Pseudomonas aeruginosa and Aeromonas hydrophila). These pathogens were found to be prevalent in both PM and RW samples with E. coli being the most prevalent potential pathogen in both types of samples. The temporal distribution of these pathogens in PM and RW was found to be similar to each other. Using the proposed microbiological technique, the atmospheric deposition (dry and wet deposition) of bacterial pathogens to lakes and reservoirs can be studied in view of growing concerns about the outbreak of waterborne diseases.

  13. Real time 3D structural and Doppler OCT imaging on graphics processing units

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Szkulmowski, Maciej; Gorczyńska, Iwona; Bukowska, Danuta; Wojtkowski, Maciej; Targowski, Piotr

    2013-03-01

    In this report the application of graphics processing unit (GPU) programming for real-time 3D Fourier domain Optical Coherence Tomography (FdOCT) imaging with implementation of Doppler algorithms for visualization of the flows in capillary vessels is presented. Generally, the time of the data processing of the FdOCT data on the main processor of the computer (CPU) constitute a main limitation for real-time imaging. Employing additional algorithms, such as Doppler OCT analysis, makes this processing even more time consuming. Lately developed GPUs, which offers a very high computational power, give a solution to this problem. Taking advantages of them for massively parallel data processing, allow for real-time imaging in FdOCT. The presented software for structural and Doppler OCT allow for the whole processing with visualization of 2D data consisting of 2000 A-scans generated from 2048 pixels spectra with frame rate about 120 fps. The 3D imaging in the same mode of the volume data build of 220 × 100 A-scans is performed at a rate of about 8 frames per second. In this paper a software architecture, organization of the threads and optimization applied is shown. For illustration the screen shots recorded during real time imaging of the phantom (homogeneous water solution of Intralipid in glass capillary) and the human eye in-vivo is presented.

  14. MO-FG-BRD-00: Real-Time Imaging and Tracking Techniques for Intrafractional Motion Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Intrafraction target motion is a prominent complicating factor in the accurate targeting of radiation within the body. Methods compensating for target motion during treatment, such as gating and dynamic tumor tracking, depend on the delineation of target location as a function of time during delivery. A variety of techniques for target localization have been explored and are under active development; these include beam-level imaging of radio-opaque fiducials, fiducial-less tracking of anatomical landmarks, tracking of electromagnetic transponders, optical imaging of correlated surrogates, and volumetric imaging within treatment delivery. The Joint Imaging and Therapy Symposium will provide an overview of the techniquesmore » for real-time imaging and tracking, with special focus on emerging modes of implementation across different modalities. In particular, the symposium will explore developments in 1) Beam-level kilovoltage X-ray imaging techniques, 2) EPID-based megavoltage X-ray tracking, 3) Dynamic tracking using electromagnetic transponders, and 4) MRI-based soft-tissue tracking during radiation delivery. Learning Objectives: Understand the fundamentals of real-time imaging and tracking techniques Learn about emerging techniques in the field of real-time tracking Distinguish between the advantages and disadvantages of different tracking modalities Understand the role of real-time tracking techniques within the clinical delivery work-flow.« less

  15. Real-time Fluorescence Image-Guided Oncologic Surgery

    PubMed Central

    Mondal, Suman B.; Gao, Shengkui; Zhu, Nan; Liang, Rongguang; Gruev, Viktor; Achilefu, Samuel

    2014-01-01

    Medical imaging plays a critical role in cancer diagnosis and planning. Many of these patients rely on surgical intervention for curative outcomes. This requires a careful identification of the primary and microscopic tumors, and the complete removal of cancer. Although there have been efforts to adapt traditional imaging modalities for intraoperative image guidance, they suffer from several constraints such as large hardware footprint, high operation cost, and disruption of the surgical workflow. Because of the ease of image acquisition, relatively low cost devices and intuitive operation, optical imaging methods have received tremendous interests for use in real-time image-guided surgery. To improve imaging depth under low interference by tissue autofluorescence, many of these applications utilize light in the near-infra red (NIR) wavelengths, which is invisible to human eyes. With the availability of a wide selection of tumor-avid contrast agents, advancements in imaging sensors, electronic and optical designs, surgeons are able to combine different attributes of NIR optical imaging techniques to improve treatment outcomes. The emergence of diverse commercial and experimental image guidance systems, which are in various stages of clinical translation, attests to the potential high impact of intraoperative optical imaging methods to improve speed of oncologic surgery with high accuracy and minimal margin positivity. PMID:25287689

  16. Ultrasound contrast agent imaging: Real-time imaging of the superharmonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peruzzini, D.; Viti, J.; Erasmus MC, ’s-Gravendijkwal 230, Faculty Building, Ee 2302, 3015 CE Rotterdam

    2015-10-28

    Currently, in medical ultrasound contrast agent (UCA) imaging the second harmonic scattering of the microbubbles is regularly used. This scattering is in competition with the signal that is caused by nonlinear wave propagation in tissue. It was reported that UCA imaging based on the third or higher harmonics, i.e. “superharmonic” imaging, shows better contrast. However, the superharmonic scattering has a lower signal level compared to e.g. second harmonic signals. This study investigates the contrast-to-tissue ratio (CTR) and signal to noise ratio (SNR) of superharmonic UCA scattering in a tissue/vessel mimicking phantom using a real-time clinical scanner. Numerical simulations were performedmore » to estimate the level of harmonics generated by the microbubbles. Data were acquired with a custom built dual-frequency cardiac phased array probe. Fundamental real-time images were produced while beam formed radiofrequency (RF) data was stored for further offline processing. The phantom consisted of a cavity filled with UCA surrounded by tissue mimicking material. The acoustic pressure in the cavity of the phantom was 110 kPa (MI = 0.11) ensuring non-destructivity of UCA. After processing of the acquired data from the phantom, the UCA-filled cavity could be clearly observed in the images, while tissue signals were suppressed at or below the noise floor. The measured CTR values were 36 dB, >38 dB, and >32 dB, for the second, third, and fourth harmonic respectively, which were in agreement with those reported earlier for preliminary contrast superharmonic imaging. The single frame SNR values (in which ‘signal’ denotes the signal level from the UCA area) were 23 dB, 18 dB, and 11 dB, respectively. This indicates that noise, and not the tissue signal, is the limiting factor for the UCA detection when using the superharmonics in nondestructive mode.« less

  17. Characterizing Articulation in Apraxic Speech Using Real-Time Magnetic Resonance Imaging.

    PubMed

    Hagedorn, Christina; Proctor, Michael; Goldstein, Louis; Wilson, Stephen M; Miller, Bruce; Gorno-Tempini, Maria Luisa; Narayanan, Shrikanth S

    2017-04-14

    Real-time magnetic resonance imaging (MRI) and accompanying analytical methods are shown to capture and quantify salient aspects of apraxic speech, substantiating and expanding upon evidence provided by clinical observation and acoustic and kinematic data. Analysis of apraxic speech errors within a dynamic systems framework is provided and the nature of pathomechanisms of apraxic speech discussed. One adult male speaker with apraxia of speech was imaged using real-time MRI while producing spontaneous speech, repeated naming tasks, and self-paced repetition of word pairs designed to elicit speech errors. Articulatory data were analyzed, and speech errors were detected using time series reflecting articulatory activity in regions of interest. Real-time MRI captured two types of apraxic gestural intrusion errors in a word pair repetition task. Gestural intrusion errors in nonrepetitive speech, multiple silent initiation gestures at the onset of speech, and covert (unphonated) articulation of entire monosyllabic words were also captured. Real-time MRI and accompanying analytical methods capture and quantify many features of apraxic speech that have been previously observed using other modalities while offering high spatial resolution. This patient's apraxia of speech affected the ability to select only the appropriate vocal tract gestures for a target utterance, suppressing others, and to coordinate them in time.

  18. Real-Time Intravascular Ultrasound and Photoacoustic Imaging

    PubMed Central

    VanderLaan, Donald; Karpiouk, Andrei; Yeager, Doug; Emelianov, Stanislav

    2018-01-01

    Combined intravascular ultrasound and photoacoustic imaging (IVUS/IVPA) is an emerging hybrid modality being explored as a means of improving the characterization of atherosclerotic plaque anatomical and compositional features. While initial demonstrations of the technique have been encouraging, they have been limited by catheter rotation and data acquisition, displaying and processing rates on the order of several seconds per frame as well as the use of off-line image processing. Herein, we present a complete IVUS/IVPA imaging system and method capable of real-time IVUS/IVPA imaging, with online data acquisition, image processing and display of both IVUS and IVPA images. The integrated IVUS/IVPA catheter is fully contained within a 1 mm outer diameter torque cable coupled on the proximal end to a custom-designed spindle enabling optical and electrical coupling to system hardware, including a nanosecond-pulsed laser with a controllable pulse repetition frequency capable of greater than 10kHz, motor and servo drive, an ultrasound pulser/receiver, and a 200 MHz digitizer. The system performance is characterized and demonstrated on a vessel-mimicking phantom with an embedded coronary stent intended to provide IVPA contrast within content of an IVUS image. PMID:28092507

  19. Real-time image-processing algorithm for markerless tumour tracking using X-ray fluoroscopic imaging.

    PubMed

    Mori, S

    2014-05-01

    To ensure accuracy in respiratory-gating treatment, X-ray fluoroscopic imaging is used to detect tumour position in real time. Detection accuracy is strongly dependent on image quality, particularly positional differences between the patient and treatment couch. We developed a new algorithm to improve the quality of images obtained in X-ray fluoroscopic imaging and report the preliminary results. Two oblique X-ray fluoroscopic images were acquired using a dynamic flat panel detector (DFPD) for two patients with lung cancer. The weighting factor was applied to the DFPD image in respective columns, because most anatomical structures, as well as the treatment couch and port cover edge, were aligned in the superior-inferior direction when the patient lay on the treatment couch. The weighting factors for the respective columns were varied until the standard deviation of the pixel values within the image region was minimized. Once the weighting factors were calculated, the quality of the DFPD image was improved by applying the factors to multiframe images. Applying the image-processing algorithm produced substantial improvement in the quality of images, and the image contrast was increased. The treatment couch and irradiation port edge, which were not related to a patient's position, were removed. The average image-processing time was 1.1 ms, showing that this fast image processing can be applied to real-time tumour-tracking systems. These findings indicate that this image-processing algorithm improves the image quality in patients with lung cancer and successfully removes objects not related to the patient. Our image-processing algorithm might be useful in improving gated-treatment accuracy.

  20. A Real Time System for Multi-Sensor Image Analysis through Pyramidal Segmentation

    DTIC Science & Technology

    1992-01-30

    A Real Time Syte for M~ulti- sensor Image Analysis S. E I0 through Pyramidal Segmentation/ / c •) L. Rudin, S. Osher, G. Koepfler, J.9. Morel 7. ytu...experiments with reconnaissance photography, multi- sensor satellite imagery, medical CT and MRI multi-band data have shown a great practi- cal potential...C ,SF _/ -- / WSM iS-I-0-d41-40450 $tltwt, kw" I (nor.- . Z-97- A real-time system for multi- sensor image analysis through pyramidal segmentation

  1. Real-time in vivo imaging of human lymphatic system using an LED-based photoacoustic/ultrasound imaging system

    NASA Astrophysics Data System (ADS)

    Kuniyil Ajith Singh, Mithun; Agano, Toshitaka; Sato, Naoto; Shigeta, Yusuke; Uemura, Tetsuji

    2018-02-01

    Non-invasive in vivo imaging of lymphatic system is of paramount importance for analyzing the functions of lymphatic vessels, and for investigating their contribution to metastasis. Recently, we introduced a multi-wavelength real-time LED-based photoacoustic/ultrasound system (AcousticX). In this work, for the first time, we demonstrate that AcousticX is capable of real-time imaging of human lymphatic system. Results demonstrate the capability of this system to image vascular and lymphatic vessels simultaneously. This could potentially provide detailed information regarding the interconnected roles of lymphatic and vascular systems in various diseases, therefore fostering the growth of therapeutic interventions.

  2. Rapid Diagnosis of Tuberculosis by Real-Time High-Resolution Imaging of Mycobacterium tuberculosis Colonies.

    PubMed

    Ghodbane, Ramzi; Asmar, Shady; Betzner, Marlena; Linet, Marie; Pierquin, Joseph; Raoult, Didier; Drancourt, Michel

    2015-08-01

    Culture remains the cornerstone of diagnosis for pulmonary tuberculosis, but the fastidiousness of Mycobacterium tuberculosis may delay culture-based diagnosis for weeks. We evaluated the performance of real-time high-resolution imaging for the rapid detection of M. tuberculosis colonies growing on a solid medium. A total of 50 clinical specimens, including 42 sputum specimens, 4 stool specimens, 2 bronchoalveolar lavage fluid specimens, and 2 bronchial aspirate fluid specimens were prospectively inoculated into (i) a commercially available Middlebrook broth and evaluated for mycobacterial growth indirectly detected by measuring oxygen consumption (standard protocol) and (ii) a home-made solid medium incubated in an incubator featuring real-time high-resolution imaging of colonies (real-time protocol). Isolates were identified by Ziehl-Neelsen staining and matrix-assisted laser desorption ionization-time of flight mass spectrometry. Use of the standard protocol yielded 14/50 (28%) M. tuberculosis isolates, which is not significantly different from the 13/50 (26%) M. tuberculosis isolates found using the real-time protocol (P = 1.00 by Fisher's exact test), and the contamination rate of 1/50 (2%) was not significantly different from the contamination rate of 2/50 (4%) using the real-time protocol (P = 1.00). The real-time imaging protocol showed a 4.4-fold reduction in time to detection, 82 ± 54 h versus 360 ± 142 h (P < 0.05). These preliminary data give the proof of concept that real-time high-resolution imaging of M. tuberculosis colonies is a new technology that shortens the time to growth detection and the laboratory diagnosis of pulmonary tuberculosis. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  3. Battlefield radar imaging through airborne millimetric wave SAR (Synthetic Aperture Radar)

    NASA Astrophysics Data System (ADS)

    Carletti, U.; Daddio, E.; Farina, A.; Morabito, C.; Pangrazi, R.; Studer, F. A.

    Airborne synthetic aperture radar (SAR), operating in the millimetric-wave (mmw) region, is discussed with reference to a battlefield surveillance application. The SAR system provides high resolution real-time imaging of the battlefield and moving target detection, under adverse environmental conditions (e.g., weather, dust, smoke, obscurants). The most relevant and original aspects of the system are the band of operation (i.e., mmw in lieu of the more traditional microwave region) and the use of an unmanned platform. The former implies reduced weight and size requirements, thus allowing use of small unmanned platforms. The latter enchances the system operational effectiveness by permitting accomplishment of recognition missions in depth beyond the FEBA. An overall system architecture based on the onboard sensor, the platform, the communication equipment, and a mobile ground station is described. The main areas of ongoing investigation are presented: the simulation of the end-to-end system, and the critical technological issues such as mmw antenna, transmitter, signal processor for image formation and platform attitude errors compensation and detection and imaging of moving targets.

  4. Real time blood testing using quantitative phase imaging.

    PubMed

    Pham, Hoa V; Bhaduri, Basanta; Tangella, Krishnarao; Best-Popescu, Catherine; Popescu, Gabriel

    2013-01-01

    We demonstrate a real-time blood testing system that can provide remote diagnosis with minimal human intervention in economically challenged areas. Our instrument combines novel advances in label-free optical imaging with parallel computing. Specifically, we use quantitative phase imaging for extracting red blood cell morphology with nanoscale sensitivity and NVIDIA's CUDA programming language to perform real time cellular-level analysis. While the blood smear is translated through focus, our system is able to segment and analyze all the cells in the one megapixel field of view, at a rate of 40 frames/s. The variety of diagnostic parameters measured from each cell (e.g., surface area, sphericity, and minimum cylindrical diameter) are currently not available with current state of the art clinical instruments. In addition, we show that our instrument correctly recovers the red blood cell volume distribution, as evidenced by the excellent agreement with the cell counter results obtained on normal patients and those with microcytic and macrocytic anemia. The final data outputted by our instrument represent arrays of numbers associated with these morphological parameters and not images. Thus, the memory necessary to store these data is of the order of kilobytes, which allows for their remote transmission via, for example, the cellular network. We envision that such a system will dramatically increase access for blood testing and furthermore, may pave the way to digital hematology.

  5. Real-time Magnetic Resonance Imaging Guidance for Cardiovascular Procedures

    PubMed Central

    Horvath, Keith A.; Li, Ming; Mazilu, Dumitru; Guttman, Michael A.; McVeigh, Elliot R.

    2008-01-01

    Magnetic resonance imaging (MRI) of the cardiovascular system has proven to be an invaluable diagnostic tool. Given the ability to allow for real-time imaging, MRI guidance of intraoperative procedures can provide superb visualization which can facilitate a variety of interventions and minimize the trauma of the operations as well. In addition to the anatomic detail, MRI can provide intraoperative assessment of organ and device function. Instruments and devices can be marked to enhance visualization and tracking. All of which is an advance over standard x-ray or ultrasonic imaging. PMID:18395633

  6. Towards real-time diffuse optical tomography for imaging brain functions cooperated with Kalman estimator

    NASA Astrophysics Data System (ADS)

    Wang, Bingyuan; Zhang, Yao; Liu, Dongyuan; Ding, Xuemei; Dan, Mai; Pan, Tiantian; Wang, Yihan; Li, Jiao; Zhou, Zhongxing; Zhang, Limin; Zhao, Huijuan; Gao, Feng

    2018-02-01

    Functional near-infrared spectroscopy (fNIRS) is a non-invasive neuroimaging method to monitor the cerebral hemodynamic through the optical changes measured at the scalp surface. It has played a more and more important role in psychology and medical imaging communities. Real-time imaging of brain function using NIRS makes it possible to explore some sophisticated human brain functions unexplored before. Kalman estimator has been frequently used in combination with modified Beer-Lamber Law (MBLL) based optical topology (OT), for real-time brain function imaging. However, the spatial resolution of the OT is low, hampering the application of OT in exploring some complicated brain functions. In this paper, we develop a real-time imaging method combining diffuse optical tomography (DOT) and Kalman estimator, much improving the spatial resolution. Instead of only presenting one spatially distributed image indicating the changes of the absorption coefficients at each time point during the recording process, one real-time updated image using the Kalman estimator is provided. Its each voxel represents the amplitude of the hemodynamic response function (HRF) associated with this voxel. We evaluate this method using some simulation experiments, demonstrating that this method can obtain more reliable spatial resolution images. Furthermore, a statistical analysis is also conducted to help to decide whether a voxel in the field of view is activated or not.

  7. A fiducial detection algorithm for real-time image guided IMRT based on simultaneous MV and kV imaging

    PubMed Central

    Mao, Weihua; Riaz, Nadeem; Lee, Louis; Wiersma, Rodney; Xing, Lei

    2008-01-01

    The advantage of highly conformal dose techniques such as 3DCRT and IMRT is limited by intrafraction organ motion. A new approach to gain near real-time 3D positions of internally implanted fiducial markers is to analyze simultaneous onboard kV beam and treatment MV beam images (from fluoroscopic or electronic portal image devices). Before we can use this real-time image guidance for clinical 3DCRT and IMRT treatments, four outstanding issues need to be addressed. (1) How will fiducial motion blur the image and hinder tracking fiducials? kV and MV images are acquired while the tumor is moving at various speeds. We find that a fiducial can be successfully detected at a maximum linear speed of 1.6 cm∕s. (2) How does MV beam scattering affect kV imaging? We investigate this by varying MV field size and kV source to imager distance, and find that common treatment MV beams do not hinder fiducial detection in simultaneous kV images. (3) How can one detect fiducials on images from 3DCRT and IMRT treatment beams when the MV fields are modified by a multileaf collimator (MLC)? The presented analysis is capable of segmenting a MV field from the blocking MLC and detecting visible fiducials. This enables the calculation of nearly real-time 3D positions of markers during a real treatment. (4) Is the analysis fast enough to track fiducials in nearly real time? Multiple methods are adopted to predict marker positions and reduce search regions. The average detection time per frame for three markers in a 1024×768 image was reduced to 0.1 s or less. Solving these four issues paves the way to tracking moving fiducial markers throughout a 3DCRT or IMRT treatment. Altogether, these four studies demonstrate that our algorithm can track fiducials in real time, on degraded kV images (MV scatter), in rapidly moving tumors (fiducial blurring), and even provide useful information in the case when some fiducials are blocked from view by the MLC. This technique can provide a gating signal

  8. Effective image differencing with convolutional neural networks for real-time transient hunting

    NASA Astrophysics Data System (ADS)

    Sedaghat, Nima; Mahabal, Ashish

    2018-06-01

    Large sky surveys are increasingly relying on image subtraction pipelines for real-time (and archival) transient detection. In this process one has to contend with varying point-spread function (PSF) and small brightness variations in many sources, as well as artefacts resulting from saturated stars and, in general, matching errors. Very often the differencing is done with a reference image that is deeper than individual images and the attendant difference in noise characteristics can also lead to artefacts. We present here a deep-learning approach to transient detection that encapsulates all the steps of a traditional image-subtraction pipeline - image registration, background subtraction, noise removal, PSF matching and subtraction - in a single real-time convolutional network. Once trained, the method works lightening-fast and, given that it performs multiple steps in one go, the time saved and false positives eliminated for multi-CCD surveys like Zwicky Transient Facility and Large Synoptic Survey Telescope will be immense, as millions of subtractions will be needed per night.

  9. Real-Time On-Board Processing Validation of MSPI Ground Camera Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

    2010-01-01

    The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

  10. The Real Time Correction of Stereoscopic Images: From the Serial to a Parallel Treatment

    NASA Astrophysics Data System (ADS)

    Irki, Zohir; Devy, Michel; Achour, Karim; Azzaz, Mohamed Salah

    2008-06-01

    The correction of the stereoscopic images is a task which consists in replacing acquired images by other images having the same properties but which are simpler to use in the other stages of stereovision. The use of the pre-calculated tables, built during an off line calibration step, made it possible to carry out the off line stereoscopic images rectification. An improvement of the built tables made it possible to carry out the real time rectification. In this paper, we describe an improvement of the real time correction approach so it can be exploited for a possible implementation on an FPGA component. This improvement holds in account the real time aspect of the correction and the available resources that can offer the FPGA Type Stratix 1S40F780C5.

  11. Liver tumor boundaries identified intraoperatively using real-time indocyanine green fluorescence imaging.

    PubMed

    Zhang, Ya-Min; Shi, Rui; Hou, Jian-Cun; Liu, Zi-Rong; Cui, Zi-Lin; Li, Yang; Wu, Di; Shi, Yuan; Shen, Zhong-Yang

    2017-01-01

    Clear delineation between tumors and normal tissues is ideal for real-time surgical navigation imaging. We investigated applying indocyanine green (ICG) fluorescence imaging navigation using an intraoperative administration method in liver resection. Fifty patients who underwent liver resection were divided into two groups based on clinical situation and operative purpose. In group I, sizes of superficial liver tumors were determined; tiny tumors were identified. In group II, the liver resection margin was determined; real-time navigation was performed. ICG was injected intravenously at the beginning of the operation; the liver surface was observed with a photodynamic eye (PDE). Liver resection margins were determined using PDE. Fluorescence contrast between normal liver and tumor tissues was obvious in 32 of 35 patients. A boundary for half the liver or specific liver segments was determined in nine patients by examining the portal vein anatomy after ICG injection. Eight small tumors not observed preoperatively were detected; the smallest was 2 mm. ICG fluorescence imaging navigation is a promising, simple, and safe tool for routine real-time intraoperative imaging during hepatic resection and clinical exploration in hepatocellular carcinoma, enabling high sensibility for identifying liver resection margins and detecting tiny superficial tumors.

  12. Application of linear array imaging techniques to the real-time inspection of airframe structures and substructures

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1995-01-01

    Development and application of linear array imaging technologies to address specific aging-aircraft inspection issues is described. Real-time video-taped images were obtained from an unmodified commercial linear-array medical scanner of specimens constructed to simulate typical types of flaws encountered in the inspection of aircraft structures. Results suggest that information regarding the characteristics, location, and interface properties of specific types of flaws in materials and structures may be obtained from the images acquired with a linear array. Furthermore, linear array imaging may offer the advantage of being able to compare 'good' regions with 'flawed' regions simultaneously, and in real time. Real-time imaging permits the inspector to obtain image information from various views and provides the opportunity for observing the effects of introducing specific interventions. Observation of an image in real-time can offer the operator the ability to 'interact' with the inspection process, thus providing new capabilities, and perhaps, new approaches to nondestructive inspections.

  13. Real-time Image Processing for Microscopy-based Label-free Imaging Flow Cytometry in a Microfluidic Chip.

    PubMed

    Heo, Young Jin; Lee, Donghyeon; Kang, Junsu; Lee, Keondo; Chung, Wan Kyun

    2017-09-14

    Imaging flow cytometry (IFC) is an emerging technology that acquires single-cell images at high-throughput for analysis of a cell population. Rich information that comes from high sensitivity and spatial resolution of a single-cell microscopic image is beneficial for single-cell analysis in various biological applications. In this paper, we present a fast image-processing pipeline (R-MOD: Real-time Moving Object Detector) based on deep learning for high-throughput microscopy-based label-free IFC in a microfluidic chip. The R-MOD pipeline acquires all single-cell images of cells in flow, and identifies the acquired images as a real-time process with minimum hardware that consists of a microscope and a high-speed camera. Experiments show that R-MOD has the fast and reliable accuracy (500 fps and 93.3% mAP), and is expected to be used as a powerful tool for biomedical and clinical applications.

  14. Real-time image restoration for iris recognition systems.

    PubMed

    Kang, Byung Jun; Park, Kang Ryoung

    2007-12-01

    In the field of biometrics, it has been reported that iris recognition techniques have shown high levels of accuracy because unique patterns of the human iris, which has very many degrees of freedom, are used. However, because conventional iris cameras have small depth-of-field (DOF) areas, input iris images can easily be blurred, which can lead to lower recognition performance, since iris patterns are transformed by the blurring caused by optical defocusing. To overcome these problems, an autofocusing camera can be used. However, this inevitably increases the cost, size, and complexity of the system. Therefore, we propose a new real-time iris image-restoration method, which can increase the camera's DOF without requiring any additional hardware. This paper presents five novelties as compared to previous works: 1) by excluding eyelash and eyelid regions, it is possible to obtain more accurate focus scores from input iris images; 2) the parameter of the point spread function (PSF) can be estimated in terms of camera optics and measured focus scores; therefore, parameter estimation is more accurate than it has been in previous research; 3) because the PSF parameter can be obtained by using a predetermined equation, iris image restoration can be done in real-time; 4) by using a constrained least square (CLS) restoration filter that considers noise, performance can be greatly enhanced; and 5) restoration accuracy can also be enhanced by estimating the weight value of the noise-regularization term of the CLS filter according to the amount of image blurring. Experimental results showed that iris recognition errors when using the proposed restoration method were greatly reduced as compared to those results achieved without restoration or those achieved using previous iris-restoration methods.

  15. Real-time image reconstruction and display system for MRI using a high-speed personal computer.

    PubMed

    Haishi, T; Kose, K

    1998-09-01

    A real-time NMR image reconstruction and display system was developed using a high-speed personal computer and optimized for the 32-bit multitasking Microsoft Windows 95 operating system. The system was operated at various CPU clock frequencies by changing the motherboard clock frequency and the processor/bus frequency ratio. When the Pentium CPU was used at the 200 MHz clock frequency, the reconstruction time for one 128 x 128 pixel image was 48 ms and that for the image display on the enlarged 256 x 256 pixel window was about 8 ms. NMR imaging experiments were performed with three fast imaging sequences (FLASH, multishot EPI, and one-shot EPI) to demonstrate the ability of the real-time system. It was concluded that in most cases, high-speed PC would be the best choice for the image reconstruction and display system for real-time MRI. Copyright 1998 Academic Press.

  16. Robot-assisted real-time magnetic resonance image-guided transcatheter aortic valve replacement.

    PubMed

    Miller, Justin G; Li, Ming; Mazilu, Dumitru; Hunt, Tim; Horvath, Keith A

    2016-05-01

    Real-time magnetic resonance imaging (rtMRI)-guided transcatheter aortic valve replacement (TAVR) offers improved visualization, real-time imaging, and pinpoint accuracy with device delivery. Unfortunately, performing a TAVR in a MRI scanner can be a difficult task owing to limited space and an awkward working environment. Our solution was to design a MRI-compatible robot-assisted device to insert and deploy a self-expanding valve from a remote computer console. We present our preliminary results in a swine model. We used an MRI-compatible robotic arm and developed a valve delivery module. A 12-mm trocar was inserted in the apex of the heart via a subxiphoid incision. The delivery device and nitinol stented prosthesis were mounted on the robot. Two continuous real-time imaging planes provided a virtual real-time 3-dimensional reconstruction. The valve was deployed remotely by the surgeon via a graphic user interface. In this acute nonsurvival study, 8 swine underwent robot-assisted rtMRI TAVR for evaluation of feasibility. Device deployment took a mean of 61 ± 5 seconds. Postdeployment necropsy was performed to confirm correlations between imaging and actual valve positions. These results demonstrate the feasibility of robotic-assisted TAVR using rtMRI guidance. This approach may eliminate some of the challenges of performing a procedure while working inside of an MRI scanner, and may improve the success of TAVR. It provides superior visualization during the insertion process, pinpoint accuracy of deployment, and, potentially, communication between the imaging device and the robotic module to prevent incorrect or misaligned deployment. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  17. Real time non invasive imaging of fatty acid uptake in vivo

    PubMed Central

    Henkin, Amy H.; Cohen, Allison S.; Dubikovskaya, Elena A.; Park, Hyo Min; Nikitin, Gennady F.; Auzias, Mathieu G.; Kazantzis, Melissa; Bertozzi, Carolyn R.; Stahl, Andreas

    2012-01-01

    Detection and quantification of fatty acid fluxes in animal model systems following physiological, pathological, or pharmacological challenges is key to our understanding of complex metabolic networks as these macronutrients also activate transcription factors and modulate signaling cascades including insulin-sensitivity. To enable non-invasive, real-time, spatiotemporal quantitative imaging of fatty acid fluxes in animals, we created a bioactivatable molecular imaging probe based on long-chain fatty acids conjugated to a reporter molecule (luciferin). We show that this probe faithfully recapitulates cellular fatty acid uptake and can be used in animal systems as a valuable tool to localize and quantitate in real-time lipid fluxes such as intestinal fatty acid absorption and brown adipose tissue activation. This imaging approach should further our understanding of basic metabolic processes and pathological alterations in multiple disease models. PMID:22928772

  18. Real-time three-dimensional digital image correlation for biomedical applications

    NASA Astrophysics Data System (ADS)

    Wu, Rong; Wu, Hua; Arola, Dwayne; Zhang, Dongsheng

    2016-10-01

    Digital image correlation (DIC) has been successfully applied for evaluating the mechanical behavior of biological tissues. A three-dimensional (3-D) DIC system has been developed and applied to examining the motion of bones in the human foot. To achieve accurate, real-time displacement measurements, an algorithm including matching between sequential images and image pairs has been developed. The system was used to monitor the movement of markers which were attached to a precisely motorized stage. The accuracy of the proposed technique for in-plane and out-of-plane measurements was found to be -0.25% and 1.17%, respectively. Two biomedical applications were presented. In the experiment involving the foot arch, a human cadaver lower leg and foot specimen were subjected to vertical compressive loads up to 700 N at a rate of 10 N/s and the 3-D motions of bones in the foot were monitored in real time. In the experiment involving distal tibio fibular syndesmosis, a human cadaver lower leg and foot specimen were subjected to a monotonic rotational torque up to 5 Nm at a speed of 5 deg per min and the relative displacements of the tibia and fibula were monitored in real time. Results showed that the system could reach a frequency of up to 16 Hz with 6 points measured simultaneously. This technique sheds new lights on measuring 3-D motion of bones in biomechanical studies.

  19. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  20. A FPGA-based architecture for real-time image matching

    NASA Astrophysics Data System (ADS)

    Wang, Jianhui; Zhong, Sheng; Xu, Wenhui; Zhang, Weijun; Cao, Zhiguo

    2013-10-01

    Image matching is a fundamental task in computer vision. It is used to establish correspondence between two images taken at different viewpoint or different time from the same scene. However, its large computational complexity has been a challenge to most embedded systems. This paper proposes a single FPGA-based image matching system, which consists of SIFT feature detection, BRIEF descriptor extraction and BRIEF matching. It optimizes the FPGA architecture for the SIFT feature detection to reduce the FPGA resources utilization. Moreover, we implement BRIEF description and matching on FPGA also. The proposed system can implement image matching at 30fps (frame per second) for 1280x720 images. Its processing speed can meet the demand of most real-life computer vision applications.

  1. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  2. Real-time image processing for passive mmW imagery

    NASA Astrophysics Data System (ADS)

    Kozacik, Stephen; Paolini, Aaron; Bonnett, James; Harrity, Charles; Mackrides, Daniel; Dillon, Thomas E.; Martin, Richard D.; Schuetz, Christopher A.; Kelmelis, Eric; Prather, Dennis W.

    2015-05-01

    The transmission characteristics of millimeter waves (mmWs) make them suitable for many applications in defense and security, from airport preflight scanning to penetrating degraded visual environments such as brownout or heavy fog. While the cold sky provides sufficient illumination for these images to be taken passively in outdoor scenarios, this utility comes at a cost; the diffraction limit of the longer wavelengths involved leads to lower resolution imagery compared to the visible or IR regimes, and the low power levels inherent to passive imagery allow the data to be more easily degraded by noise. Recent techniques leveraging optical upconversion have shown significant promise, but are still subject to fundamental limits in resolution and signal-to-noise ratio. To address these issues we have applied techniques developed for visible and IR imagery to decrease noise and increase resolution in mmW imagery. We have developed these techniques into fieldable software, making use of GPU platforms for real-time operation of computationally complex image processing algorithms. We present data from a passive, 77 GHz, distributed aperture, video-rate imaging platform captured during field tests at full video rate. These videos demonstrate the increase in situational awareness that can be gained through applying computational techniques in real-time without needing changes in detection hardware.

  3. Real-time compound sonography of the rotator-cuff: evaluation of artefact reduction and image definition.

    PubMed

    De Candia, Alessandro; Doratiotto, Stefsano; Paschina, Elio; Segatto, Enrica; Pelizzo, Francesco; Bazzocchi, Massimo

    2003-04-01

    The aim of this study was to compare real time compound sonography with conventional sonography in the evaluation of rotator cuff tears. A prospective study was performed on 50 supraspinatus tendons in 101 patients treated by surgical acromioplasty. The surgeon described 33 (66%) full-thickness tears and 17 (34%) partial-thickness tears. All tendons were examined by conventional sonography and real time compound sonography on the day before surgery. The techniques were compared by evaluating the images for freedom from artefacts, contrast resolution and overall image definition. Real time compound sonography proved to be superior to conventional sonography as regards freedom from artefacts in 50 cases out of 50 (100%). It was superior to conventional sonography in evaluating the image contrast resolution in 45 cases out of 50 (90%), and superior to conventional sonography in overall image definition in 45 out of 50 cases (90%). Real-time compound sonography reduces the intrinsic artefacts of conventional sonography and allows better overall image definition. In particular, the digital technique allowed us to study the rotator cuff with better contrast resolution and sharper and more detailed images than did conventional sonography.

  4. Two dimensional microcirculation mapping with real time spatial frequency domain imaging

    NASA Astrophysics Data System (ADS)

    Zheng, Yang; Chen, Xinlin; Lin, Weihao; Cao, Zili; Zhu, Xiuwei; Zeng, Bixin; Xu, M.

    2018-02-01

    We present a spatial frequency domain imaging (SFDI) study of local hemodynamics in the human finger cuticle of healthy volunteers performing paced breathing and the forearm of healthy young adults performing normal breathing with our recently developed Real Time Single Snapshot Multiple Frequency Demodulation - Spatial Frequency Domain Imaging (SSMD-SFDI) system. A two-layer model was used to map the concentrations of deoxy-, oxy-hemoglobin, melanin, epidermal thickness and scattering properties at the subsurface of the forearm and the finger cuticle. The oscillations of the concentrations of deoxy- and oxy-hemoglobin at the subsurface of the finger cuticle and forearm induced by paced breathing and normal breathing, respectively, were found to be close to out-of-phase, attributed to the dominance of the blood flow modulation by paced breathing or heartbeat. Our results suggest that the real time SFDI platform may serve as one effective imaging modality for microcirculation monitoring.

  5. Real-time in vivo Cherenkoscopy imaging during external beam radiation therapy.

    PubMed

    Zhang, Rongxiao; Gladstone, David J; Jarvis, Lesley A; Strawbridge, Rendall R; Jack Hoopes, P; Friedman, Oscar D; Glaser, Adam K; Pogue, Brian W

    2013-11-01

    Cherenkov radiation is induced when charged particles travel through dielectric media (such as biological tissue) faster than the speed of light through that medium. Detection of this radiation or excited luminescence during megavoltage external beam radiotherapy (EBRT) can allow emergence of a new approach to superficial dose estimation, functional imaging, and quality assurance for radiation therapy dosimetry. In this letter, the first in vivo Cherenkov images of a real-time Cherenkoscopy during EBRT are presented. The imaging system consisted of a time-gated intensified charge coupled device (ICCD) coupled with a commercial lens. The ICCD was synchronized to the linear accelerator to detect Cherenkov photons only during the 3.25-μs radiation bursts. Images of a tissue phantom under irradiation show that the intensity of Cherenkov emission is directly proportional to radiation dose, and images can be acquired at 4.7 frames/s with SNR>30. Cherenkoscopy was obtained from the superficial regions of a canine oral tumor during planned, Institutional Animal Care and Use Committee approved, conventional (therapeutically appropriate) EBRT irradiation. Coregistration between photography and Cherenkoscopy validated that Cherenkov photons were detected from the planned treatment region. Real-time images correctly monitored the beam field changes corresponding to the planned dynamic wedge movement, with accurate extent of overall beam field, and expected cold and hot regions.

  6. Programmable Real-time Clinical Photoacoustic and Ultrasound Imaging System

    PubMed Central

    Kim, Jeesu; Park, Sara; Jung, Yuhan; Chang, Sunyeob; Park, Jinyong; Zhang, Yumiao; Lovell, Jonathan F.; Kim, Chulhong

    2016-01-01

    Photoacoustic imaging has attracted interest for its capacity to capture functional spectral information with high spatial and temporal resolution in biological tissues. Several photoacoustic imaging systems have been commercialized recently, but they are variously limited by non-clinically relevant designs, immobility, single anatomical utility (e.g., breast only), or non-programmable interfaces. Here, we present a real-time clinical photoacoustic and ultrasound imaging system which consists of an FDA-approved clinical ultrasound system integrated with a portable laser. The system is completely programmable, has an intuitive user interface, and can be adapted for different applications by switching handheld imaging probes with various transducer types. The customizable photoacoustic and ultrasound imaging system is intended to meet the diverse needs of medical researchers performing both clinical and preclinical photoacoustic studies. PMID:27731357

  7. Programmable Real-time Clinical Photoacoustic and Ultrasound Imaging System.

    PubMed

    Kim, Jeesu; Park, Sara; Jung, Yuhan; Chang, Sunyeob; Park, Jinyong; Zhang, Yumiao; Lovell, Jonathan F; Kim, Chulhong

    2016-10-12

    Photoacoustic imaging has attracted interest for its capacity to capture functional spectral information with high spatial and temporal resolution in biological tissues. Several photoacoustic imaging systems have been commercialized recently, but they are variously limited by non-clinically relevant designs, immobility, single anatomical utility (e.g., breast only), or non-programmable interfaces. Here, we present a real-time clinical photoacoustic and ultrasound imaging system which consists of an FDA-approved clinical ultrasound system integrated with a portable laser. The system is completely programmable, has an intuitive user interface, and can be adapted for different applications by switching handheld imaging probes with various transducer types. The customizable photoacoustic and ultrasound imaging system is intended to meet the diverse needs of medical researchers performing both clinical and preclinical photoacoustic studies.

  8. Magnetic Particle Imaging for Real-Time Perfusion Imaging in Acute Stroke.

    PubMed

    Ludewig, Peter; Gdaniec, Nadine; Sedlacik, Jan; Forkert, Nils D; Szwargulski, Patryk; Graeser, Matthias; Adam, Gerhard; Kaul, Michael G; Krishnan, Kannan M; Ferguson, R Matthew; Khandhar, Amit P; Walczak, Piotr; Fiehler, Jens; Thomalla, Götz; Gerloff, Christian; Knopp, Tobias; Magnus, Tim

    2017-10-24

    The fast and accurate assessment of cerebral perfusion is fundamental for the diagnosis and successful treatment of stroke patients. Magnetic particle imaging (MPI) is a new radiation-free tomographic imaging method with a superior temporal resolution, compared to other conventional imaging methods. In addition, MPI scanners can be built as prehospital mobile devices, which require less complex infrastructure than computed tomography (CT) and magnetic resonance imaging (MRI). With these advantages, MPI could accelerate the stroke diagnosis and treatment, thereby improving outcomes. Our objective was to investigate the capabilities of MPI to detect perfusion deficits in a murine model of ischemic stroke. Cerebral ischemia was induced by inserting of a microfilament in the internal carotid artery in C57BL/6 mice, thereby blocking the blood flow into the medial cerebral artery. After the injection of a contrast agent (superparamagnetic iron oxide nanoparticles) specifically tailored for MPI, cerebral perfusion and vascular anatomy were assessed by the MPI scanner within seconds. To validate and compare our MPI data, we performed perfusion imaging with a small animal MRI scanner. MPI detected the perfusion deficits in the ischemic brain, which were comparable to those with MRI but in real-time. For the first time, we showed that MPI could be used as a diagnostic tool for relevant diseases in vivo, such as an ischemic stroke. Due to its shorter image acquisition times and increased temporal resolution compared to that of MRI or CT, we expect that MPI offers the potential to improve stroke imaging and treatment.

  9. Real-time quantitative fluorescence imaging using a single snapshot optical properties technique for neurosurgical guidance

    NASA Astrophysics Data System (ADS)

    Valdes, Pablo A.; Angelo, Joseph; Gioux, Sylvain

    2015-03-01

    Fluorescence imaging has shown promise as an adjunct to improve the extent of resection in neurosurgery and oncologic surgery. Nevertheless, current fluorescence imaging techniques do not account for the heterogeneous attenuation effects of tissue optical properties. In this work, we present a novel imaging system that performs real time quantitative fluorescence imaging using Single Snapshot Optical Properties (SSOP) imaging. We developed the technique and performed initial phantom studies to validate the quantitative capabilities of the system for intraoperative feasibility. Overall, this work introduces a novel real-time quantitative fluorescence imaging method capable of being used intraoperatively for neurosurgical guidance.

  10. An airborne thematic thermal infrared and electro-optical imaging system

    NASA Astrophysics Data System (ADS)

    Sun, Xiuhong; Shu, Peter

    2011-08-01

    This paper describes an advanced Airborne Thematic Thermal InfraRed and Electro-Optical Imaging System (ATTIREOIS) and its potential applications. ATTIREOIS sensor payload consists of two sets of advanced Focal Plane Arrays (FPAs) - a broadband Thermal InfraRed Sensor (TIRS) and a four (4) band Multispectral Electro-Optical Sensor (MEOS) to approximate Landsat ETM+ bands 1,2,3,4, and 6, and LDCM bands 2,3,4,5, and 10+11. The airborne TIRS is 3-axis stabilized payload capable of providing 3D photogrammetric images with a 1,850 pixel swathwidth via pushbroom operation. MEOS has a total of 116 million simultaneous sensor counts capable of providing 3 cm spatial resolution multispectral orthophotos for continuous airborne mapping. ATTIREOIS is a complete standalone and easy-to-use portable imaging instrument for light aerial vehicle deployment. Its miniaturized backend data system operates all ATTIREOIS imaging sensor components, an INS/GPS, and an e-Gimbal™ Control Electronic Unit (ECU) with a data throughput of 300 Megabytes/sec. The backend provides advanced onboard processing, performing autonomous raw sensor imagery development, TIRS image track-recovery reconstruction, LWIR/VNIR multi-band co-registration, and photogrammetric image processing. With geometric optics and boresight calibrations, the ATTIREOIS data products are directly georeferenced with an accuracy of approximately one meter. A prototype ATTIREOIS has been configured. Its sample LWIR/EO image data will be presented. Potential applications of ATTIREOIS include: 1) Providing timely and cost-effective, precisely and directly georeferenced surface emissive and solar reflective LWIR/VNIR multispectral images via a private Google Earth Globe to enhance NASA's Earth science research capabilities; and 2) Underflight satellites to support satellite measurement calibration and validation observations.

  11. Grayscale image segmentation for real-time traffic sign recognition: the hardware point of view

    NASA Astrophysics Data System (ADS)

    Cao, Tam P.; Deng, Guang; Elton, Darrell

    2009-02-01

    In this paper, we study several grayscale-based image segmentation methods for real-time road sign recognition applications on an FPGA hardware platform. The performance of different image segmentation algorithms in different lighting conditions are initially compared using PC simulation. Based on these results and analysis, suitable algorithms are implemented and tested on a real-time FPGA speed sign detection system. Experimental results show that the system using segmented images uses significantly less hardware resources on an FPGA while maintaining comparable system's performance. The system is capable of processing 60 live video frames per second.

  12. Real-time implementation of a multispectral mine target detection algorithm

    NASA Astrophysics Data System (ADS)

    Samson, Joseph W.; Witter, Lester J.; Kenton, Arthur C.; Holloway, John H., Jr.

    2003-09-01

    Spatial-spectral anomaly detection (the "RX Algorithm") has been exploited on the USMC's Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) and several associated technology base studies, and has been found to be a useful method for the automated detection of surface-emplaced antitank land mines in airborne multispectral imagery. RX is a complex image processing algorithm that involves the direct spatial convolution of a target/background mask template over each multispectral image, coupled with a spatially variant background spectral covariance matrix estimation and inversion. The RX throughput on the ATD was about 38X real time using a single Sun UltraSparc system. A goal to demonstrate RX in real-time was begun in FY01. We now report the development and demonstration of a Field Programmable Gate Array (FPGA) solution that achieves a real-time implementation of the RX algorithm at video rates using COBRA ATD data. The approach uses an Annapolis Microsystems Firebird PMC card containing a Xilinx XCV2000E FPGA with over 2,500,000 logic gates and 18MBytes of memory. A prototype system was configured using a Tek Microsystems VME board with dual-PowerPC G4 processors and two PMC slots. The RX algorithm was translated from its C programming implementation into the VHDL language and synthesized into gates that were loaded into the FPGA. The VHDL/synthesizer approach allows key RX parameters to be quickly changed and a new implementation automatically generated. Reprogramming the FPGA is done rapidly and in-circuit. Implementation of the RX algorithm in a single FPGA is a major first step toward achieving real-time land mine detection.

  13. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2004-12-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  14. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2005-01-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  15. MO-FG-BRD-04: Real-Time Imaging and Tracking Techniques for Intrafractional Motion Management: MR Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Low, D.

    2015-06-15

    Intrafraction target motion is a prominent complicating factor in the accurate targeting of radiation within the body. Methods compensating for target motion during treatment, such as gating and dynamic tumor tracking, depend on the delineation of target location as a function of time during delivery. A variety of techniques for target localization have been explored and are under active development; these include beam-level imaging of radio-opaque fiducials, fiducial-less tracking of anatomical landmarks, tracking of electromagnetic transponders, optical imaging of correlated surrogates, and volumetric imaging within treatment delivery. The Joint Imaging and Therapy Symposium will provide an overview of the techniquesmore » for real-time imaging and tracking, with special focus on emerging modes of implementation across different modalities. In particular, the symposium will explore developments in 1) Beam-level kilovoltage X-ray imaging techniques, 2) EPID-based megavoltage X-ray tracking, 3) Dynamic tracking using electromagnetic transponders, and 4) MRI-based soft-tissue tracking during radiation delivery. Learning Objectives: Understand the fundamentals of real-time imaging and tracking techniques Learn about emerging techniques in the field of real-time tracking Distinguish between the advantages and disadvantages of different tracking modalities Understand the role of real-time tracking techniques within the clinical delivery work-flow.« less

  16. MO-FG-BRD-02: Real-Time Imaging and Tracking Techniques for Intrafractional Motion Management: MV Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berbeco, R.

    2015-06-15

    Intrafraction target motion is a prominent complicating factor in the accurate targeting of radiation within the body. Methods compensating for target motion during treatment, such as gating and dynamic tumor tracking, depend on the delineation of target location as a function of time during delivery. A variety of techniques for target localization have been explored and are under active development; these include beam-level imaging of radio-opaque fiducials, fiducial-less tracking of anatomical landmarks, tracking of electromagnetic transponders, optical imaging of correlated surrogates, and volumetric imaging within treatment delivery. The Joint Imaging and Therapy Symposium will provide an overview of the techniquesmore » for real-time imaging and tracking, with special focus on emerging modes of implementation across different modalities. In particular, the symposium will explore developments in 1) Beam-level kilovoltage X-ray imaging techniques, 2) EPID-based megavoltage X-ray tracking, 3) Dynamic tracking using electromagnetic transponders, and 4) MRI-based soft-tissue tracking during radiation delivery. Learning Objectives: Understand the fundamentals of real-time imaging and tracking techniques Learn about emerging techniques in the field of real-time tracking Distinguish between the advantages and disadvantages of different tracking modalities Understand the role of real-time tracking techniques within the clinical delivery work-flow.« less

  17. MO-FG-BRD-03: Real-Time Imaging and Tracking Techniques for Intrafractional Motion Management: EM Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keall, P.

    2015-06-15

    Intrafraction target motion is a prominent complicating factor in the accurate targeting of radiation within the body. Methods compensating for target motion during treatment, such as gating and dynamic tumor tracking, depend on the delineation of target location as a function of time during delivery. A variety of techniques for target localization have been explored and are under active development; these include beam-level imaging of radio-opaque fiducials, fiducial-less tracking of anatomical landmarks, tracking of electromagnetic transponders, optical imaging of correlated surrogates, and volumetric imaging within treatment delivery. The Joint Imaging and Therapy Symposium will provide an overview of the techniquesmore » for real-time imaging and tracking, with special focus on emerging modes of implementation across different modalities. In particular, the symposium will explore developments in 1) Beam-level kilovoltage X-ray imaging techniques, 2) EPID-based megavoltage X-ray tracking, 3) Dynamic tracking using electromagnetic transponders, and 4) MRI-based soft-tissue tracking during radiation delivery. Learning Objectives: Understand the fundamentals of real-time imaging and tracking techniques Learn about emerging techniques in the field of real-time tracking Distinguish between the advantages and disadvantages of different tracking modalities Understand the role of real-time tracking techniques within the clinical delivery work-flow.« less

  18. A novel real time imaging platform to quantify macrophage phagocytosis.

    PubMed

    Kapellos, Theodore S; Taylor, Lewis; Lee, Heyne; Cowley, Sally A; James, William S; Iqbal, Asif J; Greaves, David R

    2016-09-15

    Phagocytosis of pathogens, apoptotic cells and debris is a key feature of macrophage function in host defense and tissue homeostasis. Quantification of macrophage phagocytosis in vitro has traditionally been technically challenging. Here we report the optimization and validation of the IncuCyte ZOOM® real time imaging platform for macrophage phagocytosis based on pHrodo® pathogen bioparticles, which only fluoresce when localized in the acidic environment of the phagolysosome. Image analysis and fluorescence quantification were performed with the automated IncuCyte™ Basic Software. Titration of the bioparticle number showed that the system is more sensitive than a spectrofluorometer, as it can detect phagocytosis when using 20× less E. coli bioparticles. We exemplified the power of this real time imaging platform by studying phagocytosis of murine alveolar, bone marrow and peritoneal macrophages. We further demonstrate the ability of this platform to study modulation of the phagocytic process, as pharmacological inhibitors of phagocytosis suppressed bioparticle uptake in a concentration-dependent manner, whereas opsonins augmented phagocytosis. We also investigated the effects of macrophage polarization on E. coli phagocytosis. Bone marrow-derived macrophage (BMDM) priming with M2 stimuli, such as IL-4 and IL-10 resulted in higher engulfment of bioparticles in comparison with M1 polarization. Moreover, we demonstrated that tolerization of BMDMs with lipopolysaccharide (LPS) results in impaired E. coli bioparticle phagocytosis. This novel real time assay will enable researchers to quantify macrophage phagocytosis with a higher degree of accuracy and sensitivity and will allow investigation of limited populations of primary phagocytes in vitro. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  19. Imaging the eye fundus with real-time en-face spectral domain optical coherence tomography

    PubMed Central

    Bradu, Adrian; Podoleanu, Adrian Gh.

    2014-01-01

    Real-time display of processed en-face spectral domain optical coherence tomography (SD-OCT) images is important for diagnosis. However, due to many steps of data processing requirements, such as Fast Fourier transformation (FFT), data re-sampling, spectral shaping, apodization, zero padding, followed by software cut of the 3D volume acquired to produce an en-face slice, conventional high-speed SD-OCT cannot render an en-face OCT image in real time. Recently we demonstrated a Master/Slave (MS)-OCT method that is highly parallelizable, as it provides reflectivity values of points at depth within an A-scan in parallel. This allows direct production of en-face images. In addition, the MS-OCT method does not require data linearization, which further simplifies the processing. The computation in our previous paper was however time consuming. In this paper we present an optimized algorithm that can be used to provide en-face MS-OCT images much quicker. Using such an algorithm we demonstrate around 10 times faster production of sets of en-face OCT images than previously obtained as well as simultaneous real-time display of up to 4 en-face OCT images of 200 × 200 pixels2 from the fovea and the optic nerve of a volunteer. We also demonstrate 3D and B-scan OCT images obtained from sets of MS-OCT C-scans, i.e. with no FFT and no intermediate step of generation of A-scans. PMID:24761303

  20. Real-time nondestructive monitoring of the gas tungsten arc welding (GTAW) process by combined airborne acoustic emission and non-contact ultrasonics

    NASA Astrophysics Data System (ADS)

    Zhang, Lu; Basantes-Defaz, Alexandra-Del-Carmen; Abbasi, Zeynab; Yuhas, Donald; Ozevin, Didem; Indacochea, Ernesto

    2018-03-01

    Welding is a key manufacturing process for many industries and may introduce defects into the welded parts causing significant negative impacts, potentially ruining high-cost pieces. Therefore, a real-time process monitoring method is important to implement for avoiding producing a low-quality weld. Due to high surface temperature and possible contamination of surface by contact transducers, the welding process should be monitored via non-contact transducers. In this paper, airborne acoustic emission (AE) transducers tuned at 60 kHz and non-contact ultrasonic testing (UT) transducers tuned at 500 kHz are implemented for real time weld monitoring. AE is a passive nondestructive evaluation method that listens for the process noise, and provides information about the uniformity of manufacturing process. UT provides more quantitative information about weld defects. One of the most common weld defects as burn-through is investigated. The influences of weld defects on AE signatures (time-driven data) and UT signals (received signal energy, change in peak frequency) are presented. The level of burn-through damage is defined by using single method or combine AE/UT methods.

  1. Real-time Visualization and Quantification of Retrograde Cardioplegia Delivery using Near Infrared Fluorescent Imaging

    PubMed Central

    Rangaraj, Aravind T.; Ghanta, Ravi K.; Umakanthan, Ramanan; Soltesz, Edward G.; Laurence, Rita G.; Fox, John; Cohn, Lawrence H.; Bolman, R. M.; Frangioni, John V.; Chen, Frederick Y.

    2009-01-01

    Background and Aim of the Study Homogeneous delivery of cardioplegia is essential for myocardial protection during cardiac surgery. Presently, there exist no established methods to quantitatively assess cardioplegia distribution intraoperatively and determine when retrograde cardioplegia is required. In this study, we evaluate the feasibility of near infrared (NIR) imaging for real-time visualization of cardioplegia distribution in a porcine model. Methods A portable, intraoperative, real-time NIR imaging system was utilized. NIR fluorescent cardioplegia solution was developed by incorporating indocyanine green (ICG) into crystalloid cardioplegia solution. Real-time NIR imaging was performed while the fluorescent cardioplegia solution was infused via the retrograde route in 5 ex-vivo normal porcine hearts and in 5 ex-vivo porcine hearts status post left anterior descending (LAD) coronary artery ligation. Horizontal cross-sections of the hearts were obtained at proximal, middle, and distal LAD levels. Videodensitometry was performed to quantify distribution of fluorophore content. Results The progressive distribution of cardioplegia was clearly visualized with NIR imaging. Complete visualization of retrograde distribution occurred within 4 minutes of infusion. Videodensitometry revealed that retrograde cardioplegia primarily distributed to the left ventricle and anterior septum. In hearts with LAD ligation, antegrade cardioplegia did not distribute to the anterior left ventricle. This deficiency was compensated for with retrograde cardioplegia supplementation. Conclusions Incorporation of ICG into cardioplegia allows real-time visualization of cardioplegia delivery via NIR imaging. This technology may prove useful in guiding intraoperative decisions pertaining to when retrograde cardioplegia is mandated. PMID:19016995

  2. Real-time visualization and quantification of retrograde cardioplegia delivery using near infrared fluorescent imaging.

    PubMed

    Rangaraj, Aravind T; Ghanta, Ravi K; Umakanthan, Ramanan; Soltesz, Edward G; Laurence, Rita G; Fox, John; Cohn, Lawrence H; Bolman, R M; Frangioni, John V; Chen, Frederick Y

    2008-01-01

    Homogeneous delivery of cardioplegia is essential for myocardial protection during cardiac surgery. Presently, there exist no established methods to quantitatively assess cardioplegia distribution intraoperatively and determine when retrograde cardioplegia is required. In this study, we evaluate the feasibility of near infrared (NIR) imaging for real-time visualization of cardioplegia distribution in a porcine model. A portable, intraoperative, real-time NIR imaging system was utilized. NIR fluorescent cardioplegia solution was developed by incorporating indocyanine green (ICG) into crystalloid cardioplegia solution. Real-time NIR imaging was performed while the fluorescent cardioplegia solution was infused via the retrograde route in five ex vivo normal porcine hearts and in five ex vivo porcine hearts status post left anterior descending (LAD) coronary artery ligation. Horizontal cross-sections of the hearts were obtained at proximal, middle, and distal LAD levels. Videodensitometry was performed to quantify distribution of fluorophore content. The progressive distribution of cardioplegia was clearly visualized with NIR imaging. Complete visualization of retrograde distribution occurred within 4 minutes of infusion. Videodensitometry revealed retrograde cardioplegia, primarily distributed to the left ventricle (LV) and anterior septum. In hearts with LAD ligation, antegrade cardioplegia did not distribute to the anterior LV. This deficiency was compensated for with retrograde cardioplegia supplementation. Incorporation of ICG into cardioplegia allows real-time visualization of cardioplegia delivery via NIR imaging. This technology may prove useful in guiding intraoperative decisions pertaining to when retrograde cardioplegia is mandated.

  3. Research on Airborne SAR Imaging Based on Esc Algorithm

    NASA Astrophysics Data System (ADS)

    Dong, X. T.; Yue, X. J.; Zhao, Y. H.; Han, C. M.

    2017-09-01

    Due to the ability of flexible, accurate, and fast obtaining abundant information, airborne SAR is significant in the field of Earth Observation and many other applications. Optimally the flight paths are straight lines, but in reality it is not the case since some portion of deviation from the ideal path is impossible to avoid. A small disturbance from the ideal line will have a major effect on the signal phase, dramatically deteriorating the quality of SAR images and data. Therefore, to get accurate echo information and radar images, it is essential to measure and compensate for nonlinear motion of antenna trajectories. By means of compensating each flying trajectory to its reference track, MOCO method corrects linear phase error and quadratic phase error caused by nonlinear antenna trajectories. Position and Orientation System (POS) data is applied to acquiring accuracy motion attitudes and spatial positions of antenna phase centre (APC). In this paper, extend chirp scaling algorithm (ECS) is used to deal with echo data of airborne SAR. An experiment is done using VV-Polarization raw data of C-band airborne SAR. The quality evaluations of compensated SAR images and uncompensated SAR images are done in the experiment. The former always performs better than the latter. After MOCO processing, azimuth ambiguity is declined, peak side lobe ratio (PSLR) effectively improves and the resolution of images is improved obviously. The result shows the validity and operability of the imaging process for airborne SAR.

  4. Real-time image-based B-mode ultrasound image simulation of needles using tensor-product interpolation.

    PubMed

    Zhu, Mengchen; Salcudean, Septimiu E

    2011-07-01

    In this paper, we propose an interpolation-based method for simulating rigid needles in B-mode ultrasound images in real time. We parameterize the needle B-mode image as a function of needle position and orientation. We collect needle images under various spatial configurations in a water-tank using a needle guidance robot. Then we use multidimensional tensor-product interpolation to simulate images of needles with arbitrary poses and positions using collected images. After further processing, the interpolated needle and seed images are superimposed on top of phantom or tissue image backgrounds. The similarity between the simulated and the real images is measured using a correlation metric. A comparison is also performed with in vivo images obtained during prostate brachytherapy. Our results, carried out for both the convex (transverse plane) and linear (sagittal/para-sagittal plane) arrays of a trans-rectal transducer indicate that our interpolation method produces good results while requiring modest computing resources. The needle simulation method we present can be extended to the simulation of ultrasound images of other wire-like objects. In particular, we have shown that the proposed approach can be used to simulate brachytherapy seeds.

  5. Miniature real-time intraoperative forward-imaging optical coherence tomography probe

    PubMed Central

    Joos, Karen M.; Shen, Jin-Hui

    2013-01-01

    Optical coherence tomography (OCT) has a tremendous global impact upon the ability to diagnose, treat, and monitor eye diseases. A miniature 25-gauge forward-imaging OCT probe with a disposable tip was developed for real-time intraoperative ocular imaging of posterior pole and peripheral structures to improve vitreoretinal surgery. The scanning range was 2 mm when the probe tip was held 3-4 mm from the tissue surface. The axial resolution was 4-6 µm and the lateral resolution was 25-35 µm. The probe was used to image cellophane tape and multiple ocular structures. PMID:24009997

  6. A flexible software architecture for scalable real-time image and video processing applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  7. SU-F-J-54: Towards Real-Time Volumetric Imaging Using the Treatment Beam and KV Beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, M; Rozario, T; Liu, A

    Purpose: Existing real-time imaging uses dual (orthogonal) kV beam fluoroscopies and may result in significant amount of extra radiation to patients, especially for prolonged treatment cases. In addition, kV projections only provide 2D information, which is insufficient for in vivo dose reconstruction. We propose real-time volumetric imaging using prior knowledge of pre-treatment 4D images and real-time 2D transit data of treatment beam and kV beam. Methods: The pre-treatment multi-snapshot volumetric images are used to simulate 2D projections of both the treatment beam and kV beam, respectively, for each treatment field defined by the control point. During radiation delivery, the transitmore » signals acquired by the electronic portal image device (EPID) are processed for every projection and compared with pre-calculation by cross-correlation for phase matching and thus 3D snapshot identification or real-time volumetric imaging. The data processing involves taking logarithmic ratios of EPID signals with respect to the air scan to reduce modeling uncertainties in head scatter fluence and EPID response. Simulated 2D projections are also used to pre-calculate confidence levels in phase matching. Treatment beam projections that have a low confidence level either in pre-calculation or real-time acquisition will trigger kV beams so that complementary information can be exploited. In case both the treatment beam and kV beam return low confidence in phase matching, a predicted phase based on linear regression will be generated. Results: Simulation studies indicated treatment beams provide sufficient confidence in phase matching for most cases. At times of low confidence from treatment beams, kV imaging provides sufficient confidence in phase matching due to its complementary configuration. Conclusion: The proposed real-time volumetric imaging utilizes the treatment beam and triggers kV beams for complementary information when the treatment beam along does not provide

  8. Real-time near IR (1310 nm) imaging of CO2 laser ablation of enamel.

    PubMed

    Darling, Cynthia L; Fried, Daniel

    2008-02-18

    The high-transparency of dental enamel in the near-IR (NIR) can be exploited for real-time imaging of ablation crater formation during drilling with lasers. NIR images were acquired with an InGaAs focal plane array and a NIR zoom microscope during drilling incisions in human enamel samples with a lambda=9.3-microm CO(2) laser operating at repetition rates of 50-300-Hz with and without a water spray. Crack formation, dehydration and thermal changes were observed during ablation. These initial images demonstrate the potential of NIR imaging to monitor laser-ablation events in real-time to provide information about the mechanism of ablation and to evaluate the potential for peripheral thermal and mechanical damage.

  9. Transient imaging for real-time tracking around a corner

    NASA Astrophysics Data System (ADS)

    Klein, Jonathan; Laurenzis, Martin; Hullin, Matthias

    2016-10-01

    Non-line-of-sight imaging is a fascinating emerging area of research and expected to have an impact in numerous application fields including civilian and military sensing. Performance of human perception and situational awareness can be extended by the sensing of shapes and movement around a corner in future scenarios. Rather than seeing through obstacles directly, non-line-of-sight imaging relies on analyzing indirect reflections of light that traveled around the obstacle. In previous work, transient imaging was established as the key mechanic to enable the extraction of useful information from such reflections. So far, a number of different approaches based on transient imaging have been proposed, with back projection being the most prominent one. Different hardware setups were used for the acquisition of the required data, however all of them have severe drawbacks such as limited image quality, long capture time or very high prices. In this paper we propose the analysis of synthetic transient renderings to gain more insights into the transient light transport. With this simulated data, we are no longer bound to the imperfect data of real systems and gain more flexibility and control over the analysis. In a second part, we use the insights of our analysis to formulate a novel reconstruction algorithm. It uses an adapted light simulation to formulate an inverse problem which is solved in an analysis-by-synthesis fashion. Through rigorous optimization of the reconstruction, it then becomes possible to track known objects outside the line of side in real time. Due to the forward formulation of the light transport, the algorithm is easily expandable to more general scenarios or different hardware setups. We therefore expect it to become a viable alternative to the classic back projection approach in the future.

  10. Image enhancement of real-time television to benefit the visually impaired.

    PubMed

    Wolffsohn, James S; Mukhopadhyay, Ditipriya; Rubinstein, Martin

    2007-09-01

    To examine the use of real-time, generic edge detection, image processing techniques to enhance the television viewing of the visually impaired. Prospective, clinical experimental study. One hundred and two sequential visually impaired (average age 73.8 +/- 14.8 years; 59% female) in a single center optimized a dynamic television image with respect to edge detection filter (Prewitt, Sobel, or the two combined), color (red, green, blue, or white), and intensity (one to 15 times) of the overlaid edges. They then rated the original television footage compared with a black-and-white image displaying the edges detected and the original television image with the detected edges overlaid in the chosen color and at the intensity selected. Footage of news, an advertisement, and the end of program credits were subjectively assessed in a random order. A Prewitt filter was preferred (44%) compared with the Sobel filter (27%) or a combination of the two (28%). Green and white were equally popular for displaying the detected edges (32%), with blue (22%) and red (14%) less so. The average preferred edge intensity was 3.5 +/- 1.7 times. The image-enhanced television was significantly preferred to the original (P < .001), which in turn was preferred to viewing the detected edges alone (P < .001) for each of the footage clips. Preference was not dependent on the condition causing visual impairment. Seventy percent were definitely willing to buy a set-top box that could achieve these effects for a reasonable price. Simple generic edge detection image enhancement options can be performed on television in real-time and significantly enhance the viewing of the visually impaired.

  11. Preoperative magnetic resonance and intraoperative ultrasound fusion imaging for real-time neuronavigation in brain tumor surgery.

    PubMed

    Prada, F; Del Bene, M; Mattei, L; Lodigiani, L; DeBeni, S; Kolev, V; Vetrano, I; Solbiati, L; Sakas, G; DiMeco, F

    2015-04-01

    Brain shift and tissue deformation during surgery for intracranial lesions are the main actual limitations of neuro-navigation (NN), which currently relies mainly on preoperative imaging. Ultrasound (US), being a real-time imaging modality, is becoming progressively more widespread during neurosurgical procedures, but most neurosurgeons, trained on axial computed tomography (CT) and magnetic resonance imaging (MRI) slices, lack specific US training and have difficulties recognizing anatomic structures with the same confidence as in preoperative imaging. Therefore real-time intraoperative fusion imaging (FI) between preoperative imaging and intraoperative ultrasound (ioUS) for virtual navigation (VN) is highly desirable. We describe our procedure for real-time navigation during surgery for different cerebral lesions. We performed fusion imaging with virtual navigation for patients undergoing surgery for brain lesion removal using an ultrasound-based real-time neuro-navigation system that fuses intraoperative cerebral ultrasound with preoperative MRI and simultaneously displays an MRI slice coplanar to an ioUS image. 58 patients underwent surgery at our institution for intracranial lesion removal with image guidance using a US system equipped with fusion imaging for neuro-navigation. In all cases the initial (external) registration error obtained by the corresponding anatomical landmark procedure was below 2 mm and the craniotomy was correctly placed. The transdural window gave satisfactory US image quality and the lesion was always detectable and measurable on both axes. Brain shift/deformation correction has been successfully employed in 42 cases to restore the co-registration during surgery. The accuracy of ioUS/MRI fusion/overlapping was confirmed intraoperatively under direct visualization of anatomic landmarks and the error was < 3 mm in all cases (100 %). Neuro-navigation using intraoperative US integrated with preoperative MRI is reliable, accurate

  12. Real-time look-up table-based color correction for still image stabilization of digital cameras without using frame memory

    NASA Astrophysics Data System (ADS)

    Luo, Lin-Bo; An, Sang-Woo; Wang, Chang-Shuai; Li, Ying-Chun; Chong, Jong-Wha

    2012-09-01

    Digital cameras usually decrease exposure time to capture motion-blur-free images. However, this operation will generate an under-exposed image with a low-budget complementary metal-oxide semiconductor image sensor (CIS). Conventional color correction algorithms can efficiently correct under-exposed images; however, they are generally not performed in real time and need at least one frame memory if they are implemented by hardware. The authors propose a real-time look-up table-based color correction method that corrects under-exposed images with hardware without using frame memory. The method utilizes histogram matching of two preview images, which are exposed for a long and short time, respectively, to construct an improved look-up table (ILUT) and then corrects the captured under-exposed image in real time. Because the ILUT is calculated in real time before processing the captured image, this method does not require frame memory to buffer image data, and therefore can greatly save the cost of CIS. This method not only supports single image capture, but also bracketing to capture three images at a time. The proposed method was implemented by hardware description language and verified by a field-programmable gate array with a 5 M CIS. Simulations show that the system can perform in real time with a low cost and can correct the color of under-exposed images well.

  13. Real time diffuse reflectance polarisation spectroscopy imaging to evaluate skin microcirculation

    NASA Astrophysics Data System (ADS)

    O'Doherty, Jim; Henricson, Joakim; Nilsson, Gert E.; Anderson, Chris; Leahy, Martin J.

    2007-07-01

    This article describes the theoretical development and design of a real-time microcirculation imaging system, an extension from a previously technology developed by our group. The technology utilises polarisation spectroscopy, a technique used in order to selectively gate photons returning from various compartments of human skin tissue, namely from the superficial layers of the epidermis, and the deeper backscattered light from the dermal matrix. A consumer-end digital camcorder captures colour data with three individual CCDs, and a custom designed light source consisting of a 24 LED ring light provides broadband illumination over the 400 nm - 700 nm wavelength region. Theory developed leads to an image processing algorithm, the output of which scales linearly with increasing red blood cell (RBC) concentration. Processed images are displayed online in real-time at a rate of 25 frames s -1, at a frame size of 256 x 256 pixels, and is limited only by computer RAM memory and processing speed. General demonstrations of the technique in vivo display several advantages over similar technology.

  14. Near-Real-Time Earth Observation Data Supporting Wildfire Management

    NASA Astrophysics Data System (ADS)

    Ambrosia, V. G.; Zajkowski, T.; Quayle, B.

    2013-12-01

    During disaster events, the most critical element needed by responding personnel and management teams is situational intelligence / awareness. During rapidly-evolving events such as wildfires, the need for timely information is critical to save lives, property and resources. The wildfire management agencies in the US rely heavily on remote sensing information both from airborne platforms as well as from orbital assets. The ability to readily have information from those systems, not just data, is critical to effective control and damage mitigation. NASA has been collaborating with the USFS to mature and operationalize various asset-information capabilities to effect improved knowledge of fire-prone areas, monitor wildfire events in real-time, assess effectiveness of fire management strategies, and provide rapid, post-fire assessment for recovery operations. Specific examples of near-real-time remote sensing asset utility include daily MODIS data employed to assess fire potential / wildfire hazard areas, and national-scale hot-spot detection, airborne thermal sensor collected during wildfire events to effect management strategies, EO-1 ALI 'pointable' satellite sensor data to assess fire-retardant application effectiveness, and Landsat 8 and other sensor data to derive burn severity indices for post-fire remediation work. These cases of where near-real-time data is used operationally during the previous few fire seasons will be presented.

  15. Method and apparatus for real time imaging and monitoring of radiotherapy beams

    DOEpatents

    Majewski, Stanislaw [Yorktown, VA; Proffitt, James [Newport News, VA; Macey, Daniel J [Birmingham, AL; Weisenberger, Andrew G [Yorktown, VA

    2011-11-01

    A method and apparatus for real time imaging and monitoring of radiation therapy beams is designed to preferentially distinguish and image low energy radiation from high energy secondary radiation emitted from a target as the result of therapeutic beam deposition. A detector having low sensitivity to high energy photons combined with a collimator designed to dynamically image in the region of the therapeutic beam target is used.

  16. Computer-aided diagnosis of colorectal polyp histology by using a real-time image recognition system and narrow-band imaging magnifying colonoscopy.

    PubMed

    Kominami, Yoko; Yoshida, Shigeto; Tanaka, Shinji; Sanomura, Yoji; Hirakawa, Tsubasa; Raytchev, Bisser; Tamaki, Toru; Koide, Tetsusi; Kaneda, Kazufumi; Chayama, Kazuaki

    2016-03-01

    It is necessary to establish cost-effective examinations and treatments for diminutive colorectal tumors that consider the treatment risk and surveillance interval after treatment. The Preservation and Incorporation of Valuable Endoscopic Innovations (PIVI) committee of the American Society for Gastrointestinal Endoscopy published a statement recommending the establishment of endoscopic techniques that practice the resect and discard strategy. The aims of this study were to evaluate whether our newly developed real-time image recognition system can predict histologic diagnoses of colorectal lesions depicted on narrow-band imaging and to satisfy some problems with the PIVI recommendations. We enrolled 41 patients who had undergone endoscopic resection of 118 colorectal lesions (45 nonneoplastic lesions and 73 neoplastic lesions). We compared the results of real-time image recognition system analysis with that of narrow-band imaging diagnosis and evaluated the correlation between image analysis and the pathological results. Concordance between the endoscopic diagnosis and diagnosis by a real-time image recognition system with a support vector machine output value was 97.5% (115/118). Accuracy between the histologic findings of diminutive colorectal lesions (polyps) and diagnosis by a real-time image recognition system with a support vector machine output value was 93.2% (sensitivity, 93.0%; specificity, 93.3%; positive predictive value (PPV), 93.0%; and negative predictive value, 93.3%). Although further investigation is necessary to establish our computer-aided diagnosis system, this real-time image recognition system may satisfy the PIVI recommendations and be useful for predicting the histology of colorectal tumors. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  17. Real-time imaging through strongly scattering media: seeing through turbid media, instantly

    PubMed Central

    Sudarsanam, Sriram; Mathew, James; Panigrahi, Swapnesh; Fade, Julien; Alouini, Mehdi; Ramachandran, Hema

    2016-01-01

    Numerous everyday situations like navigation, medical imaging and rescue operations require viewing through optically inhomogeneous media. This is a challenging task as photons propagate predominantly diffusively (rather than ballistically) due to random multiple scattering off the inhomogenieties. Real-time imaging with ballistic light under continuous-wave illumination is even more challenging due to the extremely weak signal, necessitating voluminous data-processing. Here we report imaging through strongly scattering media in real-time and at rates several times the critical flicker frequency of the eye, so that motion is perceived as continuous. Two factors contributed to the speedup of more than three orders of magnitude over conventional techniques - the use of a simplified algorithm enabling processing of data on the fly, and the utilisation of task and data parallelization capabilities of typical desktop computers. The extreme simplicity of the technique, and its implementation with present day low-cost technology promises its utility in a variety of devices in maritime, aerospace, rail and road transport, in medical imaging and defence. It is of equal interest to the common man and adventure sportsperson like hikers, divers, mountaineers, who frequently encounter situations requiring realtime imaging through obscuring media. As a specific example, navigation under poor visibility is examined. PMID:27114106

  18. High Resolution Near Real Time Image Processing and Support for MSSS Modernization

    NASA Astrophysics Data System (ADS)

    Duncan, R. B.; Sabol, C.; Borelli, K.; Spetka, S.; Addison, J.; Mallo, A.; Farnsworth, B.; Viloria, R.

    2012-09-01

    This paper describes image enhancement software applications engineering development work that has been performed in support of Maui Space Surveillance System (MSSS) Modernization. It also includes R&D and transition activity that has been performed over the past few years with the objective of providing increased space situational awareness (SSA) capabilities. This includes Air Force Research Laboratory (AFRL) use of an FY10 Dedicated High Performance Investment (DHPI) cluster award -- and our selection and planned use for an FY12 DHPI award. We provide an introduction to image processing of electro optical (EO) telescope sensors data; and a high resolution image enhancement and near real time processing and summary status overview. We then describe recent image enhancement applications development and support for MSSS Modernization, results to date, and end with a discussion of desired future development work and conclusions. Significant improvements to image processing enhancement have been realized over the past several years, including a key application that has realized more than a 10,000-times speedup compared to the original R&D code -- and a greater than 72-times speedup over the past few years. The latest version of this code maintains software efficiency for post-mission processing while providing optimization for image processing of data from a new EO sensor at MSSS. Additional work has also been performed to develop low latency, near real time processing of data that is collected by the ground-based sensor during overhead passes of space objects.

  19. Real-time contrast ultrasound muscle perfusion imaging with intermediate-power imaging coupled with acoustically durable microbubbles.

    PubMed

    Seol, Sang-Hoon; Davidson, Brian P; Belcik, J Todd; Mott, Brian H; Goodman, Reid M; Ammi, Azzdine; Lindner, Jonathan R

    2015-06-01

    There is growing interest in limb contrast-enhanced ultrasound (CEU) perfusion imaging for the evaluation of peripheral artery disease. Because of low resting microvascular blood flow in skeletal muscle, signal enhancement during limb CEU is prohibitively low for real-time imaging. The aim of this study was to test the hypothesis that this obstacle can be overcome by intermediate- rather than low-power CEU when performed with an acoustically resilient microbubble agent. Viscoelastic properties of Definity and Sonazoid were assessed by measuring bulk modulus during incremental increases in ambient pressure to 200 mm Hg. Comparison of in vivo microbubble destruction and signal enhancement at a mechanical index (MI) of 0.1 to 0.4 was performed by sequential reduction in pulsing interval from 10 to 0.05 sec during limb CEU at 7 MHz in mice and 1.8 MHz in dogs. Destruction was also assessed by broadband signal generation during passive cavitation detection. Real-time CEU perfusion imaging with destruction-replenishment was then performed at 1.8 MHz in dogs using an MI of 0.1, 0.2, or 0.3. Sonazoid had a higher bulk modulus than Definity (66 ± 12 vs 29 ± 2 kPa, P = .02) and exhibited less inertial cavitation (destruction) at MIs ≥ 0.2. On in vivo CEU, maximal signal intensity increased incrementally with MI for both agents and was equivalent between agents except at an MI of 0.1 (60% and 85% lower for Sonazoid at 7 and 1.8 MHz, respectively, P < .05). However, on progressive shortening of the pulsing interval, Definity was nearly completely destroyed at MIs ≥ 0.2 at 1.8 and 7 MHz, whereas Sonazoid was destroyed only at 1.8 MHz at MIs ≥ 0.3. As a result, real-time CEU perfusion imaging demonstrated approximately fourfold greater enhancement for Sonazoid at an MI of 0.3 to 0.4. Robust signal enhancement during real-time CEU perfusion imaging of the limb is possible when using intermediate-power imaging coupled with a durable microbubble

  20. A Real-Time Ultraviolet Radiation Imaging System Using an Organic Photoconductive Image Sensor†

    PubMed Central

    Okino, Toru; Yamahira, Seiji; Yamada, Shota; Hirose, Yutaka; Odagawa, Akihiro; Kato, Yoshihisa; Tanaka, Tsuyoshi

    2018-01-01

    We have developed a real time ultraviolet (UV) imaging system that can visualize both invisible UV light and a visible (VIS) background scene in an outdoor environment. As a UV/VIS image sensor, an organic photoconductive film (OPF) imager is employed. The OPF has an intrinsically higher sensitivity in the UV wavelength region than those of conventional consumer Complementary Metal Oxide Semiconductor (CMOS) image sensors (CIS) or Charge Coupled Devices (CCD). As particular examples, imaging of hydrogen flame and of corona discharge is demonstrated. UV images overlapped on background scenes are simply made by on-board background subtraction. The system is capable of imaging weaker UV signals by four orders of magnitude than that of VIS background. It is applicable not only to future hydrogen supply stations but also to other UV/VIS monitor systems requiring UV sensitivity under strong visible radiation environment such as power supply substations. PMID:29361742

  1. Mid-level image representations for real-time heart view plane classification of echocardiograms.

    PubMed

    Penatti, Otávio A B; Werneck, Rafael de O; de Almeida, Waldir R; Stein, Bernardo V; Pazinato, Daniel V; Mendes Júnior, Pedro R; Torres, Ricardo da S; Rocha, Anderson

    2015-11-01

    In this paper, we explore mid-level image representations for real-time heart view plane classification of 2D echocardiogram ultrasound images. The proposed representations rely on bags of visual words, successfully used by the computer vision community in visual recognition problems. An important element of the proposed representations is the image sampling with large regions, drastically reducing the execution time of the image characterization procedure. Throughout an extensive set of experiments, we evaluate the proposed approach against different image descriptors for classifying four heart view planes. The results show that our approach is effective and efficient for the target problem, making it suitable for use in real-time setups. The proposed representations are also robust to different image transformations, e.g., downsampling, noise filtering, and different machine learning classifiers, keeping classification accuracy above 90%. Feature extraction can be performed in 30 fps or 60 fps in some cases. This paper also includes an in-depth review of the literature in the area of automatic echocardiogram view classification giving the reader a through comprehension of this field of study. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Comparison of Three Real-Time Measurement Methods for Airborne Ultrafine Particles in the Silicon Alloy Industry.

    PubMed

    Kero, Ida Teresia; Jørgensen, Rikke Bramming

    2016-09-01

    The aim of this study was to compare the applicability and the correlation between three commercially available instruments capable of detection, quantification, and characterization of ultrafine airborne particulate matter in the industrial setting of a tapping area in a silicon alloy production plant. The number concentration of ultrafine particles was evaluated using an Electric Low Pressure Impactor (ELPI(TM)), a Fast Mobility Particle Sizer (FMPS(TM)), and a Condensation Particle Counter (CPC). The results are discussed in terms of particle size distribution and temporal variations linked to process operations. The instruments show excellent temporal covariation and the correlation between the FMPS and ELPI is good. The advantage of the FMPS is the excellent time- and size resolution of the results. The main advantage of the ELPI is the possibility to collect size-fractionated samples of the dust for subsequent analysis by, for example, electron microscopy. The CPC does not provide information about the particle size distribution and its correlation to the other two instruments is somewhat poor. Nonetheless, the CPC gives basic, real-time information about the ultrafine particle concentration and can therefore be used for source identification.

  3. Comparison of Three Real-Time Measurement Methods for Airborne Ultrafine Particles in the Silicon Alloy Industry

    PubMed Central

    Kero, Ida Teresia; Jørgensen, Rikke Bramming

    2016-01-01

    The aim of this study was to compare the applicability and the correlation between three commercially available instruments capable of detection, quantification, and characterization of ultrafine airborne particulate matter in the industrial setting of a tapping area in a silicon alloy production plant. The number concentration of ultrafine particles was evaluated using an Electric Low Pressure Impactor (ELPITM), a Fast Mobility Particle Sizer (FMPSTM), and a Condensation Particle Counter (CPC). The results are discussed in terms of particle size distribution and temporal variations linked to process operations. The instruments show excellent temporal covariation and the correlation between the FMPS and ELPI is good. The advantage of the FMPS is the excellent time- and size resolution of the results. The main advantage of the ELPI is the possibility to collect size-fractionated samples of the dust for subsequent analysis by, for example, electron microscopy. The CPC does not provide information about the particle size distribution and its correlation to the other two instruments is somewhat poor. Nonetheless, the CPC gives basic, real-time information about the ultrafine particle concentration and can therefore be used for source identification. PMID:27598180

  4. Imaging multicellular specimens with real-time optimized tiling light-sheet selective plane illumination microscopy

    PubMed Central

    Fu, Qinyi; Martin, Benjamin L.; Matus, David Q.; Gao, Liang

    2016-01-01

    Despite the progress made in selective plane illumination microscopy, high-resolution 3D live imaging of multicellular specimens remains challenging. Tiling light-sheet selective plane illumination microscopy (TLS-SPIM) with real-time light-sheet optimization was developed to respond to the challenge. It improves the 3D imaging ability of SPIM in resolving complex structures and optimizes SPIM live imaging performance by using a real-time adjustable tiling light sheet and creating a flexible compromise between spatial and temporal resolution. We demonstrate the 3D live imaging ability of TLS-SPIM by imaging cellular and subcellular behaviours in live C. elegans and zebrafish embryos, and show how TLS-SPIM can facilitate cell biology research in multicellular specimens by studying left-right symmetry breaking behaviour of C. elegans embryos. PMID:27004937

  5. Cellular Neural Network for Real Time Image Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vagliasindi, G.; Arena, P.; Fortuna, L.

    2008-03-12

    Since their introduction in 1988, Cellular Nonlinear Networks (CNNs) have found a key role as image processing instruments. Thanks to their structure they are able of processing individual pixels in a parallel way providing fast image processing capabilities that has been applied to a wide range of field among which nuclear fusion. In the last years, indeed, visible and infrared video cameras have become more and more important in tokamak fusion experiments for the twofold aim of understanding the physics and monitoring the safety of the operation. Examining the output of these cameras in real-time can provide significant information formore » plasma control and safety of the machines. The potentiality of CNNs can be exploited to this aim. To demonstrate the feasibility of the approach, CNN image processing has been applied to several tasks both at the Frascati Tokamak Upgrade (FTU) and the Joint European Torus (JET)« less

  6. In-Vivo Real-Time X-ray μ-Imaging

    NASA Astrophysics Data System (ADS)

    Dammer, Jiri; Holy, Tomas; Jakubek, Jan; Jakubek, Martin; Pospisil, Stanislav; Vavrík, Daniel

    2007-11-01

    The technique of X-ray transmission imaging is available for more than 100 years and it is still one of the fastest and easiest ways how to study the internal structure of living biological samples. The advances in semiconductor technology in last years make possible to fabricate new types of X-ray detectors with direct conversion of interacting X-ray photon to an electric signal. Especially semiconductor pixel detectors seem to be very promising. Compared to the film technique they bring single-quantum and real-time digital information about the studied object with high resolution, high sensitivity and broad dynamic range. These pixel detector-based imaging stand promising as a new tool in the field of small animal imaging, for cancer research and for observation of dynamic processes inside organisms. These detectors open up for instance new possibilities for researchers to perform non-invasive studies of tissue for mutations or pathologies and to monitor disease progression or response to therapy.

  7. Real-time magnetic resonance imaging of cardiac function and flow—recent progress

    PubMed Central

    Zhang, Shuo; Joseph, Arun A.; Voit, Dirk; Schaetz, Sebastian; Merboldt, Klaus-Dietmar; Unterberg-Buchwald, Christina; Hennemuth, Anja; Lotz, Joachim

    2014-01-01

    Cardiac structure, function and flow are most commonly studied by ultrasound, X-ray and magnetic resonance imaging (MRI) techniques. However, cardiovascular MRI is hitherto limited to electrocardiogram (ECG)-synchronized acquisitions and therefore often results in compromised quality for patients with arrhythmias or inabilities to comply with requested protocols—especially with breath-holding. Recent advances in the development of novel real-time MRI techniques now offer dynamic imaging of the heart and major vessels with high spatial and temporal resolution, so that examinations may be performed without the need for ECG synchronization and during free breathing. This article provides an overview of technical achievements, physiological validations, preliminary patient studies and translational aspects for a future clinical scenario of cardiovascular MRI in real time. PMID:25392819

  8. Towards real-time image deconvolution: application to confocal and STED microscopy

    PubMed Central

    Zanella, R.; Zanghirati, G.; Cavicchioli, R.; Zanni, L.; Boccacci, P.; Bertero, M.; Vicidomini, G.

    2013-01-01

    Although deconvolution can improve the quality of any type of microscope, the high computational time required has so far limited its massive spreading. Here we demonstrate the ability of the scaled-gradient-projection (SGP) method to provide accelerated versions of the most used algorithms in microscopy. To achieve further increases in efficiency, we also consider implementations on graphic processing units (GPUs). We test the proposed algorithms both on synthetic and real data of confocal and STED microscopy. Combining the SGP method with the GPU implementation we achieve a speed-up factor from about a factor 25 to 690 (with respect the conventional algorithm). The excellent results obtained on STED microscopy images demonstrate the synergy between super-resolution techniques and image-deconvolution. Further, the real-time processing allows conserving one of the most important property of STED microscopy, i.e the ability to provide fast sub-diffraction resolution recordings. PMID:23982127

  9. Geodetic Imaging for Rapid Assessment of Earthquakes: Airborne Laser Scanning (ALS)

    NASA Astrophysics Data System (ADS)

    Carter, W. E.; Shrestha, R. L.; Glennie, C. L.; Sartori, M.; Fernandez-Diaz, J.; National CenterAirborne Laser Mapping Operational Center

    2010-12-01

    To the residents of an area struck by a strong earthquake quantitative information on damage to the infrastructure, and its attendant impact on relief and recovery efforts, is urgent and of primary concern. To earth scientists a strong earthquake offers an opportunity to learn more about earthquake mechanisms, and to compare their models with the real world, in hopes of one day being able to accurately predict the precise locations, magnitudes, and times of large (and potentially disastrous) earthquakes. Airborne laser scanning (also referred to as airborne LiDAR or Airborne Laser Swath Mapping) is particularly well suited for rapid assessment of earthquakes, both for immediately estimating the damage to infrastructure and for providing information for the scientific study of earthquakes. ALS observations collected at low altitude (500—1000m) from a relatively slow (70—100m/sec) aircraft can provide dense (5—15 points/m2) sets of surface features (buildings, vegetation, ground), extending over hundreds of square kilometers with turn around times of several hours to a few days. The actual response time to any given event depends on several factors, including such bureaucratic issues as approval of funds, export license formalities, and clearance to fly over the area to be mapped, and operational factors such as the deployment of the aircraft and ground teams may also take a number of days for remote locations. Of course the need for immediate mapping of earthquake damage generally is not as urgent in remote regions with less infrastructure and few inhabitants. During August 16-19, 2010 the National Center for Airborne Laser Mapping (NCALM) mapped the area affected by the magnitude 7.2 El Mayor-Cucapah Earthquake (Northern Baja California Earthquake), which occurred on April 4, 2010, and was felt throughout southern California, Arizona, Nevada, and Baja California North, Mexico. From initial ground observations the fault rupture appeared to extend 75 km

  10. Microwave tomography for an effective imaging in GPR on UAV/airborne observational platforms

    NASA Astrophysics Data System (ADS)

    Soldovieri, Francesco; Catapano, Ilaria; Ludeno, Giovanni

    2017-04-01

    GPR was originally thought as a non-invasive diagnostics technique working in contact with the underground or structure to be investigated. On the other hand, in the recent years several challenging necessities and opportunities entail the necessity to work with antenna not in contact with the structure to be investigated. This necessity arises for example in the case of landmine detection but also for cultural heritage diagnostics. Other field of application regards the forward-looking GPR aiming at shallower hidden targets forward the platfrom (vehicle) carrying the GPR [1]. Finally, a recent application is concerned with the deployment of airborne/UAV GPR, able to ensure several advantages in terms of large scale surveys and "freedom" of logistics constraint [2]. For all the above mentioned cases, the interest is towards the development of effective data processing able to make imaging task in real time. The presentation will show different data processing strategies, based on microwave tomography [1,2], for a reliable and real time imaging in the case of GPR platforms far from the interface of the structure/underground to be investigated. [1] I. Catapano, A. Affinito, A. Del Moro,.G. Alli, and F. Soldovieri, "Forward-Looking Ground-Penetrating Radar via a Linear Inverse Scattering Approach," IEEE Transactions on Geoscience and Remote Sensing, vol. 53, pp. 5624 - 5633, Oct. 2015. [2] I. Catapano, L. Crocco, Y. Krellmann, G. Triltzsch, and F. Soldovieri, "A tomographic approach for helicopter-borne ground penetrating radar imaging," IEEE Geosci. Remote Sens. Lett., vol. 9, no. 3, pp. 378-382, May 2012.

  11. Imaging the small animal cardiovascular system in real-time with multispectral optoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Taruttis, Adrian; Herzog, Eva; Razansky, Daniel; Ntziachristos, Vasilis

    2011-03-01

    Multispectral Optoacoustic Tomography (MSOT) is an emerging technique for high resolution macroscopic imaging with optical and molecular contrast. We present cardiovascular imaging results from a multi-element real-time MSOT system recently developed for studies on small animals. Anatomical features relevant to cardiovascular disease, such as the carotid arteries, the aorta and the heart, are imaged in mice. The system's fast acquisition time, in tens of microseconds, allows images free of motion artifacts from heartbeat and respiration. Additionally, we present in-vivo detection of optical imaging agents, gold nanorods, at high spatial and temporal resolution, paving the way for molecular imaging applications.

  12. Real-time distortion correction of spiral and echo planar images using the gradient system impulse response function.

    PubMed

    Campbell-Washburn, Adrienne E; Xue, Hui; Lederman, Robert J; Faranesh, Anthony Z; Hansen, Michael S

    2016-06-01

    MRI-guided interventions demand high frame rate imaging, making fast imaging techniques such as spiral imaging and echo planar imaging (EPI) appealing. In this study, we implemented a real-time distortion correction framework to enable the use of these fast acquisitions for interventional MRI. Distortions caused by gradient waveform inaccuracies were corrected using the gradient impulse response function (GIRF), which was measured by standard equipment and saved as a calibration file on the host computer. This file was used at runtime to calculate the predicted k-space trajectories for image reconstruction. Additionally, the off-resonance reconstruction frequency was modified in real time to interactively deblur spiral images. Real-time distortion correction for arbitrary image orientations was achieved in phantoms and healthy human volunteers. The GIRF-predicted k-space trajectories matched measured k-space trajectories closely for spiral imaging. Spiral and EPI image distortion was visibly improved using the GIRF-predicted trajectories. The GIRF calibration file showed no systematic drift in 4 months and was demonstrated to correct distortions after 30 min of continuous scanning despite gradient heating. Interactive off-resonance reconstruction was used to sharpen anatomical boundaries during continuous imaging. This real-time distortion correction framework will enable the use of these high frame rate imaging methods for MRI-guided interventions. Magn Reson Med 75:2278-2285, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  13. Real-time distortion correction of spiral and echo planar images using the gradient system impulse response function

    PubMed Central

    Campbell-Washburn, Adrienne E; Xue, Hui; Lederman, Robert J; Faranesh, Anthony Z; Hansen, Michael S

    2015-01-01

    Purpose MRI-guided interventions demand high frame-rate imaging, making fast imaging techniques such as spiral imaging and echo planar imaging (EPI) appealing. In this study, we implemented a real-time distortion correction framework to enable the use of these fast acquisitions for interventional MRI. Methods Distortions caused by gradient waveform inaccuracies were corrected using the gradient impulse response function (GIRF), which was measured by standard equipment and saved as a calibration file on the host computer. This file was used at runtime to calculate the predicted k-space trajectories for image reconstruction. Additionally, the off-resonance reconstruction frequency was modified in real-time to interactively de-blur spiral images. Results Real-time distortion correction for arbitrary image orientations was achieved in phantoms and healthy human volunteers. The GIRF predicted k-space trajectories matched measured k-space trajectories closely for spiral imaging. Spiral and EPI image distortion was visibly improved using the GIRF predicted trajectories. The GIRF calibration file showed no systematic drift in 4 months and was demonstrated to correct distortions after 30 minutes of continuous scanning despite gradient heating. Interactive off-resonance reconstruction was used to sharpen anatomical boundaries during continuous imaging. Conclusions This real-time distortion correction framework will enable the use of these high frame-rate imaging methods for MRI-guided interventions. PMID:26114951

  14. Real-time magnetic resonance imaging-guided transcatheter aortic valve replacement.

    PubMed

    Miller, Justin G; Li, Ming; Mazilu, Dumitru; Hunt, Tim; Horvath, Keith A

    2016-05-01

    To demonstrate the feasibility of Real-time magnetic resonance imaging (rtMRI) guided transcatheter aortic valve replacement (TAVR) with an active guidewire and an MRI compatible valve delivery catheter system in a swine model. The CoreValve system was minimally modified to be MRI-compatible by replacing the stainless steel components with fluoroplastic resin and high-density polyethylene components. Eight swine weighing 60-90 kg underwent rtMRI-guided TAVR with an active guidewire through a left subclavian approach. Two imaging planes (long-axis view and short-axis view) were used simultaneously for real-time imaging during implantation. Successful deployment was performed without rapid ventricular pacing or cardiopulmonary bypass. Postdeployment images were acquired to evaluate the final valve position in addition to valvular and cardiac function. Our results show that the CoreValve can be easily and effectively deployed through a left subclavian approach using rtMRI guidance, a minimally modified valve delivery catheter system, and an active guidewire. This method allows superior visualization before deployment, thereby allowing placement of the valve with pinpoint accuracy. rtMRI has the added benefit of the ability to perform immediate postprocedural functional assessment, while eliminating the morbidity associated with radiation exposure, rapid ventricular pacing, contrast media renal toxicity, and a more invasive procedure. Use of a commercially available device brings this rtMRI-guided approach closer to clinical reality. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  15. Real-time emulation of neural images in the outer retinal circuit.

    PubMed

    Hasegawa, Jun; Yagi, Tetsuya

    2008-12-01

    We describe a novel real-time system that emulates the architecture and functionality of the vertebrate retina. This system reconstructs the neural images formed by the retinal neurons in real time by using a combination of analog and digital systems consisting of a neuromorphic silicon retina chip, a field-programmable gate array, and a digital computer. While the silicon retina carries out the spatial filtering of input images instantaneously, using the embedded resistive networks that emulate the receptive field structure of the outer retinal neurons, the digital computer carries out the temporal filtering of the spatially filtered images to emulate the dynamical properties of the outer retinal circuits. The emulations of the neural image, including 128 x 128 bipolar cells, are carried out at a frame rate of 62.5 Hz. The emulation of the response to the Hermann grid and a spot of light and an annulus of lights has demonstrated that the system responds as expected by previous physiological and psychophysical observations. Furthermore, the emulated dynamics of neural images in response to natural scenes revealed the complex nature of retinal neuron activity. We have concluded that the system reflects the spatiotemporal responses of bipolar cells in the vertebrate retina. The proposed emulation system is expected to aid in understanding the visual computation in the retina and the brain.

  16. Diffraction-limited real-time terahertz imaging by optical frequency up-conversion in a DAST crystal.

    PubMed

    Fan, Shuzhen; Qi, Feng; Notake, Takashi; Nawata, Kouji; Takida, Yuma; Matsukawa, Takeshi; Minamide, Hiroaki

    2015-03-23

    Real-time terahertz (THz) wave imaging has wide applications in areas such as security, industry, biology, medicine, pharmacy, and the arts. This report describes real-time room-temperature THz imaging by nonlinear optical frequency up-conversion in an organic 4-dimethylamino-N'-methyl-4'-stilbazolium tosylate (DAST) crystal, with high resolution reaching the diffraction limit. THz-wave images were converted to the near infrared region and then captured using an InGaAs camera in a tandem imaging system. The resolution of the imaging system was analyzed. Diffraction and interference of THz wave were observed in the experiments. Videos are supplied to show the interference pattern variation that occurs with sample moving and tilting.

  17. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    NASA Astrophysics Data System (ADS)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  18. Imaging technique for real-time temperature monitoring during cryotherapy of lesions.

    PubMed

    Petrova, Elena; Liopo, Anton; Nadvoretskiy, Vyacheslav; Ermilov, Sergey

    2016-11-01

    Noninvasive real-time temperature imaging during thermal therapies is able to significantly improve clinical outcomes. An optoacoustic (OA) temperature monitoring method is proposed for noninvasive real-time thermometry of vascularized tissue during cryotherapy. The universal temperature-dependent optoacoustic response (ThOR) of red blood cells (RBCs) is employed to convert reconstructed OA images to temperature maps. To obtain the temperature calibration curve for intensity-normalized OA images, we measured ThOR of 10 porcine blood samples in the range of temperatures from 40°C to ?16°C and analyzed the data for single measurement variations. The nonlinearity (?Tmax) and the temperature of zero OA response (T0) of the calibration curve were found equal to 11.4±0.1°C and ?13.8±0.1°C, respectively. The morphology of RBCs was examined before and after the data collection confirming cellular integrity and intracellular compartmentalization of hemoglobin. For temperatures below 0°C, which are of particular interest for cryotherapy, the accuracy of a single temperature measurement was ±1°C, which is consistent with the clinical requirements. Validation of the proposed OA temperature imaging technique was performed for slow and fast cooling of blood samples embedded in tissue-mimicking phantoms.

  19. Imaging technique for real-time temperature monitoring during cryotherapy of lesions

    PubMed Central

    Petrova, Elena; Liopo, Anton; Nadvoretskiy, Vyacheslav; Ermilov, Sergey

    2016-01-01

    Abstract. Noninvasive real-time temperature imaging during thermal therapies is able to significantly improve clinical outcomes. An optoacoustic (OA) temperature monitoring method is proposed for noninvasive real-time thermometry of vascularized tissue during cryotherapy. The universal temperature-dependent optoacoustic response (ThOR) of red blood cells (RBCs) is employed to convert reconstructed OA images to temperature maps. To obtain the temperature calibration curve for intensity-normalized OA images, we measured ThOR of 10 porcine blood samples in the range of temperatures from 40°C to −16°C and analyzed the data for single measurement variations. The nonlinearity (ΔTmax) and the temperature of zero OA response (T0) of the calibration curve were found equal to 11.4±0.1°C and −13.8±0.1°C, respectively. The morphology of RBCs was examined before and after the data collection confirming cellular integrity and intracellular compartmentalization of hemoglobin. For temperatures below 0°C, which are of particular interest for cryotherapy, the accuracy of a single temperature measurement was ±1°C, which is consistent with the clinical requirements. Validation of the proposed OA temperature imaging technique was performed for slow and fast cooling of blood samples embedded in tissue-mimicking phantoms. PMID:27822579

  20. Efficient Imaging and Real-Time Display of Scanning Ion Conductance Microscopy Based on Block Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Li, Gongxin; Li, Peng; Wang, Yuechao; Wang, Wenxue; Xi, Ning; Liu, Lianqing

    2014-07-01

    Scanning Ion Conductance Microscopy (SICM) is one kind of Scanning Probe Microscopies (SPMs), and it is widely used in imaging soft samples for many distinctive advantages. However, the scanning speed of SICM is much slower than other SPMs. Compressive sensing (CS) could improve scanning speed tremendously by breaking through the Shannon sampling theorem, but it still requires too much time in image reconstruction. Block compressive sensing can be applied to SICM imaging to further reduce the reconstruction time of sparse signals, and it has another unique application that it can achieve the function of image real-time display in SICM imaging. In this article, a new method of dividing blocks and a new matrix arithmetic operation were proposed to build the block compressive sensing model, and several experiments were carried out to verify the superiority of block compressive sensing in reducing imaging time and real-time display in SICM imaging.

  1. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    PubMed Central

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  2. Visualisation and quantitative analysis of the rodent malaria liver stage by real time imaging.

    PubMed

    Ploemen, Ivo H J; Prudêncio, Miguel; Douradinha, Bruno G; Ramesar, Jai; Fonager, Jannik; van Gemert, Geert-Jan; Luty, Adrian J F; Hermsen, Cornelus C; Sauerwein, Robert W; Baptista, Fernanda G; Mota, Maria M; Waters, Andrew P; Que, Ivo; Lowik, Clemens W G M; Khan, Shahid M; Janse, Chris J; Franke-Fayard, Blandine M D

    2009-11-18

    The quantitative analysis of Plasmodium development in the liver in laboratory animals in cultured cells is hampered by low parasite infection rates and the complicated methods required to monitor intracellular development. As a consequence, this important phase of the parasite's life cycle has been poorly studied compared to blood stages, for example in screening anti-malarial drugs. Here we report the use of a transgenic P. berghei parasite, PbGFP-Luc(con), expressing the bioluminescent reporter protein luciferase to visualize and quantify parasite development in liver cells both in culture and in live mice using real-time luminescence imaging. The reporter-parasite based quantification in cultured hepatocytes by real-time imaging or using a microplate reader correlates very well with established quantitative RT-PCR methods. For the first time the liver stage of Plasmodium is visualized in whole bodies of live mice and we were able to discriminate as few as 1-5 infected hepatocytes per liver in mice using 2D-imaging and to identify individual infected hepatocytes by 3D-imaging. The analysis of liver infections by whole body imaging shows a good correlation with quantitative RT-PCR analysis of extracted livers. The luminescence-based analysis of the effects of various drugs on in vitro hepatocyte infection shows that this method can effectively be used for in vitro screening of compounds targeting Plasmodium liver stages. Furthermore, by analysing the effect of primaquine and tafenoquine in vivo we demonstrate the applicability of real time imaging to assess parasite drug sensitivity in the liver. The simplicity and speed of quantitative analysis of liver-stage development by real-time imaging compared to the PCR methodologies, as well as the possibility to analyse liver development in live mice without surgery, opens up new possibilities for research on Plasmodium liver infections and for validating the effect of drugs and vaccines on the liver stage of

  3. Visualisation and Quantitative Analysis of the Rodent Malaria Liver Stage by Real Time Imaging

    PubMed Central

    Douradinha, Bruno G.; Ramesar, Jai; Fonager, Jannik; van Gemert, Geert-Jan; Luty, Adrian J. F.; Hermsen, Cornelus C.; Sauerwein, Robert W.; Baptista, Fernanda G.; Mota, Maria M.; Waters, Andrew P.; Que, Ivo; Lowik, Clemens W. G. M.; Khan, Shahid M.; Janse, Chris J.; Franke-Fayard, Blandine M. D.

    2009-01-01

    The quantitative analysis of Plasmodium development in the liver in laboratory animals in cultured cells is hampered by low parasite infection rates and the complicated methods required to monitor intracellular development. As a consequence, this important phase of the parasite's life cycle has been poorly studied compared to blood stages, for example in screening anti-malarial drugs. Here we report the use of a transgenic P. berghei parasite, PbGFP-Luccon, expressing the bioluminescent reporter protein luciferase to visualize and quantify parasite development in liver cells both in culture and in live mice using real-time luminescence imaging. The reporter-parasite based quantification in cultured hepatocytes by real-time imaging or using a microplate reader correlates very well with established quantitative RT-PCR methods. For the first time the liver stage of Plasmodium is visualized in whole bodies of live mice and we were able to discriminate as few as 1–5 infected hepatocytes per liver in mice using 2D-imaging and to identify individual infected hepatocytes by 3D-imaging. The analysis of liver infections by whole body imaging shows a good correlation with quantitative RT-PCR analysis of extracted livers. The luminescence-based analysis of the effects of various drugs on in vitro hepatocyte infection shows that this method can effectively be used for in vitro screening of compounds targeting Plasmodium liver stages. Furthermore, by analysing the effect of primaquine and tafenoquine in vivo we demonstrate the applicability of real time imaging to assess parasite drug sensitivity in the liver. The simplicity and speed of quantitative analysis of liver-stage development by real-time imaging compared to the PCR methodologies, as well as the possibility to analyse liver development in live mice without surgery, opens up new possibilities for research on Plasmodium liver infections and for validating the effect of drugs and vaccines on the liver stage of

  4. Image segmentation based upon topological operators: real-time implementation case study

    NASA Astrophysics Data System (ADS)

    Mahmoudi, R.; Akil, M.

    2009-02-01

    In miscellaneous applications of image treatment, thinning and crest restoring present a lot of interests. Recommended algorithms for these procedures are those able to act directly over grayscales images while preserving topology. But their strong consummation in term of time remains the major disadvantage in their choice. In this paper we present an efficient hardware implementation on RISC processor of two powerful algorithms of thinning and crest restoring developed by our team. Proposed implementation enhances execution time. A chain of segmentation applied to medical imaging will serve as a concrete example to illustrate the improvements brought thanks to the optimization techniques in both algorithm and architectural levels. The particular use of the SSE instruction set relative to the X86_32 processors (PIV 3.06 GHz) will allow a best performance for real time processing: a cadency of 33 images (512*512) per second is assured.

  5. Global meteorological data facility for real-time field experiments support and guidance

    NASA Technical Reports Server (NTRS)

    Shipham, Mark C.; Shipley, Scott T.; Trepte, Charles R.

    1988-01-01

    A Global Meteorological Data Facility (GMDF) has been constructed to provide economical real-time meteorological support to atmospheric field experiments. After collection and analysis of meteorological data sets at a central station, tailored meteorological products are transmitted to experiment field sites using conventional ground link or satellite communication techniques. The GMDF supported the Global Tropospheric Experiment Amazon Boundary Layer Experiment (GTE-ABLE II) based in Manaus, Brazil, during July and August 1985; an arctic airborne lidar survey mission for the Polar Stratospheric Clouds (PSC) experiment during January 1986; and the Genesis of Atlantic Lows Experiment (GALE) during January, February and March 1986. GMDF structure is similar to the UNIDATA concept, including meteorological data from the Zephyr Weather Transmission Service, a mode AAA GOES downlink, and dedicated processors for image manipulation, transmission and display. The GMDF improved field experiment operations in general, with the greatest benefits arising from the ability to communicate with field personnel in real time.

  6. Real-time intravascular photoacoustic-ultrasound imaging of lipid-laden plaque at speed of video-rate level

    NASA Astrophysics Data System (ADS)

    Hui, Jie; Cao, Yingchun; Zhang, Yi; Kole, Ayeeshik; Wang, Pu; Yu, Guangli; Eakins, Gregory; Sturek, Michael; Chen, Weibiao; Cheng, Ji-Xin

    2017-03-01

    Intravascular photoacoustic-ultrasound (IVPA-US) imaging is an emerging hybrid modality for the detection of lipidladen plaques by providing simultaneous morphological and lipid-specific chemical information of an artery wall. The clinical utility of IVPA-US technology requires real-time imaging and display at speed of video-rate level. Here, we demonstrate a compact and portable IVPA-US system capable of imaging at up to 25 frames per second in real-time display mode. This unprecedented imaging speed was achieved by concurrent innovations in excitation laser source, rotary joint assembly, 1 mm IVPA-US catheter, differentiated A-line strategy, and real-time image processing and display algorithms. By imaging pulsatile motion at different imaging speeds, 16 frames per second was deemed to be adequate to suppress motion artifacts from cardiac pulsation for in vivo applications. Our lateral resolution results further verified the number of A-lines used for a cross-sectional IVPA image reconstruction. The translational capability of this system for the detection of lipid-laden plaques was validated by ex vivo imaging of an atherosclerotic human coronary artery at 16 frames per second, which showed strong correlation to gold-standard histopathology.

  7. Real-time Flare Detection in Ground-Based Hα Imaging at Kanzelhöhe Observatory

    NASA Astrophysics Data System (ADS)

    Pötzi, W.; Veronig, A. M.; Riegler, G.; Amerstorfer, U.; Pock, T.; Temmer, M.; Polanec, W.; Baumgartner, D. J.

    2015-03-01

    Kanzelhöhe Observatory (KSO) regularly performs high-cadence full-disk imaging of the solar chromosphere in the Hα and Ca ii K spectral lines as well as in the solar photosphere in white light. In the frame of ESA's (European Space Agency) Space Situational Awareness (SSA) program, a new system for real-time Hα data provision and automatic flare detection was developed at KSO. The data and events detected are published in near real-time at ESA's SSA Space Weather portal (http://swe.ssa.esa.int/web/guest/kso-federated). In this article, we describe the Hα instrument, the image-recognition algorithms we developed, and the implementation into the KSO Hα observing system. We also present the evaluation results of the real-time data provision and flare detection for a period of five months. The Hα data provision worked in 99.96 % of the images, with a mean time lag of four seconds between image recording and online provision. Within the given criteria for the automatic image-recognition system (at least three Hα images are needed for a positive detection), all flares with an area ≥ 50 micro-hemispheres that were located within 60° of the solar center and occurred during the KSO observing times were detected, a number of 87 events in total. The automatically determined flare importance and brightness classes were correct in ˜ 85 %. The mean flare positions in heliographic longitude and latitude were correct to within ˜ 1°. The median of the absolute differences for the flare start and peak times from the automatic detections in comparison with the official NOAA (and KSO) visual flare reports were 3 min (1 min).

  8. Unprocessed real-time imaging of vitreoretinal surgical maneuvers using a microscope-integrated spectral-domain optical coherence tomography system.

    PubMed

    Hahn, Paul; Migacz, Justin; O'Connell, Rachelle; Izatt, Joseph A; Toth, Cynthia A

    2013-01-01

    We have recently developed a microscope-integrated spectral-domain optical coherence tomography (MIOCT) device towards intrasurgical cross-sectional imaging of surgical maneuvers. In this report, we explore the capability of MIOCT to acquire real-time video imaging of vitreoretinal surgical maneuvers without post-processing modifications. Standard 3-port vitrectomy was performed in human during scheduled surgery as well as in cadaveric porcine eyes. MIOCT imaging of human subjects was performed in healthy normal volunteers and intraoperatively at a normal pause immediately following surgical manipulations, under an Institutional Review Board-approved protocol, with informed consent from all subjects. Video MIOCT imaging of live surgical manipulations was performed in cadaveric porcine eyes by carefully aligning B-scans with instrument orientation and movement. Inverted imaging was performed by lengthening of the reference arm to a position beyond the choroid. Unprocessed MIOCT imaging was successfully obtained in healthy human volunteers and in human patients undergoing surgery, with visualization of post-surgical changes in unprocessed single B-scans. Real-time, unprocessed MIOCT video imaging was successfully obtained in cadaveric porcine eyes during brushing of the retina with the Tano scraper, peeling of superficial retinal tissue with intraocular forceps, and separation of the posterior hyaloid face. Real-time inverted imaging enabled imaging without complex conjugate artifacts. MIOCT is capable of unprocessed imaging of the macula in human patients undergoing surgery and of unprocessed, real-time, video imaging of surgical maneuvers in model eyes. These capabilities represent an important step towards development of MIOCT for efficient, real-time imaging of manipulations during human surgery.

  9. The near real time image navigation of pictures returned by Voyager 2 at Neptune

    NASA Technical Reports Server (NTRS)

    Underwood, Ian M.; Bachman, Nathaniel J.; Taber, William L.; Wang, Tseng-Chan; Acton, Charles H.

    1990-01-01

    The development of a process for performing image navigation in near real time is described. The process was used to accurately determine the camera pointing for pictures returned by the Voyager 2 spacecraft at Neptune Encounter. Image navigation improves knowledge of the pointing of an imaging instrument at a particular epoch by correlating the spacecraft-relative locations of target bodies in inertial space with the locations of their images in a picture taken at that epoch. More than 8,500 pictures returned by Voyager 2 at Neptune were processed in near real time. The results were used in several applications, including improving pointing knowledge for nonimaging instruments ('C-smithing'), making 'Neptune, the Movie', and providing immediate access to geometrical quantities similar to those traditionally supplied in the Supplementary Experiment Data Record.

  10. Fluorescence particle detector for real-time quantification of viable organisms in air

    NASA Astrophysics Data System (ADS)

    Luoma, Greg; Cherrier, Pierre P.; Piccioni, Marc; Tanton, Carol; Herz, Steve; DeFreez, Richard K.; Potter, Michael; Girvin, Kenneth L.; Whitney, Ronald

    2002-02-01

    The ability to detect viable organisms in air in real time is important in a number of applications. Detecting high levels of airborne organisms in hospitals can prevent post-operative infections and the spread of diseases. Monitoring levels of airborne viable organisms in pharmaceutical facilities can ensure safe production of drugs or vaccines. Monitoring airborne bacterial levels in meat processing plants can help to prevent contamination of food products. Monitoring the level of airborne organisms in bio-containment facilities can ensure that proper procedures are being followed. Finally, detecting viable organisms in real time is a key to defending against biological agent attacks. This presentation describes the development and performance of a detector, based on fluorescence particle counting technology, where an ultraviolet laser is used to count particles by light scattering and elicit fluorescence from specific biomolecules found only in living organisms. The resulting detector can specifically detect airborne particles containing living organisms from among the large majority of other particles normally present in air. Efforts to develop the core sensor technology, focusing on integrating an UV laser with a specially designed particle-counting cell will be highlighted. The hardware/software used to capture the information from the sensor, provide an alarm in the presence of an unusual biological aerosol content will also be described. Finally, results from experiments to test the performance of the detector will be presented.

  11. A real-time chirp-coded imaging system with tissue attenuation compensation.

    PubMed

    Ramalli, A; Guidi, F; Boni, E; Tortoli, P

    2015-07-01

    In ultrasound imaging, pulse compression methods based on the transmission (TX) of long coded pulses and matched receive filtering can be used to improve the penetration depth while preserving the axial resolution (coded-imaging). The performance of most of these methods is affected by the frequency dependent attenuation of tissue, which causes mismatch of the receiver filter. This, together with the involved additional computational load, has probably so far limited the implementation of pulse compression methods in real-time imaging systems. In this paper, a real-time low-computational-cost coded-imaging system operating on the beamformed and demodulated data received by a linear array probe is presented. The system has been implemented by extending the firmware and the software of the ULA-OP research platform. In particular, pulse compression is performed by exploiting the computational resources of a single digital signal processor. Each image line is produced in less than 20 μs, so that, e.g., 192-line frames can be generated at up to 200 fps. Although the system may work with a large class of codes, this paper has been focused on the test of linear frequency modulated chirps. The new system has been used to experimentally investigate the effects of tissue attenuation so that the design of the receive compression filter can be accordingly guided. Tests made with different chirp signals confirm that, although the attainable compression gain in attenuating media is lower than the theoretical value expected for a given TX Time-Bandwidth product (BT), good SNR gains can be obtained. For example, by using a chirp signal having BT=19, a 13 dB compression gain has been measured. By adapting the frequency band of the receiver to the band of the received echo, the signal-to-noise ratio and the penetration depth have been further increased, as shown by real-time tests conducted on phantoms and in vivo. In particular, a 2.7 dB SNR increase has been measured through a

  12. Semi-physical Simulation of the Airborne InSAR based on Rigorous Geometric Model and Real Navigation Data

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; yuquan, Liu; Xijuan, Yue; Yinghui, Zhao

    2014-03-01

    Raw signal simulation is a useful tool for the system design, mission planning, processing algorithm testing, and inversion algorithm design of Synthetic Aperture Radar (SAR). Due to the wide and high frequent variation of aircraft's trajectory and attitude, and the low accuracy of the Position and Orientation System (POS)'s recording data, it's difficult to quantitatively study the sensitivity of the key parameters, i.e., the baseline length and inclination, absolute phase and the orientation of the antennas etc., of the airborne Interferometric SAR (InSAR) system, resulting in challenges for its applications. Furthermore, the imprecise estimation of the installation offset between the Global Positioning System (GPS), Inertial Measurement Unit (IMU) and the InSAR antennas compounds the issue. An airborne interferometric SAR (InSAR) simulation based on the rigorous geometric model and real navigation data is proposed in this paper, providing a way for quantitatively studying the key parameters and for evaluating the effect from the parameters on the applications of airborne InSAR, as photogrammetric mapping, high-resolution Digital Elevation Model (DEM) generation, and surface deformation by Differential InSAR technology, etc. The simulation can also provide reference for the optimal design of the InSAR system and the improvement of InSAR data processing technologies such as motion compensation, imaging, image co-registration, and application parameter retrieval, etc.

  13. A real-time electronic imaging system for solar X-ray observations from sounding rockets

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Ting, J. W.; Gerassimenko, M.

    1979-01-01

    A real-time imaging system for displaying the solar coronal soft X-ray emission, focussed by a grazing incidence telescope, is described. The design parameters of the system, which is to be used primarily as part of a real-time control system for a sounding rocket experiment, are identified. Their achievement with a system consisting of a microchannel plate, for the conversion of X-rays into visible light, and a slow-scan vidicon, for recording and transmission of the integrated images, is described in detail. The system has a quantum efficiency better than 8 deg above 8 A, a dynamic range of 1000 coupled with a sensitivity to single photoelectrons, and provides a spatial resolution of 15 arc seconds over a field of view of 40 x 40 square arc minutes. The incident radiation is filtered to eliminate wavelengths longer than 100 A. Each image contains 3.93 x 10 to the 5th bits of information and is transmitted to the ground where it is processed by a mini-computer and displayed in real-time on a standard TV monitor.

  14. Airborne Four-Dimensional Flight Management in a Time-based Air Traffic Control Environment

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Green, Steven M.

    1991-01-01

    Advanced Air Traffic Control (ATC) systems are being developed which contain time-based (4D) trajectory predictions of aircraft. Airborne flight management systems (FMS) exist or are being developed with similar 4D trajectory generation capabilities. Differences between the ATC generated profiles and those generated by the airborne 4D FMS may introduce system problems. A simulation experiment was conducted to explore integration of a 4D equipped aircraft into a 4D ATC system. The NASA Langley Transport Systems Research Vehicle cockpit simulator was linked in real time to the NASA Ames Descent Advisor ATC simulation for this effort. Candidate procedures for handling 4D equipped aircraft were devised and traffic scenarios established which required time delays absorbed through speed control alone or in combination with path stretching. Dissimilarities in 4D speed strategies between airborne and ATC generated trajectories were tested in these scenarios. The 4D procedures and FMS operation were well received by airline pilot test subjects, who achieved an arrival accuracy at the metering fix of 2.9 seconds standard deviation time error. The amount and nature of the information transmitted during a time clearance were found to be somewhat of a problem using the voice radio communication channel. Dissimilarities between airborne and ATC-generated speed strategies were found to be a problem when the traffic remained on established routes. It was more efficient for 4D equipped aircraft to fly trajectories with similar, though less fuel efficient, speeds which conform to the ATC strategy. Heavy traffic conditions, where time delays forced off-route path stretching, were found to produce a potential operational benefit of the airborne 4D FMS.

  15. Application of the airborne ocean color imager for commercial fishing

    NASA Technical Reports Server (NTRS)

    Wrigley, Robert C.

    1993-01-01

    The objective of the investigation was to develop a commercial remote sensing system for providing near-real-time data (within one day) in support of commercial fishing operations. The Airborne Ocean Color Imager (AOCI) had been built for NASA by Daedalus Enterprises, Inc., but it needed certain improvements, data processing software, and a delivery system to make it into a commercial system for fisheries. Two products were developed to support this effort: the AOCI with its associated processing system and an information service for both commercial and recreational fisheries to be created by Spectro Scan, Inc. The investigation achieved all technical objectives: improving the AOCI, creating software for atmospheric correction and bio-optical output products, georeferencing the output products, and creating a delivery system to get those products into the hands of commercial and recreational fishermen in near-real-time. The first set of business objectives involved Daedalus Enterprises and also were achieved: they have an improved AOCI and new data processing software with a set of example data products for fisheries applications to show their customers. Daedalus' marketing activities showed the need for simplification of the product for fisheries, but they successfully marketed the current version to an Italian consortium. The second set of business objectives tasked Spectro Scan to provide an information service and they could not be achieved because Spectro Scan was unable to obtain necessary venture capital to start up operations.

  16. Development of real-time extensometer based on image processing

    NASA Astrophysics Data System (ADS)

    Adinanta, H.; Puranto, P.; Suryadi

    2017-04-01

    An extensometer system was developed by using high definition web camera as main sensor to track object position. The developed system applied digital image processing techniques. The image processing was used to measure the change of object position. The position measurement was done in real-time so that the system can directly showed the actual position in both x and y-axis. In this research, the relation between pixel and object position changes had been characterized. The system was tested by moving the target in a range of 20 cm in interval of 1 mm. To verify the long run performance, the stability and linearity of continuous measurements on both x and y-axis, this measurement had been conducted for 83 hours. The results show that this image processing-based extensometer had both good stability and linearity.

  17. Hard real-time beam scheduler enables adaptive images in multi-probe systems

    NASA Astrophysics Data System (ADS)

    Tobias, Richard J.

    2014-03-01

    Real-time embedded-system concepts were adapted to allow an imaging system to responsively control the firing of multiple probes. Large-volume, operator-independent (LVOI) imaging would increase the diagnostic utility of ultrasound. An obstacle to this innovation is the inability of current systems to drive multiple transducers dynamically. Commercial systems schedule scanning with static lists of beams to be fired and processed; here we allow an imager to adapt to changing beam schedule demands, as an intelligent response to incoming image data. An example of scheduling changes is demonstrated with a flexible duplex mode two-transducer application mimicking LVOI imaging. Embedded-system concepts allow an imager to responsively control the firing of multiple probes. Operating systems use powerful dynamic scheduling algorithms, such as fixed priority preemptive scheduling. Even real-time operating systems lack the timing constraints required for ultrasound. Particularly for Doppler modes, events must be scheduled with sub-nanosecond precision, and acquired data is useless without this requirement. A successful scheduler needs unique characteristics. To get close to what would be needed in LVOI imaging, we show two transducers scanning different parts of a subjects leg. When one transducer notices flow in a region where their scans overlap, the system reschedules the other transducer to start flow mode and alter its beams to get a view of the observed vessel and produce a flow measurement. The second transducer does this in a focused region only. This demonstrates key attributes of a successful LVOI system, such as robustness against obstructions and adaptive self-correction.

  18. Characterizing Articulation in Apraxic Speech Using Real-Time Magnetic Resonance Imaging

    ERIC Educational Resources Information Center

    Hagedorn, Christina; Proctor, Michael; Goldstein, Louis; Wilson, Stephen M.; Miller, Bruce; Gorno-Tempini, Maria Luisa; Narayanan, Shrikanth S.

    2017-01-01

    Purpose: Real-time magnetic resonance imaging (MRI) and accompanying analytical methods are shown to capture and quantify salient aspects of apraxic speech, substantiating and expanding upon evidence provided by clinical observation and acoustic and kinematic data. Analysis of apraxic speech errors within a dynamic systems framework is provided…

  19. Students' Reading Images in Kinematics: The Case of Real-Time Graphs.

    ERIC Educational Resources Information Center

    Testa, Italo; Monroy, Gabriella; Sassi, Elena

    2002-01-01

    Describes a study in which secondary school students were called upon to read and interpret documents containing images of real-time kinematics graphs specially designed to address common learning problems and minimize iconic difficulties. Makes suggestions regarding the acquisition of some specific capabilities that are needed to avoid…

  20. The first clinical treatment with kilovoltage intrafraction monitoring (KIM): A real-time image guidance method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keall, Paul J., E-mail: paul.keall@sydney.edu.au; O’Brien, Ricky; Huang, Chen-Yu

    Purpose: Kilovoltage intrafraction monitoring (KIM) is a real-time image guidance method that uses widely available radiotherapy technology, i.e., a gantry-mounted x-ray imager. The authors report on the geometric and dosimetric results of the first patient treatment using KIM which occurred on September 16, 2014. Methods: KIM uses current and prior 2D x-ray images to estimate the 3D target position during cancer radiotherapy treatment delivery. KIM software was written to process kilovoltage (kV) images streamed from a standard C-arm linear accelerator with a gantry-mounted kV x-ray imaging system. A 120° pretreatment kV imaging arc was acquired to build the patient-specific 2Dmore » to 3D motion correlation. The kV imager was activated during the megavoltage (MV) treatment, a dual arc VMAT prostate treatment, to estimate the 3D prostate position in real-time. All necessary ethics, legal, and regulatory requirements were met for this clinical study. The quality assurance processes were completed and peer reviewed. Results: During treatment, a prostate position offset of nearly 3 mm in the posterior direction was observed with KIM. This position offset did not trigger a gating event. After the treatment, the prostate motion was independently measured using kV/MV triangulation, resulting in a mean difference of less than 0.6 mm and standard deviation of less than 0.6 mm in each direction. The accuracy of the marker segmentation was visually assessed during and after treatment and found to be performing well. During treatment, there were no interruptions due to performance of the KIM software. Conclusions: For the first time, KIM has been used for real-time image guidance during cancer radiotherapy. The measured accuracy and precision were both submillimeter for the first treatment fraction. This clinical translational research milestone paves the way for the broad implementation of real-time image guidance to facilitate the detection and correction of geometric

  1. qF-SSOP: real-time optical property corrected fluorescence imaging

    PubMed Central

    Valdes, Pablo A.; Angelo, Joseph P.; Choi, Hak Soo; Gioux, Sylvain

    2017-01-01

    Fluorescence imaging is well suited to provide image guidance during resections in oncologic and vascular surgery. However, the distorting effects of tissue optical properties on the emitted fluorescence are poorly compensated for on even the most advanced fluorescence image guidance systems, leading to subjective and inaccurate estimates of tissue fluorophore concentrations. Here we present a novel fluorescence imaging technique that performs real-time (i.e., video rate) optical property corrected fluorescence imaging. We perform full field of view simultaneous imaging of tissue optical properties using Single Snapshot of Optical Properties (SSOP) and fluorescence detection. The estimated optical properties are used to correct the emitted fluorescence with a quantitative fluorescence model to provide quantitative fluorescence-Single Snapshot of Optical Properties (qF-SSOP) images with less than 5% error. The technique is rigorous, fast, and quantitative, enabling ease of integration into the surgical workflow with the potential to improve molecular guidance intraoperatively. PMID:28856038

  2. Real-time operation without a real-time operating system for instrument control and data acquisition

    NASA Astrophysics Data System (ADS)

    Klein, Randolf; Poglitsch, Albrecht; Fumi, Fabio; Geis, Norbert; Hamidouche, Murad; Hoenle, Rainer; Looney, Leslie; Raab, Walfried; Viehhauser, Werner

    2004-09-01

    We are building the Field-Imaging Far-Infrared Line Spectrometer (FIFI LS) for the US-German airborne observatory SOFIA. The detector read-out system is driven by a clock signal at a certain frequency. This signal has to be provided and all other sub-systems have to work synchronously to this clock. The data generated by the instrument has to be received by a computer in a timely manner. Usually these requirements are met with a real-time operating system (RTOS). In this presentation we want to show how we meet these demands differently avoiding the stiffness of an RTOS. Digital I/O-cards with a large buffer separate the asynchronous working computers and the synchronous working instrument. The advantage is that the data processing computers do not need to process the data in real-time. It is sufficient that the computer can process the incoming data stream on average. But since the data is read-in synchronously, problems of relating commands and responses (data) have to be solved: The data is arriving at a fixed rate. The receiving I/O-card buffers the data in its buffer until the computer can access it. To relate the data to commands sent previously, the data is tagged by counters in the read-out electronics. These counters count the system's heartbeat and signals derived from that. The heartbeat and control signals synchronous with the heartbeat are sent by an I/O-card working as pattern generator. Its buffer gets continously programmed with a pattern which is clocked out on the control lines. A counter in the I/O-card keeps track of the amount of pattern words clocked out. By reading this counter, the computer knows the state of the instrument or knows the meaning of the data that will arrive with a certain time-tag.

  3. Real-time visual communication to aid disaster recovery in a multi-segment hybrid wireless networking system

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Wang, Qi; Grecos, Christos

    2012-06-01

    When natural disasters or other large-scale incidents occur, obtaining accurate and timely information on the developing situation is vital to effective disaster recovery operations. High-quality video streams and high-resolution images, if available in real time, would provide an invaluable source of current situation reports to the incident management team. Meanwhile, a disaster often causes significant damage to the communications infrastructure. Therefore, another essential requirement for disaster management is the ability to rapidly deploy a flexible incident area communication network. Such a network would facilitate the transmission of real-time video streams and still images from the disrupted area to remote command and control locations. In this paper, a comprehensive end-to-end video/image transmission system between an incident area and a remote control centre is proposed and implemented, and its performance is experimentally investigated. In this study a hybrid multi-segment communication network is designed that seamlessly integrates terrestrial wireless mesh networks (WMNs), distributed wireless visual sensor networks, an airborne platform with video camera balloons, and a Digital Video Broadcasting- Satellite (DVB-S) system. By carefully integrating all of these rapidly deployable, interworking and collaborative networking technologies, we can fully exploit the joint benefits provided by WMNs, WSNs, balloon camera networks and DVB-S for real-time video streaming and image delivery in emergency situations among the disaster hit area, the remote control centre and the rescue teams in the field. The whole proposed system is implemented in a proven simulator. Through extensive simulations, the real-time visual communication performance of this integrated system has been numerically evaluated, towards a more in-depth understanding in supporting high-quality visual communications in such a demanding context.

  4. Geodetic imaging with airborne LiDAR: the Earth's surface revealed.

    PubMed

    Glennie, C L; Carter, W E; Shrestha, R L; Dietrich, W E

    2013-08-01

    The past decade has seen an explosive increase in the number of peer reviewed papers reporting new scientific findings in geomorphology (including fans, channels, floodplains and landscape evolution), geologic mapping, tectonics and faulting, coastal processes, lava flows, hydrology (especially snow and runoff routing), glaciers and geo-archaeology. A common genesis of such findings is often newly available decimeter resolution 'bare Earth' geodetic images, derived from airborne laser swath mapping, a.k.a. airborne LiDAR, observations. In this paper we trace nearly a half century of advances in geodetic science made possible by space age technology, such as the invention of short-pulse-length high-pulse-rate lasers, solid state inertial measurement units, chip-based high speed electronics and the GPS satellite navigation system, that today make it possible to map hundreds of square kilometers of terrain in hours, even in areas covered with dense vegetation or shallow water. To illustrate the impact of the LiDAR observations we present examples of geodetic images that are not only stunning to the eye, but help researchers to develop quantitative models explaining how terrain evolved to its present form, and how it will likely change with time. Airborne LiDAR technology continues to develop quickly, promising ever more scientific discoveries in the years ahead.

  5. Applications of Near Real-Time Image and Fire Products from MODIS

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Ilavajhala, S.; Teague, M.; Ye, G.; Masuoka, E.; Davies, D.; Murphy, K. J.; Michael, K.

    2010-12-01

    NASA’s MODIS Rapid Response Project (http://rapidfire.sci.gsfc.nasa.gov/) has been providing MODIS fire detections and imagery in near real-time since 2001. The Rapid Response system is part of the Land and Atmospheres Near-real time Capability for EOS (LANCE-MODIS) system. Current capabilities include providing MODIS imagery in true color and false color band combinations, a vegetation index, and temperature - in both uncorrected swath format and geographically corrected subset regions. The geographically-corrected subsets images cover the world's land areas and adjoining waters, as well as the entire Arctic and Antarctic. These data are available within a few hours of data acquisition. The images are accessed by large number of user communities to obtain a rapid, 250 meter-resolution overview of ground conditions for fire management, crop and famine monitoring and forecasting, disaster response (fires, oil spills, floods, storms), dust and aerosol monitoring, aviation (tracking volcanic ash), monitoring sea ice conditions, environmental monitoring, and more. In addition, the scientific community uses imagery to locate phenomena of interest prior to ordering and processing data and to support the day-to-day planning of field campaigns. The MODIS Rapid Response project has also been providing a near real-time data feed on fire locations and MODIS imagery subsets to the Fire Information for Resource Management System (FIRMS) project (http://maps.geog.umd.edu/firms). FIRMS provides timely availability of fire location information, which is essential in preventing and fighting large forest/wild fires. Products are available through a WebGIS for visualizing MODIS hotspots and MCD45 Burned Area images, an email alerting tool to deliver fire data on daily/weekly/near real-time basis, active data downloads in formats such as shape, KML, CSV, WMS, etc., along with MODIS imagery subsets. FIRMS’ user base covers more than 100 countries and territories. A recent user

  6. Interlaced photoacoustic and ultrasound imaging system with real-time coregistration for ovarian tissue characterization

    NASA Astrophysics Data System (ADS)

    Alqasemi, Umar; Li, Hai; Yuan, Guangqian; Kumavor, Patrick; Zanganeh, Saeid; Zhu, Quing

    2014-07-01

    Coregistered ultrasound (US) and photoacoustic imaging are emerging techniques for mapping the echogenic anatomical structure of tissue and its corresponding optical absorption. We report a 128-channel imaging system with real-time coregistration of the two modalities, which provides up to 15 coregistered frames per second limited by the laser pulse repetition rate. In addition, the system integrates a compact transvaginal imaging probe with a custom-designed fiber optic assembly for in vivo detection and characterization of human ovarian tissue. We present the coregistered US and photoacoustic imaging system structure, the optimal design of the PC interfacing software, and the reconfigurable field programmable gate array operation and optimization. Phantom experiments of system lateral resolution and axial sensitivity evaluation, examples of the real-time scanning of a tumor-bearing mouse, and ex vivo human ovaries studies are demonstrated.

  7. Telerobotic system concept for real-time soft-tissue imaging during radiotherapy beam delivery.

    PubMed

    Schlosser, Jeffrey; Salisbury, Kenneth; Hristov, Dimitre

    2010-12-01

    The curative potential of external beam radiation therapy is critically dependent on having the ability to accurately aim radiation beams at intended targets while avoiding surrounding healthy tissues. However, existing technologies are incapable of real-time, volumetric, soft-tissue imaging during radiation beam delivery, when accurate target tracking is most critical. The authors address this challenge in the development and evaluation of a novel, minimally interfering, telerobotic ultrasound (U.S.) imaging system that can be integrated with existing medical linear accelerators (LINACs) for therapy guidance. A customized human-safe robotic manipulator was designed and built to control the pressure and pitch of an abdominal U.S. transducer while avoiding LINAC gantry collisions. A haptic device was integrated to remotely control the robotic manipulator motion and U.S. image acquisition outside the LINAC room. The ability of the system to continuously maintain high quality prostate images was evaluated in volunteers over extended time periods. Treatment feasibility was assessed by comparing a clinically deployed prostate treatment plan to an alternative plan in which beam directions were restricted to sectors that did not interfere with the transabdominal U.S. transducer. To demonstrate imaging capability concurrent with delivery, robot performance and U.S. target tracking in a phantom were tested with a 15 MV radiation beam active. Remote image acquisition and maintenance of image quality with the haptic interface was successfully demonstrated over 10 min periods in representative treatment setups of volunteers. Furthermore, the robot's ability to maintain a constant probe force and desired pitch angle was unaffected by the LINAC beam. For a representative prostate patient, the dose-volume histogram (DVH) for a plan with restricted sectors remained virtually identical to the DVH of a clinically deployed plan. With reduced margins, as would be enabled by real-time

  8. Quantitative real-time imaging of glutathione

    USDA-ARS?s Scientific Manuscript database

    Glutathione plays many important roles in biological processes; however, the dynamic changes of glutathione concentrations in living cells remain largely unknown. Here, we report a reversible reaction-based fluorescent probe—designated as RealThiol (RT)—that can quantitatively monitor the real-time ...

  9. Real-time reconstruction of three-dimensional brain surface MR image using new volume-surface rendering technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, T.; Momose, T.; Oku, S.

    It is essential to obtain realistic brain surface images, in which sulci and gyri are easily recognized, when examining the correlation between functional (PET or SPECT) and anatomical (MRI) brain studies. The volume rendering technique (VRT) is commonly employed to make three-dimensional (3D) brain surface images. This technique, however, takes considerable time to make only one 3D image. Therefore it has not been practical to make the brain surface images in arbitrary directions on a real-time basis using ordinary work stations or personal computers. The surface rendering technique (SRT), on the other hand, is much less computationally demanding, but themore » quality of resulting images is not satisfactory for our purpose. A new computer algorithm has been developed to make 3D brain surface MR images very quickly using a volume-surface rendering technique (VSRT), in which the quality of resulting images is comparable to that of VRT and computation time to SRT. In VSRT the process of volume rendering is done only once to the direction of the normal vector of each surface point, rather than each time a new view point is determined as in VRT. Subsequent reconstruction of the 3D image uses a similar algorithm to that of SRT. Thus we can obtain brain surface MR images of sufficient quality viewed from any direction on a real-time basis using an easily available personal computer (Macintosh Quadra 800). The calculation time to make a 3D image is less than 1 sec. in VSRT, while that is more than 15 sec. in the conventional VRT. The difference of resulting image quality between VSRT and VRT is almost imperceptible. In conclusion, our new technique for real-time reconstruction of 3D brain surface MR image is very useful and practical in the functional and anatomical correlation study.« less

  10. Real-time volume rendering of 4D image using 3D texture mapping

    NASA Astrophysics Data System (ADS)

    Hwang, Jinwoo; Kim, June-Sic; Kim, Jae Seok; Kim, In Young; Kim, Sun Il

    2001-05-01

    Four dimensional image is 3D volume data that varies with time. It is used to express deforming or moving object in virtual surgery of 4D ultrasound. It is difficult to render 4D image by conventional ray-casting or shear-warp factorization methods because of their time-consuming rendering time or pre-processing stage whenever the volume data are changed. Even 3D texture mapping is used, repeated volume loading is also time-consuming in 4D image rendering. In this study, we propose a method to reduce data loading time using coherence between currently loaded volume and previously loaded volume in order to achieve real time rendering based on 3D texture mapping. Volume data are divided into small bricks and each brick being loaded is tested for similarity to one which was already loaded in memory. If the brick passed the test, it is defined as 3D texture by OpenGL functions. Later, the texture slices of the brick are mapped into polygons and blended by OpenGL blending functions. All bricks undergo this test. Continuously deforming fifty volumes are rendered in interactive time with SGI ONYX. Real-time volume rendering based on 3D texture mapping is currently available on PC.

  11. A design of real time image capturing and processing system using Texas Instrument's processor

    NASA Astrophysics Data System (ADS)

    Wee, Toon-Joo; Chaisorn, Lekha; Rahardja, Susanto; Gan, Woon-Seng

    2007-09-01

    In this work, we developed and implemented an image capturing and processing system that equipped with capability of capturing images from an input video in real time. The input video can be a video from a PC, video camcorder or DVD player. We developed two modes of operation in the system. In the first mode, an input image from the PC is processed on the processing board (development platform with a digital signal processor) and is displayed on the PC. In the second mode, current captured image from the video camcorder (or from DVD player) is processed on the board but is displayed on the LCD monitor. The major difference between our system and other existing conventional systems is that image-processing functions are performed on the board instead of the PC (so that the functions can be used for further developments on the board). The user can control the operations of the board through the Graphic User Interface (GUI) provided on the PC. In order to have a smooth image data transfer between the PC and the board, we employed Real Time Data Transfer (RTDX TM) technology to create a link between them. For image processing functions, we developed three main groups of function: (1) Point Processing; (2) Filtering and; (3) 'Others'. Point Processing includes rotation, negation and mirroring. Filter category provides median, adaptive, smooth and sharpen filtering in the time domain. In 'Others' category, auto-contrast adjustment, edge detection, segmentation and sepia color are provided, these functions either add effect on the image or enhance the image. We have developed and implemented our system using C/C# programming language on TMS320DM642 (or DM642) board from Texas Instruments (TI). The system was showcased in College of Engineering (CoE) exhibition 2006 at Nanyang Technological University (NTU) and have more than 40 users tried our system. It is demonstrated that our system is adequate for real time image capturing. Our system can be used or applied for

  12. Augmented reality based real-time subcutaneous vein imaging system

    PubMed Central

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-01-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed. PMID:27446690

  13. Augmented reality based real-time subcutaneous vein imaging system.

    PubMed

    Ai, Danni; Yang, Jian; Fan, Jingfan; Zhao, Yitian; Song, Xianzheng; Shen, Jianbing; Shao, Ling; Wang, Yongtian

    2016-07-01

    A novel 3D reconstruction and fast imaging system for subcutaneous veins by augmented reality is presented. The study was performed to reduce the failure rate and time required in intravenous injection by providing augmented vein structures that back-project superimposed veins on the skin surface of the hand. Images of the subcutaneous vein are captured by two industrial cameras with extra reflective near-infrared lights. The veins are then segmented by a multiple-feature clustering method. Vein structures captured by the two cameras are matched and reconstructed based on the epipolar constraint and homographic property. The skin surface is reconstructed by active structured light with spatial encoding values and fusion displayed with the reconstructed vein. The vein and skin surface are both reconstructed in the 3D space. Results show that the structures can be precisely back-projected to the back of the hand for further augmented display and visualization. The overall system performance is evaluated in terms of vein segmentation, accuracy of vein matching, feature points distance error, duration times, accuracy of skin reconstruction, and augmented display. All experiments are validated with sets of real vein data. The imaging and augmented system produces good imaging and augmented reality results with high speed.

  14. Real-time terahertz imaging through self-mixing in a quantum-cascade laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wienold, M., E-mail: martin.wienold@dlr.de; Rothbart, N.; Hübers, H.-W.

    2016-07-04

    We report on a fast self-mixing approach for real-time, coherent terahertz imaging based on a quantum-cascade laser and a scanning mirror. Due to a fast deflection of the terahertz beam, images with frame rates up to several Hz are obtained, eventually limited by the mechanical inertia of the employed scanning mirror. A phase modulation technique allows for the separation of the amplitude and phase information without the necessity of parameter fitting routines. We further demonstrate the potential for transmission imaging.

  15. MO-FG-BRD-01: Real-Time Imaging and Tracking Techniques for Intrafractional Motion Management: Introduction and KV Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fahimian, B.

    2015-06-15

    Intrafraction target motion is a prominent complicating factor in the accurate targeting of radiation within the body. Methods compensating for target motion during treatment, such as gating and dynamic tumor tracking, depend on the delineation of target location as a function of time during delivery. A variety of techniques for target localization have been explored and are under active development; these include beam-level imaging of radio-opaque fiducials, fiducial-less tracking of anatomical landmarks, tracking of electromagnetic transponders, optical imaging of correlated surrogates, and volumetric imaging within treatment delivery. The Joint Imaging and Therapy Symposium will provide an overview of the techniquesmore » for real-time imaging and tracking, with special focus on emerging modes of implementation across different modalities. In particular, the symposium will explore developments in 1) Beam-level kilovoltage X-ray imaging techniques, 2) EPID-based megavoltage X-ray tracking, 3) Dynamic tracking using electromagnetic transponders, and 4) MRI-based soft-tissue tracking during radiation delivery. Learning Objectives: Understand the fundamentals of real-time imaging and tracking techniques Learn about emerging techniques in the field of real-time tracking Distinguish between the advantages and disadvantages of different tracking modalities Understand the role of real-time tracking techniques within the clinical delivery work-flow.« less

  16. Two-dimensional airborne ultrasound real-time linear array scanner--applied to screening for scoliosis.

    PubMed

    Mauritzson, L; Ilver, J; Benoni, G; Lindström, K; Willner, S

    1991-01-01

    Diagnostic ultrasound is an established, noninvasive and harmless method for imaging the shape and appearance of organs and other tissues inside the body, and it has been used in many clinical applications for more than three decades. We have now applied some of this well-known technique together with the use of airborne ultrasound in medical applications, to build an equipment for anthropometrical investigation outside the body, e.g., measuring and registration of the shape and form of the human back. This is mostly done for screening purposes of young people in an attempt to find patients developing scoliosis, and in order to circumvent some of the disadvantages with the traditional screening method in this field of medical application.

  17. Real-time image annotation by manifold-based biased Fisher discriminant analysis

    NASA Astrophysics Data System (ADS)

    Ji, Rongrong; Yao, Hongxun; Wang, Jicheng; Sun, Xiaoshuai; Liu, Xianming

    2008-01-01

    Automatic Linguistic Annotation is a promising solution to bridge the semantic gap in content-based image retrieval. However, two crucial issues are not well addressed in state-of-art annotation algorithms: 1. The Small Sample Size (3S) problem in keyword classifier/model learning; 2. Most of annotation algorithms can not extend to real-time online usage due to their low computational efficiencies. This paper presents a novel Manifold-based Biased Fisher Discriminant Analysis (MBFDA) algorithm to address these two issues by transductive semantic learning and keyword filtering. To address the 3S problem, Co-Training based Manifold learning is adopted for keyword model construction. To achieve real-time annotation, a Bias Fisher Discriminant Analysis (BFDA) based semantic feature reduction algorithm is presented for keyword confidence discrimination and semantic feature reduction. Different from all existing annotation methods, MBFDA views image annotation from a novel Eigen semantic feature (which corresponds to keywords) selection aspect. As demonstrated in experiments, our manifold-based biased Fisher discriminant analysis annotation algorithm outperforms classical and state-of-art annotation methods (1.K-NN Expansion; 2.One-to-All SVM; 3.PWC-SVM) in both computational time and annotation accuracy with a large margin.

  18. Real-time intravital imaging of pH variation associated with osteoclast activity.

    PubMed

    Maeda, Hiroki; Kowada, Toshiyuki; Kikuta, Junichi; Furuya, Masayuki; Shirazaki, Mai; Mizukami, Shin; Ishii, Masaru; Kikuchi, Kazuya

    2016-08-01

    Intravital imaging by two-photon excitation microscopy (TPEM) has been widely used to visualize cell functions. However, small molecular probes (SMPs), commonly used for cell imaging, cannot be simply applied to intravital imaging because of the challenge of delivering them into target tissues, as well as their undesirable physicochemical properties for TPEM imaging. Here, we designed and developed a functional SMP with an active-targeting moiety, higher photostability, and a fluorescence switch and then imaged target cell activity by injecting the SMP into living mice. The combination of the rationally designed SMP with a fluorescent protein as a reporter of cell localization enabled quantitation of osteoclast activity and time-lapse imaging of its in vivo function associated with changes in cell deformation and membrane fluctuations. Real-time imaging revealed heterogenic behaviors of osteoclasts in vivo and provided insights into the mechanism of bone resorption.

  19. In situ real-time imaging of self-sorted supramolecular nanofibres

    NASA Astrophysics Data System (ADS)

    Onogi, Shoji; Shigemitsu, Hajime; Yoshii, Tatsuyuki; Tanida, Tatsuya; Ikeda, Masato; Kubota, Ryou; Hamachi, Itaru

    2016-08-01

    Self-sorted supramolecular nanofibres—a multicomponent system that consists of several types of fibre, each composed of distinct building units—play a crucial role in complex, well-organized systems with sophisticated functions, such as living cells. Designing and controlling self-sorting events in synthetic materials and understanding their structures and dynamics in detail are important elements in developing functional artificial systems. Here, we describe the in situ real-time imaging of self-sorted supramolecular nanofibre hydrogels consisting of a peptide gelator and an amphiphilic phosphate. The use of appropriate fluorescent probes enabled the visualization of self-sorted fibres entangled in two and three dimensions through confocal laser scanning microscopy and super-resolution imaging, with 80 nm resolution. In situ time-lapse imaging showed that the two types of fibre have different formation rates and that their respective physicochemical properties remain intact in the gel. Moreover, we directly visualized stochastic non-synchronous fibre formation and observed a cooperative mechanism.

  20. TH-CD-207A-08: Simulated Real-Time Image Guidance for Lung SBRT Patients Using Scatter Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redler, G; Cifter, G; Templeton, A

    2016-06-15

    Purpose: To develop a comprehensive Monte Carlo-based model for the acquisition of scatter images of patient anatomy in real-time, during lung SBRT treatment. Methods: During SBRT treatment, images of patient anatomy can be acquired from scattered radiation. To rigorously examine the utility of scatter images for image guidance, a model is developed using MCNP code to simulate scatter images of phantoms and lung cancer patients. The model is validated by comparing experimental and simulated images of phantoms of different complexity. The differentiation between tissue types is investigated by imaging objects of known compositions (water, lung, and bone equivalent). A lungmore » tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is used to investigate image noise properties for various quantities of delivered radiation (monitor units(MU)). Patient scatter images are simulated using the validated simulation model. 4DCT patient data is converted to an MCNP input geometry accounting for different tissue composition and densities. Lung tumor phantom images acquired with decreasing imaging time (decreasing MU) are used to model the expected noise amplitude in patient scatter images, producing realistic simulated patient scatter images with varying temporal resolution. Results: Image intensity in simulated and experimental scatter images of tissue equivalent objects (water, lung, bone) match within the uncertainty (∼3%). Lung tumor phantom images agree as well. Specifically, tumor-to-lung contrast matches within the uncertainty. The addition of random noise approximating quantum noise in experimental images to simulated patient images shows that scatter images of lung tumors can provide images in as fast as 0.5 seconds with CNR∼2.7. Conclusions: A scatter imaging simulation model is developed and validated using experimental phantom scatter images. Following validation, lung cancer patient scatter images are simulated. These

  1. Real-time feedback control of twin-screw wet granulation based on image analysis.

    PubMed

    Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György

    2018-06-04

    The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Real-Time Nonlinear Optical Information Processing.

    DTIC Science & Technology

    1979-06-01

    operations aree presented. One approach realizes the halftone method of nonlinear optical processing in real time by replacing the conventional...photographic recording medium with a real-time image transducer. In the second approach halftoning is eliminated and the real-time device is used directly

  3. Real-time image processing of TOF range images using a reconfigurable processor system

    NASA Astrophysics Data System (ADS)

    Hussmann, S.; Knoll, F.; Edeler, T.

    2011-07-01

    During the last years, Time-of-Flight sensors achieved a significant impact onto research fields in machine vision. In comparison to stereo vision system and laser range scanners they combine the advantages of active sensors providing accurate distance measurements and camera-based systems recording a 2D matrix at a high frame rate. Moreover low cost 3D imaging has the potential to open a wide field of additional applications and solutions in markets like consumer electronics, multimedia, digital photography, robotics and medical technologies. This paper focuses on the currently implemented 4-phase-shift algorithm in this type of sensors. The most time critical operation of the phase-shift algorithm is the arctangent function. In this paper a novel hardware implementation of the arctangent function using a reconfigurable processor system is presented and benchmarked against the state-of-the-art CORDIC arctangent algorithm. Experimental results show that the proposed algorithm is well suited for real-time processing of the range images of TOF cameras.

  4. Enhancements and Evolution of the Real Time Mission Monitor

    NASA Technical Reports Server (NTRS)

    Goodman, Michael; Blakeslee, Richard; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn

    2008-01-01

    The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. RTMM has proven extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, mission scientists, instrument scientists and program managers alike appreciate the contributions that RTMM makes to their flight projects. We have received numerous plaudits from a wide variety of scientists who used RTMM during recent field campaigns including the 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), 2008 Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) missions, the 2007-2008 NOAA-NASA Aerosonde Hurricane flights and the 2008 Soil Moisture Active-Passive Validation Experiment (SMAP-VEX). Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated for altitude, latitude, longitude, flight leg distance, cumulative distance, flight leg time, cumulative time, and

  5. Enhancements and Evolution of the Real Time Mission Monitor

    NASA Astrophysics Data System (ADS)

    Goodman, M.; Blakeslee, R.; Hardin, D.; Hall, J.; He, Y.; Regner, K.

    2008-12-01

    The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual earth application. RTMM has proven extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, mission scientists, instrument scientists and program managers alike appreciate the contributions that RTMM makes to their flight projects. RTMM has received numerous plaudits from a wide variety of scientists who used RTMM during recent field campaigns including the 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), 2008 Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) missions, the 2007-2008 NOAA-NASA Aerosonde Hurricane flights and the 2008 Soil Moisture Active-Passive Validation Experiment (SMAP-VEX). Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated for altitude, latitude, longitude, flight leg distance, cumulative distance, flight leg time, cumulative time, and

  6. Real-time photoacoustic imaging of rat deep brain: hemodynamic responses to hypoxia

    NASA Astrophysics Data System (ADS)

    Kawauchi, Satoko; Iwazaki, Hideaki; Ida, Taiichiro; Hosaka, Tomoya; Kawaguchi, Yasushi; Nawashiro, Hiroshi; Sato, Shunichi

    2013-03-01

    Hemodynamic responses of the brain to hypoxia or ischemia are one of the major interests in neurosurgery and neuroscience. In this study, we performed real-time transcutaneous PA imaging of the rat brain that was exposed to a hypoxic stress and investigated depth-resolved responses of the brain, including the hippocampus. A linear-array 8ch 10-MHz ultrasonic sensor (measurement length, 10 mm) was placed on the shaved scalp. Nanosecond, 570-nm and 595- nm light pulses were used to excite PA signals indicating cerebral blood volume (CBV) and blood deoxygenation, respectively. Under spontaneous respiration, inhalation gas was switched from air to nitrogen, and then reswitched to oxygen, during which real-time PA imaging was performed continuously. High-contrast PA signals were observed from the depth regions corresponding to the scalp, skull, cortex and hippocampus. After starting hypoxia, PA signals at 595 nm increased immediately in both the cortex and hippocampus for about 1.5 min, showing hemoglobin deoxygenation. On the other hand, PA signals at 570 nm coming from these regions did not increase in the early phase but started to increase at about 1.5 min after starting hypoxia, indicating reactive hyperemia to hypoxia. During hypoxia, PA signals coming from the scalp decreased transiently, which is presumably due to compensatory response in the peripheral tissue to preserve blood perfusion in the brain. The reoxygenation caused a gradual recovery of these PA signals. These findings demonstrate the usefulness of PA imaging for real-time, depth-resolved observation of cerebral hemodynamics.

  7. Compact camera technologies for real-time false-color imaging in the SWIR band

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Jennings, Todd; Snikkers, Marco

    2013-11-01

    Previously real-time false-colored multispectral imaging was not available in a true snapshot single compact imager. Recent technology improvements now allow for this technique to be used in practical applications. This paper will cover those advancements as well as a case study for its use in UAV's where the technology is enabling new remote sensing methodologies.

  8. Real-time automatic fiducial marker tracking in low contrast cine-MV images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Wei-Yang; Lin, Shu-Fang; Yang, Sheng-Chang

    2013-01-15

    Purpose: To develop a real-time automatic method for tracking implanted radiographic markers in low-contrast cine-MV patient images used in image-guided radiation therapy (IGRT). Methods: Intrafraction motion tracking using radiotherapy beam-line MV images have gained some attention recently in IGRT because no additional imaging dose is introduced. However, MV images have much lower contrast than kV images, therefore a robust and automatic algorithm for marker detection in MV images is a prerequisite. Previous marker detection methods are all based on template matching or its derivatives. Template matching needs to match object shape that changes significantly for different implantation and projection angle.more » While these methods require a large number of templates to cover various situations, they are often forced to use a smaller number of templates to reduce the computation load because their methods all require exhaustive search in the region of interest. The authors solve this problem by synergetic use of modern but well-tested computer vision and artificial intelligence techniques; specifically the authors detect implanted markers utilizing discriminant analysis for initialization and use mean-shift feature space analysis for sequential tracking. This novel approach avoids exhaustive search by exploiting the temporal correlation between consecutive frames and makes it possible to perform more sophisticated detection at the beginning to improve the accuracy, followed by ultrafast sequential tracking after the initialization. The method was evaluated and validated using 1149 cine-MV images from two prostate IGRT patients and compared with manual marker detection results from six researchers. The average of the manual detection results is considered as the ground truth for comparisons. Results: The average root-mean-square errors of our real-time automatic tracking method from the ground truth are 1.9 and 2.1 pixels for the two patients (0.26 mm

  9. Real-time co-registered ultrasound and photoacoustic imaging system based on FPGA and DSP architecture

    NASA Astrophysics Data System (ADS)

    Alqasemi, Umar; Li, Hai; Aguirre, Andres; Zhu, Quing

    2011-03-01

    Co-registering ultrasound (US) and photoacoustic (PA) imaging is a logical extension to conventional ultrasound because both modalities provide complementary information of tumor morphology, tumor vasculature and hypoxia for cancer detection and characterization. In addition, both modalities are capable of providing real-time images for clinical applications. In this paper, a Field Programmable Gate Array (FPGA) and Digital Signal Processor (DSP) module-based real-time US/PA imaging system is presented. The system provides real-time US/PA data acquisition and image display for up to 5 fps* using the currently implemented DSP board. It can be upgraded to 15 fps, which is the maximum pulse repetition rate of the used laser, by implementing an advanced DSP module. Additionally, the photoacoustic RF data for each frame is saved for further off-line processing. The system frontend consists of eight 16-channel modules made of commercial and customized circuits. Each 16-channel module consists of two commercial 8-channel receiving circuitry boards and one FPGA board from Analog Devices. Each receiving board contains an IC† that combines. 8-channel low-noise amplifiers, variable-gain amplifiers, anti-aliasing filters, and ADC's‡ in a single chip with sampling frequency of 40MHz. The FPGA board captures the LVDSξ Double Data Rate (DDR) digital output of the receiving board and performs data conditioning and subbeamforming. A customized 16-channel transmission circuitry is connected to the two receiving boards for US pulseecho (PE) mode data acquisition. A DSP module uses External Memory Interface (EMIF) to interface with the eight 16-channel modules through a customized adaptor board. The DSP transfers either sub-beamformed data (US pulse-echo mode or PAI imaging mode) or raw data from FPGA boards to its DDR-2 memory through the EMIF link, then it performs additional processing, after that, it transfer the data to the PC** for further image processing. The PC code

  10. Real-Time Imaging with Frequency Scanning Array Antenna for Industrial Inspection Applications at W band

    NASA Astrophysics Data System (ADS)

    Larumbe, Belen; Laviada, Jaime; Ibáñez-Loinaz, Asier; Teniente, Jorge

    2018-01-01

    A real-time imaging system based on a frequency scanning antenna for conveyor belt setups is presented in this paper. The frequency scanning antenna together with an inexpensive parabolic reflector operates at the W band enabling the detection of details with dimensions in the order of 2 mm. In addition, a low level of sidelobes is achieved by optimizing unequal dividers to window the power distribution for sidelobe reduction. Furthermore, the quality of the images is enhanced by the radiation pattern properties. The performance of the system is validated by showing simulation as well as experimental results obtained in real time, proving the feasibility of these kinds of frequency scanning antennas for cost-effective imaging applications.

  11. Airborne Remote Sensing of River Flow and Morphology

    NASA Astrophysics Data System (ADS)

    Zuckerman, S.; Anderson, S. P.; McLean, J.; Redford, R.

    2014-12-01

    River morphology, surface slope and flow are some of the fundamental measurements required for surface water monitoring and hydrodynamic research. This paper describes a method of combining bathymetric lidar with space-time processing of mid-wave infrared (MWIR) imagery to simultaneously measure bathymetry, currents and surface slope from an airborne platform. In May 2014, Areté installed a Pushbroom Imaging Lidar for Littoral Surveillance (PILLS) and a FLIR SC8000 MWIR imaging system sampling at 2 Hz in a small twin-engine aircraft. Data was collected over the lower Colorado River between Picacho Park and Parker. PILLS is a compact bathymetric lidar based on streak-tube sensor technology. It provides channel and bank topography and water surface elevation at 1 meter horizontal scales and 25 cm vertical accuracy. Surface currents are derived from the MWIR imagery by tracking surface features using a cross correlation algorithm. This approach enables the retrieval of currents along extended reaches at the forward speed of the aircraft with spatial resolutions down to 5 m with accuracy better than 10 cm/s. The fused airborne data captures current and depth variability on scales of meters over 10's of kilometers collected in just a few minutes. The airborne MWIR current retrievals are combined with the bathymetric lidar data to calculate river discharge which is then compared with real-time streamflow stations. The results highlight the potential for improving our understanding of complex river environments with simultaneous collections from multiple airborne sensors.

  12. Real-time windowing in imaging radar using FPGA technique

    NASA Astrophysics Data System (ADS)

    Ponomaryov, Volodymyr I.; Escamilla-Hernandez, Enrique

    2005-02-01

    The imaging radar uses the high frequency electromagnetic waves reflected from different objects for estimating of its parameters. Pulse compression is a standard signal processing technique used to minimize the peak transmission power and to maximize SNR, and to get a better resolution. Usually the pulse compression can be achieved using a matched filter. The level of the side-lobes in the imaging radar can be reduced using the special weighting function processing. There are very known different weighting functions: Hamming, Hanning, Blackman, Chebyshev, Blackman-Harris, Kaiser-Bessel, etc., widely used in the signal processing applications. Field Programmable Gate Arrays (FPGAs) offers great benefits like instantaneous implementation, dynamic reconfiguration, design, and field programmability. This reconfiguration makes FPGAs a better solution over custom-made integrated circuits. This work aims at demonstrating a reasonably flexible implementation of FM-linear signal and pulse compression using Matlab, Simulink, and System Generator. Employing FPGA and mentioned software we have proposed the pulse compression design on FPGA using classical and novel windows technique to reduce the side-lobes level. This permits increasing the detection ability of the small or nearly placed targets in imaging radar. The advantage of FPGA that can do parallelism in real time processing permits to realize the proposed algorithms. The paper also presents the experimental results of proposed windowing procedure in the marine radar with such the parameters: signal is linear FM (Chirp); frequency deviation DF is 9.375MHz; the pulse width T is 3.2μs taps number in the matched filter is 800 taps; sampling frequency 253.125*106 MHz. It has been realized the reducing of side-lobes levels in real time permitting better resolution of the small targets.

  13. Real-time image processing for particle tracking velocimetry

    NASA Astrophysics Data System (ADS)

    Kreizer, Mark; Ratner, David; Liberzon, Alex

    2010-01-01

    We present a novel high-speed particle tracking velocimetry (PTV) experimental system. Its novelty is due to the FPGA-based, real-time image processing "on camera". Instead of an image, the camera transfers to the computer using a network card, only the relevant information of the identified flow tracers. Therefore, the system is ideal for the remote particle tracking systems in research and industrial applications, while the camera can be controlled and data can be transferred over any high-bandwidth network. We present the hardware and the open source software aspects of the PTV experiments. The tracking results of the new experimental system has been compared to the flow visualization and particle image velocimetry measurements. The canonical flow in the central cross section of a a cubic cavity (1:1:1 aspect ratio) in our lid-driven cavity apparatus is used for validation purposes. The downstream secondary eddy (DSE) is the sensitive portion of this flow and its size was measured with increasing Reynolds number (via increasing belt velocity). The size of DSE estimated from the flow visualization, PIV and compressed PTV is shown to agree within the experimental uncertainty of the methods applied.

  14. Real-time image dehazing using local adaptive neighborhoods and dark-channel-prior

    NASA Astrophysics Data System (ADS)

    Valderrama, Jesus A.; Díaz-Ramírez, Víctor H.; Kober, Vitaly; Hernandez, Enrique

    2015-09-01

    A real-time algorithm for single image dehazing is presented. The algorithm is based on calculation of local neighborhoods of a hazed image inside a moving window. The local neighborhoods are constructed by computing rank-order statistics. Next the dark-channel-prior approach is applied to the local neighborhoods to estimate the transmission function of the scene. By using the suggested approach there is no need for applying a refining algorithm to the estimated transmission such as the soft matting algorithm. To achieve high-rate signal processing the proposed algorithm is implemented exploiting massive parallelism on a graphics processing unit (GPU). Computer simulation results are carried out to test the performance of the proposed algorithm in terms of dehazing efficiency and speed of processing. These tests are performed using several synthetic and real images. The obtained results are analyzed and compared with those obtained with existing dehazing algorithms.

  15. Transvaginal photoacoustic imaging probe and system based on a multiport fiber-optic beamsplitter and a real time imager for ovarian cancer detection

    NASA Astrophysics Data System (ADS)

    Kumavor, Patrick D.; Alqasemi, Umar; Tavakoli, Behnoosh; Li, Hai; Yang, Yi; Zhu, Quing

    2013-03-01

    This paper presents a real-time transvaginal photoacoustic imaging probe for imaging human ovaries in vivo. The probe consists of a high-throughput (up to 80%) fiber-optic 1 x 19 beamsplitters, a commercial array ultrasound transducer, and a fiber protective sheath. The beamsplitter has a 940-micron core diameter input fiber and 240-micron core diameter output fibers numbering 36. The 36 small-core output fibers surround the ultrasound transducer and delivers light to the tissue during imaging. A protective sheath, modeled in the form of the transducer using a 3-D printer, encloses the transducer with array of fibers. A real-time image acquisition system collects and processes the photoacoustic RF signals from the transducer, and displays the images formed on a monitor in real time. Additionally, the system is capable of coregistered pulse-echo ultrasound imaging. In this way, we obtain both morphological and functional information from the ovarian tissue. Photoacousitc images of malignant human ovaries taken ex vivo with the probe revealed blood vascular and networks that was distinguishable from normal ovaries, making the probe potential useful for characterizing ovarian tissue.

  16. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  17. Processing, Cataloguing and Distribution of Uas Images in Near Real Time

    NASA Astrophysics Data System (ADS)

    Runkel, I.

    2013-08-01

    Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images

  18. A Real-Time Imaging System for Stereo Atomic Microscopy at SPring-8's BL25SU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsushita, Tomohiro; Guo, Fang Zhun; Muro, Takayuki

    2007-01-19

    We have developed a real-time photoelectron angular distribution (PEAD) and Auger-electron angular distribution (AEAD) imaging system at SPring-8 BL25SU, Japan. In addition, a real-time imaging system for circular dichroism (CD) studies of PEAD/AEAD has been newly developed. Two PEAD images recorded with left- and right-circularly polarized light can be regarded as a stereo image of the atomic arrangement. A two-dimensional display type mirror analyzer (DIANA) has been installed at the beamline, making it possible to record PEAD/AEAD patterns with an acceptance angle of {+-}60 deg. in real-time. The twin-helical undulators at BL25SU enable helicity switching of the circularly polarized lightmore » at 10Hz, 1Hz or 0.1Hz. In order to realize real-time measurements of the CD of the PEAD/AEAD, the CCD camera must be synchronized to the switching frequency. The VME computer that controls the ID is connected to the measurement computer with two BNC cables, and the helicity information is sent using TTL signals. For maximum flexibility, rather than using a hardware shutter synchronizing with the TTL signal we have developed software to synchronize the CCD shutter with the TTL signal. We have succeeded in synchronizing the CCD camera in both the 1Hz and 0.1Hz modes.« less

  19. A real-time computer for monitoring a rapid-scanning Fourier spectrometer

    NASA Technical Reports Server (NTRS)

    Michel, G.

    1973-01-01

    A real-time Fourier computer has been designed and tested as part of the Lunar and Planetary Laboratory's program of airborne infrared astronomy using Fourier spectroscopy. The value and versatility of this device are demonstrated with specific examples of laboratory and in-flight applications.

  20. A real time quality control application for animal production by image processing.

    PubMed

    Sungur, Cemil; Özkan, Halil

    2015-11-01

    Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.

  1. Real-time image sequence segmentation using curve evolution

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Liu, Weisong

    2001-04-01

    In this paper, we describe a novel approach to image sequence segmentation and its real-time implementation. This approach uses the 3D structure tensor to produce a more robust frame difference signal and uses curve evolution to extract whole objects. Our algorithm is implemented on a standard PC running the Windows operating system with video capture from a USB camera that is a standard Windows video capture device. Using the Windows standard video I/O functionalities, our segmentation software is highly portable and easy to maintain and upgrade. In its current implementation on a Pentium 400, the system can perform segmentation at 5 frames/sec with a frame resolution of 160 by 120.

  2. Parallel algorithm of real-time infrared image restoration based on total variation theory

    NASA Astrophysics Data System (ADS)

    Zhu, Ran; Li, Miao; Long, Yunli; Zeng, Yaoyuan; An, Wei

    2015-10-01

    Image restoration is a necessary preprocessing step for infrared remote sensing applications. Traditional methods allow us to remove the noise but penalize too much the gradients corresponding to edges. Image restoration techniques based on variational approaches can solve this over-smoothing problem for the merits of their well-defined mathematical modeling of the restore procedure. The total variation (TV) of infrared image is introduced as a L1 regularization term added to the objective energy functional. It converts the restoration process to an optimization problem of functional involving a fidelity term to the image data plus a regularization term. Infrared image restoration technology with TV-L1 model exploits the remote sensing data obtained sufficiently and preserves information at edges caused by clouds. Numerical implementation algorithm is presented in detail. Analysis indicates that the structure of this algorithm can be easily implemented in parallelization. Therefore a parallel implementation of the TV-L1 filter based on multicore architecture with shared memory is proposed for infrared real-time remote sensing systems. Massive computation of image data is performed in parallel by cooperating threads running simultaneously on multiple cores. Several groups of synthetic infrared image data are used to validate the feasibility and effectiveness of the proposed parallel algorithm. Quantitative analysis of measuring the restored image quality compared to input image is presented. Experiment results show that the TV-L1 filter can restore the varying background image reasonably, and that its performance can achieve the requirement of real-time image processing.

  3. GPU accelerated real-time confocal fluorescence lifetime imaging microscopy (FLIM) based on the analog mean-delay (AMD) method

    PubMed Central

    Kim, Byungyeon; Park, Byungjun; Lee, Seungrag; Won, Youngjae

    2016-01-01

    We demonstrated GPU accelerated real-time confocal fluorescence lifetime imaging microscopy (FLIM) based on the analog mean-delay (AMD) method. Our algorithm was verified for various fluorescence lifetimes and photon numbers. The GPU processing time was faster than the physical scanning time for images up to 800 × 800, and more than 149 times faster than a single core CPU. The frame rate of our system was demonstrated to be 13 fps for a 200 × 200 pixel image when observing maize vascular tissue. This system can be utilized for observing dynamic biological reactions, medical diagnosis, and real-time industrial inspection. PMID:28018724

  4. Real-Time Visualization of Tissue Ischemia

    NASA Technical Reports Server (NTRS)

    Bearman, Gregory H. (Inventor); Chrien, Thomas D. (Inventor); Eastwood, Michael L. (Inventor)

    2000-01-01

    A real-time display of tissue ischemia which comprises three CCD video cameras, each with a narrow bandwidth filter at the correct wavelength is discussed. The cameras simultaneously view an area of tissue suspected of having ischemic areas through beamsplitters. The output from each camera is adjusted to give the correct signal intensity for combining with, the others into an image for display. If necessary a digital signal processor (DSP) can implement algorithms for image enhancement prior to display. Current DSP engines are fast enough to give real-time display. Measurement at three, wavelengths, combined into a real-time Red-Green-Blue (RGB) video display with a digital signal processing (DSP) board to implement image algorithms, provides direct visualization of ischemic areas.

  5. Redox-initiated hydrogel system for detection and real-time imaging of cellulolytic enzyme activity.

    PubMed

    Malinowska, Klara H; Verdorfer, Tobias; Meinhold, Aylin; Milles, Lukas F; Funk, Victor; Gaub, Hermann E; Nash, Michael A

    2014-10-01

    Understanding the process of biomass degradation by cellulolytic enzymes is of urgent importance for biofuel and chemical production. Optimizing pretreatment conditions and improving enzyme formulations both require assays to quantify saccharification products on solid substrates. Typically, such assays are performed using freely diffusing fluorophores or dyes that measure reducing polysaccharide chain ends. These methods have thus far not allowed spatial localization of hydrolysis activity to specific substrate locations with identifiable morphological features. Here we describe a hydrogel reagent signaling (HyReS) system that amplifies saccharification products and initiates crosslinking of a hydrogel that localizes to locations of cellulose hydrolysis, allowing for imaging of the degradation process in real time. Optical detection of the gel in a rapid parallel format on synthetic and natural pretreated solid substrates was used to quantify activity of T. emersonii and T. reesei enzyme cocktails. When combined with total internal reflection fluorescence microscopy and AFM imaging, the reagent system provided a means to visualize enzyme activity in real-time with high spatial resolution (<2 μm). These results demonstrate the versatility of the HyReS system in detecting cellulolytic enzyme activity and suggest new opportunities in real-time chemical imaging of biomass depolymerization. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. High speed, real-time, camera bandwidth converter

    DOEpatents

    Bower, Dan E; Bloom, David A; Curry, James R

    2014-10-21

    Image data from a CMOS sensor with 10 bit resolution is reformatted in real time to allow the data to stream through communications equipment that is designed to transport data with 8 bit resolution. The incoming image data has 10 bit resolution. The communication equipment can transport image data with 8 bit resolution. Image data with 10 bit resolution is transmitted in real-time, without a frame delay, through the communication equipment by reformatting the image data.

  7. Real-time Crystal Growth Visualization and Quantification by Energy-Resolved Neutron Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tremsin, Anton S.; Perrodin, Didier; Losko, Adrian S.

    Energy-resolved neutron imaging is investigated as a real-time diagnostic tool for visualization and in-situ measurements of "blind" processes. This technique is demonstrated for the Bridgman-type crystal growth enabling remote and direct measurements of growth parameters crucial for process optimization. The location and shape of the interface between liquid and solid phases are monitored in real-time, concurrently with the measurement of elemental distribution within the growth volume and with the identification of structural features with a ~100 μm spatial resolution. Such diagnostics can substantially reduce the development time between exploratory small scale growth of new materials and their subsequent commercial production.more » This technique is widely applicable and is not limited to crystal growth processes.« less

  8. Real-time Crystal Growth Visualization and Quantification by Energy-Resolved Neutron Imaging.

    PubMed

    Tremsin, Anton S; Perrodin, Didier; Losko, Adrian S; Vogel, Sven C; Bourke, Mark A M; Bizarri, Gregory A; Bourret, Edith D

    2017-04-20

    Energy-resolved neutron imaging is investigated as a real-time diagnostic tool for visualization and in-situ measurements of "blind" processes. This technique is demonstrated for the Bridgman-type crystal growth enabling remote and direct measurements of growth parameters crucial for process optimization. The location and shape of the interface between liquid and solid phases are monitored in real-time, concurrently with the measurement of elemental distribution within the growth volume and with the identification of structural features with a ~100 μm spatial resolution. Such diagnostics can substantially reduce the development time between exploratory small scale growth of new materials and their subsequent commercial production. This technique is widely applicable and is not limited to crystal growth processes.

  9. Real-time Crystal Growth Visualization and Quantification by Energy-Resolved Neutron Imaging

    NASA Astrophysics Data System (ADS)

    Tremsin, Anton S.; Perrodin, Didier; Losko, Adrian S.; Vogel, Sven C.; Bourke, Mark A. M.; Bizarri, Gregory A.; Bourret, Edith D.

    2017-04-01

    Energy-resolved neutron imaging is investigated as a real-time diagnostic tool for visualization and in-situ measurements of “blind” processes. This technique is demonstrated for the Bridgman-type crystal growth enabling remote and direct measurements of growth parameters crucial for process optimization. The location and shape of the interface between liquid and solid phases are monitored in real-time, concurrently with the measurement of elemental distribution within the growth volume and with the identification of structural features with a ~100 μm spatial resolution. Such diagnostics can substantially reduce the development time between exploratory small scale growth of new materials and their subsequent commercial production. This technique is widely applicable and is not limited to crystal growth processes.

  10. Real-time Crystal Growth Visualization and Quantification by Energy-Resolved Neutron Imaging

    DOE PAGES

    Tremsin, Anton S.; Perrodin, Didier; Losko, Adrian S.; ...

    2017-04-20

    Energy-resolved neutron imaging is investigated as a real-time diagnostic tool for visualization and in-situ measurements of "blind" processes. This technique is demonstrated for the Bridgman-type crystal growth enabling remote and direct measurements of growth parameters crucial for process optimization. The location and shape of the interface between liquid and solid phases are monitored in real-time, concurrently with the measurement of elemental distribution within the growth volume and with the identification of structural features with a ~100 μm spatial resolution. Such diagnostics can substantially reduce the development time between exploratory small scale growth of new materials and their subsequent commercial production.more » This technique is widely applicable and is not limited to crystal growth processes.« less

  11. Real-time Crystal Growth Visualization and Quantification by Energy-Resolved Neutron Imaging

    PubMed Central

    Tremsin, Anton S.; Perrodin, Didier; Losko, Adrian S.; Vogel, Sven C.; Bourke, Mark A.M.; Bizarri, Gregory A.; Bourret, Edith D.

    2017-01-01

    Energy-resolved neutron imaging is investigated as a real-time diagnostic tool for visualization and in-situ measurements of “blind” processes. This technique is demonstrated for the Bridgman-type crystal growth enabling remote and direct measurements of growth parameters crucial for process optimization. The location and shape of the interface between liquid and solid phases are monitored in real-time, concurrently with the measurement of elemental distribution within the growth volume and with the identification of structural features with a ~100 μm spatial resolution. Such diagnostics can substantially reduce the development time between exploratory small scale growth of new materials and their subsequent commercial production. This technique is widely applicable and is not limited to crystal growth processes. PMID:28425461

  12. Real time quantitative imaging for semiconductor crystal growth, control and characterization

    NASA Technical Reports Server (NTRS)

    Wargo, Michael J.

    1991-01-01

    A quantitative real time image processing system has been developed which can be software-reconfigured for semiconductor processing and characterization tasks. In thermal imager mode, 2D temperature distributions of semiconductor melt surfaces (900-1600 C) can be obtained with temperature and spatial resolutions better than 0.5 C and 0.5 mm, respectively, as demonstrated by analysis of melt surface thermal distributions. Temporal and spatial image processing techniques and multitasking computational capabilities convert such thermal imaging into a multimode sensor for crystal growth control. A second configuration of the image processing engine in conjunction with bright and dark field transmission optics is used to nonintrusively determine the microdistribution of free charge carriers and submicron sized crystalline defects in semiconductors. The IR absorption characteristics of wafers are determined with 10-micron spatial resolution and, after calibration, are converted into charge carrier density.

  13. A Miniature Forward-imaging B-scan Optical Coherence Tomography Probe to Guide Real-time Laser Ablation

    PubMed Central

    Li, Zhuoyan; Shen, Jin H.; Kozub, John A.; Prasad, Ratna; Lu, Pengcheng; Joos, Karen M.

    2014-01-01

    Background and Objective Investigations have shown that pulsed lasers tuned to 6.1 μm in wavelength are capable of ablating ocular and neural tissue with minimal collateral damage. This study investigated whether a miniature B-scan forward-imaging optical coherence tomography (OCT) probe can be combined with the laser to provide real-time visual feedback during laser incisions. Study Design/Methods and Materials A miniature 25-gauge B-scan forward-imaging OCT probe was developed and combined with a 250 μm hollow-glass waveguide to permit delivery of 6.1 μm laser energy. A gelatin mixture and both porcine corneal and retinal tissues were simultaneously imaged and lased (6.1 μm, 10 Hz, 0.4-0.7 mJ) through air. The ablation studies were observed and recorded in real time. The crater dimensions were measured using OCT imaging software (Bioptigen, Durham, NC). Histological analysis was performed on the ocular tissues. Results The combined miniature forward-imaging OCT and mid-infrared laser-delivery probe successfully imaged real-time tissue ablation in gelatin, corneal tissue, and retinal tissue. Application of a constant number of 60 pulses at 0.5 mJ/pulse to the gelatin resulted in a mean crater depth of 123 ± 15 μm. For the corneal tissue, there was a significant correlation between the number of pulses used and depth of the lased hole (Pearson correlation coefficient = 0.82; P = 0.0002). Histological analysis of the cornea and retina tissues showed discrete holes with minimal thermal damage. Conclusions A combined miniature OCT and laser -delivery probe can monitor real-time tissue laser ablation. With additional testing and improvements, this novel instrument has the future possibility of effectively guiding surgeries by simultaneously imaging and ablating tissue. PMID:24648326

  14. Real-time millimeter-wave imaging radiometer for avionic synthetic vision

    NASA Astrophysics Data System (ADS)

    Lovberg, John A.; Chou, Ri-Chee; Martin, Christopher A.

    1994-07-01

    ThermoTrex Corporation (TTC) has developed an imaging radiometer, the passive microwave camera (PMC), that uses an array of frequency-scanned antennas coupled to a multi-channel acousto-optic (Bragg cell) spectrum analyzer to form visible images of a scene through acquisition of thermal blackbody radiation in the millimeter-wave spectrum. The output of the Bragg cell is imaged by a standard video camera and passed to a computer for normalization and display at real-time frame rates. One application of this system could be its incorporation into an enhanced vision system to provide pilots with a clear view of the runway during fog and other adverse weather conditions. The unique PMC system architecture will allow compact large-aperture implementations because of its flat antenna sensor. Other potential applications include air traffic control, all-weather area surveillance, fire detection, and security. This paper describes the architecture of the TTC PMC and shows examples of images acquired with the system.

  15. Image quality specification and maintenance for airborne SAR

    NASA Astrophysics Data System (ADS)

    Clinard, Mark S.

    2004-08-01

    Specification, verification, and maintenance of image quality over the lifecycle of an operational airborne SAR begin with the specification for the system itself. Verification of image quality-oriented specification compliance can be enhanced by including a specification requirement that a vendor provide appropriate imagery at the various phases of the system life cycle. The nature and content of the imagery appropriate for each stage of the process depends on the nature of the test, the economics of collection, and the availability of techniques to extract the desired information from the data. At the earliest lifecycle stages, Concept and Technology Development (CTD) and System Development and Demonstration (SDD), the test set could include simulated imagery to demonstrate the mathematical and engineering concepts being implemented thus allowing demonstration of compliance, in part, through simulation. For Initial Operational Test and Evaluation (IOT&E), imagery collected from precisely instrumented test ranges and targets of opportunity consisting of a priori or a posteriori ground-truthed cultural and natural features are of value to the analysis of product quality compliance. Regular monitoring of image quality is possible using operational imagery and automated metrics; more precise measurements can be performed with imagery of instrumented scenes, when available. A survey of image quality measurement techniques is presented along with a discussion of the challenges of managing an airborne SAR program with the scarce resources of time, money, and ground-truthed data. Recommendations are provided that should allow an improvement in the product quality specification and maintenance process with a minimal increase in resource demands on the customer, the vendor, the operational personnel, and the asset itself.

  16. Magnetic particle imaging: advancements and perspectives for real-time in vivo monitoring and image-guided therapy

    NASA Astrophysics Data System (ADS)

    Pablico-Lansigan, Michele H.; Situ, Shu F.; Samia, Anna Cristina S.

    2013-05-01

    Magnetic particle imaging (MPI) is an emerging biomedical imaging technology that allows the direct quantitative mapping of the spatial distribution of superparamagnetic iron oxide nanoparticles. MPI's increased sensitivity and short image acquisition times foster the creation of tomographic images with high temporal and spatial resolution. The contrast and sensitivity of MPI is envisioned to transcend those of other medical imaging modalities presently used, such as magnetic resonance imaging (MRI), X-ray scans, ultrasound, computed tomography (CT), positron emission tomography (PET) and single photon emission computed tomography (SPECT). In this review, we present an overview of the recent advances in the rapidly developing field of MPI. We begin with a basic introduction of the fundamentals of MPI, followed by some highlights over the past decade of the evolution of strategies and approaches used to improve this new imaging technique. We also examine the optimization of iron oxide nanoparticle tracers used for imaging, underscoring the importance of size homogeneity and surface engineering. Finally, we present some future research directions for MPI, emphasizing the novel and exciting opportunities that it offers as an important tool for real-time in vivo monitoring. All these opportunities and capabilities that MPI presents are now seen as potential breakthrough innovations in timely disease diagnosis, implant monitoring, and image-guided therapeutics.

  17. Three-dimensional real-time imaging of bi-phasic flow through porous media

    NASA Astrophysics Data System (ADS)

    Sharma, Prerna; Aswathi, P.; Sane, Anit; Ghosh, Shankar; Bhattacharya, S.

    2011-11-01

    We present a scanning laser-sheet video imaging technique to image bi-phasic flow in three-dimensional porous media in real time with pore-scale spatial resolution, i.e., 35 μm and 500 μm for directions parallel and perpendicular to the flow, respectively. The technique is illustrated for the case of viscous fingering. Using suitable image processing protocols, both the morphology and the movement of the two-fluid interface, were quantitatively estimated. Furthermore, a macroscopic parameter such as the displacement efficiency obtained from a microscopic (pore-scale) analysis demonstrates the versatility and usefulness of the method.

  18. Real-time computational photon-counting LiDAR

    NASA Astrophysics Data System (ADS)

    Edgar, Matthew; Johnson, Steven; Phillips, David; Padgett, Miles

    2018-03-01

    The availability of compact, low-cost, and high-speed MEMS-based spatial light modulators has generated widespread interest in alternative sampling strategies for imaging systems utilizing single-pixel detectors. The development of compressed sensing schemes for real-time computational imaging may have promising commercial applications for high-performance detectors, where the availability of focal plane arrays is expensive or otherwise limited. We discuss the research and development of a prototype light detection and ranging (LiDAR) system via direct time of flight, which utilizes a single high-sensitivity photon-counting detector and fast-timing electronics to recover millimeter accuracy three-dimensional images in real time. The development of low-cost real time computational LiDAR systems could have importance for applications in security, defense, and autonomous vehicles.

  19. Attitude-error compensation for airborne down-looking synthetic-aperture imaging lidar

    NASA Astrophysics Data System (ADS)

    Li, Guang-yuan; Sun, Jian-feng; Zhou, Yu; Lu, Zhi-yong; Zhang, Guo; Cai, Guang-yu; Liu, Li-ren

    2017-11-01

    Target-coordinate transformation in the lidar spot of the down-looking synthetic-aperture imaging lidar (SAIL) was performed, and the attitude errors were deduced in the process of imaging, according to the principle of the airborne down-looking SAIL. The influence of the attitude errors on the imaging quality was analyzed theoretically. A compensation method for the attitude errors was proposed and theoretically verified. An airborne down-looking SAIL experiment was performed and yielded the same results. A point-by-point error-compensation method for solving the azimuthal-direction space-dependent attitude errors was also proposed.

  20. Real-time imaging of subarachnoid hemorrhage in piglets with electrical impedance tomography.

    PubMed

    Dai, Meng; Wang, Liang; Xu, Canhua; Li, Lianfeng; Gao, Guodong; Dong, Xiuzhen

    2010-09-01

    Subarachnoid hemorrhage (SAH) is one of the most severe medical emergencies in neurosurgery. Early detection or diagnosis would significantly reduce the rate of disability and mortality, and improve the prognosis of the patients. Although the present medical imaging techniques generally have high sensitivity to identify bleeding, the use of an additional, non-invasive imaging technique capable of continuously monitoring SAH is required to prevent contingent bleeding or re-bleeding. In this study, electrical impedance tomography (EIT) was applied to detect the onset of SAH modeled on eight piglets in real time, with the subsequent process being monitored continuously. The experimental SAH model was introduced by one-time injection of 5 ml fresh autologous arterial blood into the cisterna magna. Results showed that resistivity variations within the brain caused by the added blood could be detected using the EIT method and may be associated not only with the resistivity difference among brain tissues, but also with variations of cerebrospinal fluid dynamics. In conclusion, EIT has unique potential for use in clinical practice to provide invaluable real-time neuroimaging data for SAH after the improvement of electrode design, anisotropic realistic modeling and instrumentation.

  1. A generic FPGA-based detector readout and real-time image processing board

    NASA Astrophysics Data System (ADS)

    Sarpotdar, Mayuresh; Mathew, Joice; Safonova, Margarita; Murthy, Jayant

    2016-07-01

    For space-based astronomical observations, it is important to have a mechanism to capture the digital output from the standard detector for further on-board analysis and storage. We have developed a generic (application- wise) field-programmable gate array (FPGA) board to interface with an image sensor, a method to generate the clocks required to read the image data from the sensor, and a real-time image processor system (on-chip) which can be used for various image processing tasks. The FPGA board is applied as the image processor board in the Lunar Ultraviolet Cosmic Imager (LUCI) and a star sensor (StarSense) - instruments developed by our group. In this paper, we discuss the various design considerations for this board and its applications in the future balloon and possible space flights.

  2. Real-time Interpolation for True 3-Dimensional Ultrasound Image Volumes

    PubMed Central

    Ji, Songbai; Roberts, David W.; Hartov, Alex; Paulsen, Keith D.

    2013-01-01

    We compared trilinear interpolation to voxel nearest neighbor and distance-weighted algorithms for fast and accurate processing of true 3-dimensional ultrasound (3DUS) image volumes. In this study, the computational efficiency and interpolation accuracy of the 3 methods were compared on the basis of a simulated 3DUS image volume, 34 clinical 3DUS image volumes from 5 patients, and 2 experimental phantom image volumes. We show that trilinear interpolation improves interpolation accuracy over both the voxel nearest neighbor and distance-weighted algorithms yet achieves real-time computational performance that is comparable to the voxel nearest neighbor algrorithm (1–2 orders of magnitude faster than the distance-weighted algorithm) as well as the fastest pixel-based algorithms for processing tracked 2-dimensional ultrasound images (0.035 seconds per 2-dimesional cross-sectional image [76,800 pixels interpolated, or 0.46 ms/1000 pixels] and 1.05 seconds per full volume with a 1-mm3 voxel size [4.6 million voxels interpolated, or 0.23 ms/1000 voxels]). On the basis of these results, trilinear interpolation is recommended as a fast and accurate interpolation method for rectilinear sampling of 3DUS image acquisitions, which is required to facilitate subsequent processing and display during operating room procedures such as image-guided neurosurgery. PMID:21266563

  3. Real-time interpolation for true 3-dimensional ultrasound image volumes.

    PubMed

    Ji, Songbai; Roberts, David W; Hartov, Alex; Paulsen, Keith D

    2011-02-01

    We compared trilinear interpolation to voxel nearest neighbor and distance-weighted algorithms for fast and accurate processing of true 3-dimensional ultrasound (3DUS) image volumes. In this study, the computational efficiency and interpolation accuracy of the 3 methods were compared on the basis of a simulated 3DUS image volume, 34 clinical 3DUS image volumes from 5 patients, and 2 experimental phantom image volumes. We show that trilinear interpolation improves interpolation accuracy over both the voxel nearest neighbor and distance-weighted algorithms yet achieves real-time computational performance that is comparable to the voxel nearest neighbor algrorithm (1-2 orders of magnitude faster than the distance-weighted algorithm) as well as the fastest pixel-based algorithms for processing tracked 2-dimensional ultrasound images (0.035 seconds per 2-dimesional cross-sectional image [76,800 pixels interpolated, or 0.46 ms/1000 pixels] and 1.05 seconds per full volume with a 1-mm(3) voxel size [4.6 million voxels interpolated, or 0.23 ms/1000 voxels]). On the basis of these results, trilinear interpolation is recommended as a fast and accurate interpolation method for rectilinear sampling of 3DUS image acquisitions, which is required to facilitate subsequent processing and display during operating room procedures such as image-guided neurosurgery.

  4. Real-time registration of 3D to 2D ultrasound images for image-guided prostate biopsy.

    PubMed

    Gillies, Derek J; Gardi, Lori; De Silva, Tharindu; Zhao, Shuang-Ren; Fenster, Aaron

    2017-09-01

    During image-guided prostate biopsy, needles are targeted at tissues that are suspicious of cancer to obtain specimen for histological examination. Unfortunately, patient motion causes targeting errors when using an MR-transrectal ultrasound (TRUS) fusion approach to augment the conventional biopsy procedure. This study aims to develop an automatic motion correction algorithm approaching the frame rate of an ultrasound system to be used in fusion-based prostate biopsy systems. Two modes of operation have been investigated for the clinical implementation of the algorithm: motion compensation using a single user initiated correction performed prior to biopsy, and real-time continuous motion compensation performed automatically as a background process. Retrospective 2D and 3D TRUS patient images acquired prior to biopsy gun firing were registered using an intensity-based algorithm utilizing normalized cross-correlation and Powell's method for optimization. 2D and 3D images were downsampled and cropped to estimate the optimal amount of image information that would perform registrations quickly and accurately. The optimal search order during optimization was also analyzed to avoid local optima in the search space. Error in the algorithm was computed using target registration errors (TREs) from manually identified homologous fiducials in a clinical patient dataset. The algorithm was evaluated for real-time performance using the two different modes of clinical implementations by way of user initiated and continuous motion compensation methods on a tissue mimicking prostate phantom. After implementation in a TRUS-guided system with an image downsampling factor of 4, the proposed approach resulted in a mean ± std TRE and computation time of 1.6 ± 0.6 mm and 57 ± 20 ms respectively. The user initiated mode performed registrations with in-plane, out-of-plane, and roll motions computation times of 108 ± 38 ms, 60 ± 23 ms, and 89 ± 27 ms, respectively, and corresponding

  5. Real-time computer treatment of THz passive device images with the high image quality

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  6. Toward real-time quantum imaging with a single pixel camera

    DOE PAGES

    Lawrie, B. J.; Pooser, R. C.

    2013-03-19

    In this paper, we present a workbench for the study of real-time quantum imaging by measuring the frame-by-frame quantum noise reduction of multi-spatial-mode twin beams generated by four wave mixing in Rb vapor. Exploiting the multiple spatial modes of this squeezed light source, we utilize spatial light modulators to selectively pass macropixels of quantum correlated modes from each of the twin beams to a high quantum efficiency balanced detector. Finally, in low-light-level imaging applications, the ability to measure the quantum correlations between individual spatial modes and macropixels of spatial modes with a single pixel camera will facilitate compressive quantum imagingmore » with sensitivity below the photon shot noise limit.« less

  7. Using dual-energy x-ray imaging to enhance automated lung tumor tracking during real-time adaptive radiotherapy.

    PubMed

    Menten, Martin J; Fast, Martin F; Nill, Simeon; Oelfke, Uwe

    2015-12-01

    Real-time, markerless localization of lung tumors with kV imaging is often inhibited by ribs obscuring the tumor and poor soft-tissue contrast. This study investigates the use of dual-energy imaging, which can generate radiographs with reduced bone visibility, to enhance automated lung tumor tracking for real-time adaptive radiotherapy. kV images of an anthropomorphic breathing chest phantom were experimentally acquired and radiographs of actual lung cancer patients were Monte-Carlo-simulated at three imaging settings: low-energy (70 kVp, 1.5 mAs), high-energy (140 kVp, 2.5 mAs, 1 mm additional tin filtration), and clinical (120 kVp, 0.25 mAs). Regular dual-energy images were calculated by weighted logarithmic subtraction of high- and low-energy images and filter-free dual-energy images were generated from clinical and low-energy radiographs. The weighting factor to calculate the dual-energy images was determined by means of a novel objective score. The usefulness of dual-energy imaging for real-time tracking with an automated template matching algorithm was investigated. Regular dual-energy imaging was able to increase tracking accuracy in left-right images of the anthropomorphic phantom as well as in 7 out of 24 investigated patient cases. Tracking accuracy remained comparable in three cases and decreased in five cases. Filter-free dual-energy imaging was only able to increase accuracy in 2 out of 24 cases. In four cases no change in accuracy was observed and tracking accuracy worsened in nine cases. In 9 out of 24 cases, it was not possible to define a tracking template due to poor soft-tissue contrast regardless of input images. The mean localization errors using clinical, regular dual-energy, and filter-free dual-energy radiographs were 3.85, 3.32, and 5.24 mm, respectively. Tracking success was dependent on tumor position, tumor size, imaging beam angle, and patient size. This study has highlighted the influence of patient anatomy on the success rate of real-time

  8. Using dual-energy x-ray imaging to enhance automated lung tumor tracking during real-time adaptive radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menten, Martin J., E-mail: martin.menten@icr.ac.uk; Fast, Martin F.; Nill, Simeon

    2015-12-15

    Purpose: Real-time, markerless localization of lung tumors with kV imaging is often inhibited by ribs obscuring the tumor and poor soft-tissue contrast. This study investigates the use of dual-energy imaging, which can generate radiographs with reduced bone visibility, to enhance automated lung tumor tracking for real-time adaptive radiotherapy. Methods: kV images of an anthropomorphic breathing chest phantom were experimentally acquired and radiographs of actual lung cancer patients were Monte-Carlo-simulated at three imaging settings: low-energy (70 kVp, 1.5 mAs), high-energy (140 kVp, 2.5 mAs, 1 mm additional tin filtration), and clinical (120 kVp, 0.25 mAs). Regular dual-energy images were calculated bymore » weighted logarithmic subtraction of high- and low-energy images and filter-free dual-energy images were generated from clinical and low-energy radiographs. The weighting factor to calculate the dual-energy images was determined by means of a novel objective score. The usefulness of dual-energy imaging for real-time tracking with an automated template matching algorithm was investigated. Results: Regular dual-energy imaging was able to increase tracking accuracy in left–right images of the anthropomorphic phantom as well as in 7 out of 24 investigated patient cases. Tracking accuracy remained comparable in three cases and decreased in five cases. Filter-free dual-energy imaging was only able to increase accuracy in 2 out of 24 cases. In four cases no change in accuracy was observed and tracking accuracy worsened in nine cases. In 9 out of 24 cases, it was not possible to define a tracking template due to poor soft-tissue contrast regardless of input images. The mean localization errors using clinical, regular dual-energy, and filter-free dual-energy radiographs were 3.85, 3.32, and 5.24 mm, respectively. Tracking success was dependent on tumor position, tumor size, imaging beam angle, and patient size. Conclusions: This study has highlighted the

  9. Real-time volumetric image reconstruction and 3D tumor localization based on a single x-ray projection image for lung cancer radiotherapy.

    PubMed

    Li, Ruijiang; Jia, Xun; Lewis, John H; Gu, Xuejun; Folkerts, Michael; Men, Chunhua; Jiang, Steve B

    2010-06-01

    To develop an algorithm for real-time volumetric image reconstruction and 3D tumor localization based on a single x-ray projection image for lung cancer radiotherapy. Given a set of volumetric images of a patient at N breathing phases as the training data, deformable image registration was performed between a reference phase and the other N-1 phases, resulting in N-1 deformation vector fields (DVFs). These DVFs can be represented efficiently by a few eigenvectors and coefficients obtained from principal component analysis (PCA). By varying the PCA coefficients, new DVFs can be generated, which, when applied on the reference image, lead to new volumetric images. A volumetric image can then be reconstructed from a single projection image by optimizing the PCA coefficients such that its computed projection matches the measured one. The 3D location of the tumor can be derived by applying the inverted DVF on its position in the reference image. The algorithm was implemented on graphics processing units (GPUs) to achieve real-time efficiency. The training data were generated using a realistic and dynamic mathematical phantom with ten breathing phases. The testing data were 360 cone beam projections corresponding to one gantry rotation, simulated using the same phantom with a 50% increase in breathing amplitude. The average relative image intensity error of the reconstructed volumetric images is 6.9% +/- 2.4%. The average 3D tumor localization error is 0.8 +/- 0.5 mm. On an NVIDIA Tesla C1060 GPU card, the average computation time for reconstructing a volumetric image from each projection is 0.24 s (range: 0.17 and 0.35 s). The authors have shown the feasibility of reconstructing volumetric images and localizing tumor positions in 3D in near real-time from a single x-ray image.

  10. In-vivo, real-time cross-sectional images of retina using a GPU enhanced master slave optical coherence tomography system

    NASA Astrophysics Data System (ADS)

    Bradu, Adrian; Kapinchev, Konstantin; Barnes, Frederick; Podoleanu, Adrian

    2016-03-01

    In our previous reports we demonstrated a novel Fourier domain optical coherence tomography method, Master Slave optical coherence tomography (MS-OCT), that does not require resampling of data and can deliver en-face images from several depths simultaneously. While ideally suited for delivering information from a selected depth, the MS-OCT has been so far inferior to the conventional FFT based OCT in terms of time of producing cross section images. Here, we demonstrate that by taking advantage of the parallel processing capabilities offered by the MS-OCT method, cross-sectional OCT images of the human retina can be produced in real-time by assembling several T-scans from different depths. We analyze the conditions that ensure a real-time B-scan imaging operation, and demonstrate in-vivo real-time images from human fovea and the optic nerve, of comparable resolution and sensitivity to those produced using the traditional Fourier domain based method.

  11. Solar Demon: near real-time solar eruptive event detection on SDO/AIA images

    NASA Astrophysics Data System (ADS)

    Kraaikamp, Emil; Verbeeck, Cis

    Solar flares, dimmings and EUV waves have been observed routinely in extreme ultra-violet (EUV) images of the Sun since 1996. These events are closely associated with coronal mass ejections (CMEs), and therefore provide useful information for early space weather alerts. The Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) generates such a massive dataset that it becomes impossible to find most of these eruptive events manually. Solar Demon is a set of automatic detection algorithms that attempts to solve this problem by providing both near real-time warnings of eruptive events and a catalog of characterized events. Solar Demon has been designed to detect and characterize dimmings, EUV waves, as well as solar flares in near real-time on SDO/AIA data. The detection modules are running continuously at the Royal Observatory of Belgium on both quick-look data and synoptic science data. The output of Solar Demon can be accessed in near real-time on the Solar Demon website, and includes images, movies, light curves, and the numerical evolution of several parameters. Solar Demon is the result of collaboration between the FP7 projects AFFECTS and COMESEP. Flare detections of Solar Demon are integrated into the COMESEP alert system. Here we present the Solar Demon detection algorithms and their output. We will focus on the algorithm and its operational implementation. Examples of interesting flare, dimming and EUV wave events, and general statistics of the detections made so far during solar cycle 24 will be presented as well.

  12. Real-Time Classification of Hand Motions Using Ultrasound Imaging of Forearm Muscles.

    PubMed

    Akhlaghi, Nima; Baker, Clayton A; Lahlou, Mohamed; Zafar, Hozaifah; Murthy, Karthik G; Rangwala, Huzefa S; Kosecka, Jana; Joiner, Wilsaan M; Pancrazio, Joseph J; Sikdar, Siddhartha

    2016-08-01

    Surface electromyography (sEMG) has been the predominant method for sensing electrical activity for a number of applications involving muscle-computer interfaces, including myoelectric control of prostheses and rehabilitation robots. Ultrasound imaging for sensing mechanical deformation of functional muscle compartments can overcome several limitations of sEMG, including the inability to differentiate between deep contiguous muscle compartments, low signal-to-noise ratio, and lack of a robust graded signal. The objective of this study was to evaluate the feasibility of real-time graded control using a computationally efficient method to differentiate between complex hand motions based on ultrasound imaging of forearm muscles. Dynamic ultrasound images of the forearm muscles were obtained from six able-bodied volunteers and analyzed to map muscle activity based on the deformation of the contracting muscles during different hand motions. Each participant performed 15 different hand motions, including digit flexion, different grips (i.e., power grasp and pinch grip), and grips in combination with wrist pronation. During the training phase, we generated a database of activity patterns corresponding to different hand motions for each participant. During the testing phase, novel activity patterns were classified using a nearest neighbor classification algorithm based on that database. The average classification accuracy was 91%. Real-time image-based control of a virtual hand showed an average classification accuracy of 92%. Our results demonstrate the feasibility of using ultrasound imaging as a robust muscle-computer interface. Potential clinical applications include control of multiarticulated prosthetic hands, stroke rehabilitation, and fundamental investigations of motor control and biomechanics.

  13. Intraoperative brain hemodynamic response assessment with real-time hyperspectral optical imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Laurence, Audrey; Pichette, Julien; Angulo-Rodríguez, Leticia M.; Saint Pierre, Catherine; Lesage, Frédéric; Bouthillier, Alain; Nguyen, Dang Khoa; Leblond, Frédéric

    2016-03-01

    Following normal neuronal activity, there is an increase in cerebral blood flow and cerebral blood volume to provide oxygenated hemoglobin to active neurons. For abnormal activity such as epileptiform discharges, this hemodynamic response may be inadequate to meet the high metabolic demands. To verify this hypothesis, we developed a novel hyperspectral imaging system able to monitor real-time cortical hemodynamic changes during brain surgery. The imaging system is directly integrated into a surgical microscope, using the white-light source for illumination. A snapshot hyperspectral camera is used for detection (4x4 mosaic filter array detecting 16 wavelengths simultaneously). We present calibration experiments where phantoms made of intralipid and food dyes were imaged. Relative concentrations of three dyes were recovered at a video rate of 30 frames per second. We also present hyperspectral recordings during brain surgery of epileptic patients with concurrent electrocorticography recordings. Relative concentration maps of oxygenated and deoxygenated hemoglobin were extracted from the data, allowing real-time studies of hemodynamic changes with a good spatial resolution. Finally, we present preliminary results on phantoms obtained with an integrated spatial frequency domain imaging system to recover tissue optical properties. This additional module, used together with the hyperspectral imaging system, will allow quantification of hemoglobin concentrations maps. Our hyperspectral imaging system offers a new tool to analyze hemodynamic changes, especially in the case of epileptiform discharges. It also offers an opportunity to study brain connectivity by analyzing correlations between hemodynamic responses of different tissue regions.

  14. Ultrahigh field magnetic resonance and colour Doppler real-time fusion imaging of the orbit--a hybrid tool for assessment of choroidal melanoma.

    PubMed

    Walter, Uwe; Niendorf, Thoralf; Graessl, Andreas; Rieger, Jan; Krüger, Paul-Christian; Langner, Sönke; Guthoff, Rudolf F; Stachs, Oliver

    2014-05-01

    A combination of magnetic resonance images with real-time high-resolution ultrasound known as fusion imaging may improve ophthalmologic examination. This study was undertaken to evaluate the feasibility of orbital high-field magnetic resonance and real-time colour Doppler ultrasound image fusion and navigation. This case study, performed between April and June 2013, included one healthy man (age, 47 years) and two patients (one woman, 57 years; one man, 67 years) with choroidal melanomas. All cases underwent 7.0-T magnetic resonance imaging using a custom-made ocular imaging surface coil. The Digital Imaging and Communications in Medicine volume data set was then loaded into the ultrasound system for manual registration of the live ultrasound image and fusion imaging examination. Data registration, matching and then volume navigation were feasible in all cases. Fusion imaging provided real-time imaging capabilities and high tissue contrast of choroidal tumour and optic nerve. It also allowed adding a real-time colour Doppler signal on magnetic resonance images for assessment of vasculature of tumour and retrobulbar structures. The combination of orbital high-field magnetic resonance and colour Doppler ultrasound image fusion and navigation is feasible. Multimodal fusion imaging promises to foster assessment and monitoring of choroidal melanoma and optic nerve disorders. • Orbital magnetic resonance and colour Doppler ultrasound real-time fusion imaging is feasible • Fusion imaging combines the spatial and temporal resolution advantages of each modality • Magnetic resonance and ultrasound fusion imaging improves assessment of choroidal melanoma vascularisation.

  15. Unmanned Airborne System Deployment at Turrialba Volcano for Real Time Eruptive Cloud Measurements

    NASA Astrophysics Data System (ADS)

    Diaz, J. A.; Pieri, D. C.; Fladeland, M. M.; Bland, G.; Corrales, E.; Alan, A., Jr.; Alegria, O.; Kolyer, R.

    2015-12-01

    The development of small unmanned aerial systems (sUAS) with a variety of instrument packages enables in situ and proximal remote sensing measurements of volcanic plumes, even when the active conditions of the volcano do not allow volcanologists and emergency response personnel to get too close to the erupting crater. This has been demonstrated this year by flying a sUAS through the heavy ash driven erupting volcanic cloud of Turrialba Volcano, while conducting real time in situ measurement of gases over the crater summit. The event also achieved the collection of newly released ash samples from the erupting volcano. The interception of the Turrialba ash cloud occurred during the CARTA 2015 field campaign carried out as part of an ongoing program for remote sensing satellite calibration and validation purposes, using active volcanic plumes. These deployments are timed to support overflights of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) onboard the NASA Terra satellite on a bimonthly basis using airborne platforms such as tethered balloons, free-flying fixed wing small UAVs at altitudes up to 12.5Kft ASL within about a 5km radius of the summit crater. The onboard instrument includes the MiniGas payload which consists of an array of single electrochemical and infrared gas detectors (SO2, H2S CO2), temperature, pressure, relative humidity and GPS sensors, all connected to an Arduino-based board, with data collected at 1Hz. Data are both stored onboard and sent by telemetry to the ground operator within a 3 km range. The UAV can also carry visible and infrared cameras as well as other payloads, such as a UAV-MS payload that is currently under development for mass spectrometer-based in situ measurements. The presentation describes the ongoing UAV- based in situ remote sensing validation program at Turrialba Volcano, the results of a fly-through the eruptive cloud, as well as future plans to continue these efforts. Work presented here was

  16. Real-time simulation of thermal shadows with EMIT

    NASA Astrophysics Data System (ADS)

    Klein, Andreas; Oberhofer, Stefan; Schätz, Peter; Nischwitz, Alfred; Obermeier, Paul

    2016-05-01

    Modern missile systems use infrared imaging for tracking or target detection algorithms. The development and validation processes of these missile systems need high fidelity simulations capable of stimulating the sensors in real-time with infrared image sequences from a synthetic 3D environment. The Extensible Multispectral Image Generation Toolset (EMIT) is a modular software library developed at MBDA Germany for the generation of physics-based infrared images in real-time. EMIT is able to render radiance images in full 32-bit floating point precision using state of the art computer graphics cards and advanced shader programs. An important functionality of an infrared image generation toolset is the simulation of thermal shadows as these may cause matching errors in tracking algorithms. However, for real-time simulations, such as hardware in the loop simulations (HWIL) of infrared seekers, thermal shadows are often neglected or precomputed as they require a thermal balance calculation in four-dimensions (3D geometry in one-dimensional time up to several hours in the past). In this paper we will show the novel real-time thermal simulation of EMIT. Our thermal simulation is capable of simulating thermal effects in real-time environments, such as thermal shadows resulting from the occlusion of direct and indirect irradiance. We conclude our paper with the practical use of EMIT in a missile HWIL simulation.

  17. The implementation of CMOS sensors within a real time digital mammography intelligent imaging system: The I-ImaS System

    NASA Astrophysics Data System (ADS)

    Esbrand, C.; Royle, G.; Griffiths, J.; Speller, R.

    2009-07-01

    The integration of technology with healthcare has undoubtedly propelled the medical imaging sector well into the twenty first century. The concept of digital imaging introduced during the 1970s has since paved the way for established imaging techniques where digital mammography, phase contrast imaging and CT imaging are just a few examples. This paper presents a prototype intelligent digital mammography system designed and developed by a European consortium. The final system, the I-ImaS system, utilises CMOS monolithic active pixel sensor (MAPS) technology promoting on-chip data processing, enabling the acts of data processing and image acquisition to be achieved simultaneously; consequently, statistical analysis of tissue is achievable in real-time for the purpose of x-ray beam modulation via a feedback mechanism during the image acquisition procedure. The imager implements a dual array of twenty 520 pixel × 40 pixel CMOS MAPS sensing devices with a 32μm pixel size, each individually coupled to a 100μm thick thallium doped structured CsI scintillator. This paper presents the first intelligent images of real breast tissue obtained from the prototype system of real excised breast tissue where the x-ray exposure was modulated via the statistical information extracted from the breast tissue itself. Conventional images were experimentally acquired where the statistical analysis of the data was done off-line, resulting in the production of simulated real-time intelligently optimised images. The results obtained indicate real-time image optimisation using the statistical information extracted from the breast as a means of a feedback mechanisms is beneficial and foreseeable in the near future.

  18. DETECTION AND IDENTIFICATION OF TOXIC AIR POLLUTANTS USING AIRBORNE LWIR HYPERSPECTRAL IMAGING

    EPA Science Inventory

    Airborne longwave infrared LWIR) hyperspectral imagery was utilized to detect and identify gaseous chemical release plumes at sites in sourthern Texzas. The Airborne Hysperspectral Imager (AHI), developed by the University of Hawaii was flown over a petrochemical facility and a ...

  19. Real-time iterative monitoring of radiofrequency ablation tumor therapy with 15O-water PET imaging.

    PubMed

    Bao, Ande; Goins, Beth; Dodd, Gerald D; Soundararajan, Anuradha; Santoyo, Cristina; Otto, Randal A; Davis, Michael D; Phillips, William T

    2008-10-01

    A method that provides real-time image-based monitoring of solid tumor therapy to ensure complete tumor eradication during image-guided interventional therapy would be a valuable tool. The short, 2-min half-life of (15)O makes it possible to perform repeated PET imaging at 20-min intervals at multiple time points before and after image-guided therapy. In this study, (15)O-water PET was evaluated as a tool to provide real-time feedback and iterative image guidance to rapidly monitor the intratumoral coverage of radiofrequency (RF) ablation therapy. Tumor RF ablation therapy was performed on head and neck squamous cell carcinoma (SCC) xenograft tumors (length, approximately 23 mm) in 6 nude rats. The tumor in each animal was ablated with RF (1-cm active size ablation catheter, 70 degrees C for 5 min) twice in 2 separate tumor regions with a 20-min separation. The (15)O-water PET images were acquired before RF ablation and after the first RF and second RF ablations using a small-animal PET scanner. In each PET session, approximately 100 MBq of (15)O-water in 1.0 mL of saline were injected intravenously into each animal. List-mode PET images were acquired for 7 min starting 20 s before injection. PET images were reconstructed by 2-dimensional ordered-subset expectation maximization into single-frame images and dynamic images at 10 s/frame. PET images were displayed and analyzed with software. Pre-RF ablation images demonstrate that (15)O-water accumulates in tumors with (15)O activity reaching peak levels immediately after administration. After RF ablation, the ablated region had almost zero activity, whereas the unablated tumor tissue continued to have a high (15)O-water accumulation. Using image feedback, the RF probe was repositioned to a tumor region with residual (15)O-water uptake and then ablated. The second RF ablation in this new region of the tumor resulted in additional ablation of the solid tumor, with a corresponding decrease in activity on the (15)O

  20. Efficient Parallel Levenberg-Marquardt Model Fitting towards Real-Time Automated Parametric Imaging Microscopy

    PubMed Central

    Zhu, Xiang; Zhang, Dianwen

    2013-01-01

    We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetime imaging microscopy. PMID:24130785

  1. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  2. Embedded real-time image processing hardware for feature extraction and clustering

    NASA Astrophysics Data System (ADS)

    Chiu, Lihu; Chang, Grant

    2003-08-01

    Printronix, Inc. uses scanner-based image systems to perform print quality measurements for line-matrix printers. The size of the image samples and image definition required make commercial scanners convenient to use. The image processing is relatively well defined, and we are able to simplify many of the calculations into hardware equations and "c" code. The process of rapidly prototyping the system using DSP based "c" code gets the algorithms well defined early in the development cycle. Once a working system is defined, the rest of the process involves splitting the task up for the FPGA and the DSP implementation. Deciding which of the two to use, the DSP or the FPGA, is a simple matter of trial benchmarking. There are two kinds of benchmarking: One for speed, and the other for memory. The more memory intensive algorithms should run in the DSP, and the simple real time tasks can use the FPGA most effectively. Once the task is split, we can decide which platform the algorithm should be executed. This involves prototyping all the code in the DSP, then timing various blocks of the algorithm. Slow routines can be optimized using the compiler tools, and if further reduction in time is needed, into tasks that the FPGA can perform.

  3. MO-AB-BRA-02: A Novel Scatter Imaging Modality for Real-Time Image Guidance During Lung SBRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redler, G; Bernard, D; Templeton, A

    2015-06-15

    Purpose: A novel scatter imaging modality is developed and its feasibility for image-guided radiation therapy (IGRT) during stereotactic body radiation therapy (SBRT) for lung cancer patients is assessed using analytic and Monte Carlo models as well as experimental testing. Methods: During treatment, incident radiation interacts and scatters from within the patient. The presented methodology forms an image of patient anatomy from the scattered radiation for real-time localization of the treatment target. A radiographic flat panel-based pinhole camera provides spatial information regarding the origin of detected scattered radiation. An analytical model is developed, which provides a mathematical formalism for describing themore » scatter imaging system. Experimental scatter images are acquired by irradiating an object using a Varian TrueBeam accelerator. The differentiation between tissue types is investigated by imaging simple objects of known compositions (water, lung, and cortical bone equivalent). A lung tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is fabricated and imaged to investigate image quality for various quantities of delivered radiation. Monte Carlo N-Particle (MCNP) code is used for validation and testing by simulating scatter image formation using the experimental pinhole camera setup. Results: Analytical calculations, MCNP simulations, and experimental results when imaging the water, lung, and cortical bone equivalent objects show close agreement, thus validating the proposed models and demonstrating that scatter imaging differentiates these materials well. Lung tumor phantom images have sufficient contrast-to-noise ratio (CNR) to clearly distinguish tumor from surrounding lung tissue. CNR=4.1 and CNR=29.1 for 10MU and 5000MU images (equivalent to 0.5 and 250 second images), respectively. Conclusion: Lung SBRT provides favorable treatment outcomes, but depends on accurate target localization. A

  4. An approach to real-time magnetic resonance imaging for speech production

    NASA Astrophysics Data System (ADS)

    Narayanan, Shrikanth; Nayak, Krishna; Byrd, Dani; Lee, Sungbok

    2003-04-01

    Magnetic resonance imaging has served as a valuable tool for studying primarily static postures in speech production. Now, recent improvements in imaging techniques, particularly in temporal resolution, are making it possible to examine the dynamics of vocal tract shaping during speech. Examples include Mady et al. (2001, 2002) (8 images/second, T1 fast gradient echo) and Demolin et al. (2000) (4-5 images/second, ultra fast turbo spin echo sequence). The present study uses a non 2D-FFT acquisition strategy (spiral k-space trajectory) on a GE Signa 1.5T CV/i scanner with a low-flip angle spiral gradient echo originally developed for cardiac imaging [Kerr et al. (1997), Nayak et al. (2001)] with reconstruction rates of 8-10 images/second. The experimental stimuli included English sentences varying the syllable position of /n, r, l/ (spoken by 2 subjects) and Tamil sentences varying among five liquids (spoken by one subject). The imaging parameters were the following: 15 deg flip angle, 20-interleaves, 6.7 ms TR, 1.88 mm resolution over a 20 cm FOV, 5 mm slice thickness, and 2.4 ms spiral readouts. Data show clear real-time movements of the lips, tongue and velum. Sample movies and data analysis strategies will be presented. Segmental durations, positions, and inter-articulator timing can all be quantitatively evaluated. [Work supported by NIH.

  5. Calibration Of Airborne Visible/IR Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Vane, G. A.; Chrien, T. G.; Miller, E. A.; Reimer, J. H.

    1990-01-01

    Paper describes laboratory spectral and radiometric calibration of Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) applied to all AVIRIS science data collected in 1987. Describes instrumentation and procedures used and demonstrates that calibration accuracy achieved exceeds design requirements. Developed for use in remote-sensing studies in such disciplines as botany, geology, hydrology, and oceanography.

  6. The 2nd Generation Real Time Mission Monitor (RTMM) Development

    NASA Technical Reports Server (NTRS)

    Blakeslee, Richard; Goodman, Michael; Meyer, Paul; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn; Conover, Helen; Smith, Tammy; Lu, Jessica; hide

    2009-01-01

    The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decisionmaking for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more

  7. A real-time remote video streaming platform for ultrasound imaging.

    PubMed

    Ahmadi, Mehdi; Gross, Warren J; Kadoury, Samuel

    2016-08-01

    Ultrasound is a viable imaging technology in remote and resources-limited areas. Ultrasonography is a user-dependent skill which depends on a high degree of training and hands-on experience. However, there is a limited number of skillful sonographers located in remote areas. In this work, we aim to develop a real-time video streaming platform which allows specialist physicians to remotely monitor ultrasound exams. To this end, an ultrasound stream is captured and transmitted through a wireless network into remote computers, smart-phones and tablets. In addition, the system is equipped with a camera to track the position of the ultrasound probe. The main advantage of our work is using an open source platform for video streaming which gives us more control over streaming parameters than the available commercial products. The transmission delays of the system are evaluated for several ultrasound video resolutions and the results show that ultrasound videos close to the high-definition (HD) resolution can be received and displayed on an Android tablet with the delay of 0.5 seconds which is acceptable for accurate real-time diagnosis.

  8. Tomographic Imaging of a Forested Area By Airborne Multi-Baseline P-Band SAR.

    PubMed

    Frey, Othmar; Morsdorf, Felix; Meier, Erich

    2008-09-24

    In recent years, various attempts have been undertaken to obtain information about the structure of forested areas from multi-baseline synthetic aperture radar data. Tomographic processing of such data has been demonstrated for airborne L-band data but the quality of the focused tomographic images is limited by several factors. In particular, the common Fourierbased focusing methods are susceptible to irregular and sparse sampling, two problems, that are unavoidable in case of multi-pass, multi-baseline SAR data acquired by an airborne system. In this paper, a tomographic focusing method based on the time-domain back-projection algorithm is proposed, which maintains the geometric relationship between the original sensor positions and the imaged target and is therefore able to cope with irregular sampling without introducing any approximations with respect to the geometry. The tomographic focusing quality is assessed by analysing the impulse response of simulated point targets and an in-scene corner reflector. And, in particular, several tomographic slices of a volume representing a forested area are given. The respective P-band tomographic data set consisting of eleven flight tracks has been acquired by the airborne E-SAR sensor of the German Aerospace Center (DLR).

  9. Real-time DNA Amplification and Detection System Based on a CMOS Image Sensor.

    PubMed

    Wang, Tiantian; Devadhasan, Jasmine Pramila; Lee, Do Young; Kim, Sanghyo

    2016-01-01

    In the present study, we developed a polypropylene well-integrated complementary metal oxide semiconductor (CMOS) platform to perform the loop mediated isothermal amplification (LAMP) technique for real-time DNA amplification and detection simultaneously. An amplification-coupled detection system directly measures the photon number changes based on the generation of magnesium pyrophosphate and color changes. The photon number decreases during the amplification process. The CMOS image sensor observes the photons and converts into digital units with the aid of an analog-to-digital converter (ADC). In addition, UV-spectral studies, optical color intensity detection, pH analysis, and electrophoresis detection were carried out to prove the efficiency of the CMOS sensor based the LAMP system. Moreover, Clostridium perfringens was utilized as proof-of-concept detection for the new system. We anticipate that this CMOS image sensor-based LAMP method will enable the creation of cost-effective, label-free, optical, real-time and portable molecular diagnostic devices.

  10. Small real time detection satellites for MDA using hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Nakaya, Daiki; Yanagida, Hiroki; Shin, Satori; Ito, Tomonori; Takeuchi, Yusuke

    2017-10-01

    Hyperspectral Images are now used in the field of agriculture, cosmetics, and space exploring. Behind this fact, there is a result of efforts to contrive miniaturization and decrease in costs. This paper describes low-cost and small Hyperspectral Camera (HSC) under development and a method of utilizing it. Real Time Detection System for MDA is that government agencies put those cameras in small satellites and use them for MDA (Maritime Domain Awareness). We assume early detection of unidentified floating objects to find out disguised fishing ships and submarines.

  11. Real-time, label-free, intraoperative visualization of peripheral nerves and micro-vasculatures using multimodal optical imaging techniques

    PubMed Central

    Cha, Jaepyeong; Broch, Aline; Mudge, Scott; Kim, Kihoon; Namgoong, Jung-Man; Oh, Eugene; Kim, Peter

    2018-01-01

    Accurate, real-time identification and display of critical anatomic structures, such as the nerve and vasculature structures, are critical for reducing complications and improving surgical outcomes. Human vision is frequently limited in clearly distinguishing and contrasting these structures. We present a novel imaging system, which enables noninvasive visualization of critical anatomic structures during surgical dissection. Peripheral nerves are visualized by a snapshot polarimetry that calculates the anisotropic optical properties. Vascular structures, both venous and arterial, are identified and monitored in real-time using a near-infrared laser-speckle-contrast imaging. We evaluate the system by performing in vivo animal studies with qualitative comparison by contrast-agent-aided fluorescence imaging. PMID:29541506

  12. Tile-Image Merging and Delivering for Virtual Camera Services on Tiled-Display for Real-Time Remote Collaboration

    NASA Astrophysics Data System (ADS)

    Choe, Giseok; Nang, Jongho

    The tiled-display system has been used as a Computer Supported Cooperative Work (CSCW) environment, in which multiple local (and/or remote) participants cooperate using some shared applications whose outputs are displayed on a large-scale and high-resolution tiled-display, which is controlled by a cluster of PC's, one PC per display. In order to make the collaboration effective, each remote participant should be aware of all CSCW activities on the titled display system in real-time. This paper presents a capturing and delivering mechanism of all activities on titled-display system to remote participants in real-time. In the proposed mechanism, the screen images of all PC's are periodically captured and delivered to the Merging Server that maintains separate buffers to store the captured images from the PCs. The mechanism selects one tile image from each buffer, merges the images to make a screen shot of the whole tiled-display, clips a Region of Interest (ROI), compresses and streams it to remote participants in real-time. A technical challenge in the proposed mechanism is how to select a set of tile images, one from each buffer, for merging so that the tile images displayed at the same time on the tiled-display can be properly merged together. This paper presents three selection algorithms; a sequential selection algorithm, a capturing time based algorithm, and a capturing time and visual consistency based algorithm. It also proposes a mechanism of providing several virtual cameras on tiled-display system to remote participants by concurrently clipping several different ROI's from the same merged tiled-display images, and delivering them after compressing with video encoders requested by the remote participants. By interactively changing and resizing his/her own ROI, a remote participant can check the activities on the tiled-display effectively. Experiments on a 3 × 2 tiled-display system show that the proposed merging algorithm can build a tiled-display image

  13. Real-time 3D ultrasound imaging of infant tongue movements during breast-feeding.

    PubMed

    Burton, Pat; Deng, Jing; McDonald, Daren; Fewtrell, Mary S

    2013-09-01

    Whether infants use suction or peristaltic tongue movements or a combination to extract milk during breast-feeding is controversial. The aims of this pilot study were 1] to evaluate the feasibility of using 3D ultrasound scanning to visualise infant tongue movements; and 2] to ascertain whether peristaltic tongue movements could be demonstrated during breast-feeding. 15 healthy term infants, aged 2 weeks to 4 months were scanned during breast-feeding, using a real-time 3D ultrasound system, with a 7 MHz transducer placed sub-mentally. 1] The method proved feasible, with 72% of bi-plane datasets and 56% of real-time 3D datasets providing adequate coverage [>75%] of the infant tongue. 2] Peristaltic tongue movement was observed in 13 of 15 infants [83%] from real-time or reformatted truly mid-sagittal views under 3D guidance. This is the first study to demonstrate the feasibility of using 3D ultrasound to visualise infant tongue movements during breast-feeding. Peristaltic infant tongue movement was present in the majority of infants when the image plane was truly mid-sagittal but was not apparent if the image was slightly off the mid-sagittal plane. This should be considered in studies investigating the relative importance of vacuum and peristalsis for milk transfer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Real-Time Two-Dimensional Magnetic Particle Imaging for Electromagnetic Navigation in Targeted Drug Delivery.

    PubMed

    Le, Tuan-Anh; Zhang, Xingming; Hoshiar, Ali Kafash; Yoon, Jungwon

    2017-09-07

    Magnetic nanoparticles (MNPs) are effective drug carriers. By using electromagnetic actuated systems, MNPs can be controlled noninvasively in a vascular network for targeted drug delivery (TDD). Although drugs can reach their target location through capturing schemes of MNPs by permanent magnets, drugs delivered to non-target regions can affect healthy tissues and cause undesirable side effects. Real-time monitoring of MNPs can improve the targeting efficiency of TDD systems. In this paper, a two-dimensional (2D) real-time monitoring scheme has been developed for an MNP guidance system. Resovist particles 45 to 65 nm in diameter (5 nm core) can be monitored in real-time (update rate = 2 Hz) in 2D. The proposed 2D monitoring system allows dynamic tracking of MNPs during TDD and renders magnetic particle imaging-based navigation more feasible.

  15. Research on intelligent scenic security early warning platform based on high resolution image: real scene linkage and real-time LBS

    NASA Astrophysics Data System (ADS)

    Li, Baishou; Huang, Yu; Lan, Guangquan; Li, Tingting; Lu, Ting; Yao, Mingxing; Luo, Yuandan; Li, Boxiang; Qian, Yongyou; Gao, Yujiu

    2015-12-01

    This paper design and implement security monitor system within a scenic spot for tourists, the scenic spot staff can be automatic real time for visitors to perception and monitoring, and visitors can also know about themselves location in the scenic, real-time and obtain the 3D imaging conditions of scenic area. Through early warning can realize "parent-child relation", preventing the old man and child lost and wandering. Research results to the further development of virtual reality to provide effective security early warning platform of the theoretical basis and practical reference.

  16. In vivo real-time cavitation imaging in moving organs

    NASA Astrophysics Data System (ADS)

    Arnal, B.; Baranger, J.; Demene, C.; Tanter, M.; Pernot, M.

    2017-02-01

    The stochastic nature of cavitation implies visualization of the cavitation cloud in real-time and in a discriminative manner for the safe use of focused ultrasound therapy. This visualization is sometimes possible with standard echography, but it strongly depends on the quality of the scanner, and is hindered by difficulty in discriminating from highly reflecting tissue signals in different organs. A specific approach would then permit clear validation of the cavitation position and activity. Detecting signals from a specific source with high sensitivity is a major problem in ultrasound imaging. Based on plane or diverging wave sonications, ultrafast ultrasonic imaging dramatically increases temporal resolution, and the larger amount of acquired data permits increased sensitivity in Doppler imaging. Here, we investigate a spatiotemporal singular value decomposition of ultrafast radiofrequency data to discriminate bubble clouds from tissue based on their different spatiotemporal motion and echogenicity during histotripsy. We introduce an automation to determine the parameters of this filtering. This method clearly outperforms standard temporal filtering techniques with a bubble to tissue contrast of at least 20 dB in vitro in a moving phantom and in vivo in porcine liver.

  17. High Resolution Near Real Time Image Processing and Support for MSSS Modernization

    DTIC Science & Technology

    2012-09-01

    00-00-2012 to 00-00-2012 4 . TITLE AND SUBTITLE High Resolution Near Real Time Image Processing and Support for MSSS Modernization 5a. CONTRACT...This current CONOPS is depicted in Fig. 4 . Fig. 4 . PCID/ASPIRE High Resolution Post...experiments were performed, and subsequently addressed in papers and presentations [3, 4 ,] that demonstrated system behavior; with details of the

  18. Review of Real-Time 3-Dimensional Image Guided Radiation Therapy on Standard-Equipped Cancer Radiation Therapy Systems: Are We at the Tipping Point for the Era of Real-Time Radiation Therapy?

    PubMed

    Keall, Paul J; Nguyen, Doan Trang; O'Brien, Ricky; Zhang, Pengpeng; Happersett, Laura; Bertholet, Jenny; Poulsen, Per R

    2018-04-14

    To review real-time 3-dimensional (3D) image guided radiation therapy (IGRT) on standard-equipped cancer radiation therapy systems, focusing on clinically implemented solutions. Three groups in 3 continents have clinically implemented novel real-time 3D IGRT solutions on standard-equipped linear accelerators. These technologies encompass kilovoltage, combined megavoltage-kilovoltage, and combined kilovoltage-optical imaging. The cancer sites treated span pelvic and abdominal tumors for which respiratory motion is present. For each method the 3D-measured motion during treatment is reported. After treatment, dose reconstruction was used to assess the treatment quality in the presence of motion with and without real-time 3D IGRT. The geometric accuracy was quantified through phantom experiments. A literature search was conducted to identify additional real-time 3D IGRT methods that could be clinically implemented in the near future. The real-time 3D IGRT methods were successfully clinically implemented and have been used to treat more than 200 patients. Systematic target position shifts were observed using all 3 methods. Dose reconstruction demonstrated that the delivered dose is closer to the planned dose with real-time 3D IGRT than without real-time 3D IGRT. In addition, compromised target dose coverage and variable normal tissue doses were found without real-time 3D IGRT. The geometric accuracy results with real-time 3D IGRT had a mean error of <0.5 mm and a standard deviation of <1.1 mm. Numerous additional articles exist that describe real-time 3D IGRT methods using standard-equipped radiation therapy systems that could also be clinically implemented. Multiple clinical implementations of real-time 3D IGRT on standard-equipped cancer radiation therapy systems have been demonstrated. Many more approaches that could be implemented were identified. These solutions provide a pathway for the broader adoption of methods to make radiation therapy more accurate

  19. Real-time imaging for cerebral ischemia in rats using the multi-wavelength handheld photoacoustic system

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Hang; Xu, Yu; Chan, Kim Chuan; Mehta, Kalpesh; Thakor, Nitish; Liao, Lun-De

    2017-02-01

    Stroke is the second leading cause of death worldwide. Rapid and precise diagnosis is essential to expedite clinical decision and improve functional outcomes in stroke patients; therefore, real-time imaging plays an important role to provide crucial information for post-stroke recovery analysis. In this study, based on the multi-wavelength laser and 18.5 MHz array-based ultrasound platform, a real-time handheld photoacoustic (PA) system was developed to evaluate cerebrovascular functions pre- and post-stroke in rats. Using this system, hemodynamic information such as cerebral blood volume (CBV) can be acquired for assessment. One rat stroke model (i.e., photothrombotic ischemia (PTI)) was employed for evaluating the effect of local ischemia. For achieving better intrinsic PA contrast, Vantage and COMSOL simulations were applied to optimize the light delivery (e.g., interval between two arms) from customized fiber bundle, while phantom experiment was conducted to evaluate the imaging performance of this system. Results of phantom experiment showed that hairs ( 150 μm diameter) and pencil lead (500 μm diameter) can be imaged clearly. On the other hand, results of in vivo experiments also demonstrated that stroke symptoms can be observed in PTI model poststroke. In the near future, with the help of PA specific contrast agent, the system would be able to achieve blood-brain barrier leakage imaging post-stroke. Overall, the real-time handheld PA system holds great potential in disease models involving impairments in cerebrovascular functions.

  20. Real-time volumetric relative dosimetry for magnetic resonance—image-guided radiation therapy (MR-IGRT)

    NASA Astrophysics Data System (ADS)

    Lee, Hannah J.; Kadbi, Mo; Bosco, Gary; Ibbott, Geoffrey S.

    2018-02-01

    The integration of magnetic resonance imaging (MRI) with linear accelerators (linac) has enabled the use of 3D MR-visible gel dosimeters for real-time verification of volumetric dose distributions. Several iron-based radiochromic 3D gels were created in-house then imaged and irradiated in a pre-clinical 1.5 T-7 MV MR-Linac. MR images were acquired using a range of balanced-fast field echo (b-FFE) sequences during irradiation to assess the contrast and dose response in irradiated regions and to minimize the presence of MR artifacts. Out of four radiochromic 3D gel formulations, the FOX 3D gel was found to provide superior MR contrast in the irradiated regions. The FOX gels responded linearly with respect to real-time dose and the signal remained stable post-irradiation for at least 20 min. The response of the FOX gel also was found to be unaffected by the radiofrequency and gradient fields created by the b-FFE sequence during irradiation. A reusable version of the FOX gel was used for b-FFE sequence optimization to reduce artifacts by increasing the number of averages at the expense of temporal resolution. Regardless of the real-time MR sequence used, the FOX 3D gels responded linearly to dose with minimal magnetic field effects due to the strong 1.5 T field or gradient fields present during imaging. These gels can easily be made in-house using non-reusable and reusable formulations depending on the needs of the clinic, and the results of this study encourage further applications of 3D gels for MR-IGRT applications.

  1. Real-time Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-Ur; Jobson, Daniel J.; Woodell, Glenn A.; Harrah, Steven D.

    2005-01-01

    Flying in poor visibility conditions, such as rain, snow, fog or haze, is inherently dangerous. However these conditions can occur at nearly any location, so inevitably pilots must successfully navigate through them. At NASA Langley Research Center (LaRC), under support of the Aviation Safety and Security Program Office and the Systems Engineering Directorate, we are developing an Enhanced Vision System (EVS) that combines image enhancement and synthetic vision elements to assist pilots flying through adverse weather conditions. This system uses a combination of forward-looking infrared and visible sensors for data acquisition. A core function of the system is to enhance and fuse the sensor data in order to increase the information content and quality of the captured imagery. These operations must be performed in real-time for the pilot to use while flying. For image enhancement, we are using the LaRC patented Retinex algorithm since it performs exceptionally well for improving low-contrast range imagery typically seen during poor visibility conditions. In general, real-time operation of the Retinex requires specialized hardware. To date, we have successfully implemented a single-sensor real-time version of the Retinex on several different Digital Signal Processor (DSP) platforms. In this paper we give an overview of the EVS and its performance requirements for real-time enhancement and fusion and we discuss our current real-time Retinex implementations on DSPs.

  2. Real-time enhanced vision system

    NASA Astrophysics Data System (ADS)

    Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.; Harrah, Steven D.

    2005-05-01

    Flying in poor visibility conditions, such as rain, snow, fog or haze, is inherently dangerous. However these conditions can occur at nearly any location, so inevitably pilots must successfully navigate through them. At NASA Langley Research Center (LaRC), under support of the Aviation Safety and Security Program Office and the Systems Engineering Directorate, we are developing an Enhanced Vision System (EVS) that combines image enhancement and synthetic vision elements to assist pilots flying through adverse weather conditions. This system uses a combination of forward-looking infrared and visible sensors for data acquisition. A core function of the system is to enhance and fuse the sensor data in order to increase the information content and quality of the captured imagery. These operations must be performed in real-time for the pilot to use while flying. For image enhancement, we are using the LaRC patented Retinex algorithm since it performs exceptionally well for improving low-contrast range imagery typically seen during poor visibility poor visibility conditions. In general, real-time operation of the Retinex requires specialized hardware. To date, we have successfully implemented a single-sensor real-time version of the Retinex on several different Digital Signal Processor (DSP) platforms. In this paper we give an overview of the EVS and its performance requirements for real-time enhancement and fusion and we discuss our current real-time Retinex implementations on DSPs.

  3. Automatic image fusion of real-time ultrasound with computed tomography images: a prospective comparison between two auto-registration methods.

    PubMed

    Cha, Dong Ik; Lee, Min Woo; Kim, Ah Yeong; Kang, Tae Wook; Oh, Young-Taek; Jeong, Ja-Yeon; Chang, Jung-Woo; Ryu, Jiwon; Lee, Kyong Joon; Kim, Jaeil; Bang, Won-Chul; Shin, Dong Kuk; Choi, Sung Jin; Koh, Dalkwon; Seo, Bong Koo; Kim, Kyunga

    2017-11-01

    Background A major drawback of conventional manual image fusion is that the process may be complex, especially for less-experienced operators. Recently, two automatic image fusion techniques called Positioning and Sweeping auto-registration have been developed. Purpose To compare the accuracy and required time for image fusion of real-time ultrasonography (US) and computed tomography (CT) images between Positioning and Sweeping auto-registration. Material and Methods Eighteen consecutive patients referred for planning US for radiofrequency ablation or biopsy for focal hepatic lesions were enrolled. Image fusion using both auto-registration methods was performed for each patient. Registration error, time required for image fusion, and number of point locks used were compared using the Wilcoxon signed rank test. Results Image fusion was successful in all patients. Positioning auto-registration was significantly faster than Sweeping auto-registration for both initial (median, 11 s [range, 3-16 s] vs. 32 s [range, 21-38 s]; P < 0.001] and complete (median, 34.0 s [range, 26-66 s] vs. 47.5 s [range, 32-90]; P = 0.001] image fusion. Registration error of Positioning auto-registration was significantly higher for initial image fusion (median, 38.8 mm [range, 16.0-84.6 mm] vs. 18.2 mm [6.7-73.4 mm]; P = 0.029), but not for complete image fusion (median, 4.75 mm [range, 1.7-9.9 mm] vs. 5.8 mm [range, 2.0-13.0 mm]; P = 0.338]. Number of point locks required to refine the initially fused images was significantly higher with Positioning auto-registration (median, 2 [range, 2-3] vs. 1 [range, 1-2]; P = 0.012]. Conclusion Positioning auto-registration offers faster image fusion between real-time US and pre-procedural CT images than Sweeping auto-registration. The final registration error is similar between the two methods.

  4. Color reproduction and processing algorithm based on real-time mapping for endoscopic images.

    PubMed

    Khan, Tareq H; Mohammed, Shahed K; Imtiaz, Mohammad S; Wahid, Khan A

    2016-01-01

    In this paper, we present a real-time preprocessing algorithm for image enhancement for endoscopic images. A novel dictionary based color mapping algorithm is used for reproducing the color information from a theme image. The theme image is selected from a nearby anatomical location. A database of color endoscopy image for different location is prepared for this purpose. The color map is dynamic as its contents change with the change of the theme image. This method is used on low contrast grayscale white light images and raw narrow band images to highlight the vascular and mucosa structures and to colorize the images. It can also be applied to enhance the tone of color images. The statistic visual representation and universal image quality measures show that the proposed method can highlight the mucosa structure compared to other methods. The color similarity has been verified using Delta E color difference, structure similarity index, mean structure similarity index and structure and hue similarity. The color enhancement was measured using color enhancement factor that shows considerable improvements. The proposed algorithm has low and linear time complexity, which results in higher execution speed than other related works.

  5. Real-time three-dimensional imaging of epidermal splitting and removal by high-definition optical coherence tomography.

    PubMed

    Boone, Marc; Draye, Jean Pierre; Verween, Gunther; Pirnay, Jean-Paul; Verbeken, Gilbert; De Vos, Daniel; Rose, Thomas; Jennes, Serge; Jemec, Gregor B E; Del Marmol, Véronique

    2014-10-01

    While real-time 3-D evaluation of human skin constructs is needed, only 2-D non-invasive imaging techniques are available. The aim of this paper is to evaluate the potential of high-definition optical coherence tomography (HD-OCT) for real-time 3-D assessment of the epidermal splitting and decellularization. Human skin samples were incubated with four different agents: Dispase II, NaCl 1 M, sodium dodecyl sulphate (SDS) and Triton X-100. Epidermal splitting, dermo-epidermal junction, acellularity and 3-D architecture of dermal matrices were evaluated by High-definition optical coherence tomography before and after incubation. Real-time 3-D HD-OCT assessment was compared with 2-D en face assessment by reflectance confocal microscopy (RCM). (Immuno) histopathology was used as control. HD-OCT imaging allowed real-time 3-D visualization of the impact of selected agents on epidermal splitting, dermo-epidermal junction, dermal architecture, vascular spaces and cellularity. RCM has a better resolution (1 μm) than HD-OCT (3 μm), permitting differentiation of different collagen fibres, but HD-OCT imaging has deeper penetration (570 μm) than RCM imaging (200 μm). Dispase II and NaCl treatments were found to be equally efficient in the removal of the epidermis from human split-thickness skin allografts. However, a different epidermal splitting level at the dermo-epidermal junction could be observed and confirmed by immunolabelling of collagen type IV and type VII. Epidermal splitting occurred at the level of the lamina densa with dispase II and above the lamina densa (in the lamina lucida) with NaCl. The 3-D architecture of dermal papillae and dermis was more affected by Dispase II on HD-OCT which corresponded with histopathologic (orcein staining) fragmentation of elastic fibres. With SDS treatment, the epidermal removal was incomplete as remnants of the epidermal basal cell layer remained attached to the basement membrane on the dermis. With Triton X-100 treatment

  6. A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging.

    PubMed

    Jiang, J; Hall, T J

    2007-07-07

    Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s(-1)) that exceed our previous methods.

  7. A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging

    NASA Astrophysics Data System (ADS)

    Jiang, J.; Hall, T. J.

    2007-07-01

    Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows® system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s-1) that exceed our previous methods.

  8. Cherenkov Video Imaging Allows for the First Visualization of Radiation Therapy in Real Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, Lesley A., E-mail: Lesley.a.jarvis@hitchcock.org; Norris Cotton Cancer Center at the Dartmouth-Hitchcock Medical Center, Lebanon, New Hampshire; Zhang, Rongxiao

    Purpose: To determine whether Cherenkov light imaging can visualize radiation therapy in real time during breast radiation therapy. Methods and Materials: An intensified charge-coupled device (CCD) camera was synchronized to the 3.25-μs radiation pulses of the clinical linear accelerator with the intensifier set × 100. Cherenkov images were acquired continuously (2.8 frames/s) during fractionated whole breast irradiation with each frame an accumulation of 100 radiation pulses (approximately 5 monitor units). Results: The first patient images ever created are used to illustrate that Cherenkov emission can be visualized as a video during conditions typical for breast radiation therapy, even with complex treatment plans,more » mixed energies, and modulated treatment fields. Images were generated correlating to the superficial dose received by the patient and potentially the location of the resulting skin reactions. Major blood vessels are visible in the image, providing the potential to use these as biological landmarks for improved geometric accuracy. The potential for this system to detect radiation therapy misadministrations, which can result from hardware malfunction or patient positioning setup errors during individual fractions, is shown. Conclusions: Cherenkoscopy is a unique method for visualizing surface dose resulting in real-time quality control. We propose that this system could detect radiation therapy errors in everyday clinical practice at a time when these errors can be corrected to result in improved safety and quality of radiation therapy.« less

  9. Real time magnetic resonance guided endomyocardial local delivery

    PubMed Central

    Corti, R; Badimon, J; Mizsei, G; Macaluso, F; Lee, M; Licato, P; Viles-Gonzalez, J F; Fuster, V; Sherman, W

    2005-01-01

    Objective: To investigate the feasibility of targeting various areas of left ventricle myocardium under real time magnetic resonance (MR) imaging with a customised injection catheter equipped with a miniaturised coil. Design: A needle injection catheter with a mounted resonant solenoid circuit (coil) at its tip was designed and constructed. A 1.5 T MR scanner with customised real time sequence combined with in-room scan running capabilities was used. With this system, various myocardial areas within the left ventricle were targeted and injected with a gadolinium-diethylenetriaminepentaacetic acid (DTPA) and Indian ink mixture. Results: Real time sequencing at 10 frames/s allowed clear visualisation of the moving catheter and its transit through the aorta into the ventricle, as well as targeting of all ventricle wall segments without further image enhancement techniques. All injections were visualised by real time MR imaging and verified by gross pathology. Conclusion: The tracking device allowed real time in vivo visualisation of catheters in the aorta and left ventricle as well as precise targeting of myocardial areas. The use of this real time catheter tracking may enable precise and adequate delivery of agents for tissue regeneration. PMID:15710717

  10. Tablet disintegration studied by high-resolution real-time magnetic resonance imaging.

    PubMed

    Quodbach, Julian; Moussavi, Amir; Tammer, Roland; Frahm, Jens; Kleinebudde, Peter

    2014-01-01

    The present work employs recent advances in high-resolution real-time magnetic resonance imaging (MRI) to investigate the disintegration process of tablets containing disintegrants. A temporal resolution of 75 ms and a spatial resolution of 80 × 80 µm with a section thickness of only 600 µm were achieved. The histograms of MRI videos were quantitatively analyzed with MATLAB. The mechanisms of action of six commercially available disintegrants, the influence of relative tablet density, and the impact of disintegrant concentration were examined. Crospovidone seems to be the only disintegrant acting by a shape memory effect, whereas the others mainly swell. A higher relative density of tablets containing croscarmellose sodium leads to a more even distribution of water within the tablet matrix but hardly impacts the disintegration kinetics. Increasing the polacrilin potassium disintegrant concentration leads to a quicker and more thorough disintegration process. Real-time MRI emerges as valuable tool to visualize and investigate the process of tablet disintegration.

  11. Diagnosis of Gastroesophageal Reflux Disease Using Real-time Magnetic Resonance Imaging

    PubMed Central

    Zhang, Shuo; Joseph, Arun A.; Gross, Lisa; Ghadimi, Michael; Frahm, Jens; Beham, Alexander W.

    2015-01-01

    A small angle (His angle) between the oesophagus and the fundus of the stomach is considered to act as flap valve and anti-reflux barrier. A wide angle results in dysfunction of the oesophagogastric junction and subsequently in gastroesophageal reflux disease (GERD). Here, we used real-time magnetic resonance imaging (MRI) at 50 ms resolution (20 frames per second) in 12 volunteers and 12 patients with GERD to assess transport of pineapple juice through the oesophagogastric junction and reflux during Valsalva. We found that the intra-abdominal part of the oesophagus was bended towards the left side resulting in an angle of 75.3 ± 17.4, which was significantly larger during Valsava (P = 0.017). Reflux and several underlying pathologies were detected in 11 out of 12 patients. Our data visualize oesophagogastric junction physiology and disprove the flap valve hypothesis. Further, non-invasive real-time MRI has considerable potential for the diagnosis of causative pathologies leading to GERD. PMID:26175205

  12. Real-Time Two-Dimensional Magnetic Particle Imaging for Electromagnetic Navigation in Targeted Drug Delivery

    PubMed Central

    Le, Tuan-Anh; Zhang, Xingming; Hoshiar, Ali Kafash; Yoon, Jungwon

    2017-01-01

    Magnetic nanoparticles (MNPs) are effective drug carriers. By using electromagnetic actuated systems, MNPs can be controlled noninvasively in a vascular network for targeted drug delivery (TDD). Although drugs can reach their target location through capturing schemes of MNPs by permanent magnets, drugs delivered to non-target regions can affect healthy tissues and cause undesirable side effects. Real-time monitoring of MNPs can improve the targeting efficiency of TDD systems. In this paper, a two-dimensional (2D) real-time monitoring scheme has been developed for an MNP guidance system. Resovist particles 45 to 65 nm in diameter (5 nm core) can be monitored in real-time (update rate = 2 Hz) in 2D. The proposed 2D monitoring system allows dynamic tracking of MNPs during TDD and renders magnetic particle imaging-based navigation more feasible. PMID:28880220

  13. A DICOM Based Collaborative Platform for Real-Time Medical Teleconsultation on Medical Images.

    PubMed

    Maglogiannis, Ilias; Andrikos, Christos; Rassias, Georgios; Tsanakas, Panayiotis

    2017-01-01

    The paper deals with the design of a Web-based platform for real-time medical teleconsultation on medical images. The proposed platform combines the principles of heterogeneous Workflow Management Systems (WfMSs), the peer-to-peer networking architecture and the SPA (Single-Page Application) concept, to facilitate medical collaboration among healthcare professionals geographically distributed. The presented work leverages state-of-the-art features of the web to support peer-to-peer communication using the WebRTC (Web Real Time Communication) protocol and client-side data processing for creating an integrated collaboration environment. The paper discusses the technical details of implementation and presents the operation of the platform in practice along with some initial results.

  14. Airborne measurements in the infrared using FTIR-based imaging hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Puckrin, E.; Turcotte, C. S.; Lahaie, P.; Dubé, D.; Lagueux, P.; Farley, V.; Marcotte, F.; Chamberland, M.

    2009-09-01

    Hyperspectral ground mapping is being used in an ever-increasing extent for numerous applications in the military, geology and environmental fields. The different regions of the electromagnetic spectrum help produce information of differing nature. The visible, near-infrared and short-wave infrared radiation (400 nm to 2.5 μm) has been mostly used to analyze reflected solar light, while the mid-wave (3 to 5 μm) and long-wave (8 to 12 μm or thermal) infrared senses the self-emission of molecules directly, enabling the acquisition of data during night time. Push-broom dispersive sensors have been typically used for airborne hyperspectral mapping. However, extending the spectral range towards the mid-wave and long-wave infrared brings performance limitations due to the self emission of the sensor itself. The Fourier-transform spectrometer technology has been extensively used in the infrared spectral range due to its high transmittance as well as throughput and multiplex advantages, thereby reducing the sensor self-emission problem. Telops has developed the Hyper-Cam, a rugged and compact infrared hyperspectral imager. The Hyper-Cam is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides passive signature measurement capability, with up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The Hyper-Cam has been used on the ground in several field campaigns, including the demonstration of standoff chemical agent detection. More recently, the Hyper-Cam has been integrated into an airplane to provide airborne measurement capabilities. A special pointing module was designed to compensate for airplane attitude and forward motion. To our knowledge, the Hyper-Cam is the first commercial airborne hyperspectral imaging sensor based on Fourier-transform infrared technology. The first airborne measurements and some preliminary performance criteria for the Hyper-Cam are presented in

  15. Airborne measurements in the infrared using FTIR-based imaging hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Puckrin, E.; Turcotte, C. S.; Lahaie, P.; Dubé, D.; Farley, V.; Lagueux, P.; Marcotte, F.; Chamberland, M.

    2009-05-01

    Hyperspectral ground mapping is being used in an ever-increasing extent for numerous applications in the military, geology and environmental fields. The different regions of the electromagnetic spectrum help produce information of differing nature. The visible, near-infrared and short-wave infrared radiation (400 nm to 2.5 μm) has been mostly used to analyze reflected solar light, while the mid-wave (3 to 5 μm) and long-wave (8 to 12 μm or thermal) infrared senses the self-emission of molecules directly, enabling the acquisition of data during night time. Push-broom dispersive sensors have been typically used for airborne hyperspectral mapping. However, extending the spectral range towards the mid-wave and long-wave infrared brings performance limitations due to the self emission of the sensor itself. The Fourier-transform spectrometer technology has been extensively used in the infrared spectral range due to its high transmittance as well as throughput and multiplex advantages, thereby reducing the sensor self-emission problem. Telops has developed the Hyper-Cam, a rugged and compact infrared hyperspectral imager. The Hyper-Cam is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides passive signature measurement capability, with up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The Hyper-Cam has been used on the ground in several field campaigns, including the demonstration of standoff chemical agent detection. More recently, the Hyper-Cam has been integrated into an airplane to provide airborne measurement capabilities. A special pointing module was designed to compensate for airplane attitude and forward motion. To our knowledge, the Hyper-Cam is the first commercial airborne hyperspectral imaging sensor based on Fourier-transform infrared technology. The first airborne measurements and some preliminary performance criteria for the Hyper-Cam are presented in

  16. Real-Time 3d Reconstruction from Images Taken from AN Uav

    NASA Astrophysics Data System (ADS)

    Zingoni, A.; Diani, M.; Corsini, G.; Masini, A.

    2015-08-01

    We designed a method for creating 3D models of objects and areas from two aerial images acquired from an UAV. The models are generated automatically and in real-time, and consist in dense and true-colour reconstructions of the considered areas, which give the impression to the operator to be physically present within the scene. The proposed method only needs a cheap compact camera, mounted on a small UAV. No additional instrumentation is necessary, so that the costs are very limited. The method consists of two main parts: the design of the acquisition system and the 3D reconstruction algorithm. In the first part, the choices for the acquisition geometry and for the camera parameters are optimized, in order to yield the best performance. In the second part, a reconstruction algorithm extracts the 3D model from the two acquired images, maximizing the accuracy under the real-time constraint. A test was performed in monitoring a construction yard, obtaining very promising results. Highly realistic and easy-to-interpret 3D models of objects and areas of interest were produced in less than one second, with an accuracy of about 0.5m. For its characteristics, the designed method is suitable for video-surveillance, remote sensing and monitoring, especially in those applications that require intuitive and reliable information quickly, as disasters monitoring, search and rescue and area surveillance.

  17. Interfacing An Intelligent Decision-Maker To A Real-Time Control System

    NASA Astrophysics Data System (ADS)

    Evers, D. C.; Smith, D. M.; Staros, C. J.

    1984-06-01

    This paper discusses some of the practical aspects of implementing expert systems in a real-time environment. There is a conflict between the needs of a process control system and the computational load imposed by intelligent decision-making software. The computation required to manage a real-time control problem is primarily concerned with routine calculations which must be executed in real time. On most current hardware, non-trivial AI software should not be forced to operate under real-time constraints. In order for the system to work efficiently, the two processes must be separated by a well-defined interface. Although the precise nature of the task separation will vary with the application, the definition of the interface will need to follow certain fundamental principles in order to provide functional separation. This interface was successfully implemented in the expert scheduling software currently running the automated chemical processing facility at Lockheed-Georgia. Potential applications of this concept in the areas of airborne avionics and robotics will be discussed.

  18. Dedicated hardware processor and corresponding system-on-chip design for real-time laser speckle imaging.

    PubMed

    Jiang, Chao; Zhang, Hongyan; Wang, Jia; Wang, Yaru; He, Heng; Liu, Rui; Zhou, Fangyuan; Deng, Jialiang; Li, Pengcheng; Luo, Qingming

    2011-11-01

    Laser speckle imaging (LSI) is a noninvasive and full-field optical imaging technique which produces two-dimensional blood flow maps of tissues from the raw laser speckle images captured by a CCD camera without scanning. We present a hardware-friendly algorithm for the real-time processing of laser speckle imaging. The algorithm is developed and optimized specifically for LSI processing in the field programmable gate array (FPGA). Based on this algorithm, we designed a dedicated hardware processor for real-time LSI in FPGA. The pipeline processing scheme and parallel computing architecture are introduced into the design of this LSI hardware processor. When the LSI hardware processor is implemented in the FPGA running at the maximum frequency of 130 MHz, up to 85 raw images with the resolution of 640×480 pixels can be processed per second. Meanwhile, we also present a system on chip (SOC) solution for LSI processing by integrating the CCD controller, memory controller, LSI hardware processor, and LCD display controller into a single FPGA chip. This SOC solution also can be used to produce an application specific integrated circuit for LSI processing.

  19. Real-time imaging of the growth-inhibitory effect of JS399-19 on Fusarium.

    PubMed

    Wollenberg, Rasmus D; Donau, Søren S; Nielsen, Thorbjørn T; Sørensen, Jens L; Giese, Henriette; Wimmer, Reinhard; Søndergaard, Teis E

    2016-11-01

    Real-time imaging was used to study the effects of a novel Fusarium-specific cyanoacrylate fungicide (JS399-19) on growth and morphology of four Fusarium sp. This fungicide targets the motor domain of type I myosin. Fusarium graminearum PH-1, Fusarium solani f. sp. pisi 77-13-4, Fusarium avenaceum IBT8464, and Fusarium avenaceum 05001, which has a K216Q amino-acid substitution at the resistance-implicated site in its myosin type I motor domain, were analyzed. Real-time imaging shows that JS399-19 inhibits fungal growth but not to the extent previously reported. The fungicide causes the hypha to become entangled and unable to extend vertically. This implies that type I myosin in Fusarium is essential for hyphal and mycelia propagation. The K216Q substitution correlates with reduced susceptibility in F. avenaceum. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Comparative performance between compressed and uncompressed airborne imagery

    NASA Astrophysics Data System (ADS)

    Phan, Chung; Rupp, Ronald; Agarwal, Sanjeev; Trang, Anh; Nair, Sumesh

    2008-04-01

    The US Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD), Countermine Division is evaluating the compressibility of airborne multi-spectral imagery for mine and minefield detection application. Of particular interest is to assess the highest image data compression rate that can be afforded without the loss of image quality for war fighters in the loop and performance of near real time mine detection algorithm. The JPEG-2000 compression standard is used to perform data compression. Both lossless and lossy compressions are considered. A multi-spectral anomaly detector such as RX (Reed & Xiaoli), which is widely used as a core algorithm baseline in airborne mine and minefield detection on different mine types, minefields, and terrains to identify potential individual targets, is used to compare the mine detection performance. This paper presents the compression scheme and compares detection performance results between compressed and uncompressed imagery for various level of compressions. The compression efficiency is evaluated and its dependence upon different backgrounds and other factors are documented and presented using multi-spectral data.

  1. Real-time blind image deconvolution based on coordinated framework of FPGA and DSP

    NASA Astrophysics Data System (ADS)

    Wang, Ze; Li, Hang; Zhou, Hua; Liu, Hongjun

    2015-10-01

    Image restoration takes a crucial place in several important application domains. With the increasing of computation requirement as the algorithms become much more complexity, there has been a significant rise in the need for accelerating implementation. In this paper, we focus on an efficient real-time image processing system for blind iterative deconvolution method by means of the Richardson-Lucy (R-L) algorithm. We study the characteristics of algorithm, and an image restoration processing system based on the coordinated framework of FPGA and DSP (CoFD) is presented. Single precision floating-point processing units with small-scale cascade and special FFT/IFFT processing modules are adopted to guarantee the accuracy of the processing. Finally, Comparing experiments are done. The system could process a blurred image of 128×128 pixels within 32 milliseconds, and is up to three or four times faster than the traditional multi-DSPs systems.

  2. The use of real-time ultrasound imaging for biofeedback of lumbar multifidus muscle contraction in healthy subjects.

    PubMed

    Van, Khai; Hides, Julie A; Richardson, Carolyn A

    2006-12-01

    Randomized controlled trial. To determine if the provision of visual biofeedback using real-time ultrasound imaging enhances the ability to activate the multifidus muscle. Increasingly clinicians are using real-time ultrasound as a form of biofeedback when re-educating muscle activation. The effectiveness of this form of biofeedback for the multifidus muscle has not been reported. Healthy subjects were randomly divided into groups that received different forms of biofeedback. All subjects received clinical instruction on how to activate the multifidus muscle isometrically prior to testing and verbal feedback regarding the amount of multifidus contraction, which occurred during 10 repetitions (acquisition phase). In addition, 1 group received visual biofeedback (watched the multifidus muscle contract) using real-time ultrasound imaging. All subjects were reassessed a week later (retention phase). Subjects from both groups improved their voluntary contraction of the multifidus muscle in the acquisition phase (P<.001) and the ability to recruit the multifidus muscle differed between groups (P<.05), with subjects in the group that received visual ultrasound biofeedback achieving greater improvements. In addition, the group that received visual ultrasound biofeedback retained their improvement in performance from week 1 to week 2 (P>.90), whereas the performance of the other group decreased (P<.05). Real-time ultrasound imaging can be used to provide visual biofeedback and improve performance and retention in the ability to activate the multifidus muscle in healthy subjects.

  3. FPGA implementation of image dehazing algorithm for real time applications

    NASA Astrophysics Data System (ADS)

    Kumar, Rahul; Kaushik, Brajesh Kumar; Balasubramanian, R.

    2017-09-01

    Weather degradation such as haze, fog, mist, etc. severely reduces the effective range of visual surveillance. This degradation is a spatially varying phenomena, which makes this problem non trivial. Dehazing is an essential preprocessing stage in applications such as long range imaging, border security, intelligent transportation system, etc. However, these applications require low latency of the preprocessing block. In this work, single image dark channel prior algorithm is modified and implemented for fast processing with comparable visual quality of the restored image/video. Although conventional single image dark channel prior algorithm is computationally expensive, it yields impressive results. Moreover, a two stage image dehazing architecture is introduced, wherein, dark channel and airlight are estimated in the first stage. Whereas, transmission map and intensity restoration are computed in the next stages. The algorithm is implemented using Xilinx Vivado software and validated by using Xilinx zc702 development board, which contains an Artix7 equivalent Field Programmable Gate Array (FPGA) and ARM Cortex A9 dual core processor. Additionally, high definition multimedia interface (HDMI) has been incorporated for video feed and display purposes. The results show that the dehazing algorithm attains 29 frames per second for the image resolution of 1920x1080 which is suitable of real time applications. The design utilizes 9 18K_BRAM, 97 DSP_48, 6508 FFs and 8159 LUTs.

  4. Near Real-Time Automatic Marine Vessel Detection on Optical Satellite Images

    NASA Astrophysics Data System (ADS)

    Máttyus, G.

    2013-05-01

    Vessel monitoring and surveillance is important for maritime safety and security, environment protection and border control. Ship monitoring systems based on Synthetic-aperture Radar (SAR) satellite images are operational. On SAR images the ships made of metal with sharp edges appear as bright dots and edges, therefore they can be well distinguished from the water. Since the radar is independent from the sun light and can acquire images also by cloudy weather and rain, it provides a reliable service. Vessel detection from spaceborne optical images (VDSOI) can extend the SAR based systems by providing more frequent revisit times and overcoming some drawbacks of the SAR images (e.g. lower spatial resolution, difficult human interpretation). Optical satellite images (OSI) can have a higher spatial resolution thus enabling the detection of smaller vessels and enhancing the vessel type classification. The human interpretation of an optical image is also easier than as of SAR image. In this paper I present a rapid automatic vessel detection method which uses pattern recognition methods, originally developed in the computer vision field. In the first step I train a binary classifier from image samples of vessels and background. The classifier uses simple features which can be calculated very fast. For the detection the classifier is slided along the image in various directions and scales. The detector has a cascade structure which rejects most of the background in the early stages which leads to faster execution. The detections are grouped together to avoid multiple detections. Finally the position, size(i.e. length and width) and heading of the vessels is extracted from the contours of the vessel. The presented method is parallelized, thus it runs fast (in minutes for 16000 × 16000 pixels image) on a multicore computer, enabling near real-time applications, e.g. one hour from image acquisition to end user.

  5. A system for the real-time display of radar and video images of targets

    NASA Technical Reports Server (NTRS)

    Allen, W. W.; Burnside, W. D.

    1990-01-01

    Described here is a software and hardware system for the real-time display of radar and video images for use in a measurement range. The main purpose is to give the reader a clear idea of the software and hardware design and its functions. This system is designed around a Tektronix XD88-30 graphics workstation, used to display radar images superimposed on video images of the actual target. The system's purpose is to provide a platform for tha analysis and documentation of radar images and their associated targets in a menu-driven, user oriented environment.

  6. Real-time hyperspectral fluorescence imaging of pancreatic β-cell dynamics with the image mapping spectrometer

    PubMed Central

    Elliott, Amicia D.; Gao, Liang; Ustione, Alessandro; Bedard, Noah; Kester, Robert; Piston, David W.; Tkaczyk, Tomasz S.

    2012-01-01

    Summary The development of multi-colored fluorescent proteins, nanocrystals and organic fluorophores, along with the resulting engineered biosensors, has revolutionized the study of protein localization and dynamics in living cells. Hyperspectral imaging has proven to be a useful approach for such studies, but this technique is often limited by low signal and insufficient temporal resolution. Here, we present an implementation of a snapshot hyperspectral imaging device, the image mapping spectrometer (IMS), which acquires full spectral information simultaneously from each pixel in the field without scanning. The IMS is capable of real-time signal capture from multiple fluorophores with high collection efficiency (∼65%) and image acquisition rate (up to 7.2 fps). To demonstrate the capabilities of the IMS in cellular applications, we have combined fluorescent protein (FP)-FRET and [Ca2+]i biosensors to measure simultaneously intracellular cAMP and [Ca2+]i signaling in pancreatic β-cells. Additionally, we have compared quantitatively the IMS detection efficiency with a laser-scanning confocal microscope. PMID:22854044

  7. Implementation of real-time nonuniformity correction with multiple NUC tables using FPGA in an uncooled imaging system

    NASA Astrophysics Data System (ADS)

    Oh, Gyong Jin; Kim, Lyang-June; Sheen, Sue-Ho; Koo, Gyou-Phyo; Jin, Sang-Hun; Yeo, Bo-Yeon; Lee, Jong-Ho

    2009-05-01

    This paper presents a real time implementation of Non Uniformity Correction (NUC). Two point correction and one point correction with shutter were carried out in an uncooled imaging system which will be applied to a missile application. To design a small, light weight and high speed imaging system for a missile system, SoPC (System On a Programmable Chip) which comprises of FPGA and soft core (Micro-blaze) was used. Real time NUC and generation of control signals are implemented using FPGA. Also, three different NUC tables were made to make the operating time shorter and to reduce the power consumption in a large range of environment temperature. The imaging system consists of optics and four electronics boards which are detector interface board, Analog to Digital converter board, Detector signal generation board and Power supply board. To evaluate the imaging system, NETD was measured. The NETD was less than 160mK in three different environment temperatures.

  8. Real-time fluorescence imaging of the DNA damage repair response during mitosis.

    PubMed

    Miwa, Shinji; Yano, Shuya; Yamamoto, Mako; Matsumoto, Yasunori; Uehara, Fuminari; Hiroshima, Yukihiko; Toneri, Makoto; Murakami, Takashi; Kimura, Hiroaki; Hayashi, Katsuhiro; Yamamoto, Norio; Efimova, Elena V; Tsuchiya, Hiroyuki; Hoffman, Robert M

    2015-04-01

    The response to DNA damage during mitosis was visualized using real-time fluorescence imaging of focus formation by the DNA-damage repair (DDR) response protein 53BP1 linked to green fluorescent protein (GFP) (53BP1-GFP) in the MiaPaCa-2(Tet-On) pancreatic cancer cell line. To observe 53BP1-GFP foci during mitosis, MiaPaCa-2(Tet-On) 53BP1-GFP cells were imaged every 30 min by confocal microscopy. Time-lapse imaging demonstrated that 11.4 ± 2.1% of the mitotic MiaPaCa-2(Tet-On) 53BP1-GFP cells had increased focus formation over time. Non-mitotic cells did not have an increase in 53BP1-GFP focus formation over time. Some of the mitotic MiaPaCa-2(Tet-On) 53BP1-GFP cells with focus formation became apoptotic. The results of the present report suggest that DNA strand breaks occur during mitosis and undergo repair, which may cause some of the mitotic cells to enter apoptosis in a phenomenon possibly related to mitotic catastrophe. © 2014 Wiley Periodicals, Inc.

  9. Real-time near-IR imaging of laser-ablation crater evolution in dental enamel

    NASA Astrophysics Data System (ADS)

    Darling, Cynthia L.; Fried, Daniel

    2007-02-01

    We have shown that the enamel of the tooth is almost completely transparent near 1310-nm in the near-infrared and that near-IR (NIR) imaging has considerable potential for the optical discrimination of sound and demineralized tissue and for observing defects in the interior of the tooth. Lasers are now routinely used for many applications in dentistry including the ablation of dental caries. The objective of this study was to test the hypothesis that real-time NIR imaging can be used to monitor laser-ablation under varying conditions to assess peripheral thermal and transient-stress induced damage and to measure the rate and efficiency of ablation. Moreover, NIR imaging may have considerable potential for monitoring the removal of demineralized areas of the tooth during cavity preparations. Sound human tooth sections of approximately 3-mm thickness were irradiated by a CO II laser under varying conditions with and without a water spray. The incision area in the interior of each sample was imaged using a tungsten-halogen lamp with band-pass filter centered at 131--nm combined with an InGaAs focal plane array with a NIR zoom microscope in transillumination. Due to the high transparency of enamel at 1310-nm, laser-incisions were clearly visible to the dentin-enamel junction and crack formation, dehydration and irreversible thermal changes were observed during ablation. This study showed that there is great potential for near-IR imaging to monitor laser-ablation events in real-time to: assess safe laser operating parameters by imaging thermal and stress-induced damage, elaborate the mechanisms involved in ablation such as dehydration, and monitor the removal of demineralized enamel.

  10. Strategies of statistical windows in PET image reconstruction to improve the user’s real time experience

    NASA Astrophysics Data System (ADS)

    Moliner, L.; Correcher, C.; Gimenez-Alventosa, V.; Ilisie, V.; Alvarez, J.; Sanchez, S.; Rodríguez-Alvarez, M. J.

    2017-11-01

    Nowadays, with the increase of the computational power of modern computers together with the state-of-the-art reconstruction algorithms, it is possible to obtain Positron Emission Tomography (PET) images in practically real time. These facts open the door to new applications such as radio-pharmaceuticals tracking inside the body or the use of PET for image-guided procedures, such as biopsy interventions, among others. This work is a proof of concept that aims to improve the user experience with real time PET images. Fixed, incremental, overlapping, sliding and hybrid windows are the different statistical combinations of data blocks used to generate intermediate images in order to follow the path of the activity in the Field Of View (FOV). To evaluate these different combinations, a point source is placed in a dedicated breast PET device and moved along the FOV. These acquisitions are reconstructed according to the different statistical windows, resulting in a smoother transition of positions for the image reconstructions that use the sliding and hybrid window.

  11. Real-time, wide-area hyperspectral imaging sensors for standoff detection of explosives and chemical warfare agents

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Tazik, Shawna; Gardner, Charles W.; Nelson, Matthew P.

    2017-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the detection and analysis of targets located within complex backgrounds. HSI can detect threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Unfortunately, current generation HSI systems have size, weight, and power limitations that prohibit their use for field-portable and/or real-time applications. Current generation systems commonly provide an inefficient area search rate, require close proximity to the target for screening, and/or are not capable of making real-time measurements. ChemImage Sensor Systems (CISS) is developing a variety of real-time, wide-field hyperspectral imaging systems that utilize shortwave infrared (SWIR) absorption and Raman spectroscopy. SWIR HSI sensors provide wide-area imagery with at or near real time detection speeds. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focusing on sensor design and detection results.

  12. Real-time embedded atmospheric compensation for long-range imaging using the average bispectrum speckle method

    NASA Astrophysics Data System (ADS)

    Curt, Petersen F.; Bodnar, Michael R.; Ortiz, Fernando E.; Carrano, Carmen J.; Kelmelis, Eric J.

    2009-02-01

    While imaging over long distances is critical to a number of security and defense applications, such as homeland security and launch tracking, current optical systems are limited in resolving power. This is largely a result of the turbulent atmosphere in the path between the region under observation and the imaging system, which can severely degrade captured imagery. There are a variety of post-processing techniques capable of recovering this obscured image information; however, the computational complexity of such approaches has prohibited real-time deployment and hampers the usability of these technologies in many scenarios. To overcome this limitation, we have designed and manufactured an embedded image processing system based on commodity hardware which can compensate for these atmospheric disturbances in real-time. Our system consists of a reformulation of the average bispectrum speckle method coupled with a high-end FPGA processing board, and employs modular I/O capable of interfacing with most common digital and analog video transport methods (composite, component, VGA, DVI, SDI, HD-SDI, etc.). By leveraging the custom, reconfigurable nature of the FPGA, we have achieved performance twenty times faster than a modern desktop PC, in a form-factor that is compact, low-power, and field-deployable.

  13. Novel ultrasonic real-time scanner featuring servo controlled transducers displaying a sector image.

    PubMed

    Matzuk, T; Skolnick, M L

    1978-07-01

    This paper describes a new real-time servo controlled sector scanner that produces high resolution images and has functionally programmable features similar to phased array systems, but possesses the simplicity of design and low cost best achievable in a mechanical sector scanner. The unique feature is the transducer head which contains a single moving part--the transducer--enclosed within a light-weight, hand held, and vibration free case. The frame rate, sector width, stop action angle, are all operator programmable. The frame rate can be varied from 12 to 30 frames s-1 and the sector width from 0 degrees to 60 degrees. Conversion from sector to time motion (T/M) modes are instant and two options are available, a freeze position high density T/M and a low density T/M obtainable simultaneously during sector visualization. Unusual electronic features are: automatic gain control, electronic recording of images on video tape in rf format, and ability to post-process images during video playback to extract T/M display and to change time gain control (tgc) and image size.

  14. Real-Time Time-Frequency Two-Dimensional Imaging of Ultrafast Transient Signals in Solid-State Organic Materials

    PubMed Central

    Takeda, Jun; Ishida, Akihiro; Makishima, Yoshinori; Katayama, Ikufumi

    2010-01-01

    In this review, we demonstrate a real-time time-frequency two-dimensional (2D) pump-probe imaging spectroscopy implemented on a single shot basis applicable to excited-state dynamics in solid-state organic and biological materials. Using this technique, we could successfully map ultrafast time-frequency 2D transient absorption signals of β-carotene in solid films with wide temporal and spectral ranges having very short accumulation time of 20 ms per unit frame. The results obtained indicate the high potential of this technique as a powerful and unique spectroscopic tool to observe ultrafast excited-state dynamics of organic and biological materials in solid-state, which undergo rapid photodegradation. PMID:22399879

  15. Imaging systems and algorithms to analyze biological samples in real-time using mobile phone microscopy.

    PubMed

    Shanmugam, Akshaya; Usmani, Mohammad; Mayberry, Addison; Perkins, David L; Holcomb, Daniel E

    2018-01-01

    Miniaturized imaging devices have pushed the boundaries of point-of-care imaging, but existing mobile-phone-based imaging systems do not exploit the full potential of smart phones. This work demonstrates the use of simple imaging configurations to deliver superior image quality and the ability to handle a wide range of biological samples. Results presented in this work are from analysis of fluorescent beads under fluorescence imaging, as well as helminth eggs and freshwater mussel larvae under white light imaging. To demonstrate versatility of the systems, real time analysis and post-processing results of the sample count and sample size are presented in both still images and videos of flowing samples.

  16. Preliminary evaluation of the airborne imaging spectrometer for vegetation analysis

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Woodcock, C. E.

    1984-01-01

    The primary goal of the project was to provide ground truth and manual interpretation of data from an experimental flight of the Airborne Infrared Spectrometer (AIS) for a naturally vegetated test site. Two field visits were made; one trip to note snow conditions and temporally related vegetation states at the time of the sensor overpass, and a second trip following acquisition of prints of the AIS images for field interpretation. Unfortunately, the ability to interpret the imagery was limited by the quality of the imagery due to the experimental nature of the sensor.

  17. Laser-Generated Ultrasonic Source for a Real-Time Dry-Contact Imaging System

    NASA Astrophysics Data System (ADS)

    Petculescu, G.; Zhou, Y.; Komsky, I.; Krishnaswamy, S.

    2006-03-01

    A laser-generated ultrasonic source, to be used with a real-time imaging device, was developed. The ultrasound is generated in the thermoelastic regime, in a composite layer composed of absorbing particles (carbon) and silicone rubber. The composite layer plays three roles: of absorption, constriction and dry-coupling. The central frequency of the generated pulse was controlled by varying the absorption depth of the generation layer. The maximum peak frequency obtained was 4MHz. When additional constriction was provided to the composite layer, the amplitude of the generated signal increased further, due to the large thermal expansion coefficient of the silicone. Images using the laser-generated ultrasonic source were taken.

  18. Real-time image processing for non-contact monitoring of dynamic displacements using smartphone technologies

    NASA Astrophysics Data System (ADS)

    Min, Jae-Hong; Gelo, Nikolas J.; Jo, Hongki

    2016-04-01

    The newly developed smartphone application, named RINO, in this study allows measuring absolute dynamic displacements and processing them in real time using state-of-the-art smartphone technologies, such as high-performance graphics processing unit (GPU), in addition to already powerful CPU and memories, embedded high-speed/ resolution camera, and open-source computer vision libraries. A carefully designed color-patterned target and user-adjustable crop filter enable accurate and fast image processing, allowing up to 240fps for complete displacement calculation and real-time display. The performances of the developed smartphone application are experimentally validated, showing comparable accuracy with those of conventional laser displacement sensor.

  19. Detection of hidden objects using a real-time 3-D millimeter-wave imaging system

    NASA Astrophysics Data System (ADS)

    Rozban, Daniel; Aharon, Avihai; Levanon, Assaf; Abramovich, Amir; Yitzhaky, Yitzhak; Kopeika, N. S.

    2014-10-01

    Millimeter (mm)and sub-mm wavelengths or terahertz (THz) band have several properties that motivate their use in imaging for security applications such as recognition of hidden objects, dangerous materials, aerosols, imaging through walls as in hostage situations, and also in bad weather conditions. There is no known ionization hazard for biological tissue, and atmospheric degradation of THz radiation is relatively low for practical imaging distances. We recently developed a new technology for the detection of THz radiation. This technology is based on very inexpensive plasma neon indicator lamps, also known as Glow Discharge Detector (GDD), that can be used as very sensitive THz radiation detectors. Using them, we designed and constructed a Focal Plane Array (FPA) and obtained recognizable2-dimensional THz images of both dielectric and metallic objects. Using THz wave it is shown here that even concealed weapons made of dielectric material can be detected. An example is an image of a knife concealed inside a leather bag and also under heavy clothing. Three-dimensional imaging using radar methods can enhance those images since it can allow the isolation of the concealed objects from the body and environmental clutter such as nearby furniture or other people. The GDDs enable direct heterodyning between the electric field of the target signal and the reference signal eliminating the requirement for expensive mixers, sources, and Low Noise Amplifiers (LNAs).We expanded the ability of the FPA so that we are able to obtain recognizable 2-dimensional THz images in real time. We show here that the THz detection of objects in three dimensions, using FMCW principles is also applicable in real time. This imaging system is also shown here to be capable of imaging objects from distances allowing standoff detection of suspicious objects and humans from large distances.

  20. Delivery performance of conventional aircraft by terminal-area, time-based air traffic control: A real-time simulation evaluation

    NASA Technical Reports Server (NTRS)

    Credeur, Leonard; Houck, Jacob A.; Capron, William R.; Lohr, Gary W.

    1990-01-01

    A description and results are presented of a study to measure the performance and reaction of airline flight crews, in a full workload DC-9 cockpit, flying in a real-time simulation of an air traffic control (ATC) concept called Traffic Intelligence for the Management of Efficient Runway-scheduling (TIMER). Experimental objectives were to verify earlier fast-time TIMER time-delivery precision results and obtain data for the validation or refinement of existing computer models of pilot/airborne performance. Experimental data indicated a runway threshold, interarrival-time-error standard deviation in the range of 10.4 to 14.1 seconds. Other real-time system performance parameters measured include approach speeds, response time to controller turn instructions, bank angles employed, and ATC controller message delivery-time errors.

  1. Real-Time Mapping Spectroscopy on the Ground, in the Air, and in Space

    NASA Astrophysics Data System (ADS)

    Thompson, D. R.; Allwood, A.; Chien, S.; Green, R. O.; Wettergreen, D. S.

    2016-12-01

    Real-time data interpretation can benefit both remote in situ exploration and remote sensing. Basic analyses at the sensor can monitor instrument performance and reveal invisible science phenomena in real time. This promotes situational awareness for remote robotic explorers or campaign decision makers, enabling adaptive data collection, reduced downlink requirements, and coordinated multi-instrument observations. Fast analysis is ideal for mapping spectrometers providing unambiguous, quantitative geophysical measurements. This presentation surveys recent computational advances in real-time spectroscopic analysis for Earth science and planetary exploration. Spectral analysis at the sensor enables new operations concepts that significantly improve science yield. Applications include real-time detection of fugitive greenhouse emissions by airborne monitoring, real-time cloud screening and mineralogical mapping by orbital spectrometers, and adaptive measurement by the PIXL instrument on the Mars 2020 rover. Copyright 2016 California Institute of Technology. All Rights Reserved. We acknowledge support of the US Government, NASA, the Earth Science Division and Terrestrial Ecology program.

  2. Real-time photoacoustic and ultrasound dual-modality imaging system facilitated with graphics processing unit and code parallel optimization.

    PubMed

    Yuan, Jie; Xu, Guan; Yu, Yao; Zhou, Yu; Carson, Paul L; Wang, Xueding; Liu, Xiaojun

    2013-08-01

    Photoacoustic tomography (PAT) offers structural and functional imaging of living biological tissue with highly sensitive optical absorption contrast and excellent spatial resolution comparable to medical ultrasound (US) imaging. We report the development of a fully integrated PAT and US dual-modality imaging system, which performs signal scanning, image reconstruction, and display for both photoacoustic (PA) and US imaging all in a truly real-time manner. The back-projection (BP) algorithm for PA image reconstruction is optimized to reduce the computational cost and facilitate parallel computation on a state of the art graphics processing unit (GPU) card. For the first time, PAT and US imaging of the same object can be conducted simultaneously and continuously, at a real-time frame rate, presently limited by the laser repetition rate of 10 Hz. Noninvasive PAT and US imaging of human peripheral joints in vivo were achieved, demonstrating the satisfactory image quality realized with this system. Another experiment, simultaneous PAT and US imaging of contrast agent flowing through an artificial vessel, was conducted to verify the performance of this system for imaging fast biological events. The GPU-based image reconstruction software code for this dual-modality system is open source and available for download from http://sourceforge.net/projects/patrealtime.

  3. Near Real-Time Georeference of Umanned Aerial Vehicle Images for Post-Earthquake Response

    NASA Astrophysics Data System (ADS)

    Wang, S.; Wang, X.; Dou, A.; Yuan, X.; Ding, L.; Ding, X.

    2018-04-01

    The rapid collection of Unmanned Aerial Vehicle (UAV) remote sensing images plays an important role in the fast submitting disaster information and the monitored serious damaged objects after the earthquake. However, for hundreds of UAV images collected in one flight sortie, the traditional data processing methods are image stitching and three-dimensional reconstruction, which take one to several hours, and affect the speed of disaster response. If the manual searching method is employed, we will spend much more time to select the images and the find images do not have spatial reference. Therefore, a near-real-time rapid georeference method for UAV remote sensing disaster data is proposed in this paper. The UAV images are achieved georeference combined with the position and attitude data collected by UAV flight control system, and the georeferenced data is organized by means of world file which is developed by ESRI. The C # language is adopted to compile the UAV images rapid georeference software, combined with Geospatial Data Abstraction Library (GDAL). The result shows that it can realize rapid georeference of remote sensing disaster images for up to one thousand UAV images within one minute, and meets the demand of rapid disaster response, which is of great value in disaster emergency application.

  4. Fast interactive real-time volume rendering of real-time three-dimensional echocardiography: an implementation for low-end computers

    NASA Technical Reports Server (NTRS)

    Saracino, G.; Greenberg, N. L.; Shiota, T.; Corsi, C.; Lamberti, C.; Thomas, J. D.

    2002-01-01

    Real-time three-dimensional echocardiography (RT3DE) is an innovative cardiac imaging modality. However, partly due to lack of user-friendly software, RT3DE has not been widely accepted as a clinical tool. The object of this study was to develop and implement a fast and interactive volume renderer of RT3DE datasets designed for a clinical environment where speed and simplicity are not secondary to accuracy. Thirty-six patients (20 regurgitation, 8 normal, 8 cardiomyopathy) were imaged using RT3DE. Using our newly developed software, all 3D data sets were rendered in real-time throughout the cardiac cycle and assessment of cardiac function and pathology was performed for each case. The real-time interactive volume visualization system is user friendly and instantly provides consistent and reliable 3D images without expensive workstations or dedicated hardware. We believe that this novel tool can be used clinically for dynamic visualization of cardiac anatomy.

  5. Ultrasonic Phased Array Compressive Imaging in Time and Frequency Domain: Simulation, Experimental Verification and Real Application

    PubMed Central

    Bai, Zhiliang; Chen, Shili; Jia, Lecheng; Zeng, Zhoumo

    2018-01-01

    Embracing the fact that one can recover certain signals and images from far fewer measurements than traditional methods use, compressive sensing (CS) provides solutions to huge amounts of data collection in phased array-based material characterization. This article describes how a CS framework can be utilized to effectively compress ultrasonic phased array images in time and frequency domains. By projecting the image onto its Discrete Cosine transform domain, a novel scheme was implemented to verify the potentiality of CS for data reduction, as well as to explore its reconstruction accuracy. The results from CIVA simulations indicate that both time and frequency domain CS can accurately reconstruct array images using samples less than the minimum requirements of the Nyquist theorem. For experimental verification of three types of artificial flaws, although a considerable data reduction can be achieved with defects clearly preserved, it is currently impossible to break Nyquist limitation in the time domain. Fortunately, qualified recovery in the frequency domain makes it happen, meaning a real breakthrough for phased array image reconstruction. As a case study, the proposed CS procedure is applied to the inspection of an engine cylinder cavity containing different pit defects and the results show that orthogonal matching pursuit (OMP)-based CS guarantees the performance for real application. PMID:29738452

  6. Rapid Detection of Ceratocystis platani Inoculum by Quantitative Real-Time PCR Assay

    PubMed Central

    Ghelardini, Luisa; Belbahri, Lassaâd; Quartier, Marion; Santini, Alberto

    2013-01-01

    Ceratocystis platani is the causal agent of canker stain of plane trees, a lethal disease able to kill mature trees in one or two successive growing seasons. The pathogen is a quarantine organism and has a negative impact on anthropogenic and natural populations of plane trees. Contaminated sawdust produced during pruning and sanitation fellings can contribute to disease spread. The goal of this study was to design a rapid, real-time quantitative PCR assay to detect a C. platani airborne inoculum. Airborne inoculum traps (AITs) were placed in an urban setting in the city of Florence, Italy, where the disease was present. Primers and TaqMan minor groove binder (MGB) probes were designed to target cerato-platanin (CP) and internal transcribed spacer 2 (ITS2) genes. The detection limits of the assay were 0.05 pg/μl and 2 fg/μl of fungal DNA for CP and ITS, respectively. Pathogen detection directly from AITs demonstrated specificity and high sensitivity for C. platani, detecting DNA concentrations as low as 1.2 × 10−2 to 1.4 × 10−2 pg/μl, corresponding to ∼10 conidia per ml. Airborne inoculum traps were able to detect the C. platani inoculum within 200 m of the closest symptomatic infected plane tree. The combination of airborne trapping and real-time quantitative PCR assay provides a rapid and sensitive method for the specific detection of a C. platani inoculum. This technique may be used to identify the period of highest risk of pathogen spread in a site, thus helping disease management. PMID:23811499

  7. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  8. A Low-Cost Digital Microscope with Real-Time Fluorescent Imaging Capability.

    PubMed

    Hasan, Md Mehedi; Alam, Mohammad Wajih; Wahid, Khan A; Miah, Sayem; Lukong, Kiven Erique

    2016-01-01

    This paper describes the development of a prototype of a low-cost digital fluorescent microscope built from commercial off-the-shelf (COTS) components. The prototype was tested to detect malignant tumor cells taken from a living organism in a preclinical setting. This experiment was accomplished by using Alexa Fluor 488 conjugate dye attached to the cancer cells. Our prototype utilizes a torch along with an excitation filter as a light source for fluorophore excitation, a dichroic mirror to reflect the excitation and pass the emitted green light from the sample under test and a barrier filter to permit only appropriate wavelength. The system is designed out of a microscope using its optical zooming property and an assembly of exciter filter, dichroic mirror and transmitter filter. The microscope is connected to a computer or laptop through universal serial bus (USB) that allows real-time transmission of captured florescence images; this also offers real-time control of the microscope. The designed system has comparable features of high-end commercial fluorescent microscopes while reducing cost, power, weight and size.

  9. A Low-Cost Digital Microscope with Real-Time Fluorescent Imaging Capability

    PubMed Central

    Hasan, Md. Mehedi; Wahid, Khan A.; Miah, Sayem; Lukong, Kiven Erique

    2016-01-01

    This paper describes the development of a prototype of a low-cost digital fluorescent microscope built from commercial off-the-shelf (COTS) components. The prototype was tested to detect malignant tumor cells taken from a living organism in a preclinical setting. This experiment was accomplished by using Alexa Fluor 488 conjugate dye attached to the cancer cells. Our prototype utilizes a torch along with an excitation filter as a light source for fluorophore excitation, a dichroic mirror to reflect the excitation and pass the emitted green light from the sample under test and a barrier filter to permit only appropriate wavelength. The system is designed out of a microscope using its optical zooming property and an assembly of exciter filter, dichroic mirror and transmitter filter. The microscope is connected to a computer or laptop through universal serial bus (USB) that allows real-time transmission of captured florescence images; this also offers real-time control of the microscope. The designed system has comparable features of high-end commercial fluorescent microscopes while reducing cost, power, weight and size. PMID:27977709

  10. Real-time terahertz wave imaging by nonlinear optical frequency up-conversion in a 4-dimethylamino-N'-methyl-4'-stilbazolium tosylate crystal

    NASA Astrophysics Data System (ADS)

    Fan, Shuzhen; Qi, Feng; Notake, Takashi; Nawata, Kouji; Matsukawa, Takeshi; Takida, Yuma; Minamide, Hiroaki

    2014-03-01

    Real-time terahertz (THz) wave imaging has wide applications in areas such as security, industry, biology, medicine, pharmacy, and arts. In this letter, we report on real-time room-temperature THz imaging by nonlinear optical frequency up-conversion in organic 4-dimethylamino-N'-methyl-4'-stilbazolium tosylate crystal. The active projection-imaging system consisted of (1) THz wave generation, (2) THz-near-infrared hybrid optics, (3) THz wave up-conversion, and (4) an InGaAs camera working at 60 frames per second. The pumping laser system consisted of two optical parametric oscillators pumped by a nano-second frequency-doubled Nd:YAG laser. THz-wave images of handmade samples at 19.3 THz were taken, and videos of a sample moving and a ruler stuck with a black polyethylene film moving were supplied online to show real-time ability. Thanks to the high speed and high responsivity of this technology, real-time THz imaging with a higher signal-to-noise ratio than a commercially available THz micro-bolometer camera was proven to be feasible. By changing the phase-matching condition, i.e., by changing the wavelength of the pumping laser, we suggest THz imaging with a narrow THz frequency band of interest in a wide range from approximately 2 to 30 THz is possible.

  11. An investigation of articulatory setting using real-time magnetic resonance imaging

    PubMed Central

    Ramanarayanan, Vikram; Goldstein, Louis; Byrd, Dani; Narayanan, Shrikanth S.

    2013-01-01

    This paper presents an automatic procedure to analyze articulatory setting in speech production using real-time magnetic resonance imaging of the moving human vocal tract. The procedure extracts frames corresponding to inter-speech pauses, speech-ready intervals and absolute rest intervals from magnetic resonance imaging sequences of read and spontaneous speech elicited from five healthy speakers of American English and uses automatically extracted image features to quantify vocal tract posture during these intervals. Statistical analyses show significant differences between vocal tract postures adopted during inter-speech pauses and those at absolute rest before speech; the latter also exhibits a greater variability in the adopted postures. In addition, the articulatory settings adopted during inter-speech pauses in read and spontaneous speech are distinct. The results suggest that adopted vocal tract postures differ on average during rest positions, ready positions and inter-speech pauses, and might, in that order, involve an increasing degree of active control by the cognitive speech planning mechanism. PMID:23862826

  12. High throughput web inspection system using time-stretch real-time imaging

    NASA Astrophysics Data System (ADS)

    Kim, Chanju

    Photonic time-stretch is a novel technology that enables capturing of fast, rare and non-repetitive events. Therefore, it operates in real-time with ability to record over long period of time while having fine temporal resolution. The powerful property of photonic time-stretch has already been employed in various fields of application such as analog-to-digital conversion, spectroscopy, laser scanner and microscopy. Further expanding the scope, we fully exploit the time-stretch technology to demonstrate a high throughput web inspection system. Web inspection, namely surface inspection is a nondestructive evaluation method which is crucial for semiconductor wafer and thin film production. We successfully report a dark-field web inspection system with line scan speed of 90.9 MHz which is up to 1000 times faster than conventional inspection instruments. The manufacturing of high quality semiconductor wafer and thin film may directly benefit from this technology as it can easily locate defects with area of less than 10 microm x 10 microm where it allows maximum web flow speed of 1.8 km/s. The thesis provides an overview of our web inspection technique, followed by description of the photonic time-stretch technique which is the keystone in our system. A detailed explanation of each component is covered to provide quantitative understanding of the system. Finally, imaging results from a hard-disk sample and flexible films are presented along with performance analysis of the system. This project was the first application of time-stretch to industrial inspection, and was conducted under financial support and with close involvement by Hitachi, Ltd.

  13. Real time animation of space plasma phenomena

    NASA Technical Reports Server (NTRS)

    Jordan, K. F.; Greenstadt, E. W.

    1987-01-01

    In pursuit of real time animation of computer simulated space plasma phenomena, the code was rewritten for the Massively Parallel Processor (MPP). The program creates a dynamic representation of the global bowshock which is based on actual spacecraft data and designed for three dimensional graphic output. This output consists of time slice sequences which make up the frames of the animation. With the MPP, 16384, 512 or 4 frames can be calculated simultaneously depending upon which characteristic is being computed. The run time was greatly reduced which promotes the rapid sequence of images and makes real time animation a foreseeable goal. The addition of more complex phenomenology in the constructed computer images is now possible and work proceeds to generate these images.

  14. Region Segmentation in the Frequency Domain Applied to Upper Airway Real-Time Magnetic Resonance Images

    PubMed Central

    Narayanan, Shrikanth

    2009-01-01

    We describe a method for unsupervised region segmentation of an image using its spatial frequency domain representation. The algorithm was designed to process large sequences of real-time magnetic resonance (MR) images containing the 2-D midsagittal view of a human vocal tract airway. The segmentation algorithm uses an anatomically informed object model, whose fit to the observed image data is hierarchically optimized using a gradient descent procedure. The goal of the algorithm is to automatically extract the time-varying vocal tract outline and the position of the articulators to facilitate the study of the shaping of the vocal tract during speech production. PMID:19244005

  15. An automated data exploitation system for airborne sensors

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike

    2014-06-01

    Advanced wide area persistent surveillance (WAPS) sensor systems on manned or unmanned airborne vehicles are essential for wide-area urban security monitoring in order to protect our people and our warfighter from terrorist attacks. Currently, human (imagery) analysts process huge data collections from full motion video (FMV) for data exploitation and analysis (real-time and forensic), providing slow and inaccurate results. An Automated Data Exploitation System (ADES) is urgently needed. In this paper, we present a recently developed ADES for airborne vehicles under heavy urban background clutter conditions. This system includes four processes: (1) fast image registration, stabilization, and mosaicking; (2) advanced non-linear morphological moving target detection; (3) robust multiple target (vehicles, dismounts, and human) tracking (up to 100 target tracks); and (4) moving or static target/object recognition (super-resolution). Test results with real FMV data indicate that our ADES can reliably detect, track, and recognize multiple vehicles under heavy urban background clutters. Furthermore, our example shows that ADES as a baseline platform can provide capability for vehicle abnormal behavior detection to help imagery analysts quickly trace down potential threats and crimes.

  16. TDC-based readout electronics for real-time acquisition of high resolution PET bio-images

    NASA Astrophysics Data System (ADS)

    Marino, N.; Saponara, S.; Ambrosi, G.; Baronti, F.; Bisogni, M. G.; Cerello, P.,; Ciciriello, F.; Corsi, F.; Fanucci, L.; Ionica, M.; Licciulli, F.; Marzocca, C.; Morrocchi, M.; Pennazio, F.; Roncella, R.; Santoni, C.; Wheadon, R.; Del Guerra, A.

    2013-02-01

    Positron emission tomography (PET) is a clinical and research tool for in vivo metabolic imaging. The demand for better image quality entails continuous research to improve PET instrumentation. In clinical applications, PET image quality benefits from the time of flight (TOF) feature. Indeed, by measuring the photons arrival time on the detectors with a resolution less than 100 ps, the annihilation point can be estimated with centimeter resolution. This leads to better noise level, contrast and clarity of detail in the images either using analytical or iterative reconstruction algorithms. This work discusses a silicon photomultiplier (SiPM)-based magnetic-field compatible TOF-PET module with depth of interaction (DOI) correction. The detector features a 3D architecture with two tiles of SiPMs coupled to a single LYSO scintillator on both its faces. The real-time front-end electronics is based on a current-mode ASIC where a low input impedance, fast current buffer allows achieving the required time resolution. A pipelined time to digital converter (TDC) measures and digitizes the arrival time and the energy of the events with a timestamp of 100 ps and 400 ps, respectively. An FPGA clusters the data and evaluates the DOI, with a simulated z resolution of the PET image of 1.4 mm FWHM.

  17. Imaging systems and algorithms to analyze biological samples in real-time using mobile phone microscopy

    PubMed Central

    Mayberry, Addison; Perkins, David L.; Holcomb, Daniel E.

    2018-01-01

    Miniaturized imaging devices have pushed the boundaries of point-of-care imaging, but existing mobile-phone-based imaging systems do not exploit the full potential of smart phones. This work demonstrates the use of simple imaging configurations to deliver superior image quality and the ability to handle a wide range of biological samples. Results presented in this work are from analysis of fluorescent beads under fluorescence imaging, as well as helminth eggs and freshwater mussel larvae under white light imaging. To demonstrate versatility of the systems, real time analysis and post-processing results of the sample count and sample size are presented in both still images and videos of flowing samples. PMID:29509786

  18. Real-time broadband terahertz spectroscopic imaging by using a high-sensitivity terahertz camera

    NASA Astrophysics Data System (ADS)

    Kanda, Natsuki; Konishi, Kuniaki; Nemoto, Natsuki; Midorikawa, Katsumi; Kuwata-Gonokami, Makoto

    2017-02-01

    Terahertz (THz) imaging has a strong potential for applications because many molecules have fingerprint spectra in this frequency region. Spectroscopic imaging in the THz region is a promising technique to fully exploit this characteristic. However, the performance of conventional techniques is restricted by the requirement of multidimensional scanning, which implies an image data acquisition time of several minutes. In this study, we propose and demonstrate a novel broadband THz spectroscopic imaging method that enables real-time image acquisition using a high-sensitivity THz camera. By exploiting the two-dimensionality of the detector, a broadband multi-channel spectrometer near 1 THz was constructed with a reflection type diffraction grating and a high-power THz source. To demonstrate the advantages of the developed technique, we performed molecule-specific imaging and high-speed acquisition of two-dimensional (2D) images. Two different sugar molecules (lactose and D-fructose) were identified with fingerprint spectra, and their distributions in one-dimensional space were obtained at a fast video rate (15 frames per second). Combined with the one-dimensional (1D) mechanical scanning of the sample, two-dimensional molecule-specific images can be obtained only in a few seconds. Our method can be applied in various important fields such as security and biomedicine.

  19. Thermal Infrared Spectral Imager for Airborne Science Applications

    NASA Technical Reports Server (NTRS)

    Johnson, William R.; Hook, Simon J.; Mouroulis, Pantazis; Wilson, Daniel W.; Gunapala, Sarath D.; Hill, Cory J.; Mumolo, Jason M.; Eng, Bjorn T.

    2009-01-01

    An airborne thermal hyperspectral imager is under development which utilizes the compact Dyson optical configuration and quantum well infrared photo detector (QWIP) focal plane array. The Dyson configuration uses a single monolithic prism-like grating design which allows for a high throughput instrument (F/1.6) with minimal ghosting, stray-light and large swath width. The configuration has the potential to be the optimal imaging spectroscopy solution for lighter-than-air (LTA) vehicles and unmanned aerial vehicles (UAV) due to its small form factor and relatively low power requirements. The planned instrument specifications are discussed as well as design trade-offs. Calibration testing results (noise equivalent temperature difference, spectral linearity and spectral bandwidth) and laboratory emissivity plots from samples are shown using an operational testbed unit which has similar specifications as the final airborne system. Field testing of the testbed unit was performed to acquire plots of apparent emissivity for various known standard minerals (such as quartz). A comparison is made using data from the ASTER spectral library.

  20. Microbubble responses to a similar mechanical index with different real-time perfusion imaging techniques.

    PubMed

    Porter, Thomas R; Oberdorfer, Joseph; Rafter, Patrick; Lof, John; Xie, Feng

    2003-08-01

    The purpose of this study was to determine differences in contrast enhancement and microbubble destruction rates with current commercially available low-mechanical index (MI) real-time perfusion imaging modalities. A tissue-mimicking phantom was developed that had vessels at 3 cm (near field) and 9 cm (far field) from a real-time transducer. Perfluorocarbon-exposed sonicated dextrose albumin microbubbles (PESDA) were injected proximal to a mixing chamber, and then passed through these vessels while the region was insonified with either pulses of alternating polarity with pulse inversion Doppler (PID) or pulses of alternating amplitude by power modulation (PM) at MIs of 0.1, 0.2 and 0.3. Effluent microbubble concentration, contrast intensity and the slope of digital contrast intensity vs. time were measured. Our results demonstrated that microbubble destruction already occurs with PID at an MI of 0.1. Contrast intensity seen with PID was less than with PM. Therefore, differences in contrast enhancement and microbubble destruction rates occur at a similar MI setting when using different real-time pulse sequence schemes.

  1. Rapid turn-around mapping of wildfires and disasters with airborne infrared imagery fro the new FireMapper® 2.0 and Oilmapper systems

    Treesearch

    James W. Hoffman; Lloyd L. Coulter; Philip J Riggan

    2005-01-01

    The new FireMapper® 2.0 and OilMapper airborne, infrared imaging systems operate in a "snapshot" mode. Both systems feature the real time display of single image frames, in any selected spectral band, on a daylight readable tablet PC. These single frames are displayed to the operator with full temperature calibration in color or grayscale renditions. A rapid...

  2. Aerosol-fluorescence spectrum analyzer: real-time measurement of emission spectra of airborne biological particles

    NASA Astrophysics Data System (ADS)

    Hill, Steven C.; Pinnick, Ronald G.; Nachman, Paul; Chen, Gang; Chang, Richard K.; Mayo, Michael W.; Fernandez, Gilbert L.

    1995-10-01

    We have assembled an aerosol-fluorescence spectrum analyzer (AFS), which can measure the fluorescence spectra and elastic scattering of airborne particles as they flow through a laser beam. The aerosols traverse a scattering cell where they are illuminated with intense (50 kW/cm 2) light inside the cavity of an argon-ion laser operating at 488 nm. This AFS can obtain fluorescence spectra of individual dye-doped polystyrene microspheres as small as 0.5 mu m in diameter. The spectra obtained from microspheres doped with pink and green-yellow dyes are clearly different. We have also detected the fluorescence spectra of airborne particles (although not single particles) made from various

  3. Conceptual design and proof-of-principle testing of the real-time multispectral imaging system MANTIS

    NASA Astrophysics Data System (ADS)

    Vijvers, W. A. J.; Mumgaard, R. T.; Andrebe, Y.; Classen, I. G. J.; Duval, B. P.; Lipschultz, B.

    2017-12-01

    The Multispectral Advanced Narrowband Tokamak Imaging System (MANTIS) is proposed to resolve the steep temperature and density gradients in the scrape-off layer of tokamaks in real-time. The initial design is to deliver two-dimensional distributions of key plasma parameters of the TCV tokamak to a real-time control system in order to enable novel control strategies, while providing new insights into power exhaust physics in the full offline analysis. This paper presents the conceptual system design, the mechanical and optical design of a prototype that was built to assess the optical performance, and the results of the first proof-of-principle tests of the prototype. These demonstrate a central resolving power of 50-46 line pairs per millimeter (CTF50) in the first four channels. For the additional channels, the sharpness is a factor two worse for the odd channels (likely affected by sub-optimal alignment), while the even channels continue the trend observed for the first four channels of 3% degradation per channel. This is explained by the self-cancellation of off-axis aberrations, which is an attractive property of the chosen optical design. The results show that at least a 10-channel real-time multispectral imaging system is feasible.

  4. Real-time line matching from stereo images using a nonparametric transform of spatial relations and texture information

    NASA Astrophysics Data System (ADS)

    Park, Jonghee; Yoon, Kuk-Jin

    2015-02-01

    We propose a real-time line matching method for stereo systems. To achieve real-time performance while retaining a high level of matching precision, we first propose a nonparametric transform to represent the spatial relations between neighboring lines and nearby textures as a binary stream. Since the length of a line can vary across images, the matching costs between lines are computed within an overlap area (OA) based on the binary stream. The OA is determined for each line pair by employing the properties of a rectified image pair. Finally, the line correspondence is determined using a winner-takes-all method with a left-right consistency check. To reduce the computational time requirements further, we filter out unreliable matching candidates in advance based on their rectification properties. The performance of the proposed method was compared with state-of-the-art methods in terms of the computational time, matching precision, and recall. The proposed method required 47 ms to match lines from an image pair in the KITTI dataset with an average precision of 95%. We also verified the proposed method under image blur, illumination variation, and viewpoint changes.

  5. Solid-State Multi-Sensor Array System for Real Time Imaging of Magnetic Fields and Ferrous Objects

    NASA Astrophysics Data System (ADS)

    Benitez, D.; Gaydecki, P.; Quek, S.; Torres, V.

    2008-02-01

    In this paper the development of a solid-state sensors based system for real-time imaging of magnetic fields and ferrous objects is described. The system comprises 1089 magneto inductive solid state sensors arranged in a 2D array matrix of 33×33 files and columns, equally spaced in order to cover an approximate area of 300 by 300 mm. The sensor array is located within a large current-carrying coil. Data is sampled from the sensors by several DSP controlling units and finally streamed to a host computer via a USB 2.0 interface and the image generated and displayed at a rate of 20 frames per minute. The development of the instrumentation has been complemented by extensive numerical modeling of field distribution patterns using boundary element methods. The system was originally intended for deployment in the non-destructive evaluation (NDE) of reinforced concrete. Nevertheless, the system is not only capable of producing real-time, live video images of the metal target embedded within any opaque medium, it also allows the real-time visualization and determination of the magnetic field distribution emitted by either permanent magnets or geometries carrying current. Although this system was initially developed for the NDE arena, it could also have many potential applications in many other fields, including medicine, security, manufacturing, quality assurance and design involving magnetic fields.

  6. Performance of an airborne imaging 92/183 GHz radiometer during the Bering Sea Marginal Ice Zone Experiment (MIZEX-WEST)

    NASA Technical Reports Server (NTRS)

    Gagliano, J. A.; Mcsheehy, J. J.; Cavalieri, D. J.

    1983-01-01

    An airborne imaging 92/183 GHz radiometer was recently flown onboard NASA's Convair 990 research aircraft during the February 1983 Bering Sea Marginal Ice Zone Experiment (MIZEX-WEST). The 92 GHz portion of the radiometer was used to gather ice signature data and to generate real-time millimeter wave images of the marginal ice zone. Dry atmospheric conditions in the Arctic resulted in good surface ice signature data for the 183 GHz double sideband (DSB) channel situated + or - 8.75 GHz away from the water vapor absorption line. The radiometer's beam scanner imaged the marginal ice zone over a + or - 45 degrees swath angle about the aircraft nadir position. The aircraft altitude was 30,000 feet (9.20 km) maximum and 3,000 feet (0.92 km) minimum during the various data runs. Calculations of the minimum detectable target (ice) size for the radiometer as a function of aircraft altitude were performed. In addition, the change in the atmospheric attenuation at 92 GHz under varying weather conditions was incorporated into the target size calculations. A radiometric image of surface ice at 92 GHz in the marginal ice zone is included.

  7. Real-time and sub-wavelength ultrafast coherent diffraction imaging in the extreme ultraviolet.

    PubMed

    Zürch, M; Rothhardt, J; Hädrich, S; Demmler, S; Krebs, M; Limpert, J; Tünnermann, A; Guggenmos, A; Kleineberg, U; Spielmann, C

    2014-12-08

    Coherent Diffraction Imaging is a technique to study matter with nanometer-scale spatial resolution based on coherent illumination of the sample with hard X-ray, soft X-ray or extreme ultraviolet light delivered from synchrotrons or more recently X-ray Free-Electron Lasers. This robust technique simultaneously allows quantitative amplitude and phase contrast imaging. Laser-driven high harmonic generation XUV-sources allow table-top realizations. However, the low conversion efficiency of lab-based sources imposes either a large scale laser system or long exposure times, preventing many applications. Here we present a lensless imaging experiment combining a high numerical aperture (NA = 0.8) setup with a high average power fibre laser driven high harmonic source. The high flux and narrow-band harmonic line at 33.2 nm enables either sub-wavelength spatial resolution close to the Abbe limit (Δr = 0.8λ) for long exposure time, or sub-70 nm imaging in less than one second. The unprecedented high spatial resolution, compactness of the setup together with the real-time capability paves the way for a plethora of applications in fundamental and life sciences.

  8. Real-time endoscopic image orientation correction system using an accelerometer and gyrosensor.

    PubMed

    Lee, Hyung-Chul; Jung, Chul-Woo; Kim, Hee Chan

    2017-01-01

    The discrepancy between spatial orientations of an endoscopic image and a physician's working environment can make it difficult to interpret endoscopic images. In this study, we developed and evaluated a device that corrects the endoscopic image orientation using an accelerometer and gyrosensor. The acceleration of gravity and angular velocity were retrieved from the accelerometer and gyrosensor attached to the handle of the endoscope. The rotational angle of the endoscope handle was calculated using a Kalman filter with transmission delay compensation. Technical evaluation of the orientation correction system was performed using a camera by comparing the optical rotational angle from the captured image with the rotational angle calculated from the sensor outputs. For the clinical utility test, fifteen anesthesiology residents performed a video endoscopic examination of an airway model with and without using the orientation correction system. The participants reported numbers written on papers placed at the left main, right main, and right upper bronchi of the airway model. The correctness and the total time it took participants to report the numbers were recorded. During the technical evaluation, errors in the calculated rotational angle were less than 5 degrees. In the clinical utility test, there was a significant time reduction when using the orientation correction system compared with not using the system (median, 52 vs. 76 seconds; P = .012). In this study, we developed a real-time endoscopic image orientation correction system, which significantly improved physician performance during a video endoscopic exam.

  9. Toward real-time tumor margin identification in image-guided robotic brain tumor resection

    NASA Astrophysics Data System (ADS)

    Hu, Danying; Jiang, Yang; Belykh, Evgenii; Gong, Yuanzheng; Preul, Mark C.; Hannaford, Blake; Seibel, Eric J.

    2017-03-01

    For patients with malignant brain tumors (glioblastomas), a safe maximal resection of tumor is critical for an increased survival rate. However, complete resection of the cancer is hard to achieve due to the invasive nature of these tumors, where the margins of the tumors become blurred from frank tumor to more normal brain tissue, but in which single cells or clusters of malignant cells may have invaded. Recent developments in fluorescence imaging techniques have shown great potential for improved surgical outcomes by providing surgeons intraoperative contrast-enhanced visual information of tumor in neurosurgery. The current near-infrared (NIR) fluorophores, such as indocyanine green (ICG), cyanine5.5 (Cy5.5), 5-aminolevulinic acid (5-ALA)-induced protoporphyrin IX (PpIX), are showing clinical potential to be useful in targeting and guiding resections of such tumors. Real-time tumor margin identification in NIR imaging could be helpful to both surgeons and patients by reducing the operation time and space required by other imaging modalities such as intraoperative MRI, and has the potential to integrate with robotically assisted surgery. In this paper, a segmentation method based on the Chan-Vese model was developed for identifying the tumor boundaries in an ex-vivo mouse brain from relatively noisy fluorescence images acquired by a multimodal scanning fiber endoscope (mmSFE). Tumor contours were achieved iteratively by minimizing an energy function formed by a level set function and the segmentation model. Quantitative segmentation metrics based on tumor-to-background (T/B) ratio were evaluated. Results demonstrated feasibility in detecting the brain tumor margins at quasi-real-time and has the potential to yield improved precision brain tumor resection techniques or even robotic interventions in the future.

  10. Real-time 3D internal marker tracking during arc radiotherapy by the use of combined MV kV imaging

    NASA Astrophysics Data System (ADS)

    Liu, W.; Wiersma, R. D.; Mao, W.; Luxton, G.; Xing, L.

    2008-12-01

    To minimize the adverse dosimetric effect caused by tumor motion, it is desirable to have real-time knowledge of the tumor position throughout the beam delivery process. A promising technique to realize the real-time image guided scheme in external beam radiation therapy is through the combined use of MV and onboard kV beam imaging. The success of this MV-kV triangulation approach for fixed-gantry radiation therapy has been demonstrated. With the increasing acceptance of modern arc radiotherapy in the clinics, a timely and clinically important question is whether the image guidance strategy can be extended to arc therapy to provide the urgently needed real-time tumor motion information. While conceptually feasible, there are a number of theoretical and practical issues specific to the arc delivery that need to be resolved before clinical implementation. The purpose of this work is to establish a robust procedure of system calibration for combined MV and kV imaging for internal marker tracking during arc delivery and to demonstrate the feasibility and accuracy of the technique. A commercially available LINAC equipped with an onboard kV imager and electronic portal imaging device (EPID) was used for the study. A custom built phantom with multiple ball bearings was used to calibrate the stereoscopic MV-kV imaging system to provide the transformation parameters from imaging pixels to 3D world coordinates. The accuracy of the fiducial tracking system was examined using a 4D motion phantom capable of moving in accordance with a pre-programmed trajectory. Overall, spatial accuracy of MV-kV fiducial tracking during the arc delivery process for normal adult breathing amplitude and period was found to be better than 1 mm. For fast motion, the results depended on the imaging frame rates. The RMS error ranged from ~0.5 mm for the normal adult breathing pattern to ~1.5 mm for more extreme cases with a low imaging frame rate of 3.4 Hz. In general, highly accurate real-time

  11. Real-time 3D internal marker tracking during arc radiotherapy by the use of combined MV-kV imaging.

    PubMed

    Liu, W; Wiersma, R D; Mao, W; Luxton, G; Xing, L

    2008-12-21

    To minimize the adverse dosimetric effect caused by tumor motion, it is desirable to have real-time knowledge of the tumor position throughout the beam delivery process. A promising technique to realize the real-time image guided scheme in external beam radiation therapy is through the combined use of MV and onboard kV beam imaging. The success of this MV-kV triangulation approach for fixed-gantry radiation therapy has been demonstrated. With the increasing acceptance of modern arc radiotherapy in the clinics, a timely and clinically important question is whether the image guidance strategy can be extended to arc therapy to provide the urgently needed real-time tumor motion information. While conceptually feasible, there are a number of theoretical and practical issues specific to the arc delivery that need to be resolved before clinical implementation. The purpose of this work is to establish a robust procedure of system calibration for combined MV and kV imaging for internal marker tracking during arc delivery and to demonstrate the feasibility and accuracy of the technique. A commercially available LINAC equipped with an onboard kV imager and electronic portal imaging device (EPID) was used for the study. A custom built phantom with multiple ball bearings was used to calibrate the stereoscopic MV-kV imaging system to provide the transformation parameters from imaging pixels to 3D world coordinates. The accuracy of the fiducial tracking system was examined using a 4D motion phantom capable of moving in accordance with a pre-programmed trajectory. Overall, spatial accuracy of MV-kV fiducial tracking during the arc delivery process for normal adult breathing amplitude and period was found to be better than 1 mm. For fast motion, the results depended on the imaging frame rates. The RMS error ranged from approximately 0.5 mm for the normal adult breathing pattern to approximately 1.5 mm for more extreme cases with a low imaging frame rate of 3.4 Hz. In general

  12. Fast super-resolution with affine motion using an adaptive Wiener filter and its application to airborne imaging.

    PubMed

    Hardie, Russell C; Barnard, Kenneth J; Ordonez, Raul

    2011-12-19

    Fast nonuniform interpolation based super-resolution (SR) has traditionally been limited to applications with translational interframe motion. This is in part because such methods are based on an underlying assumption that the warping and blurring components in the observation model commute. For translational motion this is the case, but it is not true in general. This presents a problem for applications such as airborne imaging where translation may be insufficient. Here we present a new Fourier domain analysis to show that, for many image systems, an affine warping model with limited zoom and shear approximately commutes with the point spread function when diffraction effects are modeled. Based on this important result, we present a new fast adaptive Wiener filter (AWF) SR algorithm for non-translational motion and study its performance with affine motion. The fast AWF SR method employs a new smart observation window that allows us to precompute all the needed filter weights for any type of motion without sacrificing much of the full performance of the AWF. We evaluate the proposed algorithm using simulated data and real infrared airborne imagery that contains a thermal resolution target allowing for objective resolution analysis.

  13. Interactive real time flow simulations

    NASA Technical Reports Server (NTRS)

    Sadrehaghighi, I.; Tiwari, S. N.

    1990-01-01

    An interactive real time flow simulation technique is developed for an unsteady channel flow. A finite-volume algorithm in conjunction with a Runge-Kutta time stepping scheme was developed for two-dimensional Euler equations. A global time step was used to accelerate convergence of steady-state calculations. A raster image generation routine was developed for high speed image transmission which allows the user to have direct interaction with the solution development. In addition to theory and results, the hardware and software requirements are discussed.

  14. Real-time integrated photoacoustic and ultrasound (PAUS) imaging system to guide interventional procedures: ex vivo study.

    PubMed

    Wei, Chen-Wei; Nguyen, Thu-Mai; Xia, Jinjun; Arnal, Bastien; Wong, Emily Y; Pelivanov, Ivan M; O'Donnell, Matthew

    2015-02-01

    Because of depth-dependent light attenuation, bulky, low-repetition-rate lasers are usually used in most photoacoustic (PA) systems to provide sufficient pulse energies to image at depth within the body. However, integrating these lasers with real-time clinical ultrasound (US) scanners has been problematic because of their size and cost. In this paper, an integrated PA/US (PAUS) imaging system is presented operating at frame rates >30 Hz. By employing a portable, low-cost, low-pulse-energy (~2 mJ/pulse), high-repetition-rate (~1 kHz), 1053-nm laser, and a rotating galvo-mirror system enabling rapid laser beam scanning over the imaging area, the approach is demonstrated for potential applications requiring a few centimeters of penetration. In particular, we demonstrate here real-time (30 Hz frame rate) imaging (by combining multiple single-shot sub-images covering the scan region) of an 18-gauge needle inserted into a piece of chicken breast with subsequent delivery of an absorptive agent at more than 1-cm depth to mimic PAUS guidance of an interventional procedure. A signal-to-noise ratio of more than 35 dB is obtained for the needle in an imaging area 2.8 × 2.8 cm (depth × lateral). Higher frame rate operation is envisioned with an optimized scanning scheme.

  15. Frame Rate Considerations for Real-Time Abdominal Acoustic Radiation Force Impulse Imaging

    PubMed Central

    Fahey, Brian J.; Palmeri, Mark L.; Trahey, Gregg E.

    2008-01-01

    With the advent of real-time Acoustic Radiation Force Impulse (ARFI) imaging, elevated frame rates are both desirable and relevant from a clinical perspective. However, fundamental limitations on frame rates are imposed by thermal safety concerns related to incident radiation force pulses. Abdominal ARFI imaging utilizes a curvilinear scanning geometry that results in markedly different tissue heating patterns than those previously studied for linear arrays or mechanically-translated concave transducers. Finite Element Method (FEM) models were used to simulate these tissue heating patterns and to analyze the impact of tissue heating on frame rates available for abdominal ARFI imaging. A perfusion model was implemented to account for cooling effects due to blood flow and frame rate limitations were evaluated in the presence of normal, reduced and negligible tissue perfusions. Conventional ARFI acquisition techniques were also compared to ARFI imaging with parallel receive tracking in terms of thermal efficiency. Additionally, thermocouple measurements of transducer face temperature increases were acquired to assess the frame rate limitations imposed by cumulative heating of the imaging array. Frame rates sufficient for many abdominal imaging applications were found to be safely achievable utilizing available ARFI imaging techniques. PMID:17521042

  16. Real-time access of large volume imagery through low-bandwidth links

    NASA Astrophysics Data System (ADS)

    Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew

    2010-04-01

    Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.

  17. Real-time automatic registration in optical surgical navigation

    NASA Astrophysics Data System (ADS)

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming

    2016-05-01

    An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.

  18. Impact of orthodontic appliances on the quality of craniofacial anatomical magnetic resonance imaging and real-time speech imaging.

    PubMed

    Wylezinska, Marzena; Pinkstone, Marie; Hay, Norman; Scott, Andrew D; Birch, Malcolm J; Miquel, Marc E

    2015-12-01

    The aim of this work was to investigate the effects of commonly used orthodontic appliances on the magnetic resonance (MR) image quality of the craniofacial region, with special interest in the soft palate and velopharyngeal wall using real-time speech imaging sequences and anatomical imaging of the temporomandibular joints (TMJ) and pituitaries. Common orthodontic appliances were studied on 1.5 T scanner using standard spin and gradient echo sequences (based on the American Society for Testing and Materials standard test method) and sequences previously applied for high-resolution anatomical and dynamic real-time imaging during speech. Images were evaluated for the presence and size of artefacts. Metallic orthodontic appliances had different effects on image quality. The most extensive individual effects were associated with the presence of stainless steel archwire, particularly if combined with stainless steel brackets and stainless steel molar bands. With those appliances, diagnostic quality of magnetic resonance imaging speech and palate images will be most likely severely degraded, or speech imaging and imaging of pituitaries and TMJ will be not possible. All non-metallic, non-metallic with Ni/Cr reinforcement or Ni/Ti alloys appliances were of little concern. The results in the study are only valid at 1.5 T and for the sequences and devices used and cannot necessarily be extrapolated to all sequences and devices. Furthermore, both geometry and size of some appliances are subject dependent, and consequently, the effects on the image quality can vary between subjects. Therefore, the results presented in this article should be treated as a guide when assessing the risks of image quality degradation rather than an absolute evaluation of possible artefacts. Appliances manufactured from stainless steel cause extensive artefacts, which may render image non-diagnostic. The presence and type of orthodontic appliances should be always included in the patient

  19. Real-time 4D electrical resistivity imaging of tracer transport within an energically stimulated fracture zone

    NASA Astrophysics Data System (ADS)

    Johnson, T. C.

    2016-12-01

    Hydraulic fracture stimulation is used extensively in the subsurface energy sector to improve access between energy bearing formations and production boreholes. However, large uncertainties exist concerning the location and extent of stimulated fractures, and concerning the behavior of flow within those fractures. This uncertainty often results in significant risks, including induced seismicity and contamination of potable groundwater aquifers. Time-lapse electrical resistivity tomography (ERT) is a proven method of imaging fluid flow within fracture networks, by imaging the change in bulk conductivity induced by the presence of an electrically anomalous tracer within the fracture. In this work we demonstrate characterization and flow monitoring of a stimulated fracture using real-time four-dimensional ERT imaging within an unsaturated rhyolite formation. After stimulation, a conductive tracer was injected into the fracture zone. ERT survey data were continuously and autonomously collected, pre-processed on site, submitted to an off-site high performance computing system for inversion, and returned to the field for inspection. Surveys were collected at approximately 12 minute intervals. Data transmission and inversion required approximately 2 minutes per survey. The time-lapse imaging results show the dominant flow-paths within the stimulated fracture zone, thereby revealing the location and extent of the fracture, and the behavior of tracer flow within the fracture. Ultimately real-time imaging will enable site operators to better understand stimulation operations, and control post-stimulation reservoir operations for optimal performance and environmental protection.

  20. Detection of infusate leakage in the brain using real-time imaging of convection-enhanced delivery.

    PubMed

    Varenika, Vanja; Dickinson, Peter; Bringas, John; LeCouteur, Richard; Higgins, Robert; Park, John; Fiandaca, Massimo; Berger, Mitchel; Sampson, John; Bankiewicz, Krystof

    2008-11-01

    The authors have shown that convection-enhanced delivery (CED) of gadoteridol-loaded liposomes (GDLs) into different regions of normal monkey brain results in predictable, widespread distribution of this tracking agent as detected by real-time MR imaging. They also have found that this tracking technique allows monitoring of the distribution of similar nanosized agents such as therapeutic liposomes and viral vectors. A limitation of this procedure is the unexpected leakage of liposomes out of targeted parenchyma or malignancies into sulci and ventricles. The aim of the present study was to evaluate the efficacy of CED after the onset of these types of leakage. The authors documented this phenomenon in a study of 5 nonhuman primates and 7 canines, comprising 54 CED infusion sessions. Approximately 20% of these infusions resulted in leakage into cerebral ventricles or sulci. All of the infusions and leakage events were monitored with real-time MR imaging. The authors created volume-distributed versus volume-infused graphs for each infusion session. These graphs revealed the rate of distribution of GDL over the course of each infusion and allowed the authors to evaluate the progress of CED before and after leakage. The distribution of therapeutics within the target structure ceased to increase or resulted in significant attenuation after the onset of leakage. An analysis of the cases in this study revealed that leakage undermines the efficacy of CED. These findings reiterate the importance of real-time MR imaging visualization during CED to ensure an accurate, robust distribution of therapeutic agents.

  1. Can activity within the external abdominal oblique be measured using real-time ultrasound imaging?

    PubMed

    John, E K; Beith, I D

    2007-11-01

    Differences in the function of the anterolateral abdominal muscles have been the subject of much investigation, but primarily using electromyography. Recently changes in thickness of transversus abdominis and internal oblique measured from real-time ultrasound images have been shown to represent activity within these muscles. However it is still unclear if such a change in thickness in external oblique similarly represents activity within that muscle. The purpose of this study was to investigate the relationship between change in thickness and muscle activity in the external oblique using real-time ultrasound and surface electromyography. Simultaneous measurements of electromyography and real-time ultrasound images of external oblique were studied in up to 24 subjects during two tasks compared to the muscle at rest (1) isometric trunk rotation and (2) drawing in the lower abdomen. Changes in muscle thickness correlated significantly with electromyography during isometric trunk rotation in the majority of subjects but with a significant difference between subjects. In contrast, the relationship between change in thickness and electrical activity in the muscle when drawing in the lower abdomen was significant in less than 50% of subjects and the muscle often got thinner. Thickness changes of external oblique can be used as a valid indicator of electromyography activity during isometric trunk rotation, though the relationship is not as good as previously published data for transversus abdominis. Thickness changes of external oblique measured during lower abdominal drawing in cannot be used to detect activity within this muscle.

  2. Computing the total atmospheric refraction for real-time optical imaging sensor simulation

    NASA Astrophysics Data System (ADS)

    Olson, Richard F.

    2015-05-01

    Fast and accurate computation of light path deviation due to atmospheric refraction is an important requirement for real-time simulation of optical imaging sensor systems. A large body of existing literature covers various methods for application of Snell's Law to the light path ray tracing problem. This paper provides a discussion of the adaptation to real time simulation of atmospheric refraction ray tracing techniques used in mid-1980's LOWTRAN releases. The refraction ray trace algorithm published in a LOWTRAN-6 technical report by Kneizys (et. al.) has been coded in MATLAB for development, and in C-language for simulation use. To this published algorithm we have added tuning parameters for variable path segment lengths, and extensions for Earth grazing and exoatmospheric "near Earth" ray paths. Model atmosphere properties used to exercise the refraction algorithm were obtained from tables published in another LOWTRAN-6 related report. The LOWTRAN-6 based refraction model is applicable to atmospheric propagation at wavelengths in the IR and visible bands of the electromagnetic spectrum. It has been used during the past two years by engineers at the U.S. Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) in support of several advanced imaging sensor simulations. Recently, a faster (but sufficiently accurate) method using Gauss-Chebyshev Quadrature integration for evaluating the refraction integral was adopted.

  3. Single-cell real-time imaging of transgene expression upon lipofection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiume, Giuseppe; Di Rienzo, Carmine; NEST, Scuola Normale Superiore and Istituto Nanoscienze-CNR, Piazza San Silvestro 12, 56127, Pisa

    2016-05-20

    Here we address the process of lipofection by quantifying the expression of a genetically-encoded fluorescent reporter at the single-cell level, and in real-time, by confocal imaging in live cells. The Lipofectamine gold-standard formulation is compared to the alternative promising DC-Chol/DOPE formulation. In both cases, we report that only dividing cells are able to produce a detectable amount of the fluorescent reporter protein. Notably, by measuring fluorescence over time in each pair of daughter cells, we find that Lipofectamine-based transfection statistically yields a remarkably higher degree of “symmetry” in protein expression between daughter cells as compared to DC-Chol/DOPE. A model ismore » envisioned in which the degree of symmetry of protein expression is linked to the number of bioavailable DNA copies within the cell before nuclear breakdown. Reported results open new perspectives for the understanding of the lipofection mechanism and define a new experimental platform for the quantitative comparison of transfection reagents. -- Highlights: •The process of lipofection is followed by quantifying the transgene expression in real time. •The Lipofectamine gold-standard is compared to the promising DC-Chol/DOPE formulation. •We report that only dividing cells are able to produce the fluorescent reporter protein. •The degree of symmetry of protein expression in daughter cells is linked to DNA bioavailability. •A new experimental platform for the quantitative comparison of transfection reagents is proposed.« less

  4. Real-time imaging of inflation-induced ATP release in the ex vivo rat lung.

    PubMed

    Furuya, Kishio; Tan, Ju Jing; Boudreault, Francis; Sokabe, Masahiro; Berthiaume, Yves; Grygorczyk, Ryszard

    2016-11-01

    Extracellular ATP and other nucleotides are important autocrine/paracrine mediators that regulate diverse processes critical for lung function, including mucociliary clearance, surfactant secretion, and local blood flow. Cellular ATP release is mechanosensitive; however, the impact of physical stimuli on ATP release during breathing has never been tested in intact lungs in real time and remains elusive. In this pilot study, we investigated inflation-induced ATP release in rat lungs ex vivo by real-time luciferin-luciferase (LL) bioluminescence imaging coupled with simultaneous infrared tissue imaging to identify ATP-releasing sites. With LL solution introduced into air spaces, brief inflation of such edematous lung (1 s, ∼20 cmH 2 O) induced transient (<30 s) ATP release in a limited number of air-inflated alveolar sacs during their recruitment/opening. Released ATP reached concentrations of ∼10 -6 M, relevant for autocrine/paracrine signaling, but it remained spatially restricted to single alveolar sacs or their clusters. ATP release was stimulus dependent: prolonged (100 s) inflation evoked long-lasting ATP release that terminated upon alveoli deflation/derecruitment while cyclic inflation/suction produced cyclic ATP release. With LL introduced into blood vessels, inflation induced transient ATP release in many small patchlike areas the size of alveolar sacs. Findings suggest that inflation induces ATP release in both alveoli and the surrounding blood capillary network; the functional units of ATP release presumably consist of alveolar sacs or their clusters. Our study demonstrates the feasibility of real-time ATP release imaging in ex vivo lungs and provides the first direct evidence of inflation-induced ATP release in lung air spaces and in pulmonary blood capillaries, highlighting the importance of purinergic signaling in lung function. Copyright © 2016 the American Physiological Society.

  5. Back-to-back optical coherence tomography-ultrasound probe for co-registered three-dimensional intravascular imaging with real-time display

    NASA Astrophysics Data System (ADS)

    Li, Jiawen; Ma, Teng; Jing, Joseph; Zhang, Jun; Patel, Pranav M.; Shung, K. Kirk; Zhou, Qifa; Chen, Zhongping

    2014-03-01

    We have developed a novel integrated optical coherence tomography (OCT)-intravascular ultrasound (IVUS) probe, with a 1.5 mm-long rigid-part and 0.9 mm outer diameter, for real-time intracoronary imaging of atherosclerotic plaques and guiding interventional procedures. By placing the OCT ball lens and IVUS 45MHz single element transducer back-to-back at the same axial position, this probe can provide automatically co-registered, co-axial OCT-IVUS imaging. To demonstrate its capability, 3D OCT-IVUS imaging of a pig's coronary artery in real-time displayed in polar coordinates, as well as images of two major types of advanced plaques in human cadaver coronary segments, was obtained using this probe and our upgraded system. Histology validation is also presented.

  6. Real-time full-motion color Flash lidar for target detection and identification

    NASA Astrophysics Data System (ADS)

    Nelson, Roy; Coppock, Eric; Craig, Rex; Craner, Jeremy; Nicks, Dennis; von Niederhausern, Kurt

    2015-05-01

    Greatly improved understanding of areas and objects of interest can be gained when real time, full-motion Flash LiDAR is fused with inertial navigation data and multi-spectral context imagery. On its own, full-motion Flash LiDAR provides the opportunity to exploit the z dimension for improved intelligence vs. 2-D full-motion video (FMV). The intelligence value of this data is enhanced when it is combined with inertial navigation data to produce an extended, georegistered data set suitable for a variety of analysis. Further, when fused with multispectral context imagery the typical point cloud now becomes a rich 3-D scene which is intuitively obvious to the user and allows rapid cognitive analysis with little or no training. Ball Aerospace has developed and demonstrated a real-time, full-motion LIDAR system that fuses context imagery (VIS to MWIR demonstrated) and inertial navigation data in real time, and can stream these information-rich geolocated/fused 3-D scenes from an airborne platform. In addition, since the higher-resolution context camera is boresighted and frame synchronized to the LiDAR camera and the LiDAR camera is an array sensor, techniques have been developed to rapidly interpolate the LIDAR pixel values creating a point cloud that has the same resolution as the context camera, effectively creating a high definition (HD) LiDAR image. This paper presents a design overview of the Ball TotalSight™ LIDAR system along with typical results over urban and rural areas collected from both rotary and fixed-wing aircraft. We conclude with a discussion of future work.

  7. Real-Time Spaceborne Synthetic Aperture Radar Float-Point Imaging System Using Optimized Mapping Methodology and a Multi-Node Parallel Accelerating Technique

    PubMed Central

    Li, Bingyi; Chen, Liang; Yu, Wenyue; Xie, Yizhuang; Bian, Mingming; Zhang, Qingjun; Pang, Long

    2018-01-01

    With the development of satellite load technology and very large-scale integrated (VLSI) circuit technology, on-board real-time synthetic aperture radar (SAR) imaging systems have facilitated rapid response to disasters. A key goal of the on-board SAR imaging system design is to achieve high real-time processing performance under severe size, weight, and power consumption constraints. This paper presents a multi-node prototype system for real-time SAR imaging processing. We decompose the commonly used chirp scaling (CS) SAR imaging algorithm into two parts according to the computing features. The linearization and logic-memory optimum allocation methods are adopted to realize the nonlinear part in a reconfigurable structure, and the two-part bandwidth balance method is used to realize the linear part. Thus, float-point SAR imaging processing can be integrated into a single Field Programmable Gate Array (FPGA) chip instead of relying on distributed technologies. A single-processing node requires 10.6 s and consumes 17 W to focus on 25-km swath width, 5-m resolution stripmap SAR raw data with a granularity of 16,384 × 16,384. The design methodology of the multi-FPGA parallel accelerating system under the real-time principle is introduced. As a proof of concept, a prototype with four processing nodes and one master node is implemented using a Xilinx xc6vlx315t FPGA. The weight and volume of one single machine are 10 kg and 32 cm × 24 cm × 20 cm, respectively, and the power consumption is under 100 W. The real-time performance of the proposed design is demonstrated on Chinese Gaofen-3 stripmap continuous imaging. PMID:29495637

  8. Simultaneous mapping of pan and sentinel lymph nodes for real-time image-guided surgery.

    PubMed

    Ashitate, Yoshitomo; Hyun, Hoon; Kim, Soon Hee; Lee, Jeong Heon; Henary, Maged; Frangioni, John V; Choi, Hak Soo

    2014-01-01

    The resection of regional lymph nodes in the basin of a primary tumor is of paramount importance in surgical oncology. Although sentinel lymph node mapping is now the standard of care in breast cancer and melanoma, over 20% of patients require a completion lymphadenectomy. Yet, there is currently no technology available that can image all lymph nodes in the body in real time, or assess both the sentinel node and all nodes simultaneously. In this study, we report an optical fluorescence technology that is capable of simultaneous mapping of pan lymph nodes (PLNs) and sentinel lymph nodes (SLNs) in the same subject. We developed near-infrared fluorophores, which have fluorescence emission maxima either at 700 nm or at 800 nm. One was injected intravenously for identification of all regional lymph nodes in a basin, and the other was injected locally for identification of the SLN. Using the dual-channel FLARE intraoperative imaging system, we could identify and resect all PLNs and SLNs simultaneously. The technology we describe enables simultaneous, real-time visualization of both PLNs and SLNs in the same subject.

  9. Real-time image processing for label-free enrichment of Actinobacteria cultivated in picolitre droplets.

    PubMed

    Zang, Emerson; Brandes, Susanne; Tovar, Miguel; Martin, Karin; Mech, Franziska; Horbert, Peter; Henkel, Thomas; Figge, Marc Thilo; Roth, Martin

    2013-09-21

    The majority of today's antimicrobial therapeutics is derived from secondary metabolites produced by Actinobacteria. While it is generally assumed that less than 1% of Actinobacteria species from soil habitats have been cultivated so far, classic screening approaches fail to supply new substances, often due to limited throughput and frequent rediscovery of already known strains. To overcome these restrictions, we implement high-throughput cultivation of soil-derived Actinobacteria in microfluidic pL-droplets by generating more than 600,000 pure cultures per hour from a spore suspension that can subsequently be incubated for days to weeks. Moreover, we introduce triggered imaging with real-time image-based droplet classification as a novel universal method for pL-droplet sorting. Growth-dependent droplet sorting at frequencies above 100 Hz is performed for label-free enrichment and extraction of microcultures. The combination of both cultivation of Actinobacteria in pL-droplets and real-time detection of growing Actinobacteria has great potential in screening for yet unknown species as well as their undiscovered natural products.

  10. A Green Synthesis of Carbon Nanoparticle from Honey for Real-Time Photoacoustic Imaging.

    PubMed

    Wu, Lina; Cai, Xin; Nelson, Kate; Xing, Wenxin; Xia, Jun; Zhang, Ruiying; Stacy, Allen J; Luderer, Micah; Lanza, Gregory M; Wang, Lihong V; Shen, Baozhong; Pan, Dipanjan

    2013-01-01

    Imaging sentinel lymph nodes (SLN) could provide us with critical information about the progression of a cancerous disease. Real-time high-resolution intraoperative photoacoustic imaging (PAI) in conjunction with a near infrared (NIR) probe may offer the opportunities for the immediate imaging for direct identification and resection of SLN or collecting tissue samples. In this work a commercially amenable synthetic methodology is revealed for developing luminescent carbon nanoparticles with rapid clearance properties. A one-pot "green" technique is pursued, which involved rapid surface passivation of carbon nanoparticles with organic macromolecules (e.g. polysorbate, polyethyleneglycol) in a solvent free condition. Interestingly, the naked carbon nanoparticles are derived for the first time, from commercial food grade honey. Surface coated particles are markedly smaller (~7 nm) than the previously explored particles (gold, SWNT, copper) for SLN imaging. Results indicate an exceptionally rapid signal enhancement (~2 min) of the SLN. Owing to their strong optical absorption in the near infrared region, tiny size and rapid lymphatic transport, this platform offers great potential for faster resection of SLN and may lower complications caused by axillary investigation for mismarking with dyes or low-resolution imaging techniques.

  11. Real-Time Optical Image Processing Techniques

    DTIC Science & Technology

    1988-10-31

    pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the modification of micro-chan- nel spatial...required for non-linear operation. Real-time nonlinear processing was performed using the halftone screen and MSLM. The experiments showed the effectiveness...pulse frequency modulation has been pursued through the analysis, design, and fabrication of pulse frequency modulated halftone screens and the

  12. DSP+FPGA-based real-time histogram equalization system of infrared image

    NASA Astrophysics Data System (ADS)

    Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan

    2001-10-01

    Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.

  13. Dual-mode photoacoustic and ultrasound system for real-time in-vivo ovarian cancer imaging

    NASA Astrophysics Data System (ADS)

    Mostafa, Atahar; Nandy, Sreyankar; Amidi, Eghbal; Zhu, Quing

    2018-02-01

    More than 80% of the ovarian cancers are diagnosed at late stages and the survival rate is less than 50%. Currently, there is no effective screening technique available and transvaginal US can only tell if the ovaries are enlarged or not. We have developed a new real-time co-registered US and photoacoustic system for in vivo imaging and characterization of ovaries. US is used to localize ovaries and photoacoustic imaging provides functional information about ovarian tissue angiogenesis and oxygenation saturation. The system consists of a tunable laser and a commercial US system from Alpinion Inc. The Alpinion system is cable of providing channel data for both US pulse-echo and photoacoustic imaging and can be programmed as a computer terminal for display US and photoacoustic images side by side or in coregistered mode. A transvaginal ultrasound probe of 6-MHz center frequency and bandwidth of 3-10 MHz is coupled with four optical fibers surrounded the US probe to deliver the light to tissue. The light from optical fibers is homogenized to ensure the power delivered to the tissue surface is below the FDA required limit. Physicians can easily navigate the probe and use US to look for ovaries and then turn on photoacoustic mode to provide real-time tumor vasculature and So2 saturation maps. With the optimized system, we have successfully imaged first group of 7 patients of malignant, abnormal and benign ovaries. The results have shown that both photoacoustic signal strength and spatial distribution are different between malignant and abnormal and benign ovaries.

  14. Real-time sound speed correction using golden section search to enhance ultrasound imaging quality

    NASA Astrophysics Data System (ADS)

    Yoon, Chong Ook; Yoon, Changhan; Yoo, Yangmo; Song, Tai-Kyong; Chang, Jin Ho

    2013-03-01

    In medical ultrasound imaging, high-performance beamforming is important to enhance spatial and contrast resolutions. A modern receive dynamic beamfomer uses a constant sound speed that is typically assumed to 1540 m/s in generating receive focusing delays [1], [2]. However, this assumption leads to degradation of spatial and contrast resolutions particularly when imaging obese patients or breast since the sound speed is significantly lower than the assumed sound speed [3]; the true sound speed in the fatty tissue is around 1450 m/s. In our previous study, it was demonstrated that the modified nonlinear anisotropic diffusion is capable of determining an optimal sound speed and the proposed method is a useful tool to improve ultrasound image quality [4], [5]. In the previous study, however, we utilized at least 21 iterations to find an optimal sound speed, which may not be viable for real-time applications. In this paper, we demonstrates that the number of iterations can be dramatically reduced using the GSS(golden section search) method with a minimal error. To evaluate performances of the proposed method, in vitro experiments were conducted with a tissue mimicking phantom. To emulate a heterogeneous medium, the phantom was immersed in the water. From the experiments, the number of iterations was reduced from 21 to 7 with GSS method and the maximum error of the lateral resolution between direct and GSS was less than 1%. These results indicate that the proposed method can be implemented in real time to improve the image quality in the medical ultrasound imaging.

  15. Investigation of an acoustical holography system for real-time imaging

    NASA Astrophysics Data System (ADS)

    Fecht, Barbara A.; Andre, Michael P.; Garlick, George F.; Shelby, Ronald L.; Shelby, Jerod O.; Lehman, Constance D.

    1998-07-01

    A new prototype imaging system based on ultrasound transmission through the object of interest -- acoustical holography -- was developed which incorporates significant improvements in acoustical and optical design. This system is being evaluated for potential clinical application in the musculoskeletal system, interventional radiology, pediatrics, monitoring of tumor ablation, vascular imaging and breast imaging. System limiting resolution was estimated using a line-pair target with decreasing line thickness and equal separation. For a swept frequency beam from 2.6 - 3.0 MHz, the minimum resolution was 0.5 lp/mm. Apatite crystals were suspended in castor oil to approximate breast microcalcifications. Crystals from 0.425 - 1.18 mm in diameter were well resolved in the acoustic zoom mode. Needle visibility was examined with both a 14-gauge biopsy needle and a 0.6 mm needle. The needle tip was clearly visible throughout the dynamic imaging sequence as it was slowly inserted into a RMI tissue-equivalent breast biopsy phantom. A selection of human images was acquired in several volunteers: a 25 year-old female volunteer with normal breast tissue, a lateral view of the elbow joint showing muscle fascia and tendon insertions, and the superficial vessels in the forearm. Real-time video images of these studies will be presented. In all of these studies, conventional sonography was used for comparison. These preliminary investigations with the new prototype acoustical holography system showed favorable results in comparison to state-of-the-art pulse-echo ultrasound and demonstrate it to be suitable for further clinical study. The new patient interfaces will facilitate orthopedic soft tissue evaluation, study of superficial vascular structures and potentially breast imaging.

  16. Real-time high-velocity resolution color Doppler OCT

    NASA Astrophysics Data System (ADS)

    Westphal, Volker; Yazdanfar, Siavash; Rollins, Andrew M.; Izatt, Joseph A.

    2001-05-01

    Color Doppler optical coherence tomography (CDOCT), also called Optical Doppler Tomography) is a noninvasive optical imaging technique, which allows for micron-scale physiological flow mapping simultaneous with morphological OCT imaging. Current systems for real-time endoscopic optical coherence tomography (EOCT) would be enhanced by the capability to visualize sub-surface blood flow for applications in early cancer diagnosis and the management of bleeding ulcers. Unfortunately, previous implementations of CDOCT have either been sufficiently computationally expensive (employing Fourier or Hilbert transform techniques) to rule out real-time imaging of flow, or have been restricted to imaging of excessively high flow velocities when used in real time. We have developed a novel Doppler OCT signal-processing strategy capable of imaging physiological flow rates in real time. This strategy employs cross-correlation processing of sequential A-scans in an EOCT image, as opposed to autocorrelation processing as described previously. To measure Doppler shifts in the kHz range using this technique, it was necessary to stabilize the EOCT interferometer center frequency, eliminate parasitic phase noise, and to construct a digital cross correlation unit able to correlate signals of megahertz bandwidth by a fixed lag of up to a few ms. The performance of the color Doppler OCT system was demonstrated in a flow phantom, demonstrating a minimum detectable flow velocity of ~0.8 mm/s at a data acquisition rate of 8 images/second (with 480 A-scans/image) using a handheld probe. Dynamic flow as well as using it freehanded was shown. Flow was also detectable in a phantom in combination with a clinical usable endoscopic probe.

  17. The analysis of complex mixed-radiation fields using near real-time imaging.

    PubMed

    Beaumont, Jonathan; Mellor, Matthew P; Joyce, Malcolm J

    2014-10-01

    A new mixed-field imaging system has been constructed at Lancaster University using the principles of collimation and back projection to passively locate and assess sources of neutron and gamma-ray radiation. The system was set up at the University of Manchester where three radiation sources: (252)Cf, a lead-shielded (241)Am/Be and a (22)Na source were imaged. Real-time discrimination was used to find the respective components of the neutron and gamma-ray fields detected by a single EJ-301 liquid scintillator, allowing separate images of neutron and gamma-ray emitters to be formed. (252)Cf and (22)Na were successfully observed and located in the gamma-ray image; however, the (241)Am/Be was not seen owing to surrounding lead shielding. The (252)Cf and (241)Am/Be neutron sources were seen clearly in the neutron image, demonstrating the advantage of this mixed-field technique over a gamma-ray-only image where the (241)Am/Be source would have gone undetected. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. SU-G-JeP3-07: Real-Time Image Guided Radiation Therapy for Heterotopic Ossification in Patients After Hip Replacement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, A; Jiang, S; Timmerman, R

    Purpose: To demonstrate the feasibility of using CBCT in a real-time image guided radiation therapy (IGRT) for single fraction heterotopic ossification (HO) in patients after hip replacement. In this real-time procedure, all steps, from simulation, imaging, planning to treatment delivery, are performed at the treatment unit in one appointment time slot. This work promotes real-time treatment to create a paradigm shift in the single fraction radiation therapy. Methods: An integrated real-time IGRT for HO was developed and tested for radiation treatment of heterotopic ossification for patient after hip replacement. After CBCT images are acquired at the linac, and sent tomore » the treatment planning system, the physician determines the field and/or draws a block. Subsequently, a simple 2D AP/PA plan with prescription of 700 cGy is created on-the-fly for physician to review. Once the physician approves the plan, the patient is treated on the same simulation position. This real-time treatment requires the team of attending physician, physicist, therapists, and dosimetrist to work in harmony to achieve all the steps in a timely manner. Results: Ten patients have been treated with this real-time treatment, having the same beams arrangement treatment plan and prescription as our clinically regular CT-based 2D plans. The average time for these procedures are 52.9 ±10.7 minutes from the time patient entered the treatment room until s/he exited, and 37.7 ±8.6 minutes from starting CBCT until last beam delivered. Conclusion: The real-time IGRT for HO treatment has been tested and implemented to be a clinically accepted procedure. This one-time appointment greatly enhances the waiting time, especially when patients in high level of pain, and provides a convenient approach for the whole clinical staff. Other disease sites will be also tested with this new technology.« less

  19. Image quality in real-time teleultrasound of infant hip exam over low-bandwidth internet links: a transatlantic feasibility study.

    PubMed

    Martinov, Dobrivoje; Popov, Veljko; Ignjatov, Zoran; Harris, Robert D

    2013-04-01

    Evolution of communication systems, especially internet-based technologies, has probably affected Radiology more than any other medical specialty. Tremendous increase in internet bandwidth has enabled a true revolution in image transmission and easy remote viewing of the static images and real-time video stream. Previous reports of real-time telesonography, such as the ones developed for emergency situations and humanitarian work, rely on high compressions of images utilized by remote sonologist to guide and supervise the unexperienced examiner. We believe that remote sonology could be also utilized in teleultrasound exam of infant hip. We tested feasibility of a low-cost teleultrasound system for infant hip and performed data analysis on the transmitted and original images. Transmission of data was accomplished with Remote Ultrasound (RU), a software package specifically designed for teleultrasound transmission through limited internet bandwidth. While image analysis of image pairs revealed statistically significant loss of information, panel evaluation failed to recognize any clinical difference between the original saved and transmitted still images.

  20. First demonstration of a vehicle mounted 250GHz real time passive imager

    NASA Astrophysics Data System (ADS)

    Mann, Chris

    2009-05-01

    This paper describes the design and performance of a ruggedized passive Terahertz imager, the frequency of operation is a 40GHz band centred around 250GHz. This system has been specifically targeted at vehicle mounted operation, outdoors in extreme environments. The unit incorporates temperature stabilization along with an anti-vibration chassis and is sealed to allow it to be used in a dusty environment. Within the system, a 250GHz heterodyne detector array is mated with optics and scanner to allow real time imaging out to 100 meters. First applications are envisaged to be stand-off, person borne IED detection to 30 meters but the unique properties in this frequency band present other potential uses such as seeing through smoke and fog. The possibility for use as a landing aid is discussed. A detailed description of the system design and video examples of typical imaging output will be presented.

  1. Real-time and quantitative isotropic spatial resolution susceptibility imaging for magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Pi, Shiqiang; Liu, Wenzhong; Jiang, Tao

    2018-03-01

    The magnetic transparency of biological tissue allows the magnetic nanoparticle (MNP) to be a promising functional sensor and contrast agent. The complex susceptibility of MNPs, strongly influenced by particle concentration, excitation magnetic field and their surrounding microenvironment, provides significant implications for biomedical applications. Therefore, magnetic susceptibility imaging of high spatial resolution will give more detailed information during the process of MNP-aided diagnosis and therapy. In this study, we present a novel spatial magnetic susceptibility extraction method for MNPs under a gradient magnetic field, a low-frequency drive magnetic field, and a weak strength high-frequency magnetic field. Based on this novel method, a magnetic particle susceptibility imaging (MPSI) of millimeter-level spatial resolution (<3 mm) was achieved using our homemade imaging system. Corroborated by the experimental results, the MPSI shows real-time (1 s per frame acquisition) and quantitative abilities, and isotropic high resolution.

  2. Toward Simultaneous Real-Time Fluoroscopic and Nuclear Imaging in the Intervention Room.

    PubMed

    Beijst, Casper; Elschot, Mattijs; Viergever, Max A; de Jong, Hugo W A M

    2016-01-01

    To investigate the technical feasibility of hybrid simultaneous fluoroscopic and nuclear imaging. An x-ray tube, an x-ray detector, and a gamma camera were positioned in one line, enabling imaging of the same field of view. Since a straightforward combination of these elements would block the lines of view, a gamma camera setup was developed to be able to view around the x-ray tube. A prototype was built by using a mobile C-arm and a gamma camera with a four-pinhole collimator. By using the prototype, test images were acquired and sensitivity, resolution, and coregistration error were analyzed. Nuclear images (two frames per second) were acquired simultaneously with fluoroscopic images. Depending on the distance from point source to detector, the system resolution was 1.5-1.9-cm full width at half maximum, the sensitivity was (0.6-1.5) × 10(-5) counts per decay, and the coregistration error was -0.13 to 0.15 cm. With good spatial and temporal alignment of both modalities throughout the field of view, fluoroscopic images can be shown in grayscale and corresponding nuclear images in color overlay. Measurements obtained with the hybrid imaging prototype device that combines simultaneous fluoroscopic and nuclear imaging of the same field of view have demonstrated the feasibility of real-time simultaneous hybrid imaging in the intervention room. © RSNA, 2015

  3. Advances in real-time millimeter-wave imaging radiometers for avionic synthetic vision

    NASA Astrophysics Data System (ADS)

    Lovberg, John A.; Chou, Ri-Chee; Martin, Christopher A.; Galliano, Joseph A., Jr.

    1995-06-01

    Millimeter-wave imaging has advantages over conventional visible or infrared imaging for many applications because millimeter-wave signals can travel through fog, snow, dust, and clouds with much less attenuation than infrared or visible light waves. Additionally, passive imaging systems avoid many problems associated with active radar imaging systems, such as radar clutter, glint, and multi-path return. ThermoTrex Corporation previously reported on its development of a passive imaging radiometer that uses an array of frequency-scanned antennas coupled to a multichannel acousto-optic spectrum analyzer (Bragg-cell) to form visible images of a scene through the acquisition of thermal blackbody radiation in the millimeter-wave spectrum. The output from the Bragg cell is imaged by a standard video camera and passed to a computer for normalization and display at real-time frame rates. An application of this system is its incorporation as part of an enhanced vision system to provide pilots with a synthetic view of a runway in fog and during other adverse weather conditions. Ongoing improvements to a 94 GHz imaging system and examples of recent images taken with this system will be presented. Additionally, the development of dielectric antennas and an electro- optic-based processor for improved system performance, and the development of an `ultra- compact' 220 GHz imaging system will be discussed.

  4. Real-time Data Processing and Visualization for the Airborne Scanning High-resolution Interferometer Sounder (S-HIS)

    NASA Astrophysics Data System (ADS)

    Taylor, J. K.; Revercomb, H. E.; Hoese, D.; Garcia, R. K.; Smith, W. L.; Weisz, E.; Tobin, D. C.; Best, F. A.; Knuteson, R. O.; Sullivan, D. V.; Barnes, C. M.; Van Gilst, D. P.

    2015-12-01

    The Hurricane and Severe Storm Sentinel (HS3) is a five-year NASA mission targeted to enhance the understanding of the formation and evolution of hurricanes in the Atlantic basin. Measurements were made from two NASA Global Hawk Unmanned Aircraft Systems (UAS) during the 2012 through 2014 hurricane seasons, with flights conducted from the NASA Wallops Flight Facility. The Global Hawk aircraft are capable of high altitude flights with durations of up to 30 hours, which allow extensive observations over distant storms, not typically possible with manned aircraft. The two NASA Global Hawks were equipped with instrument suites to study the storm environment, and inner core structure and processes, respectively. The Scanning High-resolution Interferometer Sounder (S-HIS), designed and built by the University of Wisconsin (UW) Space Science and Engineering Center (SSEC), measures emitted thermal radiation at high spectral resolution between 3.3 and 18 microns. The radiance measurements are used to obtain temperature and water vapor profiles of the Earth's atmosphere. The S-HIS spatial resolution is 2 km at nadir, across a 40 km ground swath from a nominal altitude of 20 kilometers. Since 1998, the S-HIS has participated in 33 field campaigns and has proven to be extremely dependable, effective, and highly accurate. It has flown on the NASA ER-2, DC-8, Proteus, WB-57, and Global Hawk airborne platforms. The UW S-HIS infrared sounder instrument is equipped with a real-time ground data processing system capable of delivering atmospheric profiles, radiance data, and engineering status to mission support scientists - all within less than one minute from the time of observation. This ground data processing system was assembled by a small team using existing software and proven practical techniques similar to a satellite ground system architecture. This summary outlines the design overview for the system and illustrates the data path, content, and outcomes.

  5. Real-time blind deconvolution of retinal images in adaptive optics scanning laser ophthalmoscopy

    NASA Astrophysics Data System (ADS)

    Li, Hao; Lu, Jing; Shi, Guohua; Zhang, Yudong

    2011-06-01

    With the use of adaptive optics (AO), the ocular aberrations can be compensated to get high-resolution image of living human retina. However, the wavefront correction is not perfect due to the wavefront measure error and hardware restrictions. Thus, it is necessary to use a deconvolution algorithm to recover the retinal images. In this paper, a blind deconvolution technique called Incremental Wiener filter is used to restore the adaptive optics confocal scanning laser ophthalmoscope (AOSLO) images. The point-spread function (PSF) measured by wavefront sensor is only used as an initial value of our algorithm. We also realize the Incremental Wiener filter on graphics processing unit (GPU) in real-time. When the image size is 512 × 480 pixels, six iterations of our algorithm only spend about 10 ms. Retinal blood vessels as well as cells in retinal images are restored by our algorithm, and the PSFs are also revised. Retinal images with and without adaptive optics are both restored. The results show that Incremental Wiener filter reduces the noises and improve the image quality.

  6. Real-time photoacoustic imaging of prostate brachytherapy seeds using a clinical ultrasound system.

    PubMed

    Kuo, Nathanael; Kang, Hyun Jae; Song, Danny Y; Kang, Jin U; Boctor, Emad M

    2012-06-01

    Prostate brachytherapy is a popular prostate cancer treatment option that involves the permanent implantation of radioactive seeds into the prostate. However, contemporary brachytherapy procedure is limited by the lack of an imaging system that can provide real-time seed-position feedback. While many other imaging systems have been proposed, photoacoustic imaging has emerged as a potential ideal modality to address this need, since it could easily be incorporated into the current ultrasound system used in the operating room. We present such a photoacoustic imaging system built around a clinical ultrasound system to achieve the task of visualizing and localizing seeds. We performed several experiments to analyze the effects of various parameters on the appearance of brachytherapy seeds in photoacoustic images. We also imaged multiple seeds in an ex vivo dog prostate phantom to demonstrate the possibility of using this system in a clinical setting. Although still in its infancy, these initial results of a photoacoustic imaging system for the application of prostate brachytherapy seed localization are highly promising.

  7. Airborne Hyperspectral Imaging of Seagrass and Coral Reef

    NASA Astrophysics Data System (ADS)

    Merrill, J.; Pan, Z.; Mewes, T.; Herwitz, S.

    2013-12-01

    This talk presents the process of project preparation, airborne data collection, data pre-processing and comparative analysis of a series of airborne hyperspectral projects focused on the mapping of seagrass and coral reef communities in the Florida Keys. As part of a series of large collaborative projects funded by the NASA ROSES program and the Florida Fish and Wildlife Conservation Commission and administered by the NASA UAV Collaborative, a series of airborne hyperspectral datasets were collected over six sites in the Florida Keys in May 2012, October 2012 and May 2013 by Galileo Group, Inc. using a manned Cessna 172 and NASA's SIERRA Unmanned Aerial Vehicle. Precise solar and tidal data were used to calculate airborne collection parameters and develop flight plans designed to optimize data quality. Two independent Visible and Near-Infrared (VNIR) hyperspectral imaging systems covering 400-100nm were used to collect imagery over six Areas of Interest (AOIs). Multiple collections were performed over all sites across strict solar windows in the mornings and afternoons. Independently developed pre-processing algorithms were employed to radiometrically correct, synchronize and georectify individual flight lines which were then combined into color balanced mosaics for each Area of Interest. The use of two different hyperspectral sensor as well as environmental variations between each collection allow for the comparative analysis of data quality as well as the iterative refinement of flight planning and collection parameters.

  8. Real time in vivo imaging and measurement of serine protease activity in the mouse hippocampus using a dedicated complementary metal-oxide semiconductor imaging device.

    PubMed

    Ng, David C; Tamura, Hideki; Tokuda, Takashi; Yamamoto, Akio; Matsuo, Masamichi; Nunoshita, Masahiro; Ishikawa, Yasuyuki; Shiosaka, Sadao; Ohta, Jun

    2006-09-30

    The aim of the present study is to demonstrate the application of complementary metal-oxide semiconductor (CMOS) imaging technology for studying the mouse brain. By using a dedicated CMOS image sensor, we have successfully imaged and measured brain serine protease activity in vivo, in real-time, and for an extended period of time. We have developed a biofluorescence imaging device by packaging the CMOS image sensor which enabled on-chip imaging configuration. In this configuration, no optics are required whereby an excitation filter is applied onto the sensor to replace the filter cube block found in conventional fluorescence microscopes. The fully packaged device measures 350 microm thick x 2.7 mm wide, consists of an array of 176 x 144 pixels, and is small enough for measurement inside a single hemisphere of the mouse brain, while still providing sufficient imaging resolution. In the experiment, intraperitoneally injected kainic acid induced upregulation of serine protease activity in the brain. These events were captured in real time by imaging and measuring the fluorescence from a fluorogenic substrate that detected this activity. The entire device, which weighs less than 1% of the body weight of the mouse, holds promise for studying freely moving animals.

  9. FIR filters for hardware-based real-time multi-band image blending

    NASA Astrophysics Data System (ADS)

    Popovic, Vladan; Leblebici, Yusuf

    2015-02-01

    Creating panoramic images has become a popular feature in modern smart phones, tablets, and digital cameras. A user can create a 360 degree field-of-view photograph from only several images. Quality of the resulting image is related to the number of source images, their brightness, and the used algorithm for their stitching and blending. One of the algorithms that provides excellent results in terms of background color uniformity and reduction of ghosting artifacts is the multi-band blending. The algorithm relies on decomposition of image into multiple frequency bands using dyadic filter bank. Hence, the results are also highly dependant on the used filter bank. In this paper we analyze performance of the FIR filters used for multi-band blending. We present a set of five filters that showed the best results in both literature and our experiments. The set includes Gaussian filter, biorthogonal wavelets, and custom-designed maximally flat and equiripple FIR filters. The presented results of filter comparison are based on several no-reference metrics for image quality. We conclude that 5/3 biorthogonal wavelet produces the best result in average, especially when its short length is considered. Furthermore, we propose a real-time FPGA implementation of the blending algorithm, using 2D non-separable systolic filtering scheme. Its pipeline architecture does not require hardware multipliers and it is able to achieve very high operating frequencies. The implemented system is able to process 91 fps for 1080p (1920×1080) image resolution.

  10. High-accuracy and real-time 3D positioning, tracking system for medical imaging applications based on 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Xue, Yuan; Cheng, Teng; Xu, Xiaohai; Gao, Zeren; Li, Qianqian; Liu, Xiaojing; Wang, Xing; Song, Rui; Ju, Xiangyang; Zhang, Qingchuan

    2017-01-01

    This paper presents a system for positioning markers and tracking the pose of a rigid object with 6 degrees of freedom in real-time using 3D digital image correlation, with two examples for medical imaging applications. Traditional DIC method was improved to meet the requirements of the real-time by simplifying the computations of integral pixel search. Experiments were carried out and the results indicated that the new method improved the computational efficiency by about 4-10 times in comparison with the traditional DIC method. The system was aimed for orthognathic surgery navigation in order to track the maxilla segment after LeFort I osteotomy. Experiments showed noise for the static point was at the level of 10-3 mm and the measurement accuracy was 0.009 mm. The system was demonstrated on skin surface shape evaluation of a hand for finger stretching exercises, which indicated a great potential on tracking muscle and skin movements.

  11. Real-time Awake Animal Motion Tracking System for SPECT Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goddard Jr, James Samuel; Baba, Justin S; Lee, Seung Joon

    Enhancements have been made in the development of a real-time optical pose measurement and tracking system that provides 3D position and orientation data for a single photon emission computed tomography (SPECT) imaging system for awake, unanesthetized, unrestrained small animals. Three optical cameras with infrared (IR) illumination view the head movements of an animal enclosed in a transparent burrow. Markers placed on the head provide landmark points for image segmentation. Strobed IR LED s are synchronized to the cameras and illuminate the markers to prevent motion blur for each set of images. The system using the three cameras automatically segments themore » markers, detects missing data, rejects false reflections, performs trinocular marker correspondence, and calculates the 3D pose of the animal s head. Improvements have been made in methods for segmentation, tracking, and 3D calculation to give higher speed and more accurate measurements during a scan. The optical hardware has been installed within a Siemens MicroCAT II small animal scanner at Johns Hopkins without requiring functional changes to the scanner operation. The system has undergone testing using both phantoms and live mice and has been characterized in terms of speed, accuracy, robustness, and reliability. Experimental data showing these motion tracking results are given.« less

  12. Real time processor for array speckle interferometry

    NASA Astrophysics Data System (ADS)

    Chin, Gordon; Florez, Jose; Borelli, Renan; Fong, Wai; Miko, Joseph; Trujillo, Carlos

    1989-02-01

    The authors are constructing a real-time processor to acquire image frames, perform array flat-fielding, execute a 64 x 64 element two-dimensional complex FFT (fast Fourier transform) and average the power spectrum, all within the 25 ms coherence time for speckles at near-IR (infrared) wavelength. The processor will be a compact unit controlled by a PC with real-time display and data storage capability. This will provide the ability to optimize observations and obtain results on the telescope rather than waiting several weeks before the data can be analyzed and viewed with offline methods. The image acquisition and processing, design criteria, and processor architecture are described.

  13. Real time processor for array speckle interferometry

    NASA Technical Reports Server (NTRS)

    Chin, Gordon; Florez, Jose; Borelli, Renan; Fong, Wai; Miko, Joseph; Trujillo, Carlos

    1989-01-01

    The authors are constructing a real-time processor to acquire image frames, perform array flat-fielding, execute a 64 x 64 element two-dimensional complex FFT (fast Fourier transform) and average the power spectrum, all within the 25 ms coherence time for speckles at near-IR (infrared) wavelength. The processor will be a compact unit controlled by a PC with real-time display and data storage capability. This will provide the ability to optimize observations and obtain results on the telescope rather than waiting several weeks before the data can be analyzed and viewed with offline methods. The image acquisition and processing, design criteria, and processor architecture are described.

  14. Registration of fast cine cardiac MR slices to 3D preprocedural images: toward real-time registration for MRI-guided procedures

    NASA Astrophysics Data System (ADS)

    Smolikova, Renata; Wachowiak, Mark P.; Drangova, Maria

    2004-05-01

    Interventional cardiac magnetic resonance (MR) procedures are the subject of an increasing number of research studies. Typically, during the procedure only two-dimensional images of oblique slices can be presented to the interventionalist in real time. There is a clear benefit to being able to register the real-time 2D slices to a previously acquired 3D computed tomography (CT) or MR image of the heart. Results from a study of the accuracy of registration of 2D cardiac images of an anesthetized pig to a 3D volume obtained in diastole are presented. Fast cine MR images representing twenty phases of the cardiac cycle were obtained of a 2D slice in a known oblique orientation. The 2D images were initially mis-oriented at distances ranging from 2 to 20 mm, and rotations of +/-10 degrees about all three axes. Images from all 20 cardiac phases were registered to examine the effect of timing between the 2D image and the 3D pre-procedural image. Linear registration using mutual information computed with 64 histogram bins yielded the highest accuracy. For the diastolic phases, mean translation and rotation errors ranged between 0.91 and 1.32 mm and between 1.73 and 2.10 degrees. Scans acquired at other phases also had high accuracy. These results are promising for the use of real time MR in image-guided cardiac interventions, and demonstrate the feasibility of registering 2D oblique MR slices to previously acquired single-phase volumes without preprocessing.

  15. Laser speckle imaging allows real-time intraoperative blood flow assessment during neurosurgical procedures.

    PubMed

    Hecht, Nils; Woitzik, Johannes; König, Susanne; Horn, Peter; Vajkoczy, Peter

    2013-07-01

    Currently, there is no adequate technique for intraoperative monitoring of cerebral blood flow (CBF). To evaluate laser speckle imaging (LSI) for assessment of relative CBF, LSI was performed in 30 patients who underwent direct surgical revascularization for treatment of arteriosclerotic cerebrovascular disease (ACVD), Moyamoya disease (MMD), or giant aneurysms, and in 8 control patients who underwent intracranial surgery for reasons other than hemodynamic compromise. The applicability and sensitivity of LSI was investigated through baseline perfusion and CO2 reactivity testing. The dynamics of LSI were assessed during bypass test occlusion and flow initiation procedures. Laser speckle imaging permitted robust (pseudo-) quantitative assessment of relative microcirculatory flow and standard bypass grafting resulted in significantly higher postoperative baseline perfusion values in ACVD and MMD. The applicability and sensitivity of LSI was shown by a significantly reduced CO2 reactivity in ACVD (9.6±9%) and MMD (8.5±8%) compared with control (31.2±5%; P<0.0001). In high- and intermediate-flow bypass patients, LSI was characterized by a dynamic real-time response to acute perfusion changes and ultimately confirmed a sufficient flow substitution through the bypass graft. Thus, LSI can be used for sensitive and continuous, non-invasive real-time visualization and measurement of relative cortical CBF in excellent spatial-temporal resolution.

  16. Real-Time Integrated Photoacoustic and Ultrasound (PAUS) Imaging System to Guide Interventional Procedures: Ex Vivo Study

    PubMed Central

    Wei, Chen-Wei; Nguyen, Thu-Mai; Xia, Jinjun; Arnal, Bastien; Wong, Emily Y.; Pelivanov, Ivan M.; O’Donnell, Matthew

    2015-01-01

    Because of depth-dependent light attenuation, bulky, low-repetition-rate lasers are usually used in most photoacoustic (PA) systems to provide sufficient pulse energies to image at depth within the body. However, integrating these lasers with real-time clinical ultrasound (US) scanners has been problematic because of their size and cost. In this paper, an integrated PA/US (PAUS) imaging system is presented operating at frame rates >30 Hz. By employing a portable, low-cost, low-pulse-energy (~2 mJ/pulse), high-repetition-rate (~1 kHz), 1053-nm laser, and a rotating galvo-mirror system enabling rapid laser beam scanning over the imaging area, the approach is demonstrated for potential applications requiring a few centimeters of penetration. In particular, we demonstrate here real-time (30 Hz frame rate) imaging (by combining multiple single-shot sub-images covering the scan region) of an 18-gauge needle inserted into a piece of chicken breast with subsequent delivery of an absorptive agent at more than 1-cm depth to mimic PAUS guidance of an interventional procedure. A signal-to-noise ratio of more than 35 dB is obtained for the needle in an imaging area 2.8 × 2.8 cm (depth × lateral). Higher frame rate operation is envisioned with an optimized scanning scheme. PMID:25643081

  17. Instant Grainification: Real-Time Grain-Size Analysis from Digital Images in the Field

    NASA Astrophysics Data System (ADS)

    Rubin, D. M.; Chezar, H.

    2007-12-01

    Over the past few years, digital cameras and underwater microscopes have been developed to collect in-situ images of sand-sized bed sediment, and software has been developed to measure grain size from those digital images (Chezar and Rubin, 2004; Rubin, 2004; Rubin et al., 2006). Until now, all image processing and grain- size analysis was done back in the office where images were uploaded from cameras and processed on desktop computers. Computer hardware has become small and rugged enough to process images in the field, which for the first time allows real-time grain-size analysis of sand-sized bed sediment. We present such a system consisting of weatherproof tablet computer, open source image-processing software (autocorrelation code of Rubin, 2004, running under Octave and Cygwin), and digital camera with macro lens. Chezar, H., and Rubin, D., 2004, Underwater microscope system: U.S. Patent and Trademark Office, patent number 6,680,795, January 20, 2004. Rubin, D.M., 2004, A simple autocorrelation algorithm for determining grain size from digital images of sediment: Journal of Sedimentary Research, v. 74, p. 160-165. Rubin, D.M., Chezar, H., Harney, J.N., Topping, D.J., Melis, T.S., and Sherwood, C.R., 2006, Underwater microscope for measuring spatial and temporal changes in bed-sediment grain size: USGS Open-File Report 2006-1360.

  18. Airborne Microwave Imaging of River Velocities

    NASA Technical Reports Server (NTRS)

    Plant, William J.

    2002-01-01

    The objective of this project was to determine whether airborne microwave remote sensing systems can measure river surface currents with sufficient accuracy to make them prospective instruments with which to monitor river flow from space. The approach was to fly a coherent airborne microwave Doppler radar, developed by APL/UW, on a light airplane along several rivers in western Washington state over an extended period of time. The fundamental quantity obtained by this system to measure river currents is the mean offset of the Doppler spectrum. Since this scatter can be obtained from interferometric synthetic aperture radars (INSARs), which can be flown in space, this project provided a cost effective means for determining the suitability of spaceborne INSAR for measuring river flow.

  19. Integrating and Visualizing Tropical Cyclone Data Using the Real Time Mission Monitor

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Blakeslee, Richard; Conover, Helen; Hall, John; He, Yubin; Regner, Kathryn

    2009-01-01

    The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the NASA Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. RTMM is extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, scientists, and managers appreciate the contributions that RTMM makes to their flight projects. A broad spectrum of interdisciplinary scientists used RTMM during field campaigns including the hurricane-focused 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 NOAA-NASA Aerosonde Hurricane Noel flight, 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), plus a soil moisture (SMAP-VEX) and two arctic research experiments (ARCTAS) in 2008. Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated "on the fly". The resultant flight plan is then immediately posted to the Google Earth-based RTMM for interested scientists to view the planned flight track and subsequently compare it to the actual real time flight progress. We are planning additional capabilities to RTMM including collaborations with the Jet Propulsion

  20. Imaging the Directed Transport of Single Engineered RNA Transcripts in Real-Time Using Ratiometric Bimolecular Beacons

    PubMed Central

    Zhang, Xuemei; Zajac, Allison L.; Huang, Lingyan; Behlke, Mark A.; Tsourkas, Andrew

    2014-01-01

    The relationship between RNA expression and cell function can often be difficult to decipher due to the presence of both temporal and sub-cellular processing of RNA. These intricacies of RNA regulation can often be overlooked when only acquiring global measurements of RNA expression. This has led to development of several tools that allow for the real-time imaging of individual engineered RNA transcripts in living cells. Here, we describe a new technique that utilizes an oligonucleotide-based probe, ratiometric bimolecular beacon (RBMB), to image RNA transcripts that were engineered to contain 96-tandem repeats of the RBMB target sequence in the 3′-untranslated region. Binding of RBMBs to the target RNA resulted in discrete bright fluorescent spots, representing individual transcripts, that could be imaged in real-time. Since RBMBs are a synthetic probe, the use of photostable, bright, and red-shifted fluorophores led to a high signal-to-background. RNA motion was readily characterized by both mean squared displacement and moment scaling spectrum analyses. These analyses revealed clear examples of directed, Brownian, and subdiffusive movements. PMID:24454933

  1. Real-time high dynamic range laser scanning microscopy

    NASA Astrophysics Data System (ADS)

    Vinegoni, C.; Leon Swisher, C.; Fumene Feruglio, P.; Giedt, R. J.; Rousso, D. L.; Stapleton, S.; Weissleder, R.

    2016-04-01

    In conventional confocal/multiphoton fluorescence microscopy, images are typically acquired under ideal settings and after extensive optimization of parameters for a given structure or feature, often resulting in information loss from other image attributes. To overcome the problem of selective data display, we developed a new method that extends the imaging dynamic range in optical microscopy and improves the signal-to-noise ratio. Here we demonstrate how real-time and sequential high dynamic range microscopy facilitates automated three-dimensional neural segmentation. We address reconstruction and segmentation performance on samples with different size, anatomy and complexity. Finally, in vivo real-time high dynamic range imaging is also demonstrated, making the technique particularly relevant for longitudinal imaging in the presence of physiological motion and/or for quantification of in vivo fast tracer kinetics during functional imaging.

  2. A Bayesian approach to real-time 3D tumor localization via monoscopic x-ray imaging during treatment delivery.

    PubMed

    Li, Ruijiang; Fahimian, Benjamin P; Xing, Lei

    2011-07-01

    Monoscopic x-ray imaging with on-board kV devices is an attractive approach for real-time image guidance in modern radiation therapy such as VMAT or IMRT, but it falls short in providing reliable information along the direction of imaging x-ray. By effectively taking consideration of projection data at prior times and/or angles through a Bayesian formalism, the authors develop an algorithm for real-time and full 3D tumor localization with a single x-ray imager during treatment delivery. First, a prior probability density function is constructed using the 2D tumor locations on the projection images acquired during patient setup. Whenever an x-ray image is acquired during the treatment delivery, the corresponding 2D tumor location on the imager is used to update the likelihood function. The unresolved third dimension is obtained by maximizing the posterior probability distribution. The algorithm can also be used in a retrospective fashion when all the projection images during the treatment delivery are used for 3D localization purposes. The algorithm does not involve complex optimization of any model parameter and therefore can be used in a "plug-and-play" fashion. The authors validated the algorithm using (1) simulated 3D linear and elliptic motion and (2) 3D tumor motion trajectories of a lung and a pancreas patient reproduced by a physical phantom. Continuous kV images were acquired over a full gantry rotation with the Varian TrueBeam on-board imaging system. Three scenarios were considered: fluoroscopic setup, cone beam CT setup, and retrospective analysis. For the simulation study, the RMS 3D localization error is 1.2 and 2.4 mm for the linear and elliptic motions, respectively. For the phantom experiments, the 3D localization error is < 1 mm on average and < 1.5 mm at 95th percentile in the lung and pancreas cases for all three scenarios. The difference in 3D localization error for different scenarios is small and is not statistically significant. The proposed

  3. Real-time implementation of optimized maximum noise fraction transform for feature extraction of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Wu, Yuanfeng; Gao, Lianru; Zhang, Bing; Zhao, Haina; Li, Jun

    2014-01-01

    We present a parallel implementation of the optimized maximum noise fraction (G-OMNF) transform algorithm for feature extraction of hyperspectral images on commodity graphics processing units (GPUs). The proposed approach explored the algorithm data-level concurrency and optimized the computing flow. We first defined a three-dimensional grid, in which each thread calculates a sub-block data to easily facilitate the spatial and spectral neighborhood data searches in noise estimation, which is one of the most important steps involved in OMNF. Then, we optimized the processing flow and computed the noise covariance matrix before computing the image covariance matrix to reduce the original hyperspectral image data transmission. These optimization strategies can greatly improve the computing efficiency and can be applied to other feature extraction algorithms. The proposed parallel feature extraction algorithm was implemented on an Nvidia Tesla GPU using the compute unified device architecture and basic linear algebra subroutines library. Through the experiments on several real hyperspectral images, our GPU parallel implementation provides a significant speedup of the algorithm compared with the CPU implementation, especially for highly data parallelizable and arithmetically intensive algorithm parts, such as noise estimation. In order to further evaluate the effectiveness of G-OMNF, we used two different applications: spectral unmixing and classification for evaluation. Considering the sensor scanning rate and the data acquisition time, the proposed parallel implementation met the on-board real-time feature extraction.

  4. Replacing missing data between airborne SAR coherent image pairs

    DOE PAGES

    Musgrove, Cameron H.; West, James C.

    2017-07-31

    For synthetic aperture radar systems, missing data samples can cause severe image distortion. When multiple, coherent data collections exist and the missing data samples do not overlap between collections, there exists the possibility of replacing data samples between collections. For airborne radar, the known and unknown motion of the aircraft prevents direct data sample replacement to repair image features. Finally, this paper presents a method to calculate the necessary phase corrections to enable data sample replacement using only the collected radar data.

  5. Replacing missing data between airborne SAR coherent image pairs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musgrove, Cameron H.; West, James C.

    For synthetic aperture radar systems, missing data samples can cause severe image distortion. When multiple, coherent data collections exist and the missing data samples do not overlap between collections, there exists the possibility of replacing data samples between collections. For airborne radar, the known and unknown motion of the aircraft prevents direct data sample replacement to repair image features. Finally, this paper presents a method to calculate the necessary phase corrections to enable data sample replacement using only the collected radar data.

  6. Imager-to-Radiometer In-flight Cross Calibration: RSP Radiometric Comparison with Airborne and Satellite Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Cairns, Brian; Wasilewski, Andrzej

    2016-01-01

    This work develops a method to compare the radiometric calibration between a radiometer and imagers hosted on aircraft and satellites. The radiometer is the airborne Research Scanning Polarimeter (RSP), which takes multi-angle, photo-polarimetric measurements in several spectral channels. The RSP measurements used in this work were coincident with measurements made by the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), which was on the same aircraft. These airborne measurements were also coincident with an overpass of the Landsat 8 Operational Land Imager (OLI). First we compare the RSP and OLI radiance measurements to AVIRIS since the spectral response of the multispectral instruments can be used to synthesize a spectrally equivalent signal from the imaging spectrometer data. We then explore a method that uses AVIRIS as a transfer between RSP and OLI to show that radiometric traceability of a satellite-based imager can be used to calibrate a radiometer despite differences in spectral channel sensitivities. This calibration transfer shows agreement within the uncertainty of both the various instruments for most spectral channels.

  7. A new ultrasonic real-time scanner featuring a servo-controlled transducer displaying a sector image.

    PubMed

    Skolnick, M L; Matzuk, T

    1978-08-01

    This paper describes a new real-time servo-controlled sector scanner that produces high-resolution images similar to phased-array systems, but possesses the simplicity of design and low cost best achievable in a mechanical sector scanner. Its unique feature is the transducer head which contains a single moving part--the transducer. Frame rates vary from 0 to 30 degrees and the sector angle from 0 to 60 degrees. Abdominal applications include: differentiation of vascular structures, detection of small masses, imaging of diagonally oriented organs. Survey scanning, and demonstration of regions difficult to image with contact scanners. Cardiac uses are also described.

  8. New generation of magnetic and luminescent nanoparticles for in vivo real-time imaging

    PubMed Central

    Lacroix, Lise-Marie; Delpech, Fabien; Nayral, Céline; Lachaize, Sébastien; Chaudret, Bruno

    2013-01-01

    A new generation of optimized contrast agents is emerging, based on metallic nanoparticles (NPs) and semiconductor nanocrystals for, respectively, magnetic resonance imaging (MRI) and near-infrared (NIR) fluorescent imaging techniques. Compared with established contrast agents, such as iron oxide NPs or organic dyes, these NPs benefit from several advantages: their magnetic and optical properties can be tuned through size, shape and composition engineering, their efficiency can exceed by several orders of magnitude that of contrast agents clinically used, their surface can be modified to incorporate specific targeting agents and antifolding polymers to increase blood circulation time and tumour recognition, and they can possibly be integrated in complex architecture to yield multi-modal imaging agents. In this review, we will report the materials of choice based on the understanding of the basic physics of NIR and MRI techniques and their corresponding syntheses as NPs. Surface engineering, water transfer and specific targeting will be highlighted prior to their first use for in vivo real-time imaging. Highly efficient NPs that are safer and target specific are likely to enter clinical application in a near future. PMID:24427542

  9. Axial Tomography from Digitized Real Time Radiography

    DOE R&D Accomplishments Database

    Zolnay, A. S.; McDonald, W. M.; Doupont, P. A.; McKinney, R. L.; Lee, M. M.

    1985-01-18

    Axial tomography from digitized real time radiographs provides a useful tool for industrial radiography and tomography. The components of this system are: x-ray source, image intensifier, video camera, video line extractor and digitizer, data storage and reconstruction computers. With this system it is possible to view a two dimensional x-ray image in real time at each angle of rotation and select the tomography plane of interest by choosing which video line to digitize. The digitization of a video line requires less than a second making data acquisition relatively short. Further improvements on this system are planned and initial results are reported.

  10. Proceedings of the Airborne Imaging Spectrometer Data Analysis Workshop

    NASA Technical Reports Server (NTRS)

    Vane, G. (Editor); Goetz, A. F. H. (Editor)

    1985-01-01

    The Airborne Imaging Spectrometer (AIS) Data Analysis Workshop was held at the Jet Propulsion Laboratory on April 8 to 10, 1985. It was attended by 92 people who heard reports on 30 investigations currently under way using AIS data that have been collected over the past two years. Written summaries of 27 of the presentations are in these Proceedings. Many of the results presented at the Workshop are preliminary because most investigators have been working with this fundamentally new type of data for only a relatively short time. Nevertheless, several conclusions can be drawn from the Workshop presentations concerning the value of imaging spectrometry to Earth remote sensing. First, work with AIS has shown that direct identification of minerals through high spectral resolution imaging is a reality for a wide range of materials and geological settings. Second, there are strong indications that high spectral resolution remote sensing will enhance the ability to map vegetation species. There are also good indications that imaging spectrometry will be useful for biochemical studies of vegetation. Finally, there are a number of new data analysis techniques under development which should lead to more efficient and complete information extraction from imaging spectrometer data. The results of the Workshop indicate that as experience is gained with this new class of data, and as new analysis methodologies are developed and applied, the value of imaging spectrometry should increase.

  11. Novel Insights into the Proteus mirabilis Crystalline Biofilm Using Real-Time Imaging

    PubMed Central

    Wilks, Sandra A.; Fader, Mandy J.; Keevil, C. William

    2015-01-01

    The long-term use of indwelling catheters results in a high risk from urinary tract infections (UTI) and blockage. Blockages often occur from crystalline deposits, formed as the pH rises due to the action of urease-producing bacteria; the most commonly found species being Proteus mirabilis. These crystalline biofilms have been found to develop on all catheter materials with P. mirabilis attaching to all surfaces and forming encrustations. Previous studies have mainly relied on electron microscopy to describe this process but there remains a lack of understanding into the stages of biofilm formation. Using an advanced light microscopy technique, episcopic differential interference contrast (EDIC) microscopy combined with epifluorescence (EF), we describe a non-destructive, non-contact, real-time imaging method used to track all stages of biofilm development from initial single cell attachment to complex crystalline biofilm formation. Using a simple six-well plate system, attachment of P. mirabilis (in artificial urine) to sections of silicone and hydrogel latex catheters was tracked over time (up to 24 days). Using EDIC and EF we show how initial attachment occurred in less than 1 h following exposure to P. mirabilis. This was rapidly followed by an accumulation of an additional material (indicated to be carbohydrate based using lectin staining) and the presence of highly elongated, motile cells. After 24 h exposure, a layer developed above this conditioning film and within 4 days the entire surface (of both catheter materials) was covered with diffuse crystalline deposits with defined crystals embedded. Using three-dimensional image reconstruction software, cells of P. mirabilis were seen covering the crystal surfaces. EDIC microscopy could resolve these four components of the complex crystalline biofilm and the close relationship between P. mirabilis and the crystals. This real-time imaging technique permits study of this complex biofilm development with no risk

  12. Novel Insights into the Proteus mirabilis Crystalline Biofilm Using Real-Time Imaging.

    PubMed

    Wilks, Sandra A; Fader, Mandy J; Keevil, C William

    2015-01-01

    The long-term use of indwelling catheters results in a high risk from urinary tract infections (UTI) and blockage. Blockages often occur from crystalline deposits, formed as the pH rises due to the action of urease-producing bacteria; the most commonly found species being Proteus mirabilis. These crystalline biofilms have been found to develop on all catheter materials with P. mirabilis attaching to all surfaces and forming encrustations. Previous studies have mainly relied on electron microscopy to describe this process but there remains a lack of understanding into the stages of biofilm formation. Using an advanced light microscopy technique, episcopic differential interference contrast (EDIC) microscopy combined with epifluorescence (EF), we describe a non-destructive, non-contact, real-time imaging method used to track all stages of biofilm development from initial single cell attachment to complex crystalline biofilm formation. Using a simple six-well plate system, attachment of P. mirabilis (in artificial urine) to sections of silicone and hydrogel latex catheters was tracked over time (up to 24 days). Using EDIC and EF we show how initial attachment occurred in less than 1 h following exposure to P. mirabilis. This was rapidly followed by an accumulation of an additional material (indicated to be carbohydrate based using lectin staining) and the presence of highly elongated, motile cells. After 24 h exposure, a layer developed above this conditioning film and within 4 days the entire surface (of both catheter materials) was covered with diffuse crystalline deposits with defined crystals embedded. Using three-dimensional image reconstruction software, cells of P. mirabilis were seen covering the crystal surfaces. EDIC microscopy could resolve these four components of the complex crystalline biofilm and the close relationship between P. mirabilis and the crystals. This real-time imaging technique permits study of this complex biofilm development with no risk

  13. Collaborative real-time motion video analysis by human observer and image exploitation algorithms

    NASA Astrophysics Data System (ADS)

    Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2015-05-01

    Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.

  14. Improvements and Additions to NASA Near Real-Time Earth Imagery

    NASA Technical Reports Server (NTRS)

    Cechini, Matthew; Boller, Ryan; Baynes, Kathleen; Schmaltz, Jeffrey; DeLuca, Alexandar; King, Jerome; Thompson, Charles; Roberts, Joe; Rodriguez, Joshua; Gunnoe, Taylor; hide

    2016-01-01

    For many years, the NASA Global Imagery Browse Services (GIBS) has worked closely with the Land, Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) system to provide near real-time imagery visualizations of AIRS (Atmospheric Infrared Sounder), MLS (Microwave Limb Sounder), MODIS (Moderate Resolution Imaging Spectrometer), OMI (Ozone Monitoring Instrument), and recently VIIRS (Visible Infrared Imaging Radiometer Suite) science parameters. These visualizations are readily available through standard web services and the NASA Worldview client. Access to near real-time imagery provides a critical capability to GIBS and Worldview users. GIBS continues to focus on improving its commitment to providing near real-time imagery for end-user applications. The focus of this presentation will be the following completed or planned GIBS system and imagery enhancements relating to near real-time imagery visualization.

  15. Real-time FPGA architectures for computer vision

    NASA Astrophysics Data System (ADS)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar

    2000-03-01

    This paper presents an architecture for real-time generic convolution of a mask and an image. The architecture is intended for fast low level image processing. The FPGA-based architecture takes advantage of the availability of registers in FPGAs to implement an efficient and compact module to process the convolutions. The architecture is designed to minimize the number of accesses to the image memory and is based on parallel modules with internal pipeline operation in order to improve its performance. The architecture is prototyped in a FPGA, but it can be implemented on a dedicated VLSI to reach higher clock frequencies. Complexity issues, FPGA resources utilization, FPGA limitations, and real time performance are discussed. Some results are presented and discussed.

  16. Real-time Internet connections: implications for surgical decision making in laparoscopy.

    PubMed

    Broderick, T J; Harnett, B M; Doarn, C R; Rodas, E B; Merrell, R C

    2001-08-01

    To determine whether a low-bandwidth Internet connection can provide adequate image quality to support remote real-time surgical consultation. Telemedicine has been used to support care at a distance through the use of expensive equipment and broadband communication links. In the past, the operating room has been an isolated environment that has been relatively inaccessible for real-time consultation. Recent technological advances have permitted videoconferencing over low-bandwidth, inexpensive Internet connections. If these connections are shown to provide adequate video quality for surgical applications, low-bandwidth telemedicine will open the operating room environment to remote real-time surgical consultation. Surgeons performing a laparoscopic cholecystectomy in Ecuador or the Dominican Republic shared real-time laparoscopic images with a panel of surgeons at the parent university through a dial-up Internet account. The connection permitted video and audio teleconferencing to support real-time consultation as well as the transmission of real-time images and store-and-forward images for observation by the consultant panel. A total of six live consultations were analyzed. In addition, paired local and remote images were "grabbed" from the video feed during these laparoscopic cholecystectomies. Nine of these paired images were then placed into a Web-based tool designed to evaluate the effect of transmission on image quality. The authors showed for the first time the ability to identify critical anatomic structures in laparoscopy over a low-bandwidth connection via the Internet. The consultant panel of surgeons correctly remotely identified biliary and arterial anatomy during six laparoscopic cholecystectomies. Within the Web-based questionnaire, 15 surgeons could not blindly distinguish the quality of local and remote laparoscopic images. Low-bandwidth, Internet-based telemedicine is inexpensive, effective, and almost ubiquitous. Use of these inexpensive

  17. Real-Time Internet Connections: Implications for Surgical Decision Making in Laparoscopy

    PubMed Central

    Broderick, Timothy J.; Harnett, Brett M.; Doarn, Charles R.; Rodas, Edgar B.; Merrell, Ronald C.

    2001-01-01

    Objective To determine whether a low-bandwidth Internet connection can provide adequate image quality to support remote real-time surgical consultation. Summary Background Data Telemedicine has been used to support care at a distance through the use of expensive equipment and broadband communication links. In the past, the operating room has been an isolated environment that has been relatively inaccessible for real-time consultation. Recent technological advances have permitted videoconferencing over low-bandwidth, inexpensive Internet connections. If these connections are shown to provide adequate video quality for surgical applications, low-bandwidth telemedicine will open the operating room environment to remote real-time surgical consultation. Methods Surgeons performing a laparoscopic cholecystectomy in Ecuador or the Dominican Republic shared real-time laparoscopic images with a panel of surgeons at the parent university through a dial-up Internet account. The connection permitted video and audio teleconferencing to support real-time consultation as well as the transmission of real-time images and store-and-forward images for observation by the consultant panel. A total of six live consultations were analyzed. In addition, paired local and remote images were “grabbed” from the video feed during these laparoscopic cholecystectomies. Nine of these paired images were then placed into a Web-based tool designed to evaluate the effect of transmission on image quality. Results The authors showed for the first time the ability to identify critical anatomic structures in laparoscopy over a low-bandwidth connection via the Internet. The consultant panel of surgeons correctly remotely identified biliary and arterial anatomy during six laparoscopic cholecystectomies. Within the Web-based questionnaire, 15 surgeons could not blindly distinguish the quality of local and remote laparoscopic images. Conclusions Low-bandwidth, Internet-based telemedicine is inexpensive

  18. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    NASA Astrophysics Data System (ADS)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported

  19. Real time SAR processing

    NASA Technical Reports Server (NTRS)

    Premkumar, A. B.; Purviance, J. E.

    1990-01-01

    A simplified model for the SAR imaging problem is presented. The model is based on the geometry of the SAR system. Using this model an expression for the entire phase history of the received SAR signal is formulated. From the phase history, it is shown that the range and the azimuth coordinates for a point target image can be obtained by processing the phase information during the intrapulse and interpulse periods respectively. An architecture for a VLSI implementation for the SAR signal processor is presented which generates images in real time. The architecture uses a small number of chips, a new correlation processor, and an efficient azimuth correlation process.

  20. Design of teleoperation system with a force-reflecting real-time simulator

    NASA Technical Reports Server (NTRS)

    Hirata, Mitsunori; Sato, Yuichi; Nagashima, Fumio; Maruyama, Tsugito

    1994-01-01

    We developed a force-reflecting teleoperation system that uses a real-time graphic simulator. This system eliminates the effects of communication time delays in remote robot manipulation. The simulator provides the operator with predictive display and feedback of computed contact forces through a six-degree of freedom (6-DOF) master arm on a real-time basis. With this system, peg-in-hole tasks involving round-trip communication time delays of up to a few seconds were performed at three support levels: a real image alone, a predictive display with a real image, and a real-time graphic simulator with computed-contact-force reflection and a predictive display. The experimental results indicate the best teleoperation efficiency was achieved by using the force-reflecting simulator with two images. The shortest work time, lowest sensor maximum, and a 100 percent success rate were obtained. These results demonstrate the effectiveness of simulated-force-reflecting teleoperation efficiency.

  1. Real-time high dynamic range laser scanning microscopy

    PubMed Central

    Vinegoni, C.; Leon Swisher, C.; Fumene Feruglio, P.; Giedt, R. J.; Rousso, D. L.; Stapleton, S.; Weissleder, R.

    2016-01-01

    In conventional confocal/multiphoton fluorescence microscopy, images are typically acquired under ideal settings and after extensive optimization of parameters for a given structure or feature, often resulting in information loss from other image attributes. To overcome the problem of selective data display, we developed a new method that extends the imaging dynamic range in optical microscopy and improves the signal-to-noise ratio. Here we demonstrate how real-time and sequential high dynamic range microscopy facilitates automated three-dimensional neural segmentation. We address reconstruction and segmentation performance on samples with different size, anatomy and complexity. Finally, in vivo real-time high dynamic range imaging is also demonstrated, making the technique particularly relevant for longitudinal imaging in the presence of physiological motion and/or for quantification of in vivo fast tracer kinetics during functional imaging. PMID:27032979

  2. Real-time implementations of image segmentation algorithms on shared memory multicore architecture: a survey (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed

    2017-05-01

    The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.

  3. High-Speed Real-Time Resting-State fMRI Using Multi-Slab Echo-Volumar Imaging

    PubMed Central

    Posse, Stefan; Ackley, Elena; Mutihac, Radu; Zhang, Tongsheng; Hummatov, Ruslan; Akhtari, Massoud; Chohan, Muhammad; Fisch, Bruce; Yonas, Howard

    2013-01-01

    We recently demonstrated that ultra-high-speed real-time fMRI using multi-slab echo-volumar imaging (MEVI) significantly increases sensitivity for mapping task-related activation and resting-state networks (RSNs) compared to echo-planar imaging (Posse et al., 2012). In the present study we characterize the sensitivity of MEVI for mapping RSN connectivity dynamics, comparing independent component analysis (ICA) and a novel seed-based connectivity analysis (SBCA) that combines sliding-window correlation analysis with meta-statistics. This SBCA approach is shown to minimize the effects of confounds, such as movement, and CSF and white matter signal changes, and enables real-time monitoring of RSN dynamics at time scales of tens of seconds. We demonstrate highly sensitive mapping of eloquent cortex in the vicinity of brain tumors and arterio-venous malformations, and detection of abnormal resting-state connectivity in epilepsy. In patients with motor impairment, resting-state fMRI provided focal localization of sensorimotor cortex compared with more diffuse activation in task-based fMRI. The fast acquisition speed of MEVI enabled segregation of cardiac-related signal pulsation using ICA, which revealed distinct regional differences in pulsation amplitude and waveform, elevated signal pulsation in patients with arterio-venous malformations and a trend toward reduced pulsatility in gray matter of patients compared with healthy controls. Mapping cardiac pulsation in cortical gray matter may carry important functional information that distinguishes healthy from diseased tissue vasculature. This novel fMRI methodology is particularly promising for mapping eloquent cortex in patients with neurological disease, having variable degree of cooperation in task-based fMRI. In conclusion, ultra-high-real-time speed fMRI enhances the sensitivity of mapping the dynamics of resting-state connectivity and cerebro-vascular pulsatility for clinical and neuroscience research applications

  4. Acoustic radiation force impulse imaging for real-time observation of lesion development during radiofrequency ablation procedures

    NASA Astrophysics Data System (ADS)

    Fahey, Brian J.; Trahey, Gregg E.

    2005-04-01

    When performing radiofrequency ablation (RFA) procedures, physicians currently have little or no feedback concerning the success of the treatment until follow-up assessments are made days to weeks later. To be successful, RFA must induce a thermal lesion of sufficient volume to completely destroy a target tumor or completely isolate an aberrant cardiac pathway. Although ultrasound, computed tomography (CT), and CT-based fluoroscopy have found use in guiding RFA treatments, they are deficient in giving accurate assessments of lesion size or boundaries during procedures. As induced thermal lesion size can vary considerably from patient to patient, the current lack of real-time feedback during RFA procedures is troublesome. We have developed a technique for real-time monitoring of thermal lesion size during RFA procedures utilizing acoustic radiation force impulse (ARFI) imaging. In both ex vivo and in vivo tissues, ARFI imaging provided better thermal lesion contrast and better overall appreciation for lesion size and boundaries relative to conventional sonography. The thermal safety of ARFI imaging for use at clinically realistic depths was also verified through the use of finite element method models. As ARFI imaging is implemented entirely on a diagnostic ultrasound scanner, it is a convenient, inexpensive, and promising modality for monitoring RFA procedures in vivo.

  5. Airborne Tactical Crossload Planner

    DTIC Science & Technology

    2017-12-01

    set out in the Airborne Standard Operating Procedure (ASOP). 14. SUBJECT TERMS crossload, airborne, optimization, integer linear programming ...they land to their respective sub-mission locations. In this thesis, we formulate and implement an integer linear program called the Tactical...to meet any desired crossload objectives. xiv We demonstrate TCP with two real-world tactical problems from recent airborne operations: one by the

  6. Isolation of chicken taste buds for real-time Ca2+ imaging.

    PubMed

    Kudo, Ken-ichi; Kawabata, Fuminori; Nomura, Toumi; Aridome, Ayumi; Nishimura, Shotaro; Tabata, Shoji

    2014-10-01

    We isolated chicken taste buds and used a real-time Ca2+ imaging technique to investigate the functions of the taste cells. With RT-PCR, we found that isolated chicken taste bud-like cell subsets express chicken gustducin messenger RNA. Immunocytochemical techniques revealed that the cell subsets were also immunopositive for chicken gustducin. These results provided strong evidence that the isolated cell subsets contain chicken taste buds. The isolated cell subsets were spindle-shaped and approximately 61-75 μm wide and 88-98 μm long, and these characteristics are similar to those of sectional chicken taste buds. Using Ca2+ imaging, we observed the buds' response to 2 mmol/L quinine hydrochloride (a bitter substance) and their response to a mixture of 25 mmol/L L-glutamic acid monopotassium salt monohydrate and 1 mmol/L inosine 5'-monophosphate disodium salt, umami substances. The present study is the first morphological demonstration of isolated chicken taste buds, and our results indicate that the isolated taste buds were intact and functional approaches for examining the taste senses of the chicken using Ca2+ imaging can be informative. © 2014 Japanese Society of Animal Science.

  7. Real-time sono-photoacoustic imaging of gold nanoemulsions

    NASA Astrophysics Data System (ADS)

    Arnal, Bastien; Wei, Chen-Wei; Perez, Camilo; Lombardo, Michael; Pelivanov, Ivan M.; Pozzo, Danilo; O'Donnell, Matthew

    2015-03-01

    Phase transition contrast agents were first introduced in ultrasound (US) in the form of perfluorocarbon droplets. When their size is reduced to the nanoscale, surface tension dominates their stability and high pressure is required to vaporize them using long US emissions at high frequencies. Our group recently showed that nanoemulsion beads (100-300 nm) coated with gold nanopsheres could be used as non-linear contrast agents. Beads can be vaporized with light only, inducing stronger photoacoustic signals by increasing thermal expansion. A photoacoustic cavitation threshold study (US: 1.2 MHz, Laser 750 nm and 10-ns pulse) shows that the vaporization thresholds of NEB-GNS can be greatly reduced using simultaneous light and US excitations. The resulting signal is driven only by the pressure amplitude for a fluence higher than 2.4 mJ/cm2. At diagnostic exposures, it is possible to capture very high signals from the vaporized beads at concentrations reduced to 10 pM with optical absorption smaller than 0.01 cm-1. A real-time imaging mode selectively isolating vaporization signals was implemented on a Verasonics system. A linear US probe (L74, 3 MHz) launched short US bursts before light was emitted from the laser. Vaporization of NEB-GNS resulted in a persistent 30-dB signal enhancement compared to a dye with the same absorption. Specific vaporization signals were retrieved in phantom experiments with US scatterers. This technique, called sonophotoacoustics, has great potential for targeted molecular imaging and therapy using compact nanoprobes with potentially high-penetrability into tissue.

  8. Impact of Real-Time Image Gating on Spot Scanning Proton Therapy for Lung Tumors: A Simulation Study.

    PubMed

    Kanehira, Takahiro; Matsuura, Taeko; Takao, Seishin; Matsuzaki, Yuka; Fujii, Yusuke; Fujii, Takaaki; Ito, Yoichi M; Miyamoto, Naoki; Inoue, Tetsuya; Katoh, Norio; Shimizu, Shinichi; Umegaki, Kikuo; Shirato, Hiroki

    2017-01-01

    To investigate the effectiveness of real-time-image gated proton beam therapy for lung tumors and to establish a suitable size for the gating window (GW). A proton beam gated by a fiducial marker entering a preassigned GW (as monitored by 2 fluoroscopy units) was used with 7 lung cancer patients. Seven treatment plans were generated: real-time-image gated proton beam therapy with GW sizes of ±1, 2, 3, 4, 5, and 8 mm and free-breathing proton therapy. The prescribed dose was 70 Gy (relative biological effectiveness)/10 fractions to 99% of the target. Each of the 3-dimensional marker positions in the time series was associated with the appropriate 4-dimensional computed tomography phase. The 4-dimensional dose calculations were performed. The dose distribution in each respiratory phase was deformed into the end-exhale computed tomography image. The D99 and D5 to D95 of the clinical target volume scaled by the prescribed dose with criteria of D99 >95% and D5 to D95 <5%, V20 for the normal lung, and treatment times were evaluated. Gating windows ≤ ±2 mm fulfilled the CTV criteria for all patients (whereas the criteria were not always met for GWs ≥ ±3 mm) and gave an average reduction in V20 of more than 17.2% relative to free-breathing proton therapy (whereas GWs ≥ ±4 mm resulted in similar or increased V20). The average (maximum) irradiation times were 384 seconds (818 seconds) for the ±1-mm GW, but less than 226 seconds (292 seconds) for the ±2-mm GW. The maximum increased considerably at ±1-mm GW. Real-time-image gated proton beam therapy with a GW of ±2 mm was demonstrated to be suitable, providing good dose distribution without greatly extending treatment time. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Impact of Real-Time Image Gating on Spot Scanning Proton Therapy for Lung Tumors: A Simulation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanehira, Takahiro; Matsuura, Taeko, E-mail: matsuura@med.hokudai.ac.jp; Global Station for Quantum Medical Science and Engineering, Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo

    Purpose: To investigate the effectiveness of real-time-image gated proton beam therapy for lung tumors and to establish a suitable size for the gating window (GW). Methods and Materials: A proton beam gated by a fiducial marker entering a preassigned GW (as monitored by 2 fluoroscopy units) was used with 7 lung cancer patients. Seven treatment plans were generated: real-time-image gated proton beam therapy with GW sizes of ±1, 2, 3, 4, 5, and 8 mm and free-breathing proton therapy. The prescribed dose was 70 Gy (relative biological effectiveness)/10 fractions to 99% of the target. Each of the 3-dimensional marker positions in themore » time series was associated with the appropriate 4-dimensional computed tomography phase. The 4-dimensional dose calculations were performed. The dose distribution in each respiratory phase was deformed into the end-exhale computed tomography image. The D99 and D5 to D95 of the clinical target volume scaled by the prescribed dose with criteria of D99 >95% and D5 to D95 <5%, V20 for the normal lung, and treatment times were evaluated. Results: Gating windows ≤ ±2 mm fulfilled the CTV criteria for all patients (whereas the criteria were not always met for GWs ≥ ±3 mm) and gave an average reduction in V20 of more than 17.2% relative to free-breathing proton therapy (whereas GWs ≥ ±4 mm resulted in similar or increased V20). The average (maximum) irradiation times were 384 seconds (818 seconds) for the ±1-mm GW, but less than 226 seconds (292 seconds) for the ±2-mm GW. The maximum increased considerably at ±1-mm GW. Conclusion: Real-time-image gated proton beam therapy with a GW of ±2 mm was demonstrated to be suitable, providing good dose distribution without greatly extending treatment time.« less

  10. Real-time Three-dimensional Echocardiography: From Diagnosis to Intervention.

    PubMed

    Orvalho, João S

    2017-09-01

    Echocardiography is one of the most important diagnostic tools in veterinary cardiology, and one of the greatest recent developments is real-time three-dimensional imaging. Real-time three-dimensional echocardiography is a new ultrasonography modality that provides comprehensive views of the cardiac valves and congenital heart defects. The main advantages of this technique, particularly real-time three-dimensional transesophageal echocardiography, are the ability to visualize the catheters, and balloons or other devices, and the ability to image the structure that is undergoing intervention with unprecedented quality. This technique may become one of the main choices for the guidance of interventional cardiology procedures. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Real-time soft tissue motion estimation for lung tumors during radiotherapy delivery.

    PubMed

    Rottmann, Joerg; Keall, Paul; Berbeco, Ross

    2013-09-01

    To provide real-time lung tumor motion estimation during radiotherapy treatment delivery without the need for implanted fiducial markers or additional imaging dose to the patient. 2D radiographs from the therapy beam's-eye-view (BEV) perspective are captured at a frame rate of 12.8 Hz with a frame grabber allowing direct RAM access to the image buffer. An in-house developed real-time soft tissue localization algorithm is utilized to calculate soft tissue displacement from these images in real-time. The system is tested with a Varian TX linear accelerator and an AS-1000 amorphous silicon electronic portal imaging device operating at a resolution of 512 × 384 pixels. The accuracy of the motion estimation is verified with a dynamic motion phantom. Clinical accuracy was tested on lung SBRT images acquired at 2 fps. Real-time lung tumor motion estimation from BEV images without fiducial markers is successfully demonstrated. For the phantom study, a mean tracking error <1.0 mm [root mean square (rms) error of 0.3 mm] was observed. The tracking rms accuracy on BEV images from a lung SBRT patient (≈20 mm tumor motion range) is 1.0 mm. The authors demonstrate for the first time real-time markerless lung tumor motion estimation from BEV images alone. The described system can operate at a frame rate of 12.8 Hz and does not require prior knowledge to establish traceable landmarks for tracking on the fly. The authors show that the geometric accuracy is similar to (or better than) previously published markerless algorithms not operating in real-time.

  12. A robust real-time abnormal region detection framework from capsule endoscopy images

    NASA Astrophysics Data System (ADS)

    Cheng, Yanfen; Liu, Xu; Li, Huiping

    2009-02-01

    In this paper we present a novel method to detect abnormal regions from capsule endoscopy images. Wireless Capsule Endoscopy (WCE) is a recent technology where a capsule with an embedded camera is swallowed by the patient to visualize the gastrointestinal tract. One challenge is one procedure of diagnosis will send out over 50,000 images, making physicians' reviewing process expensive. Physicians' reviewing process involves in identifying images containing abnormal regions (tumor, bleeding, etc) from this large number of image sequence. In this paper we construct a novel framework for robust and real-time abnormal region detection from large amount of capsule endoscopy images. The detected potential abnormal regions can be labeled out automatically to let physicians review further, therefore, reduce the overall reviewing process. In this paper we construct an abnormal region detection framework with the following advantages: 1) Trainable. Users can define and label any type of abnormal region they want to find; The abnormal regions, such as tumor, bleeding, etc., can be pre-defined and labeled using the graphical user interface tool we provided. 2) Efficient. Due to the large number of image data, the detection speed is very important. Our system can detect very efficiently at different scales due to the integral image features we used; 3) Robust. After feature selection we use a cascade of classifiers to further enforce the detection accuracy.

  13. PixonVision real-time Deblurring Anisoplanaticism Corrector (DAC)

    NASA Astrophysics Data System (ADS)

    Hier, R. G.; Puetter, R. C.

    2007-09-01

    DigiVision, Inc. and PixonImaging LLC have teamed to develop a real-time Deblurring Anisoplanaticism Corrector (DAC) for the Army. The DAC measures the geometric image warp caused by anisoplanaticism and removes it to rectify and stabilize (dejitter) the incoming image. Each new geometrically corrected image field is combined into a running-average reference image. The image averager employs a higher-order filter that uses temporal bandpass information to help identify true motion of objects and thereby adaptively moderate the contribution of each new pixel to the reference image. This result is then passed to a real-time PixonVision video processor (see paper 6696-04 note, the DAC also first dehazes the incoming video) where additional blur from high-order seeing effects is removed, the image is spatially denoised, and contrast is adjusted in a spatially adaptive manner. We plan to implement the entire algorithm within a few large modern FPGAs on a circuit board for video use. Obvious applications are within the DOD, surveillance and intelligence, security and law enforcement communities. Prototype hardware is scheduled to be available in late 2008. To demonstrate the capabilities of the DAC, we present a software simulation of the algorithm applied to real atmosphere-corrupted video data collected by Sandia Labs.

  14. Real Time Computer Graphics From Body Motion

    NASA Astrophysics Data System (ADS)

    Fisher, Scott; Marion, Ann

    1983-10-01

    This paper focuses on the recent emergence and development of real, time, computer-aided body tracking technologies and their use in combination with various computer graphics imaging techniques. The convergence of these, technologies in our research results, in an interactive display environment. in which multipde, representations of a given body motion can be displayed in real time. Specific reference, to entertainment applications is described in the development of a real time, interactive stage set in which dancers can 'draw' with their bodies as they move, through the space. of the stage or manipulate virtual elements of the set with their gestures.

  15. Airborne net-centric multi-INT sensor control, display, fusion, and exploitation systems

    NASA Astrophysics Data System (ADS)

    Linne von Berg, Dale C.; Lee, John N.; Kruer, Melvin R.; Duncan, Michael D.; Olchowski, Fred M.; Allman, Eric; Howard, Grant

    2004-08-01

    The NRL Optical Sciences Division has initiated a multi-year effort to develop and demonstrate an airborne net-centric suite of multi-intelligence (multi-INT) sensors and exploitation systems for real-time target detection and targeting product dissemination. The goal of this Net-centric Multi-Intelligence Fusion Targeting Initiative (NCMIFTI) is to develop an airborne real-time intelligence gathering and targeting system that can be used to detect concealed, camouflaged, and mobile targets. The multi-INT sensor suite will include high-resolution visible/infrared (EO/IR) dual-band cameras, hyperspectral imaging (HSI) sensors in the visible-to-near infrared, short-wave and long-wave infrared (VNIR/SWIR/LWIR) bands, Synthetic Aperture Radar (SAR), electronics intelligence sensors (ELINT), and off-board networked sensors. Other sensors are also being considered for inclusion in the suite to address unique target detection needs. Integrating a suite of multi-INT sensors on a single platform should optimize real-time fusion of the on-board sensor streams, thereby improving the detection probability and reducing the false alarms that occur in reconnaissance systems that use single-sensor types on separate platforms, or that use independent target detection algorithms on multiple sensors. In addition to the integration and fusion of the multi-INT sensors, the effort is establishing an open-systems net-centric architecture that will provide a modular "plug and play" capability for additional sensors and system components and provide distributed connectivity to multiple sites for remote system control and exploitation.

  16. Real-time trace ambient ammonia monitor for haze prevention

    NASA Astrophysics Data System (ADS)

    Nishimura, Katsumi; Sakaguchi, Yuhei; Crosson, Eric; Wahl, Edward; Rella, Chris

    2007-05-01

    In photolithography, haze prevention is of critical importance to integrated circuit chip manufacturers. Numerous studies have established that the presence of ammonia in the photolithography tool can cause haze to form on optical surfaces resulting in permanent damage to costly deep ultra-violet optics. Ammonia is emitted into wafer fab air by various semiconductor processes including coating steps in the track and CMP. The workers in the clean room also emit a significant amount of ammonia. Chemical filters are typically used to remove airborne contamination from critical locations but their lifetime and coverage cannot offer complete protection. Therefore, constant or periodic monitoring of airborne ammonia at parts-per-trillion (ppt) levels is critical to insure the integrity of the lithography process. Real time monitoring can insure that an accidental ammonia release in the clean room is detected before any optics is damaged. We have developed a transportable, highly accurate, highly specific, real-time trace gas monitor that detects ammonia using Cavity Ring-Down Spectroscopy (CRDS). The trace gas monitor requires no calibration gas standards, and can measure ammonia with 200 ppt sensitivity in five minutes with little or no baseline drift. In addition, the high spectral resolution of CRDS makes the analyzer less susceptible to interference from other gases when compared to other detection methods. In this paper we describe the monitor, focus on its performance, discuss the results of a careful comparison with ion chromatography (IC), and present field data measured inside the aligner and the reticule stocker at a semiconductor fab.

  17. TH-AB-202-02: Real-Time Verification and Error Detection for MLC Tracking Deliveries Using An Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J Zwan, B; Central Coast Cancer Centre, Gosford, NSW; Colvill, E

    2016-06-15

    Purpose: The added complexity of the real-time adaptive multi-leaf collimator (MLC) tracking increases the likelihood of undetected MLC delivery errors. In this work we develop and test a system for real-time delivery verification and error detection for MLC tracking radiotherapy using an electronic portal imaging device (EPID). Methods: The delivery verification system relies on acquisition and real-time analysis of transit EPID image frames acquired at 8.41 fps. In-house software was developed to extract the MLC positions from each image frame. Three comparison metrics were used to verify the MLC positions in real-time: (1) field size, (2) field location and, (3)more » field shape. The delivery verification system was tested for 8 VMAT MLC tracking deliveries (4 prostate and 4 lung) where real patient target motion was reproduced using a Hexamotion motion stage and a Calypso system. Sensitivity and detection delay was quantified for various types of MLC and system errors. Results: For both the prostate and lung test deliveries the MLC-defined field size was measured with an accuracy of 1.25 cm{sup 2} (1 SD). The field location was measured with an accuracy of 0.6 mm and 0.8 mm (1 SD) for lung and prostate respectively. Field location errors (i.e. tracking in wrong direction) with a magnitude of 3 mm were detected within 0.4 s of occurrence in the X direction and 0.8 s in the Y direction. Systematic MLC gap errors were detected as small as 3 mm. The method was not found to be sensitive to random MLC errors and individual MLC calibration errors up to 5 mm. Conclusion: EPID imaging may be used for independent real-time verification of MLC trajectories during MLC tracking deliveries. Thresholds have been determined for error detection and the system has been shown to be sensitive to a range of delivery errors.« less

  18. Image quality on the Kuiper Airborne Observatory. I - Results of the first flight series

    NASA Technical Reports Server (NTRS)

    Elliot, J. L.; Dunham, E. W.; Baron, R. L.; Watts, A. W.; Kruse, S. E.; Rose, W. C.; Gillespie, C. M., Jr.

    1989-01-01

    The NASA Kuiper Airborne Observatory (KAO) was flown three times during June and July, 1984 in order to study the causes of the poor seeing obtained with the 0.9-m telescope. High-speed pressure and temperature sensors were placed in the telescope cavity. Several thousand stellar images were recorded under various flight and optical configurations. It is found that the long-exposure image size is affected by telescope tracking errors, imperfect optics, poor optical alignment, telescope and instrument vibration, thermal fluctuations in the telescope cavity, and density fluctuations in the shear layer that forms the boundary between the cavity air and outside air. Possible ways to improve the quality of the images are discussed.

  19. Bioanalytical Applications of Real-Time ATP Imaging Via Bioluminescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruenhagen, Jason Alan

    The research discussed within involves the development of novel applications of real-time imaging of adenosine 5'-triphosphate (ATP). ATP was detected via bioluminescence and the firefly luciferase-catalyzed reaction of ATP and luciferin. The use of a microscope and an imaging detector allowed for spatially resolved quantitation of ATP release. Employing this method, applications in both biological and chemical systems were developed. First, the mechanism by which the compound 48/80 induces release of ATP from human umbilical vein endothelial cells (HUVECs) was investigated. Numerous enzyme activators and inhibitors were utilized to probe the second messenger systems involved in release. Compound 48/80 activatedmore » a G{sub q}-type protein to initiate ATP release from HUVECs. Ca 2+ imaging along with ATP imaging revealed that activation of phospholipase C and induction of intracellular Ca 2+ signaling were necessary for release of ATP. Furthermore, activation of protein kinase C inhibited the activity of phospholipase C and thus decreased the magnitude of ATP release. This novel release mechanism was compared to the existing theories of extracellular release of ATP. Bioluminescence imaging was also employed to examine the role of ATP in the field of neuroscience. The central nervous system (CNS) was dissected from the freshwater snail Lymnaea stagnalis. Electrophysiological experiments demonstrated that the neurons of the Lymnaea were not damaged by any of the components of the imaging solution. ATP was continuously released by the ganglia of the CNS for over eight hours and varied from ganglion to ganglion and within individual ganglia. Addition of the neurotransmitters K + and serotonin increased release of ATP in certain regions of the Lymnaea CNS. Finally, the ATP imaging technique was investigated for the study of drug release systems. MCM-41-type mesoporous nanospheres were loaded with ATP and end-capped with mercaptoethanol functionalized Cd

  20. Science Measurement Requirements for Imaging Spectrometers from Airborne to Spaceborne

    NASA Technical Reports Server (NTRS)

    Green, Robert O.; Asner, Gregory P.; Boardman, Joseph; Ungar, Stephen; Mouroulis, Pantazis

    2006-01-01

    This slide presentation reviews the objectives of the work to create imaging spectrometers. The science objectives are to remotely determine the properties of the surface and atmosphere (physics, chemistry and biology) revealed by the interaction of electromagnetic energy with matter via spectroscopy. It presents a review the understanding of spectral, radiometric and spatial science measurement requirements for imaging spectrometers based upon science research results from past and current airborne and spaceborne instruments. It also examines the future requirements that will enable the next level of imaging spectroscopy science.

  1. Machine vision for real time orbital operations

    NASA Technical Reports Server (NTRS)

    Vinz, Frank L.

    1988-01-01

    Machine vision for automation and robotic operation of Space Station era systems has the potential for increasing the efficiency of orbital servicing, repair, assembly and docking tasks. A machine vision research project is described in which a TV camera is used for inputing visual data to a computer so that image processing may be achieved for real time control of these orbital operations. A technique has resulted from this research which reduces computer memory requirements and greatly increases typical computational speed such that it has the potential for development into a real time orbital machine vision system. This technique is called AI BOSS (Analysis of Images by Box Scan and Syntax).

  2. A system for real-time measurement of the brachial artery diameter in B-mode ultrasound images.

    PubMed

    Gemignani, Vincenzo; Faita, Francesco; Ghiadoni, Lorenzo; Poggianti, Elisa; Demi, Marcello

    2007-03-01

    The measurement of the brachial artery diameter is frequently used in clinical studies for evaluating the flow-mediated dilation and, in conjunction with the blood pressure value, for assessing arterial stiffness. This paper presents a system for computing the brachial artery diameter in real-time by analyzing B-mode ultrasound images. The method is based on a robust edge detection algorithm which is used to automatically locate the two walls of the vessel. The measure of the diameter is obtained with subpixel precision and with a temporal resolution of 25 samples/s, so that the small dilations induced by the cardiac cycle can also be retrieved. The algorithm is implemented on a standalone video processing board which acquires the analog video signal from the ultrasound equipment. Results are shown in real-time on a graphical user interface. The system was tested both on synthetic ultrasound images and in clinical studies of flow-mediated dilation. Accuracy, robustness, and intra/inter observer variability of the method were evaluated.

  3. Real time imaging of peripheral nerve vasculature using optical coherence angiography

    NASA Astrophysics Data System (ADS)

    Vasudevan, Srikanth; Kumsa, Doe; Takmakov, Pavel; Welle, Cristin G.; Hammer, Daniel X.

    2016-03-01

    The peripheral nervous system (PNS) carries bidirectional information between the central nervous system and distal organs. PNS stimulation has been widely used in medical devices for therapeutic indications, such as bladder control and seizure cessation. Investigational uses of PNS stimulation include providing sensory feedback for improved control of prosthetic limbs. While nerve safety has been well documented for stimulation parameters used in marketed devices, novel PNS stimulation devices may require alternative stimulation paradigms to achieve maximum therapeutic benefit. Improved testing paradigms to assess the safety of stimulation will expedite the development process for novel PNS stimulation devices. The objective of this research is to assess peripheral nerve vascular changes in real-time with optical coherence angiography (OCA). A 1300-nm OCA system was used to image vasculature changes in the rat sciatic nerve in the region around a surface contacting single electrode. Nerves and vasculature were imaged without stimulation for 180 minutes to quantify resting blood vessel diameter. Walking track analysis was used to assess motor function before and 6 days following experiments. There was no significant change in vessel diameter between baseline and other time points in all animals. Motor function tests indicated the experiments did not impair functionality. We also evaluated the capabilities to image the nerve during electrical stimulation in a pilot study. Combining OCA with established nerve assessment methods can be used to study the effects of electrical stimulation safety on neural and vascular tissue in the periphery.

  4. Camera selection for real-time in vivo radiation treatment verification systems using Cherenkov imaging.

    PubMed

    Andreozzi, Jacqueline M; Zhang, Rongxiao; Glaser, Adam K; Jarvis, Lesley A; Pogue, Brian W; Gladstone, David J

    2015-02-01

    To identify achievable camera performance and hardware needs in a clinical Cherenkov imaging system for real-time, in vivo monitoring of the surface beam profile on patients, as novel visual information, documentation, and possible treatment verification for clinicians. Complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), intensified charge-coupled device (ICCD), and electron multiplying-intensified charge coupled device (EM-ICCD) cameras were investigated to determine Cherenkov imaging performance in a clinical radiotherapy setting, with one emphasis on the maximum supportable frame rate. Where possible, the image intensifier was synchronized using a pulse signal from the Linac in order to image with room lighting conditions comparable to patient treatment scenarios. A solid water phantom irradiated with a 6 MV photon beam was imaged by the cameras to evaluate the maximum frame rate for adequate Cherenkov detection. Adequate detection was defined as an average electron count in the background-subtracted Cherenkov image region of interest in excess of 0.5% (327 counts) of the 16-bit maximum electron count value. Additionally, an ICCD and an EM-ICCD were each used clinically to image two patients undergoing whole-breast radiotherapy to compare clinical advantages and limitations of each system. Intensifier-coupled cameras were required for imaging Cherenkov emission on the phantom surface with ambient room lighting; standalone CMOS and CCD cameras were not viable. The EM-ICCD was able to collect images from a single Linac pulse delivering less than 0.05 cGy of dose at 30 frames/s (fps) and pixel resolution of 512 × 512, compared to an ICCD which was limited to 4.7 fps at 1024 × 1024 resolution. An intensifier with higher quantum efficiency at the entrance photocathode in the red wavelengths [30% quantum efficiency (QE) vs previous 19%] promises at least 8.6 fps at a resolution of 1024 × 1024 and lower monetary cost than the EM-ICCD. The

  5. SU-D-209-03: Radiation Dose Reduction Using Real-Time Image Processing in Interventional Radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanal, K; Moirano, J; Zamora, D

    Purpose: To characterize changes in radiation dose after introducing a new real-time image processing technology in interventional radiology systems. Methods: Interventional radiology (IR) procedures are increasingly complex, at times requiring substantial time and radiation dose. The risk of inducing tissue reactions as well as long-term stochastic effects such as radiation-induced cancer is not trivial. To reduce this risk, IR systems are increasingly equipped with dose reduction technologies.Recently, ClarityIQ (Philips Healthcare) technology was installed in our existing neuroradiology IR (NIR) and vascular IR (VIR) suites respectively. ClarityIQ includes real-time image processing that reduces noise/artifacts, enhances images, and sharpens edges while alsomore » reducing radiation dose rates. We reviewed 412 NIR (175 pre- and 237 post-ClarityIQ) procedures and 329 VIR (156 preand 173 post-ClarityIQ) procedures performed at our institution pre- and post-ClarityIQ implementation. NIR procedures were primarily classified as interventional or diagnostic. VIR procedures included drain port, drain placement, tube change, mesenteric, and implanted venous procedures. Air Kerma (AK in units of mGy) was documented for all the cases using a commercial radiation exposure management system. Results: When considering all NIR procedures, median AK decreased from 1194 mGy to 561 mGy. When considering all VIR procedures, median AK decreased from 49 to 14 mGy. Both NIR and VIR exhibited a decrease in AK exceeding 50% after ClarityIQ implementation, a statistically significant (p<0.05) difference. Of the 5 most common VIR procedures, all median AK values decreased, but significance (p<0.05) was only reached in venous access (N=53), angio mesenteric (N=41), and drain placement procedures (N=31). Conclusion: ClarityIQ can reduce dose significantly for both NIR and VIR procedures. Image quality was not assessed in conjunction with the dose reduction.« less

  6. Application of the Karhunen-Loeve transform temporal image filter to reduce noise in real-time cardiac cine MRI

    NASA Astrophysics Data System (ADS)

    Ding, Yu; Chung, Yiu-Cho; Raman, Subha V.; Simonetti, Orlando P.

    2009-06-01

    Real-time dynamic magnetic resonance imaging (MRI) typically sacrifices the signal-to-noise ratio (SNR) to achieve higher spatial and temporal resolution. Spatial and/or temporal filtering (e.g., low-pass filtering or averaging) of dynamic images improves the SNR at the expense of edge sharpness. We describe the application of a temporal filter for dynamic MR image series based on the Karhunen-Loeve transform (KLT) to remove random noise without blurring stationary or moving edges and requiring no training data. In this paper, we present several properties of this filter and their effects on filter performance, and propose an automatic way to find the filter cutoff based on the autocorrelation of the eigenimages. Numerical simulation and in vivo real-time cardiac cine MR image series spanning multiple cardiac cycles acquired using multi-channel sensitivity-encoded MRI, i.e., parallel imaging, are used to validate and demonstrate these properties. We found that in this application, the noise standard deviation was reduced to 42% of the original with no apparent image blurring by using the proposed filter cutoff. Greater noise reduction can be achieved by increasing the length of the image series. This advantage of KLT filtering provides flexibility in the form of another scan parameter to trade for SNR.

  7. Orientation of airborne laser scanning point clouds with multi-view, multi-scale image blocks.

    PubMed

    Rönnholm, Petri; Hyyppä, Hannu; Hyyppä, Juha; Haggrén, Henrik

    2009-01-01

    Comprehensive 3D modeling of our environment requires integration of terrestrial and airborne data, which is collected, preferably, using laser scanning and photogrammetric methods. However, integration of these multi-source data requires accurate relative orientations. In this article, two methods for solving relative orientation problems are presented. The first method includes registration by minimizing the distances between of an airborne laser point cloud and a 3D model. The 3D model was derived from photogrammetric measurements and terrestrial laser scanning points. The first method was used as a reference and for validation. Having completed registration in the object space, the relative orientation between images and laser point cloud is known. The second method utilizes an interactive orientation method between a multi-scale image block and a laser point cloud. The multi-scale image block includes both aerial and terrestrial images. Experiments with the multi-scale image block revealed that the accuracy of a relative orientation increased when more images were included in the block. The orientations of the first and second methods were compared. The comparison showed that correct rotations were the most difficult to detect accurately by using the interactive method. Because the interactive method forces laser scanning data to fit with the images, inaccurate rotations cause corresponding shifts to image positions. However, in a test case, in which the orientation differences included only shifts, the interactive method could solve the relative orientation of an aerial image and airborne laser scanning data repeatedly within a couple of centimeters.

  8. Orientation of Airborne Laser Scanning Point Clouds with Multi-View, Multi-Scale Image Blocks

    PubMed Central

    Rönnholm, Petri; Hyyppä, Hannu; Hyyppä, Juha; Haggrén, Henrik

    2009-01-01

    Comprehensive 3D modeling of our environment requires integration of terrestrial and airborne data, which is collected, preferably, using laser scanning and photogrammetric methods. However, integration of these multi-source data requires accurate relative orientations. In this article, two methods for solving relative orientation problems are presented. The first method includes registration by minimizing the distances between of an airborne laser point cloud and a 3D model. The 3D model was derived from photogrammetric measurements and terrestrial laser scanning points. The first method was used as a reference and for validation. Having completed registration in the object space, the relative orientation between images and laser point cloud is known. The second method utilizes an interactive orientation method between a multi-scale image block and a laser point cloud. The multi-scale image block includes both aerial and terrestrial images. Experiments with the multi-scale image block revealed that the accuracy of a relative orientation increased when more images were included in the block. The orientations of the first and second methods were compared. The comparison showed that correct rotations were the most difficult to detect accurately by using the interactive method. Because the interactive method forces laser scanning data to fit with the images, inaccurate rotations cause corresponding shifts to image positions. However, in a test case, in which the orientation differences included only shifts, the interactive method could solve the relative orientation of an aerial image and airborne laser scanning data repeatedly within a couple of centimeters. PMID:22454569

  9. Position tracking of moving liver lesion based on real-time registration between 2D ultrasound and 3D preoperative images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weon, Chijun; Hyun Nam, Woo; Lee, Duhgoon

    Purpose: Registration between 2D ultrasound (US) and 3D preoperative magnetic resonance (MR) (or computed tomography, CT) images has been studied recently for US-guided intervention. However, the existing techniques have some limits, either in the registration speed or the performance. The purpose of this work is to develop a real-time and fully automatic registration system between two intermodal images of the liver, and subsequently an indirect lesion positioning/tracking algorithm based on the registration result, for image-guided interventions. Methods: The proposed position tracking system consists of three stages. In the preoperative stage, the authors acquire several 3D preoperative MR (or CT) imagesmore » at different respiratory phases. Based on the transformations obtained from nonrigid registration of the acquired 3D images, they then generate a 4D preoperative image along the respiratory phase. In the intraoperative preparatory stage, they properly attach a 3D US transducer to the patient’s body and fix its pose using a holding mechanism. They then acquire a couple of respiratory-controlled 3D US images. Via the rigid registration of these US images to the 3D preoperative images in the 4D image, the pose information of the fixed-pose 3D US transducer is determined with respect to the preoperative image coordinates. As feature(s) to use for the rigid registration, they may choose either internal liver vessels or the inferior vena cava. Since the latter is especially useful in patients with a diffuse liver disease, the authors newly propose using it. In the intraoperative real-time stage, they acquire 2D US images in real-time from the fixed-pose transducer. For each US image, they select candidates for its corresponding 2D preoperative slice from the 4D preoperative MR (or CT) image, based on the predetermined pose information of the transducer. The correct corresponding image is then found among those candidates via real-time 2D registration based

  10. Gamma-ray imaging system for real-time measurements in nuclear waste characterisation

    NASA Astrophysics Data System (ADS)

    Caballero, L.; Albiol Colomer, F.; Corbi Bellot, A.; Domingo-Pardo, C.; Leganés Nieto, J. L.; Agramunt Ros, J.; Contreras, P.; Monserrate, M.; Olleros Rodríguez, P.; Pérez Magán, D. L.

    2018-03-01

    A compact, portable and large field-of-view gamma camera that is able to identify, locate and quantify gamma-ray emitting radioisotopes in real-time has been developed. The device delivers spectroscopic and imaging capabilities that enable its use it in a variety of nuclear waste characterisation scenarios, such as radioactivity monitoring in nuclear power plants and more specifically for the decommissioning of nuclear facilities. The technical development of this apparatus and some examples of its application in field measurements are reported in this article. The performance of the presented gamma-camera is also benchmarked against other conventional techniques.

  11. Real-Time Systems

    DTIC Science & Technology

    1992-02-01

    Postgraduate School Autonomous Under Vehicle (AUV) are then examined. Autonomous underwater vehicle (AUV), hard real-time system, real - time operating system , real-time programming language, real-time system, soft real-time system.

  12. Real time thermal imaging for analysis and control of crystal growth by the Czochralski technique

    NASA Technical Reports Server (NTRS)

    Wargo, M. J.; Witt, A. F.

    1992-01-01

    A real time thermal imaging system with temperature resolution better than +/- 0.5 C and spatial resolution of better than 0.5 mm has been developed. It has been applied to the analysis of melt surface thermal field distributions in both Czochralski and liquid encapsulated Czochralski growth configurations. The sensor can provide single/multiple point thermal information; a multi-pixel averaging algorithm has been developed which permits localized, low noise sensing and display of optical intensity variations at any location in the hot zone as a function of time. Temperature distributions are measured by extraction of data along a user selectable linear pixel array and are simultaneously displayed, as a graphic overlay, on the thermal image.

  13. Satellite on-board real-time SAR processor prototype

    NASA Astrophysics Data System (ADS)

    Bergeron, Alain; Doucet, Michel; Harnisch, Bernd; Suess, Martin; Marchese, Linda; Bourqui, Pascal; Desnoyers, Nicholas; Legros, Mathieu; Guillot, Ludovic; Mercier, Luc; Châteauneuf, François

    2017-11-01

    A Compact Real-Time Optronic SAR Processor has been successfully developed and tested up to a Technology Readiness Level of 4 (TRL4), the breadboard validation in a laboratory environment. SAR, or Synthetic Aperture Radar, is an active system allowing day and night imaging independent of the cloud coverage of the planet. The SAR raw data is a set of complex data for range and azimuth, which cannot be compressed. Specifically, for planetary missions and unmanned aerial vehicle (UAV) systems with limited communication data rates this is a clear disadvantage. SAR images are typically processed electronically applying dedicated Fourier transformations. This, however, can also be performed optically in real-time. Originally the first SAR images were optically processed. The optical Fourier processor architecture provides inherent parallel computing capabilities allowing real-time SAR data processing and thus the ability for compression and strongly reduced communication bandwidth requirements for the satellite. SAR signal return data are in general complex data. Both amplitude and phase must be combined optically in the SAR processor for each range and azimuth pixel. Amplitude and phase are generated by dedicated spatial light modulators and superimposed by an optical relay set-up. The spatial light modulators display the full complex raw data information over a two-dimensional format, one for the azimuth and one for the range. Since the entire signal history is displayed at once, the processor operates in parallel yielding real-time performances, i.e. without resulting bottleneck. Processing of both azimuth and range information is performed in a single pass. This paper focuses on the onboard capabilities of the compact optical SAR processor prototype that allows in-orbit processing of SAR images. Examples of processed ENVISAT ASAR images are presented. Various SAR processor parameters such as processing capabilities, image quality (point target analysis), weight and

  14. Early warning by near-real time disturbance monitoring (Invited)

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Zeileis, A.; Herold, M.

    2013-12-01

    Near real-time monitoring of ecosystem disturbances is critical for rapidly assessing and addressing impacts on carbon dynamics, biodiversity, and socio-ecological processes. Satellite remote sensing enables cost-effective and accurate monitoring at frequent time steps over large areas. Yet, generic methods to detect disturbances within newly captured satellite images are lacking. We propose a multi-purpose time-series-based disturbance detection approach that identifies and models stable historical variation to enable change detection within newly acquired data. Satellite image time series of vegetation greenness provide a global record of terrestrial vegetation productivity over the past decades. Here, we assess and demonstrate the method by applying it to (1) real-world satellite greenness image time series between February 2000 and July 2011 covering Somalia to detect drought-related vegetation disturbances (2) landsat image time series to detect forest disturbances. First, results illustrate that disturbances are successfully detected in near real-time while being robust to seasonality and noise. Second, major drought-related disturbance corresponding with most drought-stressed regions in Somalia are detected from mid-2010 onwards. Third, the method can be applied to landsat image time series having a lower temporal data density. Furthermore the method can analyze in-situ or satellite data time series of biophysical indicators from local to global scale since it is fast, does not depend on thresholds and does not require time series gap filling. While the data and methods used are appropriate for proof-of-concept development of global scale disturbance monitoring, specific applications (e.g., drought or deforestation monitoring) mandates integration within an operational monitoring framework. Furthermore, the real-time monitoring method is implemented in open-source environment and is freely available in the BFAST package for R software. Information

  15. Real-time soft tissue motion estimation for lung tumors during radiotherapy delivery

    PubMed Central

    Rottmann, Joerg; Keall, Paul; Berbeco, Ross

    2013-01-01

    Purpose: To provide real-time lung tumor motion estimation during radiotherapy treatment delivery without the need for implanted fiducial markers or additional imaging dose to the patient. Methods: 2D radiographs from the therapy beam's-eye-view (BEV) perspective are captured at a frame rate of 12.8 Hz with a frame grabber allowing direct RAM access to the image buffer. An in-house developed real-time soft tissue localization algorithm is utilized to calculate soft tissue displacement from these images in real-time. The system is tested with a Varian TX linear accelerator and an AS-1000 amorphous silicon electronic portal imaging device operating at a resolution of 512 × 384 pixels. The accuracy of the motion estimation is verified with a dynamic motion phantom. Clinical accuracy was tested on lung SBRT images acquired at 2 fps. Results: Real-time lung tumor motion estimation from BEV images without fiducial markers is successfully demonstrated. For the phantom study, a mean tracking error <1.0 mm [root mean square (rms) error of 0.3 mm] was observed. The tracking rms accuracy on BEV images from a lung SBRT patient (≈20 mm tumor motion range) is 1.0 mm. Conclusions: The authors demonstrate for the first time real-time markerless lung tumor motion estimation from BEV images alone. The described system can operate at a frame rate of 12.8 Hz and does not require prior knowledge to establish traceable landmarks for tracking on the fly. The authors show that the geometric accuracy is similar to (or better than) previously published markerless algorithms not operating in real-time. PMID:24007146

  16. Enabling Real-Time Volume Rendering of Functional Magnetic Resonance Imaging on an iOS Device.

    PubMed

    Holub, Joseph; Winer, Eliot

    2017-12-01

    Powerful non-invasive imaging technologies like computed tomography (CT), ultrasound, and magnetic resonance imaging (MRI) are used daily by medical professionals to diagnose and treat patients. While 2D slice viewers have long been the standard, many tools allowing 3D representations of digital medical data are now available. The newest imaging advancement, functional MRI (fMRI) technology, has changed medical imaging from viewing static to dynamic physiology (4D) over time, particularly to study brain activity. Add this to the rapid adoption of mobile devices for everyday work and the need to visualize fMRI data on tablets or smartphones arises. However, there are few mobile tools available to visualize 3D MRI data, let alone 4D fMRI data. Building volume rendering tools on mobile devices to visualize 3D and 4D medical data is challenging given the limited computational power of the devices. This paper describes research that explored the feasibility of performing real-time 3D and 4D volume raycasting on a tablet device. The prototype application was tested on a 9.7" iPad Pro using two different fMRI datasets of brain activity. The results show that mobile raycasting is able to achieve between 20 and 40 frames per second for traditional 3D datasets, depending on the sampling interval, and up to 9 frames per second for 4D data. While the prototype application did not always achieve true real-time interaction, these results clearly demonstrated that visualizing 3D and 4D digital medical data is feasible with a properly constructed software framework.

  17. SU-G-JeP1-11: Feasibility Study of Markerless Tracking Using Dual Energy Fluoroscopic Images for Real-Time Tumor-Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Shibuya, K; Sawada, A

    Purpose: The new real-time tumor-tracking radiotherapy (RTRT) system was installed in our institution. This system consists of two x-ray tubes and color image intensifiers (I.I.s). The fiducial marker which was implanted near the tumor was tracked using color fluoroscopic images. However, the implantation of the fiducial marker is very invasive. Color fluoroscopic images enable to increase the recognition of the tumor. However, these images were not suitable to track the tumor without fiducial marker. The purpose of this study was to investigate the feasibility of markerless tracking using dual energy colored fluoroscopic images for real-time tumor-tracking radiotherapy system. Methods: Themore » colored fluoroscopic images of static and moving phantom that had the simulated tumor (30 mm diameter sphere) were experimentally acquired using the RTRT system. The programmable respiratory motion phantom was driven using the sinusoidal pattern in cranio-caudal direction (Amplitude: 20 mm, Time: 4 s). The x-ray condition was set to 55 kV, 50 mA and 105 kV, 50 mA for low energy and high energy, respectively. Dual energy images were calculated based on the weighted logarithmic subtraction of high and low energy images of RGB images. The usefulness of dual energy imaging for real-time tracking with an automated template image matching algorithm was investigated. Results: Our proposed dual energy subtraction improve the contrast between tumor and background to suppress the bone structure. For static phantom, our results showed that high tracking accuracy using dual energy subtraction images. For moving phantom, our results showed that good tracking accuracy using dual energy subtraction images. However, tracking accuracy was dependent on tumor position, tumor size and x-ray conditions. Conclusion: We indicated that feasibility of markerless tracking using dual energy fluoroscopic images for real-time tumor-tracking radiotherapy system. Furthermore, it is needed to

  18. Improved image guidance technique for minimally invasive mitral valve repair using real-time tracked 3D ultrasound

    NASA Astrophysics Data System (ADS)

    Rankin, Adam; Moore, John; Bainbridge, Daniel; Peters, Terry

    2016-03-01

    In the past ten years, numerous new surgical and interventional techniques have been developed for treating heart valve disease without the need for cardiopulmonary bypass. Heart valve repair is now being performed in a blood-filled environment, reinforcing the need for accurate and intuitive imaging techniques. Previous work has demonstrated how augmenting ultrasound with virtual representations of specific anatomical landmarks can greatly simplify interventional navigation challenges and increase patient safety. These techniques often complicate interventions by requiring additional steps taken to manually define and initialize virtual models. Furthermore, overlaying virtual elements into real-time image data can also obstruct the view of salient image information. To address these limitations, a system was developed that uses real-time volumetric ultrasound alongside magnetically tracked tools presented in an augmented virtuality environment to provide a streamlined navigation guidance platform. In phantom studies simulating a beating-heart navigation task, procedure duration and tool path metrics have achieved comparable performance to previous work in augmented virtuality techniques, and considerable improvement over standard of care ultrasound guidance.

  19. Real time mitigation of atmospheric turbulence in long distance imaging using the lucky region fusion algorithm with FPGA and GPU hardware acceleration

    NASA Astrophysics Data System (ADS)

    Jackson, Christopher Robert

    "Lucky-region" fusion (LRF) is a synthetic imaging technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm selects sharp regions of an image obtained from a series of short exposure frames, and fuses the sharp regions into a final, improved image. In previous research, the LRF algorithm had been implemented on a PC using the C programming language. However, the PC did not have sufficient sequential processing power to handle real-time extraction, processing and reduction required when the LRF algorithm was applied to real-time video from fast, high-resolution image sensors. This thesis describes two hardware implementations of the LRF algorithm to achieve real-time image processing. The first was created with a VIRTEX-7 field programmable gate array (FPGA). The other developed using the graphics processing unit (GPU) of a NVIDIA GeForce GTX 690 video card. The novelty in the FPGA approach is the creation of a "black box" LRF video processing system with a general camera link input, a user controller interface, and a camera link video output. We also describe a custom hardware simulation environment we have built to test the FPGA LRF implementation. The advantage of the GPU approach is significantly improved development time, integration of image stabilization into the system, and comparable atmospheric turbulence mitigation.

  20. Discriminating Bio-aerosols from Non-Bio-aerosols in Real-Time by Pump-Probe Spectroscopy

    PubMed Central

    Sousa, Gustavo; Gaulier, Geoffrey; Bonacina, Luigi; Wolf, Jean-Pierre

    2016-01-01

    The optical identification of bioaerosols in the atmosphere and its discrimination against combustion related particles is a major issue for real-time, field compatible instruments. In the present paper, we show that by embedding advanced pump-probe depletion spectroscopy schemes in a portable instrument, it is possible to discriminate amino acid containing airborne particles (bacteria, humic particles, etc.) from poly-cyclic aromatic hydrocarbon containing combustion particles (Diesel droplets, soot, vehicle exhausts) with high selectivity. Our real-time, multi-modal device provides, in addition to the pump-probe depletion information, fluorescence spectra (over 32 channels), fluorescence lifetime and Mie scattering patterns of each individually flowing particle in the probed air. PMID:27619546