Sample records for horizontal visibility algorithm

  1. Topological properties of the limited penetrable horizontal visibility graph family

    NASA Astrophysics Data System (ADS)

    Wang, Minggang; Vilela, André L. M.; Du, Ruijin; Zhao, Longfeng; Dong, Gaogao; Tian, Lixin; Stanley, H. Eugene

    2018-05-01

    The limited penetrable horizontal visibility graph algorithm was recently introduced to map time series in complex networks. In this work, we extend this algorithm to create a directed-limited penetrable horizontal visibility graph and an image-limited penetrable horizontal visibility graph. We define two algorithms and provide theoretical results on the topological properties of these graphs associated with different types of real-value series. We perform several numerical simulations to check the accuracy of our theoretical results. Finally, we present an application of the directed-limited penetrable horizontal visibility graph to measure real-value time series irreversibility and an application of the image-limited penetrable horizontal visibility graph that discriminates noise from chaos. We also propose a method to measure the systematic risk using the image-limited penetrable horizontal visibility graph, and the empirical results show the effectiveness of our proposed algorithms.

  2. Predicting catastrophes of non-autonomous networks with visibility graphs and horizontal visibility

    NASA Astrophysics Data System (ADS)

    Zhang, Haicheng; Xu, Daolin; Wu, Yousheng

    2018-05-01

    Prediction of potential catastrophes in engineering systems is a challenging problem. We first attempt to construct a complex network to predict catastrophes of a multi-modular floating system in advance of their occurrences. Response time series of the system can be mapped into an virtual network by using visibility graph or horizontal visibility algorithm. The topology characteristics of the networks can be used to forecast catastrophes of the system. Numerical results show that there is an obvious corresponding relationship between the variation of topology characteristics and the onset of catastrophes. A Catastrophe Index (CI) is proposed as a numerical indicator to measure a qualitative change from a stable state to a catastrophic state. The two approaches, the visibility graph and horizontal visibility algorithms, are compared by using the index in the reliability analysis with different data lengths and sampling frequencies. The technique of virtual network method is potentially extendable to catastrophe predictions of other engineering systems.

  3. Indoor positioning algorithm combined with angular vibration compensation and the trust region technique based on received signal strength-visible light communication

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Li, Haoxu; Zhang, Xiaofeng; Wu, Rangzhong

    2017-05-01

    Indoor positioning using visible light communication has become a topic of intensive research in recent years. Because the normal of the receiver always deviates from that of the transmitter in application, the positioning systems which require that the normal of the receiver be aligned with that of the transmitter have large positioning errors. Some algorithms take the angular vibrations into account; nevertheless, these positioning algorithms cannot meet the requirement of high accuracy or low complexity. A visible light positioning algorithm combined with angular vibration compensation is proposed. The angle information from the accelerometer or other angle acquisition devices is used to calculate the angle of incidence even when the receiver is not horizontal. Meanwhile, a received signal strength technique with high accuracy is employed to determine the location. Moreover, an eight-light-emitting-diode (LED) system model is provided to improve the accuracy. The simulation results show that the proposed system can achieve a low positioning error with low complexity, and the eight-LED system exhibits improved performance. Furthermore, trust region-based positioning is proposed to determine three-dimensional locations and achieves high accuracy in both the horizontal and the vertical components.

  4. A characterization of horizontal visibility graphs and combinatorics on words

    NASA Astrophysics Data System (ADS)

    Gutin, Gregory; Mansour, Toufik; Severini, Simone

    2011-06-01

    A Horizontal Visibility Graph (HVG) is defined in association with an ordered set of non-negative reals. HVGs realize a methodology in the analysis of time series, their degree distribution being a good discriminator between randomness and chaos Luque et al. [B. Luque, L. Lacasa, F. Ballesteros, J. Luque, Horizontal visibility graphs: exact results for random time series, Phys. Rev. E 80 (2009), 046103]. We prove that a graph is an HVG if and only if it is outerplanar and has a Hamilton path. Therefore, an HVG is a noncrossing graph, as defined in algebraic combinatorics Flajolet and Noy [P. Flajolet, M. Noy, Analytic combinatorics of noncrossing configurations, Discrete Math., 204 (1999) 203-229]. Our characterization of HVGs implies a linear time recognition algorithm. Treating ordered sets as words, we characterize subfamilies of HVGs highlighting various connections with combinatorial statistics and introducing the notion of a visible pair. With this technique, we determine asymptotically the average number of edges of HVGs.

  5. Novel lidar algorithms for atmospheric slantrange visibility, planetary boundary layer height, meteorogical phenomena and atmospheric layering measurements

    NASA Astrophysics Data System (ADS)

    Pantazis, Alexandros; Papayannis, Alexandros; Georgoussis, Georgios

    2018-04-01

    In this paper we present a development of novel algorithms and techniques implemented within the Laser Remote Sensing Laboratory (LRSL) of the National Technical University of Athens (NTUA), in collaboration with Raymetrics S.A., in order to incorporate them into a 3-Dimensional (3D) lidar. The lidar is transmitting at 355 nm in the eye safe region and the measurements then are transposed to the visual range at 550 nm, according to the World Meteorological Organization (WMO) and the International Civil Aviation Organization (ICAO) rules of daytime visibility. These algorithms are able to provide horizontal, slant and vertical visibility for tower aircraft controllers, meteorologists, but also from pilot's point of view. Other algorithms are also provided for detection of atmospheric layering in any given direction and vertical angle, along with the detection of the Planetary Boundary Layer Height (PBLH).

  6. Multifractal analysis of charged particle distributions using horizontal visibility graph and sandbox algorithm

    NASA Astrophysics Data System (ADS)

    Mali, P.; Mukhopadhyay, A.; Manna, S. K.; Haldar, P. K.; Singh, G.

    2017-03-01

    Horizontal visibility graphs (HVGs) and the sandbox (SB) algorithm usually applied for multifractal characterization of complex network systems that are converted from time series measurements, are used to characterize the fluctuations in pseudorapidity densities of singly charged particles produced in high-energy nucleus-nucleus collisions. Besides obtaining the degree distribution associated with event-wise pseudorapidity distributions, the common set of observables, typical of any multifractality measurement, are studied in 16O-Ag/Br and 32S-Ag/Br interactions, each at an incident laboratory energy of 200 GeV/nucleon. For a better understanding, we systematically compare the experiment with a Monte Carlo model simulation based on the Ultra-relativistic Quantum Molecular Dynamics (UrQMD). Our results suggest that the HVG-SB technique is an efficient tool that can characterize multifractality in multiparticle emission data, and in some cases, it is even superior to other methods more commonly used in this regard.

  7. Multifractal analysis of multiparticle emission data in the framework of visibility graph and sandbox algorithm

    NASA Astrophysics Data System (ADS)

    Mali, P.; Manna, S. K.; Mukhopadhyay, A.; Haldar, P. K.; Singh, G.

    2018-03-01

    Multiparticle emission data in nucleus-nucleus collisions are studied in a graph theoretical approach. The sandbox algorithm used to analyze complex networks is employed to characterize the multifractal properties of the visibility graphs associated with the pseudorapidity distribution of charged particles produced in high-energy heavy-ion collisions. Experimental data on 28Si+Ag/Br interaction at laboratory energy Elab = 14 . 5 A GeV, and 16O+Ag/Br and 32S+Ag/Br interactions both at Elab = 200 A GeV, are used in this analysis. We observe a scale free nature of the degree distributions of the visibility and horizontal visibility graphs associated with the event-wise pseudorapidity distributions. Equivalent event samples simulated by ultra-relativistic quantum molecular dynamics, produce degree distributions that are almost identical to the respective experiment. However, the multifractal variables obtained by using sandbox algorithm for the experiment to some extent differ from the respective simulated results.

  8. Complex network approach to fractional time series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manshour, Pouya

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacencymore » matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.« less

  9. Infrared remote sensing of the vertical and horizontal distribution of clouds

    NASA Technical Reports Server (NTRS)

    Chahine, M. T.; Haskins, R. D.

    1982-01-01

    An algorithm has been developed to derive the horizontal and vertical distribution of clouds from the same set of infrared radiance data used to retrieve atmospheric temperature profiles. The method leads to the determination of the vertical atmospheric temperature structure and the cloud distribution simultaneously, providing information on heat sources and sinks, storage rates and transport phenomena in the atmosphere. Experimental verification of this algorithm was obtained using the 15-micron data measured by the NOAA-VTPR temperature sounder. After correcting for water vapor emission, the results show that the cloud cover derived from 15-micron data is less than that obtained from visible data.

  10. Plane representations of graphs and visibility between parallel segments

    NASA Astrophysics Data System (ADS)

    Tamassia, R.; Tollis, I. G.

    1985-04-01

    Several layout compaction strategies for VLSI are based on the concept of visibility between parallel segments, where we say that two parallel segments of a given set are visible if they can be joined by a segment orthogonal to them, which does not intersect any other segment. This paper studies visibility representations of graphs, which are constructed by mapping vertices to horizontal segments, and edges to vertical segments drawn between visible vertex-segments. Clearly, every graph that admits such a representation must be a planar. The authors consider three types of visibility representations, and give complete characterizations of the classes of graphs that admit them. Furthermore, they present linear time algorithms for testing the existence of and constructing visibility representations of planar graphs.

  11. Time reversibility from visibility graphs of nonstationary processes

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Flanagan, Ryan

    2015-08-01

    Visibility algorithms are a family of methods to map time series into networks, with the aim of describing the structure of time series and their underlying dynamical properties in graph-theoretical terms. Here we explore some properties of both natural and horizontal visibility graphs associated to several nonstationary processes, and we pay particular attention to their capacity to assess time irreversibility. Nonstationary signals are (infinitely) irreversible by definition (independently of whether the process is Markovian or producing entropy at a positive rate), and thus the link between entropy production and time series irreversibility has only been explored in nonequilibrium stationary states. Here we show that the visibility formalism naturally induces a new working definition of time irreversibility, which allows us to quantify several degrees of irreversibility for stationary and nonstationary series, yielding finite values that can be used to efficiently assess the presence of memory and off-equilibrium dynamics in nonstationary processes without the need to differentiate or detrend them. We provide rigorous results complemented by extensive numerical simulations on several classes of stochastic processes.

  12. From standard alpha-stable Lévy motions to horizontal visibility networks: dependence of multifractal and Laplacian spectrum

    NASA Astrophysics Data System (ADS)

    Zou, Hai-Long; Yu, Zu-Guo; Anh, Vo; Ma, Yuan-Lin

    2018-05-01

    In recent years, researchers have proposed several methods to transform time series (such as those of fractional Brownian motion) into complex networks. In this paper, we construct horizontal visibility networks (HVNs) based on the -stable Lévy motion. We aim to study the relations of multifractal and Laplacian spectrum of transformed networks on the parameters and of the -stable Lévy motion. First, we employ the sandbox algorithm to compute the mass exponents and multifractal spectrum to investigate the multifractality of these HVNs. Then we perform least squares fits to find possible relations of the average fractal dimension , the average information dimension and the average correlation dimension against using several methods of model selection. We also investigate possible dependence relations of eigenvalues and energy on , calculated from the Laplacian and normalized Laplacian operators of the constructed HVNs. All of these constructions and estimates will help us to evaluate the validity and usefulness of the mappings between time series and networks, especially between time series of -stable Lévy motions and HVNs.

  13. Impact of spatial resolution on cirrus infrared satellite retrievals in the presence of cloud heterogeneity

    NASA Astrophysics Data System (ADS)

    Fauchez, T.; Platnick, S. E.; Meyer, K.; Zhang, Z.; Cornet, C.; Szczap, F.; Dubuisson, P.

    2015-12-01

    Cirrus clouds are an important part of the Earth radiation budget but an accurate assessment of their role remains highly uncertain. Cirrus optical properties such as Cloud Optical Thickness (COT) and ice crystal effective particle size are often retrieved with a combination of Visible/Near InfraRed (VNIR) and ShortWave-InfraRed (SWIR) reflectance channels. Alternatively, Thermal InfraRed (TIR) techniques, such as the Split Window Technique (SWT), have demonstrated better accuracy for thin cirrus effective radius retrievals with small effective radii. However, current global operational algorithms for both retrieval methods assume that cloudy pixels are horizontally homogeneous (Plane Parallel Approximation (PPA)) and independent (Independent Pixel Approximation (IPA)). The impact of these approximations on ice cloud retrievals needs to be understood and, as far as possible, corrected. Horizontal heterogeneity effects in the TIR spectrum are mainly dominated by the PPA bias that primarily depends on the COT subpixel heterogeneity; for solar reflectance channels, in addition to the PPA bias, the IPA can lead to significant retrieval errors due to a significant photon horizontal transport between cloudy columns, as well as brightening and shadowing effects that are more difficult to quantify. Furthermore TIR retrievals techniques have demonstrated better retrieval accuracy for thin cirrus having small effective radii over solar reflectance techniques. The TIR range is thus particularly relevant in order to characterize, as accurately as possible, thin cirrus clouds. Heterogeneity effects in the TIR are evaluated as a function of spatial resolution in order to estimate the optimal spatial resolution for TIR retrieval applications. These investigations are performed using a cirrus 3D cloud generator (3DCloud), a 3D radiative transfer code (3DMCPOL), and two retrieval algorithms, namely the operational MODIS retrieval algorithm (MOD06) and a research-level SWT algorithm.

  14. Quality Assessment of the Cobel-Isba Numerical Forecast System of Fog and Low Clouds

    NASA Astrophysics Data System (ADS)

    Bergot, Thierry

    2007-06-01

    Short-term forecasting of fog is a difficult issue which can have a large societal impact. Fog appears in the surface boundary layer and is driven by the interactions between land surface and the lower layers of the atmosphere. These interactions are still not well parameterized in current operational NWP models, and a new methodology based on local observations, an adaptive assimilation scheme and a local numerical model is tested. The proposed numerical forecast method of foggy conditions has been run during three years at Paris-CdG international airport. This test over a long-time period allows an in-depth evaluation of the forecast quality. This study demonstrates that detailed 1-D models, including detailed physical parameterizations and high vertical resolution, can reasonably represent the major features of the life cycle of fog (onset, development and dissipation) up to +6 h. The error on the forecast onset and burn-off time is typically 1 h. The major weakness of the methodology is related to the evolution of low clouds (stratus lowering). Even if the occurrence of fog is well forecasted, the value of the horizontal visibility is only crudely forecasted. Improvements in the microphysical parameterization and in the translation algorithm converting NWP prognostic variables into a corresponding horizontal visibility seems necessary to accurately forecast the value of the visibility.

  15. Visibility graphs and symbolic dynamics

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas; Just, Wolfram

    2018-07-01

    Visibility algorithms are a family of geometric and ordering criteria by which a real-valued time series of N data is mapped into a graph of N nodes. This graph has been shown to often inherit in its topology nontrivial properties of the series structure, and can thus be seen as a combinatorial representation of a dynamical system. Here we explore in some detail the relation between visibility graphs and symbolic dynamics. To do that, we consider the degree sequence of horizontal visibility graphs generated by the one-parameter logistic map, for a range of values of the parameter for which the map shows chaotic behaviour. Numerically, we observe that in the chaotic region the block entropies of these sequences systematically converge to the Lyapunov exponent of the time series. Hence, Pesin's identity suggests that these block entropies are converging to the Kolmogorov-Sinai entropy of the physical measure, which ultimately suggests that the algorithm is implicitly and adaptively constructing phase space partitions which might have the generating property. To give analytical insight, we explore the relation k(x) , x ∈ [ 0 , 1 ] that, for a given datum with value x, assigns in graph space a node with degree k. In the case of the out-degree sequence, such relation is indeed a piece-wise constant function. By making use of explicit methods and tools from symbolic dynamics we are able to analytically show that the algorithm indeed performs an effective partition of the phase space and that such partition is naturally expressed as a countable union of subintervals, where the endpoints of each subinterval are related to the fixed point structure of the iterates of the map and the subinterval enumeration is associated with particular ordering structures that we called motifs.

  16. Algorithm theoretical baseline for formaldehyde retrievals from S5P TROPOMI and from the QA4ECV project

    NASA Astrophysics Data System (ADS)

    De Smedt, Isabelle; Theys, Nicolas; Yu, Huan; Danckaert, Thomas; Lerot, Christophe; Compernolle, Steven; Van Roozendael, Michel; Richter, Andreas; Hilboll, Andreas; Peters, Enno; Pedergnana, Mattia; Loyola, Diego; Beirle, Steffen; Wagner, Thomas; Eskes, Henk; van Geffen, Jos; Folkert Boersma, Klaas; Veefkind, Pepijn

    2018-04-01

    On board the Copernicus Sentinel-5 Precursor (S5P) platform, the TROPOspheric Monitoring Instrument (TROPOMI) is a double-channel, nadir-viewing grating spectrometer measuring solar back-scattered earthshine radiances in the ultraviolet, visible, near-infrared, and shortwave infrared with global daily coverage. In the ultraviolet range, its spectral resolution and radiometric performance are equivalent to those of its predecessor OMI, but its horizontal resolution at true nadir is improved by an order of magnitude. This paper introduces the formaldehyde (HCHO) tropospheric vertical column retrieval algorithm implemented in the S5P operational processor and comprehensively describes its various retrieval steps. Furthermore, algorithmic improvements developed in the framework of the EU FP7-project QA4ECV are described for future updates of the processor. Detailed error estimates are discussed in the light of Copernicus user requirements and needs for validation are highlighted. Finally, verification results based on the application of the algorithm to OMI measurements are presented, demonstrating the performances expected for TROPOMI.

  17. Comparison between GPR measurements and ultrasonic tomography with different inversion algorithms: an application to the base of an ancient Egyptian sculpture

    NASA Astrophysics Data System (ADS)

    Sambuelli, L.; Bohm, G.; Capizzi, P.; Cardarelli, E.; Cosentino, P.

    2011-09-01

    By late 2008 one of the most important pieces of the 'Museo delle Antichità Egizie' of Turin, the sculpture of the Pharaoh with god Amun, was planned to be one of the masterpieces of a travelling exhibition in Japan. The 'Fondazione Museo delle Antichità Egizie di Torino', who manages the museum, was concerned with the integrity of the base of the statue which actually presents visible signs of restoration dating back to the early 19th century. It was required to estimate the persistence of the visible fractures, to search for unknown ones and to provide information about the overall mechanical strength of the base. To tackle the first question a GPR reflection survey along three sides of the base was performed and the results were assembled in a 3D rendering. As far as the second question is concerned, two parallel, horizontal ultrasonic 2D tomograms across the base were made. We acquired, for each section, 723 ultrasonic signals corresponding to different transmitter and receiver positions. The tomographic data were inverted using four different software packages based upon different algorithms. The obtained velocity images were then compared each other, with the GPR results and with the visible fractures in the base. A critical analysis of the comparisons is finally presented.

  18. Public Address Set AN/UIQ-10 (XLW-1)

    DTIC Science & Technology

    1972-01-01

    case. /,320 Hz No visible indication. Major Horizontal Broadband Resonance Resonance center located 300 to 500 peak at on case over relay bracket. ’ 375 ... Hz No visible indication. Minor Horizontal Broadband Resonance Resonance center located 150 to 190 peak at case over relay bracket. � Hz Slight

  19. Imaging System Performance and Visibility as Affected by the Physical Environment

    DTIC Science & Technology

    2013-09-30

    devoted to the topic of light propagation and imaging across the air-sea interface and within the surface boundary layer of natural water bodies...Zaneveld and Pegau (2003) was used to estimate the horizontal visibility of a black target, y: y = 4.8 / α, (2) where α is the...attenuation coefficient at 532 nm, was necessary for predictions of horizontal visibility of a black target. Equations (2) and (3) were applied to IOP data

  20. 14 CFR 77.19 - Civil airport imaginary surfaces.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., having visibility minimums greater that three-fourths of a statute mile; (v) 4,000 feet for that end of a... visibility minimums as low as three-fourths statute mile; and (vi) 16,000 feet for precision instrument... existing or planned for that runway end. (a) Horizontal surface. A horizontal plane 150 feet above the...

  1. 14 CFR 77.19 - Civil airport imaginary surfaces.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., having visibility minimums greater that three-fourths of a statute mile; (v) 4,000 feet for that end of a... visibility minimums as low as three-fourths statute mile; and (vi) 16,000 feet for precision instrument... existing or planned for that runway end. (a) Horizontal surface. A horizontal plane 150 feet above the...

  2. 14 CFR 77.19 - Civil airport imaginary surfaces.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., having visibility minimums greater that three-fourths of a statute mile; (v) 4,000 feet for that end of a... visibility minimums as low as three-fourths statute mile; and (vi) 16,000 feet for precision instrument... existing or planned for that runway end. (a) Horizontal surface. A horizontal plane 150 feet above the...

  3. Half-unit weighted bilinear algorithm for image contrast enhancement in capsule endoscopy

    NASA Astrophysics Data System (ADS)

    Rukundo, Olivier

    2018-04-01

    This paper proposes a novel enhancement method based exclusively on the bilinear interpolation algorithm for capsule endoscopy images. The proposed method does not convert the original RBG image components to HSV or any other color space or model; instead, it processes directly RGB components. In each component, a group of four adjacent pixels and half-unit weight in the bilinear weighting function are used to calculate the average pixel value, identical for each pixel in that particular group. After calculations, groups of identical pixels are overlapped successively in horizontal and vertical directions to achieve a preliminary-enhanced image. The final-enhanced image is achieved by halving the sum of the original and preliminary-enhanced image pixels. Quantitative and qualitative experiments were conducted focusing on pairwise comparisons between original and enhanced images. Final-enhanced images have generally the best diagnostic quality and gave more details about the visibility of vessels and structures in capsule endoscopy images.

  4. Image registration for a UV-Visible dual-band imaging system

    NASA Astrophysics Data System (ADS)

    Chen, Tao; Yuan, Shuang; Li, Jianping; Xing, Sheng; Zhang, Honglong; Dong, Yuming; Chen, Liangpei; Liu, Peng; Jiao, Guohua

    2018-06-01

    The detection of corona discharge is an effective way for early fault diagnosis of power equipment. UV-Visible dual-band imaging can detect and locate corona discharge spot at all-weather condition. In this study, we introduce an image registration protocol for this dual-band imaging system. The protocol consists of UV image denoising and affine transformation model establishment. We report the algorithm details of UV image preprocessing, affine transformation model establishment and relevant experiments for verification of their feasibility. The denoising algorithm was based on a correlation operation between raw UV images, a continuous mask and the transformation model was established by using corner feature and a statistical method. Finally, an image fusion test was carried out to verify the accuracy of affine transformation model. It has proved the average position displacement error between corona discharge and equipment fault at different distances in a 2.5m-20 m range are 1.34 mm and 1.92 mm in the horizontal and vertical directions, respectively, which are precise enough for most industrial applications. The resultant protocol is not only expected to improve the efficiency and accuracy of such imaging system for locating corona discharge spot, but also supposed to provide a more generalized reference for the calibration of various dual-band imaging systems in practice.

  5. The variation of cloud amount and light rainy days under heavy pollution over South China during 1960-2009.

    PubMed

    Fu, Chuanbo; Dan, Li

    2018-01-01

    The ground observation data was used to analyze the variation of cloud amount and light precipitation over South China during 1960-2009. The total cloud cover (TCC) decreases in this period, whereas the low cloud cover (LCC) shows the obvious opposite change with increasing trends. LCP defined as low cloud cover/total cloud cover has increased, and small rainy days (< 10 mm day -1 ) decreased significantly (passing 0.001 significance level) during the past 50 years, which is attributed to the enhanced levels of air pollution in the form of anthropogenic aerosols. The horizontal visibility and sunshine duration are used to depict the anthropogenic aerosol loading. When horizontal visibility declines to 20 km or sunshine duration decreases to 5 h per day, LCC increases 52% or more and LCP increases significantly. The correlation coefficients between LCC and horizontal visibility or sunshine duration are - 0.533 and - 0.927, and the values between LCP and horizontal visibility or sunshine duration are - 0.849 and - 0.641, which pass 0.001 significance level. The results indicated that aerosols likely impacted the long-term trend of cloud amount and light precipitation over South China.

  6. Physics-Based GOES Product for Use in NREL's National Solar Radiation Database: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Habte, Aron; Gotseff, Peter

    The Global Solar Insolation Project (GSIP) is an operational physical model from the National Oceanic and Atmospheric Administration (NOAA) that computes global horizontal radiation (GHI) using the visible and infrared channel measurements from geostationary operational environmental satellites (GOES). GSIP uses a two-stage scheme that retrieves cloud properties and uses those properties in the Satellite Algorithm for Surface Radiation Budget (SASRAB) model to calculate surface radiation. The National Renewable Energy Laboratory, University of Wisconsin, and NOAA have recently collaborated to adapt GSIP to create a high-temporal and spatial resolution data set. The data sets are currently being incorporated into the widelymore » used National Solar Radiation Data Base.« less

  7. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Ming; Yu, Hengyong, E-mail: hengyong-yu@ieee.org

    2015-10-15

    Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle tomore » cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.« less

  8. Analytic reconstruction algorithms for triple-source CT with horizontal data truncation.

    PubMed

    Chen, Ming; Yu, Hengyong

    2015-10-01

    This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and matlab. While the basic platform is constructed in matlab, the computationally intensive segments are coded in c + +, which are linked via a mex interface. A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.

  9. Efficient Geometric Sound Propagation Using Visibility Culling

    NASA Astrophysics Data System (ADS)

    Chandak, Anish

    2011-07-01

    Simulating propagation of sound can improve the sense of realism in interactive applications such as video games and can lead to better designs in engineering applications such as architectural acoustics. In this thesis, we present geometric sound propagation techniques which are faster than prior methods and map well to upcoming parallel multi-core CPUs. We model specular reflections by using the image-source method and model finite-edge diffraction by using the well-known Biot-Tolstoy-Medwin (BTM) model. We accelerate the computation of specular reflections by applying novel visibility algorithms, FastV and AD-Frustum, which compute visibility from a point. We accelerate finite-edge diffraction modeling by applying a novel visibility algorithm which computes visibility from a region. Our visibility algorithms are based on frustum tracing and exploit recent advances in fast ray-hierarchy intersections, data-parallel computations, and scalable, multi-core algorithms. The AD-Frustum algorithm adapts its computation to the scene complexity and allows small errors in computing specular reflection paths for higher computational efficiency. FastV and our visibility algorithm from a region are general, object-space, conservative visibility algorithms that together significantly reduce the number of image sources compared to other techniques while preserving the same accuracy. Our geometric propagation algorithms are an order of magnitude faster than prior approaches for modeling specular reflections and two to ten times faster for modeling finite-edge diffraction. Our algorithms are interactive, scale almost linearly on multi-core CPUs, and can handle large, complex, and dynamic scenes. We also compare the accuracy of our sound propagation algorithms with other methods. Once sound propagation is performed, it is desirable to listen to the propagated sound in interactive and engineering applications. We can generate smooth, artifact-free output audio signals by applying efficient audio-processing algorithms. We also present the first efficient audio-processing algorithm for scenarios with simultaneously moving source and moving receiver (MS-MR) which incurs less than 25% overhead compared to static source and moving receiver (SS-MR) or moving source and static receiver (MS-SR) scenario.

  10. SAR image dataset of military ground targets with multiple poses for ATR

    NASA Astrophysics Data System (ADS)

    Belloni, Carole; Balleri, Alessio; Aouf, Nabil; Merlet, Thomas; Le Caillec, Jean-Marc

    2017-10-01

    Automatic Target Recognition (ATR) is the task of automatically detecting and classifying targets. Recognition using Synthetic Aperture Radar (SAR) images is interesting because SAR images can be acquired at night and under any weather conditions, whereas optical sensors operating in the visible band do not have this capability. Existing SAR ATR algorithms have mostly been evaluated using the MSTAR dataset.1 The problem with the MSTAR is that some of the proposed ATR methods have shown good classification performance even when targets were hidden,2 suggesting the presence of a bias in the dataset. Evaluations of SAR ATR techniques are currently challenging due to the lack of publicly available data in the SAR domain. In this paper, we present a high resolution SAR dataset consisting of images of a set of ground military target models taken at various aspect angles, The dataset can be used for a fair evaluation and comparison of SAR ATR algorithms. We applied the Inverse Synthetic Aperture Radar (ISAR) technique to echoes from targets rotating on a turntable and illuminated with a stepped frequency waveform. The targets in the database consist of four variants of two 1.7m-long models of T-64 and T-72 tanks. The gun, the turret position and the depression angle are varied to form 26 different sequences of images. The emitted signal spanned the frequency range from 13 GHz to 18 GHz to achieve a bandwidth of 5 GHz sampled with 4001 frequency points. The resolution obtained with respect to the size of the model targets is comparable to typical values obtained using SAR airborne systems. Single polarized images (Horizontal-Horizontal) are generated using the backprojection algorithm.3 A total of 1480 images are produced using a 20° integration angle. The images in the dataset are organized in a suggested training and testing set to facilitate a standard evaluation of SAR ATR algorithms.

  11. Alignment-free detection of horizontal gene transfer between closely related bacterial genomes.

    PubMed

    Domazet-Lošo, Mirjana; Haubold, Bernhard

    2011-09-01

    Bacterial epidemics are often caused by strains that have acquired their increased virulence through horizontal gene transfer. Due to this association with disease, the detection of horizontal gene transfer continues to receive attention from microbiologists and bioinformaticians alike. Most software for detecting transfer events is based on alignments of sets of genes or of entire genomes. But despite great advances in the design of algorithms and computer programs, genome alignment remains computationally challenging. We have therefore developed an alignment-free algorithm for rapidly detecting horizontal gene transfer between closely related bacterial genomes. Our implementation of this algorithm is called alfy for "ALignment Free local homologY" and is freely available from http://guanine.evolbio.mpg.de/alfy/. In this comment we demonstrate the application of alfy to the genomes of Staphylococcus aureus. We also argue that-contrary to popular belief and in spite of increasing computer speed-algorithmic optimization is becoming more, not less, important if genome data continues to accumulate at the present rate.

  12. Efficient Prediction of Low-Visibility Events at Airports Using Machine-Learning Regression

    NASA Astrophysics Data System (ADS)

    Cornejo-Bueno, L.; Casanova-Mateo, C.; Sanz-Justo, J.; Cerro-Prada, E.; Salcedo-Sanz, S.

    2017-11-01

    We address the prediction of low-visibility events at airports using machine-learning regression. The proposed model successfully forecasts low-visibility events in terms of the runway visual range at the airport, with the use of support-vector regression, neural networks (multi-layer perceptrons and extreme-learning machines) and Gaussian-process algorithms. We assess the performance of these algorithms based on real data collected at the Valladolid airport, Spain. We also propose a study of the atmospheric variables measured at a nearby tower related to low-visibility atmospheric conditions, since they are considered as the inputs of the different regressors. A pre-processing procedure of these input variables with wavelet transforms is also described. The results show that the proposed machine-learning algorithms are able to predict low-visibility events well. The Gaussian process is the best algorithm among those analyzed, obtaining over 98% of the correct classification rate in low-visibility events when the runway visual range is {>}1000 m, and about 80% under this threshold. The performance of all the machine-learning algorithms tested is clearly affected in extreme low-visibility conditions ({<}500 m). However, we show improved results of all the methods when data from a neighbouring meteorological tower are included, and also with a pre-processing scheme using a wavelet transform. Also presented are results of the algorithm performance in daytime and nighttime conditions, and for different prediction time horizons.

  13. The parametric modified limited penetrable visibility graph for constructing complex networks from time series

    NASA Astrophysics Data System (ADS)

    Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang

    2018-02-01

    This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.

  14. Continuous Data Assimilation for a 2D Bénard Convection System Through Horizontal Velocity Measurements Alone

    NASA Astrophysics Data System (ADS)

    Farhat, Aseel; Lunasin, Evelyn; Titi, Edriss S.

    2017-06-01

    In this paper we propose a continuous data assimilation (downscaling) algorithm for a two-dimensional Bénard convection problem. Specifically we consider the two-dimensional Boussinesq system of a layer of incompressible fluid between two solid horizontal walls, with no-normal flow and stress-free boundary conditions on the walls, and the fluid is heated from the bottom and cooled from the top. In this algorithm, we incorporate the observables as a feedback (nudging) term in the evolution equation of the horizontal velocity. We show that under an appropriate choice of the nudging parameter and the size of the spatial coarse mesh observables, and under the assumption that the observed data are error free, the solution of the proposed algorithm converges at an exponential rate, asymptotically in time, to the unique exact unknown reference solution of the original system, associated with the observed data on the horizontal component of the velocity.

  15. Dust transport over Iraq and northwest Iran associated with winter Shamal: A case study

    NASA Astrophysics Data System (ADS)

    Abdi Vishkaee, Farhad; Flamant, Cyrille; Cuesta, Juan; Oolman, Larry; Flamant, Pierre; Khalesifard, Hamid R.

    2012-02-01

    Dynamical processes leading to dust emission over Syria and Iraq, in response to a strong winter Shamal event as well as the subsequent transport of dust over Iraq and northwest Iran, are analyzed on the basis of a case study (22-23 February 2010) using a suite of ground-based and spaceborne remote sensing platforms together with modeling tools. Surface measurements on 22 February show a sharp reduction in horizontal visibility over Iraq occurring shortly after the passage of a cold front (behind which the northwesterly Shamal winds were blowing) and that visibilities could be as low as 1 km on average for 1-2 days in the wake of the front. The impact of the southwesterly Kaus winds blowing ahead (east) of the Shamal winds on dust emission over Iraq is also highlighted. Unlike what is observed over Iraq, low near-surface horizontal visibilities (<1 km) over northwest Iran are observed well after the passage of the cold front on 23 February, generally in the hours following sunrise. Ground-based lidar measurements acquired in Zanjan show that, in the wake of the front, dust from Syria/Iraq was transported in an elevated 1 to 1.5 km thick plume separated from the surface during the night/morning of 23 February. After sunrise, strong turbulence in the developing convective boundary layer led to mixing of the dust into the boundary layer and in turn to a sharp reduction of the horizontal visibility in Zanjan. The timing of the reduction of surface horizontal visibility in other stations over northwest Iran (Tabriz, Qom, and Tehran) is consistent with the downward mixing of dust in the planetary boundary layer just after sunset, as evidenced in Zanjan. This study sheds new light on the processes responsible for dust emission and transport over Iraq and northwest Iran in connection with winter Shamal events. Enhanced knowledge of these processes is key for improving dust forecasts in this region.

  16. Observation of dust emission and transport over Iraq and northwest Iran associated with winter Shamal

    NASA Astrophysics Data System (ADS)

    Flamant, C.; Abdi Vishkaee, F.; Cuesta, J.; Khalesifard, H.; Oolman, L.; Flamant, P.

    2012-04-01

    Dynamical processes leading to dust emission over Syria and Iraq, in response to a strong winter Shamal event as well as the subsequent transport of dust over Iraq and northwest Iran, are analyzed on the basis of a case study (22-23 February 2010) using a suite of ground-based and space-borne remote sensing platforms together with modeling tools. Surface measurements on 22 February show a sharp reduction in horizontal visibility over Iraq occurring shortly after the passage of a cold front (behind which the northwesterly Shamal winds were blowing) and that visibilities could be as low as 1 km on average for one to two days in the wake of the front. The impact of the southwesterly Kaus winds blowing ahead (east) of the Shamal winds on dust emission over Iraq is also highlighted. Unlike what is observed over Iraq, low near-surface horizontal visibilities (less than 1 km) over northwest Iran are observed well after the passage of the cold front on 23 February, generally in the hours following sunrise. Ground-based lidar measurements acquired in Zanjan show that, in the wake of the front, dust from Syria/Iraq was transported in an elevated 1 to 1.5 km thick plume separated from the surface during the night/morning of February. After sunrise, strong turbulence in the developing convective boundary layer led to mixing of the dust into the boundary layer and in turn to a sharp reduction of the horizontal visibility in Zanjan. The timing of the reduction of surface horizontal visibility in other stations over northwest Iran (Tabriz, Qom and Tehran) is consistent with the downward mixing of dust in the PBL just after sunset, as evidenced in Zanjan. This study shades new light on the processes responsible for dust emission and transport over Iraq and northwest Iran in connection with winter Shamal events. Enhanced knowledge of these processes is key for improving dust forecasts in this region.

  17. Numerical Modeling of One-Dimensional Steady-State Flow and Contaminant Transport in a Horizontally Heterogeneous Unconfined Aquifer with an Uneven Base

    EPA Science Inventory

    Algorithms and a short description of the D1_Flow program for numerical modeling of one-dimensional steady-state flow in horizontally heterogeneous aquifers with uneven sloping bases are presented. The algorithms are based on the Dupuit-Forchheimer approximations. The program per...

  18. Sparse Reconstruction of Electric Fields from Radial Magnetic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeates, Anthony R.

    2017-02-10

    Accurate estimates of the horizontal electric field on the Sun’s visible surface are important not only for estimating the Poynting flux of magnetic energy into the corona but also for driving time-dependent magnetohydrodynamic models of the corona. In this paper, a method is developed for estimating the horizontal electric field from a sequence of radial-component magnetic field maps. This problem of inverting Faraday’s law has no unique solution. Unfortunately, the simplest solution (a divergence-free electric field) is not realistically localized in regions of nonzero magnetic field, as would be expected from Ohm’s law. Our new method generates instead a localizedmore » solution, using a basis pursuit algorithm to find a sparse solution for the electric field. The method is shown to perform well on test cases where the input magnetic maps are flux balanced in both Cartesian and spherical geometries. However, we show that if the input maps have a significant imbalance of flux—usually arising from data assimilation—then it is not possible to find a localized, realistic, electric field solution. This is the main obstacle to driving coronal models from time sequences of solar surface magnetic maps.« less

  19. Improvements to the CATS Cloud-Aerosol Data Products and Implications for the Space-Based Lidar Data Record

    NASA Astrophysics Data System (ADS)

    Yorks, J. E.; McGill, M. J.; Nowottnick, E. P.; Palm, S. P.; Hlavka, D. L.; Selmer, P. A.; Rodier, S. D.; Vaughan, M.; Pauly, R.

    2017-12-01

    The Cloud-Aerosol Transport System (CATS) is an elastic backscatter lidar that has generated over 175 billion laser pulses on-orbit from the International Space Station (ISS) since February 2015. The CATS instrument was designed to demonstrate new in-space technologies for future Earth Science missions while also providing properties of clouds and aerosols such as: layer height/thickness, backscatter, optical depth, extinction, and feature type. Despite the "tech demo" nature of CATS and the lack of a funded science team, the research community is increasingly embracing CATS data. New CATS data products, the most acurrate yet, were released in the summer of 2017. The major algorithm changes made in L1B Version 2-08 (V2-08) focused on the backscatter calibration and the inclusion of a new flag to notify users of granules with depolarization ratio values of poor quality. Several changes were made to the molecular folding correction factor and calibration algorithms that result in favorable comparisons between CATS, CALIPSO, and modeled Rayleigh 1064 nm backscatter profiles. Given that the 1064 nm attenuated total backscatter and depolarization ratio are used to retrieve nearly all L2O data products, the accuracy of the L2O products has also improved. Several changes were made in CATS L2O Version 2-00 data products to improve cloud and aerosol detection. The CATS L2O data now includes layer detection at both 5 and 60 km horizontal resolutions to increase daytime detection of thin cirrus and aerosol layers over land. Horizontal persistence tests prevent superficial "striping" that was visible in vertical feature mask images for horizontally homogeneous cloud and aerosol layers. Also, the absolute uncertainties for all the L2O parameters are now reported in the CATS data products. Given the uncertain status of continued CALIPSO operations, these updated CATS data products may be the only space-based lidar data record that continues into the 2018 timeframe.

  20. Comparison among GPR measurements and ultrasonic tomographies with different inversion strategies applied to the basement of an ancient egyptian sculpture.

    NASA Astrophysics Data System (ADS)

    Sambuelli, Luigi; Bohm, Gualtiero; Capizzi, Patrizia; Cardarelli, Ettore; Cosentino, Pietro; D'Onofrio, Laurent; Marchisio, Mario

    2010-05-01

    By the late 2008 one of the most important pieces of the "Museo delle Antichità Egizie" in Turin, the sculpture of the Pharaoh with god Amun, was planned to be one of the masterpieces of a travelling exhibition in Japan. The "Fondazione Museo delle Antichità Egizie di Torino", managing the museum, was concerned with the integrity of the basement of the statue which actually presents visible signs of restorations dating back to the early IXX century. The questions put by the museum managers were to estimate the internal extension of some visible fractures, to search for unknown internal ones and to provide information about the overall mechanical strength of the basement. In order to tackle the first and second questions a GPR reflection survey of the basement along three sides was performed and the results were assembled in a 3D rendering. As far as the third question is concerned, two parallel, horizontal ultrasonic 2D tomographies across the basement were made with a source-receiver layout able to acquire, for each section, 723 ultrasonic signals correspondent to different transmitter and receiver positions. The ultrasonic tomographic data were inverted using different software based upon different algorithms. The obtained velocity images were then compared with the GPR results and with the visible joints on the basement. A critical analysis of the comparisons is finally presented.

  1. Multiscale limited penetrable horizontal visibility graph for analyzing nonlinear time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong; Zhang, Shan-Shan

    2016-10-01

    Visibility graph has established itself as a powerful tool for analyzing time series. We in this paper develop a novel multiscale limited penetrable horizontal visibility graph (MLPHVG). We use nonlinear time series from two typical complex systems, i.e., EEG signals and two-phase flow signals, to demonstrate the effectiveness of our method. Combining MLPHVG and support vector machine, we detect epileptic seizures from the EEG signals recorded from healthy subjects and epilepsy patients and the classification accuracy is 100%. In addition, we derive MLPHVGs from oil-water two-phase flow signals and find that the average clustering coefficient at different scales allows faithfully identifying and characterizing three typical oil-water flow patterns. These findings render our MLPHVG method particularly useful for analyzing nonlinear time series from the perspective of multiscale network analysis.

  2. Efficiently Sorting Zoo-Mesh Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, R; Max, N; Silva, C

    The authors describe the SXMPVO algorithm for performing a visibility ordering zoo-meshed polyhedra. The algorithm runs in practice in linear time and the visibility ordering which it produces is exact.

  3. Gamma-Weighted Discrete Ordinate Two-Stream Approximation for Computation of Domain Averaged Solar Irradiance

    NASA Technical Reports Server (NTRS)

    Kato, S.; Smith, G. L.; Barker, H. W.

    2001-01-01

    An algorithm is developed for the gamma-weighted discrete ordinate two-stream approximation that computes profiles of domain-averaged shortwave irradiances for horizontally inhomogeneous cloudy atmospheres. The algorithm assumes that frequency distributions of cloud optical depth at unresolved scales can be represented by a gamma distribution though it neglects net horizontal transport of radiation. This algorithm is an alternative to the one used in earlier studies that adopted the adding method. At present, only overcast cloudy layers are permitted.

  4. Cultural differences in asymmetric beliefs of interpersonal knowledge in vertical and horizontal relationships.

    PubMed

    Cho, Yohan; Otani, Hajime; Han, Kyunghee; Van Horn, K Roger

    2010-01-01

    Previous studies have reported that our interpersonal knowledge shows an asymmetry; that is, we tend to believe that we know and understand other people's thoughts and feelings better than other people know and understand our own thoughts and feelings. In the present study, the authors compared American (114 men, 192 women) and Korean (99 men and 98 women) students to examine whether the asymmetry is greater in collectivistic than in individualistic culture in two types of relationships: horizontal (with best friends) and vertical (with parents). On all three items--Know, Understand, and Visibility--asymmetry was found for both horizontal and vertical relationships. Further, the Understand and Visibility items showed greater asymmetry for the Korean group than for the American group. It was concluded that asymmetry is greater in collectivistic than in individualistic culture. The cultural differences can be explained by self-consistency, sensitivity to social consequences, parent-child interaction, and living arrangement.

  5. Improving a maximum horizontal gradient algorithm to determine geological body boundaries and fault systems based on gravity data

    NASA Astrophysics Data System (ADS)

    Van Kha, Tran; Van Vuong, Hoang; Thanh, Do Duc; Hung, Duong Quoc; Anh, Le Duc

    2018-05-01

    The maximum horizontal gradient method was first proposed by Blakely and Simpson (1986) for determining the boundaries between geological bodies with different densities. The method involves the comparison of a center point with its eight nearest neighbors in four directions within each 3 × 3 calculation grid. The horizontal location and magnitude of the maximum values are found by interpolating a second-order polynomial through the trio of points provided that the magnitude of the middle point is greater than its two nearest neighbors in one direction. In theoretical models of multiple sources, however, the above condition does not allow the maximum horizontal locations to be fully located, and it could be difficult to correlate the edges of complicated sources. In this paper, the authors propose an additional condition to identify more maximum horizontal locations within the calculation grid. This additional condition will improve the method algorithm for interpreting the boundaries of magnetic and/or gravity sources. The improved algorithm was tested on gravity models and applied to gravity data for the Phu Khanh basin on the continental shelf of the East Vietnam Sea. The results show that the additional locations of the maximum horizontal gradient could be helpful for connecting the edges of complicated source bodies.

  6. 33 CFR 67.05-1 - Arrangement of obstruction lights.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Arrangement of obstruction lights... for Lights § 67.05-1 Arrangement of obstruction lights. (a) Structures having a maximum horizontal... light visible for 360°. (b) Structures having a maximum horizontal dimension of over 30 feet, but not in...

  7. 33 CFR 67.05-1 - Arrangement of obstruction lights.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Arrangement of obstruction lights... for Lights § 67.05-1 Arrangement of obstruction lights. (a) Structures having a maximum horizontal... light visible for 360°. (b) Structures having a maximum horizontal dimension of over 30 feet, but not in...

  8. 33 CFR 67.05-1 - Arrangement of obstruction lights.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Arrangement of obstruction lights... for Lights § 67.05-1 Arrangement of obstruction lights. (a) Structures having a maximum horizontal... light visible for 360°. (b) Structures having a maximum horizontal dimension of over 30 feet, but not in...

  9. 33 CFR 67.05-1 - Arrangement of obstruction lights.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Arrangement of obstruction lights... for Lights § 67.05-1 Arrangement of obstruction lights. (a) Structures having a maximum horizontal... light visible for 360°. (b) Structures having a maximum horizontal dimension of over 30 feet, but not in...

  10. 33 CFR 67.05-1 - Arrangement of obstruction lights.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Arrangement of obstruction lights... for Lights § 67.05-1 Arrangement of obstruction lights. (a) Structures having a maximum horizontal... light visible for 360°. (b) Structures having a maximum horizontal dimension of over 30 feet, but not in...

  11. Real-time 3-D X-ray and gamma-ray viewer

    NASA Technical Reports Server (NTRS)

    Yin, L. I. (Inventor)

    1983-01-01

    A multi-pinhole aperture lead screen forms an equal plurality of invisible mini-images having dissimilar perspectives of an X-ray and gamma-ray emitting object (ABC) onto a near-earth phosphor layer. This layer provides visible light mini-images directly into a visible light image intensifier. A viewing screen having an equal number of dissimilar perspective apertures distributed across its face in a geometric pattern identical to the lead screen, provides a viewer with a real, pseudoscopic image (A'B'C') of the object with full horizontal and vertical parallax. Alternatively, a third screen identical to viewing screen and spaced apart from a second visible light image intensifier, may be positioned between the first image intensifier and the viewing screen, thereby providing the viewer with a virtual, orthoscopic image (A"B"C") of the object (ABC) with full horizontal and vertical parallax.

  12. Infrared and visible image fusion based on total variation and augmented Lagrangian.

    PubMed

    Guo, Hanqi; Ma, Yong; Mei, Xiaoguang; Ma, Jiayi

    2017-11-01

    This paper proposes a new algorithm for infrared and visible image fusion based on gradient transfer that achieves fusion by preserving the intensity of the infrared image and then transferring gradients in the corresponding visible one to the result. The gradient transfer suffers from the problems of low dynamic range and detail loss because it ignores the intensity from the visible image. The new algorithm solves these problems by providing additive intensity from the visible image to balance the intensity between the infrared image and the visible one. It formulates the fusion task as an l 1 -l 1 -TV minimization problem and then employs variable splitting and augmented Lagrangian to convert the unconstrained problem to a constrained one that can be solved in the framework of alternating the multiplier direction method. Experiments demonstrate that the new algorithm achieves better fusion results with a high computation efficiency in both qualitative and quantitative tests than gradient transfer and most state-of-the-art methods.

  13. Formally Verified Practical Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Caesar A.

    2009-01-01

    In this paper, we develop and formally verify practical algorithms for recovery from loss of separation. The formal verification is performed in the context of a criteria-based framework. This framework provides rigorous definitions of horizontal and vertical maneuver correctness that guarantee divergence and achieve horizontal and vertical separation. The algorithms are shown to be independently correct, that is, separation is achieved when only one aircraft maneuvers, and implicitly coordinated, that is, separation is also achieved when both aircraft maneuver. In this paper we improve the horizontal criteria over our previous work. An important benefit of the criteria approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  14. Tomographic reconstruction of an aerosol plume using passive multiangle observations from the MISR satellite instrument

    NASA Astrophysics Data System (ADS)

    Garay, Michael J.; Davis, Anthony B.; Diner, David J.

    2016-12-01

    We present initial results using computed tomography to reconstruct the three-dimensional structure of an aerosol plume from passive observations made by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. MISR views the Earth from nine different angles at four visible and near-infrared wavelengths. Adopting the 672 nm channel, we treat each view as an independent measure of aerosol optical thickness along the line of sight at 1.1 km resolution. A smoke plume over dark water is selected as it provides a more tractable lower boundary condition for the retrieval. A tomographic algorithm is used to reconstruct the horizontal and vertical aerosol extinction field for one along-track slice from the path of all camera rays passing through a regular grid. The results compare well with ground-based lidar observations from a nearby Micropulse Lidar Network site.

  15. Algorithms to automate gap-dependent integral tuning for the 2.8-meter long horizontal field undulator with a dynamic force compensation mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Joseph Z., E-mail: x@anl.gov; Vasserman, Isaac; Strelnikov, Nikita

    2016-07-27

    A 2.8-meter long horizontal field prototype undulator with a dynamic force compensation mechanism has been developed and tested at the Advanced Photon Source (APS) at Argonne National Laboratory (Argonne). The magnetic tuning of the undulator integrals has been automated and accomplished by applying magnetic shims. A detailed description of the algorithms and performance is reported.

  16. Characterization of Surface Reflectance Variation Effects on Remote Sensing

    NASA Technical Reports Server (NTRS)

    Pearce, W. A.

    1984-01-01

    The use of Monte Carlo radiative transfer codes to simulate the effects on remote sensing in visible and infrared wavelengths of variables which affect classification is examined. These variables include detector viewing angle, atmospheric aerosol size distribution, aerosol vertical and horizontal distribution (e.g., finite clouds), the form of the bidirectional ground reflectance function, and horizontal variability of reflectance type and reflectivity (albedo). These simulations are used to characterize the sensitivity of observables (intensity and polarization) to variations in the underlying physical parameters both to improve algorithms for the removal of atmospheric effects and to identify techniques which can improve classification accuracy. It was necessary to revise and validate the simulation codes (CTRANS, ARTRAN, and the Mie scattering code) to improve efficiency and accommodate a new operational environment, and to build the basic software tools for acquisition and off-line manipulation of simulation results. Initial calculations compare cases in which increasing amounts of aerosol are shifted into the stratosphere, maintaining a constant optical depth. In the case of moderate aerosol optical depth, the effect on the spread function is to scale it linearly as would be expected from a single scattering model. Varying the viewing angle appears to provide the same qualitative effect as modifying the vertical optical depth (for Lambertian ground reflectance).

  17. Adaptive inversion algorithm for 1 . 5 μm visibility lidar incorporating in situ Angstrom wavelength exponent

    NASA Astrophysics Data System (ADS)

    Shang, Xiang; Xia, Haiyun; Dou, Xiankang; Shangguan, Mingjia; Li, Manyi; Wang, Chong

    2018-07-01

    An eye-safe 1 . 5 μm visibility lidar is presented in this work considering in situ particle size distribution, which can be deployed in crowded places like airports. In such a case, the measured extinction coefficient at 1 . 5 μm should be converted to that at 0 . 55 μm for visibility retrieval. Although several models have been established since 1962, the accurate wavelength conversion remains a challenge. An adaptive inversion algorithm for 1 . 5 μm visibility lidar is proposed and demonstrated by using the in situ Angstrom wavelength exponent, which is derived from an aerosol spectrometer. The impact of the particle size distribution of atmospheric aerosols and the Rayleigh backscattering of atmospheric molecules are taken into account. Using the 1 . 5 μm visibility lidar, the visibility with a temporal resolution of 5 min is detected over 48 h in Hefei (31 . 83∘ N, 117 . 25∘ E). The average visibility error between the new method and a visibility sensor (Vaisala, PWD52) is 5.2% with the R-square value of 0.96, while the relative error between another reference visibility lidar at 532 nm and the visibility sensor is 6.7% with the R-square value of 0.91. All results agree with each other well, demonstrating the accuracy and stability of the algorithm.

  18. 49 CFR 213.113 - Defective rails.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... across the web, originating from a bolt hole, and progressing on a path either inclined upward toward the... horizontally along the head/web or base/web fillet, or they may progress into and through the head or base to... depression is visible on the rail head only, the sagging or drooping is also visible in the head/web fillet...

  19. Rotation-invariant features for multi-oriented text detection in natural images.

    PubMed

    Yao, Cong; Zhang, Xin; Bai, Xiang; Liu, Wenyu; Ma, Yi; Tu, Zhuowen

    2013-01-01

    Texts in natural scenes carry rich semantic information, which can be used to assist a wide range of applications, such as object recognition, image/video retrieval, mapping/navigation, and human computer interaction. However, most existing systems are designed to detect and recognize horizontal (or near-horizontal) texts. Due to the increasing popularity of mobile-computing devices and applications, detecting texts of varying orientations from natural images under less controlled conditions has become an important but challenging task. In this paper, we propose a new algorithm to detect texts of varying orientations. Our algorithm is based on a two-level classification scheme and two sets of features specially designed for capturing the intrinsic characteristics of texts. To better evaluate the proposed method and compare it with the competing algorithms, we generate a comprehensive dataset with various types of texts in diverse real-world scenes. We also propose a new evaluation protocol, which is more suitable for benchmarking algorithms for detecting texts in varying orientations. Experiments on benchmark datasets demonstrate that our system compares favorably with the state-of-the-art algorithms when handling horizontal texts and achieves significantly enhanced performance on variant texts in complex natural scenes.

  20. Optimisation of the mean boat velocity in rowing.

    PubMed

    Rauter, G; Baumgartner, L; Denoth, J; Riener, R; Wolf, P

    2012-01-01

    In rowing, motor learning may be facilitated by augmented feedback that displays the ratio between actual mean boat velocity and maximal achievable mean boat velocity. To provide this ratio, the aim of this work was to develop and evaluate an algorithm calculating an individual maximal mean boat velocity. The algorithm optimised the horizontal oar movement under constraints such as the individual range of the horizontal oar displacement, individual timing of catch and release and an individual power-angle relation. Immersion and turning of the oar were simplified, and the seat movement of a professional rower was implemented. The feasibility of the algorithm, and of the associated ratio between actual boat velocity and optimised boat velocity, was confirmed by a study on four subjects: as expected, advanced rowing skills resulted in higher ratios, and the maximal mean boat velocity depended on the range of the horizontal oar displacement.

  1. Time series analysis of S&P 500 index: A horizontal visibility graph approach

    NASA Astrophysics Data System (ADS)

    Vamvakaris, Michail D.; Pantelous, Athanasios A.; Zuev, Konstantin M.

    2018-05-01

    The behavior of stock prices has been thoroughly studied throughout the last century, and contradictory results have been reported in the corresponding literature. In this paper, a network theoretical approach is provided to investigate how crises affected the behavior of US stock prices. We analyze high frequency data from S&P500 via the Horizontal Visibility Graph method, and find that all major crises that took place worldwide in the last twenty years, affected significantly the behavior of the price-index. Nevertheless, we observe that each of those crises impacted the index in a different way and magnitude. Interestingly, our results suggest that the predictability of the price-index series increases during the periods of crises.

  2. Modeling and analysis of LWIR signature variability associated with 3D and BRDF effects

    NASA Astrophysics Data System (ADS)

    Adler-Golden, Steven; Less, David; Jin, Xuemin; Rynes, Peter

    2016-05-01

    Algorithms for retrieval of surface reflectance, emissivity or temperature from a spectral image almost always assume uniform illumination across the scene and horizontal surfaces with Lambertian reflectance. When these algorithms are used to process real 3-D scenes, the retrieved "apparent" values contain the strong, spatially dependent variations in illumination as well as surface bidirectional reflectance distribution function (BRDF) effects. This is especially problematic with horizontal or near-horizontal viewing, where many observed surfaces are vertical, and where horizontal surfaces can show strong specularity. The goals of this study are to characterize long-wavelength infrared (LWIR) signature variability in a HSI 3-D scene and develop practical methods for estimating the true surface values. We take advantage of synthetic near-horizontal imagery generated with the high-fidelity MultiService Electro-optic Signature (MuSES) model, and compare retrievals of temperature and directional-hemispherical reflectance using standard sky downwelling illumination and MuSES-based non-uniform environmental illumination.

  3. Robust Vision-Based Pose Estimation Algorithm for AN Uav with Known Gravity Vector

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.

    2016-06-01

    Accurate estimation of camera external orientation with respect to a known object is one of the central problems in photogrammetry and computer vision. In recent years this problem is gaining an increasing attention in the field of UAV autonomous flight. Such application requires a real-time performance and robustness of the external orientation estimation algorithm. The accuracy of the solution is strongly dependent on the number of reference points visible on the given image. The problem only has an analytical solution if 3 or more reference points are visible. However, in limited visibility conditions it is often needed to perform external orientation with only 2 visible reference points. In such case the solution could be found if the gravity vector direction in the camera coordinate system is known. A number of algorithms for external orientation estimation for the case of 2 known reference points and a gravity vector were developed to date. Most of these algorithms provide analytical solution in the form of polynomial equation that is subject to large errors in the case of complex reference points configurations. This paper is focused on the development of a new computationally effective and robust algorithm for external orientation based on positions of 2 known reference points and a gravity vector. The algorithm implementation for guidance of a Parrot AR.Drone 2.0 micro-UAV is discussed. The experimental evaluation of the algorithm proved its computational efficiency and robustness against errors in reference points positions and complex configurations.

  4. Algorithm for automatic analysis of electro-oculographic data

    PubMed Central

    2013-01-01

    Background Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. Methods The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. Results The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. Conclusion The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics. PMID:24160372

  5. Algorithm for automatic analysis of electro-oculographic data.

    PubMed

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  6. Three-dimensional high-precision indoor positioning strategy using Tabu search based on visible light communication

    NASA Astrophysics Data System (ADS)

    Peng, Qi; Guan, Weipeng; Wu, Yuxiang; Cai, Ye; Xie, Canyu; Wang, Pengfei

    2018-01-01

    This paper proposes a three-dimensional (3-D) high-precision indoor positioning strategy using Tabu search based on visible light communication. Tabu search is a powerful global optimization algorithm, and the 3-D indoor positioning can be transformed into an optimal solution problem. Therefore, in the 3-D indoor positioning, the optimal receiver coordinate can be obtained by the Tabu search algorithm. For all we know, this is the first time the Tabu search algorithm is applied to visible light positioning. Each light-emitting diode (LED) in the system broadcasts a unique identity (ID) and transmits the ID information. When the receiver detects optical signals with ID information from different LEDs, using the global optimization of the Tabu search algorithm, the 3-D high-precision indoor positioning can be realized when the fitness value meets certain conditions. Simulation results show that the average positioning error is 0.79 cm, and the maximum error is 5.88 cm. The extended experiment of trajectory tracking also shows that 95.05% positioning errors are below 1.428 cm. It can be concluded from the data that the 3-D indoor positioning based on the Tabu search algorithm achieves the requirements of centimeter level indoor positioning. The algorithm used in indoor positioning is very effective and practical and is superior to other existing methods for visible light indoor positioning.

  7. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing

    PubMed Central

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-01-01

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855

  8. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    PubMed

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  9. Image Reconstruction in Radio Astronomy with Non-Coplanar Synthesis Arrays

    NASA Astrophysics Data System (ADS)

    Goodrick, L.

    2015-03-01

    Traditional radio astronomy imaging techniques assume that the interferometric array is coplanar, with a small field of view, and that the two-dimensional Fourier relationship between brightness and visibility remains valid, allowing the Fast Fourier Transform to be used. In practice, to acquire more accurate data, the non-coplanar baseline effects need to be incorporated, as small height variations in the array plane introduces the w spatial frequency component. This component adds an additional phase shift to the incoming signals. There are two approaches to account for the non-coplanar baseline effects: either the full three-dimensional brightness and visibility model can be used to reconstruct an image, or the non-coplanar effects can be removed, reducing the three dimensional relationship to that of the two-dimensional one. This thesis describes and implements the w-projection and w-stacking algorithms. The aim of these algorithms is to account for the phase error introduced by non-coplanar synthesis arrays configurations, making the recovered visibilities more true to the actual brightness distribution model. This is done by reducing the 3D visibilities to a 2D visibility model. The algorithms also have the added benefit of wide-field imaging, although w-stacking supports a wider field of view at the cost of more FFT bin support. For w-projection, the w-term is accounted for in the visibility domain by convolving it out of the problem with a convolution kernel, allowing the use of the two-dimensional Fast Fourier Transform. Similarly, the w-Stacking algorithm applies a phase correction in the image domain to image layers to produce an intensity model that accounts for the non-coplanar baseline effects. This project considers the KAT7 array for simulation and analysis of the limitations and advantages of both the algorithms. Additionally, a variant of the Högbom CLEAN algorithm was used which employs contour trimming for extended source emission flagging. The CLEAN algorithm is an iterative two-dimensional deconvolution method that can further improve image fidelity by removing the effects of the point spread function which can obscure source data.

  10. Earth Observations taken by the Expedition 10 crew

    NASA Image and Video Library

    2005-01-17

    ISS010-E-13680 (17 January 2005) --- The border of Galveston and Brazoria Counties in Texas is visible in this electronic still camera's image, as photographed by the Expedition 10 crew onboard the International Space Station. Polly Ranch, near Friendswood, is visible west of Interstate Highway 45 (right side). FM528 goes horizontally through the middle, and FM518 runs vertically through frame center, with the two roads intersecting near Friendswood.

  11. Increasing of visibility on the pedestrian crossing by the additional lighting systems

    NASA Astrophysics Data System (ADS)

    Baleja, Richard; Bos, Petr; Novak, Tomas; Sokansky, Karel; Hanusek, Tomas

    2017-09-01

    Pedestrian crossings are critical places for road accidents between pedestrians and motor vehicles. For this reason, it is very important to increase attention when the pedestrian crossings are designed and it is necessary to take into account all factors that may contribute to higher safety. Additional lighting systems for pedestrian crossings are one of them and the lighting systems must fulfil the requirements for higher visibility from the point of view of car drivers from both directions. This paper describes the criteria for the suitable additional lighting system on pedestrian crossings. Generally, it means vertical illuminance on the pedestrian crossing from the driver’s view, horizontal illuminance on the crossing and horizontal illuminance both in front of and behind the crossing placed on the road and their acceptable ratios. The article also describes the choice of the colours of the light (correlated colour temperature) and its influence on visibility. As a part of the article, there are case designs of additional lighting systems for pedestrian crossings and measurements from realized additional lighting systems by luxmeters and luminance cameras and their evaluation.

  12. Multiplex visibility graphs to investigate recurrent neural network dynamics

    NASA Astrophysics Data System (ADS)

    Bianchi, Filippo Maria; Livi, Lorenzo; Alippi, Cesare; Jenssen, Robert

    2017-03-01

    A recurrent neural network (RNN) is a universal approximator of dynamical systems, whose performance often depends on sensitive hyperparameters. Tuning them properly may be difficult and, typically, based on a trial-and-error approach. In this work, we adopt a graph-based framework to interpret and characterize internal dynamics of a class of RNNs called echo state networks (ESNs). We design principled unsupervised methods to derive hyperparameters configurations yielding maximal ESN performance, expressed in terms of prediction error and memory capacity. In particular, we propose to model time series generated by each neuron activations with a horizontal visibility graph, whose topological properties have been shown to be related to the underlying system dynamics. Successively, horizontal visibility graphs associated with all neurons become layers of a larger structure called a multiplex. We show that topological properties of such a multiplex reflect important features of ESN dynamics that can be used to guide the tuning of its hyperparamers. Results obtained on several benchmarks and a real-world dataset of telephone call data records show the effectiveness of the proposed methods.

  13. Multiplex visibility graphs to investigate recurrent neural network dynamics

    PubMed Central

    Bianchi, Filippo Maria; Livi, Lorenzo; Alippi, Cesare; Jenssen, Robert

    2017-01-01

    A recurrent neural network (RNN) is a universal approximator of dynamical systems, whose performance often depends on sensitive hyperparameters. Tuning them properly may be difficult and, typically, based on a trial-and-error approach. In this work, we adopt a graph-based framework to interpret and characterize internal dynamics of a class of RNNs called echo state networks (ESNs). We design principled unsupervised methods to derive hyperparameters configurations yielding maximal ESN performance, expressed in terms of prediction error and memory capacity. In particular, we propose to model time series generated by each neuron activations with a horizontal visibility graph, whose topological properties have been shown to be related to the underlying system dynamics. Successively, horizontal visibility graphs associated with all neurons become layers of a larger structure called a multiplex. We show that topological properties of such a multiplex reflect important features of ESN dynamics that can be used to guide the tuning of its hyperparamers. Results obtained on several benchmarks and a real-world dataset of telephone call data records show the effectiveness of the proposed methods. PMID:28281563

  14. A synthetic visual plane algorithm for visibility computation in consideration of accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Yu, Jieqing; Wu, Lixin; Hu, Qingsong; Yan, Zhigang; Zhang, Shaoliang

    2017-12-01

    Visibility computation is of great interest to location optimization, environmental planning, ecology, and tourism. Many algorithms have been developed for visibility computation. In this paper, we propose a novel method of visibility computation, called synthetic visual plane (SVP), to achieve better performance with respect to efficiency, accuracy, or both. The method uses a global horizon, which is a synthesis of line-of-sight information of all nearer points, to determine the visibility of a point, which makes it an accurate visibility method. We used discretization of horizon to gain a good performance in efficiency. After discretization, the accuracy and efficiency of SVP depends on the scale of discretization (i.e., zone width). The method is more accurate at smaller zone widths, but this requires a longer operating time. Users must strike a balance between accuracy and efficiency at their discretion. According to our experiments, SVP is less accurate but more efficient than R2 if the zone width is set to one grid. However, SVP becomes more accurate than R2 when the zone width is set to 1/24 grid, while it continues to perform as fast or faster than R2. Although SVP performs worse than reference plane and depth map with respect to efficiency, it is superior in accuracy to these other two algorithms.

  15. Vertical coherence in mantle heterogeneity from global seismic data

    NASA Astrophysics Data System (ADS)

    Boschi, L.; Becker, T. W.

    2011-10-01

    The vertical coherence of mantle structure is of importance for a range of dynamic issues including convective mass transport and the geochemical evolution of Earth. Here, we use seismic data to infer the most likely depth ranges of strong, global changes in the horizontal pattern of mantle heterogeneity. We apply our algorithm to a comprehensive set of measurements, including various shear- and compressional-wave delay times and Love- and Rayleigh-wave fundamental mode and overtone dispersion, so that tomography resolution is as high as possible at all mantle depths. We find that vertical coherence is minimum at ∼100 km and ∼800 km depths, corresponding to the base of the lithosphere and the transition between upper and lower mantle, respectively. The D″ layer is visible, but not as prominent as the shallower features. The rest of the lower mantle is, essentially, vertically coherent. These findings are consistent with slab stagnation at depths around, and perhaps below, the 660-km phase transition, and inconsistent with global, chemically distinct, mid-mantle layering.

  16. Research on multi-source image fusion technology in haze environment

    NASA Astrophysics Data System (ADS)

    Ma, GuoDong; Piao, Yan; Li, Bing

    2017-11-01

    In the haze environment, the visible image collected by a single sensor can express the details of the shape, color and texture of the target very well, but because of the haze, the sharpness is low and some of the target subjects are lost; Because of the expression of thermal radiation and strong penetration ability, infrared image collected by a single sensor can clearly express the target subject, but it will lose detail information. Therefore, the multi-source image fusion method is proposed to exploit their respective advantages. Firstly, the improved Dark Channel Prior algorithm is used to preprocess the visible haze image. Secondly, the improved SURF algorithm is used to register the infrared image and the haze-free visible image. Finally, the weighted fusion algorithm based on information complementary is used to fuse the image. Experiments show that the proposed method can improve the clarity of the visible target and highlight the occluded infrared target for target recognition.

  17. Holographic Lens for Pilot’s Head-Up Display

    DTIC Science & Technology

    1976-02-01

    holog; rdm lens. The dynamically-stabilized recording apparatus for the full- scale transmission hologram lens was designed and assembled in Phase 2...8217LjI for which the fringe visibility measured is 0.707 ........... 25 3 Coherence lengjth for TEMQ is 3.5 cm........27 4 Masured sinkile frequency...horizontal focal surfaces ofthe T90-N8-21.9 hologram lens . . . . . . . 118 41 Chief ray efficiency measured as a function of vertical and horizontal field

  18. On the degree distribution of horizontal visibility graphs associated with Markov processes and dynamical systems: diagrammatic and variational approaches

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas

    2014-09-01

    Dynamical processes can be transformed into graphs through a family of mappings called visibility algorithms, enabling the possibility of (i) making empirical time series analysis and signal processing and (ii) characterizing classes of dynamical systems and stochastic processes using the tools of graph theory. Recent works show that the degree distribution of these graphs encapsulates much information on the signals' variability, and therefore constitutes a fundamental feature for statistical learning purposes. However, exact solutions for the degree distributions are only known in a few cases, such as for uncorrelated random processes. Here we analytically explore these distributions in a list of situations. We present a diagrammatic formalism which computes for all degrees their corresponding probability as a series expansion in a coupling constant which is the number of hidden variables. We offer a constructive solution for general Markovian stochastic processes and deterministic maps. As case tests we focus on Ornstein-Uhlenbeck processes, fully chaotic and quasiperiodic maps. Whereas only for certain degree probabilities can all diagrams be summed exactly, in the general case we show that the perturbation theory converges. In a second part, we make use of a variational technique to predict the complete degree distribution for special classes of Markovian dynamics with fast-decaying correlations. In every case we compare the theory with numerical experiments.

  19. An Automatic Image Processing System for Glaucoma Screening

    PubMed Central

    Alodhayb, Sami; Lakshminarayanan, Vasudevan

    2017-01-01

    Horizontal and vertical cup to disc ratios are the most crucial parameters used clinically to detect glaucoma or monitor its progress and are manually evaluated from retinal fundus images of the optic nerve head. Due to the rarity of the glaucoma experts as well as the increasing in glaucoma's population, an automatically calculated horizontal and vertical cup to disc ratios (HCDR and VCDR, resp.) can be useful for glaucoma screening. We report on two algorithms to calculate the HCDR and VCDR. In the algorithms, level set and inpainting techniques were developed for segmenting the disc, while thresholding using Type-II fuzzy approach was developed for segmenting the cup. The results from the algorithms were verified using the manual markings of images from a dataset of glaucomatous images (retinal fundus images for glaucoma analysis (RIGA dataset)) by six ophthalmologists. The algorithm's accuracy for HCDR and VCDR combined was 74.2%. Only the accuracy of manual markings by one ophthalmologist was higher than the algorithm's accuracy. The algorithm's best agreement was with markings by ophthalmologist number 1 in 230 images (41.8%) of the total tested images. PMID:28947898

  20. Trial of a slant visual range measuring device

    NASA Technical Reports Server (NTRS)

    Streicher, J.; Muenkel, C.; Borchardt, H.

    1992-01-01

    Each year, fog at airports renders some landing operations either difficult or impossible. The visibility that a pilot of a landing aircraft can expect is in that case the most important information. It could happen that the visibility versus the altitude is constantly decreasing or increasing. However, it is not possible to distinguish this with the existing sensors at an airport. If the visibility is decreasing with the altitude, one has the worst case - ground fog. The standard visibility sensor, the transmissometer, determines only the horizontal visual range, which will be underestimated in comparison with the real visibility a pilot has on his landing approach. Described here is a new technique to measure the slant visual range, making use of a slant scanning device - an eye-safe laser radar. A comparison with commercial visibility sensors shows that it is possible to measure visibilities with the slant looking laser radar in the range from 50 meters up to 2000 meters and even distinguish inhomogenities like ground fog.

  1. Detection of unmanned aerial vehicles using a visible camera system.

    PubMed

    Hu, Shuowen; Goldman, Geoffrey H; Borel-Donohue, Christoph C

    2017-01-20

    Unmanned aerial vehicles (UAVs) flown by adversaries are an emerging asymmetric threat to homeland security and the military. To help address this threat, we developed and tested a computationally efficient UAV detection algorithm consisting of horizon finding, motion feature extraction, blob analysis, and coherence analysis. We compare the performance of this algorithm against two variants, one using the difference image intensity as the motion features and another using higher-order moments. The proposed algorithm and its variants are tested using field test data of a group 3 UAV acquired with a panoramic video camera in the visible spectrum. The performance of the algorithms was evaluated using receiver operating characteristic curves. The results show that the proposed approach had the best performance compared to the two algorithmic variants.

  2. Combined VIS-IR spectrometer with vertical probe beam

    NASA Astrophysics Data System (ADS)

    Protopopov, V.

    2017-12-01

    A prototype of a combined visible-infrared spectrometer with a vertical probe beam is designed and tested. The combined spectral range is 0.4-20 μ with spatial resolution 1 mm. Basic features include the ability to measure both visibly transparent and opaque substances, as well as buried structures, such as in semiconductor industry; horizontal orientation of a sample, including semiconductor wafers; and reflection mode of operation, delivering twice the sensitivity compared to the transmission mode.

  3. Benefit of a 360-degree horizontal turn following premedication with simethicone on image quality during gastroendoscopy: a randomized controlled trial

    PubMed Central

    Wang, Chunhua; Liu, Haiyan; Wang, Xiuming; Shen, Xiaochun; Yang, Yingying; Sun, Wenjing; Yan, Qingjun; Cao, Yan; Wang, Xueqin; Lan, Chunhui; Chen, Dongfeng

    2015-01-01

    Objectives: To investigate whether a 360-degree horizontal turn after oral premedication with simethicone improves the mucosal visibility during gastroendoscopic examination, and to determine the proper time to turn over the patient. Methods: This study involved 993 patients scheduled for gastroendoscopy. Just before gastroendoscopy, after oral premedication with simethicone, patients were randomly assigned to three groups: in Group A, patients waited for 20 min before gastroendoscopy; in Group B, patients were separately waited for 5/10/15/20 min and were then turned 360 degrees just before gastroendoscopy; in Group C, patients were immediately turned 360 degrees and then separately waited for 5/10/15/20 min before examination. The sum of the gastric mucosal visibility scores (MVS) was calculated after the examination. The MVS and proportion of images with higher visibility scores for the mucosal surface. Lower scores indicate better visibility of the mucosal surface. Results: In Groups B and Groups C, when waiting time more than 10 min had lower mean total MVS than Group A. The MVS of four subgroups of Group B were not different from those of Group C. Conclusion: Oral premedication with simethicone and immediately make a body posture change (turning over 360 degrees) then waiting for 10min can increase the image quality during gastroendoscopy and effectively decrease the premedication time. PMID:26064342

  4. Benefit of a 360-degree horizontal turn following premedication with simethicone on image quality during gastroendoscopy: a randomized controlled trial.

    PubMed

    Wang, Chunhua; Liu, Haiyan; Wang, Xiuming; Shen, Xiaochun; Yang, Yingying; Sun, Wenjing; Yan, Qingjun; Cao, Yan; Wang, Xueqin; Lan, Chunhui; Chen, Dongfeng

    2015-01-01

    To investigate whether a 360-degree horizontal turn after oral premedication with simethicone improves the mucosal visibility during gastroendoscopic examination, and to determine the proper time to turn over the patient. This study involved 993 patients scheduled for gastroendoscopy. Just before gastroendoscopy, after oral premedication with simethicone, patients were randomly assigned to three groups: in Group A, patients waited for 20 min before gastroendoscopy; in Group B, patients were separately waited for 5/10/15/20 min and were then turned 360 degrees just before gastroendoscopy; in Group C, patients were immediately turned 360 degrees and then separately waited for 5/10/15/20 min before examination. The sum of the gastric mucosal visibility scores (MVS) was calculated after the examination. The MVS and proportion of images with higher visibility scores for the mucosal surface. Lower scores indicate better visibility of the mucosal surface. In Groups B and Groups C, when waiting time more than 10 min had lower mean total MVS than Group A. The MVS of four subgroups of Group B were not different from those of Group C. Oral premedication with simethicone and immediately make a body posture change (turning over 360 degrees) then waiting for 10min can increase the image quality during gastroendoscopy and effectively decrease the premedication time.

  5. Flight Path Synthesis and HUD Scaling for V/STOL Terminal Area Operations

    DOT National Transportation Integrated Search

    1995-04-01

    A two circle horizontal flightpath synthesis algorithm for Vertical/Short : Takeoff and Landing (V/STOL) terminal area operations is presented. This : algorithm provides a flight-path that is tangential to the aircraft's velocity : vector at the inst...

  6. Peak-to-average power ratio reduction in orthogonal frequency division multiplexing-based visible light communication systems using a modified partial transmit sequence technique

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Deng, Honggui; Ren, Shuang; Tang, Chengying; Qian, Xuewen

    2018-01-01

    We propose an efficient partial transmit sequence technique based on genetic algorithm and peak-value optimization algorithm (GAPOA) to reduce high peak-to-average power ratio (PAPR) in visible light communication systems based on orthogonal frequency division multiplexing (VLC-OFDM). By analysis of hill-climbing algorithm's pros and cons, we propose the POA with excellent local search ability to further process the signals whose PAPR is still over the threshold after processed by genetic algorithm (GA). To verify the effectiveness of the proposed technique and algorithm, we evaluate the PAPR performance and the bit error rate (BER) performance and compare them with partial transmit sequence (PTS) technique based on GA (GA-PTS), PTS technique based on genetic and hill-climbing algorithm (GH-PTS), and PTS based on shuffled frog leaping algorithm and hill-climbing algorithm (SFLAHC-PTS). The results show that our technique and algorithm have not only better PAPR performance but also lower computational complexity and BER than GA-PTS, GH-PTS, and SFLAHC-PTS technique.

  7. Plume Detection and Plume Top Height Estimation using SLSTR

    NASA Astrophysics Data System (ADS)

    Virtanen, Timo H.; Kolmonen, Pekka; Sogacheva, Larisa; Rodriguez, Edith; Saponaro, Giulia; de Leeuw, Gerrit

    2017-04-01

    We present preliminary results on ash and desert dust plume detection and plume top height estimates based on satellite data from the Sea and Land Surface Temperature Radiometer (SLSTR) aboard Sentinel-3, launched in 2016. The methods are based on the previously developed AATSR Correlation Method (ACM) height estimation algorithm, which utilized the data of the preceding similar instrument, Advanced Along Track Scanning Radiometer (AATSR). The height estimate is based on the stereo-viewing capability of SLSTR, which allows to determine the parallax between the satellite's 55° backward and nadir views, and thus the corresponding height. The ash plume detection is based on the brightness temperature difference between between thermal infrared (TIR) channels centered at 11 and 12 μm, which show characteristic signals for both desert dust and ash plumes. The SLSTR instrument provides a unique combination of dual-view capability and a wavelength range from visible to thermal infrared, rendering it an ideal instrument for this work. Accurate information on the volcanic ash position is important for air traffic safety. The ACM algorithm can provide valuable data of both horizontal and vertical ash dispersion. These data may be useful for comparisons with other volcanic ash and desert dust retrieval methods and dispersion models. The current work is being carried out as part of the H2020 project EUNADICS-AV ("European Natural Disaster Coordination and Information System for Aviation"), which started in October 2016.

  8. Two Procedures to Flag Radio Frequency Interference in the UV Plane

    NASA Astrophysics Data System (ADS)

    Sekhar, Srikrishna; Athreya, Ramana

    2018-07-01

    We present two algorithms to identify and flag radio frequency interference (RFI) in radio interferometric imaging data. The first algorithm utilizes the redundancy of visibilities inside a UV cell in the visibility plane to identify corrupted data, while varying the detection threshold in accordance with the observed reduction in noise with radial UV distance. In the second algorithm, we propose a scheme to detect faint RFI in the visibility time-channel (TC) plane of baselines. The efficacy of identifying RFI in the residual visibilities is reduced by the presence of ripples due to inaccurate subtraction of the strongest sources. This can be due to several reasons including primary beam asymmetries and other direction-dependent calibration errors. We eliminated these ripples by clipping the corresponding peaks in the associated Fourier plane. RFI was detected in the ripple-free TC plane but was flagged in the original visibilities. Application of these two algorithms to five different 150 MHz data sets from the GMRT resulted in a reduction in image noise of 20%–50% throughout the field along with a reduction in systematics and a corresponding increase in the number of detected sources. However, in comparing the mean flux densities before and after flagging RFI, we find a differential change with the fainter sources (25σ < S < 100 mJy) showing a change of ‑6% to +1% relative to the stronger sources (S > 100 mJy). We are unable to explain this effect, but it could be related to the CLEAN bias known for interferometers.

  9. LAWS simulation: Sampling strategies and wind computation algorithms

    NASA Technical Reports Server (NTRS)

    Emmitt, G. D. A.; Wood, S. A.; Houston, S. H.

    1989-01-01

    In general, work has continued on developing and evaluating algorithms designed to manage the Laser Atmospheric Wind Sounder (LAWS) lidar pulses and to compute the horizontal wind vectors from the line-of-sight (LOS) measurements. These efforts fall into three categories: Improvements to the shot management and multi-pair algorithms (SMA/MPA); observing system simulation experiments; and ground-based simulations of LAWS.

  10. Research on horizontal displacement monitoring of deep soil based on a distributed optical fibre sensor

    NASA Astrophysics Data System (ADS)

    Huang, Xiaodi; Wang, Yuan; Sun, Yangyang; Zhang, Qinghua; Zhang, Zhenglin; You, Zewei; Ma, Yuan

    2018-01-01

    The traditional measurement method for the horizontal displacement of deep soil usually uses an inclinometer for piecewise measurement and then generates an artificial reading, which takes a long time and often contains errors; in addition, the anti-jamming and long-term stability of the inclinometer is poor. In this paper, a technique for monitoring horizontal displacement based on distributed optical fibres is introduced. The relationship between the strain and the deflection was described by a theoretical model, and the strain distribution of the inclinometer tube was measured by the cables laid on its surface so that the deflection of the inclinometer tube could be calculated by the difference algorithm and regarded as the horizontal displacement of deep soil. The horizontal displacement monitoring technology of deep soil based on distributed optical fibre sensors developed in this paper not only overcame the shortcomings of traditional inclinometer technology to realize automatic real-time monitoring but also allowed for distributed measurement. The experiment was similar to the expected engineering situations, and the deflection calculated from the strain was compared with an inclinometer. The results demonstrated that the relative error between the distributed optical fibre sensors and the inclinometer was less than 8.0%, and the results also verified both the feasibility of using distributed optical fibre to monitor the horizontal displacement of soil as well as the rationality of the theoretical model and difference algorithm. The application of distributed optical fibre in monitoring the horizontal displacement of deep soil in the engineering of foundation pits and slopes can more accurately evaluate the safety of engineering during construction.

  11. Measuring visibility using smartphones

    NASA Astrophysics Data System (ADS)

    Friesen, Jan; Bialon, Raphael; Claßen, Christoph; Graffi, Kalman

    2017-04-01

    Spatial information on fog density is an important parameter for ecohydrological studies in cloud forests. The Dhofar cloud forest in Southern Oman exhibits a close interaction between the fog, trees, and rainfall. During the three month monsoon season the trees capture substantial amounts of horizontal precipitation from fog which increases net precipitation below the tree canopy. As fog density measurements are scarce, a smartphone app was designed to measure visibility. Different smartphone units use a variety of different parts. It is therefore important to assess the developed visibility measurement across a suite of different smartphones. In this study we tested five smartphones/ tablets (Google/ LG Nexus 5X, Huawei P8 lite, Huawei Y3, HTC Nexus 9, and Samsung Galaxy S4 mini) against digital camera (Sony DLSR-A900) and visual visibility observations. Visibility was assessed from photos using image entropy, from the number of visible targets, and from WiFi signal strength using RSSI. Results show clear relationships between object distance and fog density, yet a considerable spread across the different smartphone/ tablet units is evident.

  12. Fusion of Visible and Thermal Descriptors Using Genetic Algorithms for Face Recognition Systems.

    PubMed

    Hermosilla, Gabriel; Gallardo, Francisco; Farias, Gonzalo; San Martin, Cesar

    2015-07-23

    The aim of this article is to present a new face recognition system based on the fusion of visible and thermal features obtained from the most current local matching descriptors by maximizing face recognition rates through the use of genetic algorithms. The article considers a comparison of the performance of the proposed fusion methodology against five current face recognition methods and classic fusion techniques used commonly in the literature. These were selected by considering their performance in face recognition. The five local matching methods and the proposed fusion methodology are evaluated using the standard visible/thermal database, the Equinox database, along with a new database, the PUCV-VTF, designed for visible-thermal studies in face recognition and described for the first time in this work. The latter is created considering visible and thermal image sensors with different real-world conditions, such as variations in illumination, facial expression, pose, occlusion, etc. The main conclusions of this article are that two variants of the proposed fusion methodology surpass current face recognition methods and the classic fusion techniques reported in the literature, attaining recognition rates of over 97% and 99% for the Equinox and PUCV-VTF databases, respectively. The fusion methodology is very robust to illumination and expression changes, as it combines thermal and visible information efficiently by using genetic algorithms, thus allowing it to choose optimal face areas where one spectrum is more representative than the other.

  13. Fusion of Visible and Thermal Descriptors Using Genetic Algorithms for Face Recognition Systems

    PubMed Central

    Hermosilla, Gabriel; Gallardo, Francisco; Farias, Gonzalo; San Martin, Cesar

    2015-01-01

    The aim of this article is to present a new face recognition system based on the fusion of visible and thermal features obtained from the most current local matching descriptors by maximizing face recognition rates through the use of genetic algorithms. The article considers a comparison of the performance of the proposed fusion methodology against five current face recognition methods and classic fusion techniques used commonly in the literature. These were selected by considering their performance in face recognition. The five local matching methods and the proposed fusion methodology are evaluated using the standard visible/thermal database, the Equinox database, along with a new database, the PUCV-VTF, designed for visible-thermal studies in face recognition and described for the first time in this work. The latter is created considering visible and thermal image sensors with different real-world conditions, such as variations in illumination, facial expression, pose, occlusion, etc. The main conclusions of this article are that two variants of the proposed fusion methodology surpass current face recognition methods and the classic fusion techniques reported in the literature, attaining recognition rates of over 97% and 99% for the Equinox and PUCV-VTF databases, respectively. The fusion methodology is very robust to illumination and expression changes, as it combines thermal and visible information efficiently by using genetic algorithms, thus allowing it to choose optimal face areas where one spectrum is more representative than the other. PMID:26213932

  14. Infrared and visible images registration with adaptable local-global feature integration for rail inspection

    NASA Astrophysics Data System (ADS)

    Tang, Chaoqing; Tian, Gui Yun; Chen, Xiaotian; Wu, Jianbo; Li, Kongjing; Meng, Hongying

    2017-12-01

    Active thermography provides infrared images that contain sub-surface defect information, while visible images only reveal surface information. Mapping infrared information to visible images offers more comprehensive visualization for decision-making in rail inspection. However, the common information for registration is limited due to different modalities in both local and global level. For example, rail track which has low temperature contrast reveals rich details in visible images, but turns blurry in the infrared counterparts. This paper proposes a registration algorithm called Edge-Guided Speeded-Up-Robust-Features (EG-SURF) to address this issue. Rather than sequentially integrating local and global information in matching stage which suffered from buckets effect, this algorithm adaptively integrates local and global information into a descriptor to gather more common information before matching. This adaptability consists of two facets, an adaptable weighting factor between local and global information, and an adaptable main direction accuracy. The local information is extracted using SURF while the global information is represented by shape context from edges. Meanwhile, in shape context generation process, edges are weighted according to local scale and decomposed into bins using a vector decomposition manner to provide more accurate descriptor. The proposed algorithm is qualitatively and quantitatively validated using eddy current pulsed thermography scene in the experiments. In comparison with other algorithms, better performance has been achieved.

  15. ASRC RSS Data

    DOE Data Explorer

    Kiedron, Peter

    2008-01-15

    Once every minute between sunrise and sunset the Rotating Shadowband Spectroradiometer (RSS) measures simultaneously three irradiances: total horizontal, diffuse horizontal and direct normal in near ultraviolet, visible and near infrared range (approx. 370nm-1050nm) at 512 (RSS103) or 1024 (RSS102 and RSS105) adjacent spectral resolving elements (pixels). The resolution is pixel (wavelength) dependent and it differs from instrument to instrument. The reported irradiances are cosine response corrected. And their radiometric calibration is based on incandescent lamp calibrators that can be traced to the NIST irradiance scale. The units are W/m2/nm.

  16. Single underwater image enhancement based on color cast removal and visibility restoration

    NASA Astrophysics Data System (ADS)

    Li, Chongyi; Guo, Jichang; Wang, Bo; Cong, Runmin; Zhang, Yan; Wang, Jian

    2016-05-01

    Images taken under underwater condition usually have color cast and serious loss of contrast and visibility. Degraded underwater images are inconvenient for observation and analysis. In order to address these problems, an underwater image-enhancement method is proposed. A simple yet effective underwater image color cast removal algorithm is first presented based on the optimization theory. Then, based on the minimum information loss principle and inherent relationship of medium transmission maps of three color channels in an underwater image, an effective visibility restoration algorithm is proposed to recover visibility, contrast, and natural appearance of degraded underwater images. To evaluate the performance of the proposed method, qualitative comparison, quantitative comparison, and color accuracy test are conducted. Experimental results demonstrate that the proposed method can effectively remove color cast, improve contrast and visibility, and recover natural appearance of degraded underwater images. Additionally, the proposed method is comparable to and even better than several state-of-the-art methods.

  17. MATLAB-based algorithm to estimate depths of isolated thin dike-like sources using higher-order horizontal derivatives of magnetic anomalies.

    PubMed

    Ekinci, Yunus Levent

    2016-01-01

    This paper presents an easy-to-use open source computer algorithm (code) for estimating the depths of isolated single thin dike-like source bodies by using numerical second-, third-, and fourth-order horizontal derivatives computed from observed magnetic anomalies. The approach does not require a priori information and uses some filters of successive graticule spacings. The computed higher-order horizontal derivative datasets are used to solve nonlinear equations for depth determination. The solutions are independent from the magnetization and ambient field directions. The practical usability of the developed code, designed in MATLAB R2012b (MathWorks Inc.), was successfully examined using some synthetic simulations with and without noise. The algorithm was then used to estimate the depths of some ore bodies buried in different regions (USA, Sweden, and Canada). Real data tests clearly indicated that the obtained depths are in good agreement with those of previous studies and drilling information. Additionally, a state-of-the-art inversion scheme based on particle swarm optimization produced comparable results to those of the higher-order horizontal derivative analyses in both synthetic and real anomaly cases. Accordingly, the proposed code is verified to be useful in interpreting isolated single thin dike-like magnetized bodies and may be an alternative processing technique. The open source code can be easily modified and adapted to suit the benefits of other researchers.

  18. Visibility in the topology of complex networks

    NASA Astrophysics Data System (ADS)

    Tsiotas, Dimitrios; Charakopoulos, Avraam

    2018-09-01

    Taking its inspiration from the visibility algorithm, which was proposed by Lacasa et al. (2008) to convert a time-series into a complex network, this paper develops and proposes a novel expansion of this algorithm that allows generating a visibility graph from a complex network instead of a time-series that is currently applicable. The purpose of this approach is to apply the idea of visibility from the field of time-series to complex networks in order to interpret the network topology as a landscape. Visibility in complex networks is a multivariate property producing an associated visibility graph that maps the ability of a node "to see" other nodes in the network that lie beyond the range of its neighborhood, in terms of a control-attribute. Within this context, this paper examines the visibility topology produced by connectivity (degree) in comparison with the original (source) network, in order to detect what patterns or forces describe the mechanism under which a network is converted to a visibility graph. The overall analysis shows that visibility is a property that increases the connectivity in networks, it may contribute to pattern recognition (among which the detection of the scale-free topology) and it is worth to be applied to complex networks in order to reveal the potential of signal processing beyond the range of its neighborhood. Generally, this paper promotes interdisciplinary research in complex networks providing new insights to network science.

  19. Evaluating the visibility of presentation slides

    NASA Astrophysics Data System (ADS)

    Sugawara, Genki; Umezu, Nobuyuki

    2017-03-01

    Presentations using slide software such as PowerPoint are widely performed in offices and schools. The improvement of presentation skills among ordinary people is required because these days such an opportunity of giving presentation is becoming so common. One of the key factors for making successful presentation is the visibility of the slides, as well as the contents themselves. We propose an algorithm to numerically evaluate the visibility of presentation slides. Our method receives a presentation as a set of images and eliminates the background from the slides to extract characters and figures. This algorithm then evaluates the visibility according to the number and size of characters, their colors, and figure layouts. The slide evaluation criteria are based on the series of experiments with 20 participants to parameterize typical values for visual elements in slides. The algorithm is implemented on an iMac and takes 0.5 sec. to evaluate a slide image. The evaluation score is given as a value between 0 and 100 and the users can improve their slide pages with lower scores. Our future work includes a series of experiments with various presentations and extending our method to publish as a web-based rating service for learning presentation skills.

  20. The composition of the lunar crust: Radiative transfer modeling and analysis of lunar visible and near-infrared spectra

    NASA Astrophysics Data System (ADS)

    Cahill, Joshua T. S.

    This dissertation has two focuses: (1) the evaluation and validation of algorithms used for analysis of lunar visible and near-infrared data sets, and (2) the determination of lunar surface and sub-surface crustal composition by virtue of these algorithms. To that end, the results and interpretation reported herein further enhance knowledge of lunar ferroan anorthosite (FAN) and magnesium-suite (Mg-suite) mineralogy, chemistry, and distribution on and in our Moon's crust.

  1. The Feasibility of 3d Point Cloud Generation from Smartphones

    NASA Astrophysics Data System (ADS)

    Alsubaie, N.; El-Sheimy, N.

    2016-06-01

    This paper proposes a new technique for increasing the accuracy of direct geo-referenced image-based 3D point cloud generated from low-cost sensors in smartphones. The smartphone's motion sensors are used to directly acquire the Exterior Orientation Parameters (EOPs) of the captured images. These EOPs, along with the Interior Orientation Parameters (IOPs) of the camera/ phone, are used to reconstruct the image-based 3D point cloud. However, because smartphone motion sensors suffer from poor GPS accuracy, accumulated drift and high signal noise, inaccurate 3D mapping solutions often result. Therefore, horizontal and vertical linear features, visible in each image, are extracted and used as constraints in the bundle adjustment procedure. These constraints correct the relative position and orientation of the 3D mapping solution. Once the enhanced EOPs are estimated, the semi-global matching algorithm (SGM) is used to generate the image-based dense 3D point cloud. Statistical analysis and assessment are implemented herein, in order to demonstrate the feasibility of 3D point cloud generation from the consumer-grade sensors in smartphones.

  2. First Look at the Upper Tropospheric Ozone Mixing Ratio from OMI Estimated using the Cloud Slicing Technique

    NASA Technical Reports Server (NTRS)

    Bhartia, Pawan K.; Ziemke, Jerry; Chandra, Sushil; Joiner, Joanna; Vassilkov, Alexandra; Taylor, Steven; Yang, Kai; Ahn, Chang-Woo

    2004-01-01

    The Cloud Slicing technique has emerged as a powerful tool for the study of ozone in the upper troposphere. In this technique one looks at the variation with cloud height of the above-cloud column ozone derived from the backscattered ultraviolet instruments, such as TOMS, to determine the ozone mixing ratio. For this technique to work properly one needs an instrument with relatively good horizontal resolution with very good signal to noise in measuring above-cloud column ozone. In addition, one needs the (radiatively) effective cloud pressure rather than the cloud-top pressure, for the ultraviolet photons received by a satellite instrument are scattered from inside the cloud rather than from the top. For this study we use data from the OMI sensor, which was recently launched on the EOS Aura satellite. OMI is a W-Visible backscattering instrument with a nadir pixel size of 13 x 24 km. The effective cloud pressure is derived from a new algorithm based on Rotational Raman Scattering and O2-O2, absorption in the 340-400 nm band of OMI.

  3. Determination of sub-daily glacier uplift and horizontal flow velocity with time-lapse images using ImGRAFT

    NASA Astrophysics Data System (ADS)

    Egli, Pascal; Mankoff, Ken; Mettra, François; Lane, Stuart

    2017-04-01

    This study investigates the application of feature tracking algorithms to monitoring of glacier uplift. Several publications have confirmed the occurrence of an uplift of the glacier surface in the late morning hours of the mid to late ablation season. This uplift is thought to be caused by high sub-glacial water pressures at the onset of melt caused by overnight-deposited sediment that blocks subglacial channels. We use time-lapse images from a camera mounted in front of the glacier tongue of Haut Glacier d'Arolla during August 2016 in combination with a Digital Elevation Model and GPS measurements in order to investigate the phenomenon of glacier uplift using the feature tracking toolbox ImGRAFT. Camera position is corrected for all images and the images are geo-rectified using Ground Control Points visible in every image. Changing lighting conditions due to different sun angles create substantial noise and complicate the image analysis. A small glacier uplift of the order of 5 cm over a time span of 3 hours may be observed on certain days, confirming previous research.

  4. Perceived shifts of flashed stimuli by visible and invisible object motion.

    PubMed

    Watanabe, Katsumi; Sato, Takashi R; Shimojo, Shinsuke

    2003-01-01

    Perceived positions of flashed stimuli can be altered by motion signals in the visual field-position capture (Whitney and Cavanagh, 2000 Nature Neuroscience 3 954-959). We examined whether position capture of flashed stimuli depends on the spatial relationship between moving and flashed stimuli, and whether the phenomenal permanence of a moving object behind an occluding surface (tunnel effect; Michotte 1950 Acta Psychologica 7 293-322) can produce position capture. Observers saw two objects (circles) moving vertically in opposite directions, one in each visual hemifield. Two horizontal bars were simultaneously flashed at horizontally collinear positions with the fixation point at various timings. When the movement of the object was fully visible, the flashed bar appeared shifted in the motion direction of the circle. But this position-capture effect occurred only when the bar was presented ahead of or on the moving circle. Even when the motion trajectory was covered by an opaque surface and the bar was flashed after complete occlusion of the circle, the position-capture effect was still observed, though the positional asymmetry was less clear. These results show that movements of both visible and 'hidden' objects can modulate the perception of positions of flashed stimuli and suggest that a high-level representation of 'objects in motion' plays an important role in the position-capture effect.

  5. Three-flat test with plates in horizontal posture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vannoni, Maurizio; Molesini, Giuseppe

    2008-04-20

    Measuring flats in the horizontal posture with interferometers is analyzed in detail, taking into account the sag produced by gravity. A mathematical expression of the bending is provided for a plate supported at three unevenly spaced locations along the edge. It is shown that the azimuthal terms of the deformation can be recovered from a three-flat measuring procedure, while the pure radial terms can only be estimated. The effectiveness of the iterative algorithm for data processing is also demonstrated. Experimental comparison on a set of three flats in horizontal and upright posture is provided.

  6. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography

    PubMed Central

    2018-01-01

    The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user’s eye gaze. PMID:29304120

  7. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography.

    PubMed

    Hládek, Ľuboš; Porr, Bernd; Brimijoin, W Owen

    2018-01-01

    The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user's eye gaze.

  8. A Formal Framework for the Analysis of Algorithms That Recover From Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, RIcky W.; Munoz, Cesar A.

    2008-01-01

    We present a mathematical framework for the specification and verification of state-based conflict resolution algorithms that recover from loss of separation. In particular, we propose rigorous definitions of horizontal and vertical maneuver correctness that yield horizontal and vertical separation, respectively, in a bounded amount of time. We also provide sufficient conditions for independent correctness, i.e., separation under the assumption that only one aircraft maneuvers, and for implicitly coordinated correctness, i.e., separation under the assumption that both aircraft maneuver. An important benefit of this approach is that different aircraft can execute different algorithms and implicit coordination will still be achieved, as long as they all meet the explicit criteria of the framework. Towards this end we have sought to make the criteria as general as possible. The framework presented in this paper has been formalized and mechanically verified in the Prototype Verification System (PVS).

  9. MER-DIMES : a planetary landing application of computer vision

    NASA Technical Reports Server (NTRS)

    Cheng, Yang; Johnson, Andrew; Matthies, Larry

    2005-01-01

    During the Mars Exploration Rovers (MER) landings, the Descent Image Motion Estimation System (DIMES) was used for horizontal velocity estimation. The DIMES algorithm combines measurements from a descent camera, a radar altimeter and an inertial measurement unit. To deal with large changes in scale and orientation between descent images, the algorithm uses altitude and attitude measurements to rectify image data to level ground plane. Feature selection and tracking is employed in the rectified data to compute the horizontal motion between images. Differences of motion estimates are then compared to inertial measurements to verify correct feature tracking. DIMES combines sensor data from multiple sources in a novel way to create a low-cost, robust and computationally efficient velocity estimation solution, and DIMES is the first use of computer vision to control a spacecraft during planetary landing. In this paper, the detailed implementation of the DIMES algorithm and the results from the two landings on Mars are presented.

  10. A Horizontal Tilt Correction Method for Ship License Numbers Recognition

    NASA Astrophysics Data System (ADS)

    Liu, Baolong; Zhang, Sanyuan; Hong, Zhenjie; Ye, Xiuzi

    2018-02-01

    An automatic ship license numbers (SLNs) recognition system plays a significant role in intelligent waterway transportation systems since it can be used to identify ships by recognizing the characters in SLNs. Tilt occurs frequently in many SLNs because the monitors and the ships usually have great vertical or horizontal angles, which decreases the accuracy and robustness of a SLNs recognition system significantly. In this paper, we present a horizontal tilt correction method for SLNs. For an input tilt SLN image, the proposed method accomplishes the correction task through three main steps. First, a MSER-based characters’ center-points computation algorithm is designed to compute the accurate center-points of the characters contained in the input SLN image. Second, a L 1- L 2 distance-based straight line is fitted to the computed center-points using M-estimator algorithm. The tilt angle is estimated at this stage. Finally, based on the computed tilt angle, an affine transformation rotation is conducted to rotate and to correct the input SLN horizontally. At last, the proposed method is tested on 200 tilt SLN images, the proposed method is proved to be effective with a tilt correction rate of 80.5%.

  11. Design of a sensor network for structural health monitoring of a full-scale composite horizontal tail

    NASA Astrophysics Data System (ADS)

    Gao, Dongyue; Wang, Yishou; Wu, Zhanjun; Rahim, Gorgin; Bai, Shengbao

    2014-05-01

    The detection capability of a given structural health monitoring (SHM) system strongly depends on its sensor network placement. In order to minimize the number of sensors while maximizing the detection capability, optimal design of the PZT sensor network placement is necessary for structural health monitoring (SHM) of a full-scale composite horizontal tail. In this study, the sensor network optimization was simplified as a problem of determining the sensor array placement between stiffeners to achieve the desired the coverage rate. First, an analysis of the structural layout and load distribution of a composite horizontal tail was performed. The constraint conditions of the optimal design were presented. Then, the SHM algorithm of the composite horizontal tail under static load was proposed. Based on the given SHM algorithm, a sensor network was designed for the full-scale composite horizontal tail structure. Effective profiles of cross-stiffener paths (CRPs) and uncross-stiffener paths (URPs) were estimated by a Lamb wave propagation experiment in a multi-stiffener composite specimen. Based on the coverage rate and the redundancy requirements, a seven-sensor array-network was chosen as the optimal sensor network for each airfoil. Finally, a preliminary SHM experiment was performed on a typical composite aircraft structure component. The reliability of the SHM result for a composite horizontal tail structure under static load was validated. In the result, the red zone represented the delamination damage. The detection capability of the optimized sensor network was verified by SHM of a full-scale composite horizontal tail; all the diagnosis results were obtained in two minutes. The result showed that all the damage in the monitoring region was covered by the sensor network.

  12. jsc2003e15407

    NASA Image and Video Library

    2000-01-09

    JSC2003-E-15407 (9 Jan. 1990) --- A 35mm still camera located in the umbilical well of the Space Shuttle Columbia took this photograph of the external fuel tank (ET) after it was dropped from the launch stack as the shuttle headed for Earth-orbit on Jan. 9, 1990 for the STS-32 mission. Several large divots are visible near the forward ET/orbiter bipod and smaller divots are visible on the H2 tank acreage. The vertical streak and the horizontal bands were the results of repairs done prior to launch.

  13. IrisDenseNet: Robust Iris Segmentation Using Densely Connected Fully Convolutional Networks in the Images by Visible Light and Near-Infrared Light Camera Sensors

    PubMed Central

    Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung

    2018-01-01

    The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets. PMID:29748495

  14. IrisDenseNet: Robust Iris Segmentation Using Densely Connected Fully Convolutional Networks in the Images by Visible Light and Near-Infrared Light Camera Sensors.

    PubMed

    Arsalan, Muhammad; Naqvi, Rizwan Ali; Kim, Dong Seop; Nguyen, Phong Ha; Owais, Muhammad; Park, Kang Ryoung

    2018-05-10

    The recent advancements in computer vision have opened new horizons for deploying biometric recognition algorithms in mobile and handheld devices. Similarly, iris recognition is now much needed in unconstraint scenarios with accuracy. These environments make the acquired iris image exhibit occlusion, low resolution, blur, unusual glint, ghost effect, and off-angles. The prevailing segmentation algorithms cannot cope with these constraints. In addition, owing to the unavailability of near-infrared (NIR) light, iris recognition in visible light environment makes the iris segmentation challenging with the noise of visible light. Deep learning with convolutional neural networks (CNN) has brought a considerable breakthrough in various applications. To address the iris segmentation issues in challenging situations by visible light and near-infrared light camera sensors, this paper proposes a densely connected fully convolutional network (IrisDenseNet), which can determine the true iris boundary even with inferior-quality images by using better information gradient flow between the dense blocks. In the experiments conducted, five datasets of visible light and NIR environments were used. For visible light environment, noisy iris challenge evaluation part-II (NICE-II selected from UBIRIS.v2 database) and mobile iris challenge evaluation (MICHE-I) datasets were used. For NIR environment, the institute of automation, Chinese academy of sciences (CASIA) v4.0 interval, CASIA v4.0 distance, and IIT Delhi v1.0 iris datasets were used. Experimental results showed the optimal segmentation of the proposed IrisDenseNet and its excellent performance over existing algorithms for all five datasets.

  15. UGS video target detection and discrimination

    NASA Astrophysics Data System (ADS)

    Roberts, G. Marlon; Fitzgerald, James; McCormack, Michael; Steadman, Robert; Vitale, Joseph D.

    2007-04-01

    This project focuses on developing electro-optic algorithms which rank images by their likelihood of containing vehicles and people. These algorithms have been applied to images obtained from Textron's Terrain Commander 2 (TC2) Unattended Ground Sensor system. The TC2 is a multi-sensor surveillance system used in military applications. It combines infrared, acoustic, seismic, magnetic, and electro-optic sensors to detect nearby targets. When targets are detected by the seismic and acoustic sensors, the system is triggered and images are taken in the visible and infrared spectrum. The original Terrain Commander system occasionally captured and transmitted an excessive number of images, sometimes triggered by undesirable targets such as swaying trees. This wasted communications bandwidth, increased power consumption, and resulted in a large amount of end-user time being spent evaluating unimportant images. The algorithms discussed here help alleviate these problems. These algorithms are currently optimized for infra-red images, which give the best visibility in a wide range of environments, but could be adapted to visible imagery as well. It is important that the algorithms be robust, with minimal dependency on user input. They should be effective when tracking varying numbers of targets of different sizes and orientations, despite the low resolutions of the images used. Most importantly, the algorithms must be appropriate for implementation on a low-power processor in real time. This would enable us to maintain frame rates of 2 Hz for effective surveillance operations. Throughout our project we have implemented several algorithms, and used an appropriate methodology to quantitatively compare their performance. They are discussed in this paper.

  16. Visible-infrared micro-spectrometer based on a preaggregated silver nanoparticle monolayer film and an infrared sensor card

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Peng, Jing-xiao; Ho, Ho-pui; Song, Chun-yuan; Huang, Xiao-li; Zhu, Yong-yuan; Li, Xing-ao; Huang, Wei

    2018-01-01

    By using a preaggregated silver nanoparticle monolayer film and an infrared sensor card, we demonstrate a miniature spectrometer design that covers a broad wavelength range from visible to infrared with high spectral resolution. The spectral contents of an incident probe beam are reconstructed by solving a matrix equation with a smoothing simulated annealing algorithm. The proposed spectrometer offers significant advantages over current instruments that are based on Fourier transform and grating dispersion, in terms of size, resolution, spectral range, cost and reliability. The spectrometer contains three components, which are used for dispersion, frequency conversion and detection. Disordered silver nanoparticles in dispersion component reduce the fabrication complexity. An infrared sensor card in the conversion component broaden the operational spectral range of the system into visible and infrared bands. Since the CCD used in the detection component provides very large number of intensity measurements, one can reconstruct the final spectrum with high resolution. An additional feature of our algorithm for solving the matrix equation, which is suitable for reconstructing both broadband and narrowband signals, we have adopted a smoothing step based on a simulated annealing algorithm. This algorithm improve the accuracy of the spectral reconstruction.

  17. Optimal viewing position in vertically and horizontally presented Japanese words.

    PubMed

    Kajii, N; Osaka, N

    2000-11-01

    In the present study, the optimal viewing position (OVP) phenomenon in Japanese Hiragana was investigated, with special reference to a comparison between the vertical and the horizontal meridians in the visual field. In the first experiment, word recognition scores were determined while the eyes were fixating predetermined locations in vertically and horizontally displayed words. Similar to what has been reported for Roman scripts, OVP curves, which were asymmetric with respect to the beginning of words, were observed in both conditions. However, this asymmetry was less pronounced for vertically than for horizontally displayed words. In the second experiment, the visibility of individual characters within strings was examined for the vertical and horizontal meridians. As for Roman characters, letter identification scores were better in the right than in the left visual field. However, identification scores did not differ between the upper and the lower sides of fixation along the vertical meridian. The results showed that the model proposed by Nazir, O'Regan, and Jacobs (1991) cannot entirely account for the OVP phenomenon. A model in which visual and lexical factors are combined is proposed instead.

  18. Subspace algorithms for identifying separable-in-denominator 2D systems with deterministic-stochastic inputs

    NASA Astrophysics Data System (ADS)

    Ramos, José A.; Mercère, Guillaume

    2016-12-01

    In this paper, we present an algorithm for identifying two-dimensional (2D) causal, recursive and separable-in-denominator (CRSD) state-space models in the Roesser form with deterministic-stochastic inputs. The algorithm implements the N4SID, PO-MOESP and CCA methods, which are well known in the literature on 1D system identification, but here we do so for the 2D CRSD Roesser model. The algorithm solves the 2D system identification problem by maintaining the constraint structure imposed by the problem (i.e. Toeplitz and Hankel) and computes the horizontal and vertical system orders, system parameter matrices and covariance matrices of a 2D CRSD Roesser model. From a computational point of view, the algorithm has been presented in a unified framework, where the user can select which of the three methods to use. Furthermore, the identification task is divided into three main parts: (1) computing the deterministic horizontal model parameters, (2) computing the deterministic vertical model parameters and (3) computing the stochastic components. Specific attention has been paid to the computation of a stabilised Kalman gain matrix and a positive real solution when required. The efficiency and robustness of the unified algorithm have been demonstrated via a thorough simulation example.

  19. Surface reflectance retrieval from satellite and aircraft sensors - Results of sensors and algorithm comparisons during FIFE

    NASA Technical Reports Server (NTRS)

    Markham, B. L.; Halthore, R. N.; Goetz, S. J.

    1992-01-01

    Visible to shortwave infrared radiometric data collected by a number of remote sensing instruments on aircraft and satellite platforms were compared over common areas in the First International Satellite Land Surface Climatology Project (ISLSCP) Field Experiment (FIFE) site on August 4, 1989, to assess their radiometric consistency and the adequacy of atmospheric correction algorithms. The instruments in the study included the Landsat 5 Thematic Mapper (TM), the SPOT 1 high-resolution visible (HRV) 1 sensor, the NS001 Thematic Mapper simulator, and the modular multispectral radiometers (MMRs). Atmospheric correction routines analyzed were an algorithm developed for FIFE, LOWTRAN 7, and 5S. A comparison between corresponding bands of the SPOT 1 HRV 1 and the Landsat 5 TM sensors indicated that the two instruments were radiometrically consistent to within about 5 percent. Retrieved surface reflectance factors using the FIFE algorithm over one site under clear atmospheric conditions indicated a capability to determine near-nadir surface reflectance factors to within about 0.01 at a reflectance of 0.06 in the visible (0.4-0.7 microns) and about 0.30 in the near infrared (0.7-1.2 microns) for all but the NS001 sensor. All three atmospheric correction procedures produced absolute reflectances to within 0.005 in the visible and near infrared. In the shortwave infrared (1.2-2.5 microns) region the three algorithms differed in the retrieved surface reflectances primarily owing to differences in predicted gaseous absorption. Although uncertainties in the measured surface reflectance in the shortwave infrared precluded definitive results, the 5S code appeared to predict gaseous transmission marginally more accurately than LOWTRAN 7.

  20. Combing Visible and Infrared Spectral Tests for Dust Identification

    NASA Technical Reports Server (NTRS)

    Zhou, Yaping; Levy, Robert; Kleidman, Richard; Remer, Lorraine; Mattoo, Shana

    2016-01-01

    The MODIS Dark Target aerosol algorithm over Ocean (DT-O) uses spectral reflectance in the visible, near-IR and SWIR wavelengths to determine aerosol optical depth (AOD) and Angstrom Exponent (AE). Even though DT-O does have "dust-like" models to choose from, dust is not identified a priori before inversion. The "dust-like" models are not true "dust models" as they are spherical and do not have enough absorption at short wavelengths, so retrieved AOD and AE for dusty regions tends to be biased. The inference of "dust" is based on postprocessing criteria for AOD and AE by users. Dust aerosol has known spectral signatures in the near-UV (Deep blue), visible, and thermal infrared (TIR) wavelength regions. Multiple dust detection algorithms have been developed over the years with varying detection capabilities. Here, we test a few of these dust detection algorithms, to determine whether they can be useful to help inform the choices made by the DT-O algorithm. We evaluate the following methods: The multichannel imager (MCI) algorithm uses spectral threshold tests in (0.47, 0.64, 0.86, 1.38, 2.26, 3.9, 11.0, 12.0 micrometer) channels and spatial uniformity test [Zhao et al., 2010]. The NOAA dust aerosol index (DAI) uses spectral contrast in the blue channels (412nm and 440nm) [Ciren and Kundragunta, 2014]. The MCI is already included as tests within the "Wisconsin" (MOD35) Cloud mask algorithm.

  1. Evaluation of Driver Visibility from Mobile LIDAR Data and Weather Conditions

    NASA Astrophysics Data System (ADS)

    González-Jorge, H.; Díaz-Vilariño, L.; Lorenzo, H.; Arias, P.

    2016-06-01

    Visibility of drivers is crucial to ensure road safety. Visibility is influenced by two main factors, the geometry of the road and the weather present therein. The present work depicts an approach for automatic visibility evaluation using mobile LiDAR data and climate information provided from weather stations located in the neighbourhood of the road. The methodology is based on a ray-tracing algorithm to detect occlusions from point clouds with the purpose of identifying the visibility area from each driver position. The resulting data are normalized with the climate information to provide a polyline with an accurate area of visibility. Visibility ranges from 25 m (heavy fog) to more than 10,000 m (clean atmosphere). Values over 250 m are not taken into account for road safety purposes, since this value corresponds to the maximum braking distance of a vehicle. Two case studies are evaluated an urban road in the city of Vigo (Spain) and an inter-urban road between the city of Ourense and the village of Castro Caldelas (Spain). In both cases, data from the Galician Weather Agency (Meteogalicia) are used. The algorithm shows promising results allowing the detection of particularly dangerous areas from the viewpoint of driver visibility. The mountain road between Ourense and Castro Caldelas, with great presence of slopes and sharp curves, shows special interest for this type of application. In this case, poor visibility can especially contribute to the run over of pedestrians or cyclists traveling on the road shoulders.

  2. Application of Deconvolution Algorithm of Point Spread Function in Improving Image Quality: An Observer Preference Study on Chest Radiography.

    PubMed

    Chae, Kum Ju; Goo, Jin Mo; Ahn, Su Yeon; Yoo, Jin Young; Yoon, Soon Ho

    2018-01-01

    To evaluate the preference of observers for image quality of chest radiography using the deconvolution algorithm of point spread function (PSF) (TRUVIEW ART algorithm, DRTECH Corp.) compared with that of original chest radiography for visualization of anatomic regions of the chest. Prospectively enrolled 50 pairs of posteroanterior chest radiographs collected with standard protocol and with additional TRUVIEW ART algorithm were compared by four chest radiologists. This algorithm corrects scattered signals generated by a scintillator. Readers independently evaluated the visibility of 10 anatomical regions and overall image quality with a 5-point scale of preference. The significance of the differences in reader's preference was tested with a Wilcoxon's signed rank test. All four readers preferred the images applied with the algorithm to those without algorithm for all 10 anatomical regions (mean, 3.6; range, 3.2-4.0; p < 0.001) and for the overall image quality (mean, 3.8; range, 3.3-4.0; p < 0.001). The most preferred anatomical regions were the azygoesophageal recess, thoracic spine, and unobscured lung. The visibility of chest anatomical structures applied with the deconvolution algorithm of PSF was superior to the original chest radiography.

  3. Spectral and spatial variability of undisturbed and disturbed grass under different view and illumination directions

    NASA Astrophysics Data System (ADS)

    Borel-Donohue, Christoph C.; Shivers, Sarah Wells; Conover, Damon

    2017-05-01

    It is well known that disturbed grass covered surfaces show variability with view and illumination conditions. A good example is a grass field in a soccer stadium that shows stripes indicating in which direction the grass was mowed. These spatial variations are due to a complex interplay of spectral characteristics of grass blades, density, their length and orientations. Viewing a grass surface from nadir or near horizontal directions results in observing different components. Views from a vertical direction show more variations due to reflections from the randomly oriented grass blades and their shadows. Views from near horizontal show a mixture of reflected and transmitted light from grass blades. An experiment was performed on a mowed grass surface which had paths of simulated heavy foot traffic laid down in different directions. High spatial resolution hyperspectral data cubes were taken by an imaging spectrometer covering the visible through near infrared over a period of time covering several hours. Ground truth grass reflectance spectra with a hand held spectrometer were obtained of undisturbed and disturbed areas. Close range images were taken of selected areas with a hand held camera which were then used to reconstruct the 3D geometry of the grass using structure-from-motion algorithms. Computer graphics rendering using raytracing of reconstructed and procedurally created grass surfaces were used to compute BRDF models. In this paper, we discuss differences between observed and simulated spectral and spatial variability. Based on the measurements and/or simulations, we derive simple spectral index methods to detect spatial disturbances and apply scattering models.

  4. Analysis and high-resolution modeling of a dense sea fog event over the Yellow Sea

    NASA Astrophysics Data System (ADS)

    Fu, Gang; Guo, Jingtian; Xie, Shang-Ping; Duan, Yihong; Zhang, Meigen

    2006-10-01

    A ubiquitous feature of the Yellow Sea (YS) is the frequent occurrence of the sea fog in spring and summer season. An extremely dense sea fog event was observed around the Shandong Peninsula in the morning of 11 April 2004. This fog patch, with a spatial scale of several hundreds kilometers and lasted about 20 h, reduced the horizontal visibility to be less than 20 m in some locations, and caused a series of traffic collisions and 12 injuries on the coastal stretch of a major highway. In this paper, almost all available observational data, including Geostationary Operational Environmental Satellite (GOES)-9 visible satellite imagery, objectively reanalyzed data of final run analysis (FNL) issued by the National Center for Environmental Prediction (NCEP) and the sounding data of Qingdao and Dalian, as well as the latest 4.4 version of Regional Atmospheric Modeling System (RAMS) model, were employed to investigate this sea fog case. Its evolutionary process and the environmental conditions that led to the fog formation were examined by using GOES-9 visible satellite imagery and sounding observations. In order to better understand the fog formation mechanism, a high-resolution RAMS modeling of 4 km × 4 km was designed. The modeling was initialized and validated by FNL data. A 30-h modeling that started from 18 UTC 10 April 2004 reproduced the main characteristics of this fog event. The simulated lower horizontal visibility area agreed reasonably well with the sea fog region identified from the satellite imagery. Advection cooling effect seemed to play a significant role for the fog formation.

  5. Perception-oriented fusion of multi-sensor imagery: visible, IR, and SAR

    NASA Astrophysics Data System (ADS)

    Sidorchuk, D.; Volkov, V.; Gladilin, S.

    2018-04-01

    This paper addresses the problem of image fusion of optical (visible and thermal domain) data and radar data for the purpose of visualization. These types of images typically contain a lot of complimentary information, and their joint visualization can be useful and more convenient for human user than a set of individual images. To solve the image fusion problem we propose a novel algorithm that utilizes some peculiarities of human color perception and based on the grey-scale structural visualization. Benefits of presented algorithm are exemplified by satellite imagery.

  6. Pattern-Recognition System for Approaching a Known Target

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance; Cheng, Yang

    2008-01-01

    A closed-loop pattern-recognition system is designed to provide guidance for maneuvering a small exploratory robotic vehicle (rover) on Mars to return to a landed spacecraft to deliver soil and rock samples that the spacecraft would subsequently bring back to Earth. The system could be adapted to terrestrial use in guiding mobile robots to approach known structures that humans could not approach safely, for such purposes as reconnaissance in military or law-enforcement applications, terrestrial scientific exploration, and removal of explosive or other hazardous items. The system has been demonstrated in experiments in which the Field Integrated Design and Operations (FIDO) rover (a prototype Mars rover equipped with a video camera for guidance) is made to return to a mockup of Mars-lander spacecraft. The FIDO rover camera autonomously acquires an image of the lander from a distance of 125 m in an outdoor environment. Then under guidance by an algorithm that performs fusion of multiple line and texture features in digitized images acquired by the camera, the rover traverses the intervening terrain, using features derived from images of the lander truss structure. Then by use of precise pattern matching for determining the position and orientation of the rover relative to the lander, the rover aligns itself with the bottom of ramps extending from the lander, in preparation for climbing the ramps to deliver samples to the lander. The most innovative aspect of the system is a set of pattern-recognition algorithms that govern a three-phase visual-guidance sequence for approaching the lander. During the first phase, a multifeature fusion algorithm integrates the outputs of a horizontal-line-detection algorithm and a wavelet-transform-based visual-area-of-interest algorithm for detecting the lander from a significant distance. The horizontal-line-detection algorithm is used to determine candidate lander locations based on detection of a horizontal deck that is part of the lander.

  7. Geometric Hitting Set for Segments of Few Orientations

    DOE PAGES

    Fekete, Sandor P.; Huang, Kan; Mitchell, Joseph S. B.; ...

    2016-01-13

    Here we study several natural instances of the geometric hitting set problem for input consisting of sets of line segments (and rays, lines) having a small number of distinct slopes. These problems model path monitoring (e.g., on road networks) using the fewest sensors (the \\hitting points"). We give approximation algorithms for cases including (i) lines of 3 slopes in the plane, (ii) vertical lines and horizontal segments, (iii) pairs of horizontal/vertical segments. Lastly, we give hardness and hardness of approximation results for these problems. We prove that the hitting set problem for vertical lines and horizontal rays is polynomially solvable.

  8. Optimized 3D stitching algorithm for whole body SPECT based on transition error minimization (TEM)

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-02-01

    Standard Single Photon Emission Computed Tomography (SPECT) has a limited field of view (FOV) and cannot provide a 3D image of an entire long whole body SPECT. To produce a 3D whole body SPECT image, two to five overlapped SPECT FOVs from head to foot are acquired and assembled using image stitching. Most commercial software from medical imaging manufacturers applies a direct mid-slice stitching method to avoid blurring or ghosting from 3D image blending. Due to intensity changes across the middle slice of overlapped images, direct mid-slice stitching often produces visible seams in the coronal and sagittal views and maximal intensity projection (MIP). In this study, we proposed an optimized algorithm to reduce the visibility of stitching edges. The new algorithm computed, based on transition error minimization (TEM), a 3D stitching interface between two overlapped 3D SPECT images. To test the suggested algorithm, four studies of 2-FOV whole body SPECT were used and included two different reconstruction methods (filtered back projection (FBP) and ordered subset expectation maximization (OSEM)) as well as two different radiopharmaceuticals (Tc-99m MDP for bone metastases and I-131 MIBG for neuroblastoma tumors). Relative transition errors of stitched whole body SPECT using mid-slice stitching and the TEM-based algorithm were measured for objective evaluation. Preliminary experiments showed that the new algorithm reduced the visibility of the stitching interface in the coronal, sagittal, and MIP views. Average relative transition errors were reduced from 56.7% of mid-slice stitching to 11.7% of TEM-based stitching. The proposed algorithm also avoids blurring artifacts by preserving the noise properties of the original SPECT images.

  9. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  10. Accurate visible speech synthesis based on concatenating variable length motion capture data.

    PubMed

    Ma, Jiyong; Cole, Ron; Pellom, Bryan; Ward, Wayne; Wise, Barbara

    2006-01-01

    We present a novel approach to synthesizing accurate visible speech based on searching and concatenating optimal variable-length units in a large corpus of motion capture data. Based on a set of visual prototypes selected on a source face and a corresponding set designated for a target face, we propose a machine learning technique to automatically map the facial motions observed on the source face to the target face. In order to model the long distance coarticulation effects in visible speech, a large-scale corpus that covers the most common syllables in English was collected, annotated and analyzed. For any input text, a search algorithm to locate the optimal sequences of concatenated units for synthesis is desrcribed. A new algorithm to adapt lip motions from a generic 3D face model to a specific 3D face model is also proposed. A complete, end-to-end visible speech animation system is implemented based on the approach. This system is currently used in more than 60 kindergarten through third grade classrooms to teach students to read using a lifelike conversational animated agent. To evaluate the quality of the visible speech produced by the animation system, both subjective evaluation and objective evaluation are conducted. The evaluation results show that the proposed approach is accurate and powerful for visible speech synthesis.

  11. Route towards cylindrical cloaking at visible frequencies using an optimization algorithm

    NASA Astrophysics Data System (ADS)

    Rottler, Andreas; Krüger, Benjamin; Heitmann, Detlef; Pfannkuche, Daniela; Mendach, Stefan

    2012-12-01

    We derive a model based on the Maxwell-Garnett effective-medium theory that describes a cylindrical cloaking shell composed of metal rods which are radially aligned in a dielectric host medium. We propose and demonstrate a minimization algorithm that calculates for given material parameters the optimal geometrical parameters of the cloaking shell such that its effective optical parameters fit the best to the required permittivity distribution for cylindrical cloaking. By means of sophisticated full-wave simulations we find that a cylindrical cloak with good performance using silver as the metal can be designed with our algorithm for wavelengths in the red part of the visible spectrum (623nm <λ<773nm). We also present a full-wave simulation of such a cloak at an exemplary wavelength of λ=729nm (ℏω=1.7eV) which indicates that our model is useful to find design rules of cloaks with good cloaking performance. Our calculations investigate a structure that is easy to fabricate using standard preparation techniques and therefore pave the way to a realization of guiding light around an object at visible frequencies, thus rendering it invisible.

  12. Surface normal coupling to multiple-slot and cover-slotted silicon nanocrystalline waveguides and ring resonators

    NASA Astrophysics Data System (ADS)

    Covey, John; Chen, Ray T.

    2014-03-01

    Grating couplers are ideal for coupling into the tightly confined propagation modes of semiconductor waveguides. In addition, nonlinear optics has benefited from the sub-diffraction limit confinement of horizontal slot waveguides. By combining these two advancements, slot-based nonlinear optics with mode areas less than 0.02 μm2 can become as routine as twisting fiber connectors together. Surface normal fiber alignment to a chip is also highly desirable from time, cost, and manufacturing considerations. To meet these considerable design challenges, a custom genetic algorithm is created which, starting from purely random designs, creates a unique four stage grating coupler for two novel horizontal slot waveguide platforms. For horizontal multiple-slot waveguides filled with silicon nanocrystal, a theoretical fiber-towaveguide coupling efficiency of 68% is obtained. For thin silicon waveguides clad with optically active silicon nanocrystal, known as cover-slot waveguides, a theoretical fiber-to-waveguide coupling efficiency of 47% is obtained, and 1 dB and 3 dB theoretical bandwidths of 70 nm and 150 nm are obtained, respectively. Both waveguide platforms are fabricated from scratch, and their respective on-chip grating couplers are experimentally measured from a standard single mode fiber array that is mounted surface normally. The horizontal multiple-slot grating coupler achieved an experimental 60% coupling efficiency, and the horizontal cover-slot grating coupler achieved an experimental 38.7% coupling efficiency, with an extrapolated 1 dB bandwidth of 66 nm. This report demonstrates the promise of genetic algorithm-based design by reducing to practice the first large bandwidth vertical grating coupler to a novel silicon nanocrystal horizontal cover-slot waveguide.

  13. Practical calibration of design data to technical capabilities of horizontal directional drilling rig

    NASA Astrophysics Data System (ADS)

    Toropov, S. Yu; Toropov, V. S.

    2018-05-01

    In order to design more accurately trenchless pipeline passages, a technique has been developed for calculating the passage profile, based on specific parameters of the horizontal directional drilling rig, including the range of possible drilling angles and a list of compatible drill pipe sets. The algorithm for calculating the parameters of the trenchless passage profile is shown in the paper. This algorithm is based on taking into account the features of HDD technology, namely, three different stages of production. The authors take into account that the passage profile is formed at the first stage of passage construction, that is, when drilling a pilot well. The algorithm involves calculating the profile by taking into account parameters of the drill pipes used and angles of their deviation relative to each other during the pilot drilling. This approach allows us to unambiguously calibrate the designed profile for the HDD rig capabilities and the auxiliary and navigation equipment used in the construction process.

  14. Wind turbine blade shear web disbond detection using rotor blade operational sensing and data analysis.

    PubMed

    Myrent, Noah; Adams, Douglas E; Griffith, D Todd

    2015-02-28

    A wind turbine blade's structural dynamic response is simulated and analysed with the goal of characterizing the presence and severity of a shear web disbond. Computer models of a 5 MW offshore utility-scale wind turbine were created to develop effective algorithms for detecting such damage. Through data analysis and with the use of blade measurements, a shear web disbond was quantified according to its length. An aerodynamic sensitivity study was conducted to ensure robustness of the detection algorithms. In all analyses, the blade's flap-wise acceleration and root-pitching moment were the clearest indicators of the presence and severity of a shear web disbond. A combination of blade and non-blade measurements was formulated into a final algorithm for the detection and quantification of the disbond. The probability of detection was 100% for the optimized wind speed ranges in laminar, 30% horizontal shear and 60% horizontal shear conditions. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  15. A Comparison of the AFGL Flash, Draper Dart and AWS Haze Models with the Rand Wetta Model for Calculating Atmospheric Contrast Reduction.

    DTIC Science & Technology

    1982-03-01

    52 ILLUSTRATIONS Figure I Horizontal Visibility Profiles for Stair-Step and Exponential Extinction Coefficient...background reflectances. These values were then numerically intergrated (via a combination of Simpson’s and Newton’s 3/8th rules) and compared with the

  16. 46 CFR 72.04-1 - Navigation bridge visibility.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... meet the following requirements: (a) The field of vision from the navigation bridge, whether the vessel... degrees. (2) From the conning position, the horizontal field of vision extends over an arc from at least...) From each bridge wing, the field of vision extends over an arc from at least 45 degrees on the opposite...

  17. 46 CFR 92.03-1 - Navigation bridge visibility.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... after September 7, 1990, must meet the following requirements: (a) The field of vision from the... obstruction must not exceed 5 degrees. (2) From the conning position, the horizontal field of vision extends... paragraph (a)(1) of this section. (3) From each bridge wing, the field of vision extends over an arc from at...

  18. 46 CFR 190.02-1 - Navigation bridge visibility.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... September 7, 1990, must meet the following requirements: (a) The field of vision from the navigation bridge... not exceed 5 degrees. (2) From the conning position, the horizontal field of vision extends over an...)(1) of this section. (3) From each bridge wing, the field of vision extends over an arc from at least...

  19. 46 CFR 72.04-1 - Navigation bridge visibility.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... meet the following requirements: (a) The field of vision from the navigation bridge, whether the vessel... degrees. (2) From the conning position, the horizontal field of vision extends over an arc from at least...) From each bridge wing, the field of vision extends over an arc from at least 45 degrees on the opposite...

  20. 46 CFR 108.801 - Navigation bridge visibility.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... September 7, 1990, must meet the following requirements: (a) The field of vision from the navigation bridge... not exceed 5 degrees. (2) From the conning position, the horizontal field of vision extends over an...)(1) of this section. (3) From each bridge wing, the field of vision extends over an arc from at least...

  1. Propagating figured wood in black walnut

    Treesearch

    James R. McKenna; Wayne A. Geyer; Keith E. Woeste; Daniel L. Cassens

    2015-01-01

    Figured black walnut lumber is a specialty wood product that commands a high price for manufacturing fine furniture and interior paneling. Two common figured grain patterns occur in walnut; they are known as "fiddle-back" or "curly" grain, depending on the number of horizontal lines visible in the grain of the finished wood. The occurrence of...

  2. Detection of lettuce discoloration using hyperspectral reflectance imaging

    USDA-ARS?s Scientific Manuscript database

    Rapid visible/near-infrared (VNIR) hyperspectral imaging methods, employing both a single waveband algorithm and multi-spectral algorithms, were developed in order to classify the discoloration of lettuce. Reflectance spectra for sound and discolored lettuce surfaces were extracted from hyperspectra...

  3. Analysis of estimation algorithms for CDTI and CAS applications

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1985-01-01

    Estimation algorithms for Cockpit Display of Traffic Information (CDTI) and Collision Avoidance System (CAS) applications were analyzed and/or developed. The algorithms are based on actual or projected operational and performance characteristics of an Enhanced TCAS II traffic sensor developed by Bendix and the Federal Aviation Administration. Three algorithm areas are examined and discussed. These are horizontal x and y, range and altitude estimation algorithms. Raw estimation errors are quantified using Monte Carlo simulations developed for each application; the raw errors are then used to infer impacts on the CDTI and CAS applications. Applications of smoothing algorithms to CDTI problems are also discussed briefly. Technical conclusions are summarized based on the analysis of simulation results.

  4. Regularized inversion of controlled source audio-frequency magnetotelluric data in horizontally layered transversely isotropic media

    NASA Astrophysics Data System (ADS)

    Zhou, Jianmei; Wang, Jianxun; Shang, Qinglong; Wang, Hongnian; Yin, Changchun

    2014-04-01

    We present an algorithm for inverting controlled source audio-frequency magnetotelluric (CSAMT) data in horizontally layered transversely isotropic (TI) media. The popular inversion method parameterizes the media into a large number of layers which have fixed thickness and only reconstruct the conductivities (e.g. Occam's inversion), which does not enable the recovery of the sharp interfaces between layers. In this paper, we simultaneously reconstruct all the model parameters, including both the horizontal and vertical conductivities and layer depths. Applying the perturbation principle and the dyadic Green's function in TI media, we derive the analytic expression of Fréchet derivatives of CSAMT responses with respect to all the model parameters in the form of Sommerfeld integrals. A regularized iterative inversion method is established to simultaneously reconstruct all the model parameters. Numerical results show that the inverse algorithm, including the depths of the layer interfaces, can significantly improve the inverse results. It can not only reconstruct the sharp interfaces between layers, but also can obtain conductivities close to the true value.

  5. Identification and Reconfigurable Control of Impaired Multi-Rotor Drones

    NASA Technical Reports Server (NTRS)

    Stepanyan, Vahram; Krishnakumar, Kalmanje; Bencomo, Alfredo

    2016-01-01

    The paper presents an algorithm for control and safe landing of impaired multi-rotor drones when one or more motors fail simultaneously or in any sequence. It includes three main components: an identification block, a reconfigurable control block, and a decisions making block. The identification block monitors each motor load characteristics and the current drawn, based on which the failures are detected. The control block generates the required total thrust and three axis torques for the altitude, horizontal position and/or orientation control of the drone based on the time scale separation and nonlinear dynamic inversion. The horizontal displacement is controlled by modulating the roll and pitch angles. The decision making algorithm maps the total thrust and three torques into the individual motor thrusts based on the information provided by the identification block. The drone continues the mission execution as long as the number of functioning motors provide controllability of it. Otherwise, the controller is switched to the safe mode, which gives up the yaw control, commands a safe landing spot and descent rate while maintaining the horizontal attitude.

  6. On the Complexity of Duplication-Transfer-Loss Reconciliation with Non-Binary Gene Trees.

    PubMed

    Kordi, Misagh; Bansal, Mukul S

    2017-01-01

    Duplication-Transfer-Loss (DTL) reconciliation has emerged as a powerful technique for studying gene family evolution in the presence of horizontal gene transfer. DTL reconciliation takes as input a gene family phylogeny and the corresponding species phylogeny, and reconciles the two by postulating speciation, gene duplication, horizontal gene transfer, and gene loss events. Efficient algorithms exist for finding optimal DTL reconciliations when the gene tree is binary. However, gene trees are frequently non-binary. With such non-binary gene trees, the reconciliation problem seeks to find a binary resolution of the gene tree that minimizes the reconciliation cost. Given the prevalence of non-binary gene trees, many efficient algorithms have been developed for this problem in the context of the simpler Duplication-Loss (DL) reconciliation model. Yet, no efficient algorithms exist for DTL reconciliation with non-binary gene trees and the complexity of the problem remains unknown. In this work, we resolve this open question by showing that the problem is, in fact, NP-hard. Our reduction applies to both the dated and undated formulations of DTL reconciliation. By resolving this long-standing open problem, this work will spur the development of both exact and heuristic algorithms for this important problem.

  7. Retrieval and Validation of aerosol optical properties from AHI measurements: impact of surface reflectance assumption

    NASA Astrophysics Data System (ADS)

    Lim, H.; Choi, M.; Kim, J.; Go, S.; Chan, P.; Kasai, Y.

    2017-12-01

    This study attempts to retrieve the aerosol optical properties (AOPs) based on the spectral matching method, with using three visible and one near infrared channels (470, 510, 640, 860nm). This method requires the preparation of look-up table (LUT) approach based on the radiative transfer modeling. Cloud detection is one of the most important processes for guaranteed quality of AOPs. Since the AHI has several infrared channels, which are very advantageous for cloud detection, clouds can be removed by using brightness temperature difference (BTD) and spatial variability test. The Yonsei Aerosol Retrieval (YAER) algorithm is basically utilized on a dark surface, therefore a bright surface (e.g., desert, snow) should be removed first. Then we consider the characteristics of the reflectance of land and ocean surface using three visible channels. The known surface reflectivity problem in high latitude area can be solved in this algorithm by selecting appropriate channels through improving tests. On the other hand, we retrieved the AOPs by obtaining the visible surface reflectance using NIR to normalized difference vegetation index short wave infrared (NDVIswir) relationship. ESR tends to underestimate urban and cropland area, we improved the visible surface reflectance considering urban effect. In this version, ocean surface reflectance is using the new cox and munk method which considers ocean bidirectional reflectance distribution function (BRDF). Input of this method has wind speed, chlorophyll, salinity and so on. Based on validation results with the sun-photometer measurement in AErosol Robotic NETwork (AERONET), we confirm that the quality of Aerosol Optical Depth (AOD) from the YAER algorithm is comparable to the product from the Japan Aerospace Exploration Agency (JAXA) retrieval algorithm. Our future update includes a consideration of improvement land surface reflectance by hybrid approach, and non-spherical aerosols. This will improve the quality of YAER algorithm more, particularly retrieval for the dust particle over the bright surface in East Asia.

  8. GOME-2 Tropospheric Ozone Profile Retrievals from Joint UV/Visible Measurement

    NASA Astrophysics Data System (ADS)

    Liu, X.; Zoogman, P.; Chance, K.; Cai, Z.; Nowlan, C. R.; Huang, G.; Gonzalez Abad, G.

    2016-12-01

    It has been shown from sensitivity studies that adding visible measurements in the Chappuis ozone band to UV measurements in the Hartley/Huggins ozone bands can significantly enhance retrieval sensitivity to lower tropospheric ozone from backscattered solar radiances due to deeper photon penetration in the visible to the surface than in the ultraviolet. The first NASA EVI (Earth Venture Instrument) TEMPO (Tropospheric Emissions: Monitoring of Pollution) instrument is being developed to measure backscattered solar radiation in two channels ( 290-490 and 540-740 nm) and make atmospheric pollution measurements over North America from the Geostationary orbit. However, this retrieval enhancement has yet to be demonstrated from existing measurements due to the weak ozone absorption in the visible and strong interferences from surface reflectance and aerosols and the requirement of accurate radiometric calibration across different spectral channels. We present GOME-2 retrievals from joint UV/visible measurements using the SAO ozone profile retrieval algorithm, to directly explore the retrieval improvement in lower tropospheric ozone from additional visible measurements. To reduce the retrieval interference from surface reflectance, we add characterization of surface spectral reflectance in the visible based on combining EOFs (Empirical Orthogonal Functions) derived from ASTER and other surface reflectance spectra with MODIS BRDF climatology into the ozone profile algorithm. The impacts of various types of aerosols and surface BRDF on the retrievals will be investigated. In addition, we will also perform empirical radiometric calibration of the GOME-2 data based on radiative transfer simulations. We will evaluate the retrieval improvement of joint UV/visible retrieval over the UV retrieval based on fitting quality and validation against ozonesonde observations.

  9. Orthodontically induced eruption of a horizontally impacted maxillary central incisor.

    PubMed

    Rizzatto, Susana Maria Deon; de Menezes, Luciane Macedo; Allgayer, Susiane; Batista, Eraldo Luiz; Freitas, Maria Perpétua Mota; Loro, Raphael Carlos Drumond

    2013-07-01

    This case report presents the clinical features and periodontal findings in a patient with a horizontally impacted maxillary central incisor that had been exposed and aligned after a closed-eruption surgical technique. By combining 3 treatment stages-maxillary expansion, crown exposure surgery, and induced eruption-the horizontally impacted incisor was successfully moved into proper position. The patient finished treatment with a normal and stable occlusion between the maxillary and mandibular arches, and an adequate width of attached gingiva, even in the area surrounding the crown. The 5-year follow-up of stability and periodontal health demonstrated esthetic and functional outcomes after orthodontically induced tooth eruption. Clinical evaluation showed that the treated central incisor had periodontal clinical variables related to visible plaque, bleeding on probing, width of attached gingiva, and crown length that resembled the contralateral incisor. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  10. Space Radar Image of Lisbon, Portugal

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This radar image of Lisbon, Portugal illustrates the different land use patterns that are present in coastal Portugal. Lisbon, the national capital, lies on the north bank of the Rio Tejo where the river enters the Atlantic Ocean. The city center appears as the bright area in the center of the image. The green area west of the city center is a large city park called the Parque Florestal de Monsanto. The Lisbon Airport is visible east of the city. The Rio Tejo forms a large bay just east of the city. Many agricultural fields are visible as a patchwork pattern east of the bay. Suburban housing can be seen on the southern bank of the river. Spanning the river is the Ponte 25 de Abril, a large suspension bridge similar in architecture to San Francisco's Golden Gate Bridge. The image was acquired on April 19, 1994 and is centered at 38.8 degrees north latitude, 9.2 degrees west longitude. North is towards the upper right. The image is 50 kilometers by 30 kilometers (31 miles by 19 miles). The colors in this image represent the following radar channels and polarizations: red is L-band, horizontally transmitted and received; green is L-band, horizontally transmitted and vertically received; and blue is C-band, horizontally transmitted and vertically received. SIR-C/X-SAR, a joint mission of the German, Italian, and the United States space agencies, is part of NASA's Mission to Planet Earth.

  11. Trace Gas Retrievals from the GeoTASO Aircraft Instrument

    NASA Astrophysics Data System (ADS)

    Nowlan, C. R.; Liu, X.; Leitch, J. W.; Liu, C.; Gonzalez Abad, G.; Chance, K.; Cole, J.; Delker, T.; Good, W. S.; Murcray, F.; Ruppert, L.; Soo, D.; Loughner, C.; Follette-Cook, M. B.; Janz, S. J.; Kowalewski, M. G.; Pickering, K. E.; Zoogman, P.; Al-Saadi, J. A.

    2015-12-01

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) instrument is a passive remote sensing instrument capable of making 2-D measurements of trace gases and aerosols from aircraft. The instrument measures backscattered UV and visible radiation, allowing the retrieval of trace gas amounts below the aircraft at horizontal resolutions on the order of 250 m x 250 m. GeoTASO was originally developed under NASA's Instrument Incubator Program as a test-bed instrument for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey mission, and is now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions. We present spatially resolved observations of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the DISCOVER-AQ field campaigns in Texas and Colorado, as well as comparisons with observations made by ground-based Pandora spectrometers, in situ monitoring instruments and other aircraft instruments deployed during these campaigns. These measurements at various times of day are providing a very useful data set for testing and improving TEMPO and GEMS retrieval algorithms, as well as demonstrating prototype validation strategies.

  12. Automated Visibility & Cloud Cover Measurements with a Solid State Imaging System

    DTIC Science & Technology

    1989-03-01

    GL-TR-89-0061 SIO Ref. 89-7 MPL-U-26/89 AUTOMATED VISIBILITY & CLOUD COVER MEASUREMENTS WITH A SOLID-STATE IMAGING SYSTEM C) to N4 R. W. Johnson W. S...include Security Classification) Automated Visibility & Cloud Measurements With A Solid State Imaging System 12. PERSONAL AUTHOR(S) Richard W. Johnson...based imaging systems , their ics and control algorithms, thus they ar.L discussed sepa- initial deployment and the preliminary application of rately

  13. Views of the Mir Space Station during rendezvous

    NASA Image and Video Library

    1997-05-16

    STS084-350-023 (15-24 May 1997) --- A Space Shuttle point-of-view frame showing the docking port and target during rendezvous with Russia's Mir Space Station. The picture should be held horizontally with the retracted Kristall solar array at top. Other elements partially visible are Kvant-2 (left), Spektr (right) and Core Module (bottom).

  14. 46 CFR 32.16-1 - Navigation bridge visibility-T/ALL.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., must meet the following requirements: (a) The field of vision from the navigation bridge, whether the... degrees. (2) From the conning position, the horizontal field of vision extends over an arc from at least...) From each bridge wing, the field of vision extends over an arc from at least 45 degrees on the opposite...

  15. 33 CFR 164.15 - Navigation bridge visibility.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... ports must be such that the field of vision from the navigation bridge conforms as closely as possible... horizontal field of vision must extend over an arc from at least 22.5 degrees abaft the beam on one side of... of vision must extend over an arc from at least 45 degrees on the opposite bow, through dead ahead...

  16. 33 CFR 164.15 - Navigation bridge visibility.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... ports must be such that the field of vision from the navigation bridge conforms as closely as possible... horizontal field of vision must extend over an arc from at least 22.5 degrees abaft the beam on one side of... of vision must extend over an arc from at least 45 degrees on the opposite bow, through dead ahead...

  17. Ends in Themselves: Theorizing the Practice of University-School Partnering through Horizontalidad

    ERIC Educational Resources Information Center

    Campano, Gerald; Honeyford, Michelle A.; Sanchez, Lenny; Zanden, Sarah Vander

    2010-01-01

    In this article we share our current thinking about the methodology of collaborations for change and make visible our own attempts to theorize the practice of university-school partnering. We suggest that a fruitful new direction for research may involve turning to the Global South and the Latin American idea of horizontalidad [horizontalism],…

  18. Electrostatic Deformation of Liquid Surfaces by a Charged Rod and a Van De Graaff Generator

    ERIC Educational Resources Information Center

    Slisko, Josip; García-Molina, Rafael; Abril, Isabel

    2014-01-01

    Authors of physics textbooks frequently use the deflection of a thin, vertically falling water jet by a charged balloon, comb, or rod as a visually appealing and conceptually relevant example of electrostatic attraction. Nevertheless, no attempts are made to explore whether these charged bodies could cause visible deformation of a horizontal water…

  19. Decay Times and Quality Factors for a Resonance Apparatus

    ERIC Educational Resources Information Center

    Stephens, Heather; Tam, Austin; Moloney, Michael

    2011-01-01

    The commercial resonance demonstration apparatus shown in Fig. 1 exhibits curious behavior. It consists of three pairs of slender spring-steel rods attached to a horizontal bar. When one of the rods is pulled aside and released, the rod of corresponding length is excited into visible motion, but the other rods remain apparently stationary. This…

  20. Increase in the Accuracy of Calculating Length of Horizontal Cable SCS in Civil Engineering

    NASA Astrophysics Data System (ADS)

    Semenov, A.

    2017-11-01

    A modification of the method for calculating the horizontal cable consumption of SCS established at civil engineering facilities is proposed. The proposed procedure preserves the prototype simplicity and provides a 5 percent accuracy increase. The values of the achieved accuracy are justified, their compliance with the practice of real projects is proved. The method is brought to the level of the engineering algorithm and formalized in the form of 12/70 rule.

  1. A fast fusion scheme for infrared and visible light images in NSCT domain

    NASA Astrophysics Data System (ADS)

    Zhao, Chunhui; Guo, Yunting; Wang, Yulei

    2015-09-01

    Fusion of infrared and visible light images is an effective way to obtain a simultaneous visualization of details of background provided by visible light image and hiding target information provided by infrared image, which is more suitable for browsing and further processing. Two crucial components for infrared and visual light image fusion are improving its fusion performance as well as reducing its computational burden. In this paper, a novel fusion algorithm named pixel information estimation is proposed, which determines the weights by evaluating the information of pixel and is well applied in visible light and infrared image fusion with better fusion quality and lower time-consumption. Besides, a fast realization of non-subsampled contourlet transform is also proposed in this paper to improve the computational efficiency. To verify the advantage of the proposed method, this paper compares it with several popular ones in six evaluation metrics over four different image groups. Experimental results show that the proposed algorithm gets a more effective result with much less time consuming and performs well in both subjective evaluation and objective indicators.

  2. Visible Light Image-Based Method for Sugar Content Classification of Citrus

    PubMed Central

    Wang, Xuefeng; Wu, Chunyan; Hirafuji, Masayuki

    2016-01-01

    Visible light imaging of citrus fruit from Mie Prefecture of Japan was performed to determine whether an algorithm could be developed to predict the sugar content. This nondestructive classification showed that the accurate segmentation of different images can be realized by a correlation analysis based on the threshold value of the coefficient of determination. There is an obvious correlation between the sugar content of citrus fruit and certain parameters of the color images. The selected image parameters were connected by addition algorithm. The sugar content of citrus fruit can be predicted by the dummy variable method. The results showed that the small but orange citrus fruits often have a high sugar content. The study shows that it is possible to predict the sugar content of citrus fruit and to perform a classification of the sugar content using light in the visible spectrum and without the need for an additional light source. PMID:26811935

  3. Algorithm for Calculating the Dissociation Constants of Ampholytes in Nonbuffer Systems

    NASA Astrophysics Data System (ADS)

    Lysova, S. S.; Skripnikova, T. A.; Zevatskii, Yu. E.

    2018-05-01

    An algorithm for calculating the dissociation constants of ampholytes in aqueous solutions is developed on the basis of spectrophotometric data in the UV and visible ranges without pH measurements of a medium and without buffer solutions. The proposed algorithm has been experimentally tested for five ampholytes of different strengths. The relative error of measuring dissociation constants is less than 5%.

  4. Two Topics in Data Analysis: Sample-based Optimal Transport and Analysis of Turbulent Spectra from Ship Track Data

    NASA Astrophysics Data System (ADS)

    Kuang, Simeng Max

    This thesis contains two topics in data analysis. The first topic consists of the introduction of algorithms for sample-based optimal transport and barycenter problems. In chapter 1, a family of algorithms is introduced to solve both the L2 optimal transport problem and the Wasserstein barycenter problem. Starting from a theoretical perspective, the new algorithms are motivated from a key characterization of the barycenter measure, which suggests an update that reduces the total transportation cost and stops only when the barycenter is reached. A series of general theorems is given to prove the convergence of all the algorithms. We then extend the algorithms to solve sample-based optimal transport and barycenter problems, in which only finite sample sets are available instead of underlying probability distributions. A unique feature of the new approach is that it compares sample sets in terms of the expected values of a set of feature functions, which at the same time induce the function space of optimal maps and can be chosen by users to incorporate their prior knowledge of the data. All the algorithms are implemented and applied to various synthetic example and practical applications. On synthetic examples it is found that both the SOT algorithm and the SCB algorithm are able to find the true solution and often converge in a handful of iterations. On more challenging applications including Gaussian mixture models, color transfer and shape transform problems, the algorithms give very good results throughout despite the very different nature of the corresponding datasets. In chapter 2, a preconditioning procedure is developed for the L2 and more general optimal transport problems. The procedure is based on a family of affine map pairs, which transforms the original measures into two new measures that are closer to each other, while preserving the optimality of solutions. It is proved that the preconditioning procedure minimizes the remaining transportation cost among all admissible affine maps. The procedure can be used on both continuous measures and finite sample sets from distributions. In numerical examples, the procedure is applied to multivariate normal distributions, to a two-dimensional shape transform problem and to color transfer problems. For the second topic, we present an extension to anisotropic flows of the recently developed Helmholtz and wave-vortex decomposition method for one-dimensional spectra measured along ship or aircraft tracks in Buhler et al. (J. Fluid Mech., vol. 756, 2014, pp. 1007-1026). While in the original method the flow was assumed to be homogeneous and isotropic in the horizontal plane, we allow the flow to have a simple kind of horizontal anisotropy that is chosen in a self-consistent manner and can be deduced from the one-dimensional power spectra of the horizontal velocity fields and their cross-correlation. The key result is that an exact and robust Helmholtz decomposition of the horizontal kinetic energy spectrum can be achieved in this anisotropic flow setting, which then also allows the subsequent wave-vortex decomposition step. The new method is developed theoretically and tested with encouraging results on challenging synthetic data as well as on ocean data from the Gulf Stream.

  5. Determination of thiamine HCl and pyridoxine HCl in pharmaceutical preparations using UV-visible spectrophotometry and genetic algorithm based multivariate calibration methods.

    PubMed

    Ozdemir, Durmus; Dinc, Erdal

    2004-07-01

    Simultaneous determination of binary mixtures pyridoxine hydrochloride and thiamine hydrochloride in a vitamin combination using UV-visible spectrophotometry and classical least squares (CLS) and three newly developed genetic algorithm (GA) based multivariate calibration methods was demonstrated. The three genetic multivariate calibration methods are Genetic Classical Least Squares (GCLS), Genetic Inverse Least Squares (GILS) and Genetic Regression (GR). The sample data set contains the UV-visible spectra of 30 synthetic mixtures (8 to 40 microg/ml) of these vitamins and 10 tablets containing 250 mg from each vitamin. The spectra cover the range from 200 to 330 nm in 0.1 nm intervals. Several calibration models were built with the four methods for the two components. Overall, the standard error of calibration (SEC) and the standard error of prediction (SEP) for the synthetic data were in the range of <0.01 and 0.43 microg/ml for all the four methods. The SEP values for the tablets were in the range of 2.91 and 11.51 mg/tablets. A comparison of genetic algorithm selected wavelengths for each component using GR method was also included.

  6. The analysis of polar clouds from AVHRR satellite data using pattern recognition techniques

    NASA Technical Reports Server (NTRS)

    Smith, William L.; Ebert, Elizabeth

    1990-01-01

    The cloud cover in a set of summertime and wintertime AVHRR data from the Arctic and Antarctic regions was analyzed using a pattern recognition algorithm. The data were collected by the NOAA-7 satellite on 6 to 13 Jan. and 1 to 7 Jul. 1984 between 60 deg and 90 deg north and south latitude in 5 spectral channels, at the Global Area Coverage (GAC) resolution of approximately 4 km. This data embodied a Polar Cloud Pilot Data Set which was analyzed by a number of research groups as part of a polar cloud algorithm intercomparison study. This study was intended to determine whether the additional information contained in the AVHRR channels (beyond the standard visible and infrared bands on geostationary satellites) could be effectively utilized in cloud algorithms to resolve some of the cloud detection problems caused by low visible and thermal contrasts in the polar regions. The analysis described makes use of a pattern recognition algorithm which estimates the surface and cloud classification, cloud fraction, and surface and cloudy visible (channel 1) albedo and infrared (channel 4) brightness temperatures on a 2.5 x 2.5 deg latitude-longitude grid. In each grid box several spectral and textural features were computed from the calibrated pixel values in the multispectral imagery, then used to classify the region into one of eighteen surface and/or cloud types using the maximum likelihood decision rule. A slightly different version of the algorithm was used for each season and hemisphere because of differences in categories and because of the lack of visible imagery during winter. The classification of the scene is used to specify the optimal AVHRR channel for separating clear and cloudy pixels using a hybrid histogram-spatial coherence method. This method estimates values for cloud fraction, clear and cloudy albedos and brightness temperatures in each grid box. The choice of a class-dependent AVHRR channel allows for better separation of clear and cloudy pixels than does a global choice of a visible and/or infrared threshold. The classification also prevents erroneous estimates of large fractional cloudiness in areas of cloudfree snow and sea ice. The hybrid histogram-spatial coherence technique and the advantages of first classifying a scene in the polar regions are detailed. The complete Polar Cloud Pilot Data Set was analyzed and the results are presented and discussed.

  7. Parameter estimation by Differential Search Algorithm from horizontal loop electromagnetic (HLEM) data

    NASA Astrophysics Data System (ADS)

    Alkan, Hilal; Balkaya, Çağlayan

    2018-02-01

    We present an efficient inversion tool for parameter estimation from horizontal loop electromagnetic (HLEM) data using Differential Search Algorithm (DSA) which is a swarm-intelligence-based metaheuristic proposed recently. The depth, dip, and origin of a thin subsurface conductor causing the anomaly are the parameters estimated by the HLEM method commonly known as Slingram. The applicability of the developed scheme was firstly tested on two synthetically generated anomalies with and without noise content. Two control parameters affecting the convergence characteristic to the solution of the algorithm were tuned for the so-called anomalies including one and two conductive bodies, respectively. Tuned control parameters yielded more successful statistical results compared to widely used parameter couples in DSA applications. Two field anomalies measured over a dipping graphitic shale from Northern Australia were then considered, and the algorithm provided the depth estimations being in good agreement with those of previous studies and drilling information. Furthermore, the efficiency and reliability of the results obtained were investigated via probability density function. Considering the results obtained, we can conclude that DSA characterized by the simple algorithmic structure is an efficient and promising metaheuristic for the other relatively low-dimensional geophysical inverse problems. Finally, the researchers after being familiar with the content of developed scheme displaying an easy to use and flexible characteristic can easily modify and expand it for their scientific optimization problems.

  8. Angular and Seasonal Variation of Spectral Surface Reflectance Ratios: Implications for the Remote Sensing of Aerosol over Land

    NASA Technical Reports Server (NTRS)

    Remer, L. A.; Wald, A. E.; Kaufman, Y. J.

    1999-01-01

    We obtain valuable information on the angular and seasonal variability of surface reflectance using a hand-held spectrometer from a light aircraft. The data is used to test a procedure that allows us to estimate visible surface reflectance from the longer wavelength 2.1 micrometer channel (mid-IR). Estimating or avoiding surface reflectance in the visible is a vital first step in most algorithms that retrieve aerosol optical thickness over land targets. The data indicate that specular reflection found when viewing targets from the forward direction can severely corrupt the relationships between the visible and 2.1 micrometer reflectance that were derived from nadir data. There is a month by month variation in the ratios between the visible and the mid-IR, weakly correlated to the Normalized Difference Vegetation Index (NDVI). If specular reflection is not avoided, the errors resulting from estimating surface reflectance from the mid-IR exceed the acceptable limit of DELTA-rho approximately 0.01 in roughly 40% of the cases, using the current algorithm. This is reduced to 25% of the cases if specular reflection is avoided. An alternative method that uses path radiance rather than explicitly estimating visible surface reflectance results in similar errors. The two methods have different strengths and weaknesses that require further study.

  9. [Analysis of visible extinction spectrum of particle system and selection of optimal wavelength].

    PubMed

    Sun, Xiao-gang; Tang, Hong; Yuan, Gui-bin

    2008-09-01

    In the total light scattering particle sizing technique, the extinction spectrum of particle system contains some information about the particle size and refractive index. The visible extinction spectra of the common monomodal and biomodal R-R particle size distribution were computed, and the variation in the visible extinction spectrum with the particle size and refractive index was analyzed. The corresponding wavelengths were selected as the measurement wavelengths at which the second order differential extinction spectrum was discontinuous. Furthermore, the minimum and the maximum wavelengths in the visible region were also selected as the measurement wavelengths. The genetic algorithm was used as the inversion method under the dependent model The computer simulation and experiments illustrate that it is feasible to make an analysis of the extinction spectrum and use this selection method of the optimal wavelength in the total light scattering particle sizing. The rough contour of the particle size distribution can be determined after the analysis of visible extinction spectrum, so the search range of the particle size parameter is reduced in the optimal algorithm, and then a more accurate inversion result can be obtained using the selection method. The inversion results of monomodal and biomodal distribution are all still satisfactory when 1% stochastic noise is put in the transmission extinction measurement values.

  10. Guided Growth of Horizontal p-Type ZnTe Nanowires

    PubMed Central

    2016-01-01

    A major challenge toward large-scale integration of nanowires is the control over their alignment and position. A possible solution to this challenge is the guided growth process, which enables the synthesis of well-aligned horizontal nanowires that grow according to specific epitaxial or graphoepitaxial relations with the substrate. However, the guided growth of horizontal nanowires was demonstrated for a limited number of materials, most of which exhibit unintentional n-type behavior. Here we demonstrate the vapor–liquid–solid growth of guided horizontal ZnTe nanowires and nanowalls displaying p-type behavior on four different planes of sapphire. The growth directions of the nanowires are determined by epitaxial relations between the nanowires and the substrate or by a graphoepitaxial effect that guides their growth along nanogrooves or nanosteps along the surface. We characterized the crystallographic orientations and elemental composition of the nanowires using transmission electron microscopy and photoluminescence. The optoelectronic and electronic properties of the nanowires were studied by fabricating photodetectors and top-gate thin film transistors. These measurements showed that the guided ZnTe nanowires are p-type semiconductors and are photoconductive in the visible range. The guided growth of horizontal p-type nanowires opens up the possibility of parallel nanowire integration into functional systems with a variety of potential applications not available by other means. PMID:27885331

  11. Guided Growth of Horizontal p-Type ZnTe Nanowires.

    PubMed

    Reut, Gilad; Oksenberg, Eitan; Popovitz-Biro, Ronit; Rechav, Katya; Joselevich, Ernesto

    2016-08-04

    A major challenge toward large-scale integration of nanowires is the control over their alignment and position. A possible solution to this challenge is the guided growth process, which enables the synthesis of well-aligned horizontal nanowires that grow according to specific epitaxial or graphoepitaxial relations with the substrate. However, the guided growth of horizontal nanowires was demonstrated for a limited number of materials, most of which exhibit unintentional n-type behavior. Here we demonstrate the vapor-liquid-solid growth of guided horizontal ZnTe nanowires and nanowalls displaying p-type behavior on four different planes of sapphire. The growth directions of the nanowires are determined by epitaxial relations between the nanowires and the substrate or by a graphoepitaxial effect that guides their growth along nanogrooves or nanosteps along the surface. We characterized the crystallographic orientations and elemental composition of the nanowires using transmission electron microscopy and photoluminescence. The optoelectronic and electronic properties of the nanowires were studied by fabricating photodetectors and top-gate thin film transistors. These measurements showed that the guided ZnTe nanowires are p-type semiconductors and are photoconductive in the visible range. The guided growth of horizontal p-type nanowires opens up the possibility of parallel nanowire integration into functional systems with a variety of potential applications not available by other means.

  12. Effects of Inboard Horizontal Field of View Display Limitations on Pilot Path Control During Total In-Flight Simulator (TIFS) Flight Test

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Parrish, Russell V.; Williams, Steven P.; Lavell, Jeffrey S.

    1999-01-01

    A flight test was conducted aboard Calspan's Total In-Flight Simulator (TIFS) aircraft by researchers within the External Visibility System (XVS) element of the High-Speed Research program. The purpose was to investigate the effects of inboard horizontal field of view (FOV) display limitations on pilot path control and to learn about the TIFS capabilities and limitations for possible use in future XVS flight tests. The TIFS cockpit windows were masked to represent the front XVS display area and the High-Speed Civil Transport side windows, as viewed by the pilot. Masking limited the forward FOV to 40 deg. horizontal and 50 deg. vertical for the basic flight condition, With an increase of 10 deg. horizontal in the inboard direction for the increased FOV flight condition. Two right-hand approach tasks (base-downwind-final) with a left crosswind on final were performed by three pilots using visual flight rules at Niagara Falls Airport. Each of the two tasks had three replicates for both horizontal FOV conditions, resulting in twelve approaches per test subject. Limited objective data showed that an increase of inboard FOV had no effect (deficiences in objective data measurement capabilities were noted). However, subjective results showed that a 50 deg. FOV was preferred over the 40 deg. FOV.

  13. Cross-modal face recognition using multi-matcher face scores

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Blasch, Erik

    2015-05-01

    The performance of face recognition can be improved using information fusion of multimodal images and/or multiple algorithms. When multimodal face images are available, cross-modal recognition is meaningful for security and surveillance applications. For example, a probe face is a thermal image (especially at nighttime), while only visible face images are available in the gallery database. Matching a thermal probe face onto the visible gallery faces requires crossmodal matching approaches. A few such studies were implemented in facial feature space with medium recognition performance. In this paper, we propose a cross-modal recognition approach, where multimodal faces are cross-matched in feature space and the recognition performance is enhanced with stereo fusion at image, feature and/or score level. In the proposed scenario, there are two cameras for stereo imaging, two face imagers (visible and thermal images) in each camera, and three recognition algorithms (circular Gaussian filter, face pattern byte, linear discriminant analysis). A score vector is formed with three cross-matched face scores from the aforementioned three algorithms. A classifier (e.g., k-nearest neighbor, support vector machine, binomial logical regression [BLR]) is trained then tested with the score vectors by using 10-fold cross validations. The proposed approach was validated with a multispectral stereo face dataset from 105 subjects. Our experiments show very promising results: ACR (accuracy rate) = 97.84%, FAR (false accept rate) = 0.84% when cross-matching the fused thermal faces onto the fused visible faces by using three face scores and the BLR classifier.

  14. Parallel SOR methods with a parabolic-diffusion acceleration technique for solving an unstructured-grid Poisson equation on 3D arbitrary geometries

    NASA Astrophysics Data System (ADS)

    Zapata, M. A. Uh; Van Bang, D. Pham; Nguyen, K. D.

    2016-05-01

    This paper presents a parallel algorithm for the finite-volume discretisation of the Poisson equation on three-dimensional arbitrary geometries. The proposed method is formulated by using a 2D horizontal block domain decomposition and interprocessor data communication techniques with message passing interface. The horizontal unstructured-grid cells are reordered according to the neighbouring relations and decomposed into blocks using a load-balanced distribution to give all processors an equal amount of elements. In this algorithm, two parallel successive over-relaxation methods are presented: a multi-colour ordering technique for unstructured grids based on distributed memory and a block method using reordering index following similar ideas of the partitioning for structured grids. In all cases, the parallel algorithms are implemented with a combination of an acceleration iterative solver. This solver is based on a parabolic-diffusion equation introduced to obtain faster solutions of the linear systems arising from the discretisation. Numerical results are given to evaluate the performances of the methods showing speedups better than linear.

  15. Reconciliation of Gene and Species Trees

    PubMed Central

    Rusin, L. Y.; Lyubetskaya, E. V.; Gorbunov, K. Y.; Lyubetsky, V. A.

    2014-01-01

    The first part of the paper briefly overviews the problem of gene and species trees reconciliation with the focus on defining and algorithmic construction of the evolutionary scenario. Basic ideas are discussed for the aspects of mapping definitions, costs of the mapping and evolutionary scenario, imposing time scales on a scenario, incorporating horizontal gene transfers, binarization and reconciliation of polytomous trees, and construction of species trees and scenarios. The review does not intend to cover the vast diversity of literature published on these subjects. Instead, the authors strived to overview the problem of the evolutionary scenario as a central concept in many areas of evolutionary research. The second part provides detailed mathematical proofs for the solutions of two problems: (i) inferring a gene evolution along a species tree accounting for various types of evolutionary events and (ii) trees reconciliation into a single species tree when only gene duplications and losses are allowed. All proposed algorithms have a cubic time complexity and are mathematically proved to find exact solutions. Solving algorithms for problem (ii) can be naturally extended to incorporate horizontal transfers, other evolutionary events, and time scales on the species tree. PMID:24800245

  16. Parallel text rendering by a PostScript interpreter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kritskii, S.P.; Zastavnoi, B.A.

    1994-11-01

    The most radical method of increasing the performance of devices controlled by PostScript interpreters may be the use of multiprocessor controllers. This paper presents a method for parallelizing the operation of a PostScript interpreter for rendering text. The proposed method is based on decomposition of the outlines of letters into horizontal strips covering equal areas. The subroutines thus obtained are distributed to the processors in a network and then filled in by conventional sequential algorithms. A special algorithm has been developed for dividing the outlines of characters into subroutines so that each may be colored independently of the others. Themore » algorithm uses special estimates for estimating the correct partition so that the corresponding outlines are divided into horizontal strips. A method is presented for finding such estimates. Two different processing approaches are presented. In the first, one of the processors performs the decomposition of the outlines and distributes the strips to the remaining processors, which are responsible for the rendering. In the second approach, the decomposition process is itself distributed among the processors in the network.« less

  17. Lunar Pole Illumination and Communications Maps Computed from GSSR Elevation Data

    NASA Technical Reports Server (NTRS)

    Bryant, Scott

    2009-01-01

    A Digital Elevation Model of the lunar south pole was produced using Goldstone Solar System RADAR (GSSR) data obtained in 2006.12 This model has 40-meter horizontal resolution and about 5-meter relative vertical accuracy. This Digital Elevation Model was used to compute average solar illumination and Earth visibility with 100 kilometers of the lunar south pole. The elevation data were converted into local terrain horizon masks, then converted into lunar-centric latitude and longitude coordinates. The horizon masks were compared to latitude, longitude regions bounding the maximum Sun and Earth motions relative to the moon. Estimates of Earth visibility were computed by integrating the area of the region bounding the Earth's motion that was below the horizon mask. Solar illumination and other metrics were computed similarly. Proposed lunar south pole base sites were examined in detail, with the best site showing yearly solar power availability of 92 percent and Direct-To-Earth (DTE) communication availability of about 50 percent. Similar analysis of the lunar south pole used an older GSSR Digital Elevation Model with 600-meter horizontal resolution. The paper also explores using a heliostat to reduce the photovoltaic power system mass and complexity.

  18. Radiative transfer simulations of the two-dimensional ocean glint reflectance and determination of the sea surface roughness.

    PubMed

    Lin, Zhenyi; Li, Wei; Gatebe, Charles; Poudyal, Rajesh; Stamnes, Knut

    2016-02-20

    An optimized discrete-ordinate radiative transfer model (DISORT3) with a pseudo-two-dimensional bidirectional reflectance distribution function (BRDF) is used to simulate and validate ocean glint reflectances at an infrared wavelength (1036 nm) by matching model results with a complete set of BRDF measurements obtained from the NASA cloud absorption radiometer (CAR) deployed on an aircraft. The surface roughness is then obtained through a retrieval algorithm and is used to extend the simulation into the visible spectral range where diffuse reflectance becomes important. In general, the simulated reflectances and surface roughness information are in good agreement with the measurements, and the diffuse reflectance in the visible, ignored in current glint algorithms, is shown to be important. The successful implementation of this new treatment of ocean glint reflectance and surface roughness in DISORT3 will help improve glint correction algorithms in current and future ocean color remote sensing applications.

  19. Radiative Transfer Simulations of the Two-Dimensional Ocean Glint Reflectance and Determination of the Sea Surface Roughness

    NASA Technical Reports Server (NTRS)

    Lin, Zhenyi; Li, Wei; Gatebe, Charles; Poudyal, Rajesh; Stamnes, Knut

    2016-01-01

    An optimized discrete-ordinate radiative transfer model (DISORT3) with a pseudo-two-dimensional bidirectional reflectance distribution function (BRDF) is used to simulate and validate ocean glint reflectances at an infrared wavelength (1036 nm) by matching model results with a complete set of BRDF measurements obtained from the NASA cloud absorption radiometer (CAR) deployed on an aircraft. The surface roughness is then obtained through a retrieval algorithm and is used to extend the simulation into the visible spectral range where diffuse reflectance becomes important. In general, the simulated reflectances and surface roughness information are in good agreement with the measurements, and the diffuse reflectance in the visible, ignored in current glint algorithms, is shown to be important. The successful implementation of this new treatment of ocean glint reflectance and surface roughness in DISORT3 will help improve glint correction algorithms in current and future ocean color remote sensing applications.

  20. The use of a laser ceilometer for sky condition determination

    NASA Astrophysics Data System (ADS)

    Nadolski, Vickie L.; Bradley, James T.

    The use of a laser ceilometer for determining sky condition is presented, with emphasis on the operation of the ceilometer, the sky-condition-reporting algorithm, and how the laser ceilometer and the sky-condition algorithm are used to give a report suitable for aircraft operations and meteorological application. The sampling and processing features of the Vaisala ceilometer produced a detailed and accurate cloud base 'signature' by taking 254 measurement samples of the energy scattered back from a single laser pulse as the pulse traveled from the surface to 12,000 ft. The transmit time from the projection of the laser pulse to its backscattering from a cloud element and subsequent return to a collocated receiver is measured and a cloud height element computed. Attention is given to the development of a vertical visibility concept and of a vertical-visibility algorithm, as well as the strengths and limitations of the sky condition report.

  1. Conceptual optimization using genetic algorithms for tube in tube structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pârv, Bianca Roxana; Hulea, Radu; Mojolic, Cristian

    2015-03-10

    The purpose of this article is to optimize the tube in tube structural systems for tall buildings under the horizontal wind loads. It is well-known that the horizontal wind loads is the main criteria when choosing the structural system, the types and the dimensions of structural elements in the majority of tall buildings. Thus, the structural response of tall buildings under the horizontal wind loads will be analyzed for 40 story buildings and a total height of 120 meters; the horizontal dimensions will be 30m × 30m for the first two optimization problems and 15m × 15m for the third.more » The optimization problems will have the following as objective function the cross section area, as restrictions the displacement of the building< the admissible displacement (H/500), and as variables the cross section dimensions of the structural elements.« less

  2. Vorticity and divergence in the solar photosphere

    NASA Technical Reports Server (NTRS)

    Wang, YI; Noyes, Robert W.; Tarbell, Theodore D.; Title, Alan M.

    1995-01-01

    We have studied an outstanding sequence of continuum images of the solar granulation from Pic du Midi Observatory. We have calculated the horizontal vector flow field using a correlation tracking algorithm, and from this determined three scalar field: the vertical component of the curl; the horizontal divergence; and the horizontal flow speed. The divergence field has substantially longer coherence time and more power than does the curl field. Statistically, curl is better correlated with regions of negative divergence - that is, the vertical vorticity is higher in downflow regions, suggesting excess vorticity in intergranular lanes. The average value of the divergence is largest (i.e., outflow is largest) where the horizontal speed is large; we associate these regions with exploding granules. A numerical simulation of general convection also shows similar statistical differences between curl and divergence. Some individual small bright points in the granulation pattern show large local vorticities.

  3. Invisible data matrix detection with smart phone using geometric correction and Hough transform

    NASA Astrophysics Data System (ADS)

    Sun, Halit; Uysalturk, Mahir C.; Karakaya, Mahmut

    2016-04-01

    Two-dimensional data matrices are used in many different areas that provide quick and automatic data entry to the computer system. Their most common usage is to automatically read labeled products (books, medicines, food, etc.) and recognize them. In Turkey, alcohol beverages and tobacco products are labeled and tracked with the invisible data matrices for public safety and tax purposes. In this application, since data matrixes are printed on a special paper with a pigmented ink, it cannot be seen under daylight. When red LEDs are utilized for illumination and reflected light is filtered, invisible data matrices become visible and decoded by special barcode readers. Owing to their physical dimensions, price and requirement of special training to use; cheap, small sized and easily carried domestic mobile invisible data matrix reader systems are required to be delivered to every inspector in the law enforcement units. In this paper, we first developed an apparatus attached to the smartphone including a red LED light and a high pass filter. Then, we promoted an algorithm to process captured images by smartphones and to decode all information stored in the invisible data matrix images. The proposed algorithm mainly involves four stages. In the first step, data matrix code is processed by Hough transform processing to find "L" shaped pattern. In the second step, borders of the data matrix are found by using the convex hull and corner detection methods. Afterwards, distortion of invisible data matrix corrected by geometric correction technique and the size of every module is fixed in rectangular shape. Finally, the invisible data matrix is scanned line by line in the horizontal axis to decode it. Based on the results obtained from the real test images of invisible data matrix captured with a smartphone, the proposed algorithm indicates high accuracy and low error rate.

  4. Detection of Intracranial Signatures of Interictal Epileptiform Discharges from Concurrent Scalp EEG.

    PubMed

    Spyrou, Loukianos; Martín-Lopez, David; Valentín, Antonio; Alarcón, Gonzalo; Sanei, Saeid

    2016-06-01

    Interictal epileptiform discharges (IEDs) are transient neural electrical activities that occur in the brain of patients with epilepsy. A problem with the inspection of IEDs from the scalp electroencephalogram (sEEG) is that for a subset of epileptic patients, there are no visually discernible IEDs on the scalp, rendering the above procedures ineffective, both for detection purposes and algorithm evaluation. On the other hand, intracranially placed electrodes yield a much higher incidence of visible IEDs as compared to concurrent scalp electrodes. In this work, we utilize concurrent scalp and intracranial EEG (iEEG) from a group of temporal lobe epilepsy (TLE) patients with low number of scalp-visible IEDs. The aim is to determine whether by considering the timing information of the IEDs from iEEG, the resulting concurrent sEEG contains enough information for the IEDs to be reliably distinguished from non-IED segments. We develop an automatic detection algorithm which is tested in a leave-subject-out fashion, where each test subject's detection algorithm is based on the other patients' data. The algorithm obtained a [Formula: see text] accuracy in recognizing scalp IED from non-IED segments with [Formula: see text] accuracy when trained and tested on the same subject. Also, it was able to identify nonscalp-visible IED events for most patients with a low number of false positive detections. Our results represent a proof of concept that IED information for TLE patients is contained in scalp EEG even if they are not visually identifiable and also that between subject differences in the IED topology and shape are small enough such that a generic algorithm can be used.

  5. Concept development for the ITER equatorial port visible/infrared wide angle viewing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reichle, R.; Beaumont, B.; Boilson, D.

    2012-10-15

    The ITER equatorial port visible/infrared wide angle viewing system concept is developed from the measurement requirements. The proposed solution situates 4 viewing systems in the equatorial ports 3, 9, 12, and 17 with 4 views each (looking at the upper target, the inner divertor, and tangentially left and right). This gives sufficient coverage. The spatial resolution of the divertor system is 2 times higher than the other views. For compensation of vacuum-vessel movements, an optical hinge concept is proposed. Compactness and low neutron streaming is achieved by orienting port plug doglegs horizontally. Calibration methods, risks, and R and D topicsmore » are outlined.« less

  6. Evaluating some computer exhancement algorithms that improve the visibility of cometary morphology

    NASA Technical Reports Server (NTRS)

    Larson, Stephen M.; Slaughter, Charles D.

    1992-01-01

    Digital enhancement of cometary images is a necessary tool in studying cometary morphology. Many image processing algorithms, some developed specifically for comets, have been used to enhance the subtle, low contrast coma and tail features. We compare some of the most commonly used algorithms on two different images to evaluate their strong and weak points, and conclude that there currently exists no single 'ideal' algorithm, although the radial gradient spatial filter gives the best overall result. This comparison should aid users in selecting the best algorithm to enhance particular features of interest.

  7. Analyzing Visibility Configurations.

    PubMed

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  8. The analysis of the pilot's cognitive and decision processes

    NASA Technical Reports Server (NTRS)

    Curry, R. E.

    1975-01-01

    Articles are presented on pilot performance in zero-visibility precision approach, failure detection by pilots during automatic landing, experiments in pilot decision-making during simulated low visibility approaches, a multinomial maximum likelihood program, and a random search algorithm for laboratory computers. Other topics discussed include detection of system failures in multi-axis tasks and changes in pilot workload during an instrument landing.

  9. Superscattering of light optimized by a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Mirzaei, Ali; Miroshnichenko, Andrey E.; Shadrivov, Ilya V.; Kivshar, Yuri S.

    2014-07-01

    We analyse scattering of light from multi-layer plasmonic nanowires and employ a genetic algorithm for optimizing the scattering cross section. We apply the mode-expansion method using experimental data for material parameters to demonstrate that our genetic algorithm allows designing realistic core-shell nanostructures with the superscattering effect achieved at any desired wavelength. This approach can be employed for optimizing both superscattering and cloaking at different wavelengths in the visible spectral range.

  10. Fusion of infrared and visible images based on saliency scale-space in frequency domain

    NASA Astrophysics Data System (ADS)

    Chen, Yanfei; Sang, Nong; Dan, Zhiping

    2015-12-01

    A fusion algorithm of infrared and visible images based on saliency scale-space in the frequency domain was proposed. Focus of human attention is directed towards the salient targets which interpret the most important information in the image. For the given registered infrared and visible images, firstly, visual features are extracted to obtain the input hypercomplex matrix. Secondly, the Hypercomplex Fourier Transform (HFT) is used to obtain the salient regions of the infrared and visible images respectively, the convolution of the input hypercomplex matrix amplitude spectrum with a low-pass Gaussian kernel of an appropriate scale which is equivalent to an image saliency detector are done. The saliency maps are obtained by reconstructing the 2D signal using the original phase and the amplitude spectrum, filtered at a scale selected by minimizing saliency map entropy. Thirdly, the salient regions are fused with the adoptive weighting fusion rules, and the nonsalient regions are fused with the rule based on region energy (RE) and region sharpness (RS), then the fused image is obtained. Experimental results show that the presented algorithm can hold high spectrum information of the visual image, and effectively get the thermal targets information at different scales of the infrared image.

  11. Development and Positioning Accuracy Assessment of Single-Frequency Precise Point Positioning Algorithms by Combining GPS Code-Pseudorange Measurements with Real-Time SSR Corrections

    PubMed Central

    Kim, Miso; Park, Kwan-Dong

    2017-01-01

    We have developed a suite of real-time precise point positioning programs to process GPS pseudorange observables, and validated their performance through static and kinematic positioning tests. To correct inaccurate broadcast orbits and clocks, and account for signal delays occurring from the ionosphere and troposphere, we applied State Space Representation (SSR) error corrections provided by the Seoul Broadcasting System (SBS) in South Korea. Site displacements due to solid earth tide loading are also considered for the purpose of improving the positioning accuracy, particularly in the height direction. When the developed algorithm was tested under static positioning, Kalman-filtered solutions produced a root-mean-square error (RMSE) of 0.32 and 0.40 m in the horizontal and vertical directions, respectively. For the moving platform, the RMSE was found to be 0.53 and 0.69 m in the horizontal and vertical directions. PMID:28598403

  12. Effects of Device on Video Head Impulse Test (vHIT) Gain.

    PubMed

    Janky, Kristen L; Patterson, Jessie N; Shepard, Neil T; Thomas, Megan L A; Honaker, Julie A

    2017-10-01

    Numerous video head impulse test (vHIT) devices are available commercially; however, gain is not calculated uniformly. An evaluation of these devices/algorithms in healthy controls and patients with vestibular loss is necessary for comparing and synthesizing work that utilizes different devices and gain calculations. Using three commercially available vHIT devices/algorithms, the purpose of the present study was to compare: (1) horizontal canal vHIT gain among devices/algorithms in normal control subjects; (2) the effects of age on vHIT gain for each device/algorithm in normal control subjects; and (3) the clinical performance of horizontal canal vHIT gain between devices/algorithms for differentiating normal versus abnormal vestibular function. Prospective. Sixty-one normal control adult subjects (range 20-78) and eleven adults with unilateral or bilateral vestibular loss (range 32-79). vHIT was administered using three different devices/algorithms, randomized in order, for each subject on the same day: (1) Impulse (Otometrics, Schaumberg, IL; monocular eye recording, right eye only; using area under the curve gain), (2) EyeSeeCam (Interacoustics, Denmark; monocular eye recording, left eye only; using instantaneous gain), and (3) VisualEyes (MicroMedical, Chatham, IL, binocular eye recording; using position gain). There was a significant mean difference in vHIT gain among devices/algorithms for both the normal control and vestibular loss groups. vHIT gain was significantly larger in the ipsilateral direction of the eye used to measure gain; however, in spite of the significant mean differences in vHIT gain among devices/algorithms and the significant directional bias, classification of "normal" versus "abnormal" gain is consistent across all compared devices/algorithms, with the exception of instantaneous gain at 40 msec. There was not an effect of age on vHIT gain up to 78 years regardless of the device/algorithm. These findings support that vHIT gain is significantly different between devices/algorithms, suggesting that care should be taken when making direct comparisons of absolute gain values between devices/algorithms. American Academy of Audiology

  13. Stereo reconstruction from multiperspective panoramas.

    PubMed

    Li, Yin; Shum, Heung-Yeung; Tang, Chi-Keung; Szeliski, Richard

    2004-01-01

    A new approach to computing a panoramic (360 degrees) depth map is presented in this paper. Our approach uses a large collection of images taken by a camera whose motion has been constrained to planar concentric circles. We resample regular perspective images to produce a set of multiperspective panoramas and then compute depth maps directly from these resampled panoramas. Our panoramas sample uniformly in three dimensions: rotation angle, inverse radial distance, and vertical elevation. The use of multiperspective panoramas eliminates the limited overlap present in the original input images and, thus, problems as in conventional multibaseline stereo can be avoided. Our approach differs from stereo matching of single-perspective panoramic images taken from different locations, where the epipolar constraints are sine curves. For our multiperspective panoramas, the epipolar geometry, to the first order approximation, consists of horizontal lines. Therefore, any traditional stereo algorithm can be applied to multiperspective panoramas with little modification. In this paper, we describe two reconstruction algorithms. The first is a cylinder sweep algorithm that uses a small number of resampled multiperspective panoramas to obtain dense 3D reconstruction. The second algorithm, in contrast, uses a large number of multiperspective panoramas and takes advantage of the approximate horizontal epipolar geometry inherent in multiperspective panoramas. It comprises a novel and efficient 1D multibaseline matching technique, followed by tensor voting to extract the depth surface. Experiments show that our algorithms are capable of producing comparable high quality depth maps which can be used for applications such as view interpolation.

  14. Multiscale Analysis of Time Irreversibility Based on Phase-Space Reconstruction and Horizontal Visibility Graph Approach

    NASA Astrophysics Data System (ADS)

    Zhang, Yongping; Shang, Pengjian; Xiong, Hui; Xia, Jianan

    Time irreversibility is an important property of nonequilibrium dynamic systems. A visibility graph approach was recently proposed, and this approach is generally effective to measure time irreversibility of time series. However, its result may be unreliable when dealing with high-dimensional systems. In this work, we consider the joint concept of time irreversibility and adopt the phase-space reconstruction technique to improve this visibility graph approach. Compared with the previous approach, the improved approach gives a more accurate estimate for the irreversibility of time series, and is more effective to distinguish irreversible and reversible stochastic processes. We also use this approach to extract the multiscale irreversibility to account for the multiple inherent dynamics of time series. Finally, we apply the approach to detect the multiscale irreversibility of financial time series, and succeed to distinguish the time of financial crisis and the plateau. In addition, Asian stock indexes away from other indexes are clearly visible in higher time scales. Simulations and real data support the effectiveness of the improved approach when detecting time irreversibility.

  15. Long-term visibility variation in Athens (1931-2013): a proxy for local and regional atmospheric aerosol loads

    NASA Astrophysics Data System (ADS)

    Founda, Dimitra; Kazadzis, Stelios; Mihalopoulos, Nikolaos; Gerasopoulos, Evangelos; Lianou, Maria; Raptis, Panagiotis I.

    2016-09-01

    This study explores the interdecadal variability and trends of surface horizontal visibility at the urban area of Athens from 1931 to 2013, using the historical archives of the National Observatory of Athens (NOA). A prominent deterioration of visibility in the city was detected, with the long-term linear trend amounting to -2.8 km decade-1 (p < 0.001), over the entire study period. This was not accompanied by any significant trend in relative humidity or precipitation over the same period. A slight recovery of visibility levels seems to be established in the recent decade (2004-2013). It was found that very good visibility (> 20 km) occurred at a frequency of 34 % before the 1950s, while this percentage drops to just 2 % during the decade 2004-2013. The rapid impairment of the visual air quality in Athens around the 1950s points to the increased levels of air pollution on a local and/or regional scale, related to high urbanization rates and/or increased anthropogenic emissions on a global scale at that period. Visibility was found to be negatively/positively correlated with relative humidity/wind speed, the correlation being statistically valid at certain periods. Wind regime and mainly wind direction and corresponding air mass origin were found to highly control visibility levels in Athens. The comparison of visibility variation in Athens and at a non-urban reference site on Crete island revealed similar negative trends over the common period of observations. This suggests that apart local sources, visibility in Athens is highly determined by aerosol load of regional origin. AVHRR and MODIS satellite-derived aerosol optical depth (AOD) retrievals over Athens and surface measurements of PM10 confirmed the relation of visibility to aerosol load.

  16. Distributed Water Pollution Source Localization with Mobile UV-Visible Spectrometer Probes in Wireless Sensor Networks.

    PubMed

    Ma, Junjie; Meng, Fansheng; Zhou, Yuexi; Wang, Yeyao; Shi, Ping

    2018-02-16

    Pollution accidents that occur in surface waters, especially in drinking water source areas, greatly threaten the urban water supply system. During water pollution source localization, there are complicated pollutant spreading conditions and pollutant concentrations vary in a wide range. This paper provides a scalable total solution, investigating a distributed localization method in wireless sensor networks equipped with mobile ultraviolet-visible (UV-visible) spectrometer probes. A wireless sensor network is defined for water quality monitoring, where unmanned surface vehicles and buoys serve as mobile and stationary nodes, respectively. Both types of nodes carry UV-visible spectrometer probes to acquire in-situ multiple water quality parameter measurements, in which a self-adaptive optical path mechanism is designed to flexibly adjust the measurement range. A novel distributed algorithm, called Dual-PSO, is proposed to search for the water pollution source, where one particle swarm optimization (PSO) procedure computes the water quality multi-parameter measurements on each node, utilizing UV-visible absorption spectra, and another one finds the global solution of the pollution source position, regarding mobile nodes as particles. Besides, this algorithm uses entropy to dynamically recognize the most sensitive parameter during searching. Experimental results demonstrate that online multi-parameter monitoring of a drinking water source area with a wide dynamic range is achieved by this wireless sensor network and water pollution sources are localized efficiently with low-cost mobile node paths.

  17. Distributed Water Pollution Source Localization with Mobile UV-Visible Spectrometer Probes in Wireless Sensor Networks

    PubMed Central

    Zhou, Yuexi; Wang, Yeyao; Shi, Ping

    2018-01-01

    Pollution accidents that occur in surface waters, especially in drinking water source areas, greatly threaten the urban water supply system. During water pollution source localization, there are complicated pollutant spreading conditions and pollutant concentrations vary in a wide range. This paper provides a scalable total solution, investigating a distributed localization method in wireless sensor networks equipped with mobile ultraviolet-visible (UV-visible) spectrometer probes. A wireless sensor network is defined for water quality monitoring, where unmanned surface vehicles and buoys serve as mobile and stationary nodes, respectively. Both types of nodes carry UV-visible spectrometer probes to acquire in-situ multiple water quality parameter measurements, in which a self-adaptive optical path mechanism is designed to flexibly adjust the measurement range. A novel distributed algorithm, called Dual-PSO, is proposed to search for the water pollution source, where one particle swarm optimization (PSO) procedure computes the water quality multi-parameter measurements on each node, utilizing UV-visible absorption spectra, and another one finds the global solution of the pollution source position, regarding mobile nodes as particles. Besides, this algorithm uses entropy to dynamically recognize the most sensitive parameter during searching. Experimental results demonstrate that online multi-parameter monitoring of a drinking water source area with a wide dynamic range is achieved by this wireless sensor network and water pollution sources are localized efficiently with low-cost mobile node paths. PMID:29462929

  18. Remeasuring tree heights on permanent plots using rectangular coordinates and one angle per tree

    Treesearch

    Robert L. Neal

    1973-01-01

    Heights of permanent sample trees with tops visible from any point can be measured from that point with any clinometer, measuring one vertical angle per tree. Two horizontal angles and one additional vertical angle per observation point are necessary to orient the point to the plot. Permanently recorded coordinates and elevations of tree locations are used with the...

  19. Target discrimination of man-made objects using passive polarimetric signatures acquired in the visible and infrared spectral bands

    NASA Astrophysics Data System (ADS)

    Lavigne, Daniel A.; Breton, Mélanie; Fournier, Georges; Charette, Jean-François; Pichette, Mario; Rivet, Vincent; Bernier, Anne-Pier

    2011-10-01

    Surveillance operations and search and rescue missions regularly exploit electro-optic imaging systems to detect targets of interest in both the civilian and military communities. By incorporating the polarization of light as supplementary information to such electro-optic imaging systems, it is possible to increase their target discrimination capabilities, considering that man-made objects are known to depolarized light in different manner than natural backgrounds. As it is known that electro-magnetic radiation emitted and reflected from a smooth surface observed near a grazing angle becomes partially polarized in the visible and infrared wavelength bands, additional information about the shape, roughness, shading, and surface temperatures of difficult targets can be extracted by processing effectively such reflected/emitted polarized signatures. This paper presents a set of polarimetric image processing algorithms devised to extract meaningful information from a broad range of man-made objects. Passive polarimetric signatures are acquired in the visible, shortwave infrared, midwave infrared, and longwave infrared bands using a fully automated imaging system developed at DRDC Valcartier. A fusion algorithm is used to enable the discrimination of some objects lying in shadowed areas. Performance metrics, derived from the computed Stokes parameters, characterize the degree of polarization of man-made objects. Field experiments conducted during winter and summer time demonstrate: 1) the utility of the imaging system to collect polarized signatures of different objects in the visible and infrared spectral bands, and 2) the enhanced performance of target discrimination and fusion algorithms to exploit the polarized signatures of man-made objects against cluttered backgrounds.

  20. Toward automated face detection in thermal and polarimetric thermal imagery

    NASA Astrophysics Data System (ADS)

    Gordon, Christopher; Acosta, Mark; Short, Nathan; Hu, Shuowen; Chan, Alex L.

    2016-05-01

    Visible spectrum face detection algorithms perform pretty reliably under controlled lighting conditions. However, variations in illumination and application of cosmetics can distort the features used by common face detectors, thereby degrade their detection performance. Thermal and polarimetric thermal facial imaging are relatively invariant to illumination and robust to the application of makeup, due to their measurement of emitted radiation instead of reflected light signals. The objective of this work is to evaluate a government off-the-shelf wavelet based naïve-Bayes face detection algorithm and a commercial off-the-shelf Viola-Jones cascade face detection algorithm on face imagery acquired in different spectral bands. New classifiers were trained using the Viola-Jones cascade object detection framework with preprocessed facial imagery. Preprocessing using Difference of Gaussians (DoG) filtering reduces the modality gap between facial signatures across the different spectral bands, thus enabling more correlated histogram of oriented gradients (HOG) features to be extracted from the preprocessed thermal and visible face images. Since the availability of training data is much more limited in the thermal spectrum than in the visible spectrum, it is not feasible to train a robust multi-modal face detector using thermal imagery alone. A large training dataset was constituted with DoG filtered visible and thermal imagery, which was subsequently used to generate a custom trained Viola-Jones detector. A 40% increase in face detection rate was achieved on a testing dataset, as compared to the performance of a pre-trained/baseline face detector. Insights gained in this research are valuable in the development of more robust multi-modal face detectors.

  1. Space radar image of New York City

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This radar image of the New York city metropolitan area. The island of Manhattan appears in the center of the image. The green-colored rectangle on Manhattan is Central Park. This image was acquired by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar (SIR-C/ X-SAR) aboard the space shuttle Endeavour on October 10, 1994. North is toward the upper right. The area shown is 75.0 kilometers by 48.8 kilometers (46.5 miles by 30.2 miles). The image is centered at 40.7 degrees north latitude and 73.8 degrees west longitude. In general, light blue areas correspond to dense urban development, green areas to moderately vegetated zones and black areas to bodies of water. The Hudson River is the black strip that runs from the left edge to the upper right corner of the image. It separates New Jersey, in the upper left of the image, from New York. The Atlantic Ocean is at the bottom of the image where two barrier islands along the southern shore of Long Island are also visible. John F. Kennedy International Airport is visible above these islands. Long Island Sound, separating Long Island from Connecticut, is the dark area right of the center of the image. Many bridges are visible in the image, including the Verrazano Narrows, George Washington and Brooklyn bridges. The radar illumination is from the left of the image; this causes some urban zones to appear red because the streets are at a perpendicular angle to the radar pulse. The colors in this image were obtained using the following radar channels: red represents the L-band (horizontally transmitted and received); green represents the L-band (horizontally transmitted, vertically received); blue represents the C-band (horizontally transmitted, vertically received). Radar images like this one could be used as a tool for city planners and resource managers to map and monitor land use patterns. The radar imaging systems can clearly detect the variety of landscapes in the area, as well as the density of urban development.

  2. A hybrid algorithm for the segmentation of books in libraries

    NASA Astrophysics Data System (ADS)

    Hu, Zilong; Tang, Jinshan; Lei, Liang

    2016-05-01

    This paper proposes an algorithm for book segmentation based on bookshelves images. The algorithm can be separated into three parts. The first part is pre-processing, aiming at eliminating or decreasing the effect of image noise and illumination conditions. The second part is near-horizontal line detection based on Canny edge detector, and separating a bookshelves image into multiple sub-images so that each sub-image contains an individual shelf. The last part is book segmentation. In each shelf image, near-vertical line is detected, and obtained lines are used for book segmentation. The proposed algorithm was tested with the bookshelf images taken from OPIE library in MTU, and the experimental results demonstrate good performance.

  3. Document localization algorithms based on feature points and straight lines

    NASA Astrophysics Data System (ADS)

    Skoryukina, Natalya; Shemiakina, Julia; Arlazarov, Vladimir L.; Faradjev, Igor

    2018-04-01

    The important part of the system of a planar rectangular object analysis is the localization: the estimation of projective transform from template image of an object to its photograph. The system also includes such subsystems as the selection and recognition of text fields, the usage of contexts etc. In this paper three localization algorithms are described. All algorithms use feature points and two of them also analyze near-horizontal and near- vertical lines on the photograph. The algorithms and their combinations are tested on a dataset of real document photographs. Also the method of localization quality estimation is proposed that allows configuring the localization subsystem independently of the other subsystems quality.

  4. Detecting Horizontal Gene Transfer between Closely Related Taxa

    PubMed Central

    Adato, Orit; Ninyo, Noga; Gophna, Uri; Snir, Sagi

    2015-01-01

    Horizontal gene transfer (HGT), the transfer of genetic material between organisms, is crucial for genetic innovation and the evolution of genome architecture. Existing HGT detection algorithms rely on a strong phylogenetic signal distinguishing the transferred sequence from ancestral (vertically derived) genes in its recipient genome. Detecting HGT between closely related species or strains is challenging, as the phylogenetic signal is usually weak and the nucleotide composition is normally nearly identical. Nevertheless, there is a great importance in detecting HGT between congeneric species or strains, especially in clinical microbiology, where understanding the emergence of new virulent and drug-resistant strains is crucial, and often time-sensitive. We developed a novel, self-contained technique named Near HGT, based on the synteny index, to measure the divergence of a gene from its native genomic environment and used it to identify candidate HGT events between closely related strains. The method confirms candidate transferred genes based on the constant relative mutability (CRM). Using CRM, the algorithm assigns a confidence score based on “unusual” sequence divergence. A gene exhibiting exceptional deviations according to both synteny and mutability criteria, is considered a validated HGT product. We first employed the technique to a set of three E. coli strains and detected several highly probable horizontally acquired genes. We then compared the method to existing HGT detection tools using a larger strain data set. When combined with additional approaches our new algorithm provides richer picture and brings us closer to the goal of detecting all newly acquired genes in a particular strain. PMID:26439115

  5. Research support of the WETNET Program

    NASA Technical Reports Server (NTRS)

    Estes, John E.; Mcgwire, Kenneth C.; Scepan, Joseph; Henderson, SY; Lawless, Michael

    1995-01-01

    This study examines various aspects of the Microwave Vegetation Index (MVI). MVI is a derived signal created by differencing the spectral response of the 37 GHz horizontally and vertically polarized passive microwave signals. The microwave signal employed to derive this index is thought to be primarily influenced by vegetation structure, vegetation growth, standing water, and precipitation. The state of California is the study site for this research. Imagery from the Special Sensor Microwave/Imager (SSM/I) is used for the creation of MVI datasets analyzed in this research. The object of this research is to determine whether MVI corresponds with some quantifiable vegetation parameter (such as vegetation density) or whether the index is more affected by known biogeophysical parameters such antecedent precipitation. A secondary question associated with the above is whether the vegetation attributes that MVI is employed to determine can be more easily and accurately evaluated by other remote sensing means. An important associated question to be addressed in the study is the effect of different multi-temporal composting techniques on the derived MVI dataset. This work advances our understanding of the fundamental nature of MVI by studying vegetation as a mixture of structural types, such as forest and grassland. The study further advances our understanding by creating multitemporal precipitation datasets to compare the affects of precipitation upon MVI. This work will help to lay the groundwork for the use of passive microwave spectral information either as an adjunct to visible and near infrared imagery in areas where that is feasible or for the use of passive microwave alone in areas of moderate cloud coverage. In this research, an MVI dataset, spanning the period February 15, 1989 through April 25, 1990, has been created using National Aeronautic and Space Administration (NASA) supplied brightness temperature data. Information from the DMSP satellite 37 GHz wavelength SSM/I sensor in both horizontal and vertical polarization has been processed using the MVI algorithm. In conjunction with the MVI algorithm a multitemporal compositing technique was used to create datasets that correspond to 14 day periods. In this technical report, Section Two contains background information on the State of California and the three MVI study sites. Section Three describes the methods used to create the MVI and independent variables datasets. Section Four presents the results of the experiment. Section Five summarizes and concludes the work.

  6. Comparison of a single-view and a double-view aerosol optical depth retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Henderson, Bradley G.; Chylek, Petr

    2003-11-01

    We compare the results of a single-view and a double-view aerosol optical depth (AOD) retrieval algorithm applied to image pairs acquired over NASA Stennis Space Center, Mississippi. The image data were acquired by the Department of Energy's (DOE) Multispectral Thermal Imager (MTI), a pushbroom satellite imager with 15 bands from the visible to the thermal infrared. MTI has the ability to acquire imagery in pairs in which the first image is a near-nadir view and the second image is off-nadir with a zenith angle of approximately 60°. A total of 15 image pairs were used in the analysis. For a given image pair, AOD retrieval is performed twice---once using a single-view algorithm applied to the near-nadir image, then again using a double-view algorithm. Errors for both retrievals are computed by comparing the results to AERONET AOD measurements obtained at the same time and place. The single-view algorithm showed an RMS error about the mean of 0.076 in AOD units, whereas the double-view algorithm showed a modest improvement with an RMS error of 0.06. The single-view errors show a positive bias which is presumed to be a result of the empirical relationship used to determine ground reflectance in the visible. A plot of AOD error of the double-view algorithm versus time shows a noticeable trend which is interpreted to be a calibration drift. When this trend is removed, the RMS error of the double-view algorithm drops to 0.030. The single-view algorithm qualitatively appears to perform better during the spring and summer whereas the double-view algorithm seems to be less sensitive to season.

  7. Accurate Grid-based Clustering Algorithm with Diagonal Grid Searching and Merging

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Ye, Chengcheng; Zhu, Erzhou

    2017-09-01

    Due to the advent of big data, data mining technology has attracted more and more attentions. As an important data analysis method, grid clustering algorithm is fast but with relatively lower accuracy. This paper presents an improved clustering algorithm combined with grid and density parameters. The algorithm first divides the data space into the valid meshes and invalid meshes through grid parameters. Secondly, from the starting point located at the first point of the diagonal of the grids, the algorithm takes the direction of “horizontal right, vertical down” to merge the valid meshes. Furthermore, by the boundary grid processing, the invalid grids are searched and merged when the adjacent left, above, and diagonal-direction grids are all the valid ones. By doing this, the accuracy of clustering is improved. The experimental results have shown that the proposed algorithm is accuracy and relatively faster when compared with some popularly used algorithms.

  8. Effective visibility analysis method in virtual geographic environment

    NASA Astrophysics Data System (ADS)

    Li, Yi; Zhu, Qing; Gong, Jianhua

    2008-10-01

    Visibility analysis in virtual geographic environment has broad applications in many aspects in social life. But in practical use it is urged to improve the efficiency and accuracy, as well as to consider human vision restriction. The paper firstly introduces a high-efficient 3D data modeling method, which generates and organizes 3D data model using R-tree and LOD techniques. Then a new visibility algorithm which can realize real-time viewshed calculation considering the shelter of DEM and 3D building models and some restrictions of human eye to the viewshed generation. Finally an experiment is conducted to prove the visibility analysis calculation quickly and accurately which can meet the demand of digital city applications.

  9. A new evaluation method research for fusion quality of infrared and visible images

    NASA Astrophysics Data System (ADS)

    Ge, Xingguo; Ji, Yiguo; Tao, Zhongxiang; Tian, Chunyan; Ning, Chengda

    2017-03-01

    In order to objectively evaluate the fusion effect of infrared and visible image, a fusion evaluation method for infrared and visible images based on energy-weighted average structure similarity and edge information retention value is proposed for drawbacks of existing evaluation methods. The evaluation index of this method is given, and the infrared and visible image fusion results under different algorithms and environments are made evaluation experiments on the basis of this index. The experimental results show that the objective evaluation index is consistent with the subjective evaluation results obtained from this method, which shows that the method is a practical and effective fusion image quality evaluation method.

  10. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Devadiga, Sadashiva; Tang, Yuan-Liang

    1994-01-01

    This research was initiated as a part of the Advanced Sensor and Imaging System Technology (ASSIST) program at NASA Langley Research Center. The primary goal of this research is the development of image analysis algorithms for the detection of runways and other objects using an on-board camera. Initial effort was concentrated on images acquired using a passive millimeter wave (PMMW) sensor. The images obtained using PMMW sensors under poor visibility conditions due to atmospheric fog are characterized by very low spatial resolution but good image contrast compared to those images obtained using sensors operating in the visible spectrum. Algorithms developed for analyzing these images using a model of the runway and other objects are described in Part 1 of this report. Experimental verification of these algorithms was limited to a sequence of images simulated from a single frame of PMMW image. Subsequent development and evaluation of algorithms was done using video image sequences. These images have better spatial and temporal resolution compared to PMMW images. Algorithms for reliable recognition of runways and accurate estimation of spatial position of stationary objects on the ground have been developed and evaluated using several image sequences. These algorithms are described in Part 2 of this report. A list of all publications resulting from this work is also included.

  11. Direct endoscopic video registration for sinus surgery

    NASA Astrophysics Data System (ADS)

    Mirota, Daniel; Taylor, Russell H.; Ishii, Masaru; Hager, Gregory D.

    2009-02-01

    Advances in computer vision have made possible robust 3D reconstruction of monocular endoscopic video. These reconstructions accurately represent the visible anatomy and, once registered to pre-operative CT data, enable a navigation system to track directly through video eliminating the need for an external tracking system. Video registration provides the means for a direct interface between an endoscope and a navigation system and allows a shorter chain of rigid-body transformations to be used to solve the patient/navigation-system registration. To solve this registration step we propose a new 3D-3D registration algorithm based on Trimmed Iterative Closest Point (TrICP)1 and the z-buffer algorithm.2 The algorithm takes as input a 3D point cloud of relative scale with the origin at the camera center, an isosurface from the CT, and an initial guess of the scale and location. Our algorithm utilizes only the visible polygons of the isosurface from the current camera location during each iteration to minimize the search area of the target region and robustly reject outliers of the reconstruction. We present example registrations in the sinus passage applicable to both sinus surgery and transnasal surgery. To evaluate our algorithm's performance we compare it to registration via Optotrak and present closest distance point to surface error. We show our algorithm has a mean closest distance error of .2268mm.

  12. Twenty-four year record of Northern Hemisphere snow cover derived from passive microwave remote sensing

    NASA Astrophysics Data System (ADS)

    Armstrong, Richard L.; Brodzik, Mary Jo

    2003-04-01

    Snow cover is an important variable for climate and hydrologic models due to its effects on energy and moisture budgets. Seasonal snow can cover more than 50% of the Northern Hemisphere land surface during the winter resulting in snow cover being the land surface characteristic responsible for the largest annual and interannual differences in albedo. Passive microwave satellite remote sensing can augment measurements based on visible satellite data alone because of the ability to acquire data through most clouds or during darkness as well as to provide a measure of snow depth or water equivalent. It is now possible to monitor the global fluctuation of snow cover over a 24 year period using passive microwave data (Scanning Multichannel Microwave Radiometer (SMMR) 1978-1987 and Special Sensor Microwave/Imager (SSM/I), 1987-present). Evaluation of snow extent derived from passive microwave algorithms is presented through comparison with the NOAA Northern Hemisphere snow extent data. For the period 1978 to 2002, both passive microwave and visible data sets show a smiliar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are consistently less than those provided by the visible statellite data and the visible data typically show higher monthly variability. During shallow snow conditions of the early winter season microwave data consistently indicate less snow-covered area than the visible data. This underestimate of snow extent results from the fact that shallow snow cover (less than about 5.0 cm) does not provide a scattering signal of sufficient strength to be detected by the algorithms. As the snow cover continues to build during the months of January through March, as well as on into the melt season, agreement between the two data types continually improves. This occurs because as the snow becomes deeper and the layered structure more complex, the negative spectral gradient driving the passive microwave algorithm is enhanced. Trends in annual averages are similar, decreasing at rates of approximately 2% per decade. The only region where the passive microwave data consistently indicate snow and the visible data do not is over the Tibetan Plateau and surrounding mountain areas. In the effort to determine the accuracy of the microwave algorithm over this region we are acquiring surface snow observations through a collaborative study with CAREERI/Lanzhou. In order to provide an optimal snow cover product in the future, we are developing a procedure that blends snow extent maps derived from MODIS data with snow water equivalent maps derived from both SSM/I and AMSR.

  13. Infrared and visible image fusion based on robust principal component analysis and compressed sensing

    NASA Astrophysics Data System (ADS)

    Li, Jun; Song, Minghui; Peng, Yuanxi

    2018-03-01

    Current infrared and visible image fusion methods do not achieve adequate information extraction, i.e., they cannot extract the target information from infrared images while retaining the background information from visible images. Moreover, most of them have high complexity and are time-consuming. This paper proposes an efficient image fusion framework for infrared and visible images on the basis of robust principal component analysis (RPCA) and compressed sensing (CS). The novel framework consists of three phases. First, RPCA decomposition is applied to the infrared and visible images to obtain their sparse and low-rank components, which represent the salient features and background information of the images, respectively. Second, the sparse and low-rank coefficients are fused by different strategies. On the one hand, the measurements of the sparse coefficients are obtained by the random Gaussian matrix, and they are then fused by the standard deviation (SD) based fusion rule. Next, the fused sparse component is obtained by reconstructing the result of the fused measurement using the fast continuous linearized augmented Lagrangian algorithm (FCLALM). On the other hand, the low-rank coefficients are fused using the max-absolute rule. Subsequently, the fused image is superposed by the fused sparse and low-rank components. For comparison, several popular fusion algorithms are tested experimentally. By comparing the fused results subjectively and objectively, we find that the proposed framework can extract the infrared targets while retaining the background information in the visible images. Thus, it exhibits state-of-the-art performance in terms of both fusion effects and timeliness.

  14. Accelerated speckle imaging with the ATST visible broadband imager

    NASA Astrophysics Data System (ADS)

    Wöger, Friedrich; Ferayorni, Andrew

    2012-09-01

    The Advanced Technology Solar Telescope (ATST), a 4 meter class telescope for observations of the solar atmosphere currently in construction phase, will generate data at rates of the order of 10 TB/day with its state of the art instrumentation. The high-priority ATST Visible Broadband Imager (VBI) instrument alone will create two data streams with a bandwidth of 960 MB/s each. Because of the related data handling issues, these data will be post-processed with speckle interferometry algorithms in near-real time at the telescope using the cost-effective Graphics Processing Unit (GPU) technology that is supported by the ATST Data Handling System. In this contribution, we lay out the VBI-specific approach to its image processing pipeline, put this into the context of the underlying ATST Data Handling System infrastructure, and finally describe the details of how the algorithms were redesigned to exploit data parallelism in the speckle image reconstruction algorithms. An algorithm re-design is often required to efficiently speed up an application using GPU technology; we have chosen NVIDIA's CUDA language as basis for our implementation. We present our preliminary results of the algorithm performance using our test facilities, and base a conservative estimate on the requirements of a full system that could achieve near real-time performance at ATST on these results.

  15. Experimental results for correlation-based wavefront sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyneer, L A; Palmer, D W; LaFortune, K N

    2005-07-01

    Correlation wave-front sensing can improve Adaptive Optics (AO) system performance in two keys areas. For point-source-based AO systems, Correlation is more accurate, more robust to changing conditions and provides lower noise than a centroiding algorithm. Experimental results from the Lick AO system and the SSHCL laser AO system confirm this. For remote imaging, Correlation enables the use of extended objects for wave-front sensing. Results from short horizontal-path experiments will show algorithm properties and requirements.

  16. Improving Nocturnal Fire Detection with the VIIRS Day-Night Band

    NASA Technical Reports Server (NTRS)

    Polivka, Thomas N.; Wang, Jun; Ellison, Luke T.; Hyer, Edward J.; Ichoku, Charles M.

    2016-01-01

    Building on existing techniques for satellite remote sensing of fires, this paper takes advantage of the day-night band (DNB) aboard the Visible Infrared Imaging Radiometer Suite (VIIRS) to develop the Firelight Detection Algorithm (FILDA), which characterizes fire pixels based on both visible-light and infrared (IR) signatures at night. By adjusting fire pixel selection criteria to include visible-light signatures, FILDA allows for significantly improved detection of pixels with smaller and/or cooler subpixel hotspots than the operational Interface Data Processing System (IDPS) algorithm. VIIRS scenes with near-coincident Advanced Spaceborne Thermal Emission and Reflection (ASTER) overpasses are examined after applying the operational VIIRS fire product algorithm and including a modified "candidate fire pixel selection" approach from FILDA that lowers the 4-µm brightness temperature (BT) threshold but includes a minimum DNB radiance. FILDA is shown to be effective in detecting gas flares and characterizing fire lines during large forest fires (such as the Rim Fire in California and High Park fire in Colorado). Compared with the operational VIIRS fire algorithm for the study period, FILDA shows a large increase (up to 90%) in the number of detected fire pixels that can be verified with the finer resolution ASTER data (90 m). Part (30%) of this increase is likely due to a combined use of DNB and lower 4-µm BT thresholds for fire detection in FILDA. Although further studies are needed, quantitative use of the DNB to improve fire detection could lead to reduced response times to wildfires and better estimate of fire characteristics (smoldering and flaming) at night.

  17. Loosely coupled level sets for retinal layers and drusen segmentation in subjects with dry age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Novosel, Jelena; Wang, Ziyuan; de Jong, Henk; Vermeer, Koenraad A.; van Vliet, Lucas J.

    2016-03-01

    Optical coherence tomography (OCT) is used to produce high-resolution three-dimensional images of the retina, which permit the investigation of retinal irregularities. In dry age-related macular degeneration (AMD), a chronic eye disease that causes central vision loss, disruptions such as drusen and changes in retinal layer thicknesses occur which could be used as biomarkers for disease monitoring and diagnosis. Due to the topology disrupting pathology, existing segmentation methods often fail. Here, we present a solution for the segmentation of retinal layers in dry AMD subjects by extending our previously presented loosely coupled level sets framework which operates on attenuation coefficients. In eyes affected by AMD, Bruch's membrane becomes visible only below the drusen and our segmentation framework is adapted to delineate such a partially discernible interface. Furthermore, the initialization stage, which tentatively segments five interfaces, is modified to accommodate the appearance of drusen. This stage is based on Dijkstra's algorithm and combines prior knowledge on the shape of the interface, gradient and attenuation coefficient in the newly proposed cost function. This prior knowledge is incorporated by varying the weights for horizontal, diagonal and vertical edges. Finally, quantitative evaluation of the accuracy shows a good agreement between manual and automated segmentation.

  18. Airport Traffic Conflict Detection and Resolution Algorithm Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.; Chartrand, Ryan C.; Wilson, Sara R.; Commo, Sean A.; Ballard, Kathryn M.; Otero, Sharon D.; Barker, Glover D.

    2016-01-01

    Two conflict detection and resolution (CD&R) algorithms for the terminal maneuvering area (TMA) were evaluated in a fast-time batch simulation study at the National Aeronautics and Space Administration (NASA) Langley Research Center. One CD&R algorithm, developed at NASA, was designed to enhance surface situation awareness and provide cockpit alerts of potential conflicts during runway, taxi, and low altitude air-to-air operations. The second algorithm, Enhanced Traffic Situation Awareness on the Airport Surface with Indications and Alerts (SURF IA), was designed to increase flight crew awareness of the runway environment and facilitate an appropriate and timely response to potential conflict situations. The purpose of the study was to evaluate the performance of the aircraft-based CD&R algorithms during various runway, taxiway, and low altitude scenarios, multiple levels of CD&R system equipage, and various levels of horizontal position accuracy. Algorithm performance was assessed through various metrics including the collision rate, nuisance and missed alert rate, and alert toggling rate. The data suggests that, in general, alert toggling, nuisance and missed alerts, and unnecessary maneuvering occurred more frequently as the position accuracy was reduced. Collision avoidance was more effective when all of the aircraft were equipped with CD&R and maneuvered to avoid a collision after an alert was issued. In order to reduce the number of unwanted (nuisance) alerts when taxiing across a runway, a buffer is needed between the hold line and the alerting zone so alerts are not generated when an aircraft is behind the hold line. All of the results support RTCA horizontal position accuracy requirements for performing a CD&R function to reduce the likelihood and severity of runway incursions and collisions.

  19. iTrack: instrumented mobile electrooculography (EOG) eye-tracking in older adults and Parkinson's disease.

    PubMed

    Stuart, Samuel; Hickey, Aodhán; Galna, Brook; Lord, Sue; Rochester, Lynn; Godfrey, Alan

    2017-01-01

    Detection of saccades (fast eye-movements) within raw mobile electrooculography (EOG) data involves complex algorithms which typically process data acquired during seated static tasks only. Processing of data during dynamic tasks such as walking is relatively rare and complex, particularly in older adults or people with Parkinson's disease (PD). Development of algorithms that can be easily implemented to detect saccades is required. This study aimed to develop an algorithm for the detection and measurement of saccades in EOG data during static (sitting) and dynamic (walking) tasks, in older adults and PD. Eye-tracking via mobile EOG and infra-red (IR) eye-tracker (with video) was performed with a group of older adults (n  =  10) and PD participants (n  =  10) (⩾50 years). Horizontal saccades made between targets set 5°, 10° and 15° apart were first measured while seated. Horizontal saccades were then measured while a participant walked and executed a 40° turn left and right. The EOG algorithm was evaluated by comparing the number of correct saccade detections and agreement (ICC 2,1 ) between output from visual inspection of eye-tracker videos and IR eye-tracker. The EOG algorithm detected 75-92% of saccades compared to video inspection and IR output during static testing, with fair to excellent agreement (ICC 2,1 0.49-0.93). However, during walking EOG saccade detection reduced to 42-88% compared to video inspection or IR output, with poor to excellent (ICC 2,1 0.13-0.88) agreement between methodologies. The algorithm was robust during seated testing but less so during walking, which was likely due to increased measurement and analysis error with a dynamic task. Future studies may consider a combination of EOG and IR for comprehensive measurement.

  20. Ash loading and insolation at Hanford, Washington during and after the eruption of Mount St. Helens

    NASA Technical Reports Server (NTRS)

    Laulainen, N. S.

    1982-01-01

    The effects of volcanic ash suspended in the atmosphere on the incident solar radiation was monitored at the Hanford Meteorological Station (HMS) subsequent to the major eruption of Mount St. Helens on May 18, 1980. Passage of the ash plume over Hanford resulted in a very dramatic decrease of solar radiation intensity to zero. A reduction in visibility to less than 1 km was observed, as great quantities of ash fell out of the plume onto the ground. Ash loading in the atmosphere remained very high for several days following the eruption, primarily as a result of resuspension from the surface. Visibilities remained low (2 to 8 km) during this period. Estimates of atmospheric turbidity were made from the ratio of diffuse-to-direct solar radiation; these turbidities were used to estimate extinction along a horizontal path, a quantity which can be related to visibility. Comparisons of observed and estimated visibilities were very good, in spite of the rather coarse approximations used in the estimates. Atmospheric clarity and visibility improved to near pre-eruption conditions following a period of rain showers. The diffuse-to-direct ratio of solar radiation provided a useful index for estimating volcanic ash loading of the atmosphere.

  1. A method for evaluating horizontal well pumping tests.

    PubMed

    Langseth, David E; Smyth, Andrew H; May, James

    2004-01-01

    Predicting the future performance of horizontal wells under varying pumping conditions requires estimates of basic aquifer parameters, notably transmissivity and storativity. For vertical wells, there are well-established methods for estimating these parameters, typically based on either the recovery from induced head changes in a well or from the head response in observation wells to pumping in a test well. Comparable aquifer parameter estimation methods for horizontal wells have not been presented in the ground water literature. Formation parameter estimation methods based on measurements of pressure in horizontal wells have been presented in the petroleum industry literature, but these methods have limited applicability for ground water evaluation and are based on pressure measurements in only the horizontal well borehole, rather than in observation wells. This paper presents a simple and versatile method by which pumping test procedures developed for vertical wells can be applied to horizontal well pumping tests. The method presented here uses the principle of superposition to represent the horizontal well as a series of partially penetrating vertical wells. This concept is used to estimate a distance from an observation well at which a vertical well that has the same total pumping rate as the horizontal well will produce the same drawdown as the horizontal well. This equivalent distance may then be associated with an observation well for use in pumping test algorithms and type curves developed for vertical wells. The method is shown to produce good results for confined aquifers and unconfined aquifers in the absence of delayed yield response. For unconfined aquifers, the presence of delayed yield response increases the method error.

  2. Olympus Mons

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    The largest volcano on Mars, and the largest in the solar system, centered at 18.4 °N, 133.1 °W. It coincides with the albedo feature visible from Earth known as Nix Olympica, the `Snows of Olympus', after the mountain in Greece which in legend was the home of the gods. Olympus Mons rises to a height of 27 km above Mars's mean surface level, and in its greatest horizontal dimension measures 624 k...

  3. Horizontal visibility graphs generated by type-I intermittency

    NASA Astrophysics Data System (ADS)

    Núñez, Ángel M.; Luque, Bartolo; Lacasa, Lucas; Gómez, Jose Patricio; Robledo, Alberto

    2013-05-01

    The type-I intermittency route to (or out of) chaos is investigated within the horizontal visibility (HV) graph theory. For that purpose, we address the trajectories generated by unimodal maps close to an inverse tangent bifurcation and construct their associated HV graphs. We show how the alternation of laminar episodes and chaotic bursts imprints a fingerprint in the resulting graph structure. Accordingly, we derive a phenomenological theory that predicts quantitative values for several network parameters. In particular, we predict that the characteristic power-law scaling of the mean length of laminar trend sizes is fully inherited by the variance of the graph degree distribution, in good agreement with the numerics. We also report numerical evidence on how the characteristic power-law scaling of the Lyapunov exponent as a function of the distance to the tangent bifurcation is inherited in the graph by an analogous scaling of block entropy functionals defined on the graph. Furthermore, we are able to recast the full set of HV graphs generated by intermittent dynamics into a renormalization-group framework, where the fixed points of its graph-theoretical renormalization-group flow account for the different types of dynamics. We also establish that the nontrivial fixed point of this flow coincides with the tangency condition and that the corresponding invariant graph exhibits extremal entropic properties.

  4. In vitro reproduction of incisal/occlusal cupping/cratering.

    PubMed

    Dzakovich, John J; Oslak, Robert R

    2013-06-01

    Occlusal cupping/cratering (depressed dentin surrounded by elevated rims of enamel) has been postulated to be the result of abrasion, bruxism, attrition, acid erosion, stress corrosion, or a combination of these. The primary etiology or the multifactorial sequence of occlusal cupping/cratering remains scientifically unsubstantiated. The purpose of this study was to reproduce occlusal/incisal cupping/cratering in vitro. This study was designed to create cupping/cratering on the occlusal surfaces of extracted human teeth rather than to quantify the amount of lost tooth structure caused by abrasion. One name-brand toothbrush was tested with 2 different dentifrices (of different abrasive potentials [low and high]) and water only (nonabrasive) on extracted human teeth. Six specimens of 4 teeth each (24 teeth) were subjected to horizontal brushing in a 1:1 toothpaste/water slurry and water only. The control group, brushed with water only, demonstrated no visible loss of tooth structure. Each of the specimens brushed with toothpaste, regardless of the degree of abrasivity, demonstrated visible wear of the dentin, resulting in occlusal/incisal cupping/cratering. Pronounced cupping/cratering was caused by horizontal brushing with commercial toothpastes. Brushing in water demonstrated no visual loss of occlusal tooth structure. (J Prosthet Dent 2013;109:384-391). Copyright © 2013 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  5. Development of visible/infrared/microwave agriculture classification and biomass estimation algorithms, volume 2. [Oklahoma and Texas

    NASA Technical Reports Server (NTRS)

    Rosenthal, W. D.; Mcfarland, M. J.; Theis, S. W.; Jones, C. L. (Principal Investigator)

    1982-01-01

    Agricultural crop classification models using two or more spectral regions (visible through microwave) were developed and tested and biomass was estimated by including microwave with visible and infrared data. The study was conducted at Guymon, Oklahoma and Dalhart, Texas utilizing aircraft multispectral data and ground truth soil moisture and biomass information. Results indicate that inclusion of C, L, and P band active microwave data from look angles greater than 35 deg from nadir with visible and infrared data improved crop discrimination and biomass estimates compared to results using only visible and infrared data. The active microwave frequencies were sensitive to different biomass levels. In addition, two indices, one using only active microwave data and the other using data from the middle and near infrared bands, were well correlated to total biomass.

  6. Modified retrieval algorithm for three types of precipitation distribution using x-band synthetic aperture radar

    NASA Astrophysics Data System (ADS)

    Xie, Yanan; Zhou, Mingliang; Pan, Dengke

    2017-10-01

    The forward-scattering model is introduced to describe the response of normalized radar cross section (NRCS) of precipitation with synthetic aperture radar (SAR). Since the distribution of near-surface rainfall is related to the rate of near-surface rainfall and horizontal distribution factor, a retrieval algorithm called modified regression empirical and model-oriented statistical (M-M) based on the volterra integration theory is proposed. Compared with the model-oriented statistical and volterra integration (MOSVI) algorithm, the biggest difference is that the M-M algorithm is based on the modified regression empirical algorithm rather than the linear regression formula to retrieve the value of near-surface rainfall rate. Half of the empirical parameters are reduced in the weighted integral work and a smaller average relative error is received while the rainfall rate is less than 100 mm/h. Therefore, the algorithm proposed in this paper can obtain high-precision rainfall information.

  7. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  8. Identification of periods of clear sky irradiance in time series of GHI measurements

    DOE PAGES

    Reno, Matthew J.; Hansen, Clifford W.

    2016-01-18

    In this study, we present a simple algorithm for identifying periods of time with broadband global horizontal irradiance (GHI) similar to that occurring during clear sky conditions from a time series of GHI measurements. Other available methods to identify these periods do so by identifying periods with clear sky conditions using additional measurements, such as direct or diffuse irradiance. Our algorithm compares characteristics of the time series of measured GHI with the output of a clear sky model without requiring additional measurements. We validate our algorithm using data from several locations by comparing our results with those obtained from amore » clear sky detection algorithm, and with satellite and ground-based sky imagery.« less

  9. Identification of periods of clear sky irradiance in time series of GHI measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Matthew J.; Hansen, Clifford W.

    In this study, we present a simple algorithm for identifying periods of time with broadband global horizontal irradiance (GHI) similar to that occurring during clear sky conditions from a time series of GHI measurements. Other available methods to identify these periods do so by identifying periods with clear sky conditions using additional measurements, such as direct or diffuse irradiance. Our algorithm compares characteristics of the time series of measured GHI with the output of a clear sky model without requiring additional measurements. We validate our algorithm using data from several locations by comparing our results with those obtained from amore » clear sky detection algorithm, and with satellite and ground-based sky imagery.« less

  10. Representing Visibility for Siting Problems

    DTIC Science & Technology

    1994-04-01

    54 Figure 4.13 Alternative Tessellation of R2 Observations .............. 57 Figure 4.14 De -Cluttered Alternative Tessellation from R3 .......... .. 57...basis for a component of the next release of the TL%- AirLand Battle Environment (ALBE) software suite . TEC is also using algorithms developed as part...Considerations in LOS Calculation The details of calculating a LOS often receive relatively small mention in de - scriptions of visibility analyses but can have a

  11. Simulating Soft Shadows with Graphics Hardware,

    DTIC Science & Technology

    1997-01-15

    This radiance texture is analogous to the mesh of radiosity values computed in a radiosity algorithm. Unlike a radiosity algorithm, however, our...discretely. Several researchers have explored continuous visibility methods for soft shadow computation and radiosity mesh generation. With this approach...times of several seconds [9]. Most radiosity methods discretize each surface into a mesh of elements and then use discrete methods such as ray

  12. Forming positive-negative images using conditioned partial measurements from reference arm in ghost imaging.

    PubMed

    Wen, Jianming

    2012-09-01

    A recent thermal ghost imaging experiment implemented in Wu's group [Chin. Phys. Lett. 279, 074216 (2012)] showed that both positive and negative images can be constructed by applying a novel algorithm. This algorithm allows us to form the images with the use of partial measurements from the reference arm (even which never passes through the object), conditioned on the object arm. In this paper, we present a simple theory that explains the experimental observation and provides an in-depth understanding of conventional ghost imaging. In particular, we theoretically show that the visibility of formed images through such an algorithm is not bounded by the standard value 1/3. In fact, it can ideally grow up to unity (with reduced imaging quality). Thus, the algorithm described here not only offers an alternative way to decode spatial correlation of thermal light, but also mimics a "bandpass filter" to remove the constant background such that the visibility or imaging contrast is improved. We further show that conditioned on one still object present in the test arm, it is possible to construct the object's image by sampling the available reference data.

  13. An improvement of the measurement of time series irreversibility with visibility graph approach

    NASA Astrophysics Data System (ADS)

    Wu, Zhenyu; Shang, Pengjian; Xiong, Hui

    2018-07-01

    We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.

  14. Coordination Logic for Repulsive Resolution Maneuvers

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Dutle, Aaron M.

    2016-01-01

    This paper presents an algorithm for determining the direction an aircraft should maneuver in the event of a potential conflict with another aircraft. The algorithm is implicitly coordinated, meaning that with perfectly reliable computations and information, it will in- dependently provide directional information that is guaranteed to be coordinated without any additional information exchange or direct communication. The logic is inspired by the logic of TCAS II, the airborne system designed to reduce the risk of mid-air collisions between aircraft. TCAS II provides pilots with only vertical resolution advice, while the proposed algorithm, using a similar logic, provides implicitly coordinated vertical and horizontal directional advice.

  15. Hydrotropism and its interaction with gravitropism in maize roots

    NASA Technical Reports Server (NTRS)

    Takahashi, H.; Scott, T. K.

    1991-01-01

    We have partially characterized root hydrotropism and its interaction with gravitropism in maize (Zea mays L.). Roots of Golden Cross Bantam 70, which require light for orthogravitropism, showed positive hydrotropism; bending upward when placed horizontally below a hydrostimulant (moist cheesecloth) in 85% relative humidity (RH) and in total darkness. However, the light-exposed roots of Golden Cross Bantam 70 or roots of a normal maize cultivar, Burpee Snow Cross, showed positive gravitropism under the same conditions; bending downward when placed horizontally below the hydrostimulant in 85% RH. Light-exposed roots of Golden Cross Bantam 70 placed at 70 degrees below the horizontal plane responded positively hydrotropically, but gravitropism overcame the hydrotropism when the roots were placed at 45 degrees below the horizontal. Roots placed vertically with the tip down in 85% RH bent to the side toward the hydrostimulant in both cultivars, and light conditions did not affect the response. Such vertical roots did not respond when the humidity was maintained near saturation. These results suggest that hydrotropic and gravitropic responses interact with one another depending on the intensity of one or both factors. Removal of the approximately 1.5 millimeter root tip blocked both hydrotropic and gravitropic responses in the two cultivars. However, removal of visible root tip mucilage did not affect hydrotropism or gravitropism in either cultivar.

  16. High-precision approach to localization scheme of visible light communication based on artificial neural networks and modified genetic algorithms

    NASA Astrophysics Data System (ADS)

    Guan, Weipeng; Wu, Yuxiang; Xie, Canyu; Chen, Hao; Cai, Ye; Chen, Yingcong

    2017-10-01

    An indoor positioning algorithm based on visible light communication (VLC) is presented. This algorithm is used to calculate a three-dimensional (3-D) coordinate of an indoor optical wireless environment, which includes sufficient orders of multipath reflections from reflecting surfaces of the room. Leveraging the global optimization ability of the genetic algorithm (GA), an innovative framework for 3-D position estimation based on a modified genetic algorithm is proposed. Unlike other techniques using VLC for positioning, the proposed system can achieve indoor 3-D localization without making assumptions about the height or acquiring the orientation angle of the mobile terminal. Simulation results show that an average localization error of less than 1.02 cm can be achieved. In addition, in most VLC-positioning systems, the effect of reflection is always neglected and its performance is limited by reflection, which makes the results not so accurate for a real scenario and the positioning errors at the corners are relatively larger than other places. So, we take the first-order reflection into consideration and use artificial neural network to match the model of a nonlinear channel. The studies show that under the nonlinear matching of direct and reflected channels the average positioning errors of four corners decrease from 11.94 to 0.95 cm. The employed algorithm is emerged as an effective and practical method for indoor localization and outperform other existing indoor wireless localization approaches.

  17. Pattern recognition applied to infrared images for early alerts in fog

    NASA Astrophysics Data System (ADS)

    Boucher, Vincent; Marchetti, Mario; Dumoulin, Jean; Cord, Aurélien

    2014-09-01

    Fog conditions are the cause of severe car accidents in western countries because of the poor induced visibility. Its forecast and intensity are still very difficult to predict by weather services. Infrared cameras allow to detect and to identify objects in fog while visibility is too low for eye detection. Over the past years, the implementation of cost effective infrared cameras on some vehicles has enabled such detection. On the other hand pattern recognition algorithms based on Canny filters and Hough transformation are a common tool applied to images. Based on these facts, a joint research program between IFSTTAR and Cerema has been developed to study the benefit of infrared images obtained in a fog tunnel during its natural dissipation. Pattern recognition algorithms have been applied, specifically on road signs which shape is usually associated to a specific meaning (circular for a speed limit, triangle for an alert, …). It has been shown that road signs were detected early enough in images, with respect to images in the visible spectrum, to trigger useful alerts for Advanced Driver Assistance Systems.

  18. Development of GK-2A cloud optical and microphysical properties retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Yum, S. S.; Um, J.

    2017-12-01

    Cloud and aerosol radiative forcing is known to be one of the the largest uncertainties in climate change prediction. To reduce this uncertainty, remote sensing observation of cloud radiative and microphysical properties have been used since 1970s and the corresponding remote sensing techniques and instruments have been developed. As a part of such effort, Geo-KOMPSAT-2A (Geostationary Korea Multi-Purpose Satellite-2A, GK-2A) will be launched in 2018. On the GK-2A, the Advanced Meteorological Imager (AMI) is primary instrument which have 3 visible, 3 near-infrared, and 10 infrared channels. To retrieve optical and microphysical properties of clouds using AMI measurements, the preliminary version of new cloud retrieval algorithm for GK-2A was developed and several validation tests were conducted. This algorithm retrieves cloud optical thickness (COT), cloud effective radius (CER), liquid water path (LWP), and ice water path (IWP), so we named this algorithm as Daytime Cloud Optical thickness, Effective radius and liquid and ice Water path (DCOEW). The DCOEW uses cloud reflectance at visible and near-infrared channels as input data. An optimal estimation (OE) approach that requires appropriate a-priori values and measurement error information is used to retrieve COT and CER. LWP and IWP are calculated using empirical relationships between COT/CER and cloud water path that were determined previously. To validate retrieved cloud properties, we compared DCOEW output data with other operational satellite data. For COT and CER validation, we used two different data sets. To compare algorithms that use cloud reflectance at visible and near-IR channels as input data, MODIS MYD06 cloud product was selected. For the validation with cloud products that are based on microwave measurements, COT(2B-TAU)/CER(2C-ICE) data retrieved from CloudSat cloud profiling radar (W-band, 94 GHz) was used. For cloud water path validation, AMSR-2 Level-3 Cloud liquid water data was used. Detailed results will be shown at the conference.

  19. Fuzzy logic-based analogue forecasting and hybrid modelling of horizontal visibility

    NASA Astrophysics Data System (ADS)

    Tuba, Zoltán; Bottyán, Zsolt

    2018-04-01

    Forecasting visibility is one of the greatest challenges in aviation meteorology. At the same time, high accuracy visibility forecasts can significantly reduce or make avoidable weather-related risk in aviation as well. To improve forecasting visibility, this research links fuzzy logic-based analogue forecasting and post-processed numerical weather prediction model outputs in hybrid forecast. Performance of analogue forecasting model was improved by the application of Analytic Hierarchy Process. Then, linear combination of the mentioned outputs was applied to create ultra-short term hybrid visibility prediction which gradually shifts the focus from statistical to numerical products taking their advantages during the forecast period. It gives the opportunity to bring closer the numerical visibility forecast to the observations even it is wrong initially. Complete verification of categorical forecasts was carried out; results are available for persistence and terminal aerodrome forecasts (TAF) as well in order to compare. The average value of Heidke Skill Score (HSS) of examined airports of analogue and hybrid forecasts shows very similar results even at the end of forecast period where the rate of analogue prediction in the final hybrid output is 0.1-0.2 only. However, in case of poor visibility (1000-2500 m), hybrid (0.65) and analogue forecasts (0.64) have similar average of HSS in the first 6 h of forecast period, and have better performance than persistence (0.60) or TAF (0.56). Important achievement that hybrid model takes into consideration physics and dynamics of the atmosphere due to the increasing part of the numerical weather prediction. In spite of this, its performance is similar to the most effective visibility forecasting methods and does not follow the poor verification results of clearly numerical outputs.

  20. The Ozone Mapping and Profiler Suite (OMPS) Limb Profiler (LP) Version 1 aerosol extinction retrieval algorithm: theoretical basis

    NASA Astrophysics Data System (ADS)

    Loughman, Robert; Bhartia, Pawan K.; Chen, Zhong; Xu, Philippe; Nyaku, Ernest; Taha, Ghassan

    2018-05-01

    The theoretical basis of the Ozone Mapping and Profiler Suite (OMPS) Limb Profiler (LP) Version 1 aerosol extinction retrieval algorithm is presented. The algorithm uses an assumed bimodal lognormal aerosol size distribution to retrieve aerosol extinction profiles at 675 nm from OMPS LP radiance measurements. A first-guess aerosol extinction profile is updated by iteration using the Chahine nonlinear relaxation method, based on comparisons between the measured radiance profile at 675 nm and the radiance profile calculated by the Gauss-Seidel limb-scattering (GSLS) radiative transfer model for a spherical-shell atmosphere. This algorithm is discussed in the context of previous limb-scattering aerosol extinction retrieval algorithms, and the most significant error sources are enumerated. The retrieval algorithm is limited primarily by uncertainty about the aerosol phase function. Horizontal variations in aerosol extinction, which violate the spherical-shell atmosphere assumed in the version 1 algorithm, may also limit the quality of the retrieved aerosol extinction profiles significantly.

  1. A threshold-based fixed predictor for JPEG-LS image compression

    NASA Astrophysics Data System (ADS)

    Deng, Lihua; Huang, Zhenghua; Yao, Shoukui

    2018-03-01

    In JPEG-LS, fixed predictor based on median edge detector (MED) only detect horizontal and vertical edges, and thus produces large prediction errors in the locality of diagonal edges. In this paper, we propose a threshold-based edge detection scheme for the fixed predictor. The proposed scheme can detect not only the horizontal and vertical edges, but also diagonal edges. For some certain thresholds, the proposed scheme can be simplified to other existing schemes. So, it can also be regarded as the integration of these existing schemes. For a suitable threshold, the accuracy of horizontal and vertical edges detection is higher than the existing median edge detection in JPEG-LS. Thus, the proposed fixed predictor outperforms the existing JPEG-LS predictors for all images tested, while the complexity of the overall algorithm is maintained at a similar level.

  2. 48. MAIN WAREHOUSE THIRD LEVEL Elevator drive mechanism is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    48. MAIN WAREHOUSE - THIRD LEVEL Elevator drive mechanism is seen to the right, while drive wheels, belt wheels and chain drives are visible in the wooden wall framing. The horizontal metal conveyor (at the top of the wall Just under the inverted 'V' brace) is part of the empty can supply system connected to the external can conveyor. See Photo No. 28. - Hovden Cannery, 886 Cannery Row, Monterey, Monterey County, CA

  3. Pattern recognition of visible and near-infrared spectroscopy from bayberry juice by use of partial least squares and a backpropagation neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cen Haiyan; Bao Yidan; He Yong

    2006-10-10

    Visible and near-infrared reflectance (visible-NIR) spectroscopy is applied to discriminate different varieties of bayberry juices. The discrimination of visible-NIR spectra from samples is a matter of pattern recognition. By partial least squares (PLS), the spectrum is reduced to certain factors, which are then taken as the input of the backpropagation neural network (BPNN). Through training and prediction, three different varieties of bayberry juice are classified based on the output of the BPNN. In addition, a mathematical model is built and the algorithm is optimized. With proper parameters in the training set,100% accuracy is obtained by the BPNN. Thus it ismore » concluded that the PLS analysis combined with the BPNN is an alternative for pattern recognition based on visible and NIR spectroscopy.« less

  4. In vivo estimation of target registration errors during augmented reality laparoscopic surgery.

    PubMed

    Thompson, Stephen; Schneider, Crispin; Bosi, Michele; Gurusamy, Kurinchi; Ourselin, Sébastien; Davidson, Brian; Hawkes, David; Clarkson, Matthew J

    2018-06-01

    Successful use of augmented reality for laparoscopic surgery requires that the surgeon has a thorough understanding of the likely accuracy of any overlay. Whilst the accuracy of such systems can be estimated in the laboratory, it is difficult to extend such methods to the in vivo clinical setting. Herein we describe a novel method that enables the surgeon to estimate in vivo errors during use. We show that the method enables quantitative evaluation of in vivo data gathered with the SmartLiver image guidance system. The SmartLiver system utilises an intuitive display to enable the surgeon to compare the positions of landmarks visible in both a projected model and in the live video stream. From this the surgeon can estimate the system accuracy when using the system to locate subsurface targets not visible in the live video. Visible landmarks may be either point or line features. We test the validity of the algorithm using an anatomically representative liver phantom, applying simulated perturbations to achieve clinically realistic overlay errors. We then apply the algorithm to in vivo data. The phantom results show that using projected errors of surface features provides a reliable predictor of subsurface target registration error for a representative human liver shape. Applying the algorithm to in vivo data gathered with the SmartLiver image-guided surgery system shows that the system is capable of accuracies around 12 mm; however, achieving this reliably remains a significant challenge. We present an in vivo quantitative evaluation of the SmartLiver image-guided surgery system, together with a validation of the evaluation algorithm. This is the first quantitative in vivo analysis of an augmented reality system for laparoscopic surgery.

  5. Effect of Non-rigid Registration Algorithms on Deformation Based Morphometry: A Comparative Study with Control and Williams Syndrome Subjects

    PubMed Central

    Han, Zhaoying; Thornton-Wells, Tricia A.; Dykens, Elisabeth M.; Gore, John C.; Dawant, Benoit M.

    2014-01-01

    Deformation Based Morphometry (DBM) is a widely used method for characterizing anatomical differences across groups. DBM is based on the analysis of the deformation fields generated by non-rigid registration algorithms, which warp the individual volumes to a DBM atlas. Although several studies have compared non-rigid registration algorithms for segmentation tasks, few studies have compared the effect of the registration algorithms on group differences that may be uncovered through DBM. In this study, we compared group atlas creation and DBM results obtained with five well-established non-rigid registration algorithms using thirteen subjects with Williams Syndrome (WS) and thirteen Normal Control (NC) subjects. The five non-rigid registration algorithms include: (1) The Adaptive Bases Algorithm (ABA); (2) The Image Registration Toolkit (IRTK); (3) The FSL Nonlinear Image Registration Tool (FSL); (4) The Automatic Registration Tool (ART); and (5) the normalization algorithm available in SPM8. Results indicate that the choice of algorithm has little effect on the creation of group atlases. However, regions of differences between groups detected with DBM vary from algorithm to algorithm both qualitatively and quantitatively. The unique nature of the data set used in this study also permits comparison of visible anatomical differences between the groups and regions of difference detected by each algorithm. Results show that the interpretation of DBM results is difficult. Four out of the five algorithms we have evaluated detect bilateral differences between the two groups in the insular cortex, the basal ganglia, orbitofrontal cortex, as well as in the cerebellum. These correspond to differences that have been reported in the literature and that are visible in our samples. But our results also show that some algorithms detect regions that are not detected by the others and that the extent of the detected regions varies from algorithm to algorithm. These results suggest that using more than one algorithm when performing DBM studies would increase confidence in the results. Properties of the algorithms such as the similarity measure they maximize and the regularity of the deformation fields, as well as the location of differences detected with DBM, also need to be taken into account in the interpretation process. PMID:22459439

  6. Use and limitations of ASHRAE solar algorithms in solar energy utilization studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sowell, E.F.

    1978-01-01

    Algorithms for computer calculation of solar radiation based on cloud cover data, recommended by the ASHRAE Task Group on Energy Requirements for Buildings, are examined for applicability in solar utilization studies. The implementation is patterned after a well-known computer program, NBSLD. The results of these algorithms, including horizontal and tilted surface insolation and useful energy collectable, are compared to observations and results obtainable by the Liu and Jordan method. For purposes of comparison, data for Riverside, CA from 1960 through 1963 are examined. It is shown that horizontal values so predicted are frequently less than 10% and always less thanmore » 23% in error when compared to averages of hourly measurements during important collection hours in 1962. Average daily errors range from -14 to 9% over the year. When averaged on an hourly basis over four years, there is a 21% maximum discrepancy compared to the Liu and Jordan method. Corresponding tilted-surface discrepancies are slightly higher, as are those for useful energy collected. Possible sources of these discrepancies and errors are discussed. Limitations of the algorithms and various implementations are examined, and it is suggested that certain assumptions acceptable for building loads analysis may not be acceptable for solar utilization studies. In particular, it is shown that the method of separatingg diffuse and direct components in the presence of clouds requires careful consideration in order to achieve accuracy and efficiency in any implementation.« less

  7. Operational space trajectory tracking control of robot manipulators endowed with a primary controller of synthetic joint velocity.

    PubMed

    Moreno-Valenzuela, Javier; González-Hernández, Luis

    2011-01-01

    In this paper, a new control algorithm for operational space trajectory tracking control of robot arms is introduced. The new algorithm does not require velocity measurement and is based on (1) a primary controller which incorporates an algorithm to obtain synthesized velocity from joint position measurements and (2) a secondary controller which computes the desired joint acceleration and velocity required to achieve operational space motion control. The theory of singularly perturbed systems is crucial for the analysis of the closed-loop system trajectories. In addition, the practical viability of the proposed algorithm is explored through real-time experiments in a two degrees-of-freedom horizontal planar direct-drive arm. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  8. A modified beam-to-earth transformation to measure short-wavelength internal waves with an acoustic Doppler current profiler

    USGS Publications Warehouse

    Scotti, A.; Butman, B.; Beardsley, R.C.; Alexander, P.S.; Anderson, S.

    2005-01-01

    The algorithm used to transform velocity signals from beam coordinates to earth coordinates in an acoustic Doppler current profiler (ADCP) relies on the assumption that the currents are uniform over the horizontal distance separating the beams. This condition may be violated by (nonlinear) internal waves, which can have wavelengths as small as 100-200 m. In this case, the standard algorithm combines velocities measured at different phases of a wave and produces horizontal velocities that increasingly differ from true velocities with distance from the ADCP. Observations made in Massachusetts Bay show that currents measured with a bottom-mounted upward-looking ADCP during periods when short-wavelength internal waves are present differ significantly from currents measured by point current meters, except very close to the instrument. These periods are flagged with high error velocities by the standard ADCP algorithm. In this paper measurements from the four spatially diverging beams and the backscatter intensity signal are used to calculate the propagation direction and celerity of the internal waves. Once this information is known, a modified beam-to-earth transformation that combines appropriately lagged beam measurements can be used to obtain current estimates in earth coordinates that compare well with pointwise measurements. ?? 2005 American Meteorological Society.

  9. A novel clinical decision support algorithm for constructing complete medication histories.

    PubMed

    Long, Ju; Yuan, Michael Juntao

    2017-07-01

    A patient's complete medication history is a crucial element for physicians to develop a full understanding of the patient's medical conditions and treatment options. However, due to the fragmented nature of medical data, this process can be very time-consuming and often impossible for physicians to construct a complete medication history for complex patients. In this paper, we describe an accurate, computationally efficient and scalable algorithm to construct a medication history timeline. The algorithm is developed and validated based on 1 million random prescription records from a large national prescription data aggregator. Our evaluation shows that the algorithm can be scaled horizontally on-demand, making it suitable for future delivery in a cloud-computing environment. We also propose that this cloud-based medication history computation algorithm could be integrated into Electronic Medical Records, enabling informed clinical decision-making at the point of care. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Skyglow effects in UV and visible spectra: Radiative fluxes

    NASA Astrophysics Data System (ADS)

    Kocifaj, Miroslav; Solano Lamphar, Hector Antonio

    2013-09-01

    Several studies have tried to understand the mechanisms and effects of radiative transfer under different night-sky conditions. However, most of these studies are limited to the various effects of visible spectra. Nevertheless, the invisible parts of the electromagnetic spectrum can pose a more profound threat to nature. One visible threat is from what is popularly termed skyglow. Such skyglow is caused by injudiciously situated or designed artificial night lighting systems which degrade desired sky viewing. Therefore, since lamp emissions are not limited to visible electromagnetic spectra, it is necessary to consider the complete spectrum of such lamps in order to understand the physical behaviour of diffuse radiation at terrain level. In this paper, the downward diffuse radiative flux is computed in a two-stream approximation and obtained ultraviolet spectral radiative fluxes are inter-related with luminous fluxes. Such a method then permits an estimate of ultraviolet radiation if the traditionally measured illuminance on a horizontal plane is available. The utility of such a comparison of two spectral bands is shown, using the different lamp types employed in street lighting. The data demonstrate that it is insufficient to specify lamp type and its visible flux production independently of each other. Also the UV emissions have to be treated by modellers and environmental scientists because some light sources can be fairly important pollutants in the near ultraviolet. Such light sources can affect both the living organisms and ambient environment.

  11. Optimization lighting layout based on gene density improved genetic algorithm for indoor visible light communications

    NASA Astrophysics Data System (ADS)

    Liu, Huanlin; Wang, Xin; Chen, Yong; Kong, Deqian; Xia, Peijie

    2017-05-01

    For indoor visible light communication system, the layout of LED lamps affects the uniformity of the received power on communication plane. In order to find an optimized lighting layout that meets both the lighting needs and communication needs, a gene density genetic algorithm (GDGA) is proposed. In GDGA, a gene indicates a pair of abscissa and ordinate of a LED, and an individual represents a LED layout in the room. The segmented crossover operation and gene mutation strategy based on gene density are put forward to make the received power on communication plane more uniform and increase the population's diversity. A weighted differences function between individuals is designed as the fitness function of GDGA for reserving the population having the useful LED layout genetic information and ensuring the global convergence of GDGA. Comparing square layout and circular layout, with the optimized layout achieved by the GDGA, the power uniformity increases by 83.3%, 83.1% and 55.4%, respectively. Furthermore, the convergence of GDGA is verified compared with evolutionary algorithm (EA). Experimental results show that GDGA can quickly find an approximation of optimal layout.

  12. Extraction of line properties based on direction fields.

    PubMed

    Kutka, R; Stier, S

    1996-01-01

    The authors present a new set of algorithms for segmenting lines, mainly blood vessels in X-ray images, and extracting properties such as their intensities, diameters, and center lines. The authors developed a tracking algorithm that checks rules taking the properties of vessels into account. The tools even detect veins, arteries, or catheters of two pixels in diameter and with poor contrast. Compared with other algorithms, such as the Canny line detector or anisotropic diffusion, the authors extract a smoother and connected vessel tree without artifacts in the image background. As the tools depend on common intermediate results, they are very fast when used together. The authors' results will support the 3-D reconstruction of the vessel tree from stereoscopic projections. Moreover, the authors make use of their line intensity measure for enhancing and improving the visibility of vessels in 3-D X-ray images. The processed images are intended to support radiologists in diagnosis, radiation therapy planning, and surgical planning. Radiologists verified the improved quality of the processed images and the enhanced visibility of relevant details, particularly fine blood vessels.

  13. Indoor high precision three-dimensional positioning system based on visible light communication using modified genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Guan, Weipeng; Li, Simin; Wu, Yuxiang

    2018-04-01

    To improve the precision of indoor positioning and actualize three-dimensional positioning, a reversed indoor positioning system based on visible light communication (VLC) using genetic algorithm (GA) is proposed. In order to solve the problem of interference between signal sources, CDMA modulation is used. Each light-emitting diode (LED) in the system broadcasts a unique identity (ID) code using CDMA modulation. Receiver receives mixed signal from every LED reference point, by the orthogonality of spreading code in CDMA modulation, ID information and intensity attenuation information from every LED can be obtained. According to positioning principle of received signal strength (RSS), the coordinate of the receiver can be determined. Due to system noise and imperfection of device utilized in the system, distance between receiver and transmitters will deviate from the real value resulting in positioning error. By introducing error correction factors to global parallel search of genetic algorithm, coordinates of the receiver in three-dimensional space can be determined precisely. Both simulation results and experimental results show that in practical application scenarios, the proposed positioning system can realize high precision positioning service.

  14. Cloud Detection with the Earth Polychromatic Imaging Camera (EPIC)

    NASA Technical Reports Server (NTRS)

    Meyer, Kerry; Marshak, Alexander; Lyapustin, Alexei; Torres, Omar; Wang, Yugie

    2011-01-01

    The Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR) would provide a unique opportunity for Earth and atmospheric research due not only to its Lagrange point sun-synchronous orbit, but also to the potential for synergistic use of spectral channels in both the UV and visible spectrum. As a prerequisite for most applications, the ability to detect the presence of clouds in a given field of view, known as cloud masking, is of utmost importance. It serves to determine both the potential for cloud contamination in clear-sky applications (e.g., land surface products and aerosol retrievals) and clear-sky contamination in cloud applications (e.g., cloud height and property retrievals). To this end, a preliminary cloud mask algorithm has been developed for EPIC that applies thresholds to reflected UV and visible radiances, as well as to reflected radiance ratios. This algorithm has been tested with simulated EPIC radiances over both land and ocean scenes, with satisfactory results. These test results, as well as algorithm sensitivity to potential instrument uncertainties, will be presented.

  15. Automatic Texture Mapping of Architectural and Archaeological 3d Models

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Stallmann, D.

    2012-07-01

    Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage) algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.

  16. Infrared and visible image fusion with the target marked based on multi-resolution visual attention mechanisms

    NASA Astrophysics Data System (ADS)

    Huang, Yadong; Gao, Kun; Gong, Chen; Han, Lu; Guo, Yue

    2016-03-01

    During traditional multi-resolution infrared and visible image fusion processing, the low contrast ratio target may be weakened and become inconspicuous because of the opposite DN values in the source images. So a novel target pseudo-color enhanced image fusion algorithm based on the modified attention model and fast discrete curvelet transformation is proposed. The interesting target regions are extracted from source images by introducing the motion features gained from the modified attention model, and source images are performed the gray fusion via the rules based on physical characteristics of sensors in curvelet domain. The final fusion image is obtained by mapping extracted targets into the gray result with the proper pseudo-color instead. The experiments show that the algorithm can highlight dim targets effectively and improve SNR of fusion image.

  17. Computer generated hologram from point cloud using graphics processor.

    PubMed

    Chen, Rick H-Y; Wilkinson, Timothy D

    2009-12-20

    Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum. We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologram plane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique.

  18. On Problem of Synthesis of Control System for Quadrocopter

    NASA Astrophysics Data System (ADS)

    Larin, V. B.; Tunik, A. A.

    2017-05-01

    An algorithm for designing a control for a quadrocopter is given. Two cases of control of the horizontal motion of a vehicle are considered. Terminal location is given in one case, and cruise speed is given in the other case. The results are compared with those obtained by other authors

  19. DEVELOPING A METHOD TO IDENTIFY HORIZONTAL CURVE SEGMENTS WITH HIGH CRASH OCCURRENCES USING THE HAF ALGORITHM

    DOT National Transportation Integrated Search

    2018-04-01

    Crashes occur every day on Utahs highways. Curves can be particularly dangerous as they require driver focus due to potentially unseen hazards. Often, crashes occur on curves due to poor curve geometry, a lack of warning signs, or poor surface con...

  20. A Probability-Based Algorithm Using Image Sensors to Track the LED in a Vehicle Visible Light Communication System.

    PubMed

    Huynh, Phat; Do, Trong-Hop; Yoo, Myungsik

    2017-02-10

    This paper proposes a probability-based algorithm to track the LED in vehicle visible light communication systems using a camera. In this system, the transmitters are the vehicles' front and rear LED lights. The receivers are high speed cameras that take a series of images of the LEDs. ThedataembeddedinthelightisextractedbyfirstdetectingthepositionoftheLEDsintheseimages. Traditionally, LEDs are detected according to pixel intensity. However, when the vehicle is moving, motion blur occurs in the LED images, making it difficult to detect the LEDs. Particularly at high speeds, some frames are blurred at a high degree, which makes it impossible to detect the LED as well as extract the information embedded in these frames. The proposed algorithm relies not only on the pixel intensity, but also on the optical flow of the LEDs and on statistical information obtained from previous frames. Based on this information, the conditional probability that a pixel belongs to a LED is calculated. Then, the position of LED is determined based on this probability. To verify the suitability of the proposed algorithm, simulations are conducted by considering the incidents that can happen in a real-world situation, including a change in the position of the LEDs at each frame, as well as motion blur due to the vehicle speed.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Fuyu; Collins, William D.; Wehner, Michael F.

    High-resolution climate models have been shown to improve the statistics of tropical storms and hurricanes compared to low-resolution models. The impact of increasing horizontal resolution in the tropical storm simulation is investigated exclusively using a series of Atmospheric Global Climate Model (AGCM) runs with idealized aquaplanet steady-state boundary conditions and a fixed operational storm-tracking algorithm. The results show that increasing horizontal resolution helps to detect more hurricanes, simulate stronger extreme rainfall, and emulate better storm structures in the models. However, increasing model resolution does not necessarily produce stronger hurricanes in terms of maximum wind speed, minimum sea level pressure, andmore » mean precipitation, as the increased number of storms simulated by high-resolution models is mainly associated with weaker storms. The spatial scale at which the analyses are conducted appears to have more important control on these meteorological statistics compared to horizontal resolution of the model grid. When the simulations are analyzed on common low-resolution grids, the statistics of the hurricanes, particularly the hurricane counts, show reduced sensitivity to the horizontal grid resolution and signs of scale invariant.« less

  2. Lithofacies classification of the Barnett Shale gas reservoir using neural network

    NASA Astrophysics Data System (ADS)

    Aliouane, Leila; Ouadfeul, Sid-Ali

    2017-04-01

    Here, we show the contribution of the artificial intelligence such as neural network to predict the lithofacies in the lower Barnett shale gas reservoir. The Multilayer Perceptron (MLP) neural network with Hidden Weight Optimization Algorithm is used. The input is raw well-logs data recorded in a horizontal well drilled in the Lower Barnett shale formation, however the output is the concentration of the Clay and the Quartz calculated using the ELAN model and confirmed with the core rock measurement. After training of the MLP machine weights of connection are calculated, the raw well-logs data of two other horizontal wells drilled in the same reservoir are propagated though the neural machine and an output is calculated. Comparison between the predicted and measured clay and Quartz concentrations in these two horizontal wells shows the ability of neural network to improve shale gas reservoirs characterization.

  3. Cloud retrievals from satellite data using optimal estimation: evaluation and application to ATSR

    NASA Astrophysics Data System (ADS)

    Poulsen, C. A.; Siddans, R.; Thomas, G. E.; Sayer, A. M.; Grainger, R. G.; Campmany, E.; Dean, S. M.; Arnold, C.; Watts, P. D.

    2012-08-01

    Clouds play an important role in balancing the Earth's radiation budget. Hence, it is vital that cloud climatologies are produced that quantify cloud macro and micro physical parameters and the associated uncertainty. In this paper, we present an algorithm ORAC (Oxford-RAL retrieval of Aerosol and Cloud) which is based on fitting a physically consistent cloud model to satellite observations simultaneously from the visible to the mid-infrared, thereby ensuring that the resulting cloud properties provide both a good representation of the short-wave and long-wave radiative effects of the observed cloud. The advantages of the optimal estimation method are that it enables rigorous error propagation and the inclusion of all measurements and any a priori information and associated errors in a rigorous mathematical framework. The algorithm provides a measure of the consistency between retrieval representation of cloud and satellite radiances. The cloud parameters retrieved are the cloud top pressure, cloud optical depth, cloud effective radius, cloud fraction and cloud phase. The algorithm can be applied to most visible/infrared satellite instruments. In this paper, we demonstrate the applicability to the Along-Track Scanning Radiometers ATSR-2 and AATSR. Examples of applying the algorithm to ATSR-2 flight data are presented and the sensitivity of the retrievals assessed, in particular the algorithm is evaluated for a number of simulated single-layer and multi-layer conditions. The algorithm was found to perform well for single-layer cloud except when the cloud was very thin; i.e., less than 1 optical depths. For the multi-layer cloud, the algorithm was robust except when the upper ice cloud layer is less than five optical depths. In these cases the retrieved cloud top pressure and cloud effective radius become a weighted average of the 2 layers. The sum of optical depth of multi-layer cloud is retrieved well until the cloud becomes thick, greater than 50 optical depths, where the cloud begins to saturate. The cost proved a good indicator of multi-layer scenarios. Both the retrieval cost and the error need to be considered together in order to evaluate the quality of the retrieval. This algorithm in the configuration described here has been applied to both ATSR-2 and AATSR visible and infrared measurements in the context of the GRAPE (Global Retrieval and cloud Product Evaluation) project to produce a 14 yr consistent record for climate research.

  4. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Infrared and visible fusion face recognition based on NSCT domain

    NASA Astrophysics Data System (ADS)

    Xie, Zhihua; Zhang, Shuai; Liu, Guodong; Xiong, Jinquan

    2018-01-01

    Visible face recognition systems, being vulnerable to illumination, expression, and pose, can not achieve robust performance in unconstrained situations. Meanwhile, near infrared face images, being light- independent, can avoid or limit the drawbacks of face recognition in visible light, but its main challenges are low resolution and signal noise ratio (SNR). Therefore, near infrared and visible fusion face recognition has become an important direction in the field of unconstrained face recognition research. In this paper, a novel fusion algorithm in non-subsampled contourlet transform (NSCT) domain is proposed for Infrared and visible face fusion recognition. Firstly, NSCT is used respectively to process the infrared and visible face images, which exploits the image information at multiple scales, orientations, and frequency bands. Then, to exploit the effective discriminant feature and balance the power of high-low frequency band of NSCT coefficients, the local Gabor binary pattern (LGBP) and Local Binary Pattern (LBP) are applied respectively in different frequency parts to obtain the robust representation of infrared and visible face images. Finally, the score-level fusion is used to fuse the all the features for final classification. The visible and near infrared face recognition is tested on HITSZ Lab2 visible and near infrared face database. Experiments results show that the proposed method extracts the complementary features of near-infrared and visible-light images and improves the robustness of unconstrained face recognition.

  6. Enhanced hemispheric-scale snow mapping through the blending of optical and microwave satellite data

    NASA Astrophysics Data System (ADS)

    Armstrong, R. L.; Brodzik, M. J.; Savoie, M.; Knowles, K.

    2003-04-01

    Snow cover is an important variable for climate and hydrologic models due to its effects on energy and moisture budgets. Seasonal snow can cover more than 50% of the Northern Hemisphere land surface during the winter resulting in snow cover being the land surface characteristic responsible for the largest annual and interannual differences in albedo. Passive microwave satellite remote sensing can augment measurements based on visible satellite data alone because of the ability to acquire data through most clouds or during darkness as well as to provide a measure of snow depth or water equivalent. Global snow cover fluctuation can now be monitored over a 24 year period using passive microwave data (Scanning Multichannel Microwave Radiometer (SMMR) 1978-1987 and Special Sensor Microwave/Imager (SSM/I), 1987-present). Evaluation of snow extent derived from passive microwave algorithms is presented through comparison with the NOAA Northern Hemisphere weekly snow extent data. For the period 1978 to 2002, both passive microwave and visible data sets show a similar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are consistently less than those provided by the visible satellite data and the visible data typically show higher monthly variability. Decadal trends and their significance are compared for the two data types. During shallow snow conditions of the early winter season microwave data consistently indicate less snow-covered area than the visible data. This underestimate of snow extent results from the fact that shallow snow cover (less than about 5.0 cm) does not provide a scattering signal of sufficient strength to be detected by the algorithms. As the snow cover continues to build during the months of January through March, as well as throughout the melt season, agreement between the two data types continually improves. This occurs because as the snow becomes deeper and the layered structure more complex, the negative spectral gradient driving the passive microwave algorithm is enhanced. Because the current generation of microwave snow algorithms is unable to consistently detect shallow and intermittent snow, we combine visible satellite data with the microwave data in a single blended product to overcome this problem. For the period 1978 to 2002 we combine data from the NOAA weekly snow charts with passive microwave data from the SMMR and SSM/I brightness temperature record. For the current and future time period we blend MODIS and AMSR-E data sets, both of which have greatly enhanced spatial resolution compared to the earlier data sources. Because it is not possible to determine snow depth or snow water equivalent from visible data, the regions where only the NOAA or MODIS data indicate snow are defined as "shallow snow". However, because our current blended product is being developed in the 25 km EASE-Grid and the MODIS data being used are in the Climate Modelers Grid (CMG) at approximately 5 km (0.05 deg.) the blended product also includes percent snow cover over the larger grid cell. A prototype version of the blended MODIS/AMSR-E product will be available in near real-time from NSIDC during the 2002-2003 winter season.

  7. Efficient demodulation scheme for rolling-shutter-patterning of CMOS image sensor based visible light communications.

    PubMed

    Chen, Chia-Wei; Chow, Chi-Wai; Liu, Yang; Yeh, Chien-Hung

    2017-10-02

    Recently even the low-end mobile-phones are equipped with a high-resolution complementary-metal-oxide-semiconductor (CMOS) image sensor. This motivates using a CMOS image sensor for visible light communication (VLC). Here we propose and demonstrate an efficient demodulation scheme to synchronize and demodulate the rolling shutter pattern in image sensor based VLC. The implementation algorithm is discussed. The bit-error-rate (BER) performance and processing latency are evaluated and compared with other thresholding schemes.

  8. Planning minimum-energy paths in an off-road environment with anisotropic traversal costs and motion constraints. Doctoral thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, R.S.

    1989-06-01

    For a vehicle operating across arbitrarily-contoured terrain, finding the most fuel-efficient route between two points can be viewed as a high-level global path-planning problem with traversal costs and stability dependent on the direction of travel (anisotropic). The problem assumes a two-dimensional polygonal map of homogeneous cost regions for terrain representation constructed from elevation information. The anisotropic energy cost of vehicle motion has a non-braking component dependent on horizontal distance, a braking component dependent on vertical distance, and a constant path-independent component. The behavior of minimum-energy paths is then proved to be restricted to a small, but optimal set of traversalmore » types. An optimal-path-planning algorithm, using a heuristic search technique, reduces the infinite number of paths between the start and goal points to a finite number by generating sequences of goal-feasible window lists from analyzing the polygonal map and applying pruning criteria. The pruning criteria consist of visibility analysis, heading analysis, and region-boundary constraints. Each goal-feasible window lists specifies an associated convex optimization problem, and the best of all locally-optimal paths through the goal-feasible window lists is the globally-optimal path. These ideas have been implemented in a computer program, with results showing considerably better performance than the exponential average-case behavior predicted.« less

  9. NPP VIIRS Geometric Performance Status

    NASA Technical Reports Server (NTRS)

    Lin, Guoqing; Wolfe, Robert E.; Nishihama, Masahiro

    2011-01-01

    Visible Infrared Imager Radiometer Suite (VIIRS) instrument on-board the National Polar-orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP) satellite is scheduled for launch in October, 2011. It is to provide satellite measured radiance/reflectance data for both weather and climate applications. Along with radiometric calibration, geometric characterization and calibration of Sensor Data Records (SDRs) are crucial to the VIIRS Environmental Data Record (EDR) algorithms and products which are used in numerical weather prediction (NWP). The instrument geometric performance includes: 1) sensor (detector) spatial response, parameterized by the dynamic field of view (DFOV) in the scan direction and instantaneous FOV (IFOV) in the track direction, modulation transfer function (MTF) for the 17 moderate resolution bands (M-bands), and horizontal spatial resolution (HSR) for the five imagery bands (I-bands); 2) matrices of band-to-band co-registration (BBR) from the corresponding detectors in all band pairs; and 3) pointing knowledge and stability characteristics that includes scan plane tilt, scan rate and scan start position variations, and thermally induced variations in pointing with respect to orbital position. They have been calibrated and characterized through ground testing under ambient and thermal vacuum conditions, numerical modeling and analysis. This paper summarizes the results, which are in general compliance with specifications, along with anomaly investigations, and describes paths forward for characterizing on-orbit BBR and spatial response, and for improving instrument on-orbit performance in pointing and geolocation.

  10. Multiview robotic microscope reveals the in-plane kinematics of amphibian neurulation.

    PubMed

    Veldhuis, Jim H; Brodland, G Wayne; Wiebe, Colin J; Bootsma, Gregory J

    2005-06-01

    A new robotic microscope system, called the Frogatron 3000, was developed to collect time-lapse images from arbitrary viewing angles over the surface of live embryos. Embryos are mounted at the center of a horizontal, fluid-filled, cylindrical glass chamber around which a camera with special optics traverses. To hold them at the center of the chamber and revolve them about a vertical axis, the embryos are placed on the end of a small vertical glass tube that is rotated under computer control. To demonstrate operation of the system, it was used to capture time-lapse images of developing axolotl (amphibian) embryos from 63 viewing angles during the process of neurulation and the in-plane kinematics of the epithelia visible at the center of each view was calculated. The motions of points on the surface of the embryo were determined by digital tracking of their natural surface texture, and a least-squares algorithm was developed to calculate the deformation-rate tensor from the motions of these surface points. Principal strain rates and directions were extracted from this tensor using decomposition and eigenvector techniques. The highest observed principal true strain rate was 28 +/- 5% per hour, along the midline of the neural plate during developmental stage 14, while the greatest contractile true strain rate was--35 +/- 5% per hour, normal to the embryo midline during stage 15.

  11. Improving visibility of rear surface cracks during inductive thermography of metal plates using Autoencoder

    NASA Astrophysics Data System (ADS)

    Xie, Jing; Xu, Changhang; Chen, Guoming; Huang, Weiping

    2018-06-01

    Inductive thermography is one kind of infrared thermography (IRT) technique, which is effective in detection of front surface cracks in metal plates. However, rear surface cracks are usually missed due to their weak indications during inductive thermography. Here we propose a novel approach (AET: AE Thermography) to improve the visibility of rear surface cracks during inductive thermography by employing the Autoencoder (AE) algorithm, which is an important block to construct deep learning architectures. We construct an integrated framework for processing the raw inspection data of inductive thermography using the AE algorithm. Through this framework, underlying features of rear surface cracks are efficiently extracted and new clearer images are constructed. Experiments of inductive thermography were conducted on steel specimens to verify the efficacy of the proposed approach. We visually compare the raw thermograms, the empirical orthogonal functions (EOFs) of the prominent component thermography (PCT) technique and the results of AET. We further quantitatively evaluated AET by calculating crack contrast and signal-to-noise ratio (SNR). The results demonstrate that the proposed AET approach can remarkably improve the visibility of rear surface cracks and then improve the capability of inductive thermography in detecting rear surface cracks in metal plates.

  12. Real-time 3D human pose recognition from reconstructed volume via voxel classifiers

    NASA Astrophysics Data System (ADS)

    Yoo, ByungIn; Choi, Changkyu; Han, Jae-Joon; Lee, Changkyo; Kim, Wonjun; Suh, Sungjoo; Park, Dusik; Kim, Junmo

    2014-03-01

    This paper presents a human pose recognition method which simultaneously reconstructs a human volume based on ensemble of voxel classifiers from a single depth image in real-time. The human pose recognition is a difficult task since a single depth camera can capture only visible surfaces of a human body. In order to recognize invisible (self-occluded) surfaces of a human body, the proposed algorithm employs voxel classifiers trained with multi-layered synthetic voxels. Specifically, ray-casting onto a volumetric human model generates a synthetic voxel, where voxel consists of a 3D position and ID corresponding to the body part. The synthesized volumetric data which contain both visible and invisible body voxels are utilized to train the voxel classifiers. As a result, the voxel classifiers not only identify the visible voxels but also reconstruct the 3D positions and the IDs of the invisible voxels. The experimental results show improved performance on estimating the human poses due to the capability of inferring the invisible human body voxels. It is expected that the proposed algorithm can be applied to many fields such as telepresence, gaming, virtual fitting, wellness business, and real 3D contents control on real 3D displays.

  13. LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.

    PubMed

    Nguyen, Phong Ha; Arsalan, Muhammad; Koo, Ja Hyung; Naqvi, Rizwan Ali; Truong, Noi Quang; Park, Kang Ryoung

    2018-05-24

    Autonomous landing of an unmanned aerial vehicle or a drone is a challenging problem for the robotics research community. Previous researchers have attempted to solve this problem by combining multiple sensors such as global positioning system (GPS) receivers, inertial measurement unit, and multiple camera systems. Although these approaches successfully estimate an unmanned aerial vehicle location during landing, many calibration processes are required to achieve good detection accuracy. In addition, cases where drones operate in heterogeneous areas with no GPS signal should be considered. To overcome these problems, we determined how to safely land a drone in a GPS-denied environment using our remote-marker-based tracking algorithm based on a single visible-light-camera sensor. Instead of using hand-crafted features, our algorithm includes a convolutional neural network named lightDenseYOLO to extract trained features from an input image to predict a marker's location by visible light camera sensor on drone. Experimental results show that our method significantly outperforms state-of-the-art object trackers both using and not using convolutional neural network in terms of both accuracy and processing time.

  14. Efficient visibility encoding for dynamic illumination in direct volume rendering.

    PubMed

    Kronander, Joel; Jönsson, Daniel; Löw, Joakim; Ljung, Patric; Ynnerman, Anders; Unger, Jonas

    2012-03-01

    We present an algorithm that enables real-time dynamic shading in direct volume rendering using general lighting, including directional lights, point lights, and environment maps. Real-time performance is achieved by encoding local and global volumetric visibility using spherical harmonic (SH) basis functions stored in an efficient multiresolution grid over the extent of the volume. Our method enables high-frequency shadows in the spatial domain, but is limited to a low-frequency approximation of visibility and illumination in the angular domain. In a first pass, level of detail (LOD) selection in the grid is based on the current transfer function setting. This enables rapid online computation and SH projection of the local spherical distribution of visibility information. Using a piecewise integration of the SH coefficients over the local regions, the global visibility within the volume is then computed. By representing the light sources using their SH projections, the integral over lighting, visibility, and isotropic phase functions can be efficiently computed during rendering. The utility of our method is demonstrated in several examples showing the generality and interactive performance of the approach.

  15. Thermal-to-visible face recognition using partial least squares.

    PubMed

    Hu, Shuowen; Choi, Jonghyun; Chan, Alex L; Schwartz, William Robson

    2015-03-01

    Although visible face recognition has been an active area of research for several decades, cross-modal face recognition has only been explored by the biometrics community relatively recently. Thermal-to-visible face recognition is one of the most difficult cross-modal face recognition challenges, because of the difference in phenomenology between the thermal and visible imaging modalities. We address the cross-modal recognition problem using a partial least squares (PLS) regression-based approach consisting of preprocessing, feature extraction, and PLS model building. The preprocessing and feature extraction stages are designed to reduce the modality gap between the thermal and visible facial signatures, and facilitate the subsequent one-vs-all PLS-based model building. We incorporate multi-modal information into the PLS model building stage to enhance cross-modal recognition. The performance of the proposed recognition algorithm is evaluated on three challenging datasets containing visible and thermal imagery acquired under different experimental scenarios: time-lapse, physical tasks, mental tasks, and subject-to-camera range. These scenarios represent difficult challenges relevant to real-world applications. We demonstrate that the proposed method performs robustly for the examined scenarios.

  16. Long-term analysis of aerosol optical depth over Northeast Asia using a satellite-based measurement: MI Yonsei Aerosol Retrieval Algorithm (YAER)

    NASA Astrophysics Data System (ADS)

    Kim, Mijin; Kim, Jhoon; Yoon, Jongmin; Chung, Chu-Yong; Chung, Sung-Rae

    2017-04-01

    In 2010, the Korean geostationary earth orbit (GEO) satellite, the Communication, Ocean, and Meteorological Satellite (COMS), was launched including the Meteorological Imager (MI). The MI measures atmospheric condition over Northeast Asia (NEA) using a single visible channel centered at 0.675 μm and four IR channels at 3.75, 6.75, 10.8, 12.0 μm. The visible measurement can also be utilized for the retrieval of aerosol optical properties (AOPs). Since the GEO satellite measurement has an advantage for continuous monitoring of AOPs, we can analyze the spatiotemporal variation of the aerosol using the MI observations over NEA. Therefore, we developed an algorithm to retrieve aerosol optical depth (AOD) using the visible observation of MI, and named as MI Yonsei Aerosol Retrieval Algorithm (YAER). In this study, we investigated the accuracy of MI YAER AOD by comparing the values with the long-term products of AERONET sun-photometer. The result showed that the MI AODs were significantly overestimated than the AERONET values over bright surface in low AOD case. Because the MI visible channel centered at red color range, contribution of aerosol signal to the measured reflectance is relatively lower than the surface contribution. Therefore, the AOD error in low AOD case over bright surface can be a fundamental limitation of the algorithm. Meanwhile, an assumption of background aerosol optical depth (BAOD) could result in the retrieval uncertainty, also. To estimate the surface reflectance by considering polluted air condition over the NEA, we estimated the BAOD from the MODIS dark target (DT) aerosol products by pixel. The satellite-based AOD retrieval, however, largely depends on the accuracy of the surface reflectance estimation especially in low AOD case, and thus, the BAOD could include the uncertainty in surface reflectance estimation of the satellite-based retrieval. Therefore, we re-estimated the BAOD using the ground-based sun-photometer measurement, and investigated the effects of the BAOD assumption. The satellite-based BAOD was significantly higher than the ground-based value over urban area, and thus, resulted in the underestimation of surface reflectance and the overestimation of AOD. The error analysis of the MI AOD also showed sensitivity to cloud contamination, clearly. Therefore, improvements of cloud masking process in the developed single channel MI algorithm as well as the modification of the surface reflectance estimation will be required for the future study.

  17. Introducing two Random Forest based methods for cloud detection in remote sensing images

    NASA Astrophysics Data System (ADS)

    Ghasemian, Nafiseh; Akhoondzadeh, Mehdi

    2018-07-01

    Cloud detection is a necessary phase in satellite images processing to retrieve the atmospheric and lithospheric parameters. Currently, some cloud detection methods based on Random Forest (RF) model have been proposed but they do not consider both spectral and textural characteristics of the image. Furthermore, they have not been tested in the presence of snow/ice. In this paper, we introduce two RF based algorithms, Feature Level Fusion Random Forest (FLFRF) and Decision Level Fusion Random Forest (DLFRF) to incorporate visible, infrared (IR) and thermal spectral and textural features (FLFRF) including Gray Level Co-occurrence Matrix (GLCM) and Robust Extended Local Binary Pattern (RELBP_CI) or visible, IR and thermal classifiers (DLFRF) for highly accurate cloud detection on remote sensing images. FLFRF first fuses visible, IR and thermal features. Thereafter, it uses the RF model to classify pixels to cloud, snow/ice and background or thick cloud, thin cloud and background. DLFRF considers visible, IR and thermal features (both spectral and textural) separately and inserts each set of features to RF model. Then, it holds vote matrix of each run of the model. Finally, it fuses the classifiers using the majority vote method. To demonstrate the effectiveness of the proposed algorithms, 10 Terra MODIS and 15 Landsat 8 OLI/TIRS images with different spatial resolutions are used in this paper. Quantitative analyses are based on manually selected ground truth data. Results show that after adding RELBP_CI to input feature set cloud detection accuracy improves. Also, the average cloud kappa values of FLFRF and DLFRF on MODIS images (1 and 0.99) are higher than other machine learning methods, Linear Discriminate Analysis (LDA), Classification And Regression Tree (CART), K Nearest Neighbor (KNN) and Support Vector Machine (SVM) (0.96). The average snow/ice kappa values of FLFRF and DLFRF on MODIS images (1 and 0.85) are higher than other traditional methods. The quantitative values on Landsat 8 images show similar trend. Consequently, while SVM and K-nearest neighbor show overestimation in predicting cloud and snow/ice pixels, our Random Forest (RF) based models can achieve higher cloud, snow/ice kappa values on MODIS and thin cloud, thick cloud and snow/ice kappa values on Landsat 8 images. Our algorithms predict both thin and thick cloud on Landsat 8 images while the existing cloud detection algorithm, Fmask cannot discriminate them. Compared to the state-of-the-art methods, our algorithms have acquired higher average cloud and snow/ice kappa values for different spatial resolutions.

  18. Texture orientation-based algorithm for detecting infrared maritime targets.

    PubMed

    Wang, Bin; Dong, Lili; Zhao, Ming; Wu, Houde; Xu, Wenhai

    2015-05-20

    Infrared maritime target detection is a key technology for maritime target searching systems. However, in infrared maritime images (IMIs) taken under complicated sea conditions, background clutters, such as ocean waves, clouds or sea fog, usually have high intensity that can easily overwhelm the brightness of real targets, which is difficult for traditional target detection algorithms to deal with. To mitigate this problem, this paper proposes a novel target detection algorithm based on texture orientation. This algorithm first extracts suspected targets by analyzing the intersubband correlation between horizontal and vertical wavelet subbands of the original IMI on the first scale. Then the self-adaptive wavelet threshold denoising and local singularity analysis of the original IMI is combined to remove false alarms further. Experiments show that compared with traditional algorithms, this algorithm can suppress background clutter much better and realize better single-frame detection for infrared maritime targets. Besides, in order to guarantee accurate target extraction further, the pipeline-filtering algorithm is adopted to eliminate residual false alarms. The high practical value and applicability of this proposed strategy is backed strongly by experimental data acquired under different environmental conditions.

  19. Model space exploration for determining landslide source history from long period seismic data

    NASA Astrophysics Data System (ADS)

    Zhao, Juan; Mangeney, Anne; Stutzmann, Eléonore; Capdeville, Yann; Moretti, Laurent; Calder, Eliza S.; Smith, Patrick J.; Cole, Paul; Le Friant, Anne

    2013-04-01

    The seismic signals generated by high magnitude landslide events can be recorded at remote stations, which provides access to the landslide process. During the "Boxing Day" eruption at Montserrat in 1997, the long period seismic signals generated by the debris avalanche are recorded by two stations at distances of 450 km and 1261 km. We investigate the landslide process considering that the landslide source can be described by single forces. The period band 25-50 sec is selected for which the landslide signal is clearly visible at the two stations. We first use the transverse component of the closest station to determine the horizontal forces. We model the seismogram by normal mode summation and investigate the model space. Two horizontal forces are found that best fit the data. These two horizontal forces have similar amplitude, but opposite direction and they are separated in time by 70 sec. The radiation pattern of the transverse component does not enable to determine the exact azimuth of these forces. We then model the vertical component of the seismograms which enable to retrieve both the vertical and horizontal forces. Using the parameter previously determined (amplitude ratio and time shift of the 2 horizontal forces), we further investigate the model space and show that a single vertical force together with the 2 horizontal forces enable to fit the data. The complete source time function can be described as follows: a horizontal force toward the opposite direction of the landslide flow is followed 40 sec later by a vertical downward force and 30 more seconds later by a horizontal force toward the direction of the flow. Inverting directly the seismograms in the period band 25-50sec enable to retrieve a source time function that is consistent with the 3 forces determined previously. The source time function in this narrow period band alone does not enable easily to recover the corresponding single forces. This method can be used to determine the source parameters using only 2 distant stations. It is successfully tested also on Mount St. Helens (1980) event which are recorded by more broadband stations.

  20. Development of visible/infrared/microwave agriculture classification and biomass estimation algorithms. [Guyton, Oklahoma and Dalhart, Texas

    NASA Technical Reports Server (NTRS)

    Rosenthal, W. D.; Mcfarland, M. J.; Theis, S. W.; Jones, C. L. (Principal Investigator)

    1982-01-01

    Agricultural crop classification models using two or more spectral regions (visible through microwave) are considered in an effort to estimate biomass at Guymon, Oklahoma Dalhart, Texas. Both grounds truth and aerial data were used. Results indicate that inclusion of C, L, and P band active microwave data, from look angles greater than 35 deg from nadir, with visible and infrared data improve crop discrimination and biomass estimates compared to results using only visible and infrared data. The microwave frequencies were sensitive to different biomass levels. The K and C band were sensitive to differences at low biomass levels, while P band was sensitive to differences at high biomass levels. Two indices, one using only active microwave data and the other using data from the middle and near infrared bands, were well correlated to total biomass. It is implied that inclusion of active microwave sensors with visible and infrared sensors on future satellites could aid in crop discrimination and biomass estimation.

  1. Visibility Equalizer Cutaway Visualization of Mesoscopic Biological Models.

    PubMed

    Le Muzic, M; Mindek, P; Sorger, J; Autin, L; Goodsell, D; Viola, I

    2016-06-01

    In scientific illustrations and visualization, cutaway views are often employed as an effective technique for occlusion management in densely packed scenes. We propose a novel method for authoring cutaway illustrations of mesoscopic biological models. In contrast to the existing cutaway algorithms, we take advantage of the specific nature of the biological models. These models consist of thousands of instances with a comparably smaller number of different types. Our method constitutes a two stage process. In the first step, clipping objects are placed in the scene, creating a cutaway visualization of the model. During this process, a hierarchical list of stacked bars inform the user about the instance visibility distribution of each individual molecular type in the scene. In the second step, the visibility of each molecular type is fine-tuned through these bars, which at this point act as interactive visibility equalizers. An evaluation of our technique with domain experts confirmed that our equalizer-based approach for visibility specification was valuable and effective for both, scientific and educational purposes.

  2. Visibility Equalizer Cutaway Visualization of Mesoscopic Biological Models

    PubMed Central

    Le Muzic, M.; Mindek, P.; Sorger, J.; Autin, L.; Goodsell, D.; Viola, I.

    2017-01-01

    In scientific illustrations and visualization, cutaway views are often employed as an effective technique for occlusion management in densely packed scenes. We propose a novel method for authoring cutaway illustrations of mesoscopic biological models. In contrast to the existing cutaway algorithms, we take advantage of the specific nature of the biological models. These models consist of thousands of instances with a comparably smaller number of different types. Our method constitutes a two stage process. In the first step, clipping objects are placed in the scene, creating a cutaway visualization of the model. During this process, a hierarchical list of stacked bars inform the user about the instance visibility distribution of each individual molecular type in the scene. In the second step, the visibility of each molecular type is fine-tuned through these bars, which at this point act as interactive visibility equalizers. An evaluation of our technique with domain experts confirmed that our equalizer-based approach for visibility specification was valuable and effective for both, scientific and educational purposes. PMID:28344374

  3. Remote assessment of ocean color for interpretation of satellite visible imagery: A review

    NASA Technical Reports Server (NTRS)

    Gordon, H. R.; Morel, A. Y.

    1983-01-01

    An assessment is presented of the state-of-the-art of remote, (satellite-based) Coastal Zone Color (CZCS) Scanning of color variations in the ocean due to phytoplankton. Attention is given to physical problems associated with ocean color remote sensing, in-water algorithms for the correction of atmospheric effects, constituent retrieval algorithms and application of the algorithms to CZCS imagery. The applicability of CZCS to both near-coast and mid-ocean waters is considered, and it is concluded that while differences between the two environments are complex, universal algorithms can be used for the case of mid-ocean waters, and site-specific algorithms are adequate for CZCS imaging of the near-coast oceanic environment. A short description of CVCS and some sample photographs are provided in an appendix.

  4. Toward the Development of a Coupled COAMPS-ROMS Ensemble Kalman Filter and Adjoint with a focus on the Indian Ocean and the Intraseasonal Oscillation

    DTIC Science & Technology

    2015-09-30

    1 Approved for public release; distribution is unlimited. Toward the Development of a Coupled COAMPS-ROMS Ensemble Kalman Filter and Adjoint...system at NCAR. (2) Compare the performance of the Ensemble Kalman Filter (EnKF) using the Data Assimilation Research Testbed (DART) and 4...undercurrent is clearly visible. Figure 2 shows the horizontal temperature structure and circulation at a depth of 50 m within the surface mixed layer

  5. Method of making hollow elastomeric bodies

    NASA Technical Reports Server (NTRS)

    Broyles, H. F.; Moacanin, J.; Cuddihy, E. F. (Inventor)

    1976-01-01

    Annular elastomeric bodies having intricate shapes are cast by dipping a heated, rotating mandrel into a solution of the elastomer, permitting the elastomer to creep into sharp recesses, drying the coated mandrel and repeating the operation until the desired thickness has been achieved. A bladder for a heart assist pump in which a cylindrical body terminating in flat, sharp horizontal flanges fabricated by this procedure has been subjected to over 2,500 hours of simulated life conditions with no visible signs of degradation.

  6. Atmospheric Pressure During Landing

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This figure shows the variation with time of pressure (dots) measured by the Pathfinder MET instrument during the landing period shown in image PIA00797. The two diamonds indicate the times of bridal cutting and 1st impact. The overall trend in the data is of pressure increasing with time. This is almost certainly due to the lander rolling downhill by roughly 10 m. The spacing of the horizontal dotted lines indicates the pressure change expected from 10 m changes in altitude. Bounces may also be visible in the data.

  7. Visible aurora in Jupiter's atmosphere

    NASA Technical Reports Server (NTRS)

    Cook, A. F., II; Jones, A. V.; Shemansky, D. E.

    1981-01-01

    The darkside limb pictures obtained by the imaging experiment on Voyager 1 have been reexamined. It is concluded that the observed luminosity is very likely due at least in part to Io torus aurora. If the effective wavelength of the emission lies in the 4000- to 5000-A region, the slant intensity is estimated to be about 20 kR. The observed double structure may be due to a number of causes such as horizontal structure in auroral emission, aurora plus twilight or photochemical airglow plus aurora.

  8. Problem Solving by 5-6 Years Old Kindergarten Children in a Computer Programming Environment: A Case Study

    ERIC Educational Resources Information Center

    Fessakis, G.; Gouli, E.; Mavroudi, E.

    2013-01-01

    Computer programming is considered an important competence for the development of higher-order thinking in addition to algorithmic problem solving skills. Its horizontal integration throughout all educational levels is considered worthwhile and attracts the attention of researchers. Towards this direction, an exploratory case study is presented…

  9. Automated Driftmeter Fused with Inertial Navigation

    DTIC Science & Technology

    2014-03-27

    6 IMU Inertial Measurement Unit . . . . . . . . . . . . . . . . . . . . . . . 7 SLAM Simultaneous...timing lines to remain horizontal at all times, regardless of turbulence and within 20 degrees of roll , pitch, and yaw of the aircraft. It had two...introduced in 1960 [2]. The Kalman filter algorithm has been used to merge inertial navigational data from Inertial Measurement Units ( IMU ) with

  10. Hyperspectral image processing methods

    USDA-ARS?s Scientific Manuscript database

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  11. Imaging of Biological Tissues by Visible Light CDI

    NASA Astrophysics Data System (ADS)

    Karpov, Dmitry; Dos Santos Rolo, Tomy; Rich, Hannah; Fohtung, Edwin

    Recent advances in the use of synchrotron and X-ray free electron laser (XFEL) based coherent diffraction imaging (CDI) with application to material sciences and medicine proved the technique to be efficient in recovering information about the samples encoded in the phase domain. The current state-of-the-art algorithms of reconstruction are transferable to optical frequencies, which makes laser sources a reasonable milestone both in technique development and applications. Here we present first results from table-top laser CDI system for imaging of biological tissues and reconstruction algorithms development and discuss approaches that are complimenting the data quality improvement that is applicable to visible light frequencies due to it's properties. We demonstrate applicability of the developed methodology to a wide class of soft bio-matter and condensed matter systems. This project is funded by DOD-AFOSR under Award No FA9550-14-1-0363 and the LANSCE Professorship at LANL.

  12. Algorithm for fuel conservative horizontal capture trajectories

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Erzberger, H.

    1981-01-01

    A real time algorithm for computing constant altitude fuel-conservative approach trajectories for aircraft is described. The characteristics of the trajectory computed were chosen to approximate the extremal trajectories obtained from the optimal control solution to the problem and showed a fuel difference of only 0.5 to 2 percent for the real time algorithm in favor of the extremals. The trajectories may start at any initial position, heading, and speed and end at any other final position, heading, and speed. They consist of straight lines and a series of circular arcs of varying radius to approximate constant bank-angle decelerating turns. Throttle control is maximum thrust, nominal thrust, or zero thrust. Bank-angle control is either zero or aproximately 30 deg.

  13. Credit BG. View looking northeast down from the tower onto ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Credit BG. View looking northeast down from the tower onto the two horizontal test stations at Test Stand "D." Station Dy is at the far left (Dy vacuum cell out of view), with in-line exhaust gas cooling sections and steam-driven "air ejector" (or evacuator) discharging engine exhausts to the east. The Dd cell is visible at the lower left, and the Dd exhaust train has the same functions as at Dy. The spherical tank is an electrically heated "accumulator" which supplies steam to the ejectors at Dv, Dd, and Dy stations. Other large piping delivered cooling water to the horizontal train cooling sections. The horizontal duct at the "Y" branch in the Dd train connects the Dd ejector to the Dv and Cv vacuum duct system (a blank can be bolted into this duct to isolate the Dd system). The shed roof for the Dpond test station appears at bottom center of this image. The open steel frame to the lower left of the image supports a hoist and crane for installing or removing test engines from the Dd test cell - Jet Propulsion Laboratory Edwards Facility, Test Stand D, Edwards Air Force Base, Boron, Kern County, CA

  14. The spectral signature of cloud spatial structure in shortwave irradiance

    PubMed Central

    Song, Shi; Schmidt, K. Sebastian; Pilewskie, Peter; King, Michael D.; Heidinger, Andrew K.; Walther, Andi; Iwabuchi, Hironobu; Wind, Gala; Coddington, Odele M.

    2017-01-01

    In this paper, we used cloud imagery from a NASA field experiment in conjunction with three-dimensional radiative transfer calculations to show that cloud spatial structure manifests itself as a spectral signature in shortwave irradiance fields – specifically in transmittance and net horizontal photon transport in the visible and near-ultraviolet wavelength range. We found a robust correlation between the magnitude of net horizontal photon transport (H) and its spectral dependence (slope), which is scale-invariant and holds for the entire pixel population of a domain. This was surprising at first given the large degree of spatial inhomogeneity. We prove that the underlying physical mechanism for this phenomenon is molecular scattering in conjunction with cloud spatial structure. On this basis, we developed a simple parameterization through a single parameter ε, which quantifies the characteristic spectral signature of spatial inhomogeneities. In the case we studied, neglecting net horizontal photon transport leads to a local transmittance bias of ±12–19 %, even at the relatively coarse spatial resolution of 20 km. Since three-dimensional effects depend on the spatial context of a given pixel in a nontrivial way, the spectral dimension of this problem may emerge as the starting point for future bias corrections. PMID:28824698

  15. The spectral signature of cloud spatial structure in shortwave irradiance.

    PubMed

    Song, Shi; Schmidt, K Sebastian; Pilewskie, Peter; King, Michael D; Heidinger, Andrew K; Walther, Andi; Iwabuchi, Hironobu; Wind, Gala; Coddington, Odele M

    2016-11-08

    In this paper, we used cloud imagery from a NASA field experiment in conjunction with three-dimensional radiative transfer calculations to show that cloud spatial structure manifests itself as a spectral signature in shortwave irradiance fields - specifically in transmittance and net horizontal photon transport in the visible and near-ultraviolet wavelength range. We found a robust correlation between the magnitude of net horizontal photon transport ( H ) and its spectral dependence (slope), which is scale-invariant and holds for the entire pixel population of a domain. This was surprising at first given the large degree of spatial inhomogeneity. We prove that the underlying physical mechanism for this phenomenon is molecular scattering in conjunction with cloud spatial structure. On this basis, we developed a simple parameterization through a single parameter ε , which quantifies the characteristic spectral signature of spatial inhomogeneities. In the case we studied, neglecting net horizontal photon transport leads to a local transmittance bias of ±12-19 %, even at the relatively coarse spatial resolution of 20 km. Since three-dimensional effects depend on the spatial context of a given pixel in a nontrivial way, the spectral dimension of this problem may emerge as the starting point for future bias corrections.

  16. Space Radar Image of Central Sumatra, Indonesia

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This is a radar image of the central part of the island of Sumatra in Indonesia that shows how the tropical rainforest typical of this country is being impacted by human activity. Native forest appears in green in this image, while prominent pink areas represent places where the native forest has been cleared. The large rectangular areas have been cleared for palm oil plantations. The bright pink zones are areas that have been cleared since 1989, while the dark pink zones are areas that were cleared before 1989. These radar data were processed as part of an effort to assist oil and gas companies working in the area to assess the environmental impact of both their drilling operations and the activities of the local population. Radar images are useful in these areas because heavy cloud cover and the persistent smoke and haze associated with deforestation have prevented usable visible-light imagery from being acquired since 1989. The dark shapes in the upper right (northeast) corner of the image are a chain of lakes in flat coastal marshes. This image was acquired in October 1994 by the Spaceborne Imaging Radar C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR) onboard the space shuttle Endeavour. Environmental changes can be easily documented by comparing this image with visible-light data that were acquired in previous years by the Landsat satellite. The image is centered at 0.9 degrees north latitude and 101.3 degrees east longitude. The area shown is 50 kilometers by 100 kilometers (31 miles by 62 miles). The colors in the image are assigned to different frequencies and polarizations of the radar as follows: red is L-band horizontally transmitted, horizontally received; green is L-band horizontally transmitted, vertically received; blue is L-band vertically transmitted, vertically received. SIR-C/X-SAR, a joint mission of the German, Italian and United States space agencies, is part of NASA's Mission to Planet Earth program.

  17. Space Radar Image of Central Sumatra, Indonesia

    NASA Image and Video Library

    1999-04-15

    This is a radar image of the central part of the island of Sumatra in Indonesia that shows how the tropical rainforest typical of this country is being impacted by human activity. Native forest appears in green in this image, while prominent pink areas represent places where the native forest has been cleared. The large rectangular areas have been cleared for palm oil plantations. The bright pink zones are areas that have been cleared since 1989, while the dark pink zones are areas that were cleared before 1989. These radar data were processed as part of an effort to assist oil and gas companies working in the area to assess the environmental impact of both their drilling operations and the activities of the local population. Radar images are useful in these areas because heavy cloud cover and the persistent smoke and haze associated with deforestation have prevented usable visible-light imagery from being acquired since 1989. The dark shapes in the upper right (northeast) corner of the image are a chain of lakes in flat coastal marshes. This image was acquired in October 1994 by the Spaceborne Imaging Radar C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR) onboard the space shuttle Endeavour. Environmental changes can be easily documented by comparing this image with visible-light data that were acquired in previous years by the Landsat satellite. The image is centered at 0.9 degrees north latitude and 101.3 degrees east longitude. The area shown is 50 kilometers by 100 kilometers (31 miles by 62 miles). The colors in the image are assigned to different frequencies and polarizations of the radar as follows: red is L-band horizontally transmitted, horizontally received; green is L-band horizontally transmitted, vertically received; blue is L-band vertically transmitted, vertically received. SIR-C/X-SAR, a joint mission of the German, Italian and United States space agencies, is part of NASA's Mission to Planet Earth program. http://photojournal.jpl.nasa.gov/catalog/PIA01797

  18. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    NASA Astrophysics Data System (ADS)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government policies in China on the changes of dynamics of GDP and the three industries adjustment. The work in our paper provides a new way to understand the dynamics of economic development.

  19. Large Scale Ice Water Path and 3-D Ice Water Content

    DOE Data Explorer

    Liu, Guosheng

    2008-01-15

    Cloud ice water concentration is one of the most important, yet poorly observed, cloud properties. Developing physical parameterizations used in general circulation models through single-column modeling is one of the key foci of the ARM program. In addition to the vertical profiles of temperature, water vapor and condensed water at the model grids, large-scale horizontal advective tendencies of these variables are also required as forcing terms in the single-column models. Observed horizontal advection of condensed water has not been available because the radar/lidar/radiometer observations at the ARM site are single-point measurement, therefore, do not provide horizontal distribution of condensed water. The intention of this product is to provide large-scale distribution of cloud ice water by merging available surface and satellite measurements. The satellite cloud ice water algorithm uses ARM ground-based measurements as baseline, produces datasets for 3-D cloud ice water distributions in a 10 deg x 10 deg area near ARM site. The approach of the study is to expand a (surface) point measurement to an (satellite) areal measurement. That is, this study takes the advantage of the high quality cloud measurements at the point of ARM site. We use the cloud characteristics derived from the point measurement to guide/constrain satellite retrieval, then use the satellite algorithm to derive the cloud ice water distributions within an area, i.e., 10 deg x 10 deg centered at ARM site.

  20. Development and Application of Non-Linear Image Enhancement and Multi-Sensor Fusion Techniques for Hazy and Dark Imaging

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-ur

    2005-01-01

    The purpose of this research was to develop enhancement and multi-sensor fusion algorithms and techniques to make it safer for the pilot to fly in what would normally be considered Instrument Flight Rules (IFR) conditions, where pilot visibility is severely restricted due to fog, haze or other weather phenomenon. We proposed to use the non-linear Multiscale Retinex (MSR) as the basic driver for developing an integrated enhancement and fusion engine. When we started this research, the MSR was being applied primarily to grayscale imagery such as medical images, or to three-band color imagery, such as that produced in consumer photography: it was not, however, being applied to other imagery such as that produced by infrared image sources. However, we felt that it was possible by using the MSR algorithm in conjunction with multiple imaging modalities such as long-wave infrared (LWIR), short-wave infrared (SWIR), and visible spectrum (VIS), we could substantially improve over the then state-of-the-art enhancement algorithms, especially in poor visibility conditions. We proposed the following tasks: 1) Investigate the effects of applying the MSR to LWIR and SWIR images. This consisted of optimizing the algorithm in terms of surround scales, and weights for these spectral bands; 2) Fusing the LWIR and SWIR images with the VIS images using the MSR framework to determine the best possible representation of the desired features; 3) Evaluating different mixes of LWIR, SWIR and VIS bands for maximum fog and haze reduction, and low light level compensation; 4) Modifying the existing algorithms to work with video sequences. Over the course of the 3 year research period, we were able to accomplish these tasks and report on them at various internal presentations at NASA Langley Research Center, and in presentations and publications elsewhere. A description of the work performed under the tasks is provided in Section 2. The complete list of relevant publications during the research periods is provided in Section 5. This research also resulted in the generation of intellectual property.

  1. Non-damaging laser therapy of the macula: Titration algorithm and tissue response

    NASA Astrophysics Data System (ADS)

    Palanker, Daniel; Lavinsky, Daniel; Dalal, Roopa; Huie, Philip

    2014-02-01

    Retinal photocoagulation typically results in permanent scarring and scotomata, which limit its applicability to the macula, preclude treatments in the fovea, and restrict the retreatments. Non-damaging approaches to laser therapy have been tested in the past, but the lack of reliable titration and slow treatment paradigms limited their clinical use. We developed and tested a titration algorithm for sub-visible and non-damaging treatments of the retina with pulses sufficiently short to be used with pattern laser scanning. The algorithm based on Arrhenius model of tissue damage optimizes the power and duration for every energy level, relative to the threshold of lesion visibility established during titration (and defined as 100%). Experiments with pigmented rabbits established that lesions in the 50-75% energy range were invisible ophthalmoscopically, but detectable with Fluorescein Angiography and OCT, while at 30% energy there was only very minor damage to the RPE, which recovered within a few days. Patients with Diabetic Macular Edema (DME) and Central Serous Retinopathy (CSR) have been treated over the edematous areas at 30% energy, using 200μm spots with 0.25 diameter spacing. No signs of laser damage have been detected with any imaging modality. In CSR patients, subretinal fluid resolved within 45 days. In DME patients the edema decreased by approximately 150μm over 60 days. After 3-4 months some patients presented with recurrence of edema, and they responded well to retreatment with the same parameters, without any clinically visible damage. This pilot data indicates a possibility of effective and repeatable macular laser therapy below the tissue damage threshold.

  2. Evaluation of a metal artifacts reduction algorithm applied to postinterventional flat panel detector CT imaging.

    PubMed

    Stidd, D A; Theessen, H; Deng, Y; Li, Y; Scholz, B; Rohkohl, C; Jhaveri, M D; Moftakhar, R; Chen, M; Lopes, D K

    2014-01-01

    Flat panel detector CT images are degraded by streak artifacts caused by radiodense implanted materials such as coils or clips. A new metal artifacts reduction prototype algorithm has been used to minimize these artifacts. The application of this new metal artifacts reduction algorithm was evaluated for flat panel detector CT imaging performed in a routine clinical setting. Flat panel detector CT images were obtained from 59 patients immediately following cerebral endovascular procedures or as surveillance imaging for cerebral endovascular or surgical procedures previously performed. The images were independently evaluated by 7 physicians for metal artifacts reduction on a 3-point scale at 2 locations: immediately adjacent to the metallic implant and 3 cm away from it. The number of visible vessels before and after metal artifacts reduction correction was also evaluated within a 3-cm radius around the metallic implant. The metal artifacts reduction algorithm was applied to the 59 flat panel detector CT datasets without complications. The metal artifacts in the reduction-corrected flat panel detector CT images were significantly reduced in the area immediately adjacent to the implanted metal object (P = .05) and in the area 3 cm away from the metal object (P = .03). The average number of visible vessel segments increased from 4.07 to 5.29 (P = .1235) after application of the metal artifacts reduction algorithm to the flat panel detector CT images. Metal artifacts reduction is an effective method to improve flat panel detector CT images degraded by metal artifacts. Metal artifacts are significantly decreased by the metal artifacts reduction algorithm, and there was a trend toward increased vessel-segment visualization. © 2014 by American Journal of Neuroradiology.

  3. Evaluation of a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography of scaphoid fixation screws.

    PubMed

    Filli, Lukas; Marcon, Magda; Scholz, Bernhard; Calcagni, Maurizio; Finkenstädt, Tim; Andreisek, Gustav; Guggenberger, Roman

    2014-12-01

    The aim of this study was to evaluate a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography (FDCT) of scaphoid fixation screws. FDCT has gained interest in imaging small anatomic structures of the appendicular skeleton. Angiographic C-arm systems with flat detectors allow fluoroscopy and FDCT imaging in a one-stop procedure emphasizing their role as an ideal intraoperative imaging tool. However, FDCT imaging can be significantly impaired by artefacts induced by fixation screws. Following ethical board approval, commercially available scaphoid fixation screws were inserted into six cadaveric specimens in order to fix artificially induced scaphoid fractures. FDCT images corrected with the algorithm were compared to uncorrected images both quantitatively and qualitatively by two independent radiologists in terms of artefacts, screw contour, fracture line visibility, bone visibility, and soft tissue definition. Normal distribution of variables was evaluated using the Kolmogorov-Smirnov test. In case of normal distribution, quantitative variables were compared using paired Student's t tests. The Wilcoxon signed-rank test was used for quantitative variables without normal distribution and all qualitative variables. A p value of < 0.05 was considered to indicate statistically significant differences. Metal artefacts were significantly reduced by the correction algorithm (p < 0.001), and the fracture line was more clearly defined (p < 0.01). The inter-observer reliability was "almost perfect" (intra-class correlation coefficient 0.85, p < 0.001). The prototype correction algorithm in FDCT for metal artefacts induced by scaphoid fixation screws may facilitate intra- and postoperative follow-up imaging. Flat detector computed tomography (FDCT) is a helpful imaging tool for scaphoid fixation. The correction algorithm significantly reduces artefacts in FDCT induced by scaphoid fixation screws. This may facilitate intra- and postoperative follow-up imaging.

  4. RESOLVE: A new algorithm for aperture synthesis imaging of extended emission in radio astronomy

    NASA Astrophysics Data System (ADS)

    Junklewitz, H.; Bell, M. R.; Selig, M.; Enßlin, T. A.

    2016-02-01

    We present resolve, a new algorithm for radio aperture synthesis imaging of extended and diffuse emission in total intensity. The algorithm is derived using Bayesian statistical inference techniques, estimating the surface brightness in the sky assuming a priori log-normal statistics. resolve estimates the measured sky brightness in total intensity, and the spatial correlation structure in the sky, which is used to guide the algorithm to an optimal reconstruction of extended and diffuse sources. During this process, the algorithm succeeds in deconvolving the effects of the radio interferometric point spread function. Additionally, resolve provides a map with an uncertainty estimate of the reconstructed surface brightness. Furthermore, with resolve we introduce a new, optimal visibility weighting scheme that can be viewed as an extension to robust weighting. In tests using simulated observations, the algorithm shows improved performance against two standard imaging approaches for extended sources, Multiscale-CLEAN and the Maximum Entropy Method.

  5. Comparative analysis of multisensor satellite monitoring of Arctic sea-ice

    USGS Publications Warehouse

    Belchansky, G.I.; Mordvintsev, Ilia N.; Douglas, David C.

    1999-01-01

    This report represents comparative analysis of nearly coincident Russian OKEAN-01 polar orbiting satellite data, Special Sensor Microwave Imager (SSM/I) and Advanced Very High Resolution Radiometer (AVHRR) imagery. OKEAN-01 ice concentration algorithms utilize active and passive microwave measurements and a linear mixture model for measured values of the brightness temperature and the radar backscatter. SSM/I and AVHRR ice concentrations were computed with NASA Team algorithm and visible and thermal-infrared wavelength AVHRR data, accordingly

  6. Fusion of visible and near-infrared images based on luminance estimation by weighted luminance algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Zhun; Cheng, Feiyan; Shi, Junsheng; Huang, Xiaoqiao

    2018-01-01

    In a low-light scene, capturing color images needs to be at a high-gain setting or a long-exposure setting to avoid a visible flash. However, such these setting will lead to color images with serious noise or motion blur. Several methods have been proposed to improve a noise-color image through an invisible near infrared flash image. A novel method is that the luminance component and the chroma component of the improved color image are estimated from different image sources [1]. The luminance component is estimated mainly from the NIR image via a spectral estimation, and the chroma component is estimated from the noise-color image by denoising. However, it is challenging to estimate the luminance component. This novel method to estimate the luminance component needs to generate the learning data pairs, and the processes and algorithm are complex. It is difficult to achieve practical application. In order to reduce the complexity of the luminance estimation, an improved luminance estimation algorithm is presented in this paper, which is to weight the NIR image and the denoised-color image and the weighted coefficients are based on the mean value and standard deviation of both images. Experimental results show that the same fusion effect at aspect of color fidelity and texture quality is achieved, compared the proposed method with the novel method, however, the algorithm is more simple and practical.

  7. Simultaneous deblurring and iterative reconstruction of CBCT for image guided brain radiosurgery.

    PubMed

    Hashemi, SayedMasoud; Song, William Y; Sahgal, Arjun; Lee, Young; Huynh, Christopher; Grouza, Vladimir; Nordström, Håkan; Eriksson, Markus; Dorenlot, Antoine; Régis, Jean Marie; Mainprize, James G; Ruschin, Mark

    2017-04-07

    One of the limiting factors in cone-beam CT (CBCT) image quality is system blur, caused by detector response, x-ray source focal spot size, azimuthal blurring, and reconstruction algorithm. In this work, we develop a novel iterative reconstruction algorithm that improves spatial resolution by explicitly accounting for image unsharpness caused by different factors in the reconstruction formulation. While the model-based iterative reconstruction techniques use prior information about the detector response and x-ray source, our proposed technique uses a simple measurable blurring model. In our reconstruction algorithm, denoted as simultaneous deblurring and iterative reconstruction (SDIR), the blur kernel can be estimated using the modulation transfer function (MTF) slice of the CatPhan phantom or any other MTF phantom, such as wire phantoms. The proposed image reconstruction formulation includes two regularization terms: (1) total variation (TV) and (2) nonlocal regularization, solved with a split Bregman augmented Lagrangian iterative method. The SDIR formulation preserves edges, eases the parameter adjustments to achieve both high spatial resolution and low noise variances, and reduces the staircase effect caused by regular TV-penalized iterative algorithms. The proposed algorithm is optimized for a point-of-care head CBCT unit for image-guided radiosurgery and is tested with CatPhan phantom, an anthropomorphic head phantom, and 6 clinical brain stereotactic radiosurgery cases. Our experiments indicate that SDIR outperforms the conventional filtered back projection and TV penalized simultaneous algebraic reconstruction technique methods (represented by adaptive steepest-descent POCS algorithm, ASD-POCS) in terms of MTF and line pair resolution, and retains the favorable properties of the standard TV-based iterative reconstruction algorithms in improving the contrast and reducing the reconstruction artifacts. It improves the visibility of the high contrast details in bony areas and the brain soft-tissue. For example, the results show the ventricles and some brain folds become visible in SDIR reconstructed images and the contrast of the visible lesions is effectively improved. The line-pair resolution was improved from 12 line-pair/cm in FBP to 14 line-pair/cm in SDIR. Adjusting the parameters of the ASD-POCS to achieve 14 line-pair/cm caused the noise variance to be higher than the SDIR. Using these parameters for ASD-POCS, the MTF of FBP and ASD-POCS were very close and equal to 0.7 mm -1 which was increased to 1.2 mm -1 by SDIR, at half maximum.

  8. Simultaneous deblurring and iterative reconstruction of CBCT for image guided brain radiosurgery

    NASA Astrophysics Data System (ADS)

    Hashemi, SayedMasoud; Song, William Y.; Sahgal, Arjun; Lee, Young; Huynh, Christopher; Grouza, Vladimir; Nordström, Håkan; Eriksson, Markus; Dorenlot, Antoine; Régis, Jean Marie; Mainprize, James G.; Ruschin, Mark

    2017-04-01

    One of the limiting factors in cone-beam CT (CBCT) image quality is system blur, caused by detector response, x-ray source focal spot size, azimuthal blurring, and reconstruction algorithm. In this work, we develop a novel iterative reconstruction algorithm that improves spatial resolution by explicitly accounting for image unsharpness caused by different factors in the reconstruction formulation. While the model-based iterative reconstruction techniques use prior information about the detector response and x-ray source, our proposed technique uses a simple measurable blurring model. In our reconstruction algorithm, denoted as simultaneous deblurring and iterative reconstruction (SDIR), the blur kernel can be estimated using the modulation transfer function (MTF) slice of the CatPhan phantom or any other MTF phantom, such as wire phantoms. The proposed image reconstruction formulation includes two regularization terms: (1) total variation (TV) and (2) nonlocal regularization, solved with a split Bregman augmented Lagrangian iterative method. The SDIR formulation preserves edges, eases the parameter adjustments to achieve both high spatial resolution and low noise variances, and reduces the staircase effect caused by regular TV-penalized iterative algorithms. The proposed algorithm is optimized for a point-of-care head CBCT unit for image-guided radiosurgery and is tested with CatPhan phantom, an anthropomorphic head phantom, and 6 clinical brain stereotactic radiosurgery cases. Our experiments indicate that SDIR outperforms the conventional filtered back projection and TV penalized simultaneous algebraic reconstruction technique methods (represented by adaptive steepest-descent POCS algorithm, ASD-POCS) in terms of MTF and line pair resolution, and retains the favorable properties of the standard TV-based iterative reconstruction algorithms in improving the contrast and reducing the reconstruction artifacts. It improves the visibility of the high contrast details in bony areas and the brain soft-tissue. For example, the results show the ventricles and some brain folds become visible in SDIR reconstructed images and the contrast of the visible lesions is effectively improved. The line-pair resolution was improved from 12 line-pair/cm in FBP to 14 line-pair/cm in SDIR. Adjusting the parameters of the ASD-POCS to achieve 14 line-pair/cm caused the noise variance to be higher than the SDIR. Using these parameters for ASD-POCS, the MTF of FBP and ASD-POCS were very close and equal to 0.7 mm-1 which was increased to 1.2 mm-1 by SDIR, at half maximum.

  9. Verification studies of Seasat-A satellite scatterometer /SASS/ measurements

    NASA Technical Reports Server (NTRS)

    Halberstam, I.

    1981-01-01

    Two comparisons between Seasat-A satellite scatterometer (SASS) data and surface truth, obtained from the Gulf of Alaska Seasat Experiment and the Joint Air-Sea Interaction program, have been made to determine the behavior of SASS and its algorithms. The performance of SASS was first evaluated irrespective of the algorithms employed to convert the SASS data to geophysical parameters, which was done by separating the backscatter measurements into small bins of incidence and azimuth angles and polarity and regression against wind speed measurements. The algorithms were then tested by comparing their predicted slopes and y intercepts with those derived from the regressions, and by comparing each SASS backscatter measurement with the backscatter derived from the algorithms, and the given wind velocity from the observations. It was shown that SASS was insensitive to winds at high incidence angles for horizontal polarizations. Fairly high correlations were found between backscatter and wind speeds. The algorithms functioned well at mid-ranges of incidence angle and backscattering coefficient.

  10. User's guide to the Fault Inferring Nonlinear Detection System (FINDS) computer program

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Godiwala, P. M.; Satz, H. S.

    1988-01-01

    Described are the operation and internal structure of the computer program FINDS (Fault Inferring Nonlinear Detection System). The FINDS algorithm is designed to provide reliable estimates for aircraft position, velocity, attitude, and horizontal winds to be used for guidance and control laws in the presence of possible failures in the avionics sensors. The FINDS algorithm was developed with the use of a digital simulation of a commercial transport aircraft and tested with flight recorded data. The algorithm was then modified to meet the size constraints and real-time execution requirements on a flight computer. For the real-time operation, a multi-rate implementation of the FINDS algorithm has been partitioned to execute on a dual parallel processor configuration: one based on the translational dynamics and the other on the rotational kinematics. The report presents an overview of the FINDS algorithm, the implemented equations, the flow charts for the key subprograms, the input and output files, program variable indexing convention, subprogram descriptions, and the common block descriptions used in the program.

  11. Primary analysis of the ocean color remote sensing data of the HY-1B/COCTS

    NASA Astrophysics Data System (ADS)

    He, Xianqiang; Bai, Yan; Pan, Delu; Zhu, Qiankun; Gong, Fang

    2009-01-01

    China had successfully launched her second ocean color satellite HY-1B on 11 Apr., 2007, which was the successor of the HY-1A satellite launched on 15 May, 2002. There were two sensors onboard HY-1B, named the Chinese Ocean Color and Temperature Scanner (COCTS) and the Coastal Zone Imager (CZI) respectively, and COCTS was the main sensor. COCTS had not only eight visible and near-infrared wave bands similar to the SeaWiFS, but also two more thermal infrared wave bands to measure the sea surface temperature. Therefore, COCTS had broad application potentiality, such as fishery resource protection and development, coastal monitoring and management and marine pollution monitoring. In this paper, the main characteristics of COCTS were described firstly. Then, using the crosscalibration method, the vicarious calibration of COCTS was carried out by the synchronous remote sensing data of SeaWiFS, and the results showed that COCTS had well linear responses for the visible light bands with the correlation coefficients more than 0.98, however, the performances of the near infrared wavelength bands were not good as visible light bands. Using the vicarious calibration result, the operational atmospheric correction (AC) algorithm of COCTS was developed based on the exact Rayleigh scattering look-up table (LUT), aerosol scattering LUT and atmosphere diffuse transmission LUT generated by the coupled ocean-atmospheric vector radiative transfer numerical model named PCOART. The AC algorithm had been validated by the simulated radiance data at the top-of-atmosphere, and the results showed the errors of the water-leaving reflectance retrieved by the AC algorithm were less than 0.0005, which met the requirement of the exactly atmospheric correction of ocean color remote sensing. Finally, the AC algorithm was applied to the HY-1B/COCTS remote sensing data, and the corresponding ocean color remote sensing products have been generated.

  12. The Goddard Profiling Algorithm (GPROF): Description and Current Applications

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Yang, Song; Stout, John E.; Grecu, Mircea

    2004-01-01

    Atmospheric scientists use different methods for interpreting satellite data. In the early days of satellite meteorology, the analysis of cloud pictures from satellites was primarily subjective. As computer technology improved, satellite pictures could be processed digitally, and mathematical algorithms were developed and applied to the digital images in different wavelength bands to extract information about the atmosphere in an objective way. The kind of mathematical algorithm one applies to satellite data may depend on the complexity of the physical processes that lead to the observed image, and how much information is contained in the satellite images both spatially and at different wavelengths. Imagery from satellite-borne passive microwave radiometers has limited horizontal resolution, and the observed microwave radiances are the result of complex physical processes that are not easily modeled. For this reason, a type of algorithm called a Bayesian estimation method is utilized to interpret passive microwave imagery in an objective, yet computationally efficient manner.

  13. Open-cycle OTEC system performance analysis. [Claude cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewandowski, A.A.; Olson, D.A.; Johnson, D.H.

    1980-10-01

    An algorithm developed to calculate the performance of Claude-Cycle ocean thermal energy conversion (OTEC) systems is described. The algorithm treats each component of the system separately and then interfaces them to form a complete system, allowing a component to be changed without changing the rest of the algorithm. Two components that are subject to change are the evaporator and condenser. For this study we developed mathematical models of a channel-flow evaporator and both a horizontal jet and spray director contact condenser. The algorithm was then programmed to run on SERI's CDC 7600 computer and used to calculate the effect onmore » performance of deaerating the warm and cold water streams before entering the evaporator and condenser, respectively. This study indicates that there is no advantage to removing air from these streams compared with removing the air from the condenser.« less

  14. The algorithm stitching for medical imaging

    NASA Astrophysics Data System (ADS)

    Semenishchev, E.; Marchuk, V.; Voronin, V.; Pismenskova, M.; Tolstova, I.; Svirin, I.

    2016-05-01

    In this paper we propose a stitching algorithm of medical images into one. The algorithm is designed to stitching the medical x-ray imaging, biological particles in microscopic images, medical microscopic images and other. Such image can improve the diagnosis accuracy and quality for minimally invasive studies (e.g., laparoscopy, ophthalmology and other). The proposed algorithm is based on the following steps: the searching and selection areas with overlap boundaries; the keypoint and feature detection; the preliminary stitching images and transformation to reduce the visible distortion; the search a single unified borders in overlap area; brightness, contrast and white balance converting; the superimposition into a one image. Experimental results demonstrate the effectiveness of the proposed method in the task of image stitching.

  15. Cirrus Horizontal Heterogeneity Effects on Cloud Optical Properties Retrieved from MODIS VNIR to TIR Channels as a Function of the Spatial Resolution

    NASA Astrophysics Data System (ADS)

    Fauchez, T.; Platnick, S. E.; Sourdeval, O.; Wang, C.; Meyer, K.; Cornet, C.; Szczap, F.

    2017-12-01

    Cirrus are an important part of the Earth radiation budget but an assessment of their role yet remains highly uncertain. Cirrus optical properties such as Cloud Optical Thickness (COT) and ice crystal effective particle size (Re) are often retrieved with a combination of Visible/Near InfraRed (VNIR) and ShortWave-InfraRed (SWIR) reflectance channels. Alternatively, Thermal InfraRed (TIR) techniques, such as the Split Window Technique (SWT), have demonstrated better sensitivity to thin cirrus. However, current satellite operational products for both retrieval methods assume that cloudy pixels are horizontally homogeneous (Plane Parallel and Homogeneous Approximation (PPHA)) and independent (Independent Pixel Approximation (IPA)). The impact of these approximations on cirrus retrievals needs to be understood and, as far as possible, corrected. Horizontal heterogeneity effects can be more easily estimated and corrected in the TIR range because they are mainly dominated by the PPA bias, which primarily depends on the COT subpixel heterogeneity. For solar reflectance channels, in addition to the PPHA bias, the IPA can lead to significant retrieval errors if there is large photon transport between cloudy columns in addition to brightening and shadowing effects that are more difficult to quantify.The effects of cirrus horizontal heterogeneity are here studied on COT and Re retrievals obtained using simulated MODIS reflectances at 0.86 and 2.11 μm and radiances at 8.5, 11.0 and 12.0 μm, for spatial resolutions ranging from 50 m to 10 km. For each spatial resolution, simulated TOA reflectances and radiances are combined for cloud optical property retrievals with a research-level optimal estimation retrieval method (OEM). The impact of horizontal heterogeneity on the retrieved products is assessed for different solar geometries and various combinations of the five channels.

  16. Observational filter for limb sounders applied to convective gravity waves

    NASA Astrophysics Data System (ADS)

    Trinh, Quang Thai; Preusse, Peter; Riese, Martin; Kalisch, Silvio

    Gravity waves (GWs) play a key role in the dynamics of the middle atmosphere. In the current work, simulated spectral distribution in term of horizontal and vertical wavenumber of GW momentum flux (GWMF) is analysed by applying an accurate observational filter, which consider sensitivity and sampling geometry of satellite instruments. For this purpose, GWs are simulated for January 2008 by coupling GROGRAT (gravity wave regional or global ray tracer) and ray-based spectral parameterization of convective gravity wave drag (CGWD). Atmospheric background is taken from MERRA (Modern-Era Retrospective Analysis For Research And Applications) data. GW spectra of different spatial and temporal scales from parameterization of CGWD (MF1, MF2, MF3) at 25 km altitude are considered. The observational filter contains the following elements: determination of the wavelength along the line of sight, application of the visibility filter from Preusse et al, JGR, 2002, determination of the along-track wavelength, and aliasing correction as well as correction of GWMF due to larger horizontal wavelength along-track. Sensitivity and sampling geometries of the SABER (Sounding of the Atmosphere using Broadband Emission Radiometry) and HIRDLS (High Resolution Dynamics Limb Sounder) are simulated. Results show that all spectra are shifted to the direction of longer horizontal and vertical wavelength after applying the observational filter. Spectrum MF1 is most influenced and MF3 is least influenced by this filter. Part of the spectra, related to short horizontal wavelength, is cut off and flipped to the part of longer horizontal wavelength by aliasing. Sampling geometry of HIRDLS allows to see a larger part of the spectrum thanks to shorter sampling profile distance. A better vertical resolution of the HIRDLS instrument also helps to increase its sensitivity.

  17. Observational filter for limb sounders applied to convective gravity waves

    NASA Astrophysics Data System (ADS)

    Trinh, Thai; Kalisch, Silvio; Preusse, Peter; Riese, Martin

    2014-05-01

    Gravity waves (GWs) play a key role in the dynamics of the middle atmosphere. In the current work, simulated spectral distribution in term of horizontal and vertical wavenumber of GW momentum flux (GWMF) is analysed by applying an accurate observational filter, which consider sensitivity and sampling geometry of satellite instruments. For this purpose, GWs are simulated for January 2008 by coupling GROGRAT (gravity wave regional or global ray tracer) and ray-based spectral parameterization of convective gravity wave drag (CGWD). Atmospheric background is taken from MERRA (Modern-Era Retrospective Analysis For Research And Applications) data. GW spectra of different spatial and temporal scales from parameterization of CGWD (MF1, MF2, MF3) at 25 km altitude are considered. The observational filter contains the following elements: determination of the wavelength along the line of sight, application of the visibility filter from Preusse et al, JGR, 2002, determination of the along-track wavelength, and aliasing correction as well as correction of GWMF due to larger horizontal wavelength along-track. Sensitivity and sampling geometries of the SABER (Sounding of the Atmosphere using Broadband Emission Radiometry) and HIRDLS (High Resolution Dynamics Limb Sounder) are simulated. Results show that all spectra are shifted to the direction of longer horizontal and vertical wavelength after applying the observational filter. Spectrum MF1 is most influenced and MF3 is least influenced by this filter. Part of the spectra, related to short horizontal wavelength, is cut off and flipped to the part of longer horizontal wavelength by aliasing. Sampling geometry of HIRDLS allows to see a larger part of the spectrum thanks to shorter sampling profile distance. A better vertical resolution of the HIRDLS instrument also helps to increase its sensitivity.

  18. The Solution of Large Time-Dependent Problems Using Reduced Coordinates.

    DTIC Science & Technology

    1987-06-01

    numerical intergration schemes for dynamic problems, the algorithm known as Newmark’s Method. The behavior of the Newmark scheme, as well as the basic...T’he horizontal displacements at the mid-height and the bottom of the buildin- are shown in f igure 4. 13. The solution history illustrated is for a

  19. A New Algorithm for Detecting Cloud Height using OMPS/LP Measurements

    NASA Technical Reports Server (NTRS)

    Chen, Zhong; DeLand, Matthew; Bhartia, Pawan K.

    2016-01-01

    The Ozone Mapping and Profiler Suite Limb Profiler (OMPS/LP) ozone product requires the determination of cloud height for each event to establish the lower boundary of the profile for the retrieval algorithm. We have created a revised cloud detection algorithm for LP measurements that uses the spectral dependence of the vertical gradient in radiance between two wavelengths in the visible and near-IR spectral regions. This approach provides better discrimination between clouds and aerosols than results obtained using a single wavelength. Observed LP cloud height values show good agreement with coincident Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) measurements.

  20. An evaluation of flight path formats head-up and head-down

    NASA Technical Reports Server (NTRS)

    Sexton, George A.; Moody, Laura E.; Evans, Joanne; Williams, Kenneth E.

    1988-01-01

    Flight path primary flight display formats were incorporated on head-up and head-down electronic displays and integrated into an Advanced Concepts Flight Simulator. Objective and subjective data were collected while ten airline pilots evaluated the formats by flying an approach and landing task under various ceiling, visibility and wind conditions. Deviations from referenced/commanded airspeed, horizontal track, vertical track and touchdown point were smaller using the head-up display (HUD) format than the head-down display (HDD) format, but not significantly smaller. Subjectively, the pilots overwhelmingly preferred (1) flight path formats over attitude formats used in current aircraft, and (2) the head-up presentation over the head-down, primarily because it eliminated the head-down to head-up transition during low visibility landing approaches. This report describes the simulator, the flight displays, the format evaluation, and the results of the objective and subjective data.

  1. Analysis of HD 73045 light curve data

    NASA Astrophysics Data System (ADS)

    Das, Mrinal Kanti; Bhatraju, Naveen Kumar; Joshi, Santosh

    2018-04-01

    In this work we analyzed the Kepler light curve data of HD 73045. The raw data has been smoothened using standard filters. The power spectrum has been obtained by using a fast Fourier transform routine. It shows the presence of more than one period. In order to take care of any non-stationary behavior, we carried out a wavelet analysis to obtain the wavelet power spectrum. In addition, to identify the scale invariant structure, the data has been analyzed using a multifractal detrended fluctuation analysis. Further to characterize the diversity of embedded patterns in the HD 73045 flux time series, we computed various entropy-based complexity measures e.g. sample entropy, spectral entropy and permutation entropy. The presence of periodic structure in the time series was further analyzed using the visibility network and horizontal visibility network model of the time series. The degree distributions in the two network models confirm such structures.

  2. Measurement of fog and haze extinction characteristics and availability evaluation of free space optical link under the sea surface environment.

    PubMed

    Wu, Xiaojun; Wang, Hongxing; Song, Bo

    2015-02-10

    Fog and haze can lead to changes in extinction characteristics. Therefore, the performance of the free space optical link is highly influenced by severe weather conditions. Considering the influential behavior of weather conditions, a state-of-the-art solution for the observation of fog and haze over the sea surface is presented in this paper. A Mie scattering laser radar, with a wavelength of 532 nm, is used to observe the weather conditions of the sea surface environment. The horizontal extinction coefficients and visibilities are obtained from the observation data, and the results are presented in the paper. The changes in the characteristics of extinction coefficients and visibilities are analyzed based on both the short-term (6 days) severe weather data and long-term (6 months) data. Finally, the availability performance of the free space optical communication link is evaluated under the sea surface environment.

  3. Acoustic wave propagation and intensity fluctuations in shallow water 2006 experiment

    NASA Astrophysics Data System (ADS)

    Luo, Jing

    Fluctuations of low frequency sound propagation in the presence of nonlinear internal waves during the Shallow Water 2006 experiment are analyzed. Acoustic waves and environmental data including on-board ship radar images were collected simultaneously before, during, and after a strong internal solitary wave packet passed through a source-receiver acoustic track. Analysis of the acoustic wave signals shows temporal intensity fluctuations. These fluctuations are affected by the passing internal wave and agrees well with the theory of the horizontal refraction of acoustic wave propagation in shallow water. The intensity focusing and defocusing that occurs in a fixed source-receiver configuration while internal wave packet approaches and passes the acoustic track is addressed in this thesis. Acoustic ray-mode theory is used to explain the modal evolution of broadband acoustic waves propagating in a shallow water waveguide in the presence of internal waves. Acoustic modal behavior is obtained from the data through modal decomposition algorithms applied to data collected by a vertical line array of hydrophones. Strong interference patterns are observed in the acoustic data, whose main cause is identified as the horizontal refraction referred to as the horizontal Lloyd mirror effect. To analyze this interference pattern, combined Parabolic Equation model and Vertical-mode horizontal-ray model are utilized. A semi-analytic formula for estimating the horizontal Lloyd mirror effect is developed.

  4. A study of natural circulation in the evaporator of a horizontal-tube heat recovery steam generator

    NASA Astrophysics Data System (ADS)

    Roslyakov, P. V.; Pleshanov, K. A.; Sterkhov, K. V.

    2014-07-01

    Results obtained from investigations of stable natural circulation in an intricate circulation circuit with a horizontal layout of the tubes of evaporating surface having a negative useful head are presented. The possibility of making a shift from using multiple forced circulation organized by means of a circulation pump to natural circulation in vertical heat recovery steam generator is estimated. Criteria for characterizing the performance reliability and efficiency of a horizontal evaporator with negative useful head are proposed. The influence of various design solutions on circulation robustness is considered. With due regard of the optimal parameters, the most efficient and least costly methods are proposed for achieving more stable circulation in a vertical heat recovery steam generator when a shift is made from multiple forced to natural circulation. A procedure for calculating the circulation parameters and an algorithm for checking evaporator performance reliability are developed, and recommendations for the design of heat recovery steam generator, nonheated parts of natural circulation circuit, and evaporating surface are suggested.

  5. Experimental validation and model development for thermal transmittances of porous window screens and horizontal louvred blind systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Robert; Goudey, Howdy; Curcija, D. Charlie

    Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less

  6. Experimental validation and model development for thermal transmittances of porous window screens and horizontal louvred blind systems

    DOE PAGES

    Hart, Robert; Goudey, Howdy; Curcija, D. Charlie

    2017-05-16

    Virtually every home in the US has some form of shades, blinds, drapes, or other window attachment, but few have been designed for energy savings. In order to provide a common basis of comparison for thermal performance it is important to have validated simulation tools. This study outlines a review and validation of the ISO 15099 centre-of-glass thermal transmittance correlations for naturally ventilated cavities through measurement and detailed simulations. The focus is on the impacts of room-side ventilated cavities, such as those found with solar screens and horizontal louvred blinds. The thermal transmittance of these systems is measured experimentally, simulatedmore » using computational fluid dynamics analysis, and simulated utilizing simplified correlations from ISO 15099. Finally, correlation coefficients are proposed for the ISO 15099 algorithm that reduces the mean error between measured and simulated heat flux for typical solar screens from 16% to 3.5% and from 13% to 1% for horizontal blinds.« less

  7. A new active variable stiffness suspension system using a nonlinear energy sink-based controller

    NASA Astrophysics Data System (ADS)

    Anubi, Olugbenga Moses; Crane, Carl D.

    2013-10-01

    This paper presents the active case of a variable stiffness suspension system. The central concept is based on a recently designed variable stiffness mechanism which consists of a horizontal control strut and a vertical strut. The horizontal strut is used to vary the load transfer ratio by actively controlling the location of the point of attachment of the vertical strut to the car body. The control algorithm, effected by a hydraulic actuator, uses the concept of nonlinear energy sink (NES) to effectively transfer the vibrational energy in the sprung mass to a control mass, thereby reducing the transfer of energy from road disturbance to the car body at a relatively lower cost compared to the traditional active suspension using the skyhook concept. The analyses and simulation results show that a better performance can be achieved by subjecting the point of attachment of a suspension system, to the chassis, to the influence of a horizontal NES system.

  8. Modeling the Volcanic Source at Long Valley, CA, Using a Genetic Algorithm Technique

    NASA Technical Reports Server (NTRS)

    Tiampo, Kristy F.

    1999-01-01

    In this project, we attempted to model the deformation pattern due to the magmatic source at Long Valley caldera using a real-value coded genetic algorithm (GA) inversion similar to that found in Michalewicz, 1992. The project has been both successful and rewarding. The genetic algorithm, coded in the C programming language, performs stable inversions over repeated trials, with varying initial and boundary conditions. The original model used a GA in which the geophysical information was coded into the fitness function through the computation of surface displacements for a Mogi point source in an elastic half-space. The program was designed to invert for a spherical magmatic source - its depth, horizontal location and volume - using the known surface deformations. It also included the capability of inverting for multiple sources.

  9. Contour classification in thermographic images for detection of breast cancer

    NASA Astrophysics Data System (ADS)

    Okuniewski, Rafał; Nowak, Robert M.; Cichosz, Paweł; Jagodziński, Dariusz; Matysiewicz, Mateusz; Neumann, Łukasz; Oleszkiewicz, Witold

    2016-09-01

    Thermographic images of breast taken by the Braster device are uploaded into web application which uses different classification algorithms to automatically decide whether a patient should be more thoroughly examined. This article presents the approach to the task of classifying contours visible on thermographic images of breast taken by the Braster device in order to make the decision about the existence of cancerous tumors in breast. It presents the results of the researches conducted on the different classification algorithms.

  10. Generating soft shadows with a depth buffer algorithm

    NASA Technical Reports Server (NTRS)

    Brotman, L. S.; Badler, N. I.

    1984-01-01

    Computer-synthesized shadows used to appear with a sharp edge when cast onto a surface. At present the production of more realistic, soft shadows is considered. However, significant costs arise in connection with such a representation. The current investigation is concerned with a pragmatic approach, which combines an existing shadowing method with a popular visible surface rendering technique, called a 'depth buffer', to generate soft shadows resulting from light sources of finite extent. The considered method represents an extension of Crow's (1977) shadow volume algorithm.

  11. Cloud field classification based on textural features

    NASA Technical Reports Server (NTRS)

    Sengupta, Sailes Kumar

    1989-01-01

    An essential component in global climate research is accurate cloud cover and type determination. Of the two approaches to texture-based classification (statistical and textural), only the former is effective in the classification of natural scenes such as land, ocean, and atmosphere. In the statistical approach that was adopted, parameters characterizing the stochastic properties of the spatial distribution of grey levels in an image are estimated and then used as features for cloud classification. Two types of textural measures were used. One is based on the distribution of the grey level difference vector (GLDV), and the other on a set of textural features derived from the MaxMin cooccurrence matrix (MMCM). The GLDV method looks at the difference D of grey levels at pixels separated by a horizontal distance d and computes several statistics based on this distribution. These are then used as features in subsequent classification. The MaxMin tectural features on the other hand are based on the MMCM, a matrix whose (I,J)th entry give the relative frequency of occurrences of the grey level pair (I,J) that are consecutive and thresholded local extremes separated by a given pixel distance d. Textural measures are then computed based on this matrix in much the same manner as is done in texture computation using the grey level cooccurrence matrix. The database consists of 37 cloud field scenes from LANDSAT imagery using a near IR visible channel. The classification algorithm used is the well known Stepwise Discriminant Analysis. The overall accuracy was estimated by the percentage or correct classifications in each case. It turns out that both types of classifiers, at their best combination of features, and at any given spatial resolution give approximately the same classification accuracy. A neural network based classifier with a feed forward architecture and a back propagation training algorithm is used to increase the classification accuracy, using these two classes of features. Preliminary results based on the GLDV textural features alone look promising.

  12. A Generic 1D Forward Modeling and Inversion Algorithm for TEM Sounding with an Arbitrary Horizontal Loop

    NASA Astrophysics Data System (ADS)

    Li, Zhanhui; Huang, Qinghua; Xie, Xingbing; Tang, Xingong; Chang, Liao

    2016-08-01

    We present a generic 1D forward modeling and inversion algorithm for transient electromagnetic (TEM) data with an arbitrary horizontal transmitting loop and receivers at any depth in a layered earth. Both the Hankel and sine transforms required in the forward algorithm are calculated using the filter method. The adjoint-equation method is used to derive the formulation of data sensitivity at any depth in non-permeable media. The inversion algorithm based on this forward modeling algorithm and sensitivity formulation is developed using the Gauss-Newton iteration method combined with the Tikhonov regularization. We propose a new data-weighting method to minimize the initial model dependence that enhances the convergence stability. On a laptop with a CPU of i7-5700HQ@3.5 GHz, the inversion iteration of a 200 layered input model with a single receiver takes only 0.34 s, while it increases to only 0.53 s for the data from four receivers at a same depth. For the case of four receivers at different depths, the inversion iteration runtime increases to 1.3 s. Modeling the data with an irregular loop and an equal-area square loop indicates that the effect of the loop geometry is significant at early times and vanishes gradually along the diffusion of TEM field. For a stratified earth, inversion of data from more than one receiver is useful in noise reducing to get a more credible layered earth. However, for a resistive layer shielded below a conductive layer, increasing the number of receivers on the ground does not have significant improvement in recovering the resistive layer. Even with a down-hole TEM sounding, the shielded resistive layer cannot be recovered if all receivers are above the shielded resistive layer. However, our modeling demonstrates remarkable improvement in detecting the resistive layer with receivers in or under this layer.

  13. Proceedings of the Ship Control Systems Symposium (6th) Held in Ottawa, Canada on 26-30 October 1981. Volume 2.

    DTIC Science & Technology

    1981-10-30

    cumulative effects of wooding on visibility, as shown in Fig. 3b, it is clear that no more than five windows should lie in the same plane. Two principles...012000/ / 25300 0 / 3000 n3% 20 25 30 WOODING AS PERCENT Of IONIZONTAL FIELD OF VIEW FIG. 3a. Percent of Wooding in Forward 1800 Field of View for...Effect of Wooding on Horizontal Field of View with Increasing Distance from Window. El 2-11 voice communication among bridge personnel at expected levels

  14. Wind erosion from a sagebrush steppe burned by wildfire: measurements of PM10 and total horizontal sediment flux

    USGS Publications Warehouse

    Wagenbrenner, Natalie S.; Germino, Matthew J.; Lamb, Brian K.; Robichaud, Peter R.; Foltz, Randy B.

    2013-01-01

    above the soil surface, had a maximum PM10 vertical flux of 100 mg m-2 s-1, and generated a large dust plume that was visible in satellite imagery. The peak PM10 concentration measured on-site at a height of 2 m in the downwind portion of the burned area was 690 mg m-3. Our results indicate that wildfire can convert a relatively stable landscape into one that is a major dust source.

  15. Characterization of Combustion Dynamics, Detection, and Prevention of an Unstable Combustion State Based on a Complex-Network Theory

    NASA Astrophysics Data System (ADS)

    Gotoda, Hiroshi; Kinugawa, Hikaru; Tsujimoto, Ryosuke; Domen, Shohei; Okuno, Yuta

    2017-04-01

    Complex-network theory has attracted considerable attention for nearly a decade, and it enables us to encompass our understanding of nonlinear dynamics in complex systems in a wide range of fields, including applied physics and mechanical, chemical, and electrical engineering. We conduct an experimental study using a pragmatic online detection methodology based on complex-network theory to prevent a limiting unstable state such as blowout in a confined turbulent combustion system. This study introduces a modified version of the natural visibility algorithm based on the idea of a visibility limit to serve as a pragmatic online detector. The average degree of the modified version of the natural visibility graph allows us to detect the onset of blowout, resulting in online prevention.

  16. Multi-Sensor Person Following in Low-Visibility Scenarios

    PubMed Central

    Sales, Jorge; Marín, Raúl; Cervera, Enric; Rodríguez, Sergio; Pérez, Javier

    2010-01-01

    Person following with mobile robots has traditionally been an important research topic. It has been solved, in most cases, by the use of machine vision or laser rangefinders. In some special circumstances, such as a smoky environment, the use of optical sensors is not a good solution. This paper proposes and compares alternative sensors and methods to perform a person following in low visibility conditions, such as smoky environments in firefighting scenarios. The use of laser rangefinder and sonar sensors is proposed in combination with a vision system that can determine the amount of smoke in the environment. The smoke detection algorithm provides the robot with the ability to use a different combination of sensors to perform robot navigation and person following depending on the visibility in the environment. PMID:22163506

  17. Multi-sensor person following in low-visibility scenarios.

    PubMed

    Sales, Jorge; Marín, Raúl; Cervera, Enric; Rodríguez, Sergio; Pérez, Javier

    2010-01-01

    Person following with mobile robots has traditionally been an important research topic. It has been solved, in most cases, by the use of machine vision or laser rangefinders. In some special circumstances, such as a smoky environment, the use of optical sensors is not a good solution. This paper proposes and compares alternative sensors and methods to perform a person following in low visibility conditions, such as smoky environments in firefighting scenarios. The use of laser rangefinder and sonar sensors is proposed in combination with a vision system that can determine the amount of smoke in the environment. The smoke detection algorithm provides the robot with the ability to use a different combination of sensors to perform robot navigation and person following depending on the visibility in the environment.

  18. Active contour based segmentation of resected livers in CT images

    NASA Astrophysics Data System (ADS)

    Oelmann, Simon; Oyarzun Laura, Cristina; Drechsler, Klaus; Wesarg, Stefan

    2015-03-01

    The majority of state of the art segmentation algorithms are able to give proper results in healthy organs but not in pathological ones. However, many clinical applications require an accurate segmentation of pathological organs. The determination of the target boundaries for radiotherapy or liver volumetry calculations are examples of this. Volumetry measurements are of special interest after tumor resection for follow up of liver regrow. The segmentation of resected livers presents additional challenges that were not addressed by state of the art algorithms. This paper presents a snakes based algorithm specially developed for the segmentation of resected livers. The algorithm is enhanced with a novel dynamic smoothing technique that allows the active contour to propagate with different speeds depending on the intensities visible in its neighborhood. The algorithm is evaluated in 6 clinical CT images as well as 18 artificial datasets generated from additional clinical CT images.

  19. The development of efficient numerical time-domain modeling methods for geophysical wave propagation

    NASA Astrophysics Data System (ADS)

    Zhu, Lieyuan

    This Ph.D. dissertation focuses on the numerical simulation of geophysical wave propagation in the time domain including elastic waves in solid media, the acoustic waves in fluid media, and the electromagnetic waves in dielectric media. This thesis shows that a linear system model can describe accurately the physical processes of those geophysical waves' propagation and can be used as a sound basis for modeling geophysical wave propagation phenomena. The generalized stability condition for numerical modeling of wave propagation is therefore discussed in the context of linear system theory. The efficiency of a series of different numerical algorithms in the time-domain for modeling geophysical wave propagation are discussed and compared. These algorithms include the finite-difference time-domain method, pseudospectral time domain method, alternating directional implicit (ADI) finite-difference time domain method. The advantages and disadvantages of these numerical methods are discussed and the specific stability condition for each modeling scheme is carefully derived in the context of the linear system theory. Based on the review and discussion of these existing approaches, the split step, ADI pseudospectral time domain (SS-ADI-PSTD) method is developed and tested for several cases. Moreover, the state-of-the-art stretched-coordinate perfect matched layer (SCPML) has also been implemented in SS-ADI-PSTD algorithm as the absorbing boundary condition for truncating the computational domain and absorbing the artificial reflection from the domain boundaries. After algorithmic development, a few case studies serve as the real-world examples to verify the capacities of the numerical algorithms and understand the capabilities and limitations of geophysical methods for detection of subsurface contamination. The first case is a study using ground penetrating radar (GPR) amplitude variation with offset (AVO) for subsurface non-aqueous-liquid (NAPL) contamination. The numerical AVO study reveals that the normalized residual polarization (NRP) variation with offset does not respond to subsurface NAPL existence when the offset is close to or larger than its critical value (which corresponds to critical incident angle) because the air and head waves dominate the recorded wave field and severely interfere with reflected waves in the TEz wave field. Thus it can be concluded that the NRP AVO/GPR method is invalid when source-receiver angle offset is close to or greater than its critical value due to incomplete and severely distorted reflection information. In other words, AVO is not a promising technique for detection of the subsurface NAPL, as claimed by some researchers. In addition, the robustness of the newly developed numerical algorithms is also verified by the AVO study for randomly-arranged layered media. Meanwhile, this case study also demonstrates again that the full-wave numerical modeling algorithms are superior to ray tracing method. The second case study focuses on the effect of the existence of a near-surface fault on the vertically incident P- and S- plane waves. The modeling results show that both P-wave vertical incidence and S-wave vertical incidence cases are qualified fault indicators. For the plane S-wave vertical incidence case, the horizontal location of the upper tip of the fault (the footwall side) can be identified without much effort, because all the recorded parameters on the surface including the maximum velocities and the maximum accelerations, and even their ratios H/V, have shown dramatic changes when crossing the upper tip of the fault. The centers of the transition zone of the all the curves of parameters are almost directly above the fault tip (roughly the horizontal center of the model). Compared with the case of the vertically incident P-wave source, it has been found that the S-wave vertical source is a better indicator for fault location, because the horizontal location of the tip of that fault cannot be clearly identified with the ratio of the horizontal to vertical velocity for the P-wave incident case.

  20. Model space exploration for determining landslide source history from long period seismic data

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Mangeney, A.; Stutzmann, E.; Capdeville, Y.; Moretti, L.; Calder, E. S.; Smith, P. J.; Cole, P.; Le Friant, A.

    2012-12-01

    The seismic signals generated by high magnitude landslide events can be recorded at remote stations, which provides access to the landslide process. During the "Boxing Day" eruption at Montserrat in 1997, the long-period seismic signals generated by the debris avalanche are recorded by two stations at distances of 450km and 1261km. We investigate the landslide process considering that the landslide source can be described by single forces. The period band 25-50 sec is selected for which the landslide signal is clearly visible at the two stations. We first use the transverse component of the closest station to determine the horizontal forces. We model the seismogram by normal mode summation and investigate the model space. Two horizontal forces are found that best fit the data. These two horizontal forces have similar amplitude, but opposite direction and they are separated in time by 70 sec. The radiation pattern of the transverse component does not enable to determine the exact azimuth of these forces. We then model the vertical component of the seismograms which enable to retrieve both the vertical and horizontal forces. Using the parameter previously determined (amplitude ratio and time shift of the 2 horizontal forces), we further investigate the model space and show that a single vertical force together with the 2 horizontal forces enable to fit the data. The complete source time function can be described as follows: a horizontal force toward the opposite direction of the landslide flow is followed 40 sec later by a vertical downward force and 30 more seconds later by a horizontal force toward the direction of the flow. The volume of the landslide estimated from the force magnitude is compatible with the volume determined by field survey. Inverting directly the seismograms in the period band 25-50sec enable to retrieve a source time function that is consistent with the 3 forces determined previously. The source time function in this narrow period band alone does not enable easily to recover the corresponding single forces. This method can be used to determine the source parameters using only 2 distant stations. It is successfully tested also on other landslides such as Mount St. Helens (1980) event and Mount Steller event (2005) which are recorded by more broadband stations.

  1. Near infrared and visible face recognition based on decision fusion of LBP and DCT features

    NASA Astrophysics Data System (ADS)

    Xie, Zhihua; Zhang, Shuai; Liu, Guodong; Xiong, Jinquan

    2018-03-01

    Visible face recognition systems, being vulnerable to illumination, expression, and pose, can not achieve robust performance in unconstrained situations. Meanwhile, near infrared face images, being light- independent, can avoid or limit the drawbacks of face recognition in visible light, but its main challenges are low resolution and signal noise ratio (SNR). Therefore, near infrared and visible fusion face recognition has become an important direction in the field of unconstrained face recognition research. In order to extract the discriminative complementary features between near infrared and visible images, in this paper, we proposed a novel near infrared and visible face fusion recognition algorithm based on DCT and LBP features. Firstly, the effective features in near-infrared face image are extracted by the low frequency part of DCT coefficients and the partition histograms of LBP operator. Secondly, the LBP features of visible-light face image are extracted to compensate for the lacking detail features of the near-infrared face image. Then, the LBP features of visible-light face image, the DCT and LBP features of near-infrared face image are sent to each classifier for labeling. Finally, decision level fusion strategy is used to obtain the final recognition result. The visible and near infrared face recognition is tested on HITSZ Lab2 visible and near infrared face database. The experiment results show that the proposed method extracts the complementary features of near-infrared and visible face images and improves the robustness of unconstrained face recognition. Especially for the circumstance of small training samples, the recognition rate of proposed method can reach 96.13%, which has improved significantly than 92.75 % of the method based on statistical feature fusion.

  2. New methods, algorithms, and software for rapid mapping of tree positions in coordinate forest plots

    Treesearch

    A. Dan Wilson

    2000-01-01

    The theories and methodologies for two new tree mapping methods, the Sequential-target method and the Plot-origin radial method, are described. The methods accommodate the use of any conventional distance measuring device and compass to collect horizontal distance and azimuth data between source or reference positions (origins) and target trees. Conversion equations...

  3. A genetic algorithm-based optimization model for pool boiling heat transfer on horizontal rod heaters at isolated bubble regime

    NASA Astrophysics Data System (ADS)

    Alavi Fazel, S. Ali

    2017-09-01

    A new optimized model which can predict the heat transfer in the nucleate boiling at isolated bubble regime is proposed for pool boiling on a horizontal rod heater. This model is developed based on the results of direct observations of the physical boiling phenomena. Boiling heat flux, wall temperature, bubble departing diameter, bubble generation frequency and bubble nucleation site density have been experimentally measured. Water and ethanol have been used as two different boiling fluids. Heating surface was made by several metals and various degrees of roughness. The mentioned model considers various mechanisms such as latent heat transfer due to micro-layer evaporation, transient conduction due to thermal boundary layer reformation, natural convection, heat transfer due to the sliding bubbles and bubble super-heating. The fractional contributions of individual mentioned heat transfer mechanisms have been calculated by genetic algorithm. The results show that at wall temperature difference more that about 3 K, bubble sliding transient conduction, non-sliding transient conduction, micro-layer evaporation, natural convection, radial forced convection and bubble super-heating have higher to lower fractional contributions respectively. The performance of the new optimized model has been verified by comparison of the existing experimental data.

  4. Tunable plasmon-induced transparency in plasmonic metamaterial composed of three identical rings

    NASA Astrophysics Data System (ADS)

    Tian, Yuchen; Ding, Pei; Fan, Chunzhen

    2017-10-01

    We numerically investigated the plasmon-induced transparency (PIT) effect in a three-dimensional plasmonic metamaterial composed of three identical rings. It is illustrated that the PIT effect appears as a result of the destructive interference between the electric dipole and the quadrupole resonance mode. By tuning gap distance, radius or rotation angle of the metamaterial, the required transmission spectra with a narrow sharp transparency peak can be realized. In particular, it is found that an on-to-off amplitude modulation of the PIT transparency window can be achieved by moving or rotating the horizontal ring. Two dips move to high frequency and low frequency regions, respectively, in the transmission spectra by moving the horizontal ring, namely, the width of transmission peak becomes larger. With the rotation of horizontal ring, both width and position of transmission peak are kept invariant. Our designed structure achieved a maximum group index of 352 in the visible frequency range, which has a significant slow light effect. Moreover, the PIT effect is explained based on the classical two-oscillator theory, which is in well agreement with the numerical results. It indicates our proposed structure and theoretical analysis may open up avenues for the tunable control of light in highly integrated optical circuits.

  5. MAX-DOAS measurements of NO2 column densities in Vienna

    NASA Astrophysics Data System (ADS)

    Schreier, Stefan; Weihs, Philipp; Peters, Enno; Richter, Andreas; Ostendorf, Mareike; Schönhardt, Anja; Burrows, John P.; Schmalwieser, Alois

    2017-04-01

    In the VINDOBONA (VIenna horizontal aNd vertical Distribution OBservations Of Nitrogen dioxide and Aerosols) project, two Multi AXis Differential Optical Absorption Spectroscopy (MAX-DOAS) systems will be set up at two different locations and altitudes in Vienna, Austria. After comparison measurements in Bremen, Germany, and Cabauw, The Netherlands, the first of the two MAX-DOAS instruments was set up at the University of Veterinary Medicine in the northeastern part of Vienna in December 2016. The instrument performs spectral measurements of visible scattered sunlight at defined horizontal and vertical viewing directions. From these measurements, column densities of NO2 and aerosols are derived by applying the DOAS analysis. First preliminary results are presented. The second MAX-DOAS instrument will be set up in April/May 2017 at the University of Natural Resources and Life Sciences in the northwestern part of Vienna. Once these two instruments are measuring simultaneously, small campaigns including car DOAS zenith-sky and tower DOAS off-axis measurements are planned. The main emphasis of this project will be on the installation and operation of two MAX-DOAS instruments, the improvement of tropospheric NO2 and aerosol retrieval, and the characterization of the horizontal, vertical, and temporal variations of tropospheric NO2 and aerosols in Vienna, Austria.

  6. Reconciling biases and uncertainties of AIRS and MODIS ice cloud properties

    NASA Astrophysics Data System (ADS)

    Kahn, B. H.; Gettelman, A.

    2015-12-01

    We will discuss comparisons of collocated Atmospheric Infrared Sounder (AIRS) and Moderate Resolution Imaging Spectroradiometer (MODIS) ice cloud optical thickness (COT), effective radius (CER), and cloud thermodynamic phase retrievals. The ice cloud comparisons are stratified by retrieval uncertainty estimates, horizontal inhomogeneity at the pixel-scale, vertical cloud structure, and other key parameters. Although an estimated 27% globally of all AIRS pixels contain ice cloud, only 7% of them are spatially uniform ice according to MODIS. We find that the correlations of COT and CER between the two instruments are strong functions of horizontal cloud heterogeneity and vertical cloud structure. The best correlations are found in single-layer, horizontally homogeneous clouds over the low-latitude tropical oceans with biases and scatter that increase with scene complexity. While the COT comparisons are unbiased in homogeneous ice clouds, a bias of 5-10 microns remains in CER within the most homogeneous scenes identified. This behavior is entirely consistent with known sensitivity differences in the visible and infrared bands. We will use AIRS and MODIS ice cloud properties to evaluate ice hydrometeor output from climate model output, such as the CAM5, with comparisons sorted into different dynamical regimes. The results of the regime-dependent comparisons will be described and implications for model evaluation and future satellite observational needs will be discussed.

  7. Tomographic inversion of satellite photometry

    NASA Technical Reports Server (NTRS)

    Solomon, S. C.; Hays, P. B.; Abreu, V. J.

    1984-01-01

    An inversion algorithm capable of reconstructing the volume emission rate of thermospheric airglow features from satellite photometry has been developed. The accuracy and resolution of this technique are investigated using simulated data, and the inversions of several sets of observations taken by the Visible Airglow Experiment are presented.

  8. Tasked-based quantification of measurement utility for ex vivo multi-spectral Mueller polarimetry of the uterine cervix

    NASA Astrophysics Data System (ADS)

    Kupinski, Meredith; Rehbinder, Jean; Haddad, Huda; Deby, Stanislas; Vizet, Jérémy; Teig, Benjamin; Nazac, André; Pierangelo, Angelo; Moreau, François; Novikova, Tatiana

    2017-07-01

    Significant contrast in visible wavelength Mueller matrix images for healthy and pre-cancerous regions of excised cervical tissue is shown. A novel classification algorithm is used to compute a test statistic from a small patient population.

  9. Energy-efficient constellations design and fast decoding for space-collaborative MIMO visible light communications

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Jun; Liang, Wang-Feng; Wang, Chao; Wang, Wen-Ya

    2017-01-01

    In this paper, space-collaborative constellations (SCCs) for indoor multiple-input multiple-output (MIMO) visible light communication (VLC) systems are considered. Compared with traditional VLC MIMO techniques, such as repetition coding (RC), spatial modulation (SM) and spatial multiplexing (SMP), SCC achieves the minimum average optical power for a fixed minimum Euclidean distance. We have presented a unified SCC structure for 2×2 MIMO VLC systems and extended it to larger MIMO VLC systems with more transceivers. Specifically for 2×2 MIMO VLC, a fast decoding algorithm is developed with decoding complexity almost linear in terms of the square root of the cardinality of SCC, and the expressions of symbol error rate of SCC are presented. In addition, bit mappings similar to Gray mapping are proposed for SCC. Computer simulations are performed to verify the fast decoding algorithm and the performance of SCC, and the results demonstrate that the performance of SCC is better than those of RC, SM and SMP for indoor channels in general.

  10. Discrete Walsh Hadamard transform based visible watermarking technique for digital color images

    NASA Astrophysics Data System (ADS)

    Santhi, V.; Thangavelu, Arunkumar

    2011-10-01

    As the size of the Internet is growing enormously the illegal manipulation of digital multimedia data become very easy with the advancement in technology tools. In order to protect those multimedia data from unauthorized access the digital watermarking system is used. In this paper a new Discrete walsh Hadamard Transform based visible watermarking system is proposed. As the watermark is embedded in transform domain, the system is robust to many signal processing attacks. Moreover in this proposed method the watermark is embedded in tiling manner in all the range of frequencies to make it robust to compression and cropping attack. The robustness of the algorithm is tested against noise addition, cropping, compression, Histogram equalization and resizing attacks. The experimental results show that the algorithm is robust to common signal processing attacks and the observed peak signal to noise ratio (PSNR) of watermarked image is varying from 20 to 30 db depends on the size of the watermark.

  11. Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors

    PubMed Central

    Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal

    2014-01-01

    Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782

  12. Improving the Accuracy of Direct Geo-referencing of Smartphone-Based Mobile Mapping Systems Using Relative Orientation and Scene Geometric Constraints.

    PubMed

    Alsubaie, Naif M; Youssef, Ahmed A; El-Sheimy, Naser

    2017-09-30

    This paper introduces a new method which facilitate the use of smartphones as a handheld low-cost mobile mapping system (MMS). Smartphones are becoming more sophisticated and smarter and are quickly closing the gap between computers and portable tablet devices. The current generation of smartphones are equipped with low-cost GPS receivers, high-resolution digital cameras, and micro-electro mechanical systems (MEMS)-based navigation sensors (e.g., accelerometers, gyroscopes, magnetic compasses, and barometers). These sensors are in fact the essential components for a MMS. However, smartphone navigation sensors suffer from the poor accuracy of global navigation satellite System (GNSS), accumulated drift, and high signal noise. These issues affect the accuracy of the initial Exterior Orientation Parameters (EOPs) that are inputted into the bundle adjustment algorithm, which then produces inaccurate 3D mapping solutions. This paper proposes new methodologies for increasing the accuracy of direct geo-referencing of smartphones using relative orientation and smartphone motion sensor measurements as well as integrating geometric scene constraints into free network bundle adjustment. The new methodologies incorporate fusing the relative orientations of the captured images and their corresponding motion sensor measurements to improve the initial EOPs. Then, the geometric features (e.g., horizontal and vertical linear lines) visible in each image are extracted and used as constraints in the bundle adjustment procedure which correct the relative position and orientation of the 3D mapping solution.

  13. Improving the Accuracy of Direct Geo-referencing of Smartphone-Based Mobile Mapping Systems Using Relative Orientation and Scene Geometric Constraints

    PubMed Central

    Alsubaie, Naif M.; Youssef, Ahmed A.; El-Sheimy, Naser

    2017-01-01

    This paper introduces a new method which facilitate the use of smartphones as a handheld low-cost mobile mapping system (MMS). Smartphones are becoming more sophisticated and smarter and are quickly closing the gap between computers and portable tablet devices. The current generation of smartphones are equipped with low-cost GPS receivers, high-resolution digital cameras, and micro-electro mechanical systems (MEMS)-based navigation sensors (e.g., accelerometers, gyroscopes, magnetic compasses, and barometers). These sensors are in fact the essential components for a MMS. However, smartphone navigation sensors suffer from the poor accuracy of global navigation satellite System (GNSS), accumulated drift, and high signal noise. These issues affect the accuracy of the initial Exterior Orientation Parameters (EOPs) that are inputted into the bundle adjustment algorithm, which then produces inaccurate 3D mapping solutions. This paper proposes new methodologies for increasing the accuracy of direct geo-referencing of smartphones using relative orientation and smartphone motion sensor measurements as well as integrating geometric scene constraints into free network bundle adjustment. The new methodologies incorporate fusing the relative orientations of the captured images and their corresponding motion sensor measurements to improve the initial EOPs. Then, the geometric features (e.g., horizontal and vertical linear lines) visible in each image are extracted and used as constraints in the bundle adjustment procedure which correct the relative position and orientation of the 3D mapping solution. PMID:28973958

  14. Performance of lightweight large C/SiC mirror

    NASA Astrophysics Data System (ADS)

    Yui, Yukari Y.; Goto, Ken; Kaneda, Hidehiro; Katayama, Haruyoshi; Kotani, Masaki; Miyamoto, Masashi; Naitoh, Masataka; Nakagawa, Takao; Saruwatari, Hideki; Suganuma, Masahiro; Sugita, Hiroyuki; Tange, Yoshio; Utsunomiya, Shin; Yamamoto, Yasuji; Yamawaki, Toshihiko

    2017-11-01

    Very lightweight mirror will be required in the near future for both astronomical and earth science/observation missions. Silicon carbide is becoming one of the major materials applied especially to large and/or light space-borne optics, such as Herschel, GAIA, and SPICA. On the other hand, the technology of highly accurate optical measurement of large telescopes, especially in visible wavelength or cryogenic circumstances is also indispensable to realize such space-borne telescopes and hence the successful missions. We have manufactured a very lightweight Φ=800mm mirror made of carbon reinforced silicon carbide composite that can be used to evaluate the homogeneity of the mirror substrate and to master and establish the ground testing method and techniques by assembling it as the primary mirror into an optical system. All other parts of the optics model are also made of the same material as the primary mirror. The composite material was assumed to be homogeneous from the mechanical tests of samples cut out from the various areas of the 800mm mirror green-body and the cryogenic optical measurement of the mirror surface deformation of a 160mm sample mirror that is also made from the same green-body as the 800mm mirror. The circumstance and condition of the optical testing facility has been confirmed to be capable for the highly precise optical measurements of large optical systems of horizontal light axis configuration. Stitching measurement method and the algorithm for analysis of the measurement is also under study.

  15. Comparative Analysis of Aerosol Retrievals from MODIS, OMI and MISR Over Sahara Region

    NASA Technical Reports Server (NTRS)

    Lyapustin, A.; Wang, Y.; Hsu, C.; Terres, O.; Leptoukh, G.; Kalashnikova, O.; Korkin, S.

    2011-01-01

    MODIS is a wide field-of-view sensor providing daily global observations of the Earth. Currently, global MODIS aerosol retrievals over land are performed with the main Dark Target algorithm complimented with the Deep Blue (DB) Algorithm over bright deserts. The Dark Target algorithm relies on surface parameterization which relates reflectance in MODIS visible bands with the 2.1 micrometer region, whereas the Deep Blue algorithm uses an ancillary angular distribution model of surface reflectance developed from the time series of clear-sky MODIS observations. Recently, a new Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm has been developed for MODIS. MAIAC uses a time series and an image based processing to perform simultaneous retrievals of aerosol properties and surface bidirectional reflectance. It is a generic algorithm which works over both dark vegetative surfaces and bright deserts and performs retrievals at 1 km resolution. In this work, we will provide a comparative analysis of DB, MAIAC, MISR and OMI aerosol products over bright deserts of northern Africa.

  16. Volumetric ambient occlusion for real-time rendering and games.

    PubMed

    Szirmay-Kalos, L; Umenhoffer, T; Toth, B; Szecsi, L; Sbert, M

    2010-01-01

    This new algorithm, based on GPUs, can compute ambient occlusion to inexpensively approximate global-illumination effects in real-time systems and games. The first step in deriving this algorithm is to examine how ambient occlusion relates to the physically founded rendering equation. The correspondence stems from a fuzzy membership function that defines what constitutes nearby occlusions. The next step is to develop a method to calculate ambient occlusion in real time without precomputation. The algorithm is based on a novel interpretation of ambient occlusion that measures the relative volume of the visible part of the surface's tangent sphere. The new formula's integrand has low variation and thus can be estimated accurately with a few samples.

  17. Algorithmic commonalities in the parallel environment

    NASA Technical Reports Server (NTRS)

    Mcanulty, Michael A.; Wainer, Michael S.

    1987-01-01

    The ultimate aim of this project was to analyze procedures from substantially different application areas to discover what is either common or peculiar in the process of conversion to the Massively Parallel Processor (MPP). Three areas were identified: molecular dynamic simulation, production systems (rule systems), and various graphics and vision algorithms. To date, only selected graphics procedures have been investigated. They are the most readily available, and produce the most visible results. These include simple polygon patch rendering, raycasting against a constructive solid geometric model, and stochastic or fractal based textured surface algorithms. Only the simplest of conversion strategies, mapping a major loop to the array, has been investigated so far. It is not entirely satisfactory.

  18. Kinetic and dynamic Delaunay tetrahedralizations in three dimensions

    NASA Astrophysics Data System (ADS)

    Schaller, Gernot; Meyer-Hermann, Michael

    2004-09-01

    We describe algorithms to implement fully dynamic and kinetic three-dimensional unconstrained Delaunay triangulations, where the time evolution of the triangulation is not only governed by moving vertices but also by a changing number of vertices. We use three-dimensional simplex flip algorithms, a stochastic visibility walk algorithm for point location and in addition, we propose a new simple method of deleting vertices from an existing three-dimensional Delaunay triangulation while maintaining the Delaunay property. As an example, we analyse the performance in various cases of practical relevance. The dual Dirichlet tessellation can be used to solve differential equations on an irregular grid, to define partitions in cell tissue simulations, for collision detection etc.

  19. Validation of VIIRS Cloud Base Heights at Night Using Ground and Satellite Measurements over Alaska

    NASA Astrophysics Data System (ADS)

    NOH, Y. J.; Miller, S. D.; Seaman, C.; Forsythe, J. M.; Brummer, R.; Lindsey, D. T.; Walther, A.; Heidinger, A. K.; Li, Y.

    2016-12-01

    Knowledge of Cloud Base Height (CBH) is critical to describing cloud radiative feedbacks in numerical models and is of practical significance to aviation communities. We have developed a new CBH algorithm constrained by Cloud Top Height (CTH) and Cloud Water Path (CWP) by performing a statistical analysis of A-Train satellite data. It includes an extinction-based method for thin cirrus. In the algorithm, cloud geometric thickness is derived with upstream CTH and CWP input and subtracted from CTH to generate the topmost layer CBH. The CBH information is a key parameter for an improved Cloud Cover/Layers product. The algorithm has been applied to the Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the Suomi NPP spacecraft. Nighttime cloud optical properties for CWP are retrieved from the nighttime lunar cloud optical and microphysical properties (NLCOMP) algorithm based on a lunar reflectance model for the VIIRS Day/Night Band (DNB) measuring nighttime visible light such as moonlight. The DNB has innovative capabilities to fill the polar winter and nighttime gap of cloud observations which has been an important shortfall from conventional radiometers. The CBH products have been intensively evaluated against CloudSat data. The results showed the new algorithm yields significantly improved performance over the original VIIRS CBH algorithm. However, since CloudSat is now operational during daytime only due to a battery anomaly, the nighttime performance has not been fully assessed. This presentation will show our approach to assess the performance of the CBH algorithm at night. VIIRS CBHs are retrieved over the Alaska region from October 2015 to April 2016 using the Clouds from AVHRR Extended (CLAVR-x) processing system. Ground-based measurements from ceilometer and micropulse lidar at the Atmospheric Radiation Measurement (ARM) site on the North Slope of Alaska are used for the analysis. Local weather conditions are checked using temperature and precipitation observations at the site. CALIPSO data with near-simultaneous colocation are added for multi-layered cloud cases which may have high clouds aloft beyond the ground measurements. Multi-month statistics of performance and case studies will be shown. Additional efforts for algorithm refinements will be also discussed.

  20. Efficient Stochastic Rendering of Static and Animated Volumes Using Visibility Sweeps.

    PubMed

    von Radziewsky, Philipp; Kroes, Thomas; Eisemann, Martin; Eisemann, Elmar

    2017-09-01

    Stochastically solving the rendering integral (particularly visibility) is the de-facto standard for physically-based light transport but it is computationally expensive, especially when displaying heterogeneous volumetric data. In this work, we present efficient techniques to speed-up the rendering process via a novel visibility-estimation method in concert with an unbiased importance sampling (involving environmental lighting and visibility inside the volume), filtering, and update techniques for both static and animated scenes. Our major contributions include a progressive estimate of partial occlusions based on a fast sweeping-plane algorithm. These occlusions are stored in an octahedral representation, which can be conveniently transformed into a quadtree-based hierarchy suited for a joint importance sampling. Further, we propose sweep-space filtering, which suppresses the occurrence of fireflies and investigate different update schemes for animated scenes. Our technique is unbiased, requires little precomputation, is highly parallelizable, and is applicable to a various volume data sets, dynamic transfer functions, animated volumes and changing environmental lighting.

  1. Detection of convective initiation using Meteosat SEVIRI: implementation in and verification with the tracking and nowcasting algorithm Cb-TRAM

    NASA Astrophysics Data System (ADS)

    Merk, D.; Zinner, T.

    2013-02-01

    In this paper a new detection scheme for Convective Initation (CI) under day and night conditions is presented. The new algorithm combines the strengths of two existing methods for detecting Convective Initation with geostationary satellite data and uses the channels of the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard Meteosat Second Generation (MSG). For the new algorithm five infrared criteria from the Satellite Convection Analysis and Tracking algorithm (SATCAST) and one High Resolution Visible channel (HRV) criteria from Cb-TRAM were adapted. This set of criteria aims for identifying the typical development of quickly developing convective cells in an early stage. The different criteria include timetrends of the 10.8 IR channel and IR channel differences as well as their timetrends. To provide the trend fields an optical flow based method is used, the Pyramidal Matching algorithm which is part of Cb-TRAM. The new detection scheme is implemented in Cb-TRAM and is verified for seven days which comprise different weather situations in Central Europe. Contrasted with the original early stage detection scheme of Cb-TRAM skill scores are provided. From the comparison against detections of later thunderstorm stages, which are also provided by Cb-TRAM, a decrease in false prior warnings (false alarm ratio) from 91 to 81% is presented, an increase of the critical success index from 7.4 to 12.7%, and a decrease of the BIAS from 320 to 146% for normal scan mode. Similar trends are found for rapid scan mode. Most obvious is the decline of false alarms found for synoptic conditions with upper cold air masses triggering convection.

  2. Detection of convective initiation using Meteosat SEVIRI: implementation in and verification with the tracking and nowcasting algorithm Cb-TRAM

    NASA Astrophysics Data System (ADS)

    Merk, D.; Zinner, T.

    2013-08-01

    In this paper a new detection scheme for convective initiation (CI) under day and night conditions is presented. The new algorithm combines the strengths of two existing methods for detecting CI with geostationary satellite data. It uses the channels of the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard Meteosat Second Generation (MSG). For the new algorithm five infrared (IR) criteria from the Satellite Convection Analysis and Tracking algorithm (SATCAST) and one high-resolution visible channel (HRV) criteria from Cb-TRAM were adapted. This set of criteria aims to identify the typical development of quickly developing convective cells in an early stage. The different criteria include time trends of the 10.8 IR channel, and IR channel differences, as well as their time trends. To provide the trend fields an optical-flow-based method is used: the pyramidal matching algorithm, which is part of Cb-TRAM. The new detection scheme is implemented in Cb-TRAM, and is verified for seven days which comprise different weather situations in central Europe. Contrasted with the original early-stage detection scheme of Cb-TRAM, skill scores are provided. From the comparison against detections of later thunderstorm stages, which are also provided by Cb-TRAM, a decrease in false prior warnings (false alarm ratio) from 91 to 81% is presented, an increase of the critical success index from 7.4 to 12.7%, and a decrease of the BIAS from 320 to 146% for normal scan mode. Similar trends are found for rapid scan mode. Most obvious is the decline of false alarms found for the synoptic class "cold air" masses.

  3. Real-time Enhancement, Registration, and Fusion for an Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.

    2006-01-01

    Over the last few years NASA Langley Research Center (LaRC) has been developing an Enhanced Vision System (EVS) to aid pilots while flying in poor visibility conditions. The EVS captures imagery using two infrared video cameras. The cameras are placed in an enclosure that is mounted and flown forward-looking underneath the NASA LaRC ARIES 757 aircraft. The data streams from the cameras are processed in real-time and displayed on monitors on-board the aircraft. With proper processing the camera system can provide better-than-human-observed imagery particularly during poor visibility conditions. However, to obtain this goal requires several different stages of processing including enhancement, registration, and fusion, and specialized processing hardware for real-time performance. We are using a real-time implementation of the Retinex algorithm for image enhancement, affine transformations for registration, and weighted sums to perform fusion. All of the algorithms are executed on a single TI DM642 digital signal processor (DSP) clocked at 720 MHz. The image processing components were added to the EVS system, tested, and demonstrated during flight tests in August and September of 2005. In this paper we briefly discuss the EVS image processing hardware and algorithms. We then discuss implementation issues and show examples of the results obtained during flight tests.

  4. Automated Rock Identification for Future Mars Exploration Missions

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Gazis, P.; Bishop, J. L.; Alena, R.; Hart, S. D.; Horton, A.

    2003-01-01

    A key task for human or robotic explorers on the surface of Mars is choosing which particular rock or mineral samples should be selected for more intensive study. The usual challenges of such a task are compounded by the lack of sensory input available to a suited astronaut or the limited downlink bandwidth available to a rover. Additional challenges facing a human mission include limited surface time and the similarities in appearance of important minerals (e.g. carbonates, silicates, salts). Yet the choice of which sample to collect is critical. To address this challenge we are developing science analysis algorithms to interface with a Geologist's Field Assistant (GFA) device that will allow robotic or human remote explorers to better sense and explore their surroundings during limited surface excursions. We aim for our algorithms to interpret spectral and imaging data obtained by various sensors. The algorithms, for example, will identify key minerals, rocks, and sediments from mid-IR, Raman, and visible/near-IR spectra as well as from high resolution and microscopic images to help interpret data and to provide high-level advice to the remote explorer. A top-level system will consider multiple inputs from raw sensor data output by imagers and spectrometers (visible/near-IR, mid-IR, and Raman) as well as human opinion to identify rock and mineral samples.

  5. Estimation of cloud optical thickness by processing SEVIRI images and implementing a semi analytical cloud property retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Pandey, P.; De Ridder, K.; van Lipzig, N.

    2009-04-01

    Clouds play a very important role in the Earth's climate system, as they form an intermediate layer between Sun and the Earth. Satellite remote sensing systems are the only means to provide information about clouds on large scales. The geostationary satellite, Meteosat Second Generation (MSG) has onboard an imaging radiometer, the Spinning Enhanced Visible and Infrared Imager (SEVIRI). SEVIRI is a 12 channel imager, with 11 channels observing the earth's full disk with a temporal resolution of 15 min and spatial resolution of 3 km at nadir, and a high resolution visible (HRV) channel. The visible channels (0.6 µm and 0.81 µm) and near infrared channel (1.6µm) of SEVIRI are being used to retrieve the cloud optical thickness (COT). The study domain is over Europe covering the region between 35°N - 70°N and 10°W - 30°E. SEVIRI level 1.5 images over this domain are being acquired from the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) archive. The processing of this imagery, involves a number of steps before estimating the COT. The steps involved in pre-processing are as follows. First, the digital count number is acquired from the imagery. Image geo-coding is performed in order to relate the pixel positions to the corresponding longitude and latitude. Solar zenith angle is determined as a function of latitude and time. The radiometric conversion is done using the values of offsets and slopes of each band. The values of radiance obtained are then used to calculate the reflectance for channels in the visible spectrum using the information of solar zenith angle. An attempt is made to estimate the COT from the observed radiances. A semi analytical algorithm [Kokhanovsky et al., 2003] is implemented for the estimation of cloud optical thickness from the visible spectrum of light intensity reflected from clouds. The asymptotical solution of the radiative transfer equation, for clouds with large optical thickness, is the basis of this algorithm. The two visible channels of SEVIRI are used to find the COT and the near infra red channel to estimate the effective radius of droplets. Estimation of COT using a semi analytical scheme, which doesn't involve the conventional look-up table approach, is the aim of this work and henceforth, vertically integrated liquid water (w) or ice water content will be retrieved. The COT estimated and w obtained, will be compared with the values obtained from other approaches and will be validated with in situ measurements. Corresponding author address: Praveen Pandey, VITO - Flemish Institute for Technological Research, Boeretang 200, B 2400, Mol, Belgium. E-mail: praveen.pandey@vito.be

  6. Computing danger zones for provably safe closely spaced parallel approaches: Theory and experiment

    NASA Astrophysics Data System (ADS)

    Teo, Rodney

    In poor visibility, paired approaches to airports with closely spaced parallel runways are not permitted, thus halving the arrival rate. With Global Positioning System technology, datalinks and cockpit displays, this could be averted. One important problem is ensuring safety during a blundered approach by one aircraft. This is on-going research. A danger zone around the blunderer is required. If the correct danger zone could be calculated, then it would be possible to get 100% of clear-day capacity in poor-visibility days even on 750 foot runways. The danger zones vary significantly (during an approach) and calculating them in real time would be very significant. Approximations (e.g. outer bounds) are not good enough. This thesis presents a way to calculate these danger zones in real time for a very broad class of blunder trajectories. The approach in this thesis differs from others in that it guarantees safety for any possible blunder trajectory as long as the speeds and turn rates of the blunder are within certain bounds. In addition, the approach considers all emergency evasive maneuvers whose speeds and turn rates are within certain bounds about a nominal emergency evasive maneuver. For all combinations of these blunder and evasive maneuver trajectories, it guarantees that the evasive maneuver is safe. For more than 1 million simulation runs, the algorithm shows a 100% rate of Successful Alerts and a 0% rate of Collisions Given an Alert. As an experimental testbed, two 10-ft wingspan fully autonomous unmanned aerial vehicles and a ground station are developed together with J. S. Jang. The development includes the design and flight testing of automatic controllers. The testbed is used to demonstrate the algorithm implementation through an autonomous closely spaced parallel approach, with one aircraft programmed to blunder. The other aircraft responds according to the result of the algorithm on board it and evades autonomously when required. This experimental demonstration is successfully conducted, showing the implementation of the algorithm, in particular, demonstrating that it can run in real time. Finally; with the necessary sensors and datalink, and the appropriate procedures in place, the algorithm developed in this thesis will enable 100% of clear-day capacity in poor-visibility days even on 750 foot runways.

  7. Retrieval of tropospheric aerosol properties over land from visible and near-infrared spectral reflectance: Application over Maryland

    NASA Astrophysics Data System (ADS)

    Levy, Robert Carroll

    Aerosols are major components of the Earth's global climate system, affecting the radiation budget and cloud processes of the atmosphere. When located near the surface, high concentrations lead to lowered visibility, increased health problems and generally reduced quality of life for the human population. Over the United States mid-Atlantic region, aerosol pollution is a problem mainly during the summer. Satellites, such as the MODerate Imaging Spectrometer (MODIS), from their vantage point above the atmosphere, provide unprecedented coverage of global and regional aerosols over land. During MODIS' eight-year operation, exhaustive data validation and analyses have shown how the algorithm should be improved. This dissertation describes the development of the 'second-generation' operational algorithm for retrieval of global tropospheric aerosol properties over dark land surfaces, from MODIS-observed spectral reflectance. New understanding about global aerosol properties, land surface reflectance characteristics, and radiative transfer properties were learned in the process. This new operational algorithm performs a simultaneous inversion of reflectance in two visible channels (0.47 and 0.66 mum) and one shortwave infrared channel (2.12 mum), thereby having increased sensitivity to coarse aerosol. Inversion of the three channels retrieves the aerosol optical depth (tau) at 0.55 mum, the percentage of non-dust (fine model) aerosol (eta) and the surface reflectance. This algorithm is applied globally, and retrieves tau that is highly correlated (y = 0.02 + 1.0x, R=0.9) with ground-based sunphotometer measurements. The new algorithm estimates the global, over-land, long-term averaged tau ˜ 0.21, a 25% reduction from previous MODIS estimates. This leads to reducing estimates of global, non-desert, over-land aerosol direct radiative effect (all aerosols) by 1.7 W·m-2 (0.5 W·m-2 over the entire globe), which significantly impacts assessment of aerosol direct radiative forcing (contribution from anthropogenic aerosols only). Over the U.S. mid-Atlantic region, validated retrievals of tau (an integrated column property) can help to estimate surface PM2.5 concentration, a monitored criteria air quality property. The 3-dimensional aerosol loading in the region is characterized using aircraft measurements and the Community Multi-scale Air Quality Model (CMAQ) model, leading to some convergence of observed quantities and modeled processes.

  8. Implementation in an FPGA circuit of Edge detection algorithm based on the Discrete Wavelet Transforms

    NASA Astrophysics Data System (ADS)

    Bouganssa, Issam; Sbihi, Mohamed; Zaim, Mounia

    2017-07-01

    The 2D Discrete Wavelet Transform (DWT) is a computationally intensive task that is usually implemented on specific architectures in many imaging systems in real time. In this paper, a high throughput edge or contour detection algorithm is proposed based on the discrete wavelet transform. A technique for applying the filters on the three directions (Horizontal, Vertical and Diagonal) of the image is used to present the maximum of the existing contours. The proposed architectures were designed in VHDL and mapped to a Xilinx Sparten6 FPGA. The results of the synthesis show that the proposed architecture has a low area cost and can operate up to 100 MHz, which can perform 2D wavelet analysis for a sequence of images while maintaining the flexibility of the system to support an adaptive algorithm.

  9. Pattern optimization of compound optical film for uniformity improvement in liquid-crystal displays

    NASA Astrophysics Data System (ADS)

    Huang, Bing-Le; Lin, Jin-tang; Ye, Yun; Xu, Sheng; Chen, En-guo; Guo, Tai-Liang

    2017-12-01

    The density dynamic adjustment algorithm (DDAA) is designed to efficiently promote the uniformity of the integrated backlight module (IBLM) by adjusting the microstructures' distribution on the compound optical film (COF), in which the COF is constructed in the SolidWorks and simulated in the TracePro. In order to demonstrate the universality of the proposed algorithm, the initial distribution is allocated by the Bezier curve instead of an empirical value. Simulation results maintains that the uniformity of the IBLM reaches over 90% only after four rounds. Moreover, the vertical and horizontal full width at half maximum of angular intensity are collimated to 24 deg and 14 deg, respectively. Compared with the current industry requirement, the IBLM has an 85% higher luminance uniformity of the emerging light, which demonstrate the feasibility and universality of the proposed algorithm.

  10. Effects of various runway lighting parameters upon the relation between runway visual range and visual range of centerline and edge lights in fog

    NASA Technical Reports Server (NTRS)

    Haines, R. F.

    1973-01-01

    Thirty six students and 54 commercial airline pilots were tested in the fog chamber to determine the effect of runway edge and centerline light intensity and spacing, fog density, ambient luminance level, and lateral and vertical offset distance of the subject from the runway's centerline upon horizontal visual range. These data were obtained to evaluate the adequacy of a balanced lighting system to provide maximum visual range in fog viewing both centerline and runway edge lights. The daytime system was compared against two other candidate lighting systems; the nighttime system was compared against other candidate lighting systems. The second objective was to determine if visual range is affected by lights between the subject and the farthestmost light visible through the fog. The third objective was to determine if college student subjects differ from commercial airline pilots in their horizontal visual range through fog. Two studies were conducted.

  11. Polarization sensitivity and retinal topography of the striped pyjama squid (Sepioloidea lineolata - Quoy/Gaimard 1832).

    PubMed

    Talbot, Christopher M; Marshall, Justin

    2010-10-01

    Coleoid cephalopods (octopus, cuttlefish and squid) potentially possess polarization sensitivity (PS) based on photoreceptor structure, but this idea has rarely been tested behaviourally. Here, we use a polarized, striped optokinetic stimulus to demonstrate PS in the striped pyjama squid, Sepioloidea lineolata. This species displayed strong, consistent optokinetic nystagmic eye movements in response to a drum with stripes producing e-vectors set to 0 deg, 45 deg, 90 deg and 135 deg that would only be visible to an animal with PS. This is the first behavioural demonstration of a polarized optokinetic response in any species of cephalopod. This species, which typically sits beneath the substrate surface looking upwards for potential predators and prey, possesses a dorsally shifted horizontal pupil slit. Accordingly, it was found to possess a horizontal strip of high-density photoreceptors shifted ventrally in the retina, suggesting modifications such as a change in sensitivity or resolution to the dorsal visual field.

  12. An Improved Aerial Target Localization Method with a Single Vector Sensor

    PubMed Central

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2017-01-01

    This paper focuses on the problems encountered in the actual data processing with the use of the existing aerial target localization methods, analyzes the causes of the problems, and proposes an improved algorithm. Through the processing of the sea experiment data, it is found that the existing algorithms have higher requirements for the accuracy of the angle estimation. The improved algorithm reduces the requirements of the angle estimation accuracy and obtains the robust estimation results. The closest distance matching estimation algorithm and the horizontal distance estimation compensation algorithm are proposed. The smoothing effect of the data after being post-processed by using the forward and backward two-direction double-filtering method has been improved, thus the initial stage data can be filtered, so that the filtering results retain more useful information. In this paper, the aerial target height measurement methods are studied, the estimation results of the aerial target are given, so as to realize the three-dimensional localization of the aerial target and increase the understanding of the underwater platform to the aerial target, so that the underwater platform has better mobility and concealment. PMID:29135956

  13. Vehicle tracking using fuzzy-based vehicle detection window with adaptive parameters

    NASA Astrophysics Data System (ADS)

    Chitsobhuk, Orachat; Kasemsiri, Watjanapong; Glomglome, Sorayut; Lapamonpinyo, Pipatphon

    2018-04-01

    In this paper, fuzzy-based vehicle tracking system is proposed. The proposed system consists of two main processes: vehicle detection and vehicle tracking. In the first process, the Gradient-based Adaptive Threshold Estimation (GATE) algorithm is adopted to provide the suitable threshold value for the sobel edge detection. The estimated threshold can be adapted to the changes of diverse illumination conditions throughout the day. This leads to greater vehicle detection performance compared to a fixed user's defined threshold. In the second process, this paper proposes the novel vehicle tracking algorithms namely Fuzzy-based Vehicle Analysis (FBA) in order to reduce the false estimation of the vehicle tracking caused by uneven edges of the large vehicles and vehicle changing lanes. The proposed FBA algorithm employs the average edge density and the Horizontal Moving Edge Detection (HMED) algorithm to alleviate those problems by adopting fuzzy rule-based algorithms to rectify the vehicle tracking. The experimental results demonstrate that the proposed system provides the high accuracy of vehicle detection about 98.22%. In addition, it also offers the low false detection rates about 3.92%.

  14. The life-cycle of upper-tropospheric jet streams identified with a novel data segmentation algorithm

    NASA Astrophysics Data System (ADS)

    Limbach, S.; Schömer, E.; Wernli, H.

    2010-09-01

    Jet streams are prominent features of the upper-tropospheric atmospheric flow. Through the thermal wind relationship these regions with intense horizontal wind speed (typically larger than 30 m/s) are associated with pronounced baroclinicity, i.e., with regions where extratropical cyclones develop due to baroclinic instability processes. Individual jet streams are non-stationary elongated features that can extend over more than 2000 km in the along-flow and 200-500 km in the across-flow direction, respectively. Their lifetime can vary between a few days and several weeks. In recent years, feature-based algorithms have been developed that allow compiling synoptic climatologies and typologies of upper-tropospheric jet streams based upon objective selection criteria and climatological reanalysis datasets. In this study a novel algorithm to efficiently identify jet streams using an extended region-growing segmentation approach is introduced. This algorithm iterates over a 4-dimensional field of horizontal wind speed from ECMWF analyses and decides at each grid point whether all prerequisites for a jet stream are met. In a single pass the algorithm keeps track of all adjacencies of these grid points and creates the 4-dimensional connected segments associated with each jet stream. In addition to the detection of these sets of connected grid points, the algorithm analyzes the development over time of the distinct 3-dimensional features each segment consists of. Important events in the development of these features, for example mergings and splittings, are detected and analyzed on a per-grid-point and per-feature basis. The output of the algorithm consists of the actual sets of grid-points augmented with information about the particular events, and of the so-called event graphs, which are an abstract representation of the distinct 3-dimensional features and events of each segment. This technique provides comprehensive information about the frequency of upper-tropospheric jet streams, their preferred regions of genesis, merging, splitting, and lysis, and statistical information about their size, amplitude and lifetime. The presentation will introduce the technique, provide example visualizations of the time evolution of the identified 3-dimensional jet stream features, and present results from a first multi-month "climatology" of upper-tropospheric jets. In the future, the technique can be applied to longer datasets, for instance reanalyses and output from global climate model simulations - and provide detailed information about key characteristics of jet stream life cycles.

  15. Super-resolution algorithm based on sparse representation and wavelet preprocessing for remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Ren, Ruizhi; Gu, Lingjia; Fu, Haoyang; Sun, Chenglin

    2017-04-01

    An effective super-resolution (SR) algorithm is proposed for actual spectral remote sensing images based on sparse representation and wavelet preprocessing. The proposed SR algorithm mainly consists of dictionary training and image reconstruction. Wavelet preprocessing is used to establish four subbands, i.e., low frequency, horizontal, vertical, and diagonal high frequency, for an input image. As compared to the traditional approaches involving the direct training of image patches, the proposed approach focuses on the training of features derived from these four subbands. The proposed algorithm is verified using different spectral remote sensing images, e.g., moderate-resolution imaging spectroradiometer (MODIS) images with different bands, and the latest Chinese Jilin-1 satellite images with high spatial resolution. According to the visual experimental results obtained from the MODIS remote sensing data, the SR images using the proposed SR algorithm are superior to those using a conventional bicubic interpolation algorithm or traditional SR algorithms without preprocessing. Fusion algorithms, e.g., standard intensity-hue-saturation, principal component analysis, wavelet transform, and the proposed SR algorithms are utilized to merge the multispectral and panchromatic images acquired by the Jilin-1 satellite. The effectiveness of the proposed SR algorithm is assessed by parameters such as peak signal-to-noise ratio, structural similarity index, correlation coefficient, root-mean-square error, relative dimensionless global error in synthesis, relative average spectral error, spectral angle mapper, and the quality index Q4, and its performance is better than that of the standard image fusion algorithms.

  16. Image reconstruction from few-view CT data by gradient-domain dictionary learning.

    PubMed

    Hu, Zhanli; Liu, Qiegen; Zhang, Na; Zhang, Yunwan; Peng, Xi; Wu, Peter Z; Zheng, Hairong; Liang, Dong

    2016-05-21

    Decreasing the number of projections is an effective way to reduce the radiation dose exposed to patients in medical computed tomography (CT) imaging. However, incomplete projection data for CT reconstruction will result in artifacts and distortions. In this paper, a novel dictionary learning algorithm operating in the gradient-domain (Grad-DL) is proposed for few-view CT reconstruction. Specifically, the dictionaries are trained from the horizontal and vertical gradient images, respectively and the desired image is reconstructed subsequently from the sparse representations of both gradients by solving the least-square method. Since the gradient images are sparser than the image itself, the proposed approach could lead to sparser representations than conventional DL methods in the image-domain, and thus a better reconstruction quality is achieved. To evaluate the proposed Grad-DL algorithm, both qualitative and quantitative studies were employed through computer simulations as well as real data experiments on fan-beam and cone-beam geometry. The results show that the proposed algorithm can yield better images than the existing algorithms.

  17. Privacy Preserving Nearest Neighbor Search

    NASA Astrophysics Data System (ADS)

    Shaneck, Mark; Kim, Yongdae; Kumar, Vipin

    Data mining is frequently obstructed by privacy concerns. In many cases data is distributed, and bringing the data together in one place for analysis is not possible due to privacy laws (e.g. HIPAA) or policies. Privacy preserving data mining techniques have been developed to address this issue by providing mechanisms to mine the data while giving certain privacy guarantees. In this chapter we address the issue of privacy preserving nearest neighbor search, which forms the kernel of many data mining applications. To this end, we present a novel algorithm based on secure multiparty computation primitives to compute the nearest neighbors of records in horizontally distributed data. We show how this algorithm can be used in three important data mining algorithms, namely LOF outlier detection, SNN clustering, and kNN classification. We prove the security of these algorithms under the semi-honest adversarial model, and describe methods that can be used to optimize their performance. Keywords: Privacy Preserving Data Mining, Nearest Neighbor Search, Outlier Detection, Clustering, Classification, Secure Multiparty Computation

  18. Occlusal wear and occlusal condition in a convenience sample of young adults.

    PubMed

    Van't Spijker, A; Kreulen, C M; Bronkhorst, E M; Creugers, N H J

    2015-01-01

    To study progression of tooth wear quantitatively in a convenient sample of young adults and to assess possible correlations with occlusal conditions. Twenty-eight dental students participated in a three-year follow up study on tooth wear. Visible wear facets on full arch gypsum casts were assessed using a flatbed scanner and measuring software. Regression analyses were used to assess possible associations between the registered occlusal conditions 'occlusal guidance scheme', 'vertical overbite', 'horizontal overbite', 'depth of sagittal curve', 'canine Angle class relation', 'history of orthodontic treatment', and 'self-reported grinding/clenching' (independent variables) and increase of wear facets (dependent variable). Mean increase in facet surface areas ranged from 1.2 mm2 (premolars, incisors) to 3.4 mm2 (molars); the relative increase ranged from 15% to 23%. Backward regression analysis showed no significant relation for 'group function', 'vertical overbite', 'depth of sagittal curve', 'history of orthodontic treatment' nor 'self-reported clenching. The final multiple linear regression model showed significant associations amongst 'anterior protected articulation' and 'horizontal overbite' and increase of facet surface areas. For all teeth combined, only 'anterior protected articulation' had a significant effect. 'Self reported grinding' did not have a significant effect (p>0.07). In this study 'anterior protected articulation' and 'horizontal overbite', were significantly associated with the progression of tooth wear. Self reported grinding was not significantly associated with progression of tooth wear. Occlusal conditions such as anterior protected articulation and horizontal overbite seem to have an effect on the progression of occlusal tooth wear in this convenient sample of young adults. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. An Earth-Moon Transfer Trajectory Design and Analysis Considering Spacecraft's Visibility from Daejeon Ground Station at TLI and LOI Maneuvers

    NASA Astrophysics Data System (ADS)

    Woo, Jin; Song, Young-Joo; Park, Sang-Young; Kim, Hae-Dong; Sim, Eun-Sup

    2010-09-01

    The optimal Earth-Moon transfer trajectory considering spacecraft's visibility from the Daejeon ground station visibility at both the trans lunar injection (TLI) and lunar orbit insertion (LOI) maneuvers is designed. Both the TLI and LOI maneuvers are assumed to be impulsive thrust. As the successful execution of the TLI and LOI maneuvers are crucial factors among the various lunar mission parameters, it is necessary to design an optimal lunar transfer trajectory which guarantees the visibility from a specified ground station while executing these maneuvers. The optimal Earth-Moon transfer trajectory is simulated by modifying the Korean Lunar Mission Design Software using Impulsive high Thrust Engine (KLMDS-ITE) which is developed in previous studies. Four different mission scenarios are established and simulated to analyze the effects of the spacecraft's visibility considerations at the TLI and LOI maneuvers. As a result, it is found that the optimal Earth-Moon transfer trajectory, guaranteeing the spacecraft's visibility from Daejeon ground station at both the TLI and LOI maneuvers, can be designed with slight changes in total amount of delta-Vs. About 1% difference is observed with the optimal trajectory when none of the visibility condition is guaranteed, and about 0.04% with the visibility condition is only guaranteed at the time of TLI maneuver. The spacecraft's mass which can delivered to the Moon, when both visibility conditions are secured is shown to be about 534 kg with assumptions of KSLV-2's on-orbit mass about 2.6 tons. To minimize total mission delta-Vs, it is strongly recommended that visibility conditions at both the TLI and LOI maneuvers should be simultaneously implemented to the trajectory optimization algorithm.

  20. Algorithm for Cosmic Noise Suppression in Free Space Optical Communications

    NASA Astrophysics Data System (ADS)

    Yuvaraj, George; Himani Sharma, Goyal, Dr.

    2017-08-01

    This article describes an algorithm to reduce cosmic noise in free space optical communication system. This method is intended to increase communication system’s performance and to increase the sustainability of the communication system by means of image processing technique. Apart from these, methods employed in testing the model are also presented for the communication system that uses either terrestrial or extraterrestrial medium to propagate message using optics or visible light without considering environmental impact that is turbulence, atmospheric absorption, beam dispersion and light intensity on its performance.

  1. State-Estimation Algorithm Based on Computer Vision

    NASA Technical Reports Server (NTRS)

    Bayard, David; Brugarolas, Paul

    2007-01-01

    An algorithm and software to implement the algorithm are being developed as means to estimate the state (that is, the position and velocity) of an autonomous vehicle, relative to a visible nearby target object, to provide guidance for maneuvering the vehicle. In the original intended application, the autonomous vehicle would be a spacecraft and the nearby object would be a small astronomical body (typically, a comet or asteroid) to be explored by the spacecraft. The algorithm could also be used on Earth in analogous applications -- for example, for guiding underwater robots near such objects of interest as sunken ships, mineral deposits, or submerged mines. It is assumed that the robot would be equipped with a vision system that would include one or more electronic cameras, image-digitizing circuitry, and an imagedata- processing computer that would generate feature-recognition data products.

  2. Comparison of Nimbus-7 SMMR and GOES-1 VISSR Atmospheric Liquid Water Content.

    NASA Astrophysics Data System (ADS)

    Lojou, Jean-Yves; Frouin, Robert; Bernard, René

    1991-02-01

    Vertically integrated atmospheric liquid water content derived from Nimbus-7 Scanning Multichannel Microwave Radiometer (SMMR) brightness temperatures and from GOES-1 Visible and Infrared Spin-Scan Radiometer (VISSR) radiances in the visible are compared over the Indian Ocean during MONEX (monsoon experiment). In the retrieval procedure, Wilheit and Chang' algorithm and Stephens' parameterization schemes are applied to the SMMR and VISSR data, respectively. The results indicate that in the 0-100 mg cm2 range of liquid water content considered, the correlation coefficient between the two types of estimates is 0.83 (0.81- 0.85 at the 99 percent confidence level). The Wilheit and Chang algorithm, however, yields values lower than those obtained with Stephens's schemes by 24.5 mg cm2 on the average, and occasionally the SMMR-based values are negative. Alternative algorithms are proposed for use with SMMR data, which eliminate the bias, augment the correlation coefficient, and reduce the rms difference. These algorithms include using the Witheit and Chang formula with modified coefficients (multilinear regression), the Wilheit and Chang formula with the same coefficients but different equivalent atmospheric temperatures for each channel (temperature bias adjustment), and a second-order polynomial in brightness temperatures at 18, 21, and 37 GHz (polynomial development). When applied to a dataset excluded from the regressionn dataset, the multilinear regression algorithm provides the best results, namely a 0.91 correlation coefficient, a 5.2 mg cm2 (residual) difference, and a 2.9 mg cm2 bias. Simply shifting the liquid water content predicted by the Wilheit and Chang algorithm does not yield as good comparison statistics, indicating that the occasional negative values are not due only to a bias. The more accurate SMMR-derived liquid water content allows one to better evaluate cloud transmittance in the solar spectrum, at least in the area and during the period analyzed. Combining this cloud transmittance with a clear sky model would provide ocean surface insulation estimates from SMMR data alone.

  3. Contrast enhancement for in vivo visible reflectance imaging of tissue oxygenation.

    PubMed

    Crane, Nicole J; Schultz, Zachary D; Levin, Ira W

    2007-08-01

    Results are presented illustrating a straightforward algorithm to be used for real-time monitoring of oxygenation levels in blood cells and tissue based on the visible spectrum of hemoglobin. Absorbance images obtained from the visible reflection of white light through separate red and blue bandpass filters recorded by monochrome charge-coupled devices (CCDs) are combined to create enhanced images that suggest a quantitative correlation between the degree of oxygenated and deoxygenated hemoglobin in red blood cells. The filter bandpass regions are chosen specifically to mimic the color response of commercial 3-CCD cameras, representative of detectors with which the operating room laparoscopic tower systems are equipped. Adaptation of this filter approach is demonstrated for laparoscopic donor nephrectomies in which images are analyzed in terms of real-time in vivo monitoring of tissue oxygenation.

  4. A Proposed Methodology to Classify Frontier Capital Markets

    DTIC Science & Technology

    2011-07-31

    but because it is the surest route to our common good.” -Inaugural Speech by President Barack Obama, Jan 2009 This project involves basic...machine learning. The algorithm consists of a unique binary classifier mechanism that combines three methods: k-Nearest Neighbors ( kNN ), ensemble...Through kNN Ensemble Classification Techniques E. Capital Market Classification Based on Capital Flows and Trading Architecture F. Horizontal

  5. Improved visibility graph fractality with application for the diagnosis of Autism Spectrum Disorder

    NASA Astrophysics Data System (ADS)

    Ahmadlou, Mehran; Adeli, Hojjat; Adeli, Amir

    2012-10-01

    Recently, the visibility graph (VG) algorithm was proposed for mapping a time series to a graph to study complexity and fractality of the time series through investigation of the complexity of its graph. The visibility graph algorithm converts a fractal time series to a scale-free graph. VG has been used for the investigation of fractality in the dynamic behavior of both artificial and natural complex systems. However, robustness and performance of the power of scale-freeness of VG (PSVG) as an effective method for measuring fractality has not been investigated. Since noise is unavoidable in real life time series, the robustness of a fractality measure is of paramount importance. To improve the accuracy and robustness of PSVG to noise for measurement of fractality of time series in biological time-series, an improved PSVG is presented in this paper. The proposed method is evaluated using two examples: a synthetic benchmark time series and a complicated real life Electroencephalograms (EEG)-based diagnostic problem, that is distinguishing autistic children from non-autistic children. It is shown that the proposed improved PSVG is less sensitive to noise and therefore more robust compared with PSVG. Further, it is shown that using improved PSVG in the wavelet-chaos neural network model of Adeli and c-workers in place of the Katz fractality dimension results in a more accurate diagnosis of autism, a complicated neurological and psychiatric disorder.

  6. Three dimensional indoor positioning based on visible light with Gaussian mixture sigma-point particle filter technique

    NASA Astrophysics Data System (ADS)

    Gu, Wenjun; Zhang, Weizhi; Wang, Jin; Amini Kashani, M. R.; Kavehrad, Mohsen

    2015-01-01

    Over the past decade, location based services (LBS) have found their wide applications in indoor environments, such as large shopping malls, hospitals, warehouses, airports, etc. Current technologies provide wide choices of available solutions, which include Radio-frequency identification (RFID), Ultra wideband (UWB), wireless local area network (WLAN) and Bluetooth. With the rapid development of light-emitting-diodes (LED) technology, visible light communications (VLC) also bring a practical approach to LBS. As visible light has a better immunity against multipath effect than radio waves, higher positioning accuracy is achieved. LEDs are utilized both for illumination and positioning purpose to realize relatively lower infrastructure cost. In this paper, an indoor positioning system using VLC is proposed, with LEDs as transmitters and photo diodes as receivers. The algorithm for estimation is based on received-signalstrength (RSS) information collected from photo diodes and trilateration technique. By appropriately making use of the characteristics of receiver movements and the property of trilateration, estimation on three-dimensional (3-D) coordinates is attained. Filtering technique is applied to enable tracking capability of the algorithm, and a higher accuracy is reached compare to raw estimates. Gaussian mixture Sigma-point particle filter (GM-SPPF) is proposed for this 3-D system, which introduces the notion of Gaussian Mixture Model (GMM). The number of particles in the filter is reduced by approximating the probability distribution with Gaussian components.

  7. Enhancement tuning and control for high dynamic range images in multi-scale locally adaptive contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Cvetkovic, Sascha D.; Schirris, Johan; de With, Peter H. N.

    2009-01-01

    For real-time imaging in surveillance applications, visibility of details is of primary importance to ensure customer confidence. If we display High Dynamic-Range (HDR) scenes whose contrast spans four or more orders of magnitude on a conventional monitor without additional processing, results are unacceptable. Compression of the dynamic range is therefore a compulsory part of any high-end video processing chain because standard monitors are inherently Low- Dynamic Range (LDR) devices with maximally two orders of display dynamic range. In real-time camera processing, many complex scenes are improved with local contrast enhancements, bringing details to the best possible visibility. In this paper, we show how a multi-scale high-frequency enhancement scheme, in which gain is a non-linear function of the detail energy, can be used for the dynamic range compression of HDR real-time video camera signals. We also show the connection of our enhancement scheme to the processing way of the Human Visual System (HVS). Our algorithm simultaneously controls perceived sharpness, ringing ("halo") artifacts (contrast) and noise, resulting in a good balance between visibility of details and non-disturbance of artifacts. The overall quality enhancement, suitable for both HDR and LDR scenes, is based on a careful selection of the filter types for the multi-band decomposition and a detailed analysis of the signal per frequency band.

  8. Exact Algorithms for Duplication-Transfer-Loss Reconciliation with Non-Binary Gene Trees.

    PubMed

    Kordi, Misagh; Bansal, Mukul S

    2017-06-01

    Duplication-Transfer-Loss (DTL) reconciliation is a powerful method for studying gene family evolution in the presence of horizontal gene transfer. DTL reconciliation seeks to reconcile gene trees with species trees by postulating speciation, duplication, transfer, and loss events. Efficient algorithms exist for finding optimal DTL reconciliations when the gene tree is binary. In practice, however, gene trees are often non-binary due to uncertainty in the gene tree topologies, and DTL reconciliation with non-binary gene trees is known to be NP-hard. In this paper, we present the first exact algorithms for DTL reconciliation with non-binary gene trees. Specifically, we (i) show that the DTL reconciliation problem for non-binary gene trees is fixed-parameter tractable in the maximum degree of the gene tree, (ii) present an exponential-time, but in-practice efficient, algorithm to track and enumerate all optimal binary resolutions of a non-binary input gene tree, and (iii) apply our algorithms to a large empirical data set of over 4700 gene trees from 100 species to study the impact of gene tree uncertainty on DTL-reconciliation and to demonstrate the applicability and utility of our algorithms. The new techniques and algorithms introduced in this paper will help biologists avoid incorrect evolutionary inferences caused by gene tree uncertainty.

  9. Performance Analysis of Web-Based Ppp Services with DİFFERENT Visibility Conditions

    NASA Astrophysics Data System (ADS)

    Albayrak, M.; Erkaya, H.; Ozludemir, M. T.; Ocalan, T.

    2016-12-01

    GNSS is being used effectively to precise position for many measuring and geodetic purposes at the present time. There is an increasing variety of these systems including the post-processing calculations in terms of number, quality and features and many different techniques are developed to determine position. Precise positioning intend to derive requires user experience and scientific or commercial software with costly license fees. However, in recent years important alternatives to this software that are user friendly and offer free web-based online precise point positioning service have become widely used in geodetic applications. The aim of this study is to test the performance of PPP techniques on ground control points with different visibility conditions. Within this framework, static observations were carried out for three hours a day repeatedly for six days, in YTU Davutpasa Campus on three different ground control points. The locations of these stations were selected by taking into account the impact of natural (trees, etc.) and artificial (buildings, etc.) obstacles. In order to compare the obtained GPS observations with PPP performances, first of all the accurate coordinates of the control points were computed with relative positioning technique in connection with the IGS stations using Bernese v5.0 software. Afterwards, three different web-based positioning services (CSRS-PPP, magicGNSS, GAPS) were used to analyze the GPS observations via PPP technique. To compare all of the obtained results, ITRF2008 datum measurement epoch coordinates were preferred by taking the service result criteria into consideration. In coordinate comparison, for the first station located nearby a building and possibly subjected to multipath effect horizontal discrepancies vary between 2-14.5 cm while vertical differences are between 3.5-16 cm. For the second point located partly in a forestry area, the discrepancies have been obtained as 1.5-8 cm and 2-10 cm for horizontal and vertical components, respectively. For the third point located in an area with no obstacles, 1.5-7 cm horizontal and 1-7 cm vertical differences have been obtained. The results show that the PPP technique could be used effectively in several positioning applications.

  10. Tight Analysis of a Collisionless Robot Gathering Algorithm

    DTIC Science & Technology

    2015-09-28

    local-multiplicity detection. In SSS , pages 384– 398, Berlin, Heidelberg, 2009. Springer-Verlag. [20] T. Izumi, Y. Katayama, N. Inuzuka, and K. Wada...T. Izumi, M. G. Potop-Butucaru, and S. Tixeuil. Connectivity- preserving scattering of mobile robots with limited visibility. In SSS , pages 319–331

  11. Cotton growth modeling and assessment using UAS visual-band imagery

    USDA-ARS?s Scientific Manuscript database

    This paper explores the potential of using unmanned aircraft system (UAS)-based visible-band images to assess cotton growth. By applying the structure-from-motion algorithm, cotton plant height (ph) and canopy cover (cc) were retrieved from the point cloud-based digital surface models (DSMs) and ort...

  12. Efficient Computation of Atmospheric Flows with Tempest: Development of Next-Generation Climate and Weather Prediction Algorithms at Non-Hydrostatic Scales

    NASA Astrophysics Data System (ADS)

    Guerra, J. E.; Ullrich, P. A.

    2015-12-01

    Tempest is a next-generation global climate and weather simulation platform designed to allow experimentation with numerical methods at very high spatial resolutions. The atmospheric fluid equations are discretized by continuous / discontinuous finite elements in the horizontal and by a staggered nodal finite element method (SNFEM) in the vertical, coupled with implicit/explicit time integration. At global horizontal resolutions below 10km, many important questions remain on optimal techniques for solving the fluid equations. We present results from a suite of meso-scale test cases to validate the performance of the SNFEM applied in the vertical. Internal gravity wave, mountain wave, convective, and Cartesian baroclinic instability tests will be shown at various vertical orders of accuracy and compared with known results.

  13. Summertime Coincident Observations of Ice Water Path in the Visible/Near-IR, Radar, and Microwave Frequencies

    NASA Technical Reports Server (NTRS)

    Pittman, Jasna V.; Robertson, Franklin R.; Atkinson, Robert J.

    2008-01-01

    Accurate representation of the physical and radiative properties of clouds in climate models continues to be a challenge. At present, both remote sensing observations and modeling of microphysical properties of clouds rely heavily on parameterizations or assumptions on particle size distribution (PSD) and cloud phase. In this study, we compare Ice Water Path (IWP), an important physical and radiative property that provides the amount of ice present in a cloud column, using measurements obtained via three different retrieval strategies. The datasets we use in this study include Visible/Near-IR IWP from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument flying aboard the Aqua satellite, Radar-only IWP from the CloudSat instrument operating at 94 GHz, and NOAA/NESDIS operational IWP from the 89 and 157 GHz channels of the Microwave Humidity Sounder (MHS) instrument flying aboard the NOAA-18 satellite. In the Visible/Near-IR, IWP is derived from observations of optical thickness and effective radius. CloudSat IWP is determined from measurements of cloud backscatter and assumed PSD. MHS IWP retrievals depend on scattering measurements at two different, non-water absorbing channels, 89 and 157 GHz. In order to compare IWP obtained from these different techniques and collected at different vertical and horizontal resolutions, we examine summertime cases in the tropics (30S - 30N) when all 3 satellites are within 4 minutes of each other (approximately 1500 km). All measurements are then gridded to a common 15 km x 15 km box determined by MHS. In a grid box comparison, we find CloudSat to report the highest IWP followed by MODIS, followed by MHS. In a statistical comparison, probability density distributions show MHS with the highest frequencies at IWP of 100-1000 g/m(exp 2) and CloudSat with the longest tail reporting IWP of several thousands g/m(exp 2). For IWP greater than 30 g/m(exp 2), MODIS is consistently higher than CloudSat, and it is higher at the lower IWPs but lower at the higher IWPs that overlap with MHS. Some of these differences can be attributed to the limitations of the measuring techniques themselves, but some can result from the assumptions made in the algorithms that generate the IWP product. We investigate this issue by creating categories based on various conditions such as cloud type, precipitation presence, underlying liquid water content, and surface type (land vs. ocean) and by comparing the performance of the IWP products under each condition.

  14. 3-D Inhomogeous Radiative Transfer Model using a Planar-stratified Forward RT Model and Horizontal Perturbation Series

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Gasiewski, A. J.

    2017-12-01

    A horizontally inhomogeneous unified microwave radiative transfer (HI-UMRT) model based upon a nonspherical hydrometeor scattering model is being developed at the University of Colorado at Boulder to facilitate forward radiative simulations for 3-dimensionally inhomogeneous clouds in severe weather. The HI-UMRT 3-D analytical solution is based on incorporating a planar-stratified 1-D UMRT algorithm within a horizontally inhomogeneous iterative perturbation scheme. Single-scattering parameters are computed using the Discrete Dipole Scattering (DDSCAT v7.3) program for hundreds of carefully selected nonspherical complex frozen hydrometeors from the NASA/GSFC DDSCAT database. The required analytic factorization symmetry of transition matrix in a normalized RT equation was analytically proved and validated numerically using the DDSCAT-based full Stokes matrix of randomly oriented hydrometeors. The HI-UMRT model thus inherits the properties of unconditional numerical stability, efficiency, and accuracy from the UMRT algorithm and provides a practical 3-D two-Stokes parameter radiance solution with Jacobian to be used within microwave retrievals and data assimilation schemes. In addition, a fast forward radar reflectivity operator with Jacobian based on DDSCAT backscatter efficiency computed for large hydrometeors is incorporated into the HI-UMRT model to provide applicability to active radar sensors. The HI-UMRT will be validated strategically at two levels: 1) intercomparison of brightness temperature (Tb) results with those of several 1-D and 3-D RT models, including UMRT, CRTM and Monte Carlo models, 2) intercomparison of Tb with observed data from combined passive and active spaceborne sensors (e.g. GPM GMI and DPR). The precise expression for determining the required number of 3-D iterations to achieve an error bound on the perturbation solution will be developed to facilitate the numerical verification of the HI-UMRT code complexity and computation performance.

  15. Determination of Large-Scale Cloud Ice Water Concentration by Combining Surface Radar and Satellite Data in Support of ARM SCM Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Guosheng

    2013-03-15

    Single-column modeling (SCM) is one of the key elements of Atmospheric Radiation Measurement (ARM) research initiatives for the development and testing of various physical parameterizations to be used in general circulation models (GCMs). The data required for use with an SCM include observed vertical profiles of temperature, water vapor, and condensed water, as well as the large-scale vertical motion and tendencies of temperature, water vapor, and condensed water due to horizontal advection. Surface-based measurements operated at ARM sites and upper-air sounding networks supply most of the required variables for model inputs, but do not provide the horizontal advection term ofmore » condensed water. Since surface cloud radar and microwave radiometer observations at ARM sites are single-point measurements, they can provide the amount of condensed water at the location of observation sites, but not a horizontal distribution of condensed water contents. Consequently, observational data for the large-scale advection tendencies of condensed water have not been available to the ARM cloud modeling community based on surface observations alone. This lack of advection data of water condensate could cause large uncertainties in SCM simulations. Additionally, to evaluate GCMs cloud physical parameterization, we need to compare GCM results with observed cloud water amounts over a scale that is large enough to be comparable to what a GCM grid represents. To this end, the point-measurements at ARM surface sites are again not adequate. Therefore, cloud water observations over a large area are needed. The main goal of this project is to retrieve ice water contents over an area of 10 x 10 deg. surrounding the ARM sites by combining surface and satellite observations. Built on the progress made during previous ARM research, we have conducted the retrievals of 3-dimensional ice water content by combining surface radar/radiometer and satellite measurements, and have produced 3-D cloud ice water contents in support of cloud modeling activities. The approach of the study is to expand a (surface) point measurement to an (satellite) area measurement. That is, the study takes the advantage of the high quality cloud measurements (particularly cloud radar and microwave radiometer measurements) at the point of the ARM sites. We use the cloud ice water characteristics derived from the point measurement to guide/constrain a satellite retrieval algorithm, then use the satellite algorithm to derive the 3-D cloud ice water distributions within an 10° (latitude) x 10° (longitude) area. During the research period, we have developed, validated and improved our cloud ice water retrievals, and have produced and archived at ARM website as a PI-product of the 3-D cloud ice water contents using combined satellite high-frequency microwave and surface radar observations for SGP March 2000 IOP and TWP-ICE 2006 IOP over 10 deg. x 10 deg. area centered at ARM SGP central facility and Darwin sites. We have also worked on validation of the 3-D ice water product by CloudSat data, synergy with visible/infrared cloud ice water retrievals for better results at low ice water conditions, and created a long-term (several years) of ice water climatology in 10 x 10 deg. area of ARM SGP and TWP sites and then compared it with GCMs.« less

  16. Earth Observations taken by the Expedition 13 crew

    NASA Image and Video Library

    2006-09-06

    ISS013-E-78295 (6 Sept. 2006) --- Haze in the Po River Valley of Italy is featured in this image photographed by an Expedition 13 crewmember onboard the International Space Station. The valley is visible across the horizontal center of the frame, with the floor obscured by what NASA scientists refer to as frequent atmospheric haze, a mixture of industrial pollutants, dust and smoke. The visual texture of such haze is perceptibly different from that of bright white clouds which stretch across the top of the scene and cover part of the Alps. The Po River Valley is Italy's industrial heartland and one of the most industrialized regions on Earth, according to scientists. Northern Italy is in the foreground of this southwesterly view. The partially cloud-covered Alps are at lower right; the Adriatic Sea at lower left. Corsica is under partial cloud cover at center; and Sardinia, almost totally obscured, is to its south. The island of Elba is visible just to the west of Italy. By contrast with haze accumulation along the axis of the valley, the Alps and the Apennines are clearly visible, and Lake Garda can be seen in the foothills of the Alps. Other visible geographic features are the lagoon at Venice north of the Po River delta, and three small lakes north of Rome. The winds on the day this image was taken are mainly from the north, as shown by the flow lines in the haze near Venice. The haze typically flows south down the Adriatic Sea. Visibility in the Mediterranean basin is often reduced by hazes such as these, deriving from different sources in industrialized Europe.

  17. Cloud Forecasting and 3-D Radiative Transfer Model Validation using Citizen-Sourced Imagery

    NASA Astrophysics Data System (ADS)

    Gasiewski, A. J.; Heymsfield, A.; Newman Frey, K.; Davis, R.; Rapp, J.; Bansemer, A.; Coon, T.; Folsom, R.; Pfeufer, N.; Kalloor, J.

    2017-12-01

    Cloud radiative feedback mechanisms are one of the largest sources of uncertainty in global climate models. Variations in local 3D cloud structure impact the interpretation of NASA CERES and MODIS data for top-of-atmosphere radiation studies over clouds. Much of this uncertainty results from lack of knowledge of cloud vertical and horizontal structure. Surface-based data on 3-D cloud structure from a multi-sensor array of low-latency ground-based cameras can be used to intercompare radiative transfer models based on MODIS and other satellite data with CERES data to improve the 3-D cloud parameterizations. Closely related, forecasting of solar insolation and associated cloud cover on time scales out to 1 hour and with spatial resolution of 100 meters is valuable for stabilizing power grids with high solar photovoltaic penetrations. Data for cloud-advection based solar insolation forecasting with requisite spatial resolution and latency needed to predict high ramp rate events obtained from a bottom-up perspective is strongly correlated with cloud-induced fluctuations. The development of grid management practices for improved integration of renewable solar energy thus also benefits from a multi-sensor camera array. The data needs for both 3D cloud radiation modelling and solar forecasting are being addressed using a network of low-cost upward-looking visible light CCD sky cameras positioned at 2 km spacing over an area of 30-60 km in size acquiring imagery on 30 second intervals. Such cameras can be manufactured in quantity and deployed by citizen volunteers at a marginal cost of 200-400 and operated unattended using existing communications infrastructure. A trial phase to understand the potential utility of up-looking multi-sensor visible imagery is underway within this NASA Citizen Science project. To develop the initial data sets necessary to optimally design a multi-sensor cloud camera array a team of 100 citizen scientists using self-owned PDA cameras is being organized to collect distributed cloud data sets suitable for MODIS-CERES cloud radiation science and solar forecasting algorithm development. A low-cost and robust sensor design suitable for large scale fabrication and long term deployment has been developed during the project prototyping phase.

  18. Pixel-level multisensor image fusion based on matrix completion and robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.

    2016-01-01

    Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.

  19. A fast hidden line algorithm for plotting finite element models

    NASA Technical Reports Server (NTRS)

    Jones, G. K.

    1982-01-01

    Effective plotting of finite element models requires the use of fast hidden line plot techniques that provide interactive response. A high speed hidden line technique was developed to facilitate the plotting of NASTRAN finite element models. Based on testing using 14 different models, the new hidden line algorithm (JONES-D) appears to be very fast: its speed equals that for normal (all lines visible) plotting and when compared to other existing methods it appears to be substantially faster. It also appears to be very reliable: no plot errors were observed using the new method to plot NASTRAN models. The new algorithm was made part of the NPLOT NASTRAN plot package and was used by structural analysts for normal production tasks.

  20. An image-space parallel convolution filtering algorithm based on shadow map

    NASA Astrophysics Data System (ADS)

    Li, Hua; Yang, Huamin; Zhao, Jianping

    2017-07-01

    Shadow mapping is commonly used in real-time rendering. In this paper, we presented an accurate and efficient method of soft shadows generation from planar area lights. First this method generated a depth map from light's view, and analyzed the depth-discontinuities areas as well as shadow boundaries. Then these areas were described as binary values in the texture map called binary light-visibility map, and a parallel convolution filtering algorithm based on GPU was enforced to smooth out the boundaries with a box filter. Experiments show that our algorithm is an effective shadow map based method that produces perceptually accurate soft shadows in real time with more details of shadow boundaries compared with the previous works.

  1. Fast and accurate face recognition based on image compression

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Blasch, Erik

    2017-05-01

    Image compression is desired for many image-related applications especially for network-based applications with bandwidth and storage constraints. The face recognition community typical reports concentrate on the maximal compression rate that would not decrease the recognition accuracy. In general, the wavelet-based face recognition methods such as EBGM (elastic bunch graph matching) and FPB (face pattern byte) are of high performance but run slowly due to their high computation demands. The PCA (Principal Component Analysis) and LDA (Linear Discriminant Analysis) algorithms run fast but perform poorly in face recognition. In this paper, we propose a novel face recognition method based on standard image compression algorithm, which is termed as compression-based (CPB) face recognition. First, all gallery images are compressed by the selected compression algorithm. Second, a mixed image is formed with the probe and gallery images and then compressed. Third, a composite compression ratio (CCR) is computed with three compression ratios calculated from: probe, gallery and mixed images. Finally, the CCR values are compared and the largest CCR corresponds to the matched face. The time cost of each face matching is about the time of compressing the mixed face image. We tested the proposed CPB method on the "ASUMSS face database" (visible and thermal images) from 105 subjects. The face recognition accuracy with visible images is 94.76% when using JPEG compression. On the same face dataset, the accuracy of FPB algorithm was reported as 91.43%. The JPEG-compressionbased (JPEG-CPB) face recognition is standard and fast, which may be integrated into a real-time imaging device.

  2. Mt Pamola, the Electromagnetic Field, EMF, Thunderbird, Mothman and Environmental Monitoring Signals Via the Southern Constellation Phoenix As Detectable In Potato Cave, Acton, MA.

    NASA Astrophysics Data System (ADS)

    Pecora, Andrea S.; Pawa Matagamon, Sagamo

    2004-03-01

    Just below the peak of Mt Pamola in ME, at the juncture with the Knife Edge, downwardly arcing segments of Earths EMF, are manifested by a faint lotus-blossom-blue, neon-like glow at 3 pm some sunny afternoons. Similarly hued glows, and horizontal but variable-arced segmented trajectories, are somewhat periodically detectable under certain conditions in chambers at Acton, MA. These phenomena curiously have the filled-in profile that precisely matches the outline of the southern constellation Phoenix, which is never visible above the nighttime horizon locally. The stick-figure representation of the constellation Canis Major can also be detected in a chamber at Americas Stonehenge, two hours before it has arisen, at certain times. The sequence of phenomena visible at Acton correctly correlates with eclipses and other alignments of our solar system. Phoenix, a.k.a. Thunderbird and Mothman, is detectable elsewhere in MA.

  3. EVA 4 activity on Flight Day 7 to service the Hubble Space Telescope

    NASA Image and Video Library

    1997-02-17

    STS082-711-067 (11-21 Feb. 1997) --- Astronaut Gregory J. Harbaugh, mission specialist, floats horizontally in the cargo bay of the Earth-orbiting Space Shuttle Discovery, backdropped against its giant temporary passenger, the Hubble Space Telescope (HST). Harbaugh, sharing this space walking activity with astronaut Joseph R. Tanner (out of frame), is actually recognizable through his helmet visor in the 70mm frame. He is near the Second Axial Carrier (SAC), Axial Scientific Instrument Protection Enclosure (ASIPE). STS-82 marked the first flight of the exit airlock, partially visible at bottom edge of photo.

  4. Realizing structural color generation with aluminum plasmonic V-groove metasurfaces

    DOE PAGES

    Wang, Wei; Rosenmann, Daniel; Czaplewski, David A.; ...

    2017-08-14

    The structural color printing based on all-aluminum plasmonic V-groove metasurfaces is demonstrated under both bright field and dark field illumination conditions. A broad visible color range is realized with the plasmonic V-groove arrays etched on aluminum surface by simply varying the groove depth while keeping the groove period as a constant. Polarization dependent structural color printing is further achieved with interlaced V-groove arrays along both the horizontal and vertical directions. Furthermore, these results pave the way towards the use of all-aluminum structural color printing platform for many practical applications such as security marking and information storage.

  5. Mars Surface near Viking Lander 1 Footpad

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image, which has been flipped horizontally, was taken by Viking Lander 1 on August 1, 1976, 12 sols after landing. Much like images that have returned from Phoenix, the soil beneath Viking 1 has been exposed due to exhaust from thruster engines during descent. This is visible to the right of the struts of Viking's surface-sampler arm housing, seen on the left.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  6. Numerical simulation of the interaction of transport, diffusion and chemical reactions in an urban plume

    NASA Technical Reports Server (NTRS)

    Vogel, Bernhard; Vogel, Heike; Fiedler, Franz

    1994-01-01

    A model system is presented that takes into account the main physical and chemical processes on the regional scale here in an area of 100x100 sq km. The horizontal gridsize used is 2x2 sq km. For a case study, it is demonstrated how the model system can be used to separate the contributions of the processes advection, turbulent diffusion, and chemical reactions to the diurnal cycle of ozone. In this way, typical features which are visible in observations and are reproduced by the numerical simulations can be interpreted.

  7. Realizing structural color generation with aluminum plasmonic V-groove metasurfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Wei; Rosenmann, Daniel; Czaplewski, David A.

    The structural color printing based on all-aluminum plasmonic V-groove metasurfaces is demonstrated under both bright field and dark field illumination conditions. A broad visible color range is realized with the plasmonic V-groove arrays etched on aluminum surface by simply varying the groove depth while keeping the groove period as a constant. Polarization dependent structural color printing is further achieved with interlaced V-groove arrays along both the horizontal and vertical directions. Furthermore, these results pave the way towards the use of all-aluminum structural color printing platform for many practical applications such as security marking and information storage.

  8. Validating precision estimates in horizontal wind measurements from a Doppler lidar

    DOE PAGES

    Newsom, Rob K.; Brewer, W. Alan; Wilczak, James M.; ...

    2017-03-30

    Results from a recent field campaign are used to assess the accuracy of wind speed and direction precision estimates produced by a Doppler lidar wind retrieval algorithm. The algorithm, which is based on the traditional velocity-azimuth-display (VAD) technique, estimates the wind speed and direction measurement precision using standard error propagation techniques, assuming the input data (i.e., radial velocities) to be contaminated by random, zero-mean, errors. For this study, the lidar was configured to execute an 8-beam plan-position-indicator (PPI) scan once every 12 min during the 6-week deployment period. Several wind retrieval trials were conducted using different schemes for estimating themore » precision in the radial velocity measurements. Here, the resulting wind speed and direction precision estimates were compared to differences in wind speed and direction between the VAD algorithm and sonic anemometer measurements taken on a nearby 300 m tower.« less

  9. ID card number detection algorithm based on convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhu, Jian; Ma, Hanjie; Feng, Jie; Dai, Leiyan

    2018-04-01

    In this paper, a new detection algorithm based on Convolutional Neural Network is presented in order to realize the fast and convenient ID information extraction in multiple scenarios. The algorithm uses the mobile device equipped with Android operating system to locate and extract the ID number; Use the special color distribution of the ID card, select the appropriate channel component; Use the image threshold segmentation, noise processing and morphological processing to take the binary processing for image; At the same time, the image rotation and projection method are used for horizontal correction when image was tilting; Finally, the single character is extracted by the projection method, and recognized by using Convolutional Neural Network. Through test shows that, A single ID number image from the extraction to the identification time is about 80ms, the accuracy rate is about 99%, It can be applied to the actual production and living environment.

  10. A New Spin for Understanding the Peculiar Horizontal Branch Morphology of the Galactic Globular Clusters NGC 6388 and NGC 6441

    NASA Technical Reports Server (NTRS)

    Busso, G.; Piotto, G.; Cassisi, S.; Romaniello, M.; Castelli, F.; Catelan, M.; Djorgovski, S. G.; King, I. R.; Landsman, W. B.; Blanco, A. Reico; hide

    2006-01-01

    In this paper we present multiband optical and UV Hubble Space Telescope photometry of the two Galactic globular clusters NGC 6388 and NGC 6441 Aims. We investigate the properties of their anomalous horizontal branches (HB) in different photometric planes in order to shed light on the nature of the physical mechanism(s) responsible for the existence of an extended HB blue tail, and of a slope in the HB, visible in all the color-magnitude diagrams. Methods. New photometric data have been collected and carefully reduced. Empirical data have been compared with updated stellar models of low-mass, metal-rich, He-burning structures, transformed to the observational plane with appropriate atmosphere models. Results. We have obtained the first UV color-magnitude diagrams for NGC 6388 and NGC 6441. These diagrams confirm previous results, obtained in optical bands, about the presence of a sizeable stellar population of extremely hot Horizontal Branch stars. At least in NGC 6388, we find a clear indication that at the hot end of the horizontal branch the distribution of stars forms a hook-like feature, closely resembling those observed in NGC 2808 and w Centauri. We briefly review the theoretical scenarios which have been suggested for interpreting this observational feature. We investigate also on the tilt in the horizontal branch morphology, and provide further evidence that supports early suggestions according to which this feature cannot be interpreted as an effect of differential reddening or radiative levitation, though these effects contribute to create the anomaly. We demonstrate that a possible solution of the puzzle is to assume that a small fraction (approx. 13% in NGC 6388 and approx. 8% NGC 6441) of the stellar population in the two clusters is strongly helium enriched (Y approx. 0.40 in NGC6388 and Y approx. 0.35 in NGC 6441). This solution necessarily implies the presence of a double generation of stars in the two clusters.

  11. Phase shifting diffraction interferometer

    DOEpatents

    Sommargren, Gary E.

    1996-01-01

    An interferometer which has the capability of measuring optical elements and systems with an accuracy of .lambda./1000 where .lambda. is the wavelength of visible light. Whereas current interferometers employ a reference surface, which inherently limits the accuracy of the measurement to about .lambda./50, this interferometer uses an essentially perfect spherical reference wavefront generated by the fundamental process of diffraction. This interferometer is adjustable to give unity fringe visibility, which maximizes the signal-to-noise, and has the means to introduce a controlled prescribed relative phase shift between the reference wavefront and the wavefront from the optics under test, which permits analysis of the interference fringe pattern using standard phase extraction algorithms.

  12. Phase shifting diffraction interferometer

    DOEpatents

    Sommargren, G.E.

    1996-08-29

    An interferometer which has the capability of measuring optical elements and systems with an accuracy of {lambda}/1000 where {lambda} is the wavelength of visible light. Whereas current interferometers employ a reference surface, which inherently limits the accuracy of the measurement to about {lambda}/50, this interferometer uses an essentially perfect spherical reference wavefront generated by the fundamental process of diffraction. This interferometer is adjustable to give unity fringe visibility, which maximizes the signal-to-noise, and has the means to introduce a controlled prescribed relative phase shift between the reference wavefront and the wavefront from the optics under test, which permits analysis of the interference fringe pattern using standard phase extraction algorithms. 8 figs.

  13. Characterization of macropore structure of Malan loess in NW China based on 3D pipe models constructed by using computed tomography technology

    NASA Astrophysics Data System (ADS)

    Li, Yanrong; He, Shengdi; Deng, Xiaohong; Xu, Yongxin

    2018-04-01

    Malan loess is a grayish yellow or brownish yellow, clastic, highly porous and brittle late Quaternary sediment formed by the accumulation of windblown dust. The present-day pore structure of Malan loess is crucial for understanding the loessification process in history, loess strengths and mechanical behavior. This study employed a modern computed tomography (CT) device to scan Malan loess samples, which were obtained from the east part of the Loess Plateau of China. A sophisticated and efficient workflow for processing the CT images and constructing 3D pore models was established by selecting and programming relevant mathematical algorithms in MATLAB, such as the maximum entropy method, medial axis method, and node recognition algorithm. Individual pipes within the Malan loess were identified and constructed by partitioning and recombining links in the 3D pore model. The macropore structure of Malan loess was then depicted using quantitative parameters. The parameters derived from 2D images of CT scanning included equivalent radius, length and aspect ratio of pores, porosity, and pore distribution entropy, whereas those derived from the constructed 3D structure models included porosity, coordination number, node density, pipe radius, length, length density, dip angle, and dip direction. The analysis of these parameters revealed that Malan loess is a strongly anisotropic geomaterial with a dense and complex network of pores and pipes. The pores discovered on horizontal images, perpendicular to the vertical direction, were round and relatively uniform in shape and size and evenly distributed, whereas the pores discovered on vertical images varied in shape and size and were distributed in clusters. The pores showed good connectivity in vertical direction and formed vertically aligned pipes but displayed weak connectivity in horizontal directions. The pipes in vertical direction were thick, long, and straight compared with those in horizontal directions. These results were in good agreement with both numerical simulation and laboratory permeability tests, which indicate that Malan loess is more permeable in the vertical direction than in the horizontal directions.

  14. A joint precoding scheme for indoor downlink multi-user MIMO VLC systems

    NASA Astrophysics Data System (ADS)

    Zhao, Qiong; Fan, Yangyu; Kang, Bochao

    2017-11-01

    In this study, we aim to improve the system performance and reduce the implementation complexity of precoding scheme for visible light communication (VLC) systems. By incorporating the power-method algorithm and the block diagonalization (BD) algorithm, we propose a joint precoding scheme for indoor downlink multi-user multi-input-multi-output (MU-MIMO) VLC systems. In this scheme, we apply the BD algorithm to eliminate the co-channel interference (CCI) among users firstly. Secondly, the power-method algorithm is used to search the precoding weight for each user based on the optimal criterion of signal to interference plus noise ratio (SINR) maximization. Finally, the optical power restrictions of VLC systems are taken into account to constrain the precoding weight matrix. Comprehensive computer simulations in two scenarios indicate that the proposed scheme always has better bit error rate (BER) performance and lower computation complexity than that of the traditional scheme.

  15. An FPGA-based heterogeneous image fusion system design method

    NASA Astrophysics Data System (ADS)

    Song, Le; Lin, Yu-chi; Chen, Yan-hua; Zhao, Mei-rong

    2011-08-01

    Taking the advantages of FPGA's low cost and compact structure, an FPGA-based heterogeneous image fusion platform is established in this study. Altera's Cyclone IV series FPGA is adopted as the core processor of the platform, and the visible light CCD camera and infrared thermal imager are used as the image-capturing device in order to obtain dualchannel heterogeneous video images. Tailor-made image fusion algorithms such as gray-scale weighted averaging, maximum selection and minimum selection methods are analyzed and compared. VHDL language and the synchronous design method are utilized to perform a reliable RTL-level description. Altera's Quartus II 9.0 software is applied to simulate and implement the algorithm modules. The contrast experiments of various fusion algorithms show that, preferably image quality of the heterogeneous image fusion can be obtained on top of the proposed system. The applied range of the different fusion algorithms is also discussed.

  16. Multisensor satellite data integration for sea surface wind speed and direction determination

    NASA Technical Reports Server (NTRS)

    Glackin, D. L.; Pihos, G. G.; Wheelock, S. L.

    1984-01-01

    Techniques to integrate meteorological data from various satellite sensors to yield a global measure of sea surface wind speed and direction for input to the Navy's operational weather forecast models were investigated. The sensors were launched or will be launched, specifically the GOES visible and infrared imaging sensor, the Nimbus-7 SMMR, and the DMSP SSM/I instrument. An algorithm for the extrapolation to the sea surface of wind directions as derived from successive GOES cloud images was developed. This wind veering algorithm is relatively simple, accounts for the major physical variables, and seems to represent the best solution that can be found with existing data. An algorithm for the interpolation of the scattered observed data to a common geographical grid was implemented. The algorithm is based on a combination of inverse distance weighting and trend surface fitting, and is suited to combing wind data from disparate sources.

  17. A Fusion Algorithm for GFP Image and Phase Contrast Image of Arabidopsis Cell Based on SFL-Contourlet Transform

    PubMed Central

    Feng, Peng; Wang, Jing; Wei, Biao; Mi, Deling

    2013-01-01

    A hybrid multiscale and multilevel image fusion algorithm for green fluorescent protein (GFP) image and phase contrast image of Arabidopsis cell is proposed in this paper. Combining intensity-hue-saturation (IHS) transform and sharp frequency localization Contourlet transform (SFL-CT), this algorithm uses different fusion strategies for different detailed subbands, which include neighborhood consistency measurement (NCM) that can adaptively find balance between color background and gray structure. Also two kinds of neighborhood classes based on empirical model are taken into consideration. Visual information fidelity (VIF) as an objective criterion is introduced to evaluate the fusion image. The experimental results of 117 groups of Arabidopsis cell image from John Innes Center show that the new algorithm cannot only make the details of original images well preserved but also improve the visibility of the fusion image, which shows the superiority of the novel method to traditional ones. PMID:23476716

  18. Limb-Nadir Matching for Tropospheric NO2: A New Algorithm in the SCIAMACHY Operational Level 2 Processor

    NASA Astrophysics Data System (ADS)

    Meringer, Markus; Gretschany, Sergei; Lichtenberg, Gunter; Hilboll, Andreas; Richter, Andreas; Burrows, John P.

    2015-11-01

    SCIAMACHY (SCanning Imaging Absorption spectroMeter for Atmospheric ChartographY) aboard ESA's environmental satellite ENVISAT observed the Earth's atmosphere in limb, nadir, and solar/lunar occultation geometries covering the UV-Visible to NIR spectral range. Limb and nadir geometries were the main operation modes for the retrieval of scientific data. The new version 6 of ESA's level 2 processor now provides for the first time an operational algorithm to combine measurements of these two geometries in order to generate new products. As a first instance the retrieval of tropospheric NO2 has been implemented based on IUP-Bremen's reference algorithm. We will detail the single processing steps performed by the operational limb-nadir matching algorithm and report the results of comparisons with the scientific tropospheric NO2 products of IUP and the Tropospheric Emission Monitoring Internet Service (TEMIS).

  19. Filtering method of star control points for geometric correction of remote sensing image based on RANSAC algorithm

    NASA Astrophysics Data System (ADS)

    Tan, Xiangli; Yang, Jungang; Deng, Xinpu

    2018-04-01

    In the process of geometric correction of remote sensing image, occasionally, a large number of redundant control points may result in low correction accuracy. In order to solve this problem, a control points filtering algorithm based on RANdom SAmple Consensus (RANSAC) was proposed. The basic idea of the RANSAC algorithm is that using the smallest data set possible to estimate the model parameters and then enlarge this set with consistent data points. In this paper, unlike traditional methods of geometric correction using Ground Control Points (GCPs), the simulation experiments are carried out to correct remote sensing images, which using visible stars as control points. In addition, the accuracy of geometric correction without Star Control Points (SCPs) optimization is also shown. The experimental results show that the SCPs's filtering method based on RANSAC algorithm has a great improvement on the accuracy of remote sensing image correction.

  20. Detection of Lettuce Discoloration Using Hyperspectral Reflectance Imaging

    PubMed Central

    Mo, Changyeun; Kim, Giyoung; Lim, Jongguk; Kim, Moon S.; Cho, Hyunjeong; Cho, Byoung-Kwan

    2015-01-01

    Rapid visible/near-infrared (VNIR) hyperspectral imaging methods, employing both a single waveband algorithm and multi-spectral algorithms, were developed in order to discrimination between sound and discolored lettuce. Reflectance spectra for sound and discolored lettuce surfaces were extracted from hyperspectral reflectance images obtained in the 400–1000 nm wavelength range. The optimal wavebands for discriminating between discolored and sound lettuce surfaces were determined using one-way analysis of variance. Multi-spectral imaging algorithms developed using ratio and subtraction functions resulted in enhanced classification accuracy of above 99.9% for discolored and sound areas on both adaxial and abaxial lettuce surfaces. Ratio imaging (RI) and subtraction imaging (SI) algorithms at wavelengths of 552/701 nm and 557–701 nm, respectively, exhibited better classification performances compared to results obtained for all possible two-waveband combinations. These results suggest that hyperspectral reflectance imaging techniques can potentially be used to discriminate between discolored and sound fresh-cut lettuce. PMID:26610510

  1. Detection of Lettuce Discoloration Using Hyperspectral Reflectance Imaging.

    PubMed

    Mo, Changyeun; Kim, Giyoung; Lim, Jongguk; Kim, Moon S; Cho, Hyunjeong; Cho, Byoung-Kwan

    2015-11-20

    Rapid visible/near-infrared (VNIR) hyperspectral imaging methods, employing both a single waveband algorithm and multi-spectral algorithms, were developed in order to discrimination between sound and discolored lettuce. Reflectance spectra for sound and discolored lettuce surfaces were extracted from hyperspectral reflectance images obtained in the 400-1000 nm wavelength range. The optimal wavebands for discriminating between discolored and sound lettuce surfaces were determined using one-way analysis of variance. Multi-spectral imaging algorithms developed using ratio and subtraction functions resulted in enhanced classification accuracy of above 99.9% for discolored and sound areas on both adaxial and abaxial lettuce surfaces. Ratio imaging (RI) and subtraction imaging (SI) algorithms at wavelengths of 552/701 nm and 557-701 nm, respectively, exhibited better classification performances compared to results obtained for all possible two-waveband combinations. These results suggest that hyperspectral reflectance imaging techniques can potentially be used to discriminate between discolored and sound fresh-cut lettuce.

  2. Towards High Resolution Numerical Algorithms for Wave Dominated Physical Phenomena

    DTIC Science & Technology

    2009-01-30

    results are scaled as floating point operations per second, obtained by counting the number of floating point additions and multiplications in the...black horizontal line. Perhaps the most striking feature at first is the fact that the memory bandwidth measured for flux lifting transcends this...theoretical peak performance values. For a suitable CPU-limited workload, this means that a single workstation equipped with multiple GPUs can do work that

  3. Single-footprint retrievals of temperature, water vapor and cloud properties from AIRS

    NASA Astrophysics Data System (ADS)

    Irion, Fredrick W.; Kahn, Brian H.; Schreier, Mathias M.; Fetzer, Eric J.; Fishbein, Evan; Fu, Dejian; Kalmus, Peter; Wilson, R. Chris; Wong, Sun; Yue, Qing

    2018-02-01

    Single-footprint Atmospheric Infrared Sounder spectra are used in an optimal estimation-based algorithm (AIRS-OE) for simultaneous retrieval of atmospheric temperature, water vapor, surface temperature, cloud-top temperature, effective cloud optical depth and effective cloud particle radius. In a departure from currently operational AIRS retrievals (AIRS V6), cloud scattering and absorption are in the radiative transfer forward model and AIRS single-footprint thermal infrared data are used directly rather than cloud-cleared spectra (which are calculated using nine adjacent AIRS infrared footprints). Coincident MODIS cloud data are used for cloud a priori data. Using single-footprint spectra improves the horizontal resolution of the AIRS retrieval from ˜ 45 to ˜ 13.5 km at nadir, but as microwave data are not used, the retrieval is not made at altitudes below thick clouds. An outline of the AIRS-OE retrieval procedure and information content analysis is presented. Initial comparisons of AIRS-OE to AIRS V6 results show increased horizontal detail in the water vapor and relative humidity fields in the free troposphere above the clouds. Initial comparisons of temperature, water vapor and relative humidity profiles with coincident radiosondes show good agreement. Future improvements to the retrieval algorithm, and to the forward model in particular, are discussed.

  4. Observations and Measurements of Dust Transport from the Patagonia Desert into the South Atlantic Ocean in 2004 and 2005

    NASA Astrophysics Data System (ADS)

    Gasso, S.; Gaiero, D. M.; Villoslada, B.; Liske, E.

    2005-12-01

    The largest continental landmass south of the 40-degree parallel and potentially one of the largest sources of dust into the Southern Ocean (SO) is the Patagonia desert. Most of the estimates of dust outflow and deposition from this region into the South Atlantic Ocean are based on model simulations. However, there are very few measurements available that can corroborate these estimates. Satellite assessments of dust activity offer conflicting views. For example, monthly time series of satellite-derived (e.g. AVHRR and MODIS) aerosol optical depth (AOD) indicate that dust activity is minimal. However, a study with the TOMS Aerosol Index (Prospero et al., 2002) showed that the frequency of dust events is in the range of 7-14 days/month during the years 1978 through 1993. In addition, surface visibility observations along the Patagonian coast confirm that ocean-going dust events do occur during the summer and spring months. These discrepancies indicate fundamental uncertainties regarding the frequency and extent of dust activity in Patagonia. Given that the SO is the largest high-chlorophyll, low-nutrient area in the world and that the flux of nutrient-rich dust has the potential to modify biological activity with possible climatic consequences, it is of interest to have a better understanding of how often and intense are dust events in the Patagonia region. We surveyed the reports of dust activity from surface weather stations in the Patagonia region during the period June, 2004 to April, 2005. These observations were compared with simultaneous MODIS true color pictures and the corresponding aerosol retrievals. In addition, measurements of vertical and horizontal dust flux were collected by dust samplers at four sites along the coast. The horizontal flux measurements were compared with the same estimates derived from MODIS. According to the true color pictures and confirmed by the surface visibility observations, we recorded at least 16 ocean-going dust events. The scale of the events varied from small (single dust plumes along the coast) to large (dust front extending ~600 km). Most of the large events occurred during the late summer. Due to the presence of sun glint, cloud obstruction, or coastal sediments, the MODIS automatic aerosol algorithm did not derive AODs in many instances and, as result, many events were not recorded in the MODIS monthly database. Dust sources are numerous and dust plumes outflow at any place along the coastline (> 1000 km) including some very active sources as far south as in the Tierra del Fuego Island (54S). The main sources identified are coastal saltbeds, inland deflation hollows and receding shores of large lakes. Although some of major emitting points have been included as sources in dust models, there are some notable exceptions, for example most of the coastal sources. We note, in addition, that the scale and diversity of the different sources pose significant challenges with respect to parameterization in global models of dust dispersion.

  5. Sequential visibility-graph motifs

    NASA Astrophysics Data System (ADS)

    Iacovacci, Jacopo; Lacasa, Lucas

    2016-04-01

    Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.

  6. Infrared and Visible Image Fusion Based on Different Constraints in the Non-Subsampled Shearlet Transform Domain.

    PubMed

    Huang, Yan; Bi, Duyan; Wu, Dongpeng

    2018-04-11

    There are many artificial parameters when fuse infrared and visible images, to overcome the lack of detail in the fusion image because of the artifacts, a novel fusion algorithm for infrared and visible images that is based on different constraints in non-subsampled shearlet transform (NSST) domain is proposed. There are high bands and low bands of images that are decomposed by the NSST. After analyzing the characters of the bands, fusing the high level bands by the gradient constraint, the fused image can obtain more details; fusing the low bands by the constraint of saliency in the images, the targets are more salient. Before the inverse NSST, the Nash equilibrium is used to update the coefficient. The fused images and the quantitative results demonstrate that our method is more effective in reserving details and highlighting the targets when compared with other state-of-the-art methods.

  7. Decoding mobile-phone image sensor rolling shutter effect for visible light communications

    NASA Astrophysics Data System (ADS)

    Liu, Yang

    2016-01-01

    Optical wireless communication (OWC) using visible lights, also known as visible light communication (VLC), has attracted significant attention recently. As the traditional OWC and VLC receivers (Rxs) are based on PIN photo-diode or avalanche photo-diode, deploying the complementary metal-oxide-semiconductor (CMOS) image sensor as the VLC Rx is attractive since nowadays nearly every person has a smart phone with embedded CMOS image sensor. However, deploying the CMOS image sensor as the VLC Rx is challenging. In this work, we propose and demonstrate two simple contrast ratio (CR) enhancement schemes to improve the contrast of the rolling shutter pattern. Then we describe their processing algorithms one by one. The experimental results show that both the proposed CR enhancement schemes can significantly mitigate the high-intensity fluctuations of the rolling shutter pattern and improve the bit-error-rate performance.

  8. A fusion algorithm for infrared and visible based on guided filtering and phase congruency in NSST domain

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwen; Feng, Yan; Chen, Hang; Jiao, Licheng

    2017-10-01

    A novel and effective image fusion method is proposed for creating a highly informative and smooth surface of fused image through merging visible and infrared images. Firstly, a two-scale non-subsampled shearlet transform (NSST) is employed to decompose the visible and infrared images into detail layers and one base layer. Then, phase congruency is adopted to extract the saliency maps from the detail layers and a guided filtering is proposed to compute the filtering output of base layer and saliency maps. Next, a novel weighted average technique is used to make full use of scene consistency for fusion and obtaining coefficients map. Finally the fusion image was acquired by taking inverse NSST of the fused coefficients map. Experiments show that the proposed approach can achieve better performance than other methods in terms of subjective visual effect and objective assessment.

  9. [Research on the measurement of flue-dust concentration in Vis, IR spectral region].

    PubMed

    Sun, Xiao-gang; Tang, Hong; Yuan, Gui-bin

    2008-10-01

    In the measurement of flue-dust concentration based on the transmission method, the dependent model algorithm was used to invert the flue-dust concentration in the visible, infrared and visible-infrared spectral regions respectively. By the analysis and comparison of the accuracy, linearity and sensitivity of the inversion flue-dust concentration, the optimal spectral region was determined. Meanwhile, the influence of the water droplet with different size distribution and volume concentration was simulated, and a method was proposed which has advantages of simplicity, rapidity, and suitability for on line measurement. Simulation experiments illustrate that the flue-dust concentration can be inverted very well in the visible-infrared spectral region, and it is feasible to use the ratio of the constrained light extinction method to overcome the influence of water droplet. The inverse results all remain satisfactory when 2% stochastic noise is added to the value of the light extinction.

  10. Infrared and Visible Image Fusion Based on Different Constraints in the Non-Subsampled Shearlet Transform Domain

    PubMed Central

    Huang, Yan; Bi, Duyan; Wu, Dongpeng

    2018-01-01

    There are many artificial parameters when fuse infrared and visible images, to overcome the lack of detail in the fusion image because of the artifacts, a novel fusion algorithm for infrared and visible images that is based on different constraints in non-subsampled shearlet transform (NSST) domain is proposed. There are high bands and low bands of images that are decomposed by the NSST. After analyzing the characters of the bands, fusing the high level bands by the gradient constraint, the fused image can obtain more details; fusing the low bands by the constraint of saliency in the images, the targets are more salient. Before the inverse NSST, the Nash equilibrium is used to update the coefficient. The fused images and the quantitative results demonstrate that our method is more effective in reserving details and highlighting the targets when compared with other state-of-the-art methods. PMID:29641505

  11. Infrared and visible image fusion method based on saliency detection in sparse domain

    NASA Astrophysics Data System (ADS)

    Liu, C. H.; Qi, Y.; Ding, W. R.

    2017-06-01

    Infrared and visible image fusion is a key problem in the field of multi-sensor image fusion. To better preserve the significant information of the infrared and visible images in the final fused image, the saliency maps of the source images is introduced into the fusion procedure. Firstly, under the framework of the joint sparse representation (JSR) model, the global and local saliency maps of the source images are obtained based on sparse coefficients. Then, a saliency detection model is proposed, which combines the global and local saliency maps to generate an integrated saliency map. Finally, a weighted fusion algorithm based on the integrated saliency map is developed to achieve the fusion progress. The experimental results show that our method is superior to the state-of-the-art methods in terms of several universal quality evaluation indexes, as well as in the visual quality.

  12. Real-time Enhancement, Registration, and Fusion for a Multi-Sensor Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.

    2006-01-01

    Over the last few years NASA Langley Research Center (LaRC) has been developing an Enhanced Vision System (EVS) to aid pilots while flying in poor visibility conditions. The EVS captures imagery using two infrared video cameras. The cameras are placed in an enclosure that is mounted and flown forward-looking underneath the NASA LaRC ARIES 757 aircraft. The data streams from the cameras are processed in real-time and displayed on monitors on-board the aircraft. With proper processing the camera system can provide better-than- human-observed imagery particularly during poor visibility conditions. However, to obtain this goal requires several different stages of processing including enhancement, registration, and fusion, and specialized processing hardware for real-time performance. We are using a real-time implementation of the Retinex algorithm for image enhancement, affine transformations for registration, and weighted sums to perform fusion. All of the algorithms are executed on a single TI DM642 digital signal processor (DSP) clocked at 720 MHz. The image processing components were added to the EVS system, tested, and demonstrated during flight tests in August and September of 2005. In this paper we briefly discuss the EVS image processing hardware and algorithms. We then discuss implementation issues and show examples of the results obtained during flight tests. Keywords: enhanced vision system, image enhancement, retinex, digital signal processing, sensor fusion

  13. NADH-fluorescence scattering correction for absolute concentration determination in a liquid tissue phantom using a novel multispectral magnetic-resonance-imaging-compatible needle probe

    NASA Astrophysics Data System (ADS)

    Braun, Frank; Schalk, Robert; Heintz, Annabell; Feike, Patrick; Firmowski, Sebastian; Beuermann, Thomas; Methner, Frank-Jürgen; Kränzlin, Bettina; Gretz, Norbert; Rädle, Matthias

    2017-07-01

    In this report, a quantitative nicotinamide adenine dinucleotide hydrate (NADH) fluorescence measurement algorithm in a liquid tissue phantom using a fiber-optic needle probe is presented. To determine the absolute concentrations of NADH in this phantom, the fluorescence emission spectra at 465 nm were corrected using diffuse reflectance spectroscopy between 600 nm and 940 nm. The patented autoclavable Nitinol needle probe enables the acquisition of multispectral backscattering measurements of ultraviolet, visible, near-infrared and fluorescence spectra. As a phantom, a suspension of calcium carbonate (Calcilit) and water with physiological NADH concentrations between 0 mmol l-1 and 2.0 mmol l-1 were used to mimic human tissue. The light scattering characteristics were adjusted to match the backscattering attributes of human skin by modifying the concentration of Calcilit. To correct the scattering effects caused by the matrices of the samples, an algorithm based on the backscattered remission spectrum was employed to compensate the influence of multiscattering on the optical pathway through the dispersed phase. The monitored backscattered visible light was used to correct the fluorescence spectra and thereby to determine the true NADH concentrations at unknown Calcilit concentrations. Despite the simplicity of the presented algorithm, the root-mean-square error of prediction (RMSEP) was 0.093 mmol l-1.

  14. Evaluation of Object Detection Algorithms for Ship Detection in the Visible Spectrum

    DTIC Science & Technology

    2013-12-01

    Kodak KAI-2093 was assumed throughout the model to be the image equitation sensor. The sensor was assumed to have taken all of the evaluation imagery...www.ShipPhotos.co.uk. [Online]. Available: http://www.shipphotos.co.uk/hull/ [42] Kodak (2007. March 19). Kodak KAI-2093 image sensor. [Online]. Available

  15. Discrimination methods of biological contamination on fresh-cut lettuce based on VNIR and NIR hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Multispectral imaging algorithms were developed using visible-near-infrared (VNIR) and near-infrared (NIR) hyperspectral imaging (HSI) techniques to detect worms on fresh-cut lettuce. The optimal wavebands that detect worm on fresh-cut lettuce for each type of HSI were investigated using the one-way...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cornacchia, Massimo

    The VISA (Visible to Infrared SASE Amplifier) SASE free electron laser has been successfully operated at the Accelerator Test Facility (ATF) at BNL. High gain and saturation were observed at 840 nm. We describe here the diagnostic system, experimental procedures and data reduction algorithms, as the FEL performance was measured along the length of the undulator. We also discuss selected spectral radiation measurements.

  17. Detection algorithm for cracks on the surface of tomatoes using Multispectral Vis/NIR Reflectance Imagery

    USDA-ARS?s Scientific Manuscript database

    Tomatoes, an important agricultural product in fresh-cut markets, are sometimes a source of foodborne illness, mainly Salmonella spp. Growth cracks on tomatoes can be a pathway for bacteria, so its detection prior to consumption is important for public health. In this study, multispectral Visible/Ne...

  18. Retrieval of NO2 stratospheric profiles from ground-based zenith-sky uv-visible measurements at 60°N

    NASA Astrophysics Data System (ADS)

    Hendrick, F.; van Roozendael, M.; Lambert, J.-C.; Fayt, C.; Hermans, C.; de Mazière, M.

    2003-04-01

    Nitrogen dioxide (NO_2) plays an important role in controlling ozone abundances in the stratosphere, either directly through the NOx (NO+NO_2) catalytic cycle, either indirectly by reaction with the radical ClO to form the reservoir species ClONO_2. In this presentation, NO_2 stratospheric profiles are retrieved from ground-based UV-visible NO_2 slant column abundances measured since 1998 at the complementary NDSC station of Harestua (Norway, 60^oN). The retrieval algorithm is based on the Rodgers optimal estimation inversion method and a forward model consisting in the IASB-BIRA stacked box photochemical model PSCBOX coupled to the radiative transfer package UVspec/DISORT. This algorithm has been applied to a set of about 50 sunrises and sunsets for which spatially and temporally coincident NO_2 measurements made by the HALOE (Halogen Occultation Experiment) instrument on board the Upper Atmosphere Research Satellite (UARS) are available. The consistency between retrieved and HALOE profiles is discussed in term of the different seasonal conditions investigated which are spring with and without chlorine activation, summer, and fall.

  19. Algorithm Science to Operations for the National Polar-orbiting Operational Environmental Satellite System (NPOESS) Visible/Infrared Imager/Radiometer Suite (VIIRS)

    NASA Technical Reports Server (NTRS)

    Duda, James L.; Barth, Suzanna C

    2005-01-01

    The VIIRS sensor provides measurements for 22 Environmental Data Records (EDRs) addressing the atmosphere, ocean surface temperature, ocean color, land parameters, aerosols, imaging for clouds and ice, and more. That is, the VIIRS collects visible and infrared radiometric data of the Earth's atmosphere, ocean, and land surfaces. Data types include atmospheric, clouds, Earth radiation budget, land/water and sea surface temperature, ocean color, and low light imagery. This wide scope of measurements calls for the preparation of a multiplicity of Algorithm Theoretical Basis Documents (ATBDs), and, additionally, for intermediate products such as cloud mask, et al. Furthermore, the VIIRS interacts with three or more other sensors. This paper addresses selected and crucial elements of the process being used to convert and test an immense volume of a maturing and changing science code to the initial operational source code in preparation for launch of NPP. The integrity of the original science code is maintained and enhanced via baseline comparisons when re-hosted, in addition to multiple planned code performance reviews.

  20. Mineral Potential in India Using Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) Data

    NASA Astrophysics Data System (ADS)

    Oommen, T.; Chatterjee, S.

    2017-12-01

    NASA and the Indian Space Research Organization (ISRO) are generating Earth surface features data using Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) within 380 to 2500 nm spectral range. This research focuses on the utilization of such data to better understand the mineral potential in India and to demonstrate the application of spectral data in rock type discrimination and mapping for mineral exploration by using automated mapping techniques. The primary focus area of this research is the Hutti-Maski greenstone belt, located in Karnataka, India. The AVIRIS-NG data was integrated with field analyzed data (laboratory scaled compositional analysis, mineralogy, and spectral library) to characterize minerals and rock types. An expert system was developed to produce mineral maps from AVIRIS-NG data automatically. The ground truth data from the study areas was obtained from the existing literature and collaborators from India. The Bayesian spectral unmixing algorithm was used in AVIRIS-NG data for endmember selection. The classification maps of the minerals and rock types were developed using support vector machine algorithm. The ground truth data was used to verify the mineral maps.

  1. Infrared and visible image fusion using discrete cosine transform and swarm intelligence for surveillance applications

    NASA Astrophysics Data System (ADS)

    Paramanandham, Nirmala; Rajendiran, Kishore

    2018-01-01

    A novel image fusion technique is presented for integrating infrared and visible images. Integration of images from the same or various sensing modalities can deliver the required information that cannot be delivered by viewing the sensor outputs individually and consecutively. In this paper, a swarm intelligence based image fusion technique using discrete cosine transform (DCT) domain is proposed for surveillance application which integrates the infrared image with the visible image for generating a single informative fused image. Particle swarm optimization (PSO) is used in the fusion process for obtaining the optimized weighting factor. These optimized weighting factors are used for fusing the DCT coefficients of visible and infrared images. Inverse DCT is applied for obtaining the initial fused image. An enhanced fused image is obtained through adaptive histogram equalization for a better visual understanding and target detection. The proposed framework is evaluated using quantitative metrics such as standard deviation, spatial frequency, entropy and mean gradient. The experimental results demonstrate the outperformance of the proposed algorithm over many other state- of- the- art techniques reported in literature.

  2. Near-infrared face recognition utilizing open CV software

    NASA Astrophysics Data System (ADS)

    Sellami, Louiza; Ngo, Hau; Fowler, Chris J.; Kearney, Liam M.

    2014-06-01

    Commercially available hardware, freely available algorithms, and authors' developed software are synergized successfully to detect and recognize subjects in an environment without visible light. This project integrates three major components: an illumination device operating in near infrared (NIR) spectrum, a NIR capable camera and a software algorithm capable of performing image manipulation, facial detection and recognition. Focusing our efforts in the near infrared spectrum allows the low budget system to operate covertly while still allowing for accurate face recognition. In doing so a valuable function has been developed which presents potential benefits in future civilian and military security and surveillance operations.

  3. Thrust stand evaluation of engine performance improvement algorithms in an F-15 airplane

    NASA Technical Reports Server (NTRS)

    Conners, Timothy R.

    1992-01-01

    An investigation is underway to determine the benefits of a new propulsion system optimization algorithm in an F-15 airplane. The performance seeking control (PSC) algorithm optimizes the quasi-steady-state performance of an F100 derivative turbofan engine for several modes of operation. The PSC algorithm uses an onboard software engine model that calculates thrust, stall margin, and other unmeasured variables for use in the optimization. As part of the PSC test program, the F-15 aircraft was operated on a horizontal thrust stand. Thrust was measured with highly accurate load cells. The measured thrust was compared to onboard model estimates and to results from posttest performance programs. Thrust changes using the various PSC modes were recorded. Those results were compared to benefits using the less complex highly integrated digital electronic control (HIDEC) algorithm. The PSC maximum thrust mode increased intermediate power thrust by 10 percent. The PSC engine model did very well at estimating measured thrust and closely followed the transients during optimization. Quantitative results from the evaluation of the algorithms and performance calculation models are included with emphasis on measured thrust results. The report presents a description of the PSC system and a discussion of factors affecting the accuracy of the thrust stand load measurements.

  4. Space Radar Image of Washington D.C.

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The city of Washington, D.C., is shown is this space radar image. Images like these are useful tools for urban planners and managers, who use them to map and monitor land use patterns. Downtown Washington is the bright area between the Potomac (upper center to lower left) and Anacostia (middle right) rivers. The dark cross shape that is formed by the National Mall, Tidal Basin, the White House and Ellipse is seen in the center of the image. Arlington National Cemetery is the dark blue area on the Virginia (left) side of the Potomac River near the center of the image. The Pentagon is visible in bright white and red, south of the cemetery. Due to the alignment of the radar and the streets, the avenues that form the boundary between Washington and Maryland appear as bright red lines in the top, right and bottom parts of the image, parallel to the image borders. This image is centered at 38.85 degrees north latitude, 77.05 degrees west longitude. North is toward the upper right. The area shown is approximately 29 km by 26 km (18 miles by 16 miles). Colors are assigned to different frequencies and polarizations of the radar as follows: Red is the L-band horizontally transmitted, horizontally received; green is the L-band horizontally transmitted, vertically received; blue is the C-band horizontally transmitted, vertically received. The image was acquired by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture (SIR-C/X-SAR) imaging radar when it flew aboard the space shuttle Endeavour on April 18, 1994. SIR-C/X-SAR, a joint mission of the German, Italian and United States space agencies, is part of NASA's Mission to Planet Earth program.

  5. Space Radar Image of Florence, Italy

    NASA Image and Video Library

    1999-04-15

    This radar image shows land use patterns in and around the city of Florence, Italy, shown here in the center of the image. Florence is situated on a plain in the Chianti Hill region of Central Italy. The Arno River flows through town and is visible as the dark line running from the upper right to the bottom center of the image. The city is home to some of the world's most famous art museums. The bridges seen crossing the Arno, shown as faint red lines in the upper right portion of the image, were all sacked during World War II with the exception of the Ponte Vecchio, which remains as Florence's only covered bridge. The large, black V-shaped feature near the center of the image is the Florence Railroad Station. This image was acquired by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar (SIR-C/X-SAR) onboard the Space Shuttle Endeavour on April 14, 1994. SIR-C/X-SAR, a joint mission of the German, Italian, and United States space agencies, is part of NASA's Mission to Planet Earth. This image is centered at 43.7 degrees north latitude and 11.15 degrees east longitude with North toward the upper left of the image. The area shown measures 20 kilometers by 17 kilometers (12.4 miles by 10.6 miles). The colors in the image are assigned to different frequencies and polarizations of the radar as follows: red is L-band horizontally transmitted, horizontally received; green is L-band horizontally transmitted, vertically received; blue is C-band horizontally transmitted, vertically received. http://photojournal.jpl.nasa.gov/catalog/PIA01795

  6. New porcine test-model reveals remarkable differences between algorithms for spectrophotometrical haemoglobin saturation measurements with VLS.

    PubMed

    Gade, John; Greisen, Gorm

    2016-09-01

    The study created an 'ex vivo' model to test different algorithms for measurements of mucosal haemoglobin saturation with visible light spectrophotometry (VLS). The model allowed comparison between algorithms, but it also allowed comparison with co-oximetry using a 'gold standard' method. This has not been described before. Seven pigs were used. They were perfused with cold Haemaxel, and thus killed, chilled and becoming bloodless. The bronchial artery was perfused with cold blood with known saturation and spectrophotometrical measurements were made through a bronchoscope. Based on 42 spectrophotometrical measurements of porcine bronchial mucosa saturation with fully oxygenated blood and 21 with de-oxygenated blood, six algorithms were applied to the raw-spectra of the measurements and compared with co-oxymetry. The difference from co-oxymetry in the oxygenated and de-oxygenated state ranged from  -32.8 to  +29.9 percentage points and from  -5.0 to  +9.2 percentage points, respectively. the algorithms showed remarkable in-between differences when tested on raw-spectra from an 'ex vivo' model. All algorithms had bias, more marked at high oxygenation than low oxygenation. Three algorithms were dis-recommended.

  7. Horizontal decomposition of data table for finding one reduct

    NASA Astrophysics Data System (ADS)

    Hońko, Piotr

    2018-04-01

    Attribute reduction, being one of the most essential tasks in rough set theory, is a challenge for data that does not fit in the available memory. This paper proposes new definitions of attribute reduction using horizontal data decomposition. Algorithms for computing superreduct and subsequently exact reducts of a data table are developed and experimentally verified. In the proposed approach, the size of subtables obtained during the decomposition can be arbitrarily small. Reducts of the subtables are computed independently from one another using any heuristic method for finding one reduct. Compared with standard attribute reduction methods, the proposed approach can produce superreducts that usually inconsiderably differ from an exact reduct. The approach needs comparable time and much less memory to reduce the attribute set. The method proposed for removing unnecessary attributes from superreducts executes relatively fast for bigger databases.

  8. Sensitivity of mesoscale-model forecast skill to some initial-data characteristics, data density, data position, analysis procedure and measurement error

    NASA Technical Reports Server (NTRS)

    Warner, Thomas T.; Key, Lawrence E.; Lario, Annette M.

    1989-01-01

    The effects of horizontal and vertical data resolution, data density, data location, different objective analysis algorithms, and measurement error on mesoscale-forecast accuracy are studied with observing-system simulation experiments. Domain-averaged errors are shown to generally decrease with time. It is found that the vertical distribution of error growth depends on the initial vertical distribution of the error itself. Larger gravity-inertia wave noise is produced in forecasts with coarser vertical data resolution. The use of a low vertical resolution observing system with three data levels leads to more forecast errors than moderate and high vertical resolution observing systems with 8 and 14 data levels. Also, with poor vertical resolution in soundings, the initial and forecast errors are not affected by the horizontal data resolution.

  9. Efficient Computation of Atmospheric Flows with Tempest: Validation of Next-Generation Climate and Weather Prediction Algorithms at Non-Hydrostatic Scales

    NASA Astrophysics Data System (ADS)

    Guerra, Jorge; Ullrich, Paul

    2016-04-01

    Tempest is a next-generation global climate and weather simulation platform designed to allow experimentation with numerical methods for a wide range of spatial resolutions. The atmospheric fluid equations are discretized by continuous / discontinuous finite elements in the horizontal and by a staggered nodal finite element method (SNFEM) in the vertical, coupled with implicit/explicit time integration. At horizontal resolutions below 10km, many important questions remain on optimal techniques for solving the fluid equations. We present results from a suite of idealized test cases to validate the performance of the SNFEM applied in the vertical with an emphasis on flow features and dynamic behavior. Internal gravity wave, mountain wave, convective bubble, and Cartesian baroclinic instability tests will be shown at various vertical orders of accuracy and compared with known results.

  10. Development of fog detection algorithm using Himawari-8/AHI data at daytime

    NASA Astrophysics Data System (ADS)

    Han, Ji-Hye; Kim, So-Hyeong; suh, Myoung-Seok

    2017-04-01

    Fog is defined that small cloud water drops or ice particles float in the air and visibility is less than 1 km. In general, fog affects ecological system, radiation budget and human activities such as airplane, ship, and car. In this study, we developed a fog detection algorithm (FDA) consisted of four threshold tests of optical and textual properties of fog using satellite and ground observation data at daytime. For the detection of fog, we used satellite data (Himawari-8/AHI data) and other ancillary data such as air temperature from NWP data (over land), SST from OSTIA (over sea). And for validation, ground observed visibility data from KMA. The optical and textual properties of fog are normalized albedo (NAlb) and normalized local standard deviation (NLSD), respectively. In addition, differences between air temperature (SST) and fog top temperature (FTa(S)) are applied to discriminate the fog from low clouds. And post-processing is performed to detect the fog edge based on spatial continuity of fog. Threshold values for each test are determined by optimization processes based on the ROC analysis for the selected fog cases. Fog detection is performed according to solar zenith angle (SZA) because of the difference of available satellite data. In this study, we defined daytime when SZA is less than 85˚ . Result of FDA is presented by probability (0 ˜ 100 %) of fog through the weighted sum of each test result. The validation results with ground observed visibility data showed that POD and FAR are 0.63 ˜ 0.89 and 0.29 ˜ 0.46 according to the fog intensity and type, respectively. In general, the detection skills are better in the cases of intense and without high clouds than localized and weak fog. We are plan to transfer this algorithm to the National Meteorological Satellite Center of KMA for the operational detection of fog using GK-2A/AMI data which will be launched in 2018.

  11. The Middle Temporal Artery: Surgical Anatomy and Exposure for Cerebral Revascularization.

    PubMed

    Rubio, Roberto Rodriguez; Lawton, Michael T; Kola, Olivia; Tabani, Halima; Yousef, Sonia; Meybodi, Ali Tayebi; Burkhardt, Jan-Karl; El-Sayed, Ivan; Benet, Arnau

    2018-02-01

    The middle temporal artery (MTA) is the proximal medial branch of the superficial temporal artery (STA), supplying the temporalis muscle along with deep temporal arteries. Its use in vascularized flaps for reconstructive and otologic procedures has been described, yet its potential use in neurosurgery has not been studied. We report a novel technique for exposing the MTA and evaluated its characteristics for extracranial-intracranial cerebrovascular bypass. After a curvilinear frontotemporal incision in 10 cadaveric specimens, the STA was dissected from distal to proximal. The horizontal portion of MTA was found posterolateral to the posterior end of the zygomatic root and was followed proximally until its origin and distally until its 2 terminal branches. The total length, visible branches, and caliber of MTA were measured. The mean total harvested length of MTA was 31.7 ± 5.1 mm, with an average proximal caliber of 1.7 ± 0.4 mm, and distal caliber of 1.3 ± 0.5 mm. There were 4-6 terminal MTA branches. The caliber of the proximal STA trunk was 2.5 ± 0.5 mm. The origin of the MTA was visible with a mean distance of 16.9 ± 4.8 mm inferior to the PEZR. The parotid gland was traversed and a communicating auriculotemporal nerve to the temporal branch of the facial nerve crossed MTA in 2 specimens. MTA can be safely harvested with an anterolateral approach, following its horizontal portion at the level of the zygomatic root, which is constant. The length and caliber of MTA makes it a potential alternative donor vessel or interposition graft for extracranial-intracranial bypass, especially when other donors are unavailable. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Space-borne observation of mesospheric bore by Visible and near Infrared Spectral Imager onboard the International Space Station

    NASA Astrophysics Data System (ADS)

    Hozumi, Y.; Saito, A.; Sakanoi, T.; Yamazaki, A.; Hosokawa, K.

    2017-12-01

    Mesospheric bores were observed by Visible and near Infrared Spectral Imager (VISI) of the ISS-IMAP mission (Ionosphere, Mesosphere, upper Atmosphere and Plasmasphere mapping mission from the International Space Station) in O2 airglow at 762 nm wavelength. The mesospheric bore is moving front of sharp jump followed by undulations or turbulence in the mesopause region. Since previous studies of mesospheric bore were mainly based on ground-based airglow imaging that is limited in field-of-view and observing site, little is known about its horizontal extent and global behavior. Space-borne imaging by ISS-IMAP/VISI provides an opportunity to study the mesospheric bore with a wide field-of-view and global coverage. A mesospheric bore was captured by VISI in two consecutive paths on 9 July 2015 over the south of African continent (48ºS - 54ºS and 15ºE). The wave front aligned with south-north direction and propagated to west. The phase velocity and wave length of the following undulation were estimated to 100 m/s and 30 km, respectively. Those parameters are similar to those reported by previous studies. 30º anti-clockwise rotation of the wave front was recognized in 100 min. Another mesospheric bore was captured on 9 May 2013 over the south Atlantic ocean (35ºS - 43ºS and 24ºW - 1ºE) with more than 2,200 km horizontal extent of wave front. The wave front aligned with southeast-northwest direction. Because the following undulation is recognized in the southwest side of the wave front, it is estimated to propagate to northeast direction. The wave front was modulated with 1,000 km wave length. This modulation implies inhomogeneity of the phase velocity.

  13. Nonlinear Dynamics of River Runoff Elucidated by Horizontal Visibility Graphs

    NASA Astrophysics Data System (ADS)

    Lange, Holger; Rosso, Osvaldo A.

    2017-04-01

    We investigate a set of long-term river runoff time series at daily resolution from Brazil, monitored by the Agencia Nacional de Aguas. A total of 150 time series was obtained, with an average length of 65 years. Both long-term trends and human influence (water management, e.g. for power production) on the dynamical behaviour are analyzed. We use Horizontal Visibility Graphs (HVGs) to determine the individual temporal networks for the time series, and extract their degree and their distance (shortest path length) distributions. Statistical and information-theoretic properties of these distributions are calculated: robust estimators of skewness and kurtosis, the maximum degree occurring in the time series, the Shannon entropy, permutation complexity and Fisher Information. For the latter, we also compare the information measures obtained from the degree distributions to those using the original time series directly, to investigate the impact of graph construction on the dynamical properties as reflected in these measures. Focus is on one hand on universal properties of the HVG, common to all runoff series, and on site-specific aspects on the other. Results demonstrate that the assumption of power law behaviour for the degree distribtion does not generally hold, and that management has a significant impact on this distribution. We also show that a specific pretreatment of the time series conventional in hydrology, the elimination of seasonality by a separate z-transformation for each calendar day, is highly detrimental to the nonlinear behaviour. It changes long-term correlations and the overall dynamics towards more random behaviour. Analysis based on the transformed data easily leads to spurious results, and bear a high risk of misinterpretation.

  14. Mapped Landmark Algorithm for Precision Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew; Ansar, Adnan; Matthies, Larry

    2007-01-01

    A report discusses a computer vision algorithm for position estimation to enable precision landing during planetary descent. The Descent Image Motion Estimation System for the Mars Exploration Rovers has been used as a starting point for creating code for precision, terrain-relative navigation during planetary landing. The algorithm is designed to be general because it handles images taken at different scales and resolutions relative to the map, and can produce mapped landmark matches for any planetary terrain of sufficient texture. These matches provide a measurement of horizontal position relative to a known landing site specified on the surface map. Multiple mapped landmarks generated per image allow for automatic detection and elimination of bad matches. Attitude and position can be generated from each image; this image-based attitude measurement can be used by the onboard navigation filter to improve the attitude estimate, which will improve the position estimates. The algorithm uses normalized correlation of grayscale images, producing precise, sub-pixel images. The algorithm has been broken into two sub-algorithms: (1) FFT Map Matching (see figure), which matches a single large template by correlation in the frequency domain, and (2) Mapped Landmark Refinement, which matches many small templates by correlation in the spatial domain. Each relies on feature selection, the homography transform, and 3D image correlation. The algorithm is implemented in C++ and is rated at Technology Readiness Level (TRL) 4.

  15. On-off keying transmitter design for navigation by visible light communication

    NASA Astrophysics Data System (ADS)

    Louro, P.; Vieira, M.; Costa, J.; Vieira, M. A.

    2018-02-01

    White LEDS revolutionized the field of illumination technology mainly due to the energy saving effects. Besides lighting purposes LEDs can also be used in wireless communication systems when integrated in Visible Light Communication (VLC) systems. Indoor positioning for navigation in large buildings is currently under research to overcome the difficulties associated with the use of GPS in such environments. The motivation for this application is also supported by the possibility of taking advantage of an existing lighting and WiFi infrastructure. In this work it is proposed an indoor navigation system based on the use of VLC technology. The proposed system includes trichromatic white LEDs with the red and blue chips modulated at different frequencies and a pinpin photodetector with selective spectral sensitivity. Optoelectronic features of both optical sources and photodetector device are analyzed. The photodetector device consists two pin structures based on a-SiC:H and a-Si:H with geometrical configuration optimized for the detection of short and large wavelengths in the visible range. Its sensitivity is externally tuned by steady state optical bias. The localization algorithm makes use of the Fourier transform to identify the frequencies present in the photocurrent signal and the wavelength filtering properties of the sensor under front and back optical bias to detect the existing red and blue signals. The viability of the system was demonstrated through the implementation of an automatic algorithm to infer the photodetector cardinal direction. A capacitive optoelectronic model supports the experimental results and explains the device operation.

  16. VASIR: An Open-Source Research Platform for Advanced Iris Recognition Technologies.

    PubMed

    Lee, Yooyoung; Micheals, Ross J; Filliben, James J; Phillips, P Jonathon

    2013-01-01

    The performance of iris recognition systems is frequently affected by input image quality, which in turn is vulnerable to less-than-optimal conditions due to illuminations, environments, and subject characteristics (e.g., distance, movement, face/body visibility, blinking, etc.). VASIR (Video-based Automatic System for Iris Recognition) is a state-of-the-art NIST-developed iris recognition software platform designed to systematically address these vulnerabilities. We developed VASIR as a research tool that will not only provide a reference (to assess the relative performance of alternative algorithms) for the biometrics community, but will also advance (via this new emerging iris recognition paradigm) NIST's measurement mission. VASIR is designed to accommodate both ideal (e.g., classical still images) and less-than-ideal images (e.g., face-visible videos). VASIR has three primary modules: 1) Image Acquisition 2) Video Processing, and 3) Iris Recognition. Each module consists of several sub-components that have been optimized by use of rigorous orthogonal experiment design and analysis techniques. We evaluated VASIR performance using the MBGC (Multiple Biometric Grand Challenge) NIR (Near-Infrared) face-visible video dataset and the ICE (Iris Challenge Evaluation) 2005 still-based dataset. The results showed that even though VASIR was primarily developed and optimized for the less-constrained video case, it still achieved high verification rates for the traditional still-image case. For this reason, VASIR may be used as an effective baseline for the biometrics community to evaluate their algorithm performance, and thus serves as a valuable research platform.

  17. VASIR: An Open-Source Research Platform for Advanced Iris Recognition Technologies

    PubMed Central

    Lee, Yooyoung; Micheals, Ross J; Filliben, James J; Phillips, P Jonathon

    2013-01-01

    The performance of iris recognition systems is frequently affected by input image quality, which in turn is vulnerable to less-than-optimal conditions due to illuminations, environments, and subject characteristics (e.g., distance, movement, face/body visibility, blinking, etc.). VASIR (Video-based Automatic System for Iris Recognition) is a state-of-the-art NIST-developed iris recognition software platform designed to systematically address these vulnerabilities. We developed VASIR as a research tool that will not only provide a reference (to assess the relative performance of alternative algorithms) for the biometrics community, but will also advance (via this new emerging iris recognition paradigm) NIST’s measurement mission. VASIR is designed to accommodate both ideal (e.g., classical still images) and less-than-ideal images (e.g., face-visible videos). VASIR has three primary modules: 1) Image Acquisition 2) Video Processing, and 3) Iris Recognition. Each module consists of several sub-components that have been optimized by use of rigorous orthogonal experiment design and analysis techniques. We evaluated VASIR performance using the MBGC (Multiple Biometric Grand Challenge) NIR (Near-Infrared) face-visible video dataset and the ICE (Iris Challenge Evaluation) 2005 still-based dataset. The results showed that even though VASIR was primarily developed and optimized for the less-constrained video case, it still achieved high verification rates for the traditional still-image case. For this reason, VASIR may be used as an effective baseline for the biometrics community to evaluate their algorithm performance, and thus serves as a valuable research platform. PMID:26401431

  18. Onboard Algorithms for Data Prioritization and Summarization of Aerial Imagery

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Hayden, David; Thompson, David R.; Castano, Rebecca

    2013-01-01

    Many current and future NASA missions are capable of collecting enormous amounts of data, of which only a small portion can be transmitted to Earth. Communications are limited due to distance, visibility constraints, and competing mission downlinks. Long missions and high-resolution, multispectral imaging devices easily produce data exceeding the available bandwidth. To address this situation computationally efficient algorithms were developed for analyzing science imagery onboard the spacecraft. These algorithms autonomously cluster the data into classes of similar imagery, enabling selective downlink of representatives of each class, and a map classifying the terrain imaged rather than the full dataset, reducing the volume of the downlinked data. A range of approaches was examined, including k-means clustering using image features based on color, texture, temporal, and spatial arrangement

  19. Learning algorithm in restricted Boltzmann machines using Kullback-Leibler importance estimation procedure

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Sakurai, Tetsuharu; Tanaka, Kazuyuki

    Restricted Boltzmann machines (RBMs) are bipartite structured statistical neural networks and consist of two layers. One of them is a layer of visible units and the other one is a layer of hidden units. In each layer, any units do not connect to each other. RBMs have high flexibility and rich structure and have been expected to applied to various applications, for example, image and pattern recognitions, face detections and so on. However, most of computational models in RBMs are intractable and often belong to the class of NP-hard problem. In this paper, in order to construct a practical learning algorithm for them, we employ the Kullback-Leibler Importance Estimation Procedure (KLIEP) to RBMs, and give a new scheme of practical approximate learning algorithm for RBMs based on the KLIEP.

  20. A gossip based information fusion protocol for distributed frequent itemset mining

    NASA Astrophysics Data System (ADS)

    Sohrabi, Mohammad Karim

    2018-07-01

    The computational complexity, huge memory space requirement, and time-consuming nature of frequent pattern mining process are the most important motivations for distribution and parallelization of this mining process. On the other hand, the emergence of distributed computational and operational environments, which causes the production and maintenance of data on different distributed data sources, makes the parallelization and distribution of the knowledge discovery process inevitable. In this paper, a gossip based distributed itemset mining (GDIM) algorithm is proposed to extract frequent itemsets, which are special types of frequent patterns, in a wireless sensor network environment. In this algorithm, local frequent itemsets of each sensor are extracted using a bit-wise horizontal approach (LHPM) from the nodes which are clustered using a leach-based protocol. Heads of clusters exploit a gossip based protocol in order to communicate each other to find the patterns which their global support is equal to or more than the specified support threshold. Experimental results show that the proposed algorithm outperforms the best existing gossip based algorithm in term of execution time.

  1. A comprehensive approach to evaluating and classifying sun-protective clothing.

    PubMed

    Downs, N J; Harrison, S L

    2018-04-01

    National standards for clothing designed to protect the wearer from the harmful effects of solar ultraviolet radiation (UVR) have been implemented in Australia/New Zealand, Europe and the U.S.A. Industry standards reflect the need to protect the skin by covering a considerable proportion of the potentially exposed body surface area (BSA) and by reducing UVR-transmission through fabric (the Ultraviolet Protection Factor; UPF). This research aimed to develop a new index for rating sun-protective clothing that incorporates the BSA coverage of the garment in addition to the UPF of the fabric. A mannequin model was fixed to an optical bench and marked with horizontal lines at 1-cm intervals. An algorithm (the Garment Protector Factor; GPF) was developed based on the number of lines visible on the clothed vs. unclothed mannequin and the UPF of the garment textile. This data was collected in 2015/16 and analysed in 2016. The GPF weights fabric UPF by BSA coverage above the minimum required by international sun-protective clothing standards for upper-body, lower-body and full-body garments. The GPF increases with BSA coverage of the garment and fabric UPF. Three nominal categories are proposed for the GPF: 0 ≤ GPF < 3 for garments that 'meet' minimum standards; 3 ≤ GPF < 6 for garments providing 'good' sun protection; and GPF ≥ 6 indicating 'excellent' protection. Adoption of the proposed rating scheme should encourage manufacturers to design sun-protective garments that exceed the minimum standard for BSA coverage, with positive implications for skin cancer prevention, consumer education and sun-protection awareness. © 2017 British Association of Dermatologists.

  2. Detection of partial-thickness tears in ligaments and tendons by Stokes-polarimetry imaging

    NASA Astrophysics Data System (ADS)

    Kim, Jihoon; John, Raheel; Walsh, Joseph T.

    2008-02-01

    A Stokes polarimetry imaging (SPI) system utilizes an algorithm developed to construct degree of polarization (DoP) image maps from linearly polarized light illumination. Partial-thickness tears of turkey tendons were imaged by the SPI system in order to examine the feasibility of the system to detect partial-thickness rotator cuff tear or general tendon pathology. The rotating incident polarization angle (IPA) for the linearly polarized light provides a way to analyze different tissue types which may be sensitive to IPA variations. Degree of linear polarization (DoLP) images revealed collagen fiber structure, related to partial-thickness tears, better than standard intensity images. DoLP images also revealed structural changes in tears that are related to the tendon load. DoLP images with red-wavelength-filtered incident light may show tears and related organization of collagen fiber structure at a greater depth from the tendon surface. Degree of circular polarization (DoCP) images exhibited well the horizontal fiber orientation that is not parallel to the vertically aligned collagen fibers of the tendon. The SPI system's DOLP images reveal alterations in tendons and ligaments, which have a tissue matrix consisting largely of collagen, better than intensity images. All polarized images showed modulated intensity as the IPA was varied. The optimal detection of the partial-thickness tendon tears at a certain IPA was observed. The SPI system with varying IPA and spectral information can improve the detection of partial-thickness rotator cuff tears by higher visibility of fiber orientations and thereby improve diagnosis and treatment of tendon related injuries.

  3. Very-short range forecasting system for 2018 Pyeonchang Winter Olympic and Paralympic games

    NASA Astrophysics Data System (ADS)

    Nam, Ji-Eun; Park, Kyungjeen; Kim, Minyou; Kim, Changhwan; Joo, Sangwon

    2016-04-01

    The 23rd Olympic Winter and the 13th Paralympic Winter Games will be held in Pyeongchang, Republic of Korea respectively from 9 to 25 February 2018 and from 9 to 18 February 2018. The Korea Meteorological Administration (KMA) and the National Institute for Meteorological Science (NIMS) have the responsibility to provide weather information for the management of the Games and the safety of the public. NIMS will carry out a Forecast Demonstration Project (FDP) and a Research and Development Project (RDP) which will be called ICE-POP 2018. These projects will focus on intensive observation campaigns to understand severe winter weathers over the Pyeongchang region, and the research results from the RDP will be used to improve the accuracy of nowcasting and very short-range forecast systems during the Games. To support these projects, NIMS developed Very-short range Data Assimilation and Prediction System (VDAPS), which is run in real time with 1 hour cycling interval and up to 12 hour forecasts. The domain is covering Korean Peninsular and surrounding seas with 1.5km horizontal resolution. AWS, windprofiler, buoy, sonde, aircraft, scatwinds, and radar radial winds are assimilated by 3DVAR on 3km resolution inner domain. The rain rate is converted into latent heat and initialized via nudging. The visibility data are also assimilated with the addition of aerosol control variable. The experiments results show the improvement in rainfall over south sea of Korean peninsula. In order to reduce excessive rainfalls during first 2 hours due to the reduced cycling interval, the data assimilation algorithm is optimized.

  4. Control algorithms for dynamic windows for residential buildings

    DOE PAGES

    Firlag, Szymon; Yazdanian, Mehrangiz; Curcija, Charlie; ...

    2015-09-30

    This study analyzes the influence of control algorithms for dynamic windows on energy consumption, number of hours of retracted shades during daylight and shade operations. Five different control algorithms - heating/cooling, simple rules, perfect citizen, heat flow and predictive weather were developed and compared. The performance of a typical residential building was modeled with EnergyPlus. The program Widow was used to generate a Bi-Directional Distribution Function (BSDF) for two window configurations. The BSDF was exported to EnergyPlus using the IDF file format. The EMS feature in EnergyPlus was used to develop custom control algorithms. The calculations were made for fourmore » locations with diverse climate. The results showed that: (a) use of automated shading with proposed control algorithms can reduce the site energy in the range of 11.6-13.0%; in regard to source (primary) energy in the range of 20.1-21.6%, (b) the differences between algorithms in regard to energy savings are not high, (c) the differences between algorithms in regard to number of hours of retracted shades are visible, (e) the control algorithms have a strong influence on shade operation and oscillation of shade can occur, (d) additional energy consumption caused by motor, sensors and a small microprocessor in the analyzed case is very small.« less

  5. Imaging of Stellar Surfaces with the Navy Precision Optical Interferometer

    NASA Astrophysics Data System (ADS)

    Jorgensen, A.; Schmitt, H. R.; van Belle, G. T.; Hutter, Clark; Mozurkewich, D.; Armstrong, J. T.; Baines, E. K.; Restaino, S. R.

    The Navy Precision Optical Interferometer (NPOI) has a unique layout which is particularly well-suited for high-resolution interferometric imaging. By combining the NPOI layout with a new data acquisition and fringe tracking system we are progressing toward a imaging capability which will exceed any other interferometer in operation. The project, funded by the National Science Foundation, combines several existing advances and infrastructure at NPOI with modest enhancements. For optimal imaging there are several requirements that should be fulfilled. The observatory should be capable of measuring visibilities on a wide range of baseline lengths and orientations, providing complete UV coverage in a short period of time. It should measure visibility amplitudes with good SNR on all baselines as critical imaging information is often contained in low-amplitude visibilities. It should measure the visibility phase on all baselines. The technologies which can achieve this are the NPOI Y-shaped array with (nearly) equal spacing between telescopes and an ability for rapid configuration. Placing 6-telescopes in a row makes it possible to measure visibilities into the 4th lobe of the visibility function. By arranging the available telescopes carefully we will be able to switch, every few days, between 3 different 6-station chains which provide symmetric coverage in the UV (Fourier) plane without moving any telescopes, only by moving beam relay mirrors. The 6-station chains are important to achieve the highest imaging resolution, and switching rapidly between station chains provides uniform coverage. Coherent integration techniques can be used to obtain good SNR on very small visibilities. Coherently integrated visibilities can be used for imaging with standard radio imaging packages such as AIPS. The commissioning of one additional station, the use of new data acquisition hardware and fringe tracking algorithms are the enhancements which make this project possible.

  6. Torsion effect of swing frame on the measurement of horizontal two-plane balancing machine

    NASA Astrophysics Data System (ADS)

    Wang, Qiuxiao; Wang, Dequan; He, Bin; Jiang, Pan; Wu, Zhaofu; Fu, Xiaoyan

    2017-03-01

    In this paper, the vibration model of swing frame of two-plane balancing machine is established to calculate the vibration center position of swing frame first. The torsional stiffness formula of spring plate twisting around the vibration center is then deduced by using superposition principle. Finally, the dynamic balancing experiments prove the irrationality of A-B-C algorithm which ignores the torsion effect, and show that the torsional stiffness deduced by experiments is consistent with the torsional stiffness calculated by theory. The experimental datas show the influence of the torsion effect of swing frame on the separation ratio of sided balancing machines, which reveals the sources of measurement error and assesses the application scope of A-B-C algorithm.

  7. Simulation of Extreme Arctic Cyclones in IPCC AR5 Experiments

    DTIC Science & Technology

    2014-05-15

    atmospheric fields, including sea level pressure ( SLP ), on daily and sub-daily time scales at 2° horizontal resolution. A higher-resolution and more...its 21st-century simulation. Extreme cyclones were defined as occurrences of daily mean SLP at least 40 hPa below the climatological annual-average... SLP at a grid point. As such, no cyclone-tracking algorithm was employed, because the purpose here is to identify instances of extremely strong

  8. Difet: Distributed Feature Extraction Tool for High Spatial Resolution Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Eken, S.; Aydın, E.; Sayar, A.

    2017-11-01

    In this paper, we propose distributed feature extraction tool from high spatial resolution remote sensing images. Tool is based on Apache Hadoop framework and Hadoop Image Processing Interface. Two corner detection (Harris and Shi-Tomasi) algorithms and five feature descriptors (SIFT, SURF, FAST, BRIEF, and ORB) are considered. Robustness of the tool in the task of feature extraction from LandSat-8 imageries are evaluated in terms of horizontal scalability.

  9. Atmosphere and climate studies of Mars using the Mars Observer pressure modulator infrared radiometer

    NASA Technical Reports Server (NTRS)

    Mccleese, D. J.; Haskins, R. D.; Schofield, J. T.; Zurek, R. W.; Leovy, C. B.; Paige, D. A.; Taylor, F. W.

    1992-01-01

    Studies of the climate and atmosphere of Mars are limited at present by a lack of meteorological data having systematic global coverage with good horizontal and vertical resolution. The Mars Observer spacecraft in a low, nearly circular, polar orbit will provide an excellent platform for acquiring the data needed to advance significantly our understanding of the Martian atmosphere and its remarkable variability. The Mars Observer pressure modulator infrared radiometer (PMIRR) is a nine-channel limb and nadir scanning atmospheric sounder which will observe the atmosphere of Mars globally from 0 to 80 km for a full Martian year. PMIRR employs narrow-band radiometric channels and two pressure modulation cells to measure atmospheric and surface emission in the thermal infrared. PMIRR infrared and visible measurements will be combined to determine the radiative balance of the polar regions, where a sizeable fraction of the global atmospheric mass annually condenses onto and sublimes from the surface. Derived meteorological fields, including diabatic heating and cooling and the vertical variation of horizontal winds, are computed from the globally mapped fields retrieved from PMIRR data.

  10. A study of satellite-derived moisture with emphasis on the Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Schreiner, Anthony J.; Hayden, Christopher M.; Paris, Cecil A.

    1992-01-01

    Visible-Infrared Spin Scan Radiometer (VISSR) Atmospheric Sounder (VAS) moisture retrievals are compared to the National Meteorological Center Regional Analysis and Forecast System (RAFS) 12-h forecast and to 1200 UTC rawinsondes over the U.S. and the Gulf of Mexico on a daily basis for nearly 1.5 years. The principal objective is to determine what information the current moisture retrievals add to that available from the RAFS and surface data. The data are examined from the climatological perspective, that is, total precipitable water over the seasons for three geographical regions, and also for synoptic applications, that is, vertical and horizontal resolution. VAS retrievals are found to be systematically too moist at higher values. The variance of the VAS soundings more closely agrees with the rawinsonde at locations around the Gulf of Mexico than the RAFS. An examination of a case (6 June 1989) over the Gulf of Mexico region comparing three layers of VAS-derived moisture to the RAFS forecast shows the former capable of outperforming the latter in both the horizontal and, to some extent, the vertical frame of reference.

  11. Combining Passive Microwave Rain Rate Retrieval with Visible and Infrared Cloud Classification.

    NASA Astrophysics Data System (ADS)

    Miller, Shawn William

    The relation between cloud type and rain rate has been investigated here from different approaches. Previous studies and intercomparisons have indicated that no single passive microwave rain rate algorithm is an optimal choice for all types of precipitating systems. Motivated by the upcoming Tropical Rainfall Measuring Mission (TRMM), an algorithm which combines visible and infrared cloud classification with passive microwave rain rate estimation was developed and analyzed in a preliminary manner using data from the Tropical Ocean Global Atmosphere-Coupled Ocean Atmosphere Response Experiment (TOGA-COARE). Overall correlation with radar rain rate measurements across five case studies showed substantial improvement in the combined algorithm approach when compared to the use of any single microwave algorithm. An automated neural network cloud classifier for use over both land and ocean was independently developed and tested on Advanced Very High Resolution Radiometer (AVHRR) data. The global classifier achieved strict accuracy for 82% of the test samples, while a more localized version achieved strict accuracy for 89% of its own test set. These numbers provide hope for the eventual development of a global automated cloud classifier for use throughout the tropics and the temperate zones. The localized classifier was used in conjunction with gridded 15-minute averaged radar rain rates at 8km resolution produced from the current operational network of National Weather Service (NWS) radars, to investigate the relation between cloud type and rain rate over three regions of the continental United States and adjacent waters. The results indicate a substantially lower amount of available moisture in the Front Range of the Rocky Mountains than in the Midwest or in the eastern Gulf of Mexico.

  12. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems

    PubMed Central

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data. PMID:28806754

  13. Random Forest Application for NEXRAD Radar Data Quality Control

    NASA Astrophysics Data System (ADS)

    Keem, M.; Seo, B. C.; Krajewski, W. F.

    2017-12-01

    Identification and elimination of non-meteorological radar echoes (e.g., returns from ground, wind turbines, and biological targets) are the basic data quality control steps before radar data use in quantitative applications (e.g., precipitation estimation). Although WSR-88Ds' recent upgrade to dual-polarization has enhanced this quality control and echo classification, there are still challenges to detect some non-meteorological echoes that show precipitation-like characteristics (e.g., wind turbine or anomalous propagation clutter embedded in rain). With this in mind, a new quality control method using Random Forest is proposed in this study. This classification algorithm is known to produce reliable results with less uncertainty. The method introduces randomness into sampling and feature selections and integrates consequent multiple decision trees. The multidimensional structure of the trees can characterize the statistical interactions of involved multiple features in complex situations. The authors explore the performance of Random Forest method for NEXRAD radar data quality control. Training datasets are selected using several clear cases of precipitation and non-precipitation (but with some non-meteorological echoes). The model is structured using available candidate features (from the NEXRAD data) such as horizontal reflectivity, differential reflectivity, differential phase shift, copolar correlation coefficient, and their horizontal textures (e.g., local standard deviation). The influence of each feature on classification results are quantified by variable importance measures that are automatically estimated by the Random Forest algorithm. Therefore, the number and types of features in the final forest can be examined based on the classification accuracy. The authors demonstrate the capability of the proposed approach using several cases ranging from distinct to complex rain/no-rain events and compare the performance with the existing algorithms (e.g., MRMS). They also discuss operational feasibility based on the observed strength and weakness of the method.

  14. Short-term prediction of solar energy in Saudi Arabia using automated-design fuzzy logic systems.

    PubMed

    Almaraashi, Majid

    2017-01-01

    Solar energy is considered as one of the main sources for renewable energy in the near future. However, solar energy and other renewable energy sources have a drawback related to the difficulty in predicting their availability in the near future. This problem affects optimal exploitation of solar energy, especially in connection with other resources. Therefore, reliable solar energy prediction models are essential to solar energy management and economics. This paper presents work aimed at designing reliable models to predict the global horizontal irradiance (GHI) for the next day in 8 stations in Saudi Arabia. The designed models are based on computational intelligence methods of automated-design fuzzy logic systems. The fuzzy logic systems are designed and optimized with two models using fuzzy c-means clustering (FCM) and simulated annealing (SA) algorithms. The first model uses FCM based on the subtractive clustering algorithm to automatically design the predictor fuzzy rules from data. The second model is using FCM followed by simulated annealing algorithm to enhance the prediction accuracy of the fuzzy logic system. The objective of the predictor is to accurately predict next-day global horizontal irradiance (GHI) using previous-day meteorological and solar radiation observations. The proposed models use observations of 10 variables of measured meteorological and solar radiation data to build the model. The experimentation and results of the prediction are detailed where the root mean square error of the prediction was approximately 88% for the second model tuned by simulated annealing compared to 79.75% accuracy using the first model. This results demonstrate a good modeling accuracy of the second model despite that the training and testing of the proposed models were carried out using spatially and temporally independent data.

  15. Computer-aided visual assessment in mine planning and design

    Treesearch

    Michael Hatfield; A. J. LeRoy Balzer; Roger E. Nelson

    1979-01-01

    A computer modeling technique is described for evaluating the visual impact of a proposed surface mine located within the viewshed of a national park. A computer algorithm analyzes digitized USGS baseline topography and identifies areas subject to surface disturbance visible from the park. Preliminary mine and reclamation plan information is used to describe how the...

  16. Nitrogen dioxide observations from the Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument: Retrieval algorithm and measurements during DISCOVER-AQ Texas 2013

    EPA Science Inventory

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument is a test bed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA F...

  17. Integrating models to predict regional haze from wildland fire.

    Treesearch

    D. McKenzie; S.M. O' Neill; N. Larkin; R.A. Norheim

    2006-01-01

    Visibility impairment from regional haze is a significant problem throughout the continental United States. A substantial portion of regional haze is produced by smoke from prescribed and wildland fires. Here we describe the integration of four simulation models, an array of GIS raster layers, and a set of algorithms for fire-danger calculations into a modeling...

  18. Implementation of a near real-time burned area detection algorithm calibrated for VIIRS imagery

    Treesearch

    Brenna Schwert; Carl Albury; Jess Clark; Abigail Schaaf; Shawn Urbanski; Bryce Nordgren

    2016-01-01

    There is a need to implement methods for rapid burned area detection using a suitable replacement for Moderate Resolution Imaging Spectroradiometer (MODIS) imagery to meet future mapping and monitoring needs (Roy and Boschetti 2009, Tucker and Yager 2011). The Visible Infrared Imaging Radiometer Suite (VIIRS) sensor onboard the Suomi-National Polar-orbiting Partnership...

  19. Visibility graph analysis of very short-term heart rate variability during sleep

    NASA Astrophysics Data System (ADS)

    Hou, F. Z.; Li, F. W.; Wang, J.; Yan, F. R.

    2016-09-01

    Based on a visibility-graph algorithm, complex networks were constructed from very short-term heart rate variability (HRV) during different sleep stages. Network measurements progressively changed from rapid eye movement (REM) sleep to light sleep and then deep sleep, exhibiting promising ability for sleep assessment. Abnormal activation of the cardiovascular controls with enhanced 'small-world' couplings and altered fractal organization during REM sleep indicates that REM could be a potential risk factor for adverse cardiovascular event, especially in males, older individuals, and people who are overweight. Additionally, an apparent influence of gender, aging, and obesity on sleep was demonstrated in healthy adults, which may be helpful for establishing expected sleep-HRV patterns in different populations.

  20. Fully coupled simulation of multiple hydraulic fractures to propagate simultaneously from a perforated horizontal wellbore

    NASA Astrophysics Data System (ADS)

    Zeng, Qinglei; Liu, Zhanli; Wang, Tao; Gao, Yue; Zhuang, Zhuo

    2018-02-01

    In hydraulic fracturing process in shale rock, multiple fractures perpendicular to a horizontal wellbore are usually driven to propagate simultaneously by the pumping operation. In this paper, a numerical method is developed for the propagation of multiple hydraulic fractures (HFs) by fully coupling the deformation and fracturing of solid formation, fluid flow in fractures, fluid partitioning through a horizontal wellbore and perforation entry loss effect. The extended finite element method (XFEM) is adopted to model arbitrary growth of the fractures. Newton's iteration is proposed to solve these fully coupled nonlinear equations, which is more efficient comparing to the widely adopted fixed-point iteration in the literatures and avoids the need to impose fluid pressure boundary condition when solving flow equations. A secant iterative method based on the stress intensity factor (SIF) is proposed to capture different propagation velocities of multiple fractures. The numerical results are compared with theoretical solutions in literatures to verify the accuracy of the method. The simultaneous propagation of multiple HFs is simulated by the newly proposed algorithm. The coupled influences of propagation regime, stress interaction, wellbore pressure loss and perforation entry loss on simultaneous propagation of multiple HFs are investigated.

  1. Integration of vertical and in-seam horizontal well production analyses with stochastic geostatistical algorithms to estimate pre-mining methane drainage efficiency from coal seams: Blue Creek seam, Alabama

    PubMed Central

    Karacan, C. Özgen

    2015-01-01

    Coal seam degasification and its efficiency are directly related to the safety of coal mining. Degasification activities in the Black Warrior basin started in the early 1980s by using vertical boreholes. Although the Blue Creek seam, which is part of the Mary Lee coal group, has been the main seam of interest for coal mining, vertical wellbores have also been completed in the Pratt, Mary Lee, and Black Creek coal groups of the Upper Pottsville formation to degasify multiple seams. Currently, the Blue Creek seam is further degasified 2–3 years in advance of mining using in-seam horizontal boreholes to ensure safe mining. The studied location in this work is located between Tuscaloosa and Jefferson counties in Alabama and was degasified using 81 vertical boreholes, some of which are still active. When the current long mine expanded its operation into this area in 2009, horizontal boreholes were also drilled in advance of mining for further degasification of only the Blue Creek seam to ensure a safe and a productive operation. This paper presents an integrated study and a methodology to combine history matching results from vertical boreholes with production modeling of horizontal boreholes using geostatistical simulation to evaluate spatial effectiveness of in-seam boreholes in reducing gas-in-place (GIP). Results in this study showed that in-seam wells' boreholes had an estimated effective drainage area of 2050 acres with cumulative production of 604 MMscf methane during ~2 years of operation. With horizontal borehole production, GIP in the Blue Creek seam decreased from an average of 1.52 MMscf to 1.23 MMscf per acre. It was also shown that effective gas flow capacity, which was independently modeled using vertical borehole data, affected horizontal borehole production. GIP and effective gas flow capacity of coal seam gas were also used to predict remaining gas potential for the Blue Creek seam. PMID:26435557

  2. Integration of vertical and in-seam horizontal well production analyses with stochastic geostatistical algorithms to estimate pre-mining methane drainage efficiency from coal seams: Blue Creek seam, Alabama.

    PubMed

    Karacan, C Özgen

    2013-07-30

    Coal seam degasification and its efficiency are directly related to the safety of coal mining. Degasification activities in the Black Warrior basin started in the early 1980s by using vertical boreholes. Although the Blue Creek seam, which is part of the Mary Lee coal group, has been the main seam of interest for coal mining, vertical wellbores have also been completed in the Pratt, Mary Lee, and Black Creek coal groups of the Upper Pottsville formation to degasify multiple seams. Currently, the Blue Creek seam is further degasified 2-3 years in advance of mining using in-seam horizontal boreholes to ensure safe mining. The studied location in this work is located between Tuscaloosa and Jefferson counties in Alabama and was degasified using 81 vertical boreholes, some of which are still active. When the current long mine expanded its operation into this area in 2009, horizontal boreholes were also drilled in advance of mining for further degasification of only the Blue Creek seam to ensure a safe and a productive operation. This paper presents an integrated study and a methodology to combine history matching results from vertical boreholes with production modeling of horizontal boreholes using geostatistical simulation to evaluate spatial effectiveness of in-seam boreholes in reducing gas-in-place (GIP). Results in this study showed that in-seam wells' boreholes had an estimated effective drainage area of 2050 acres with cumulative production of 604 MMscf methane during ~2 years of operation. With horizontal borehole production, GIP in the Blue Creek seam decreased from an average of 1.52 MMscf to 1.23 MMscf per acre. It was also shown that effective gas flow capacity, which was independently modeled using vertical borehole data, affected horizontal borehole production. GIP and effective gas flow capacity of coal seam gas were also used to predict remaining gas potential for the Blue Creek seam.

  3. Real-time out-of-plane artifact subtraction tomosynthesis imaging using prior CT for scanning beam digital x-ray system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Meng, E-mail: mengwu@stanford.edu; Fahrig, Rebecca

    2014-11-01

    Purpose: The scanning beam digital x-ray system (SBDX) is an inverse geometry fluoroscopic system with high dose efficiency and the ability to perform continuous real-time tomosynthesis in multiple planes. This system could be used for image guidance during lung nodule biopsy. However, the reconstructed images suffer from strong out-of-plane artifact due to the small tomographic angle of the system. Methods: The authors propose an out-of-plane artifact subtraction tomosynthesis (OPAST) algorithm that utilizes a prior CT volume to augment the run-time image processing. A blur-and-add (BAA) analytical model, derived from the project-to-backproject physical model, permits the generation of tomosynthesis images thatmore » are a good approximation to the shift-and-add (SAA) reconstructed image. A computationally practical algorithm is proposed to simulate images and out-of-plane artifacts from patient-specific prior CT volumes using the BAA model. A 3D image registration algorithm to align the simulated and reconstructed images is described. The accuracy of the BAA analytical model and the OPAST algorithm was evaluated using three lung cancer patients’ CT data. The OPAST and image registration algorithms were also tested with added nonrigid respiratory motions. Results: Image similarity measurements, including the correlation coefficient, mean squared error, and structural similarity index, indicated that the BAA model is very accurate in simulating the SAA images from the prior CT for the SBDX system. The shift-variant effect of the BAA model can be ignored when the shifts between SBDX images and CT volumes are within ±10 mm in the x and y directions. The nodule visibility and depth resolution are improved by subtracting simulated artifacts from the reconstructions. The image registration and OPAST are robust in the presence of added respiratory motions. The dominant artifacts in the subtraction images are caused by the mismatches between the real object and the prior CT volume. Conclusions: Their proposed prior CT-augmented OPAST reconstruction algorithm improves lung nodule visibility and depth resolution for the SBDX system.« less

  4. Eddy-induced salinity pattern in the North Pacific

    NASA Astrophysics Data System (ADS)

    Abe, H.; Ebuchi, N.; Ueno, H.; Ishiyama, H.; Matsumura, Y.

    2017-12-01

    This research examines spatio-temporal behavior of sea surface salinity (SSS) after intense rainfall events using observed data from Aquarius. Aquarius SSS in the North Pacific reveals one notable event in which SSS is locally freshened by intense rainfall. Although SSS pattern shortly after the rainfall reflects atmospheric pattern, its final form reflects ocean dynamic structure; an anticyclonic eddy. Since this anticyclonic eddy was located at SSS front created by precipitation, this eddy stirs the water in a clockwise direction. This eddy stirring was visible for several months. It is expected horizontal transport by mesoscale eddies would play significant role in determining upper ocean salinity structure.

  5. Strategies to Evaluate the Visibility Along AN Indoor Path in a Point Cloud Representation

    NASA Astrophysics Data System (ADS)

    Grasso, N.; Verbree, E.; Zlatanova, S.; Piras, M.

    2017-09-01

    Many research works have been oriented to the formulation of different algorithms for estimating the paths in indoor environments from three-dimensional representations of space. The architectural configuration, the actions that take place within it, and the location of some objects in the space influence the paths along which is it possible to move, as they may cause visibility problems. To overcome the visibility issue, different methods have been proposed which allow to identify the visible areas and from a certain point of view, but often they do not take into account the user's visual perception of the environment and not allow estimating how much may be complicated to follow a certain path. In the field of space syntax and cognitive science, it has been attempted to describe the characteristics of a building or an urban environment by the isovists and visibility graphs methods; some numerical properties of these representations allow to describe the space as for how it is perceived by a user. However, most of these studies are directed to analyze the environment in a two-dimensional space. In this paper we propose a method to evaluate in a quantitative way the complexity of a certain path within an environment represented by a three-dimensional point cloud, by the combination of some of the previously mentioned techniques, considering the space visible from a certain point of view, depending on the moving agent (pedestrian , people in wheelchairs, UAV, UGV, robot).

  6. How nonlinear optics can merge interferometry for high resolution imaging

    NASA Astrophysics Data System (ADS)

    Ceus, D.; Reynaud, F.; Tonello, A.; Delage, L.; Grossard, L.

    2017-11-01

    High resolution stellar interferometers are very powerful efficient instruments to get a better knowledge of our Universe through the spatial coherence analysis of the light. For this purpose, the optical fields collected by each telescope Ti are mixed together. From the interferometric pattern, two expected information called the contrast Cij and the phase information φij are extracted. These information lead to the Vij, called the complex visibility, with Vij=Cijexp(jφij). For each telescope doublet TiTj, it is possible to get a complex visibility Vij. The Zernike Van Cittert theorem gives a relationship between the intensity distribution of the object observed and the complex visibility. The combination of the acquired complex visibilities and a reconstruction algorithm allows imaging reconstruction. To avoid lots of technical difficulties related to infrared optics (components transmission, thermal noises, thermal cooling…), our team proposes to explore the possibility of using nonlinear optical techniques. This is a promising alternative detection technique for detecting infrared optical signals. This way, we experimentally demonstrate that frequency conversion does not result in additional bias on the interferometric data supplied by a stellar interferometer. In this presentation, we report on wavelength conversion of the light collected by each telescope from the infrared domain to the visible. The interferometric pattern is observed in the visible domain with our, so called, upconversion interferometer. Thereby, one can benefit from mature optical components mainly used in optical telecommunications (waveguide, coupler, multiplexer…) and efficient low-noise detection schemes up to the single-photon counting level.

  7. Modeling techniques for cross-hole seismic monitoring of CO2 injection in a deep saline aquifer

    NASA Astrophysics Data System (ADS)

    Da, Federico, ,, Col; Gei, Davide

    2017-04-01

    In this work, we present a modelling technique for a synthetic, yet realistic, 2D cross-hole seismic monitoring experiment for CO2 injection in a deep saline aquifer. We implement a synthetic (2D) geological formation consisting of a sandstone aquifer, with shaly mudstone intrusions, embedded in very low permeability shales. The aquifer has its top at about 800 m b.s.l., is approximately 200 m thick and it extends about 800 m in the horizontal direction.The formation is very heterogenous with respect to all petrophysical and hydrological properties; furthermore, we consider the grains to be a mixture of quartz and clay. Injection of the CO2 and the propagation of the plume is modelled using STOMP commercial software. The algorithm solves the mass balance equation for wetting and non-wetting phase fluids, as well as for the dissolved salt. It considers advection via Darcy's equation extended to two phase flow and molecular diffusion. Furthermore, dissolution of the CO2 in the brine is considered. We assume the aquifer to be initially in hydrostatic equilibrium and we inject pure CO2 for 2 years. We then compute phase p-wave velocities and quality factor by means of White's mesoscopic theory, which assumes that the partially saturated pore consists of two concentrical spheres; the inner saturated with gas, the outer saturated with brine. Using this p-wave velocity and quality factor map, we compute synthetic cross-hole seismograms by means of a visco-acoustic modelling code. We perform 80 shots along the left borehole, with a source spacing of 5 metres. We then pick the first arrivals (direct wave) on the seismograms and we perform a tomographic inversion using cat3d software. We invert for straight rays, updating the velocity model with a SIRT algorithm at each iteration. Due to the mainly horizontal orientation of the velocity anomalies, we select to invert only for rays having an angle lower than 30° with the horizontal direction. The algorithm converged well after 200 iterations; furthermore, the picked and computed velocities fit rather well, with residuals showing a gaussian distribution around 0. The method looks promising, since the main velocity anomalies are well detected.

  8. An Algorithm for Pedestrian Detection in Multispectral Image Sequences

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Fedorenko, V. V.

    2017-05-01

    The growing interest for self-driving cars provides a demand for scene understanding and obstacle detection algorithms. One of the most challenging problems in this field is the problem of pedestrian detection. Main difficulties arise from a diverse appearances of pedestrians. Poor visibility conditions such as fog and low light conditions also significantly decrease the quality of pedestrian detection. This paper presents a new optical flow based algorithm BipedDetet that provides robust pedestrian detection on a single-borad computer. The algorithm is based on the idea of simplified Kalman filtering suitable for realization on modern single-board computers. To detect a pedestrian a synthetic optical flow of the scene without pedestrians is generated using slanted-plane model. The estimate of a real optical flow is generated using a multispectral image sequence. The difference of the synthetic optical flow and the real optical flow provides the optical flow induced by pedestrians. The final detection of pedestrians is done by the segmentation of the difference of optical flows. To evaluate the BipedDetect algorithm a multispectral dataset was collected using a mobile robot.

  9. Description of a dual fail operational redundant strapdown inertial measurement unit for integrated avionics systems research

    NASA Technical Reports Server (NTRS)

    Bryant, W. H.; Morrell, F. R.

    1981-01-01

    An experimental redundant strapdown inertial measurement unit (RSDIMU) is developed as a link to satisfy safety and reliability considerations in the integrated avionics concept. The unit includes four two degree-of-freedom tuned rotor gyros, and four accelerometers in a skewed and separable semioctahedral array. These sensors are coupled to four microprocessors which compensate sensor errors. These microprocessors are interfaced with two flight computers which process failure detection, isolation, redundancy management, and general flight control/navigation algorithms. Since the RSDIMU is a developmental unit, it is imperative that the flight computers provide special visibility and facility in algorithm modification.

  10. Constraints as a destriping tool for Hires images

    NASA Technical Reports Server (NTRS)

    Cao, YU; Prince, Thomas A.

    1994-01-01

    Images produced from the Maximum Correlation Method sometimes suffer from visible striping artifacts, especially for areas of extended sources. Possible causes are different baseline levels and calibration errors in the detectors. We incorporated these factors into the MCM algorithm, and tested the effects of different constraints on the output image. The result shows significant visual improvement over the standard MCM Method. In some areas the new images show intelligible structures that are otherwise corrupted by striping artifacts, and the removal of these artifacts could enhance performance of object classification algorithms. The constraints were also tested on low surface brightness areas, and were found to be effective in reducing the noise level.

  11. Interpretation of magnetotelluric resistivity and phase soundings over horizontal layers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patella, D.

    1976-02-01

    The present paper deals with a new inverse method for quantitatively interpreting magnetotelluric apparent resistivity and phase-lag sounding curves over horizontally stratified earth sections. The recurrent character of the general formula relating the wave impedance of an (n-l)-layered medium to that of an n-layered medium suggests the use of the method of reduction to a lower boundary plane, as originally termed by Koefoed in the case of dc resistivity soundings. The layering parameters are so directly derived by a simple iterative procedure. The method is applicable for any number of layers but only when both apparent resistivity and phase-lag soundingmore » curves are jointly available. Moreover no sophisticated algorithm is required: a simple desk electronic calculator together with a sheet of two-layer apparent resistivity and phase-lag master curves are sufficient to reproduce earth sections which, in the range of equivalence, are all consistent with field data.« less

  12. Automatic layout of structured hierarchical reports.

    PubMed

    Bakke, Eirik; Karger, David R; Miller, Robert C

    2013-12-01

    Domain-specific database applications tend to contain a sizable number of table-, form-, and report-style views that must each be designed and maintained by a software developer. A significant part of this job is the necessary tweaking of low-level presentation details such as label placements, text field dimensions, list or table styles, and so on. In this paper, we present a horizontally constrained layout management algorithm that automates the display of structured hierarchical data using the traditional visual idioms of hand-designed database UIs: tables, multi-column forms, and outline-style indented lists. We compare our system with pure outline and nested table layouts with respect to space efficiency and readability, the latter with an online user study on 27 subjects. Our layouts are 3.9 and 1.6 times more compact on average than outline layouts and horizontally unconstrained table layouts, respectively, and are as readable as table layouts even for large datasets.

  13. Evaluation of the relationship between mandibular third molar and mandibular canal by different algorithms of cone-beam computed tomography.

    PubMed

    Mehdizadeh, Mojdeh; Ahmadi, Navid; Jamshidi, Mahsa

    2014-11-01

    Exact location of the inferior alveolar nerve (IAN) bundle is very important. The aim of this study is to evaluate the relationship between the mandibular third molar and the mandibular canal by cone-beam computed tomography. This was a cross-sectional study with convenience sampling. 94 mandibular CBCTs performed with CSANEX 3D machine (Soredex, Finland) and 3D system chosen. Vertical and horizontal relationship between the mandibular canal and the third molar depicted by 3D, panoramic reformat view of CBCT and cross-sectional view. Cross-sectional view was our gold standard and other view evaluated by it. There were significant differences between the vertical and horizontal relation of nerve and tooth in all views (p < 0.001). The results showed differences in the position of the inferior alveolar nerve with different views of CBCT, so CBCT images are not quite reliable and have possibility of error.

  14. Design considerations for a real-time ocular counterroll instrument

    NASA Technical Reports Server (NTRS)

    Hatamian, M.; Anderson, D. J.

    1983-01-01

    A real-time algorithm for measuring three-dimensional movement of the human eye, especially torsional movement, is presented. As its input, the system uses images of the eyeball taken at video rate. The amount of horizontal and vertical movement is extracted using a pupil tracking technique. The torsional movement is then measured by computing the discrete cross-correlation function between the circular samples of successive images of the iris patterns and searching for the position of the peak of the function. A local least square interpolation around the peak of the cross-correlation function is used to produce nearly unbiased estimates of torsion angle with accuracy of about 3-4 arcmin. Accuracies of better than 0.03 deg are achievable in torsional measurement with SNR higher than 36 dB. Horizontal and vertical rotations of up to + or - 13 deg can occur simultaneously with torsion without introducing any appreciable error in the counterrolling measurement process.

  15. Reconstruction of Horizontal Plasma Motions at the Photosphere from Intensitygrams: A Comparison Between DeepVel, LCT, FLCT, and CST

    NASA Astrophysics Data System (ADS)

    Tremblay, Benoit; Roudier, Thierry; Rieutord, Michel; Vincent, Alain

    2018-04-01

    Direct measurements of plasma motions in the photosphere are limited to the line-of-sight component of the velocity. Several algorithms have therefore been developed to reconstruct the transverse components from observed continuum images or magnetograms. We compare the space and time averages of horizontal velocity fields in the photosphere inferred from pairs of consecutive intensitygrams by the LCT, FLCT, and CST methods and the DeepVel neural network in order to identify the method that is best suited for generating synthetic observations to be used for data assimilation. The Stein and Nordlund ( Astrophys. J. Lett. 753, L13, 2012) magnetoconvection simulation is used to generate synthetic SDO/HMI intensitygrams and reference flows to train DeepVel. Inferred velocity fields show that DeepVel performs best at subgranular and granular scales and is second only to FLCT at mesogranular and supergranular scales.

  16. Doubling transmission capacity in optical wireless system by antenna horizontal- and vertical-polarization multiplexing.

    PubMed

    Li, Xinying; Yu, Jianjun; Zhang, Junwen; Dong, Ze; Chi, Nan

    2013-06-15

    We experimentally demonstrate 2×56 Gb/s two-channel polarization-division-multiplexing quadrature-phase-shift-keying signal delivery over 80 km single-mode fiber-28 and 2 m Q-band (33-50 GHz) wireless link, adopting antenna horizontal- (H-) and vertical-polarization (V-polarization) multiplexing. At the wireless receiver, classic constant-modulus-algorithm equalization based on digital signal processing can realize polarization demultiplexing and remove the crosstalk at the same antenna polarization. By adopting antenna polarization multiplexing, the signal baud rate and performance requirements for optical and wireless devices can be reduced but at the cost of double antennas and devices, while wireless transmission capacity can also be increased but at the cost of stricter requirements for V-polarization. The isolation is only about 19 dB when V-polarization deviation approaches 10°, which will affect high-speed (>50 Gb/s) wireless delivery.

  17. A study on obstacle detection method of the frontal view using a camera on highway

    NASA Astrophysics Data System (ADS)

    Nguyen, Van-Quang; Park, Jeonghyeon; Seo, Changjun; Kim, Heungseob; Boo, Kwangsuck

    2018-03-01

    In this work, we introduce an approach to detect vehicles for driver assistance, or warning system. For driver assistance system, it must detect both lanes (left and right side lane), and discover vehicles ahead of the test vehicle. Therefore, in this study, we use a camera, it is installed on the windscreen of the test vehicle. Images from the camera are used to detect three lanes, and detect multiple vehicles. In lane detection, line detection and vanishing point estimation are used. For the vehicle detection, we combine the horizontal and vertical edge detection, the horizontal edge is used to detect the vehicle candidates, and then the vertical edge detection is used to verify the vehicle candidates. The proposed algorithm works with of 480 × 640 image frame resolution. The system was tested on the highway in Korea.

  18. Visible light communication based vehicle positioning using LED street light and rolling shutter CMOS sensors

    NASA Astrophysics Data System (ADS)

    Do, Trong Hop; Yoo, Myungsik

    2018-01-01

    This paper proposes a vehicle positioning system using LED street lights and two rolling shutter CMOS sensor cameras. In this system, identification codes for the LED street lights are transmitted to camera-equipped vehicles through a visible light communication (VLC) channel. Given that the camera parameters are known, the positions of the vehicles are determined based on the geometric relationship between the coordinates of the LEDs in the images and their real world coordinates, which are obtained through the LED identification codes. The main contributions of the paper are twofold. First, the collinear arrangement of the LED street lights makes traditional camera-based positioning algorithms fail to determine the position of the vehicles. In this paper, an algorithm is proposed to fuse data received from the two cameras attached to the vehicles in order to solve the collinearity problem of the LEDs. Second, the rolling shutter mechanism of the CMOS sensors combined with the movement of the vehicles creates image artifacts that may severely degrade the positioning accuracy. This paper also proposes a method to compensate for the rolling shutter artifact, and a high positioning accuracy can be achieved even when the vehicle is moving at high speeds. The performance of the proposed positioning system corresponding to different system parameters is examined by conducting Matlab simulations. Small-scale experiments are also conducted to study the performance of the proposed algorithm in real applications.

  19. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    NASA Astrophysics Data System (ADS)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  20. Alternative method for VIIRS Moon in space view process

    NASA Astrophysics Data System (ADS)

    Anderson, Samuel; Chiang, Kwofu V.; Xiong, Xiaoxiong

    2013-09-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) is a radiometric sensing instrument currently operating onboard the Suomi National Polar-orbiting Partnership (S-NPP) spacecraft. It provides high spatial-resolution images of the emitted and reflected radiation from the Earth and its atmosphere in 22 spectral bands (16 moderate resolution bands M1-M16, 5 imaging bands I1-I5, and 1 day/night pan band DNB) spanning the visible and infrared wavelengths from 412 nm to 12 μm. Just prior to each scan it makes of the Earth, the VIIRS instrument makes a measurement of deep space to serve as a background reference. These space view (SV) measurements form a crucial input to the VIIRS calibration process and are a major determinant of its accuracy. On occasion, the orientation of the Suomi NPP spacecraft coincides with the position of the moon in such a fashion that the SV measurements include light from the moon, rendering the SV measurements unusable for calibration. This paper investigates improvements to the existing baseline SV data processing algorithm of the Sensor Data Record (SDR) processing software. The proposed method makes use of a Moon-in-SV detection algorithm that identifies moon-contaminated SV data on a scan-by-scan basis. Use of this algorithm minimizes the number of SV scans that are rejected initially, so that subsequent substitution processes are always able to find alternative substitute SV scans in the near vicinity of detected moon-contaminated scans.

Top