Sample records for background field method

  1. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    PubMed

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Limitations of the background field method applied to Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Nobili, Camilla; Otto, Felix

    2017-09-01

    We consider Rayleigh-Bénard convection as modeled by the Boussinesq equations, in the case of infinite Prandtl numbers and with no-slip boundary condition. There is a broad interest in bounds of the upwards heat flux, as given by the Nusselt number Nu, in terms of the forcing via the imposed temperature difference, as given by the Rayleigh number in the turbulent regime Ra ≫ 1 . In several studies, the background field method applied to the temperature field has been used to provide upper bounds on Nu in terms of Ra. In these applications, the background field method comes in the form of a variational problem where one optimizes a stratified temperature profile subject to a certain stability condition; the method is believed to capture the marginal stability of the boundary layer. The best available upper bound via this method is Nu ≲Ra/1 3 ( ln R a )/1 15 ; it proceeds via the construction of a stable temperature background profile that increases logarithmically in the bulk. In this paper, we show that the background temperature field method cannot provide a tighter upper bound in terms of the power of the logarithm. However, by another method, one does obtain the tighter upper bound Nu ≲ Ra /1 3 ( ln ln Ra ) /1 3 so that the result of this paper implies that the background temperature field method is unphysical in the sense that it cannot provide the optimal bound.

  3. Non-perturbative background field calculations

    NASA Astrophysics Data System (ADS)

    Stephens, C. R.

    1988-01-01

    New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.

  4. Universal field matching in craniospinal irradiation by a background-dose gradient-optimized method.

    PubMed

    Traneus, Erik; Bizzocchi, Nicola; Fellin, Francesco; Rombi, Barbara; Farace, Paolo

    2018-01-01

    The gradient-optimized methods are overcoming the traditional feathering methods to plan field junctions in craniospinal irradiation. In this note, a new gradient-optimized technique, based on the use of a background dose, is described. Treatment planning was performed by RayStation (RaySearch Laboratories, Stockholm, Sweden) on the CT scans of a pediatric patient. Both proton (by pencil beam scanning) and photon (by volumetric modulated arc therapy) treatments were planned with three isocenters. An 'in silico' ideal background dose was created first to cover the upper-spinal target and to produce a perfect dose gradient along the upper and lower junction regions. Using it as background, the cranial and the lower-spinal beams were planned by inverse optimization to obtain dose coverage of their relevant targets and of the junction volumes. Finally, the upper-spinal beam was inversely planned after removal of the background dose and with the previously optimized beams switched on. In both proton and photon plans, the optimized cranial and the lower-spinal beams produced a perfect linear gradient in the junction regions, complementary to that produced by the optimized upper-spinal beam. The final dose distributions showed a homogeneous coverage of the targets. Our simple technique allowed to obtain high-quality gradients in the junction region. Such technique universally works for photons as well as protons and could be applicable to the TPSs that allow to manage a background dose. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  5. Separation of foreground and background from light field using gradient information.

    PubMed

    Lee, Jae Young; Park, Rae-Hong

    2017-02-01

    Studies of computer vision or machine vision applications using a light field camera have been increasing in recent years. However, the abilities that the light field camera has are not fully used in these applications. In this paper, we propose a method for direct separation of foreground and background that uses the gradient information and can be used in various applications such as pre-processing. From an optical phenomenon whereby the bundles of rays from the background are flipped, we derive that the disparity sign of the background in the captured three-dimensional scene has the opposite disparity sign of the foreground. Using the majority-weighted voting algorithm based on the gradient information with the Lambertian assumption and the gradient constraint, the foreground and background can be separated at each pixel. In regard to pre-processing, the proposed method can be used for various applications such as occlusion and saliency detection, disparity estimation, and so on. Experimental results with the EPFL light field dataset and Stanford Lytro light field dataset show that the proposed method achieves better performance in terms of the occlusion detection, and thus can be effectively used in pre-processing for saliency detection and disparity estimation.

  6. Background field Landau mode operators for the nucleon

    NASA Astrophysics Data System (ADS)

    Kamleh, Waseem; Bignell, Ryan; Leinweber, Derek B.; Burkardt, Matthias

    2018-03-01

    The introduction of a uniform background magnetic field breaks threedimensional spatial symmetry for a charged particle and introduces Landau mode effects. Standard quark operators are inefficient at isolating the nucleon correlation function at nontrivial field strengths. We introduce novel quark operators constructed from the twodimensional Laplacian eigenmodes that describe a charged particle on a finite lattice. These eigenmode-projected quark operators provide enhanced precision for calculating nucleon energy shifts in a magnetic field. Preliminary results are obtained for the neutron and proton magnetic polarisabilities using these methods.

  7. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain

    NASA Astrophysics Data System (ADS)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C. M.; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  8. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain.

    PubMed

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C M; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  9. Cosmic Microwave Background Mapmaking with a Messenger Field

    NASA Astrophysics Data System (ADS)

    Huffenberger, Kevin M.; Næss, Sigurd K.

    2018-01-01

    We apply a messenger field method to solve the linear minimum-variance mapmaking equation in the context of Cosmic Microwave Background (CMB) observations. In simulations, the method produces sky maps that converge significantly faster than those from a conjugate gradient descent algorithm with a diagonal preconditioner, even though the computational cost per iteration is similar. The messenger method recovers large scales in the map better than conjugate gradient descent, and yields a lower overall χ2. In the single, pencil beam approximation, each iteration of the messenger mapmaking procedure produces an unbiased map, and the iterations become more optimal as they proceed. A variant of the method can handle differential data or perform deconvolution mapmaking. The messenger method requires no preconditioner, but a high-quality solution needs a cooling parameter to control the convergence. We study the convergence properties of this new method and discuss how the algorithm is feasible for the large data sets of current and future CMB experiments.

  10. Chameleon scalar fields in relativistic gravitational backgrounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsujikawa, Shinji; Tamaki, Takashi; Tavakol, Reza, E-mail: shinji@rs.kagu.tus.ac.jp, E-mail: tamaki@gravity.phys.waseda.ac.jp, E-mail: r.tavakol@qmul.ac.uk

    2009-05-15

    We study the field profile of a scalar field {phi} that couples to a matter fluid (dubbed a chameleon field) in the relativistic gravitational background of a spherically symmetric spacetime. Employing a linear expansion in terms of the gravitational potential {Phi}{sub c} at the surface of a compact object with a constant density, we derive the thin-shell field profile both inside and outside the object, as well as the resulting effective coupling with matter, analytically. We also carry out numerical simulations for the class of inverse power-law potentials V({phi}) = M{sup 4+n}{phi}{sup -n} by employing the information provided by ourmore » analytical solutions to set the boundary conditions around the centre of the object and show that thin-shell solutions in fact exist if the gravitational potential {Phi}{sub c} is smaller than 0.3, which marginally covers the case of neutron stars. Thus the chameleon mechanism is present in the relativistic gravitational backgrounds, capable of reducing the effective coupling. Since thin-shell solutions are sensitive to the choice of boundary conditions, our analytic field profile is very helpful to provide appropriate boundary conditions for {Phi}{sub c}{approx}« less

  11. Multiphoton amplitude in a constant background field

    NASA Astrophysics Data System (ADS)

    Ahmad, Aftab; Ahmadiniaz, Naser; Corradini, Olindo; Kim, Sang Pyo; Schubert, Christian

    2018-01-01

    In this contribution, we present our recent compact master formulas for the multiphoton amplitudes of a scalar propagator in a constant background field using the worldline fomulation of quantum field theory. The constant field has been included nonperturbatively, which is crucial for strong external fields. A possible application is the scattering of photons by electrons in a strong magnetic field, a process that has been a subject of great interest since the discovery of astrophysical objects like radio pulsars, which provide evidence that magnetic fields of the order of 1012G are present in nature. The presence of a strong external field leads to a strong deviation from the classical scattering amplitudes. We explicitly work out the Compton scattering amplitude in a magnetic field, which is a process of potential relevance for astrophysics. Our final result is compact and suitable for numerical integration.

  12. Consistent compactification of double field theory on non-geometric flux backgrounds

    NASA Astrophysics Data System (ADS)

    Hassler, Falk; Lüst, Dieter

    2014-05-01

    In this paper, we construct non-trivial solutions to the 2 D-dimensional field equations of Double Field Theory (DFT) by using a consistent Scherk-Schwarz ansatz. The ansatz identifies 2( D - d) internal directions with a twist U M N which is directly connected to the covariant fluxes ABC . It exhibits 2( D - d) linear independent generalized Killing vectors K I J and gives rise to a gauged supergravity in d dimensions. We analyze the covariant fluxes and the corresponding gauged supergravity with a Minkowski vacuum. We calculate fluctuations around such vacua and show how they gives rise to massive scalars field and vectors field with a non-abelian gauge algebra. Because DFT is a background independent theory, these fields should directly correspond the string excitations in the corresponding background. For ( D - d) = 3 we perform a complete scan of all allowed covariant fluxes and find two different kinds of backgrounds: the single and the double elliptic case. The later is not T-dual to a geometric background and cannot be transformed to a geometric setting by a field redefinition either. While this background fulfills the strong constraint, it is still consistent with the Killing vectors depending on the coordinates and the winding coordinates, thereby giving a non-geometric patching. This background can therefore not be described in Supergravity or Generalized Geometry.

  13. Moving branes in the presence of background tachyon fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rezaei, Z., E-mail: z.rezaei@aut.ac.ir; Kamani, D., E-mail: kamani@aut.ac.ir

    2011-12-15

    We compute the boundary state associated with a moving Dp-brane in the presence of the open string tachyon field as a background field. The effect of the tachyon condensation on the boundary state is discussed. It leads to a boundary state associated with a lower-dimensional moving D-brane or a stationary instantonic D-brane. The former originates from condensation along the spatial directions and the latter comes from the temporal direction of the D-brane worldvolume. Using the boundary state, we also study the interaction amplitude between two arbitrary Dp{sub 1}- and Dp{sub 2}-branes. The long-range behavior of the amplitude is investigated, demonstratingmore » an obvious deviation from the conventional form, due to the presence of the background tachyon field.« less

  14. Detection and correction of laser induced breakdown spectroscopy spectral background based on spline interpolation method

    NASA Astrophysics Data System (ADS)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz

  15. CAPELLA: Software for stellar photometry in dense fields with an irregular background

    NASA Astrophysics Data System (ADS)

    Debray, B.; Llebaria, A.; Dubout-Crillon, R.; Petit, M.

    1994-01-01

    We describe CAPELLA, a photometric reduction package developed top automatically process images of very crowded stellar fields with an irregular background. Detection is performed by the use of a derivative filter (the laplacian of a gaussian), the measuring of position and flux of the stars uses a profile fitting technique. The Point Spread Function (PSF) is empirical. The traditional multiparmetric non-linear fit is replaced by a set of individual linear fits. The determination of the background, the detection, the definition of the PSF and the basics of the methods are successively addressed in details. The iterative procedure as well as some aspects of the sampling problem are also discussed. Precision tests, performances in uncrowded and crowded fields are given CAPELLA has been used to process crowded stellar fields obtained with different detectors such as electronographic cameras, CCD's photographic films coupled to image intensifiers. It has been applied successfully in the extreme cases of close associations of the galaxy M33, of the composite Wolf-Rayet Brey 73 in the Large Magellanic Cloud (LMC) and of the central parts of globular clusters as 47 TUC and M15.

  16. New type IIB backgrounds and aspects of their field theory duals

    NASA Astrophysics Data System (ADS)

    Caceres, Elena; Macpherson, Niall T.; Núñez, Carlos

    2014-08-01

    In this paper we study aspects of geometries in Type IIA and Type IIB String theory and elaborate on their field theory dual pairs. The backgrounds are associated with reductions to Type IIA of solutions with G 2 holonomy in eleven dimensions. We classify these backgrounds according to their G-structure, perform a non-Abelian T-duality on them and find new Type IIB configurations presenting dynamical SU(2)-structure. We study some aspects of the associated field theories defined by these new backgrounds. Various technical details are clearly spelled out.

  17. Radiative improvement of the lattice nonrelativistic QCD action using the background field method and application to the hyperfine splitting of quarkonium states.

    PubMed

    Hammant, T C; Hart, A G; von Hippel, G M; Horgan, R R; Monahan, C J

    2011-09-09

    We present the first application of the background field method to nonrelativistic QCD (NRQCD) on the lattice in order to determine the one-loop radiative corrections to the coefficients of the NRQCD action in a manifestly gauge-covariant manner. The coefficients of the σ·B term in the NRQCD action and the four-fermion spin-spin interaction are computed at the one-loop level; the resulting shift of the hyperfine splitting of bottomonium is found to bring the lattice predictions in line with experiment.

  18. Unsupervised background-constrained tank segmentation of infrared images in complex background based on the Otsu method.

    PubMed

    Zhou, Yulong; Gao, Min; Fang, Dan; Zhang, Baoquan

    2016-01-01

    In an effort to implement fast and effective tank segmentation from infrared images in complex background, the threshold of the maximum between-class variance method (i.e., the Otsu method) is analyzed and the working mechanism of the Otsu method is discussed. Subsequently, a fast and effective method for tank segmentation from infrared images in complex background is proposed based on the Otsu method via constraining the complex background of the image. Considering the complexity of background, the original image is firstly divided into three classes of target region, middle background and lower background via maximizing the sum of their between-class variances. Then, the unsupervised background constraint is implemented based on the within-class variance of target region and hence the original image can be simplified. Finally, the Otsu method is applied to simplified image for threshold selection. Experimental results on a variety of tank infrared images (880 × 480 pixels) in complex background demonstrate that the proposed method enjoys better segmentation performance and even could be comparative with the manual segmentation in segmented results. In addition, its average running time is only 9.22 ms, implying the new method with good performance in real time processing.

  19. The Uncertainty of Local Background Magnetic Field Orientation in Anisotropic Plasma Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerick, F.; Saur, J.; Papen, M. von, E-mail: felix.gerick@uni-koeln.de

    In order to resolve and characterize anisotropy in turbulent plasma flows, a proper estimation of the background magnetic field is crucially important. Various approaches to calculating the background magnetic field, ranging from local to globally averaged fields, are commonly used in the analysis of turbulent data. We investigate how the uncertainty in the orientation of a scale-dependent background magnetic field influences the ability to resolve anisotropy. Therefore, we introduce a quantitative measure, the angle uncertainty, that characterizes the uncertainty of the orientation of the background magnetic field that turbulent structures are exposed to. The angle uncertainty can be used asmore » a condition to estimate the ability to resolve anisotropy with certain accuracy. We apply our description to resolve the spectral anisotropy in fast solar wind data. We show that, if the angle uncertainty grows too large, the power of the turbulent fluctuations is attributed to false local magnetic field angles, which may lead to an incorrect estimation of the spectral indices. In our results, an apparent robustness of the spectral anisotropy to false local magnetic field angles is observed, which can be explained by a stronger increase of power for lower frequencies when the scale of the local magnetic field is increased. The frequency-dependent angle uncertainty is a measure that can be applied to any turbulent system.« less

  20. Interaction of moving branes with background massless and tachyon fields in superstring theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rezaei, Z., E-mail: z.rezaei@aut.ac.ir; Kamani, D., E-mail: kamani@aut.ac.ir

    2012-02-15

    Using the boundary state formalism, we study a moving Dp-brane in a partially compact space-time in the presence of background fields: the Kalb-Ramond field B{sub {mu}{nu}}, a U(1) gauge field A{sub {alpha}}, and the tachyon field. The boundary state enables us to obtain the interaction amplitude of two branes with the above back-ground fields. The branes are parallel or perpendicular to each other. Because of the presence of background fields, compactification of some space-time directions, motion of the branes, and the arbitrariness of the dimensions of the branes, the system is rather general. Due to the tachyon fields and velocitiesmore » of the branes, the behavior of the interaction amplitude reveals obvious differences from the conventional behavior.« less

  1. X-ray radiative transfer in protoplanetary disks. The role of dust and X-ray background fields

    NASA Astrophysics Data System (ADS)

    Rab, Ch.; Güdel, M.; Woitke, P.; Kamp, I.; Thi, W.-F.; Min, M.; Aresu, G.; Meijerink, R.

    2018-01-01

    Context. The X-ray luminosities of T Tauri stars are about two to four orders of magnitude higher than the luminosity of the contemporary Sun. As these stars are born in clusters, their disks are not only irradiated by their parent star but also by an X-ray background field produced by the cluster members. Aims: We aim to quantify the impact of X-ray background fields produced by young embedded clusters on the chemical structure of disks. Further, we want to investigate the importance of the dust for X-ray radiative transfer in disks. Methods: We present a new X-ray radiative transfer module for the radiation thermo-chemical disk code PRODIMO (PROtoplanetary DIsk MOdel), which includes X-ray scattering and absorption by both the gas and dust component. The X-ray dust opacities can be calculated for various dust compositions and dust-size distributions. For the X-ray radiative transfer we consider irradiation by the star and by X-ray background fields. To study the impact of X-rays on the chemical structure of disks we use the well established disk ionization tracers N2H+ and HCO+. Results: For evolved dust populations (e.g. grain growth), X-ray opacities are mostly dominated by the gas; only for photon energies E ≳ 5-10 keV do dust opacities become relevant. Consequently the local disk X-ray radiation field is only affected in dense regions close to the disk midplane. X-ray background fields can dominate the local X-ray disk ionization rate for disk radii r ≳ 20 au. However, the N2H+ and HCO+ column densities are only significantly affected in cases of low cosmic-ray ionization rates (≲10-19 s-1), or if the background flux is at least a factor of ten higher than the flux level of ≈10-5 erg cm-2 s-1 expected for clusters typical for the solar vicinity. Conclusions: Observable signatures of X-ray background fields in low-mass star-formation regions, like Taurus, are only expected for cluster members experiencing a strong X-ray background field (e.g. due to

  2. Dynamics of Plasma Jets and Bubbles Launched into a Transverse Background Magnetic Field

    NASA Astrophysics Data System (ADS)

    Zhang, Yue

    2017-10-01

    A coaxial magnetized plasma gun has been utilized to launch both plasma jets (open B-field) and plasma bubbles (closed B-field) into a transverse background magnetic field in the HelCat (Helicon-Cathode) linear device at the University of New Mexico. These situations may have bearing on fusion plasmas (e.g. plasma injection for tokamak fueling, ELM pacing, or disruption mitigation) and astrophysical settings (e.g. astrophysical jet stability, coronal mass ejections, etc.). The magnetic Reynolds number of the gun plasma is 100 , so that magnetic advection dominates over magnetic diffusion. The gun plasma ram pressure, ρjetVjet2 >B02 / 2μ0 , the background magnetic pressure, so that the jet or bubble can easily penetrate the background B-field, B0. When the gun axial B-field is weak compared to the gun azimuthal field, a current-driven jet is formed with a global helical magnetic configuration. Applying the transverse background magnetic field, it is observed that the n = 1 kink mode is stabilized, while magnetic probe measurements show contrarily that the safety factor q(a) drops below unity. At the same time, a sheared axial jet velocity is measured. We conclude that the tension force arising from increasing curvature of the background magnetic field induces the measured sheared flow gradient above the theoretical kink-stabilization threshold, resulting in the emergent kink stabilization of the injected plasma jet. In the case of injected bubbles, spheromak-like plasma formation is verified. However, when the spheromak plasma propagates into the transverse background magnetic field, the typical self-closed global symmetry magnetic configuration does not hold any more. In the region where the bubble toroidal field opposed the background B-field, the magneto-Rayleigh-Taylor (MRT) instability has been observed. Details of the experiment setup, diagnostics, experimental results and theoretical analysis will be presented. Supported by the National Science Foundation

  3. Prequantum classical statistical field theory: background field as a source of everything?

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2011-07-01

    Prequantum classical statistical field theory (PCSFT) is a new attempt to consider quantum mechanics (QM) as an emergent phenomenon, cf. with De Broglie's "double solution" approach, Bohmian mechanics, stochastic electrodynamics (SED), Nelson's stochastic QM and its generalization by Davidson, 't Hooft's models and their development by Elze. PCSFT is a comeback to a purely wave viewpoint on QM, cf. with early Schrodinger. There is no quantum particles at all, only waves. In particular, photons are simply wave-pulses of the classical electromagnetic field, cf. SED. Moreover, even massive particles are special "prequantum fields": the electron field, the neutron field, and so on. PCSFT claims that (sooner or later) people will be able to measure components of these fields: components of the "photonic field" (the classical electromagnetic field of low intensity), electronic field, neutronic field, and so on. At the moment we are able to produce quantum correlations as correlations of classical Gaussian random fields. In this paper we are interested in mathematical and physical reasons of usage of Gaussian fields. We consider prequantum signals (corresponding to quantum systems) as composed of a huge number of wave-pulses (on very fine prequantum time scale). We speculate that the prequantum background field (the field of "vacuum fluctuations") might play the role of a source of such pulses, i.e., the source of everything.

  4. Background oriented schlieren measurement of the refractive index field of air induced by a hot, cylindrical measurement object.

    PubMed

    Beermann, Rüdiger; Quentin, Lorenz; Pösch, Andreas; Reithmeier, Eduard; Kästner, Markus

    2017-05-10

    To optically capture the topography of a hot measurement object with high precision, the light deflection by the inhomogeneous refractive index field-induced by the heat transfer from the measurement object to the ambient medium-has to be considered. We used the 2D background oriented schlieren method with illuminated wavelet background, an optical flow algorithm, and Ciddor's equation to quantify the refractive index field located directly above a red-glowing, hot measurement object. A heat transfer simulation has been implemented to verify the magnitude and the shape of the measured refractive index field. Provided that no forced external flow is disturbing the shape of the convective flow originating from the hot object, a laminar flow can be observed directly above the object, resulting in a sharply bounded, inhomogeneous refractive index field.

  5. The inception of pulsed discharges in air: simulations in background fields above and below breakdown

    NASA Astrophysics Data System (ADS)

    Sun, Anbang; Teunissen, Jannis; Ebert, Ute

    2014-11-01

    We investigate discharge inception in air, in uniform background electric fields above and below the breakdown threshold. We perform 3D particle simulations that include a natural level of background ionization in the form of positive and \\text{O}2- ions. In background fields below breakdown, we use a strongly ionized seed of electrons and positive ions to enhance the field locally. In the region of enhanced field, we observe the growth of positive streamers, as in previous simulations with 2D plasma fluid models. The inclusion of background ionization has little effect in this case. When the background field is above the breakdown threshold, the situation is very different. Electrons can then detach from \\text{O}2- and start ionization avalanches in the whole volume. These avalanches together create one extended discharge, in contrast to the ‘double-headed’ streamers found in many fluid simulations.

  6. Conservation laws and stress-energy-momentum tensors for systems with background fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gratus, Jonathan, E-mail: j.gratus@lancaster.ac.uk; The Cockcroft Institute, Daresbury Laboratory, Warrington WA4 4AD; Obukhov, Yuri N., E-mail: yo@thp.uni-koeln.de

    2012-10-15

    This article attempts to delineate the roles played by non-dynamical background structures and Killing symmetries in the construction of stress-energy-momentum tensors generated from a diffeomorphism invariant action density. An intrinsic coordinate independent approach puts into perspective a number of spurious arguments that have historically lead to the main contenders, viz the Belinfante-Rosenfeld stress-energy-momentum tensor derived from a Noether current and the Einstein-Hilbert stress-energy-momentum tensor derived in the context of Einstein's theory of general relativity. Emphasis is placed on the role played by non-dynamical background (phenomenological) structures that discriminate between properties of these tensors particularly in the context of electrodynamics inmore » media. These tensors are used to construct conservation laws in the presence of Killing Lie-symmetric background fields. - Highlights: Black-Right-Pointing-Pointer The role of background fields in diffeomorphism invariant actions is demonstrated. Black-Right-Pointing-Pointer Interrelations between different stress-energy-momentum tensors are emphasised. Black-Right-Pointing-Pointer The Abraham and Minkowski electromagnetic tensors are discussed in this context. Black-Right-Pointing-Pointer Conservation laws in the presence of nondynamic background fields are formulated. Black-Right-Pointing-Pointer The discussion is facilitated by the development of a new variational calculus.« less

  7. Scalar field vacuum expectation value induced by gravitational wave background

    NASA Astrophysics Data System (ADS)

    Jones, Preston; McDougall, Patrick; Ragsdale, Michael; Singleton, Douglas

    2018-06-01

    We show that a massless scalar field in a gravitational wave background can develop a non-zero vacuum expectation value. We draw comparisons to the generation of a non-zero vacuum expectation value for a scalar field in the Higgs mechanism and with the dynamical Casimir vacuum. We propose that this vacuum expectation value, generated by a gravitational wave, can be connected with particle production from gravitational waves and may have consequences for the early Universe where scalar fields are thought to play an important role.

  8. Experimental investigation of coaxial-gun-formed plasmas injected into a background transverse magnetic field or plasma

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Fisher, Dustin M.; Gilmore, Mark; Hsu, Scott C.; Lynn, Alan G.

    2018-05-01

    Injection of coaxial-gun-formed magnetized plasmas into a background transverse vacuum magnetic field or into a background magnetized plasma has been studied in the helicon-cathode (HelCat) linear plasma device at the University of New Mexico [M. Gilmore et al., J. Plasma Phys. 81, 345810104 (2015)]. A magnetized plasma jet launched into a background transverse magnetic field shows emergent kink stabilization of the jet due to the formation of a sheared flow in the jet above the kink stabilization threshold 0.1kVA [Y. Zhang et al., Phys. Plasmas 24, 110702 (2017)]. Injection of a spheromak-like plasma into a transverse background magnetic field led to the observation of finger-like structures on the side with a stronger magnetic field null between the spheromak and the background field. The finger-like structures are consistent with magneto-Rayleigh-Taylor instability. Jets or spheromaks launched into a background, low-β magnetized plasma show similar behavior as above, respectively, in both cases.

  9. Detection methods for stochastic gravitational-wave backgrounds: a unified treatment

    NASA Astrophysics Data System (ADS)

    Romano, Joseph D.; Cornish, Neil. J.

    2017-04-01

    We review detection methods that are currently in use or have been proposed to search for a stochastic background of gravitational radiation. We consider both Bayesian and frequentist searches using ground-based and space-based laser interferometers, spacecraft Doppler tracking, and pulsar timing arrays; and we allow for anisotropy, non-Gaussianity, and non-standard polarization states. Our focus is on relevant data analysis issues, and not on the particular astrophysical or early Universe sources that might give rise to such backgrounds. We provide a unified treatment of these searches at the level of detector response functions, detection sensitivity curves, and, more generally, at the level of the likelihood function, since the choice of signal and noise models and prior probability distributions are actually what define the search. Pedagogical examples are given whenever possible to compare and contrast different approaches. We have tried to make the article as self-contained and comprehensive as possible, targeting graduate students and new researchers looking to enter this field.

  10. Background fluorescence estimation and vesicle segmentation in live cell imaging with conditional random fields.

    PubMed

    Pécot, Thierry; Bouthemy, Patrick; Boulanger, Jérôme; Chessel, Anatole; Bardin, Sabine; Salamero, Jean; Kervrann, Charles

    2015-02-01

    Image analysis applied to fluorescence live cell microscopy has become a key tool in molecular biology since it enables to characterize biological processes in space and time at the subcellular level. In fluorescence microscopy imaging, the moving tagged structures of interest, such as vesicles, appear as bright spots over a static or nonstatic background. In this paper, we consider the problem of vesicle segmentation and time-varying background estimation at the cellular scale. The main idea is to formulate the joint segmentation-estimation problem in the general conditional random field framework. Furthermore, segmentation of vesicles and background estimation are alternatively performed by energy minimization using a min cut-max flow algorithm. The proposed approach relies on a detection measure computed from intensity contrasts between neighboring blocks in fluorescence microscopy images. This approach permits analysis of either 2D + time or 3D + time data. We demonstrate the performance of the so-called C-CRAFT through an experimental comparison with the state-of-the-art methods in fluorescence video-microscopy. We also use this method to characterize the spatial and temporal distribution of Rab6 transport carriers at the cell periphery for two different specific adhesion geometries.

  11. Spectral characterization of natural backgrounds

    NASA Astrophysics Data System (ADS)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  12. Background field removal technique based on non-regularized variable kernels sophisticated harmonic artifact reduction for phase data for quantitative susceptibility mapping.

    PubMed

    Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2018-06-11

    We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.

  13. Cosmic microwave background polarization signals from tangled magnetic fields.

    PubMed

    Seshadri, T R; Subramanian, K

    2001-09-03

    Tangled, primordial cosmic magnetic fields create small rotational velocity perturbations on the last scattering surface of the cosmic microwave background radiation. For fields which redshift to a present value of B0 = 3 x 10(-9) G, these vector modes are shown to generate polarization anisotropies of order 0.1-4 microK on small angular scales (500

  14. Vacuum fluctuations of the supersymmetric field in curved background

    NASA Astrophysics Data System (ADS)

    Bilić, Neven; Domazet, Silvije; Guberina, Branko

    2012-01-01

    We study a supersymmetric model in curved background spacetime. We calculate the effective action and the vacuum expectation value of the energy momentum tensor using a covariant regularization procedure. A soft supersymmetry breaking induces a nonzero contribution to the vacuum energy density and pressure. Assuming the presence of a cosmic fluid in addition to the vacuum fluctuations of the supersymmetric field an effective equation of state is derived in a self-consistent approach at one loop order. The net effect of the vacuum fluctuations of the supersymmetric fields in the leading adiabatic order is a renormalization of the Newton and cosmological constants.

  15. Energy spectrum of tearing mode turbulence in sheared background field

    NASA Astrophysics Data System (ADS)

    Hu, Di; Bhattacharjee, Amitava; Huang, Yi-Min

    2018-06-01

    The energy spectrum of tearing mode turbulence in a sheared background magnetic field is studied in this work. We consider the scenario where the nonlinear interaction of overlapping large-scale modes excites a broad spectrum of small-scale modes, generating tearing mode turbulence. The spectrum of such turbulence is of interest since it is relevant to the small-scale back-reaction on the large-scale field. The turbulence we discuss here differs from traditional MHD turbulence mainly in two aspects. One is the existence of many linearly stable small-scale modes which cause an effective damping during the energy cascade. The other is the scale-independent anisotropy induced by the large-scale modes tilting the sheared background field, as opposed to the scale-dependent anisotropy frequently encountered in traditional critically balanced turbulence theories. Due to these two differences, the energy spectrum deviates from a simple power law and takes the form of a power law multiplied by an exponential falloff. Numerical simulations are carried out using visco-resistive MHD equations to verify our theoretical predictions, and a reasonable agreement is found between the numerical results and our model.

  16. Holographic non-Fermi liquid in a background magnetic field

    NASA Astrophysics Data System (ADS)

    Basu, Pallab; He, Jianyang; Mukherjee, Anindya; Shieh, Hsien-Hang

    2010-08-01

    We study the effects of a nonzero magnetic field on a class of 2+1 dimensional non-Fermi liquids, recently found in [Hong Liu, John McGreevy, and David Vegh, arXiv:0903.2477.] by considering properties of a Fermionic probe in an extremal AdS4 black hole background. Introducing a similar fermionic probe in a dyonic AdS4 black hole geometry, we find that the effect of a magnetic field could be incorporated in a rescaling of the probe fermion’s charge. From this simple fact, we observe interesting effects like gradual disappearance of the Fermi surface and quasiparticle peaks at large magnetic fields and changes in other properties of the system. We also find Landau level like structures and oscillatory phenomena similar to the de-Haas-van Alphen effect.

  17. Cosmic microwave background trispectrum and primordial magnetic field limits.

    PubMed

    Trivedi, Pranjal; Seshadri, T R; Subramanian, Kandaswamy

    2012-06-08

    Primordial magnetic fields will generate non-gaussian signals in the cosmic microwave background (CMB) as magnetic stresses and the temperature anisotropy they induce depend quadratically on the magnetic field. We compute a new measure of magnetic non-gaussianity, the CMB trispectrum, on large angular scales, sourced via the Sachs-Wolfe effect. The trispectra induced by magnetic energy density and by magnetic scalar anisotropic stress are found to have typical magnitudes of approximately a few times 10(-29) and 10(-19), respectively. Observational limits on CMB non-gaussianity from WMAP data allow us to conservatively set upper limits of a nG, and plausibly sub-nG, on the present value of the primordial cosmic magnetic field. This represents the tightest limit so far on the strength of primordial magnetic fields, on Mpc scales, and is better than limits from the CMB bispectrum and all modes in the CMB power spectrum. Thus, the CMB trispectrum is a new and more sensitive probe of primordial magnetic fields on large scales.

  18. ON THE ROLE OF THE BACKGROUND OVERLYING MAGNETIC FIELD IN SOLAR ERUPTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nindos, A.; Patsourakos, S.; Wiegelmann, T., E-mail: anindos@cc.uoi.gr

    2012-03-20

    The primary constraining force that inhibits global solar eruptions is provided by the overlying background magnetic field. Using magnetic field data from both the Helioseismic and Magnetic Imager aboard the Solar Dynamics Observatory and the spectropolarimeter of the Solar Optical Telescope aboard Hinode, we study the long-term evolution of the background field in active region AR11158 that produced three major coronal mass ejections (CMEs). The CME formation heights were determined using EUV data. We calculated the decay index -(z/B)({partial_derivative}B/{partial_derivative}z) of the magnetic field B (i.e., how fast the field decreases with height, z) related to each event from the timemore » of the active region emergence until well after the CMEs. At the heights of CME formation, the decay indices were 1.1-2.1. Prior to two of the events, there were extended periods (of more than 23 hr) where the related decay indices at heights above the CME formation heights either decreased (up to -15%) or exhibited small changes. The decay index related to the third event increased (up to 118%) at heights above 20 Mm within an interval that started 64 hr prior to the CME. The magnetic free energy and the accumulated helicity into the corona contributed the most to the eruptions by their increase throughout the flux emergence phase (by factors of more than five and more than two orders of magnitude, respectively). Our results indicate that the initiation of eruptions does not depend critically on the temporal evolution of the variation of the background field with height.« less

  19. Wide-field two-photon microscopy with temporal focusing and HiLo background rejection

    NASA Astrophysics Data System (ADS)

    Yew, Elijah Y. S.; Choi, Heejin; Kim, Daekeun; So, Peter T. C.

    2011-03-01

    Scanningless depth-resolved microscopy is achieved through spatial-temporal focusing and has been demonstrated previously. The advantage of this method is that a large area may be imaged without scanning resulting in higher throughput of the imaging system. Because it is a widefield technique, the optical sectioning effect is considerably poorer than with conventional spatial focusing two-photon microscopy. Here we propose wide-field two-photon microscopy based on spatio-temporal focusing and employing background rejection based on the HiLo microscope principle. We demonstrate the effects of applying HiLo microscopy to widefield temporally focused two-photon microscopy.

  20. VNIR hyperspectral background characterization methods in adverse weather conditions

    NASA Astrophysics Data System (ADS)

    Romano, João M.; Rosario, Dalton; Roth, Luz

    2009-05-01

    Hyperspectral technology is currently being used by the military to detect regions of interest where potential targets may be located. Weather variability, however, may affect the ability for an algorithm to discriminate possible targets from background clutter. Nonetheless, different background characterization approaches may facilitate the ability for an algorithm to discriminate potential targets over a variety of weather conditions. In a previous paper, we introduced a new autonomous target size invariant background characterization process, the Autonomous Background Characterization (ABC) or also known as the Parallel Random Sampling (PRS) method, features a random sampling stage, a parallel process to mitigate the inclusion by chance of target samples into clutter background classes during random sampling; and a fusion of results at the end. In this paper, we will demonstrate how different background characterization approaches are able to improve performance of algorithms over a variety of challenging weather conditions. By using the Mahalanobis distance as the standard algorithm for this study, we compare the performance of different characterization methods such as: the global information, 2 stage global information, and our proposed method, ABC, using data that was collected under a variety of adverse weather conditions. For this study, we used ARDEC's Hyperspectral VNIR Adverse Weather data collection comprised of heavy, light, and transitional fog, light and heavy rain, and low light conditions.

  1. Emergent kink stability of a magnetized plasma jet injected into a transverse background magnetic field

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Gilmore, Mark; Hsu, Scott C.; Fisher, Dustin M.; Lynn, Alan G.

    2017-11-01

    We report experimental results on the injection of a magnetized plasma jet into a transverse background magnetic field in the HelCat linear plasma device at the University of New Mexico [M. Gilmore et al., J. Plasma Phys. 81(1), 345810104 (2015)]. After the plasma jet leaves the plasma-gun muzzle, a tension force arising from an increasing curvature of the background magnetic field induces in the jet a sheared axial-flow gradient above the theoretical kink-stabilization threshold. We observe that this emergent sheared axial flow stabilizes the n = 1 kink mode in the jet, whereas a kink instability is observed in the jet when there is no background magnetic field present.

  2. The Anisotropy of the Microwave Background to l = 3500: Deep Field Observations with the Cosmic Background Imager

    NASA Technical Reports Server (NTRS)

    Mason, B. S.; Pearson, T. J.; Readhead, A. C. S.; Shepherd, M. C.; Sievers, J.; Udomprasert, P. S.; Cartwright, J. K.; Farmer, A. J.; Padin, S.; Myers, S. T.; hide

    2002-01-01

    We report measurements of anisotropy in the cosmic microwave background radiation over the multipole range l approximately 200 (right arrow) 3500 with the Cosmic Background Imager based on deep observations of three fields. These results confirm the drop in power with increasing l first reported in earlier measurements with this instrument, and extend the observations of this decline in power out to l approximately 2000. The decline in power is consistent with the predicted damping of primary anisotropies. At larger multipoles, l = 2000-3500, the power is 3.1 sigma greater than standard models for intrinsic microwave background anisotropy in this multipole range, and 3.5 sigma greater than zero. This excess power is not consistent with expected levels of residual radio source contamination but, for sigma 8 is approximately greater than 1, is consistent with predicted levels due to a secondary Sunyaev-Zeldovich anisotropy. Further observations are necessary to confirm the level of this excess and, if confirmed, determine its origin.

  3. Susceptibility of the QCD vacuum to CP-odd electromagnetic background fields.

    PubMed

    D'Elia, Massimo; Mariti, Marco; Negro, Francesco

    2013-02-22

    We investigate two flavor quantum chromodynamics (QCD) in the presence of CP-odd electromagnetic background fields and determine, by means of lattice QCD simulations, the induced effective θ term to first order in E[over →] · B[over →]. We employ a rooted staggered discretization and study lattice spacings down to 0.1 fm and Goldstone pion masses around 480 MeV. In order to deal with a positive measure, we consider purely imaginary electric fields and real magnetic fields, and then exploit the analytic continuation. Our results are relevant to a description of the effective pseudoscalar quantum electrodynamics-QCD interactions.

  4. Laboratory Experiments on Propagating Plasma Bubbles into Vacuum, Vacuum Magnetic Field, and Background Plasmas

    NASA Astrophysics Data System (ADS)

    Lynn, Alan G.; Zhang, Yue; Gilmore, Mark; Hsu, Scott

    2014-10-01

    We discuss the dynamics of plasma ``bubbles'' as they propagate through a variety of background media. These bubbles are formed by a pulsed coaxial gun with an externally applied magnetic field. Bubble parameters are typically ne ~1020 m-3, Te ~ 5 - 10 eV, and Ti ~ 10 - 15 eV. The structure of the bubbles can range from unmagnetized jet-like structures to spheromak-like structures with complex magnetic flux surfaces. Some of the background media the bubbles interact with are vacuum, vacuum with magnetic field, and other magnetized plasmas. These bubbles exhibit different qualitative behavior depending on coaxial gun parameters such as gas species, gun current, and gun bias magnetic field. Their behavior also depends on the parameters of the background they propagate through. Multi-frame fast camera imaging and magnetic probe data are used to characterize the bubble evolution under various conditions.

  5. Improvement of Accuracy for Background Noise Estimation Method Based on TPE-AE

    NASA Astrophysics Data System (ADS)

    Itai, Akitoshi; Yasukawa, Hiroshi

    This paper proposes a method of a background noise estimation based on the tensor product expansion with a median and a Monte carlo simulation. We have shown that a tensor product expansion with absolute error method is effective to estimate a background noise, however, a background noise might not be estimated by using conventional method properly. In this paper, it is shown that the estimate accuracy can be improved by using proposed methods.

  6. NEW OBSERVATION OF FAILED FILAMENT ERUPTIONS: THE INFLUENCE OF ASYMMETRIC CORONAL BACKGROUND FIELDS ON SOLAR ERUPTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y.; Xu, Z.; Su, J.

    2009-05-01

    Failed filament eruptions not associated with a coronal mass ejection (CME) have been observed and reported as evidence for solar coronal field confinement on erupting flux ropes. In those events, each filament eventually returns to its origin on the solar surface. In this Letter, a new observation of two failed filament eruptions is reported which indicates that the mass of a confined filament can be ejected to places far from the original filament channel. The jetlike mass motions in the two failed filament eruptions are thought to be due to the asymmetry of the background coronal magnetic fields with respectmore » to the locations of the filament channels. The asymmetry of the coronal fields is confirmed by an extrapolation based on a potential field model. The obvious imbalance between the positive and negative magnetic flux (with a ratio of 1:3) in the bipolar active region is thought to be the direct cause of the formation of the asymmetric coronal fields. We think that the asymmetry of the background fields can not only influence the trajectories of ejecta, but also provide a relatively stronger confinement for flux rope eruptions than the symmetric background fields do.« less

  7. Amplification due to two-stream instability of self-electric and magnetic fields of an ion beam propagating in background plasma

    NASA Astrophysics Data System (ADS)

    Tokluoglu, Erinc K.; Kaganovich, Igor D.; Carlsson, Johan A.; Hara, Kentaro; Startsev, Edward A.

    2018-05-01

    Propagation of charged particle beams in background plasma as a method of space charge neutralization has been shown to achieve a high degree of charge and current neutralization and therefore enables nearly ballistic propagation and focusing of charged particle beams. Correspondingly, the use of plasmas for propagation of charged particle beams has important applications for transport and focusing of intense particle beams in inertial fusion and high energy density laboratory plasma physics. However, the streaming of beam ions through a background plasma can lead to the development of two-stream instability between the beam ions and the plasma electrons. The beam electric and magnetic fields enhanced by the two-stream instability can lead to defocusing of the ion beam. Using particle-in-cell simulations, we study the scaling of the instability-driven self-electromagnetic fields and consequent defocusing forces with the background plasma density and beam ion mass. We identify plasma parameters where the defocusing forces can be reduced.

  8. Latent variable method for automatic adaptation to background states in motor imagery BCI

    NASA Astrophysics Data System (ADS)

    Dagaev, Nikolay; Volkova, Ksenia; Ossadtchi, Alexei

    2018-02-01

    Objective. Brain-computer interface (BCI) systems are known to be vulnerable to variabilities in background states of a user. Usually, no detailed information on these states is available even during the training stage. Thus there is a need in a method which is capable of taking background states into account in an unsupervised way. Approach. We propose a latent variable method that is based on a probabilistic model with a discrete latent variable. In order to estimate the model’s parameters, we suggest to use the expectation maximization algorithm. The proposed method is aimed at assessing characteristics of background states without any corresponding data labeling. In the context of asynchronous motor imagery paradigm, we applied this method to the real data from twelve able-bodied subjects with open/closed eyes serving as background states. Main results. We found that the latent variable method improved classification of target states compared to the baseline method (in seven of twelve subjects). In addition, we found that our method was also capable of background states recognition (in six of twelve subjects). Significance. Without any supervised information on background states, the latent variable method provides a way to improve classification in BCI by taking background states into account at the training stage and then by making decisions on target states weighted by posterior probabilities of background states at the prediction stage.

  9. Far-field detection of sub-wavelength Tetris without extra near-field metal parts based on phase prints of time-reversed fields with intensive background interference.

    PubMed

    Chen, Yingming; Wang, Bing-Zhong

    2014-07-14

    Time-reversal (TR) phase prints are first used in far-field (FF) detection of sub-wavelength (SW) deformable scatterers without any extra metal structure positioned in the vicinity of the target. The 2D prints derive from discrete short-time Fourier transform of 1D TR electromagnetic (EM) signals. Because the time-invariant intensive background interference is effectively centralized by TR technique, the time-variant weak indication from FF SW scatterers can be highlighted. This method shows a different use of TR technique in which the focus peak of TR EM waves is unusually removed and the most useful information is conveyed by the other part.

  10. Measuring Extinction in Local Group Galaxies Using Background Galaxies

    NASA Astrophysics Data System (ADS)

    Wyder, T. K.; Hodge, P. W.

    1999-05-01

    Knowledge of the distribution and quantity of dust in galaxies is important for understanding their structure and evolution. The goal of our research is to measure the total extinction through Local Group galaxies using measured properties of background galaxies. Our method relies on the SExtractor software as an objective and automated method of detecting background galaxies. In an initial test, we have explored two WFPC2 fields in the SMC and two in M31 obtained from the HST archives. The two pointings in the SMC are fields around the open clusters L31 and B83 while the two M31 fields target the globular clusters G1 and G170. Except for the G1 observations of M31, the fields chosen are very crowded (even when observed with HST) and we chose them as a particularly stringent test of the method. We performed several experiments using a series of completeness tests that involved superimposing comparison fields, adjusted to the equivalent exposure time, from the HST Medium-Deep and Groth-Westphal surveys. These tests showed that for crowded fields, such as the two in the core of the SMC and the one in the bulge of M31, this automated method of detecting galaxies can be completely dominated by the effects of crowding. For these fields, only a small fraction of the added galaxies was recovered. However, in the outlying G1 field in M31, almost all of the added galaxies were recovered. The numbers of actual background galaxies in this field are consistent with zero extinction. As a follow-up experiment, we used image processing techniques to suppress stellar objects while enhancing objects with non-stellar, more gradual luminosity profiles. This method yielded significant numbers of background galaxies in even the most crowded fields, which we are now analyzing to determine the total extinction and reddening caused by the foreground galaxy.

  11. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    PubMed

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Unified field theories, the early big bang, and the microwave background paradox

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1979-01-01

    It is suggested that a superunified field theory incorporating gravity and possessing asymptotic freedom could provide a solution to the paradox of the isotropy of the universal 3K background radiation. Thermal equilibrium could be established in this context through interactions occurring in a temporally indefinite preplanckian era.

  13. Fractional-wrapped branes with rotation, linear motion and background fields

    NASA Astrophysics Data System (ADS)

    Maghsoodi, Elham; Kamani, Davoud

    2017-09-01

    We obtain two boundary states corresponding to the two folds of a fractional-wrapped Dp-brane, i.e. the twisted version under the orbifold C2 /Z2 and the untwisted version. The brane has rotation and linear motion, in the presence of the following background fields: the Kalb-Ramond tensor, a U (1) internal gauge potential and a tachyon field. The rotation and linear motion are inside the volume of the brane. The brane lives in the d-dimensional spacetime, with the orbifold-toroidal structure Tn ×R 1 , d - n - 5 ×C2 /Z2 in the twisted sector. Using these boundary states we calculate the interaction amplitude of two parallel fractional Dp-branes with the foregoing setup. Various properties of this amplitude such as the long-range behavior will be analyzed.

  14. A method of reducing background fluctuation in tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Yang, Rendi; Dong, Xiaozhou; Bi, Yunfeng; Lv, Tieliang

    2018-03-01

    Optical interference fringe is the main factor that leads to background fluctuation in gas concentration detection based on tunable diode laser absorption spectroscopy. The interference fringes are generated by multiple reflections or scatterings upon optical surfaces in optical path and make the background signal present an approximated sinusoidal oscillation. To reduce the fluctuation of the background, a method that combines dual tone modulation (DTM) with vibration reflector (VR) is proposed in this paper. The combination of DTM and VR can make the unwanted periodic interference fringes to be averaged out and the effectiveness of the method in reducing background fluctuation has been verified by simulation and real experiments in this paper. In the detection system based on the proposed method, the standard deviation (STD) value of the background signal is decreased to 0.0924 parts per million (ppm), which is reduced by a factor of 16 compared with that of wavelength modulation spectroscopy. The STD value of 0.0924 ppm corresponds to the absorption of 4 . 328 × 10-6Hz - 1 / 2 (with effective optical path length of 4 m and integral time of 0.1 s). Moreover, the proposed method presents a better stable performance in reducing background fluctuation in long time experiments.

  15. Noise covariance incorporated MEG-MUSIC algorithm: a method for multiple-dipole estimation tolerant of the influence of background brain activity.

    PubMed

    Sekihara, K; Poeppel, D; Marantz, A; Koizumi, H; Miyashita, Y

    1997-09-01

    This paper proposes a method of localizing multiple current dipoles from spatio-temporal biomagnetic data. The method is based on the multiple signal classification (MUSIC) algorithm and is tolerant of the influence of background brain activity. In this method, the noise covariance matrix is estimated using a portion of the data that contains noise, but does not contain any signal information. Then, a modified noise subspace projector is formed using the generalized eigenvectors of the noise and measured-data covariance matrices. The MUSIC localizer is calculated using this noise subspace projector and the noise covariance matrix. The results from a computer simulation have verified the effectiveness of the method. The method was then applied to source estimation for auditory-evoked fields elicited by syllable speech sounds. The results strongly suggest the method's effectiveness in removing the influence of background activity.

  16. PE Metrics: Background, Testing Theory, and Methods

    ERIC Educational Resources Information Center

    Zhu, Weimo; Rink, Judy; Placek, Judith H.; Graber, Kim C.; Fox, Connie; Fisette, Jennifer L.; Dyson, Ben; Park, Youngsik; Avery, Marybell; Franck, Marian; Raynes, De

    2011-01-01

    New testing theories, concepts, and psychometric methods (e.g., item response theory, test equating, and item bank) developed during the past several decades have many advantages over previous theories and methods. In spite of their introduction to the field, they have not been fully accepted by physical educators. Further, the manner in which…

  17. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  18. Plane-parallel waves as duals of the flat background III: T-duality with torsionless B-field

    NASA Astrophysics Data System (ADS)

    Hlavatý, Ladislav; Petr, Ivo; Petrásek, Filip

    2018-04-01

    By addition of non-zero, but torsionless B-field, we expand the classification of (non-)Abelian T-duals of the flat background in four dimensions with respect to 1, 2, 3 and 4D subgroups of the Poincaré group. We discuss the influence of the additional B-field on the process of dualization, and identify essential parts of the torsionless B-field that cannot in general be eliminated by coordinate or gauge transformation of the dual background. These effects are demonstrated using particular examples. Due to their physical importance, we focus on duals whose metrics represent plane-parallel (pp-)waves. Besides the previously found metrics, we find new pp-waves depending on parameters originating from the torsionless B-field. These pp-waves are brought into their standard forms in Brinkmann and Rosen coordinates.

  19. Cosmic microwave background bispectrum from primordial magnetic fields on large angular scales.

    PubMed

    Seshadri, T R; Subramanian, Kandaswamy

    2009-08-21

    Primordial magnetic fields lead to non-Gaussian signals in the cosmic microwave background (CMB) even at the lowest order, as magnetic stresses and the temperature anisotropy they induce depend quadratically on the magnetic field. In contrast, CMB non-Gaussianity due to inflationary scalar perturbations arises only as a higher-order effect. We propose a novel probe of stochastic primordial magnetic fields that exploits the characteristic CMB non-Gaussianity that they induce. We compute the CMB bispectrum (b(l1l2l3)) induced by such fields on large angular scales. We find a typical value of l1(l1 + 1)l3(l3 + 1)b(l1l2l3) approximately 10(-22), for magnetic fields of strength B0 approximately 3 nG and with a nearly scale invariant magnetic spectrum. Observational limits on the bispectrum allow us to set upper limits on B0 approximately 35 nG.

  20. Probing the Intergalactic Magnetic Field with the Anisotropy of the Extragalactic Gamma-ray Background

    NASA Technical Reports Server (NTRS)

    Venters, T. M.; Pavlidou, V.

    2013-01-01

    The intergalactic magnetic field (IGMF) may leave an imprint on the angular anisotropy of the extragalactic gamma-ray background through its effect on electromagnetic cascades triggered by interactions between very high energy photons and the extragalactic background light. A strong IGMF will deflect secondary particles produced in these cascades and will thus tend to isotropize lower energy cascade photons, thereby inducing a modulation in the anisotropy energy spectrum of the gamma-ray background. Here we present a simple, proof-of-concept calculation of the magnitude of this effect and demonstrate that current Fermi data already seem to prefer nonnegligible IGMF values. The anisotropy energy spectrum of the Fermi gamma-ray background could thus be used as a probe of the IGMF strength.

  1. Probing the Intergalactic Magnetic Field with the Anisotropy of the Extragalactic Gamma-Ray Background

    NASA Technical Reports Server (NTRS)

    Venters, T. M.; Pavlidou, V.

    2012-01-01

    The intergalactic magnetic field (IGMF) may leave an imprint on the anisotropy properties of the extragalactic gamma-ray background, through its effect on electromagnetic cascades triggered by interactions between very high energy photons and the extragalactic background light. A strong IGMF will deflect secondary particles produced in these cascades and will thus tend to isotropize lower energy cascade photons, thus inducing a modulation in the anisotropy energy spectrum of the gamma-ray background. Here we present a simple, proof-of-concept calculation of the magnitude of this effect and demonstrate that the two extreme cases (zero IGMF and IGMF strong enough to completely isotropize cascade photons) would be separable by ten years of Fermi observations and reasonable model parameters for the gamma-ray background. The anisotropy energy spectrum of the Fermi gamma-ray background could thus be used as a probe of the IGMF strength.

  2. Background feature descriptor for offline handwritten numeral recognition

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Wang, Hao; Tian, Tian; Jie, Feiran; Lei, Bo

    2011-11-01

    This paper puts forward an offline handwritten numeral recognition method based on background structural descriptor (sixteen-value numerical background expression). Through encoding the background pixels in the image according to a certain rule, 16 different eigenvalues were generated, which reflected the background condition of every digit, then reflected the structural features of the digits. Through pattern language description of images by these features, automatic segmentation of overlapping digits and numeral recognition can be realized. This method is characterized by great deformation resistant ability, high recognition speed and easy realization. Finally, the experimental results and conclusions are presented. The experimental results of recognizing datasets from various practical application fields reflect that with this method, a good recognition effect can be achieved.

  3. Linear spin-2 fields in most general backgrounds

    NASA Astrophysics Data System (ADS)

    Bernard, Laura; Deffayet, Cédric; Schmidt-May, Angnis; von Strauss, Mikael

    2016-04-01

    We derive the full perturbative equations of motion for the most general background solutions in ghost-free bimetric theory in its metric formulation. Clever field redefinitions at the level of fluctuations enable us to circumvent the problem of varying a square-root matrix appearing in the theory. This greatly simplifies the expressions for the linear variation of the bimetric interaction terms. We show that these field redefinitions exist and are uniquely invertible if and only if the variation of the square-root matrix itself has a unique solution, which is a requirement for the linearized theory to be well defined. As an application of our results we examine the constraint structure of ghost-free bimetric theory at the level of linear equations of motion for the first time. We identify a scalar combination of equations which is responsible for the absence of the Boulware-Deser ghost mode in the theory. The bimetric scalar constraint is in general not manifestly covariant in its nature. However, in the massive gravity limit the constraint assumes a covariant form when one of the interaction parameters is set to zero. For that case our analysis provides an alternative and almost trivial proof of the absence of the Boulware-Deser ghost. Our findings generalize previous results in the metric formulation of massive gravity and also agree with studies of its vielbein version.

  4. Structured background grids for generation of unstructured grids by advancing front method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    1991-01-01

    A new method of background grid construction is introduced for generation of unstructured tetrahedral grids using the advancing-front technique. Unlike the conventional triangular/tetrahedral background grids which are difficult to construct and usually inadequate in performance, the new method exploits the simplicity of uniform Cartesian meshes and provides grids of better quality. The approach is analogous to solving a steady-state heat conduction problem with discrete heat sources. The spacing parameters of grid points are distributed over the nodes of a Cartesian background grid by interpolating from a few prescribed sources and solving a Poisson equation. To increase the control over the grid point distribution, a directional clustering approach is used. The new method is convenient to use and provides better grid quality and flexibility. Sample results are presented to demonstrate the power of the method.

  5. A possible alternative method for collecting mosquito larvae in rice fields

    PubMed Central

    Robert, Vincent; Goff, Gilbert Le; Ariey, Frédéric; Duchemin, Jean-Bernard

    2002-01-01

    Background Rice fields are efficient breeding places for malaria vectors in Madagascar. In order to establish as easily as possible if a rice field is an effective larval site for anophelines, we compared classical dipping versus a net as methods of collecting larvae. Results Using similar collecting procedures, we found that the total number of anopheline larvae collected with the net was exactly double (174/87) that collected by dipping. The number of anopheline species collected was also greater with a net. Conclusions The net is an effective means of collecting anopheline larvae and can be used for qualitative ecological studies and to rapidly determine which rice fields are containing malaria vectors. PMID:12057018

  6. BRST-BFV analysis of anomalies in bosonic string theory interacting with background gravitational field

    NASA Astrophysics Data System (ADS)

    Buchbinder, I. L.; Mistchuk, B. R.; Pershin, V. D.

    1995-02-01

    A general BRST-BFV analysis of the anomaly in string theory coupled to background fields is carried out. An exact equation for the c-valued symbol of the anomaly operator is found and the structure of its solution is studied.

  7. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Background oriented schlieren in a density stratified fluid.

    PubMed

    Verso, Lilly; Liberzon, Alex

    2015-10-01

    Non-intrusive quantitative fluid density measurement methods are essential in the stratified flow experiments. Digital imaging leads to synthetic schlieren methods in which the variations of the index of refraction are reconstructed computationally. In this study, an extension to one of these methods, called background oriented schlieren, is proposed. The extension enables an accurate reconstruction of the density field in stratified liquid experiments. Typically, the experiments are performed by the light source, background pattern, and the camera positioned on the opposite sides of a transparent vessel. The multimedia imaging through air-glass-water-glass-air leads to an additional aberration that destroys the reconstruction. A two-step calibration and image remapping transform are the key components that correct the images through the stratified media and provide a non-intrusive full-field density measurements of transparent liquids.

  9. Multipactor susceptibility on a dielectric with a bias dc electric field and a background gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang Peng; Lau, Y. Y.; Franzi, Matthew

    2011-05-15

    We use Monte Carlo simulations and analytical calculations to derive the condition for the onset of multipactor discharge on a dielectric surface at various combinations of the bias dc electric field, rf electric field, and background pressures of noble gases, such as Argon. It is found that the presence of a tangential bias dc electric field on the dielectric surface lowers the magnitude of rf electric field threshold to initiate multipactor, therefore plausibly offering robust protection against high power microwaves. The presence of low pressure gases may lead to a lower multipactor saturation level, however. The combined effects of tangentialmore » dc electric field and external gases on multipactor susceptibility are presented.« less

  10. Field background odour should be taken into account when formulating a pest attractant based on plant volatiles

    PubMed Central

    Cai, Xiaoming; Bian, Lei; Xu, Xiuxiu; Luo, Zongxiu; Li, Zhaoqun; Chen, Zongmao

    2017-01-01

    Attractants for pest monitoring and controlling can be developed based on plant volatiles. Previously, we showed that tea leafhopper (Empoasca onukii) preferred grapevine, peach plant, and tea plant odours to clean air. In this research, we formulated three blends with similar attractiveness to leafhoppers as peach, grapevine, and tea plant volatiles; these blends were composed of (Z)-3-hexenyl acetate, (E)-ocimene, (E)-4,8-dimethyl-1,3,7-nonatriene, benzaldehyde, and ethyl benzoate. Based on these five compounds, we developed two attractants, formula-P and formula-G. The specific component relative to tea plant volatiles in formula-P was benzaldehyde, and that in formula-G was ethyl benzoate. These two compounds played a role in attracting leafhoppers. In laboratory assays, the two attractants were more attractive than tea plant volatiles to the leafhoppers, and had a similar level of attractiveness. However, the leafhoppers were not attracted to formula-P in the field. A high concentration of benzaldehyde was detected in the background odour of the tea plantations. In laboratory tests, benzaldehyde at the field concentration was attractive to leafhoppers. Our results indicate that the field background odour can interfere with a point-releasing attractant when their components overlap, and that a successful attractant must differ from the field background odour. PMID:28150728

  11. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    PubMed

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  12. An Effective Method for Modeling Two-dimensional Sky Background of LAMOST

    NASA Astrophysics Data System (ADS)

    Haerken, Hasitieer; Duan, Fuqing; Zhang, Jiannan; Guo, Ping

    2017-06-01

    Each CCD of LAMOST accommodates 250 spectra, while about 40 are used to observe sky background during real observations. How to estimate the unknown sky background information hidden in the observed 210 celestial spectra by using the known 40 sky spectra is the problem we solve. In order to model the sky background, usually a pre-observation is performed with all fibers observing sky background. We use the observed 250 skylight spectra as training data, where those observed by the 40 fibers are considered as a base vector set. The Locality-constrained Linear Coding (LLC) technique is utilized to represent the skylight spectra observed by the 210 fibers with the base vector set. We also segment each spectrum into small parts, and establish the local sky background model for each part. Experimental results validate the proposed method, and show the local model is better than the global model.

  13. Spectral feature characterization methods for blood stain detection in crime scene backgrounds

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Mathew, Jobin J.; Dube, Roger R.; Messinger, David W.

    2016-05-01

    Blood stains are one of the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Blood spectral signatures containing unique reflectance or absorption features are important both for forensic on-site investigation and laboratory testing. They can be used for target detection and identification applied to crime scene hyperspectral imagery, and also be utilized to analyze the spectral variation of blood on various backgrounds. Non-blood stains often mislead the detection and can generate false alarms at a real crime scene, especially for dark and red backgrounds. This paper measured the reflectance of liquid blood and 9 kinds of non-blood samples in the range of 350 nm - 2500 nm in various crime scene backgrounds, such as pure samples contained in petri dish with various thicknesses, mixed samples with different colors and materials of fabrics, and mixed samples with wood, all of which are examined to provide sub-visual evidence for detecting and recognizing blood from non-blood samples in a realistic crime scene. The spectral difference between blood and non-blood samples are examined and spectral features such as "peaks" and "depths" of reflectance are selected. Two blood stain detection methods are proposed in this paper. The first method uses index to denote the ratio of "depth" minus "peak" over"depth" add"peak" within a wavelength range of the reflectance spectrum. The second method uses relative band depth of the selected wavelength ranges of the reflectance spectrum. Results show that the index method is able to discriminate blood from non-blood samples in most tested crime scene backgrounds, but is not able to detect it from black felt. Whereas the relative band depth method is able to discriminate blood from non-blood samples on all of the tested background material types and colors.

  14. 2010 August 1-2 Sympathetic Eruptions. II. Magnetic Topology of the MHD Background Field

    NASA Astrophysics Data System (ADS)

    Titov, Viacheslav S.; Mikić, Zoran; Török, Tibor; Linker, Jon A.; Panasenco, Olga

    2017-08-01

    Using a potential field source-surface (PFSS) model, we recently analyzed the global topology of the background coronal magnetic field for a sequence of coronal mass ejections (CMEs) that occurred on 2010 August 1-2. Here we repeat this analysis for the background field reproduced by a magnetohydrodynamic (MHD) model that incorporates plasma thermodynamics. As for the PFSS model, we find that all three CME source regions contain a coronal hole (CH) that is separated from neighboring CHs by topologically very similar pseudo-streamer structures. However, the two models yield very different results for the size, shape, and flux of the CHs. We find that the helmet-streamer cusp line, which corresponds to a source-surface null line in the PFSS model, is structurally unstable and does not form in the MHD model. Our analysis indicates that, generally, in MHD configurations, this line instead consists of a multiple-null separator passing along the edge of disconnected-flux regions. Some of these regions are transient and may be the origin of the so-called streamer blobs. We show that the core topological structure of such blobs is a three-dimensional “plasmoid” consisting of two conjoined flux ropes of opposite handedness, which connect at a spiral null point of the magnetic field. Our analysis reveals that such plasmoids also appear in pseudo-streamers on much smaller scales. These new insights into the coronal magnetic topology provide some intriguing implications for solar energetic particle events and for the properties of the slow solar wind.

  15. Effect of a chameleon scalar field on the cosmic microwave background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Anne-Christine; Schelpe, Camilla A. O.; Shaw, Douglas J.

    2009-09-15

    We show that a direct coupling between a chameleonlike scalar field and photons can give rise to a modified Sunyaev-Zel'dovich (SZ) effect in the cosmic microwave background (CMB). The coupling induces a mixing between chameleon particles and the CMB photons when they pass through the magnetic field of a galaxy cluster. Both the intensity and the polarization of the radiation are modified. The degree of modification depends strongly on the properties of the galaxy cluster such as magnetic field strength and electron number density. Existing SZ measurements of the Coma cluster enable us to place constraints on the photon-chameleon coupling.more » The constrained conversion probability in the cluster is P{sub Coma}(204 GHz)<6.2x10{sup -5} at 95% confidence, corresponding to an upper bound on the coupling strength of g{sub eff}{sup (cell)}<2.2x10{sup -8} GeV{sup -1} or g{sub eff}{sup (Kolmo)}<(7.2-32.5)x10{sup -10} GeV{sup -1}, depending on the model that is assumed for the cluster magnetic field structure. We predict the radial profile of the chameleonic CMB intensity decrement. We find that the chameleon effect extends farther toward the edges of the cluster than the thermal SZ effect. Thus we might see a discrepancy between the x-ray emission data and the observed SZ intensity decrement. We further predict the expected change to the CMB polarization arising from the existence of a chameleonlike scalar field. These predictions could be verified or constrained by future CMB experiments.« less

  16. The Ratio between Field Attractive and Background Volatiles Encodes Host-Plant Recognition in a Specialist Moth.

    PubMed

    Knudsen, Geir K; Norli, Hans R; Tasin, Marco

    2017-01-01

    Volatiles emitted by plants convey an array of information through different trophic levels. Animals such as host-seeking herbivores encounter plumes with filaments from both host and non-host plants. While studies showed a behavioral effect of non-host plants on herbivore host location, less information is available on how a searching insect herbivore perceives and flies upwind to a host-plant odor plume within a background of non-host volatiles. We hypothesized here that herbivorous insects in search of a host-plant can discriminate plumes of host and non-host plants and that the taxonomic relatedness of the non-host have an effect on finding the host. We also predicted that the ratio between certain plant volatiles is cognized as host-plant recognition cue by a receiver herbivorous insect. To verify these hypotheses we measured the wind tunnel response of the moth Argyresthia conjugella to the host plant rowan, to non-host plants taxonomically related (Rosaceae, apple and pear) or unrelated to the host (Pinaceae, spruce) and to binary combination of host and non-host plants. Volatiles were collected from all plant combinations and delivered to the test insect via an ultrasonic sprayer as an artificial plume. While the response to the rowan as a plant was not affected by the addition of any of the non-host plants, the attraction to the corresponding sprayed headspace decreased when pear or apple but not spruce were added to rowan. A similar result was measured toward the odor exiting a jar where freshly cut plant material of apple or pear or spruce was intermixed with rowan. Dose-response gas-chromatography coupled to electroantennography revealed the presence of seven field attractive and seven background non-attractive antennally active compounds. Although the abundance of field attractive and of some background volatiles decreased in all dual combinations in comparison with rowan alone, an increased amount of the background compounds (3E)-4,8-Dimethyl-1

  17. The EPIC-MOS Particle-Induced Background Spectrum

    NASA Technical Reports Server (NTRS)

    Kuntz, K. D.; Snowden, S. L.

    2006-01-01

    We have developed a method for constructing a spectrum of the particle-induced instrumental background of the XMM-Newton EPIC MOS detectors that can be used for observations of the diffuse background and extended sources that fill a significant fraction of the instrument field of view. The strength and spectrum of the particle-induced background, that is, the background due to the interaction of particles with the detector and the detector surroundings, is temporally variable as well as spatially variable over individual chips. Our method uses a combination of the filter-wheel-closed data and a database of unexposed-region data to construct a spectrum of the "quiescent" background. We show that, using this method of background subtraction, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear evidence of solar wind charge exchange emission. We use the blank sky observations to show that contamination by SWCX emission is a strong function of the solar wind proton flux, and that observations through the flanks of the magnetosheath appear to be contaminated only at much higher solar wind fluxes. We have also developed a spectral model of the residual soft proton flares, which allows their effects to be removed to a substantial degree during spectral fitting.

  18. The Pedestrian Detection Method Using an Extension Background Subtraction about the Driving Safety Support Systems

    NASA Astrophysics Data System (ADS)

    Muranaka, Noriaki; Date, Kei; Tokumaru, Masataka; Imanishi, Shigeru

    In recent years, the traffic accident occurs frequently with explosion of traffic density. Therefore, we think that the safe and comfortable transportation system to defend the pedestrian who is the traffic weak is necessary. First, we detect and recognize the pedestrian (the crossing person) by the image processing. Next, we inform all the drivers of the right or left turn that the pedestrian exists by the sound and the image and so on. By prompting a driver to do safe driving in this way, the accident to the pedestrian can decrease. In this paper, we are using a background subtraction method for the movement detection of the movement object. In the background subtraction method, the update method in the background was important, and as for the conventional way, the threshold values of the subtraction processing and background update were identical. That is, the mixing rate of the input image and the background image of the background update was a fixation value, and the fine tuning which corresponded to the environment change of the weather was difficult. Therefore, we propose the update method of the background image that the estimated mistake is difficult to be amplified. We experiment and examines in the comparison about five cases of sunshine, cloudy, evening, rain, sunlight change, except night. This technique can set separately the threshold values of the subtraction processing and background update processing which suited the environmental condition of the weather and so on. Therefore, the fine tuning becomes possible freely in the mixing rate of the input image and the background image of the background update. Because the setting of the parameter which suited an environmental condition becomes important to minimize mistaking percentage, we examine about the setting of a parameter.

  19. Comparison of presbyopic additions determined by the fused cross-cylinder method using alternative target background colours.

    PubMed

    Wee, Sung-Hyun; Yu, Dong-Sik; Moon, Byeong-Yeon; Cho, Hyun Gug

    2010-11-01

    To compare and contrast standard and alternative versions of refractor head (phoropter)-based charts used to determine reading addition. Forty one presbyopic subjects aged between 42 and 60 years were tested. Tentative additions were determined using a red-green background letter chart, and 4 cross-grid charts (with white, red, green, or red-green backgrounds) which were used with the fused cross cylinder (FCC) method. The final addition for a 40 cm working distance was determined for each subject by subjectively adjusting the tentative additions. There were significant differences in the tentative additions obtained using the 5 methods (repeated measures ANOVA, p < 0.001). The mean differences between the tentative and final additions were <0.10 D and were not clinically meaningful, with the exception of the red-green letter test, and the red background in the FCC method. There were no significant differences between the tentative and final additions for the green background in the FCC method (p > 0.05). The intervals of the 95% limits of agreement were under ±0.50 D, and the narrowest interval (±0.26 D) was for the red-green background. The 3 FCC methods with a white, green, or red-green background provided a tentative addition close to the final addition. Compared with the other methods, the FCC method with the red-green background had a narrow range of error. Further, since this method combines the functions of both the fused cross-cylinder test and the duochrome test, it can be a useful technique for determining presbyopic additions. © 2010 The Authors. Ophthalmic and Physiological Optics © 2010 The College of Optometrists.

  20. Subspace-based optimization method for inverse scattering problems with an inhomogeneous background medium

    NASA Astrophysics Data System (ADS)

    Chen, Xudong

    2010-07-01

    This paper proposes a version of the subspace-based optimization method to solve the inverse scattering problem with an inhomogeneous background medium where the known inhomogeneities are bounded in a finite domain. Although the background Green's function at each discrete point in the computational domain is not directly available in an inhomogeneous background scenario, the paper uses the finite element method to simultaneously obtain the Green's function at all discrete points. The essence of the subspace-based optimization method is that part of the contrast source is determined from the spectrum analysis without using any optimization, whereas the orthogonally complementary part is determined by solving a lower dimension optimization problem. This feature significantly speeds up the convergence of the algorithm and at the same time makes it robust against noise. Numerical simulations illustrate the efficacy of the proposed algorithm. The algorithm presented in this paper finds wide applications in nondestructive evaluation, such as through-wall imaging.

  1. Apparatuses and methods for generating electric fields

    DOEpatents

    Scott, Jill R; McJunkin, Timothy R; Tremblay, Paul L

    2013-08-06

    Apparatuses and methods relating to generating an electric field are disclosed. An electric field generator may include a semiconductive material configured in a physical shape substantially different from a shape of an electric field to be generated thereby. The electric field is generated when a voltage drop exists across the semiconductive material. A method for generating an electric field may include applying a voltage to a shaped semiconductive material to generate a complex, substantially nonlinear electric field. The shape of the complex, substantially nonlinear electric field may be configured for directing charged particles to a desired location. Other apparatuses and methods are disclosed.

  2. The Ratio between Field Attractive and Background Volatiles Encodes Host-Plant Recognition in a Specialist Moth

    PubMed Central

    Knudsen, Geir K.; Norli, Hans R.; Tasin, Marco

    2017-01-01

    Volatiles emitted by plants convey an array of information through different trophic levels. Animals such as host-seeking herbivores encounter plumes with filaments from both host and non-host plants. While studies showed a behavioral effect of non-host plants on herbivore host location, less information is available on how a searching insect herbivore perceives and flies upwind to a host-plant odor plume within a background of non-host volatiles. We hypothesized here that herbivorous insects in search of a host-plant can discriminate plumes of host and non-host plants and that the taxonomic relatedness of the non-host have an effect on finding the host. We also predicted that the ratio between certain plant volatiles is cognized as host-plant recognition cue by a receiver herbivorous insect. To verify these hypotheses we measured the wind tunnel response of the moth Argyresthia conjugella to the host plant rowan, to non-host plants taxonomically related (Rosaceae, apple and pear) or unrelated to the host (Pinaceae, spruce) and to binary combination of host and non-host plants. Volatiles were collected from all plant combinations and delivered to the test insect via an ultrasonic sprayer as an artificial plume. While the response to the rowan as a plant was not affected by the addition of any of the non-host plants, the attraction to the corresponding sprayed headspace decreased when pear or apple but not spruce were added to rowan. A similar result was measured toward the odor exiting a jar where freshly cut plant material of apple or pear or spruce was intermixed with rowan. Dose-response gas-chromatography coupled to electroantennography revealed the presence of seven field attractive and seven background non-attractive antennally active compounds. Although the abundance of field attractive and of some background volatiles decreased in all dual combinations in comparison with rowan alone, an increased amount of the background compounds (3E)-4,8-Dimethyl-1

  3. 2010 August 1–2 Sympathetic Eruptions. II. Magnetic Topology of the MHD Background Field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titov, Viacheslav S.; Mikić, Zoran; Török, Tibor

    Using a potential field source-surface (PFSS) model, we recently analyzed the global topology of the background coronal magnetic field for a sequence of coronal mass ejections (CMEs) that occurred on 2010 August 1–2. Here we repeat this analysis for the background field reproduced by a magnetohydrodynamic (MHD) model that incorporates plasma thermodynamics. As for the PFSS model, we find that all three CME source regions contain a coronal hole (CH) that is separated from neighboring CHs by topologically very similar pseudo-streamer structures. However, the two models yield very different results for the size, shape, and flux of the CHs. Wemore » find that the helmet-streamer cusp line, which corresponds to a source-surface null line in the PFSS model, is structurally unstable and does not form in the MHD model. Our analysis indicates that, generally, in MHD configurations, this line instead consists of a multiple-null separator passing along the edge of disconnected-flux regions. Some of these regions are transient and may be the origin of the so-called streamer blobs. We show that the core topological structure of such blobs is a three-dimensional “plasmoid” consisting of two conjoined flux ropes of opposite handedness, which connect at a spiral null point of the magnetic field. Our analysis reveals that such plasmoids also appear in pseudo-streamers on much smaller scales. These new insights into the coronal magnetic topology provide some intriguing implications for solar energetic particle events and for the properties of the slow solar wind.« less

  4. Detection methods for non-Gaussian gravitational wave stochastic backgrounds

    NASA Astrophysics Data System (ADS)

    Drasco, Steve; Flanagan, Éanna É.

    2003-04-01

    A gravitational wave stochastic background can be produced by a collection of independent gravitational wave events. There are two classes of such backgrounds, one for which the ratio of the average time between events to the average duration of an event is small (i.e., many events are on at once), and one for which the ratio is large. In the first case the signal is continuous, sounds something like a constant hiss, and has a Gaussian probability distribution. In the second case, the discontinuous or intermittent signal sounds something like popcorn popping, and is described by a non-Gaussian probability distribution. In this paper we address the issue of finding an optimal detection method for such a non-Gaussian background. As a first step, we examine the idealized situation in which the event durations are short compared to the detector sampling time, so that the time structure of the events cannot be resolved, and we assume white, Gaussian noise in two collocated, aligned detectors. For this situation we derive an appropriate version of the maximum likelihood detection statistic. We compare the performance of this statistic to that of the standard cross-correlation statistic both analytically and with Monte Carlo simulations. In general the maximum likelihood statistic performs better than the cross-correlation statistic when the stochastic background is sufficiently non-Gaussian, resulting in a gain factor in the minimum gravitational-wave energy density necessary for detection. This gain factor ranges roughly between 1 and 3, depending on the duty cycle of the background, for realistic observing times and signal strengths for both ground and space based detectors. The computational cost of the statistic, although significantly greater than that of the cross-correlation statistic, is not unreasonable. Before the statistic can be used in practice with real detector data, further work is required to generalize our analysis to accommodate separated, misaligned

  5. Compressible cavitation with stochastic field method

    NASA Astrophysics Data System (ADS)

    Class, Andreas; Dumond, Julien

    2012-11-01

    Non-linear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrange particles or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic field method solving pdf transport based on Euler fields has been proposed which eliminates the necessity to mix Euler and Lagrange techniques or prescribed pdf assumptions. In the present work, part of the PhD Design and analysis of a Passive Outflow Reducer relying on cavitation, a first application of the stochastic field method to multi-phase flow and in particular to cavitating flow is presented. The application considered is a nozzle subjected to high velocity flow so that sheet cavitation is observed near the nozzle surface in the divergent section. It is demonstrated that the stochastic field formulation captures the wide range of pdf shapes present at different locations. The method is compatible with finite-volume codes where all existing physical models available for Lagrange techniques, presumed pdf or binning methods can be easily extended to the stochastic field formulation.

  6. Surveys for presence of Oregon spotted frog (Rana pretiosa): background information and field methods

    USGS Publications Warehouse

    Pearl, Christopher A.; Clayton, David; Turner, Lauri

    2010-01-01

    The Oregon spotted frog (Rana pretiosa) is the most aquatic of the native frogs in the Pacific Northwest. The common name derives from the pattern of black, ragged-edged spots set against a brown or red ground color on the dorsum of adult frogs. Oregon spotted frogs are generally associated with wetland complexes that have several aquatic habitat types and sizeable coverage of emergent vegetation. Like other ranid frogs native to the Northwest, Oregon spotted frogs breed in spring, larvae transform in summer of their breeding year, and adults tend to be relatively short lived (3-5 yrs). Each life stage (egg, tadpole, juvenile and adult) has characteristics that present challenges for detection. Breeding can be explosive and completed within 1-2 weeks. Egg masses are laid in aggregations, often in a few locations in large areas of potential habitat. Egg masses can develop, hatch, and disintegrate in <2 weeks during warm weather. Tadpoles can be difficult to identify, have low survival, and spend most of their 3-4 months hidden in vegetation or flocculant substrates. Juveniles and adults are often difficult to capture and can spend summers away from breeding areas. Moreover, a substantial portion of extant populations are of limited size (<100 breeding adults), and field densities of all life stages are often low. An understanding of the biology of the species and use of multiple visits are thus important for assessing presence of Oregon spotted frogs. This report is meant to be a resource for USDA Region 6 Forest Service (FS) and OR/WA Bureau of Land Management (BLM) personnel tasked with surveying for the presence of Oregon spotted frogs. Our objective was to summarize information to improve the efficiency of field surveys and increase chances of detection if frogs are present. We include overviews of historical and extant ranges of Oregon spotted frog. We briefly summarize what is known of Oregon spotted frog habitat associations and review aspects of behavior and

  7. Background simulations for the wide field imager aboard the ATHENA X-ray Observatory

    NASA Astrophysics Data System (ADS)

    Hauf, Steffen; Kuster, Markus; Hoffmann, Dieter H. H.; Lang, Philipp-Michael; Neff, Stephan; Pia, Maria Grazia; Strüder, Lothar

    2012-09-01

    The ATHENA X-ray observatory was a European Space Agency project for a L-class mission. ATHENA was to be based upon a simplified IXO design with the number of instruments and the focal length of the Wolter optics being reduced. One of the two instruments, the Wide Field Imager (WFI) was to be a DePFET based focal plane pixel detector, allowing for high time and spatial resolution spectroscopy in the energy-range between 0.1 and 15 keV. In order to fulfill the mission goals a high sensitivity is essential, especially to study faint and extended sources. Thus a detailed understanding of the detector background induced by cosmic ray particles is crucial. During the mission design generally extensive Monte-Carlo simulations are used to estimate the detector background in order to optimize shielding components and software rejection algorithms. The Geant4 toolkit1,2 is frequently the tool of choice for this purpose. Alongside validation of the simulation environment with XMM-Newton EPIC-pn and Space Shuttle STS-53 data we present estimates for the ATHENA WFI cosmic ray induced background including long-term activation, which demonstrate that DEPFET-technology based detectors are able to achieve the required sensitivity.

  8. Magnetic imager and method

    DOEpatents

    Powell, J.; Reich, M.; Danby, G.

    1997-07-22

    A magnetic imager includes a generator for practicing a method of applying a background magnetic field over a concealed object, with the object being effective to locally perturb the background field. The imager also includes a sensor for measuring perturbations of the background field to detect the object. In one embodiment, the background field is applied quasi-statically. And, the magnitude or rate of change of the perturbations may be measured for determining location, size, and/or condition of the object. 25 figs.

  9. Background ELF magnetic fields in incubators: a factor of importance in cell culture work.

    PubMed

    Mild, Kjell Hansson; Wilén, Jonna; Mattsson, Mats-Olof; Simko, Myrtill

    2009-07-01

    Extremely low frequency (ELF) magnetic fields in cell culture incubators have been measured. Values of the order of tens of muT were found which is in sharp contrast to the values found in our normal environment (0.05-0.1microT). There are numerous examples of biological effects found after exposure to MF at these levels, such as changes in gene expression, blocked cell differentiation, inhibition of the effect of tamoxifen, effects on chick embryo development, etc. We therefore recommend that people working with cell culture incubators check for the background magnetic field and take this into account in performing their experiments, since this could be an unrecognised factor of importance contributing to the variability in the results from work with cell cultures.

  10. Infrared Thermography Approach for Effective Shielding Area of Field Smoke Based on Background Subtraction and Transmittance Interpolation.

    PubMed

    Tang, Runze; Zhang, Tonglai; Chen, Yongpeng; Liang, Hao; Li, Bingyang; Zhou, Zunning

    2018-05-06

    Effective shielding area is a crucial indicator for the evaluation of the infrared smoke-obscuring effectiveness on the battlefield. The conventional methods for assessing the shielding area of the smoke screen are time-consuming and labor intensive, in addition to lacking precision. Therefore, an efficient and convincing technique for testing the effective shielding area of the smoke screen has great potential benefits in the smoke screen applications in the field trial. In this study, a thermal infrared sensor with a mid-wavelength infrared (MWIR) range of 3 to 5 μm was first used to capture the target scene images through clear as well as obscuring smoke, at regular intervals. The background subtraction in motion detection was then applied to obtain the contour of the smoke cloud at each frame. The smoke transmittance at each pixel within the smoke contour was interpolated based on the data that was collected from the image. Finally, the smoke effective shielding area was calculated, based on the accumulation of the effective shielding pixel points. One advantage of this approach is that it utilizes only one thermal infrared sensor without any other additional equipment in the field trial, which significantly contributes to the efficiency and its convenience. Experiments have been carried out to demonstrate that this approach can determine the effective shielding area of the field infrared smoke both practically and efficiently.

  11. Bohmian field theory on a shape dynamics background and Unruh effect

    NASA Astrophysics Data System (ADS)

    Dündar, Furkan Semih; Arık, Metin

    2018-05-01

    In this paper, we investigate the Unruh radiation in the Bohmian field theory on a shape dynamics background setting. Since metric and metric momentum are real quantities, the integral kernel to invert the Lichnerowicz-York equation for first order deviations due to existence of matter terms turns out to be real. This fact makes the interaction Hamiltonian real. On the other hand, the only contribution to guarantee the existence of Unruh radiation has to come from the imaginary part of the temporal part of the wave functional. We have proved the existence of Unruh radiation in this setting. It is also important that we have found the Unruh radiation via an Unruh-DeWitt detector in a theory where there is no Lorentz symmetry and no conventional space-time structure.

  12. getimages: Background derivation and image flattening method

    NASA Astrophysics Data System (ADS)

    Men'shchikov, Alexander

    2017-05-01

    getimages performs background derivation and image flattening for high-resolution images obtained with space observatories. It is based on median filtering with sliding windows corresponding to a range of spatial scales from the observational beam size up to a maximum structure width X. The latter is a single free parameter of getimages that can be evaluated manually from the observed image. The median filtering algorithm provides a background image for structures of all widths below X. The same median filtering procedure applied to an image of standard deviations derived from a background-subtracted image results in a flattening image. Finally, a flattened image is computed by dividing the background-subtracted by the flattening image. Standard deviations in the flattened image are now uniform outside sources and filaments. Detecting structures in such radically simplified images results in much cleaner extractions that are more complete and reliable. getimages also reduces various observational and map-making artifacts and equalizes noise levels between independent tiles of mosaicked images. The code (a Bash script) uses FORTRAN utilities from getsources (ascl:1507.014), which must be installed.

  13. Magnetic imager and method

    DOEpatents

    Powell, James; Reich, Morris; Danby, Gordon

    1997-07-22

    A magnetic imager 10 includes a generator 18 for practicing a method of applying a background magnetic field over a concealed object, with the object being effective to locally perturb the background field. The imager 10 also includes a sensor 20 for measuring perturbations of the background field to detect the object. In one embodiment, the background field is applied quasi-statically. And, the magnitude or rate of change of the perturbations may be measured for determining location, size, and/or condition of the object.

  14. Backgrounds in Language.

    ERIC Educational Resources Information Center

    Maxwell, John C.; Long, Barbara K.

    "Backgrounds in Language," a field-tested inservice course designed for use by groups of 15 or 25 language arts teachers, provides the subject matter background teachers need to make informed decisions about what curriculum materials to use in what way, at what time, and with which students. The course is comprised of eight 2-hour sessions,…

  15. A novel method to remove GPR background noise based on the similarity of non-neighboring regions

    NASA Astrophysics Data System (ADS)

    Montiel-Zafra, V.; Canadas-Quesada, F. J.; Vera-Candeas, P.; Ruiz-Reyes, N.; Rey, J.; Martinez, J.

    2017-09-01

    Ground penetrating radar (GPR) is a non-destructive technique that has been widely used in many areas of research, such as landmine detection or subsurface anomalies, where it is required to locate targets embedded within a background medium. One of the major challenges in the research of GPR data remains the improvement of the image quality of stone materials by means of detection of true anisotropies since most of the errors are caused by an incorrect interpretation by the users. However, it is complicated due to the interference of the horizontal background noise, e.g., the air-ground interface, that reduces the high-resolution quality of radargrams. Thus, weak or deep anisotropies are often masked by this type of noise. In order to remove the background noise obtained by GPR, this work proposes a novel background removal method assuming that the horizontal noise shows repetitive two-dimensional regions along the movement of the GPR antenna. Specifically, the proposed method, based on the non-local similarity of regions over the distance, computes similarities between different regions of the same depth in order to identify most repetitive regions using a criterion to avoid closer regions. Evaluations are performed using a set of synthetic and real GPR data. Experimental results show that the proposed method obtains promising results compared to the classic background removal techniques and the most recently published background removal methods.

  16. Plenoptic background oriented schlieren imaging

    NASA Astrophysics Data System (ADS)

    Klemkowsky, Jenna N.; Fahringer, Timothy W.; Clifford, Christopher J.; Bathel, Brett F.; Thurow, Brian S.

    2017-09-01

    The combination of the background oriented schlieren (BOS) technique with the unique imaging capabilities of a plenoptic camera, termed plenoptic BOS, is introduced as a new addition to the family of schlieren techniques. Compared to conventional single camera BOS, plenoptic BOS is capable of sampling multiple lines-of-sight simultaneously. Displacements from each line-of-sight are collectively used to build a four-dimensional displacement field, which is a vector function structured similarly to the original light field captured in a raw plenoptic image. The displacement field is used to render focused BOS images, which qualitatively are narrow depth of field slices of the density gradient field. Unlike focused schlieren methods that require manually changing the focal plane during data collection, plenoptic BOS synthetically changes the focal plane position during post-processing, such that all focal planes are captured in a single snapshot. Through two different experiments, this work demonstrates that plenoptic BOS is capable of isolating narrow depth of field features, qualitatively inferring depth, and quantitatively estimating the location of disturbances in 3D space. Such results motivate future work to transition this single-camera technique towards quantitative reconstructions of 3D density fields.

  17. Field by field hybrid upwind splitting methods

    NASA Technical Reports Server (NTRS)

    Coquel, Frederic; Liou, Meng-Sing

    1993-01-01

    A new and general approach to upwind splitting is presented. The design principle combines the robustness of flux vector splitting schemes in the capture of nonlinear waves and the accuracy of some flux difference splitting schemes in the resolution of linear waves. The new schemes are derived following a general hybridization technique performed directly at the basic level of the field by field decomposition involved in FDS methods. The scheme does not use a spatial switch to be tuned up according to the local smoothness of the approximate solution.

  18. A new background subtraction method for energy dispersive X-ray fluorescence spectra using a cubic spline interpolation

    NASA Astrophysics Data System (ADS)

    Yi, Longtao; Liu, Zhiguo; Wang, Kai; Chen, Man; Peng, Shiqi; Zhao, Weigang; He, Jialin; Zhao, Guangcui

    2015-03-01

    A new method is presented to subtract the background from the energy dispersive X-ray fluorescence (EDXRF) spectrum using a cubic spline interpolation. To accurately obtain interpolation nodes, a smooth fitting and a set of discriminant formulations were adopted. From these interpolation nodes, the background is estimated by a calculated cubic spline function. The method has been tested on spectra measured from a coin and an oil painting using a confocal MXRF setup. In addition, the method has been tested on an existing sample spectrum. The result confirms that the method can properly subtract the background.

  19. Advanced Background Subtraction Applied to Aeroacoustic Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Horne, William C.

    2015-01-01

    An advanced form of background subtraction is presented and applied to aeroacoustic wind tunnel data. A variant of this method has seen use in other fields such as climatology and medical imaging. The technique, based on an eigenvalue decomposition of the background noise cross-spectral matrix, is robust against situations where isolated background auto-spectral levels are measured to be higher than levels of combined source and background signals. It also provides an alternate estimate of the cross-spectrum, which previously might have poor definition for low signal-to-noise ratio measurements. Simulated results indicate similar performance to conventional background subtraction when the subtracted spectra are weaker than the true contaminating background levels. Superior performance is observed when the subtracted spectra are stronger than the true contaminating background levels. Experimental results show limited success in recovering signal behavior for data where conventional background subtraction fails. They also demonstrate the new subtraction technique's ability to maintain a proper coherence relationship in the modified cross-spectral matrix. Beam-forming and de-convolution results indicate the method can successfully separate sources. Results also show a reduced need for the use of diagonal removal in phased array processing, at least for the limited data sets considered.

  20. Fluorescence background removal method for biological Raman spectroscopy based on empirical mode decomposition.

    PubMed

    Leon-Bejarano, Maritza; Dorantes-Mendez, Guadalupe; Ramirez-Elias, Miguel; Mendez, Martin O; Alba, Alfonso; Rodriguez-Leyva, Ildefonso; Jimenez, M

    2016-08-01

    Raman spectroscopy of biological tissue presents fluorescence background, an undesirable effect that generates false Raman intensities. This paper proposes the application of the Empirical Mode Decomposition (EMD) method to baseline correction. EMD is a suitable approach since it is an adaptive signal processing method for nonlinear and non-stationary signal analysis that does not require parameters selection such as polynomial methods. EMD performance was assessed through synthetic Raman spectra with different signal to noise ratio (SNR). The correlation coefficient between synthetic Raman spectra and the recovered one after EMD denoising was higher than 0.92. Additionally, twenty Raman spectra from skin were used to evaluate EMD performance and the results were compared with Vancouver Raman algorithm (VRA). The comparison resulted in a mean square error (MSE) of 0.001554. High correlation coefficient using synthetic spectra and low MSE in the comparison between EMD and VRA suggest that EMD could be an effective method to remove fluorescence background in biological Raman spectra.

  1. Cosmological origin of anomalous radio background

    NASA Astrophysics Data System (ADS)

    Cline, James M.; Vincent, Aaron C.

    2013-02-01

    The ARCADE 2 collaboration has reported a significant excess in the isotropic radio background, whose homogeneity cannot be reconciled with clustered sources. This suggests a cosmological origin prior to structure formation. We investigate several potential mechanisms and show that injection of relativistic electrons through late decays of a metastable particle can give rise to the observed excess radio spectrum through synchrotron emission. However, constraints from the cosmic microwave background (CMB) anisotropy, on injection of charged particles and on the primordial magnetic field, present a challenge. The simplest scenario is with a gtrsim9 GeV particle decaying into e+e- at a redshift of z ~ 5, in a magnetic field of ~ 5μG, which exceeds the CMB B-field constraints, unless the field was generated after decoupling. Decays into exotic millicharged particles can alleviate this tension, if they emit synchroton radiation in conjunction with a sufficiently large background magnetic field of a dark U(1)' gauge field.

  2. Validation of radiative transfer computation with Monte Carlo method for ultra-relativistic background flow

    NASA Astrophysics Data System (ADS)

    Ishii, Ayako; Ohnishi, Naofumi; Nagakura, Hiroki; Ito, Hirotaka; Yamada, Shoichi

    2017-11-01

    We developed a three-dimensional radiative transfer code for an ultra-relativistic background flow-field by using the Monte Carlo (MC) method in the context of gamma-ray burst (GRB) emission. For obtaining reliable simulation results in the coupled computation of MC radiation transport with relativistic hydrodynamics which can reproduce GRB emission, we validated radiative transfer computation in the ultra-relativistic regime and assessed the appropriate simulation conditions. The radiative transfer code was validated through two test calculations: (1) computing in different inertial frames and (2) computing in flow-fields with discontinuous and smeared shock fronts. The simulation results of the angular distribution and spectrum were compared among three different inertial frames and in good agreement with each other. If the time duration for updating the flow-field was sufficiently small to resolve a mean free path of a photon into ten steps, the results were thoroughly converged. The spectrum computed in the flow-field with a discontinuous shock front obeyed a power-law in frequency whose index was positive in the range from 1 to 10 MeV. The number of photons in the high-energy side decreased with the smeared shock front because the photons were less scattered immediately behind the shock wave due to the small electron number density. The large optical depth near the shock front was needed for obtaining high-energy photons through bulk Compton scattering. Even one-dimensional structure of the shock wave could affect the results of radiation transport computation. Although we examined the effect of the shock structure on the emitted spectrum with a large number of cells, it is hard to employ so many computational cells per dimension in multi-dimensional simulations. Therefore, a further investigation with a smaller number of cells is required for obtaining realistic high-energy photons with multi-dimensional computations.

  3. Interaction of the branes in the presence of the background fields: The dynamical, nonintersecting, perpendicular, wrapped-fractional configuration

    NASA Astrophysics Data System (ADS)

    Maghsoodi, Elham; Kamani, Davoud

    2017-05-01

    We shall obtain the interaction of the Dp1- and Dp2-branes in the toroidal-orbifold space-time Tn × ℝ1,d-n-5 × ℂ2/ℤ 2. The configuration of the branes is nonintersecting, perpendicular, moving-rotating, wrapped-fractional with background fields. For this, we calculate the bosonic boundary state corresponding to a dynamical fractional-wrapped Dp-brane in the presence of the Kalb-Ramond field, a U1 gauge potential and an open string tachyon field. The long-range behavior of the interaction amplitude will be extracted.

  4. Use of an OSSE to Evaluate Background Error Covariances Estimated by the 'NMC Method'

    NASA Technical Reports Server (NTRS)

    Errico, Ronald M.; Prive, Nikki C.; Gu, Wei

    2014-01-01

    The NMC method has proven utility for prescribing approximate background-error covariances required by variational data assimilation systems. Here, untunedNMCmethod estimates are compared with explicitly determined error covariances produced within an OSSE context by exploiting availability of the true simulated states. Such a comparison provides insights into what kind of rescaling is required to render the NMC method estimates usable. It is shown that rescaling of variances and directional correlation lengths depends greatly on both pressure and latitude. In particular, some scaling coefficients appropriate in the Tropics are the reciprocal of those in the Extratropics. Also, the degree of dynamic balance is grossly overestimated by the NMC method. These results agree with previous examinations of the NMC method which used ensembles as an alternative for estimating background-error statistics.

  5. Nuclear Quadrupole Resonance (NQR) Method and Probe for Generating RF Magnetic Fields in Different Directions to Distinguish NQR from Acoustic Ringing Induced in a Sample

    DTIC Science & Technology

    1997-08-01

    77,719 TITLE OF THE INVENTION NUCLEAR QUADRUPOLE RESONANCE ( NQR ) METHOD AND PROBE FOR GENERATING RF MAGNETIC FIELDS IN DIFFERENT DIRECTIONS TO...DISTINGUISH NQR FROM ACOUSTIC RINGING INDUCED IN A SAMPLE BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a...nuclear quadrupole 15 resonance ( NQR ) method and probe for generating RF magnetic fields in different directions towards a sample. More specifically

  6. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Identification of source velocities on 3D structures in non-anechoic environments: Theoretical background and experimental validation of the inverse patch transfer functions method

    NASA Astrophysics Data System (ADS)

    Aucejo, M.; Totaro, N.; Guyader, J.-L.

    2010-08-01

    In noise control, identification of the source velocity field remains a major problem open to investigation. Consequently, methods such as nearfield acoustical holography (NAH), principal source projection, the inverse frequency response function and hybrid NAH have been developed. However, these methods require free field conditions that are often difficult to achieve in practice. This article presents an alternative method known as inverse patch transfer functions, designed to identify source velocities and developed in the framework of the European SILENCE project. This method is based on the definition of a virtual cavity, the double measurement of the pressure and particle velocity fields on the aperture surfaces of this volume, divided into elementary areas called patches and the inversion of impedances matrices, numerically computed from a modal basis obtained by FEM. Theoretically, the method is applicable to sources with complex 3D geometries and measurements can be carried out in a non-anechoic environment even in the presence of other stationary sources outside the virtual cavity. In the present paper, the theoretical background of the iPTF method is described and the results (numerical and experimental) for a source with simple geometry (two baffled pistons driven in antiphase) are presented and discussed.

  8. Nonrelativistic trace and diffeomorphism anomalies in particle number background

    NASA Astrophysics Data System (ADS)

    Auzzi, Roberto; Baiguera, Stefano; Nardelli, Giuseppe

    2018-04-01

    Using the heat kernel method, we compute nonrelativistic trace anomalies for Schrödinger theories in flat spacetime, with a generic background gauge field for the particle number symmetry, both for a free scalar and a free fermion. The result is genuinely nonrelativistic, and it has no counterpart in the relativistic case. Contrary to naive expectations, the anomaly is not gauge invariant; this is similar to the nongauge covariance of the non-Abelian relativistic anomaly. We also show that, in the same background, the gravitational anomaly for a nonrelativistic scalar vanishes.

  9. Inhibition of recombinase polymerase amplification by background DNA: a lateral flow-based method for enriching target DNA.

    PubMed

    Rohrman, Brittany; Richards-Kortum, Rebecca

    2015-02-03

    Recombinase polymerase amplification (RPA) may be used to detect a variety of pathogens, often after minimal sample preparation. However, previous work has shown that whole blood inhibits RPA. In this paper, we show that the concentrations of background DNA found in whole blood prevent the amplification of target DNA by RPA. First, using an HIV-1 RPA assay with known concentrations of nonspecific background DNA, we show that RPA tolerates more background DNA when higher HIV-1 target concentrations are present. Then, using three additional assays, we demonstrate that the maximum amount of background DNA that may be tolerated in RPA reactions depends on the DNA sequences used in the assay. We also show that changing the RPA reaction conditions, such as incubation time and primer concentration, has little effect on the ability of RPA to function when high concentrations of background DNA are present. Finally, we develop and characterize a lateral flow-based method for enriching the target DNA concentration relative to the background DNA concentration. This sample processing method enables RPA of 10(4) copies of HIV-1 DNA in a background of 0-14 μg of background DNA. Without lateral flow sample enrichment, the maximum amount of background DNA tolerated is 2 μg when 10(6) copies of HIV-1 DNA are present. This method requires no heating or other external equipment, may be integrated with upstream DNA extraction and purification processes, is compatible with the components of lysed blood, and has the potential to detect HIV-1 DNA in infant whole blood with high proviral loads.

  10. Impact of a primordial magnetic field on cosmic microwave background B modes with weak lensing

    NASA Astrophysics Data System (ADS)

    Yamazaki, Dai G.

    2018-05-01

    We discuss the manner in which the primordial magnetic field (PMF) suppresses the cosmic microwave background (CMB) B mode due to the weak-lensing (WL) effect. The WL effect depends on the lensing potential (LP) caused by matter perturbations, the distribution of which at cosmological scales is given by the matter power spectrum (MPS). Therefore, the WL effect on the CMB B mode is affected by the MPS. Considering the effect of the ensemble average energy density of the PMF, which we call "the background PMF," on the MPS, the amplitude of MPS is suppressed in the wave number range of k >0.01 h Mpc-1 . The MPS affects the LP and the WL effect in the CMB B mode; however, the PMF can damp this effect. Previous studies of the CMB B mode with the PMF have only considered the vector and tensor modes. These modes boost the CMB B mode in the multipole range of ℓ>1000 , whereas the background PMF damps the CMB B mode owing to the WL effect in the entire multipole range. The matter density in the Universe controls the WL effect. Therefore, when we constrain the PMF and the matter density parameters from cosmological observational data sets, including the CMB B mode, we expect degeneracy between these parameters. The CMB B mode also provides important information on the background gravitational waves, inflation theory, matter density fluctuations, and the structure formations at the cosmological scale through the cosmological parameter search. If we study these topics and correctly constrain the cosmological parameters from cosmological observations, including the CMB B mode, we need to correctly consider the background PMF.

  11. Improved background rejection in neutrinoless double beta decay experiments using a magnetic field in a high pressure xenon TPC

    NASA Astrophysics Data System (ADS)

    Renner, J.; Cervera, A.; Hernando, J. A.; Imzaylov, A.; Monrabal, F.; Muñoz, J.; Nygren, D.; Gomez-Cadenas, J. J.

    2015-12-01

    We demonstrate that the application of an external magnetic field could lead to an improved background rejection in neutrinoless double-beta (0νββ) decay experiments using a high-pressure xenon (HPXe) TPC. HPXe chambers are capable of imaging electron tracks, a feature that enhances the separation between signal events (the two electrons emitted in the 0νββ decay of 136Xe) and background events, arising chiefly from single electrons of kinetic energy compatible with the end-point of the 0νββ decay (0Qββ). Applying an external magnetic field of sufficiently high intensity (in the range of 0.5-1 Tesla for operating pressures in the range of 5-15 atmospheres) causes the electrons to produce helical tracks. Assuming the tracks can be properly reconstructed, the sign of the curvature can be determined at several points along these tracks, and such information can be used to separate signal (0νββ) events containing two electrons producing a track with two different directions of curvature from background (single-electron) events producing a track that should spiral in a single direction. Due to electron multiple scattering, this strategy is not perfectly efficient on an event-by-event basis, but a statistical estimator can be constructed which can be used to reject background events by one order of magnitude at a moderate cost (about 30%) in signal efficiency. Combining this estimator with the excellent energy resolution and topological signature identification characteristic of the HPXe TPC, it is possible to reach a background rate of less than one count per ton-year of exposure. Such a low background rate is an essential feature of the next generation of 0νββ experiments, aiming to fully explore the inverse hierarchy of neutrino masses.

  12. Sensitivity-based virtual fields for the non-linear virtual fields method

    NASA Astrophysics Data System (ADS)

    Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice

    2017-09-01

    The virtual fields method is an approach to inversely identify material parameters using full-field deformation data. In this manuscript, a new set of automatically-defined virtual fields for non-linear constitutive models has been proposed. These new sensitivity-based virtual fields reduce the influence of noise on the parameter identification. The sensitivity-based virtual fields were applied to a numerical example involving small strain plasticity; however, the general formulation derived for these virtual fields is applicable to any non-linear constitutive model. To quantify the improvement offered by these new virtual fields, they were compared with stiffness-based and manually defined virtual fields. The proposed sensitivity-based virtual fields were consistently able to identify plastic model parameters and outperform the stiffness-based and manually defined virtual fields when the data was corrupted by noise.

  13. Teaching Geographic Field Methods Using Paleoecology

    ERIC Educational Resources Information Center

    Walsh, Megan K.

    2014-01-01

    Field-based undergraduate geography courses provide numerous pedagogical benefits including an opportunity for students to acquire employable skills in an applied context. This article presents one unique approach to teaching geographic field methods using paleoecological research. The goals of this course are to teach students key geographic…

  14. Background colour matching by a crab spider in the field: a community sensory ecology perspective.

    PubMed

    Defrize, Jérémy; Théry, Marc; Casas, Jérôme

    2010-05-01

    The question of whether a species matches the colour of its natural background in the perspective of the correct receiver is complex to address for several reasons; however, the answer to this question may provide invaluable support for functional interpretations of colour. In most cases, little is known about the identity and visual sensory abilities of the correct receiver and the precise location at which interactions take place in the field, in particular for mimetic systems. In this study, we focused on Misumena vatia, a crab spider meeting the criteria for assessing crypsis better than many other models, and claimed to use colour changes for both aggressive and protective crypsis. We carried out a systematic field survey to quantitatively assess the exactness of background colour matching in M. vatia with respect to the visual system of many of its receivers within the community. We applied physiological models of bird, bee and blowfly colour vision, using flower and spider spectral reflectances measured with a spectroradiometer. We observed that crypsis at long distance is systematically achieved, exclusively through achromatic contrast, in both bee and bird visions. At short distance, M. vatia is mostly chromatically detectable, whatever the substrate, for bees and birds. However, spiders can be either poorly discriminable or quite visible depending on the substrate for bees. Spiders are always chromatically undetectable for blowflies. We discuss the biological relevance of these results in both defensive and aggressive contexts of crypsis within a community sensory perspective.

  15. Raman background photobleaching as a possible method of cancer diagnostics

    NASA Astrophysics Data System (ADS)

    Brandt, Nikolai N.; Brandt, Nikolai B.; Chikishev, Andrey Y.; Gangardt, Mihail G.; Karyakina, Nina F.

    2001-06-01

    Kinetics of photobleaching of background in Raman spectra of aqueous solutions of plant toxins ricin and ricin agglutinin, ricin binding subunit, and normal and malignant human blood serum were measured. For the excitation of the spectra cw and pulsed laser radiation were used. The spectra of Raman background change upon laser irradiation. Background intensity is lower for the samples with small molecular weight. The cyclization of amino acid residues in the toxin molecules as well as in human blood serum can be a reason of the Raman background. The model of the background photobleaching is proposed. The differences in photobleaching kinetics in the cases of cw and pulsed laser radiation are discussed. It is shown that Raman background photobleaching can be very informative for cancer diagnostics.

  16. Parametrically driven scalar field in an expanding background

    NASA Astrophysics Data System (ADS)

    Yanez-Pagans, Sergio; Urzagasti, Deterlino; Oporto, Zui

    2017-10-01

    We study the existence and dynamic behavior of localized and extended structures in a massive scalar inflaton field ϕ in 1 +1 dimensions in the framework of an expanding universe with constant Hubble parameter. We introduce a parametric forcing, produced by another quantum scalar field ψ , over the effective mass squared around the minimum of the inflaton potential. For this purpose, we study the system in the context of the cubic quintic complex Ginzburg-Landau equation and find the associated amplitude equation to the cosmological scalar field equation, which near the parametric resonance allows us to find the field amplitude. We find homogeneous null solutions, flat-top expanding solitons, and dark soliton patterns. No persistent non-null solutions are found in the absence of parametric forcing, and divergent solutions are obtained when the forcing amplitude is greater than 4 /3 .

  17. Scalar field dark energy with a minimal coupling in a spherically symmetric background

    NASA Astrophysics Data System (ADS)

    Matsumoto, Jiro

    Dark energy models and modified gravity theories have been actively studied and the behaviors in the solar system have been also carefully investigated in a part of the models. However, the isotropic solutions of the field equations in the simple models of dark energy, e.g. quintessence model without matter coupling, have not been well investigated. One of the reason would be the nonlinearity of the field equations. In this paper, a method to evaluate the solution of the field equations is constructed, and it is shown that there is a model that can easily pass the solar system tests, whereas, there is also a model that is constrained from the solar system tests.

  18. Research on infrared ship detection method in sea-sky background

    NASA Astrophysics Data System (ADS)

    Tang, Da; Sun, Gang; Wang, Ding-he; Niu, Zhao-dong; Chen, Zeng-ping

    2013-09-01

    An approach to infrared ship detection based on sea-sky-line(SSL) detection, ROI extraction and feature recognition is proposed in this paper. Firstly, considering that far ships are expected to be adjacent to the SSL, SSL is detected to find potential target areas. Radon transform is performed on gradient image to choose candidate SSLs, and detection result is given by fuzzy synthetic evaluation values. Secondly, in view of recognizable condition that there should be enough differences between target and background in infrared image, two gradient masks have been created and improved as practical guidelines in eliminating false alarm. Thirdly, extract ROI near the SSL by using multi-grade segmentation and fusion method after image sharpening, and unsuitable candidates are screened out according to the gradient masks and ROI shape. Finally, we segment the rest of ROIs by two-stage modified OTSU, and calculate target confidence as a standard measuring the facticity of target. Compared with other ship detection methods, proposed method is suitable for bipolar targets, which offers a good practicability and accuracy, and achieves a satisfying detection speed. Detection experiments with 200 thousand frames show that the proposed method is widely applicable, powerful in resistance to interferences and noises with a detection rate of above 95%, which satisfies the engineering needs commendably.

  19. An analytical method to calculate equivalent fields to irregular symmetric and asymmetric photon fields.

    PubMed

    Tahmasebi Birgani, Mohamad J; Chegeni, Nahid; Zabihzadeh, Mansoor; Hamzian, Nima

    2014-01-01

    Equivalent field is frequently used for central axis depth-dose calculations of rectangular- and irregular-shaped photon beams. As most of the proposed models to calculate the equivalent square field are dosimetry based, a simple physical-based method to calculate the equivalent square field size was used as the basis of this study. The table of the sides of the equivalent square or rectangular fields was constructed and then compared with the well-known tables by BJR and Venselaar, et al. with the average relative error percentage of 2.5 ± 2.5% and 1.5 ± 1.5%, respectively. To evaluate the accuracy of this method, the percentage depth doses (PDDs) were measured for some special irregular symmetric and asymmetric treatment fields and their equivalent squares for Siemens Primus Plus linear accelerator for both energies, 6 and 18MV. The mean relative differences of PDDs measurement for these fields and their equivalent square was approximately 1% or less. As a result, this method can be employed to calculate equivalent field not only for rectangular fields but also for any irregular symmetric or asymmetric field. © 2013 American Association of Medical Dosimetrists Published by American Association of Medical Dosimetrists All rights reserved.

  20. Notification: Background Investigation Services

    EPA Pesticide Factsheets

    Project #OA-FY15-0029, February 26, 2015. The Office of Inspector General (OIG) for the U.S. Environmental Protection Agency (EPA) plans to begin field work for our audit of background investigation services.

  1. Background simulations of the wide-field coded-mask camera for X-/Gamma-ray of the French-Chinese mission SVOM

    NASA Astrophysics Data System (ADS)

    Godet, Olivier; Barret, Didier; Paul, Jacques; Sizun, Patrick; Mandrou, Pierre; Cordier, Bertrand

    SVOM (Space Variable Object Monitor) is a French-Chinese mission dedicated to the study of high-redshift GRBs, which is expected to be launched in 2012. The anti-Sun pointing strategy of SVOM along with a strong and integrated ground segment consisting of two wide-field robotic telescopes covering the near-IR and optical will optimise the ground-based GRB follow-ups by the largest telescopes and thus the measurements of spectroscopic redshifts. The central instrument of the science payload will be an innovative wide-field coded-mask camera for X- /Gamma-rays (4-250 keV) responsible for triggering and localising GRBs with an accuracy better than 10 arc-minutes. Such an instrument will be background-dominated so it is essential to estimate the background level expected once in orbit during the early phase of the instrument design in order to ensure good science performance. We present our Monte-Carlo simulator enabling us to compute the background spectrum taking into account the mass model of the camera and the main components of the space environment encountered in orbit by the satellite. From that computation, we show that the current design of the camera CXG will be more sensitive to high-redshift GRBs than the Swift-BAT thanks to its low-energy threshold of 4 keV.

  2. A novel infrared small moving target detection method based on tracking interest points under complicated background

    NASA Astrophysics Data System (ADS)

    Dong, Xiabin; Huang, Xinsheng; Zheng, Yongbin; Bai, Shengjian; Xu, Wanying

    2014-07-01

    Infrared moving target detection is an important part of infrared technology. We introduce a novel infrared small moving target detection method based on tracking interest points under complicated background. Firstly, Difference of Gaussians (DOG) filters are used to detect a group of interest points (including the moving targets). Secondly, a sort of small targets tracking method inspired by Human Visual System (HVS) is used to track these interest points for several frames, and then the correlations between interest points in the first frame and the last frame are obtained. Last, a new clustering method named as R-means is proposed to divide these interest points into two groups according to the correlations, one is target points and another is background points. In experimental results, the target-to-clutter ratio (TCR) and the receiver operating characteristics (ROC) curves are computed experimentally to compare the performances of the proposed method and other five sophisticated methods. From the results, the proposed method shows a better discrimination of targets and clutters and has a lower false alarm rate than the existing moving target detection methods.

  3. Position sensitive detection of neutrons in high radiation background field.

    PubMed

    Vavrik, D; Jakubek, J; Pospisil, S; Vacik, J

    2014-01-01

    We present the development of a high-resolution position sensitive device for detection of slow neutrons in the environment of extremely high γ and e(-) radiation background. We make use of a planar silicon pixelated (pixel size: 55 × 55 μm(2)) spectroscopic Timepix detector adapted for neutron detection utilizing very thin (10)B converter placed onto detector surface. We demonstrate that electromagnetic radiation background can be discriminated from the neutron signal utilizing the fact that each particle type produces characteristic ionization tracks in the pixelated detector. Particular tracks can be distinguished by their 2D shape (in the detector plane) and spectroscopic response using single event analysis. A Cd sheet served as thermal neutron stopper as well as intensive source of gamma rays and energetic electrons. Highly efficient discrimination was successful even at very low neutron to electromagnetic background ratio about 10(-4).

  4. Improvements in Technique of NMR Imaging and NMR Diffusion Measurements in the Presence of Background Gradients.

    NASA Astrophysics Data System (ADS)

    Lian, Jianyu

    In this work, modification of the cosine current distribution rf coil, PCOS, has been introduced and tested. The coil produces a very homogeneous rf magnetic field, and it is inexpensive to build and easy to tune for multiple resonance frequency. The geometrical parameters of the coil are optimized to produce the most homogeneous rf field over a large volume. To avoid rf field distortion when the coil length is comparable to a quarter wavelength, a parallel PCOS coil is proposed and discussed. For testing rf coils and correcting B _1 in NMR experiments, a simple, rugged and accurate NMR rf field mapping technique has been developed. The method has been tested and used in 1D, 2D, 3D and in vivo rf mapping experiments. The method has been proven to be very useful in the design of rf coils. To preserve the linear relation between rf output applied on an rf coil and modulating input for an rf modulating -amplifying system of NMR imaging spectrometer, a quadrature feedback loop is employed in an rf modulator with two orthogonal rf channels to correct the amplitude and phase non-linearities caused by the rf components in the rf system. The modulator is very linear over a large range and it can generate an arbitrary rf shape. A diffusion imaging sequence has been developed for measuring and imaging diffusion in the presence of background gradients. Cross terms between the diffusion sensitizing gradients and background gradients or imaging gradients can complicate diffusion measurement and make the interpretation of NMR diffusion data ambiguous, but these have been eliminated in this method. Further, the background gradients has been measured and imaged. A dipole random distribution model has been established to study background magnetic fields Delta B and background magnetic gradients G_0 produced by small particles in a sample when it is in a B_0 field. From this model, the minimum distance that a spin can approach a particle can be determined by measuring

  5. Low Field Squid MRI Devices, Components and Methods

    NASA Technical Reports Server (NTRS)

    Hahn, Inseob (Inventor); Penanen, Konstantin I. (Inventor); Eom, Byeong H. (Inventor)

    2013-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  6. Low Field Squid MRI Devices, Components and Methods

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin I. (Inventor); Eom, Byeong H. (Inventor); Hahn, Inseob (Inventor)

    2014-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  7. Low field SQUID MRI devices, components and methods

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin I. (Inventor); Eom, Byeong H. (Inventor); Hahn, Inseob (Inventor)

    2011-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  8. Low field SQUID MRI devices, components and methods

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin I. (Inventor); Eom, Byeong H (Inventor); Hahn, Inseob (Inventor)

    2010-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  9. Application of Canonical Effective Methods to Background-Independent Theories

    NASA Astrophysics Data System (ADS)

    Buyukcam, Umut

    Effective formalisms play an important role in analyzing phenomena above some given length scale when complete theories are not accessible. In diverse exotic but physically important cases, the usual path-integral techniques used in a standard Quantum Field Theory approach seldom serve as adequate tools. This thesis exposes a new effective method for quantum systems, called the Canonical Effective Method, which owns particularly wide applicability in backgroundindependent theories as in the case of gravitational phenomena. The central purpose of this work is to employ these techniques to obtain semi-classical dynamics from canonical quantum gravity theories. Application to non-associative quantum mechanics is developed and testable results are obtained. Types of non-associative algebras relevant for magnetic-monopole systems are discussed. Possible modifications of hypersurface deformation algebra and the emergence of effective space-times are presented. iii.

  10. Percent body fat estimations in college women using field and laboratory methods: a three-compartment model approach

    PubMed Central

    Moon, Jordan R; Hull, Holly R; Tobkin, Sarah E; Teramoto, Masaru; Karabulut, Murat; Roberts, Michael D; Ryan, Eric D; Kim, So Jung; Dalbo, Vincent J; Walter, Ashley A; Smith, Abbie T; Cramer, Joel T; Stout, Jeffrey R

    2007-01-01

    Background Methods used to estimate percent body fat can be classified as a laboratory or field technique. However, the validity of these methods compared to multiple-compartment models has not been fully established. This investigation sought to determine the validity of field and laboratory methods for estimating percent fat (%fat) in healthy college-age women compared to the Siri three-compartment model (3C). Methods Thirty Caucasian women (21.1 ± 1.5 yrs; 164.8 ± 4.7 cm; 61.2 ± 6.8 kg) had their %fat estimated by BIA using the BodyGram™ computer program (BIA-AK) and population-specific equation (BIA-Lohman), NIR (Futrex® 6100/XL), a quadratic (SF3JPW) and linear (SF3WB) skinfold equation, air-displacement plethysmography (BP), and hydrostatic weighing (HW). Results All methods produced acceptable total error (TE) values compared to the 3C model. Both laboratory methods produced similar TE values (HW, TE = 2.4%fat; BP, TE = 2.3%fat) when compared to the 3C model, though a significant constant error (CE) was detected for HW (1.5%fat, p ≤ 0.006). The field methods produced acceptable TE values ranging from 1.8 – 3.8 %fat. BIA-AK (TE = 1.8%fat) yielded the lowest TE among the field methods, while BIA-Lohman (TE = 2.1%fat) and NIR (TE = 2.7%fat) produced lower TE values than both skinfold equations (TE > 2.7%fat) compared to the 3C model. Additionally, the SF3JPW %fat estimation equation resulted in a significant CE (2.6%fat, p ≤ 0.007). Conclusion Data suggest that the BP and HW are valid laboratory methods when compared to the 3C model to estimate %fat in college-age Caucasian women. When the use of a laboratory method is not feasible, NIR, BIA-AK, BIA-Lohman, SF3JPW, and SF3WB are acceptable field methods to estimate %fat in this population. PMID:17988393

  11. A New Moving Object Detection Method Based on Frame-difference and Background Subtraction

    NASA Astrophysics Data System (ADS)

    Guo, Jiajia; Wang, Junping; Bai, Ruixue; Zhang, Yao; Li, Yong

    2017-09-01

    Although many methods of moving object detection have been proposed, moving object extraction is still the core in video surveillance. However, with the complex scene in real world, false detection, missed detection and deficiencies resulting from cavities inside the body still exist. In order to solve the problem of incomplete detection for moving objects, a new moving object detection method combined an improved frame-difference and Gaussian mixture background subtraction is proposed in this paper. To make the moving object detection more complete and accurate, the image repair and morphological processing techniques which are spatial compensations are applied in the proposed method. Experimental results show that our method can effectively eliminate ghosts and noise and fill the cavities of the moving object. Compared to other four moving object detection methods which are GMM, VIBE, frame-difference and a literature's method, the proposed method improve the efficiency and accuracy of the detection.

  12. 2010 AUGUST 1-2 SYMPATHETIC ERUPTIONS. I. MAGNETIC TOPOLOGY OF THE SOURCE-SURFACE BACKGROUND FIELD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Titov, V. S.; Mikic, Z.; Toeroek, T.

    2012-11-01

    A sequence of apparently coupled eruptions was observed on 2010 August 1-2 by Solar Dynamics Observatory and STEREO. The eruptions were closely synchronized with one another, even though some of them occurred at widely separated locations. In an attempt to identify a plausible reason for such synchronization, we study the large-scale structure of the background magnetic configuration. The coronal field was computed from the photospheric magnetic field observed at the appropriate time period by using the potential field source-surface model. We investigate the resulting field structure by analyzing the so-called squashing factor calculated at the photospheric and source-surface boundaries, asmore » well as at different coronal cross-sections. Using this information as a guide, we determine the underlying structural skeleton of the configuration, including separatrix and quasi-separatrix surfaces. Our analysis reveals, in particular, several pseudo-streamers in the regions where the eruptions occurred. Of special interest to us are the magnetic null points and separators associated with the pseudo-streamers. We propose that magnetic reconnection triggered along these separators by the first eruption likely played a key role in establishing the assumed link between the sequential eruptions. The present work substantiates our recent simplified magnetohydrodynamic model of sympathetic eruptions and provides a guide for further deeper study of these phenomena. Several important implications of our results for the S-web model of the slow solar wind are also addressed.« less

  13. Method for making field-structured memory materials

    DOEpatents

    Martin, James E.; Anderson, Robert A.; Tigges, Chris P.

    2002-01-01

    A method of forming a dual-level memory material using field structured materials. The field structured materials are formed from a dispersion of ferromagnetic particles in a polymerizable liquid medium, such as a urethane acrylate-based photopolymer, which are applied as a film to a support and then exposed in selected portions of the film to an applied magnetic or electric field. The field can be applied either uniaxially or biaxially at field strengths up to 150 G or higher to form the field structured materials. After polymerizing the field-structure materials, a magnetic field can be applied to selected portions of the polymerized field-structured material to yield a dual-level memory material on the support, wherein the dual-level memory material supports read-and-write binary data memory and write once, read many memory.

  14. Position sensitive detection of neutrons in high radiation background field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vavrik, D., E-mail: vavrik@itam.cas.cz; Institute of Theoretical and Applied Mechanics, Academy of Sciences of the Czech Republic, Prosecka 76, 190 00 Prague 9; Jakubek, J.

    We present the development of a high-resolution position sensitive device for detection of slow neutrons in the environment of extremely high γ and e{sup −} radiation background. We make use of a planar silicon pixelated (pixel size: 55 × 55 μm{sup 2}) spectroscopic Timepix detector adapted for neutron detection utilizing very thin {sup 10}B converter placed onto detector surface. We demonstrate that electromagnetic radiation background can be discriminated from the neutron signal utilizing the fact that each particle type produces characteristic ionization tracks in the pixelated detector. Particular tracks can be distinguished by their 2D shape (in the detector plane)more » and spectroscopic response using single event analysis. A Cd sheet served as thermal neutron stopper as well as intensive source of gamma rays and energetic electrons. Highly efficient discrimination was successful even at very low neutron to electromagnetic background ratio about 10{sup −4}.« less

  15. [Research on the temperature field detection method of hot forging based on long-wavelength infrared spectrum].

    PubMed

    Zhang, Yu-Cun; Wei, Bin; Fu, Xian-Bin

    2014-02-01

    A temperature field detection method based on long-wavelength infrared spectrum for hot forging is proposed in the present paper. This method combines primary spectrum pyrometry and three-stage FP-cavity LCTF. By optimizing the solutions of three group nonlinear equations in the mathematical model of temperature detection, the errors are reduced, thus measuring results will be more objective and accurate. Then the system of three-stage FP-cavity LCTF was designed on the principle of crystal birefringence. The system realized rapid selection of any wavelength in a certain wavelength range. It makes the response of the temperature measuring system rapid and accurate. As a result, without the emissivity of hot forging, the method can acquire exact information of temperature field and effectively suppress the background light radiation around the hot forging and ambient light that impact the temperature detection accuracy. Finally, the results of MATLAB showed that the infrared spectroscopy through the three-stage FP-cavity LCTF could meet the requirements of design. And experiments verified the feasibility of temperature measuring method. Compared with traditional single-band thermal infrared imager, the accuracy of measuring result was improved.

  16. Comparison of on-site field measured inorganic arsenic in rice with laboratory measurements using a field deployable method: Method validation.

    PubMed

    Mlangeni, Angstone Thembachako; Vecchi, Valeria; Norton, Gareth J; Raab, Andrea; Krupp, Eva M; Feldmann, Joerg

    2018-10-15

    A commercial arsenic field kit designed to measure inorganic arsenic (iAs) in water was modified into a field deployable method (FDM) to measure iAs in rice. While the method has been validated to give precise and accurate results in the laboratory, its on-site field performance has not been evaluated. This study was designed to test the method on-site in Malawi in order to evaluate its accuracy and precision in determination of iAs on-site by comparing with a validated reference method and giving original data on inorganic arsenic in Malawian rice and rice-based products. The method was validated by using the established laboratory-based HPLC-ICPMS. Statistical tests indicated there were no significant differences between on-site and laboratory iAs measurements determined using the FDM (p = 0.263, ά = 0.05) and between on-site measurements and measurements determined using HPLC-ICP-MS (p = 0.299, ά = 0.05). This method allows quick (within 1 h) and efficient screening of rice containing iAs concentrations on-site. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Optimal background matching camouflage.

    PubMed

    Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C

    2017-07-12

    Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.

  18. Percent body fat estimations in college men using field and laboratory methods: A three-compartment model approach

    PubMed Central

    Moon, Jordan R; Tobkin, Sarah E; Smith, Abbie E; Roberts, Michael D; Ryan, Eric D; Dalbo, Vincent J; Lockwood, Chris M; Walter, Ashley A; Cramer, Joel T; Beck, Travis W; Stout, Jeffrey R

    2008-01-01

    Background Methods used to estimate percent body fat can be classified as a laboratory or field technique. However, the validity of these methods compared to multiple-compartment models has not been fully established. The purpose of this study was to determine the validity of field and laboratory methods for estimating percent fat (%fat) in healthy college-age men compared to the Siri three-compartment model (3C). Methods Thirty-one Caucasian men (22.5 ± 2.7 yrs; 175.6 ± 6.3 cm; 76.4 ± 10.3 kg) had their %fat estimated by bioelectrical impedance analysis (BIA) using the BodyGram™ computer program (BIA-AK) and population-specific equation (BIA-Lohman), near-infrared interactance (NIR) (Futrex® 6100/XL), four circumference-based military equations [Marine Corps (MC), Navy and Air Force (NAF), Army (A), and Friedl], air-displacement plethysmography (BP), and hydrostatic weighing (HW). Results All circumference-based military equations (MC = 4.7% fat, NAF = 5.2% fat, A = 4.7% fat, Friedl = 4.7% fat) along with NIR (NIR = 5.1% fat) produced an unacceptable total error (TE). Both laboratory methods produced acceptable TE values (HW = 2.5% fat; BP = 2.7% fat). The BIA-AK, and BIA-Lohman field methods produced acceptable TE values (2.1% fat). A significant difference was observed for the MC and NAF equations compared to both the 3C model and HW (p < 0.006). Conclusion Results indicate that the BP and HW are valid laboratory methods when compared to the 3C model to estimate %fat in college-age Caucasian men. When the use of a laboratory method is not feasible, BIA-AK, and BIA-Lohman are acceptable field methods to estimate %fat in this population. PMID:18426582

  19. Background radiation measurements at high power research reactors

    NASA Astrophysics Data System (ADS)

    Ashenfelter, J.; Balantekin, B.; Baldenegro, C. X.; Band, H. R.; Barclay, G.; Bass, C. D.; Berish, D.; Bowden, N. S.; Bryan, C. D.; Cherwinka, J. J.; Chu, R.; Classen, T.; Davee, D.; Dean, D.; Deichert, G.; Dolinski, M. J.; Dolph, J.; Dwyer, D. A.; Fan, S.; Gaison, J. K.; Galindo-Uribarri, A.; Gilje, K.; Glenn, A.; Green, M.; Han, K.; Hans, S.; Heeger, K. M.; Heffron, B.; Jaffe, D. E.; Kettell, S.; Langford, T. J.; Littlejohn, B. R.; Martinez, D.; McKeown, R. D.; Morrell, S.; Mueller, P. E.; Mumm, H. P.; Napolitano, J.; Norcini, D.; Pushin, D.; Romero, E.; Rosero, R.; Saldana, L.; Seilhan, B. S.; Sharma, R.; Stemen, N. T.; Surukuchi, P. T.; Thompson, S. J.; Varner, R. L.; Wang, W.; Watson, S. M.; White, B.; White, C.; Wilhelmi, J.; Williams, C.; Wise, T.; Yao, H.; Yeh, M.; Yen, Y.-R.; Zhang, C.; Zhang, X.; Prospect Collaboration

    2016-01-01

    Research reactors host a wide range of activities that make use of the intense neutron fluxes generated at these facilities. Recent interest in performing measurements with relatively low event rates, e.g. reactor antineutrino detection, at these facilities necessitates a detailed understanding of background radiation fields. Both reactor-correlated and naturally occurring background sources are potentially important, even at levels well below those of importance for typical activities. Here we describe a comprehensive series of background assessments at three high-power research reactors, including γ-ray, neutron, and muon measurements. For each facility we describe the characteristics and identify the sources of the background fields encountered. The general understanding gained of background production mechanisms and their relationship to facility features will prove valuable for the planning of any sensitive measurement conducted therein.

  20. Virasoro conformal blocks and thermality from classical background fields

    DOE PAGES

    Fitzpatrick, A. Liam; Kaplan, Jared; Walters, Matthew T.

    2015-11-30

    We show that in 2d CFTs at large central charge, the coupling of the stress tensor to heavy operators can be re-absorbed by placing the CFT in a non-trivial background metric. This leads to a more precise computation of the Virasoro conformal blocks between heavy and light operators, which are shown to be equivalent to global conformal blocks evaluated in the new background. We also generalize to the case where the operators carry U(1) charges. The refined Virasoro blocks can be used as the seed for a new Virasoro block recursion relation expanded in the heavy-light limit. Furthermore, we commentmore » on the implications of our results for the universality of black hole thermality in AdS 3 , or equivalently, the eigenstate thermalization hypothesis for CFT 2 at large central charge.« less

  1. Anatomical background noise power spectrum in differential phase contrast breast images

    NASA Astrophysics Data System (ADS)

    Garrett, John; Ge, Yongshuai; Li, Ke; Chen, Guang-Hong

    2015-03-01

    In x-ray breast imaging, the anatomical noise background of the breast has a significant impact on the detection of lesions and other features of interest. This anatomical noise is typically characterized by a parameter, β, which describes a power law dependence of anatomical noise on spatial frequency (the shape of the anatomical noise power spectrum). Large values of β have been shown to reduce human detection performance, and in conventional mammography typical values of β are around 3.2. Recently, x-ray differential phase contrast (DPC) and the associated dark field imaging methods have received considerable attention as possible supplements to absorption imaging for breast cancer diagnosis. However, the impact of these additional contrast mechanisms on lesion detection is not yet well understood. In order to better understand the utility of these new methods, we measured the β indices for absorption, DPC, and dark field images in 15 cadaver breast specimens using a benchtop DPC imaging system. We found that the measured β value for absorption was consistent with the literature for mammographic acquisitions (β = 3.61±0.49), but that both DPC and dark field images had much lower values of β (β = 2.54±0.75 for DPC and β = 1.44±0.49 for dark field). In addition, visual inspection showed greatly reduced anatomical background in both DPC and dark field images. These promising results suggest that DPC and dark field imaging may help provide improved lesion detection in breast imaging, particularly for those patients with dense breasts, in whom anatomical noise is a major limiting factor in identifying malignancies.

  2. A component compensation method for magnetic interferential field

    NASA Astrophysics Data System (ADS)

    Zhang, Qi; Wan, Chengbiao; Pan, Mengchun; Liu, Zhongyan; Sun, Xiaoyong

    2017-04-01

    A new component searching with scalar restriction method (CSSRM) is proposed for magnetometer to compensate magnetic interferential field caused by ferromagnetic material of platform and improve measurement performance. In CSSRM, the objection function for parameter estimation is to minimize magnetic field (components and magnitude) difference between its measurement value and reference value. Two scalar compensation method is compared with CSSRM and the simulation results indicate that CSSRM can estimate all interferential parameters and external magnetic field vector with high accuracy. The magnetic field magnitude and components, compensated with CSSRM, coincide with true value very well. Experiment is carried out for a tri-axial fluxgate magnetometer, mounted in a measurement system with inertial sensors together. After compensation, error standard deviation of both magnetic field components and magnitude are reduced from more than thousands nT to less than 20 nT. It suggests that CSSRM provides an effective way to improve performance of magnetic interferential field compensation.

  3. Background Independence and Duality Invariance in String Theory.

    PubMed

    Hohm, Olaf

    2017-03-31

    Closed string theory exhibits an O(D,D) duality symmetry on tori, which in double field theory is manifest before compactification. I prove that to first order in α^{'} there is no manifestly background independent and duality invariant formulation of bosonic string theory in terms of a metric, b field, and dilaton. To this end I use O(D,D) invariant second order perturbation theory around flat space to show that the unique background independent candidate expression for the gauge algebra at order α^{'} is inconsistent with the Jacobi identity. A background independent formulation exists instead for frame variables subject to α^{'}-deformed frame transformations (generalized Green-Schwarz transformations). Potential applications for curved backgrounds, as in cosmology, are discussed.

  4. Greybody factors for a minimally coupled scalar field in a three-dimensional Einstein-power-Maxwell black hole background

    NASA Astrophysics Data System (ADS)

    Panotopoulos, Grigoris; Rincón, Ángel

    2018-04-01

    In the present work we study the propagation of a probe minimally coupled scalar field in Einstein-power-Maxwell charged black hole background in (1 +2 ) dimensions. We find analytical expressions for the reflection coefficient as well as for the absorption cross section in the low energy regime, and we show graphically their behavior as functions of the frequency for several values of the free parameters of the theory.

  5. Sheet metals characterization using the virtual fields method

    NASA Astrophysics Data System (ADS)

    Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice

    2018-05-01

    In this work, a characterisation method involving a deep-notched specimen subjected to a tensile loading is introduced. This specimen leads to heterogeneous states of stress and strain, the latter being measured using a stereo DIC system (MatchID). This heterogeneity enables the identification of multiple material parameters in a single test. In order to identify material parameters from the DIC data, an inverse method called the Virtual Fields Method is employed. The method combined with recently developed sensitivity-based virtual fields allows to optimally locate areas in the test where information about each material parameter is encoded, improving accuracy of the identification over the traditional user-defined virtual fields. It is shown that a single test performed at 45° to the rolling direction is sufficient to obtain all anisotropic plastic parameters, thus reducing experimental effort involved in characterisation. The paper presents the methodology and some numerical validation.

  6. The influence of immigrant background on the choice of sedation method in paediatric dentistry.

    PubMed

    Dahlander, Andreas; Jansson, Leif; Carlstedt, Kerstin; Grindefjord, Margaret

    2015-01-01

    The effects of immigration on the demographics of the Swedish population have changed the situation for many dental care providers, placing increased demand on cultural competence. The aim of this investigation was to study the choice of sedation method among children with immigrant background, referred to paediatric dentistry specialists, because of behaviour management problems or dental fear in combination with treatment needs. The material consisted of dental records from children referred to two clinics for paediatric dentistry: 117 records from children with an immigrant background and 106 from children with a non-immigrant background. Information about choice of sedation method (conventional treatment, conscious sedation with midazolam, nitrous oxide, or general anaesthesia) and dental status was collected from the records. The number of missed appointments (defaults) was also registered. Binary logistic regression analyses were used to calculate the influence of potential predictors on choice of sedation method. The mean age of the patients in the immigrant group was 4.9 yrs, making them significantly younger than the patients in the non-immigrant group (mean 5.7 yrs). In the immigrant group, 26% of the patients defaulted from treatments, while the corresponding frequency was significantly lower for the reference group (7%). The numbers of primary teeth with caries and permanent teeth with caries were positively and significantly correlated with the choice of treatment under general anaesthesia. Conscious sedation was used significantly more often in younger children and in the non-immigrant group, while nitrous oxide was preferred in the older children. In conclusion, conscious sedation was more frequently used in the non-immigrant group. The choice of sedation was influenced by caries frequency and the age of the child.

  7. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    NASA Astrophysics Data System (ADS)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  8. Sources of background light on space based laser communications links

    NASA Astrophysics Data System (ADS)

    Farrell, Thomas C.

    2018-05-01

    We discuss the sources and levels of background light that should be expected on space based laser communication (lasercom) crosslinks and uplinks, as well as on downlinks to ground stations. The analyses are valid for both Earth orbiting satellites and inter-planetary links. Fundamental equations are derived suitable for first order system engineering analyses of potential lasercom systems. These divide sources of background light into two general categories: extended sources which fill the field of view of a receiver's optics, and point sources which cannot be resolved by the optics. Specific sources of background light are discussed, and expected power levels are estimated. For uplinks, reflected sunlight and blackbody radiation from the Earth dominates. For crosslinks, depending on specific link geometry, sources of background light may include the Sun in the field of view (FOV), reflected sunlight and blackbody radiation from planets and other bodies in the solar system, individual bright stars in the FOV, the amalgam of dim stars in the FOV, zodiacal light, and reflected sunlight off of the transmitting spacecraft. For downlinks, all of these potentially come into play, and the effects of the atmosphere, including turbulence, scattering, and absorption contribute as well. Methods for accounting for each of these are presented. Specific examples are presented to illustrate the relative contributions of each source for various link geometries.

  9. Historic Methods for Capturing Magnetic Field Images

    ERIC Educational Resources Information Center

    Kwan, Alistair

    2016-01-01

    I investigated two late 19th-century methods for capturing magnetic field images from iron filings for historical insight into the pedagogy of hands-on physics education methods, and to flesh out teaching and learning practicalities tacit in the historical record. Both methods offer opportunities for close sensory engagement in data-collection…

  10. Research on cloud background infrared radiation simulation based on fractal and statistical data

    NASA Astrophysics Data System (ADS)

    Liu, Xingrun; Xu, Qingshan; Li, Xia; Wu, Kaifeng; Dong, Yanbing

    2018-02-01

    Cloud is an important natural phenomenon, and its radiation causes serious interference to infrared detector. Based on fractal and statistical data, a method is proposed to realize cloud background simulation, and cloud infrared radiation data field is assigned using satellite radiation data of cloud. A cloud infrared radiation simulation model is established using matlab, and it can generate cloud background infrared images for different cloud types (low cloud, middle cloud, and high cloud) in different months, bands and sensor zenith angles.

  11. Background-Oriented Schlieren for Large-Scale and High-Speed Aerodynamic Phenomena

    NASA Technical Reports Server (NTRS)

    Mizukaki, Toshiharu; Borg, Stephen; Danehy, Paul M.; Murman, Scott M.; Matsumura, Tomoharu; Wakabayashi, Kunihiko; Nakayama, Yoshio

    2015-01-01

    Visualization of the flow field around a generic re-entry capsule in subsonic flow and shock wave visualization with cylindrical explosives have been conducted to demonstrate sensitivity and applicability of background-oriented schlieren (BOS) for field experiments. The wind tunnel experiment suggests that BOS with a fine-pixel imaging device has a density change detection sensitivity on the order of 10(sup -5) in subsonic flow. In a laboratory setup, the structure of the shock waves generated by explosives have been successfully reconstructed by a computed tomography method combined with BOS.

  12. On the detection of a stochastic background of gravitational radiation by the Doppler tracking of spacecraft

    NASA Technical Reports Server (NTRS)

    Mashhoon, B.; Grishchuk, L. P.

    1980-01-01

    Consideration is given to the possibility of detection of an isotropic background gravitational radiation of a stochastic nature by the method of Doppler tracking of spacecraft. Attention is given in the geometrical optics limit, to the general formula for the frequency shift of an electromagnetic signal in the gravitational radiation field, and it is shown to be gauge independent. The propagation of a free electromagnetic wave in a gravitational radiation field is examined with the conclusion that no resonance phenomena can be expected. Finally, the 'Doppler noise' due to a stochastic background is evaluated, and it is shown to depend on the total energy density of the background and a parameter that is a characteristic of the radiation spectrum and the detection system used.

  13. Proof of factorization using background field method of QCD

    NASA Astrophysics Data System (ADS)

    Nayak, Gouranga C.

    2010-02-01

    Factorization theorem plays the central role at high energy colliders to study standard model and beyond standard model physics. The proof of factorization theorem is given by Collins, Soper and Sterman to all orders in perturbation theory by using diagrammatic approach. One might wonder if one can obtain the proof of factorization theorem through symmetry considerations at the lagrangian level. In this paper we provide such a proof.

  14. Proof of factorization using background field method of QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nayak, Gouranga C.

    Factorization theorem plays the central role at high energy colliders to study standard model and beyond standard model physics. The proof of factorization theorem is given by Collins, Soper and Sterman to all orders in perturbation theory by using diagrammatic approach. One might wonder if one can obtain the proof of factorization theorem through symmetry considerations at the lagrangian level. In this paper we provide such a proof.

  15. Mitigation strategies against radiation-induced background for space astronomy missions

    NASA Astrophysics Data System (ADS)

    Davis, C. S. W.; Hall, D.; Keelan, J.; O'Farrell, J.; Leese, M.; Holland, A.

    2018-01-01

    The Advanced Telescope for High ENergy Astrophysics (ATHENA) mission is a major upcoming space-based X-ray observatory due to be launched in 2028 by ESA, with the purpose of mapping the early universe and observing black holes. Background radiation is expected to constitute a large fraction of the total system noise in the Wide Field Imager (WFI) instrument on ATHENA, and designing an effective system to reduce the background radiation impacting the WFI will be crucial for maximising its sensitivity. Significant background sources are expected to include high energy protons, X-ray fluorescence lines, 'knock-on' electrons and Compton electrons. Due to the variety of the different background sources, multiple shielding methods may be required to achieve maximum sensitivity in the WFI. These techniques may also be of great interest for use in future space-based X-ray experiments. Simulations have been developed to model the effect of a graded-Z shield on the X-ray fluorescence background. In addition the effect of a 90nm optical blocking filter on the secondary electron background has been investigated and shown to modify the requirements of any secondary electron shielding that is to be used.

  16. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 63 [OAR-2004-0080, FRL-9306-8] RIN 2060-AF00 Method 301--Field Validation of Pollutant Measurement Methods From Various Waste Media AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: This action amends EPA's Method 301, Field Validation...

  17. Geochemical field method for determination of nickel in plants

    USGS Publications Warehouse

    Reichen, L.E.

    1951-01-01

    The use of biogeochemical data in prospecting for nickel emphasizes the need for a simple, moderately accurate field method for the determination of nickel in plants. In order to follow leads provided by plants of unusual nickel content without loss of time, the plants should be analyzed and the results given to the field geologist promptly. The method reported in this paper was developed to meet this need. Speed is acquired by elimination of the customary drying and controlled ashing; the fresh vegetation is ashed in an open dish over a gasoline stove. The ash is put into solution with hydrochloric acid and the solution buffered. A chromograph is used to make a confined spot with an aliquot of the ash solution on dimethylglyoxime reagent paper. As little as 0.025% nickel in plant ash can be determined. With a simple modification, 0.003% can be detected. Data are given comparing the results obtained by an accepted laboratory procedure. Results by the field method are within 30% of the laboratory values. The field method for nickel in plants meets the requirements of biogeochemical prospecting with respect to accuracy, simplicity, speed, and ease of performance in the field. With experience, an analyst can make 30 determinations in an 8-hour work day in the field.

  18. A salient region detection model combining background distribution measure for indoor robots.

    PubMed

    Li, Na; Xu, Hui; Wang, Zhenhua; Sun, Lining; Chen, Guodong

    2017-01-01

    Vision system plays an important role in the field of indoor robot. Saliency detection methods, capturing regions that are perceived as important, are used to improve the performance of visual perception system. Most of state-of-the-art methods for saliency detection, performing outstandingly in natural images, cannot work in complicated indoor environment. Therefore, we propose a new method comprised of graph-based RGB-D segmentation, primary saliency measure, background distribution measure, and combination. Besides, region roundness is proposed to describe the compactness of a region to measure background distribution more robustly. To validate the proposed approach, eleven influential methods are compared on the DSD and ECSSD dataset. Moreover, we build a mobile robot platform for application in an actual environment, and design three different kinds of experimental constructions that are different viewpoints, illumination variations and partial occlusions. Experimental results demonstrate that our model outperforms existing methods and is useful for indoor mobile robots.

  19. Background Error Correlation Modeling with Diffusion Operators

    DTIC Science & Technology

    2013-01-01

    RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) 07-10-2013 Book Chapter Background Error Correlation Modeling with Diffusion Operators...normalization Unclassified Unclassified Unclassified UU 27 Max Yaremchuk (228) 688-5259 Reset Chapter 8 Background error correlation modeling with diffusion ...field, then a structure like this simulates enhanced diffusive transport of model errors in the regions of strong cur- rents on the background of

  20. [The validation of the effect of correcting spectral background changes based on floating reference method by simulation].

    PubMed

    Wang, Zhu-lou; Zhang, Wan-jie; Li, Chen-xi; Chen, Wen-liang; Xu, Ke-xin

    2015-02-01

    There are some challenges in near-infrared non-invasive blood glucose measurement, such as the low signal to noise ratio of instrument, the unstable measurement conditions, the unpredictable and irregular changes of the measured object, and etc. Therefore, it is difficult to extract the information of blood glucose concentrations from the complicated signals accurately. Reference measurement method is usually considered to be used to eliminate the effect of background changes. But there is no reference substance which changes synchronously with the anylate. After many years of research, our research group has proposed the floating reference method, which is succeeded in eliminating the spectral effects induced by the instrument drifts and the measured object's background variations. But our studies indicate that the reference-point will changes following the changing of measurement location and wavelength. Therefore, the effects of floating reference method should be verified comprehensively. In this paper, keeping things simple, the Monte Carlo simulation employing Intralipid solution with the concentrations of 5% and 10% is performed to verify the effect of floating reference method used into eliminating the consequences of the light source drift. And the light source drift is introduced through varying the incident photon number. The effectiveness of the floating reference method with corresponding reference-points at different wavelengths in eliminating the variations of the light source drift is estimated. The comparison of the prediction abilities of the calibration models with and without using this method shows that the RMSEPs of the method are decreased by about 98.57% (5%Intralipid)and 99.36% (10% Intralipid)for different Intralipid. The results indicate that the floating reference method has obvious effect in eliminating the background changes.

  1. Force-free magnetic fields - The magneto-frictional method

    NASA Technical Reports Server (NTRS)

    Yang, W. H.; Sturrock, P. A.; Antiochos, S. K.

    1986-01-01

    The problem under discussion is that of calculating magnetic field configurations in which the Lorentz force j x B is everywhere zero, subject to specified boundary conditions. We choose to represent the magnetic field in terms of Clebsch variables in the form B = grad alpha x grad beta. These variables are constant on any field line so that each field line is labeled by the corresponding values of alpha and beta. When the field is described in this way, the most appropriate choice of boundary conditions is to specify the values of alpha and beta on the bounding surface. We show that such field configurations may be calculated by a magneto-frictional method. We imagine that the field lines move through a stationary medium, and that each element of magnetic field is subject to a frictional force parallel to and opposing the velocity of the field line. This concept leads to an iteration procedure for modifying the variables alpha and beta, that tends asymptotically towards the force-free state. We apply the method first to a simple problem in two rectangular dimensions, and then to a problem of cylindrical symmetry that was previously discussed by Barnes and Sturrock (1972). In one important respect, our new results differ from the earlier results of Barnes and Sturrock, and we conclude that the earlier article was in error.

  2. Methods of measuring soil moisture in the field

    USGS Publications Warehouse

    Johnson, A.I.

    1962-01-01

    For centuries, the amount of moisture in the soil has been of interest in agriculture. The subject of soil moisture is also of great importance to the hydrologist, forester, and soils engineer. Much equipment and many methods have been developed to measure soil moisture under field conditions. This report discusses and evaluates the various methods for measurement of soil moisture and describes the equipment needed for each method. The advantages and disadvantages of each method are discussed and an extensive list of references is provided for those desiring to study the subject in more detail. The gravimetric method is concluded to be the most satisfactory method for most problems requiring onetime moisture-content data. The radioactive method is normally best for obtaining repeated measurements of soil moisture in place. It is concluded that all methods have some limitations and that the ideal method for measurement of soil moisture under field conditions has yet to be perfected.

  3. Granularity of the Diffuse Background Observed

    NASA Technical Reports Server (NTRS)

    Gruber, D. E.; MacDonald, D.; Rothschild, R. E.; Boldt, E.; Mushotzky, R. F.; Fabian, A. C.

    1995-01-01

    First results are reported from a program for measuring the field-to-field fluctuation level of the cosmic diffuse background by using differences between the two background positions of each deep exposure with the High Energy X-ray Timing Experiment (HEXTE) instrument on the Remote X Ray Timing Explorer (RXTE). With 8 million live seconds accumulated to date a fluctuation level on the 15-25 keV band is observed which is consistent with extrapolations from the High Energy Astrophysical Observatory-1 (HEAO-1) measurements. Positive results are expected eventually at higher energies. Models of (active galactic nuclei) AGN origin will eventually be constrained by this program.

  4. THE NEW YORK CITY URBAN DISPERSION PROGRAM MARCH 2005 FIELD STUDY: TRACER METHODS AND RESULTS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WATSON, T.B.; HEISER, J.; KALB, P.

    The Urban Dispersion Program March 2005 Field Study tracer releases, sampling, and analytical methods are described in detail. There were two days where tracer releases and sampling were conducted. A total of 16.0 g of six tracers were released during the first test day or Intensive Observation Period (IOP) 1 and 15.7 g during IOP 2. Three types of sampling instruments were used in this study. Sequential air samplers, or SAS, collected six-minute samples, while Brookhaven atmospheric tracer samplers (BATS) and personal air samplers (PAS) collected thirty-minute samples. There were a total of 1300 samples resulting from the two IOPs.more » Confidence limits in the sampling and analysis method were 20% as determined from 100 duplicate samples. The sample recovery rate was 84%. The integrally averaged 6-minute samples were compared to the 30-minute samples. The agreement was found to be good in most cases. The validity of using a background tracer to calculate sample volumes was examined and also found to have a confidence level of 20%. Methods for improving sampling and analysis are discussed. The data described in this report are available as Excel files. An additional Excel file of quality assured tracer data for use in model validation efforts is also available. The file consists of extensively quality assured BATS tracer data with background concentrations subtracted.« less

  5. Particle production in a gravitational wave background

    NASA Astrophysics Data System (ADS)

    Jones, Preston; McDougall, Patrick; Singleton, Douglas

    2017-03-01

    We study the possibility that massless particles, such as photons, are produced by a gravitational wave. That such a process should occur is implied by tree-level Feynman diagrams such as two gravitons turning into two photons, i.e., g +g →γ +γ . Here we calculate the rate at which a gravitational wave creates a massless scalar field. This is done by placing the scalar field in the background of a plane gravitational wave and calculating the 4-current of the scalar field. Even in the vacuum limit of the scalar field it has a nonzero vacuum expectation value (similar to what occurs in the Higgs mechanism) and a nonzero current. We associate this with the production of scalar field quanta by the gravitational field. This effect has potential consequences for the attenuation of gravitational waves since the massless field is being produced at the expense of the gravitational field. This is related to the time-dependent Schwinger effect, but with the electric field replaced by the gravitational wave background and the electron/positron field quanta replaced by massless scalar "photons." Since the produced scalar quanta are massless there is no exponential suppression, as occurs in the Schwinger effect due to the electron mass.

  6. Optical Flow for Flight and Wind Tunnel Background Oriented Schlieren Imaging

    NASA Technical Reports Server (NTRS)

    Smith, Nathanial T.; Heineck, James T.; Schairer, Edward T.

    2017-01-01

    Background oriented Schlieren images have historically been generated by calculating the observed pixel displacement between a wind-on and wind-o image pair using normalized cross-correlation. This work uses optical flow to solve the displacement fields which generate the Schlieren images. A well established method used in the computer vision community, optical flow is the apparent motion in an image sequence due to brightness changes. The regularization method of Horn and Schunck is used to create Schlieren images using two data sets: a supersonic jet plume shock interaction from the NASA Ames Unitary Plan Wind Tunnel, and a transonic flight test of a T-38 aircraft using a naturally occurring background, performed in conjunction with NASA Ames and Armstrong Research Centers. Results are presented and contrasted with those using normalized cross-correlation. The optical flow Schlieren images are found to provided significantly more detail. We apply the method to historical data sets to demonstrate the broad applicability and limitations of the technique.

  7. Exponential nonlinear electrodynamics and backreaction effects on holographic superconductor in the Lifshitz black hole background

    NASA Astrophysics Data System (ADS)

    Sherkatghanad, Z.; Mirza, B.; Lalehgani Dezaki, F.

    We analytically describe the properties of the s-wave holographic superconductor with the exponential nonlinear electrodynamics in the Lifshitz black hole background in four-dimensions. Employing an assumption the scalar and gauge fields backreact on the background geometry, we calculate the critical temperature as well as the condensation operator. Based on Sturm-Liouville method, we show that the critical temperature decreases with increasing exponential nonlinear electrodynamics and Lifshitz dynamical exponent, z, indicating that condensation becomes difficult. Also we find that the effects of backreaction has a more important role on the critical temperature and condensation operator in small values of Lifshitz dynamical exponent, while z is around one. In addition, the properties of the upper critical magnetic field in Lifshitz black hole background using Sturm-Liouville approach is investigated to describe the phase diagram of the corresponding holographic superconductor in the probe limit. We observe that the critical magnetic field decreases with increasing Lifshitz dynamical exponent, z, and it goes to zero at critical temperature, independent of the Lifshitz dynamical exponent, z.

  8. Synchronization of video recording and laser pulses including background light suppression

    NASA Technical Reports Server (NTRS)

    Kalshoven, Jr., James E. (Inventor); Tierney, Jr., Michael (Inventor); Dabney, Philip W. (Inventor)

    2004-01-01

    An apparatus for and a method of triggering a pulsed light source, in particular a laser light source, for predictable capture of the source by video equipment. A frame synchronization signal is derived from the video signal of a camera to trigger the laser and position the resulting laser light pulse in the appropriate field of the video frame and during the opening of the electronic shutter, if such shutter is included in the camera. Positioning of the laser pulse in the proper video field allows, after recording, for the viewing of the laser light image with a video monitor using the pause mode on a standard cassette-type VCR. This invention also allows for fine positioning of the laser pulse to fall within the electronic shutter opening. For cameras with externally controllable electronic shutters, the invention provides for background light suppression by increasing shutter speed during the frame in which the laser light image is captured. This results in the laser light appearing in one frame in which the background scene is suppressed with the laser light being uneffected, while in all other frames, the shutter speed is slower, allowing for the normal recording of the background scene. This invention also allows for arbitrary (manual or external) triggering of the laser with full video synchronization and background light suppression.

  9. Nonrelativistic fluids on scale covariant Newton-Cartan backgrounds

    NASA Astrophysics Data System (ADS)

    Mitra, Arpita

    2017-12-01

    The nonrelativistic covariant framework for fields is extended to investigate fields and fluids on scale covariant curved backgrounds. The scale covariant Newton-Cartan background is constructed using the localization of space-time symmetries of nonrelativistic fields in flat space. Following this, we provide a Weyl covariant formalism which can be used to study scale invariant fluids. By considering ideal fluids as an example, we describe its thermodynamic and hydrodynamic properties and explicitly demonstrate that it satisfies the local second law of thermodynamics. As a further application, we consider the low energy description of Hall fluids. Specifically, we find that the gauge fields for scale transformations lead to corrections of the Wen-Zee and Berry phase terms contained in the effective action.

  10. Background radiation measurements at high power research reactors

    DOE PAGES

    Ashenfelter, J.; Yeh, M.; Balantekin, B.; ...

    2015-10-23

    Research reactors host a wide range of activities that make use of the intense neutron fluxes generated at these facilities. Recent interest in performing measurements with relatively low event rates, e.g. reactor antineutrino detection, at these facilities necessitates a detailed understanding of background radiation fields. Both reactor-correlated and naturally occurring background sources are potentially important, even at levels well below those of importance for typical activities. Here we describe a comprehensive series of background assessments at three high-power research reactors, including γ-ray, neutron, and muon measurements. For each facility we describe the characteristics and identify the sources of the backgroundmore » fields encountered. Furthermore, the general understanding gained of background production mechanisms and their relationship to facility features will prove valuable for the planning of any sensitive measurement conducted therein.« less

  11. Path Planning for Robot based on Chaotic Artificial Potential Field Method

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng

    2018-03-01

    Robot path planning in unknown environments is one of the hot research topics in the field of robot control. Aiming at the shortcomings of traditional artificial potential field methods, we propose a new path planning for Robot based on chaotic artificial potential field method. The path planning adopts the potential function as the objective function and introduces the robot direction of movement as the control variables, which combines the improved artificial potential field method with chaotic optimization algorithm. Simulations have been carried out and the results demonstrate that the superior practicality and high efficiency of the proposed method.

  12. New Method for Solving Inductive Electric Fields in the Ionosphere

    NASA Astrophysics Data System (ADS)

    Vanhamäki, H.

    2005-12-01

    We present a new method for calculating inductive electric fields in the ionosphere. It is well established that on large scales the ionospheric electric field is a potential field. This is understandable, since the temporal variations of large scale current systems are generally quite slow, in the timescales of several minutes, so inductive effects should be small. However, studies of Alfven wave reflection have indicated that in some situations inductive phenomena could well play a significant role in the reflection process, and thus modify the nature of ionosphere-magnetosphere coupling. The input to our calculation method are the time series of the potential part of the ionospheric electric field together with the Hall and Pedersen conductances. The output is the time series of the induced rotational part of the ionospheric electric field. The calculation method works in the time-domain and can be used with non-uniform, time-dependent conductances. In addition no particular symmetry requirements are imposed on the input potential electric field. The presented method makes use of special non-local vector basis functions called Cartesian Elementary Current Systems (CECS). This vector basis offers a convenient way of representing curl-free and divergence-free parts of 2-dimensional vector fields and makes it possible to solve the induction problem using simple linear algebra. The new calculation method is validated by comparing it with previously published results for Alfven wave reflection from uniformly conducting ionosphere.

  13. Influence of magnetic field configuration on magnetohydrodynamic waves in Earth's core

    NASA Astrophysics Data System (ADS)

    Knezek, Nicholas; Buffett, Bruce

    2018-04-01

    We develop a numerical model to study magnetohydrodynamic waves in a thin layer of stratified fluid near the surface of Earth's core. Past studies have been limited to using simple background magnetic field configurations. However, the choice of field distribution can dramatically affect the structure and frequency of the waves. To permit a more general treatment of background magnetic field and layer stratification, we combine finite volume and Fourier methods to describe the wave motions. We validate our model by comparisons to previous studies and examine the influence of background magnetic field configuration on two types of magnetohydrodynamic waves. We show that the structure of zonal Magnetic-Archimedes-Coriolis (MAC) waves for a dipole background field is unstable to small perturbations of the field strength in the equatorial region. Modifications to the wave structures are computed for a range of field configurations. In addition, we show that non-zonal MAC waves are trapped near the equator for realistic magnetic field distributions, and that their latitudinal extent depends upon the distribution of magnetic field strength at the CMB.

  14. Holographic anisotropic background with confinement-deconfinement phase transition

    NASA Astrophysics Data System (ADS)

    Aref'eva, Irina; Rannu, Kristina

    2018-05-01

    We present new anisotropic black brane solutions in 5D Einstein-dilaton-two-Maxwell system. The anisotropic background is specified by an arbitrary dynamical exponent ν, a nontrivial warp factor, a non-zero dilaton field, a non-zero time component of the first Maxwell field and a non-zero longitudinal magnetic component of the second Maxwell field. The blackening function supports the Van der Waals-like phase transition between small and large black holes for a suitable first Maxwell field charge. The isotropic case corresponding to ν = 1 and zero magnetic field reproduces previously known solutions. We investigate the anisotropy influence on the thermodynamic properties of our background, in particular, on the small/large black holes phase transition diagram. We discuss applications of the model to the bottom-up holographic QCD. The RG flow interpolates between the UV section with two suppressed transversal coordinates and the IR section with the suppressed time and longitudinal coordinates due to anisotropic character of our solution. We study the temporal Wilson loops, extended in longitudinal and transversal directions, by calculating the minimal surfaces of the corresponding probing open string world-sheet in anisotropic backgrounds with various temperatures and chemical potentials. We find that dynamical wall locations depend on the orientation of the quark pairs, that gives a crossover transition line between confinement/deconfinement phases in the dual gauge theory. Instability of the background leads to the appearance of the critical points ( μ ϑ,b , T ϑ,b ) depending on the orientation ϑ of quark-antiquark pairs in respect to the heavy ions collision line.

  15. Application of nonparametric regression methods to study the relationship between NO2 concentrations and local wind direction and speed at background sites.

    PubMed

    Donnelly, Aoife; Misstear, Bruce; Broderick, Brian

    2011-02-15

    Background concentrations of nitrogen dioxide (NO(2)) are not constant but vary temporally and spatially. The current paper presents a powerful tool for the quantification of the effects of wind direction and wind speed on background NO(2) concentrations, particularly in cases where monitoring data are limited. In contrast to previous studies which applied similar methods to sites directly affected by local pollution sources, the current study focuses on background sites with the aim of improving methods for predicting background concentrations adopted in air quality modelling studies. The relationship between measured NO(2) concentration in air at three such sites in Ireland and locally measured wind direction has been quantified using nonparametric regression methods. The major aim was to analyse a method for quantifying the effects of local wind direction on background levels of NO(2) in Ireland. The method was expanded to include wind speed as an added predictor variable. A Gaussian kernel function is used in the analysis and circular statistics employed for the wind direction variable. Wind direction and wind speed were both found to have a statistically significant effect on background levels of NO(2) at all three sites. Frequently environmental impact assessments are based on short term baseline monitoring producing a limited dataset. The presented non-parametric regression methods, in contrast to the frequently used methods such as binning of the data, allow concentrations for missing data pairs to be estimated and distinction between spurious and true peaks in concentrations to be made. The methods were found to provide a realistic estimation of long term concentration variation with wind direction and speed, even for cases where the data set is limited. Accurate identification of the actual variation at each location and causative factors could be made, thus supporting the improved definition of background concentrations for use in air quality modelling

  16. A method to characterise site, urban and regional ambient background radiation.

    PubMed

    Passmore, C; Kirr, M

    2011-03-01

    Control dosemeters are routinely provided to customers to monitor the background radiation so that it can be subtracted from the gross response of the dosemeter to arrive at the occupational dose. Landauer, the largest dosimetry processor in the world with subsidiaries in Australia, Brazil, China, France, Japan, Mexico and the UK, has clients in approximately 130 countries. The Glenwood facility processes over 1.1 million controls per year. This network of clients around the world provides a unique ability to monitor the world's ambient background radiation. Control data can be mined to provide useful historical information regarding ambient background rates and provide a historical baseline for geographical areas. Historical baseline can be used to provide site or region-specific background subtraction values, document the variation in ambient background radiation around a client's site or provide a baseline for measuring the efficiency of clean-up efforts in urban areas after a dirty bomb detonation.

  17. Method for evaluating human exposure to 60 HZ electric fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deno, D.W.; Silva, M.

    1984-07-01

    This paper describes a method that has been successfully used to evaluate human exposure to 60 Hz electric fields. An exposure measuring system that uses an electric field sensor vest and data collection instrumentation is presented. Exposure concepts and activity factors are discussed and experimental data collected with the exposure system are provided. This method can be used to measure exposure to a wide range of electric field with intensities from less than 1 V/m to more than 10 kV/m. Results may be translated to characterize various exposure criteria (time histogram of unperturbed field, surface fields, internal current density, totalmore » body current, etc).« less

  18. Space moving target detection and tracking method in complex background

    NASA Astrophysics Data System (ADS)

    Lv, Ping-Yue; Sun, Sheng-Li; Lin, Chang-Qing; Liu, Gao-Rui

    2018-06-01

    The background of the space-borne detectors in real space-based environment is extremely complex and the signal-to-clutter ratio is very low (SCR ≈ 1), which increases the difficulty for detecting space moving targets. In order to solve this problem, an algorithm combining background suppression processing based on two-dimensional least mean square filter (TDLMS) and target enhancement based on neighborhood gray-scale difference (GSD) is proposed in this paper. The latter can filter out most of the residual background clutter processed by the former such as cloud edge. Through this procedure, both global and local SCR have obtained substantial improvement, indicating that the target has been greatly enhanced. After removing the detector's inherent clutter region through connected domain processing, the image only contains the target point and the isolated noise, in which the isolated noise could be filtered out effectively through multi-frame association. The proposed algorithm in this paper has been compared with some state-of-the-art algorithms for moving target detection and tracking tasks. The experimental results show that the performance of this algorithm is the best in terms of SCR gain, background suppression factor (BSF) and detection results.

  19. General heat kernel coefficients for massless free spin-3/2 Rarita-Schwinger field

    NASA Astrophysics Data System (ADS)

    Karan, Sudip; Kumar, Shashank; Panda, Binata

    2018-04-01

    We review the general heat kernel method for the Dirac spinor field as an elementary example in any arbitrary background. We, then compute the first three Seeley-DeWitt coefficients for the massless free spin-3/2 Rarita-Schwinger field without imposing any limitations on the background geometry.

  20. Characterization and Prediction of the SPI Background

    NASA Technical Reports Server (NTRS)

    Teegarden, B. J.; Jean, P.; Knodlseder, J.; Skinner, G. K.; Weidenspointer, G.

    2003-01-01

    The INTEGRAL Spectrometer, like most gamma-ray instruments, is background dominated. Signal-to-background ratios of a few percent are typical. The background is primarily due to interactions of cosmic rays in the instrument and spacecraft. It characteristically varies by +/- 5% on time scales of days. This variation is caused mainly by fluctuations in the interplanetary magnetic field that modulates the cosmic ray intensity. To achieve the maximum performance from SPI it is essential to have a high quality model of this background that can predict its value to a fraction of a percent. In this poster we characterize the background and its variability, explore various models, and evaluate the accuracy of their predictions.

  1. General design method for three-dimensional potential flow fields. 1: Theory

    NASA Technical Reports Server (NTRS)

    Stanitz, J. D.

    1980-01-01

    A general design method was developed for steady, three dimensional, potential, incompressible or subsonic-compressible flow. In this design method, the flow field, including the shape of its boundary, was determined for arbitrarily specified, continuous distributions of velocity as a function of arc length along the boundary streamlines. The method applied to the design of both internal and external flow fields, including, in both cases, fields with planar symmetry. The analytic problems associated with stagnation points, closure of bodies in external flow fields, and prediction of turning angles in three dimensional ducts were reviewed.

  2. Norman Ramsey and the Separated Oscillatory Fields Method

    Science.gov Websites

    methods of investigation; in particular, he contributed many refinements of the molecular beam method for the study of atomic and molecular properties, he invented the separated oscillatory field method of atomic and molecular spectroscopy and it is the practical basis for the most precise atomic clocks

  3. Effects of a primordial magnetic field with log-normal distribution on the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    Yamazaki, Dai G.; Ichiki, Kiyotomo; Takahashi, Keitaro

    2011-12-01

    We study the effect of primordial magnetic fields (PMFs) on the anisotropies of the cosmic microwave background (CMB). We assume the spectrum of PMFs is described by log-normal distribution which has a characteristic scale, rather than power-law spectrum. This scale is expected to reflect the generation mechanisms and our analysis is complementary to previous studies with power-law spectrum. We calculate power spectra of energy density and Lorentz force of the log-normal PMFs, and then calculate CMB temperature and polarization angular power spectra from scalar, vector, and tensor modes of perturbations generated from such PMFs. By comparing these spectra with WMAP7, QUaD, CBI, Boomerang, and ACBAR data sets, we find that the current CMB data set places the strongest constraint at k≃10-2.5Mpc-1 with the upper limit B≲3nG.

  4. Landau problem with time dependent mass in time dependent electric and harmonic background fields

    NASA Astrophysics Data System (ADS)

    Lawson, Latévi M.; Avossevou, Gabriel Y. H.

    2018-04-01

    The spectrum of a Hamiltonian describing the dynamics of a Landau particle with time-dependent mass and frequency undergoing the influence of a uniform time-dependent electric field is obtained. The configuration space wave function of the model is expressed in terms of the generalised Laguerre polynomials. To diagonalize the time-dependent Hamiltonian, we employ the Lewis-Riesenfeld method of invariants. To this end, we introduce a unitary transformation in the framework of the algebraic formalism to construct the invariant operator of the system and then to obtain the exact solution of the Hamiltonian. We recover the solutions of the ordinary Landau problem in the absence of the electric and harmonic fields for a constant particle mass.

  5. A new method to reduce the statistical and systematic uncertainty of chance coincidence backgrounds measured with waveform digitizers

    DOE PAGES

    O'Donnell, John M.

    2015-06-30

    We present a new method for measuring chance-coincidence backgrounds during the collection of coincidence data. The method relies on acquiring data with near-zero dead time, which is now realistic due to the increasing deployment of flash electronic-digitizer (waveform digitizer) techniques. An experiment designed to use this new method is capable of acquiring more coincidence data, and a much reduced statistical fluctuation of the measured background. A statistical analysis is presented, and us ed to derive a figure of merit for the new method. Factors of four improvement over other analyses are realistic. The technique is illustrated with preliminary data takenmore » as part of a program to make new measurements of the prompt fission neutron spectra at Los Alamo s Neutron Science Center. In conclusion, it is expected that the these measurements will occur in a regime where the maximum figure of merit will be exploited« less

  6. Wave field restoration using three-dimensional Fourier filtering method.

    PubMed

    Kawasaki, T; Takai, Y; Ikuta, T; Shimizu, R

    2001-11-01

    A wave field restoration method in transmission electron microscopy (TEM) was mathematically derived based on a three-dimensional (3D) image formation theory. Wave field restoration using this method together with spherical aberration correction was experimentally confirmed in through-focus images of amorphous tungsten thin film, and the resolution of the reconstructed phase image was successfully improved from the Scherzer resolution limit to the information limit. In an application of this method to a crystalline sample, the surface structure of Au(110) was observed in a profile-imaging mode. The processed phase image showed quantitatively the atomic relaxation of the topmost layer.

  7. A field day of soil regulation methods

    NASA Astrophysics Data System (ADS)

    Kempter, Axel; Kempter, Carmen

    2015-04-01

    The subject Soil plays an important role in the school subject geography. In particular in the upper classes it is expected that the knowledge from the area of Soil can be also be applied in other subjects. Thus, e.g., an assessment of economy and agricultural development and developing potential requires the interweaving of natural- geographic and human-geographic factors. The treatment of the subject Soil requires the desegregation of the results of different fields like Physics, Chemistry and Biology. Accordingly the subject gives cause to professional-covering lessons and offers the opportunity for practical work as well as excursions. Beside the mediation of specialist knowledge and with the support of the methods and action competences, the independent learning and the practical work should have a special emphasis on the field excursion by using stimulating exercises oriented to solving problems and mastering the methods. This aim should be achieved by the interdisciplinary treatment of the subject Soil in the task-oriented learning process on the field day. The methods and experiments should be sensibly selected for both the temporal and material supply constraints. During the field day the pupils had to categorize soil texture, soil colour, soil profile, soil skeleton, lime content, ion exchanger (Soils filter materials), pH-Value, water retention capacity and evidence of different ions like e.g. Fe3+, Mg2+, Cl- and NO3-. The pupils worked on stations and evaluated the data to receive a general view of the ground at the end. According to numbers of locations, amount of time and group size, different procedures can be offered. There are groups of experts who carry out the same experiment at all locations and split for the evaluation in different groups or each group ran through all stations. The results were compared and discussed at the end.

  8. Chandra ACIS-I particle background: an analytical model

    NASA Astrophysics Data System (ADS)

    Bartalucci, I.; Mazzotta, P.; Bourdin, H.; Vikhlinin, A.

    2014-06-01

    Aims: Imaging and spectroscopy of X-ray extended sources require a proper characterisation of a spatially unresolved background signal. This background includes sky and instrumental components, each of which are characterised by its proper spatial and spectral behaviour. While the X-ray sky background has been extensively studied in previous work, here we analyse and model the instrumental background of the ACIS-I detector on board the Chandra X-ray observatory in very faint mode. Methods: Caused by interaction of highly energetic particles with the detector, the ACIS-I instrumental background is spectrally characterised by the superimposition of several fluorescence emission lines onto a continuum. To isolate its flux from any sky component, we fitted an analytical model of the continuum to observations performed in very faint mode with the detector in the stowed position shielded from the sky, and gathered over the eight-year period starting in 2001. The remaining emission lines were fitted to blank-sky observations of the same period. We found 11 emission lines. Analysing the spatial variation of the amplitude, energy and width of these lines has further allowed us to infer that three lines of these are presumably due to an energy correction artefact produced in the frame store. Results: We provide an analytical model that predicts the instrumental background with a precision of 2% in the continuum and 5% in the lines. We use this model to measure the flux of the unresolved cosmic X-ray background in the Chandra deep field south. We obtain a flux of 10.2+0.5-0.4 × 10-13 erg cm-2 deg-2 s-1 for the [1-2] keV band and (3.8 ± 0.2) × 10-12 erg cm-2 deg-2 s-1 for the [2-8] keV band.

  9. Field-gradient partitioning for fracture and frictional contact in the material point method: Field-gradient partitioning for fracture and frictional contact in the material point method [Fracture and frictional contact in material point method using damage-field gradients for velocity-field partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Homel, Michael A.; Herbold, Eric B.

    Contact and fracture in the material point method require grid-scale enrichment or partitioning of material into distinct velocity fields to allow for displacement or velocity discontinuities at a material interface. We present a new method which a kernel-based damage field is constructed from the particle data. The gradient of this field is used to dynamically repartition the material into contact pairs at each node. Our approach avoids the need to construct and evolve explicit cracks or contact surfaces and is therefore well suited to problems involving complex 3-D fracture with crack branching and coalescence. A straightforward extension of this approachmore » permits frictional ‘self-contact’ between surfaces that are initially part of a single velocity field, enabling more accurate simulation of granular flow, porous compaction, fragmentation, and comminution of brittle materials. Finally, numerical simulations of self contact and dynamic crack propagation are presented to demonstrate the accuracy of the approach.« less

  10. Field-gradient partitioning for fracture and frictional contact in the material point method: Field-gradient partitioning for fracture and frictional contact in the material point method [Fracture and frictional contact in material point method using damage-field gradients for velocity-field partitioning

    DOE PAGES

    Homel, Michael A.; Herbold, Eric B.

    2016-08-15

    Contact and fracture in the material point method require grid-scale enrichment or partitioning of material into distinct velocity fields to allow for displacement or velocity discontinuities at a material interface. We present a new method which a kernel-based damage field is constructed from the particle data. The gradient of this field is used to dynamically repartition the material into contact pairs at each node. Our approach avoids the need to construct and evolve explicit cracks or contact surfaces and is therefore well suited to problems involving complex 3-D fracture with crack branching and coalescence. A straightforward extension of this approachmore » permits frictional ‘self-contact’ between surfaces that are initially part of a single velocity field, enabling more accurate simulation of granular flow, porous compaction, fragmentation, and comminution of brittle materials. Finally, numerical simulations of self contact and dynamic crack propagation are presented to demonstrate the accuracy of the approach.« less

  11. Perspective: Ab initio force field methods derived from quantum mechanics

    NASA Astrophysics Data System (ADS)

    Xu, Peng; Guidez, Emilie B.; Bertoni, Colleen; Gordon, Mark S.

    2018-03-01

    It is often desirable to accurately and efficiently model the behavior of large molecular systems in the condensed phase (thousands to tens of thousands of atoms) over long time scales (from nanoseconds to milliseconds). In these cases, ab initio methods are difficult due to the increasing computational cost with the number of electrons. A more computationally attractive alternative is to perform the simulations at the atomic level using a parameterized function to model the electronic energy. Many empirical force fields have been developed for this purpose. However, the functions that are used to model interatomic and intermolecular interactions contain many fitted parameters obtained from selected model systems, and such classical force fields cannot properly simulate important electronic effects. Furthermore, while such force fields are computationally affordable, they are not reliable when applied to systems that differ significantly from those used in their parameterization. They also cannot provide the information necessary to analyze the interactions that occur in the system, making the systematic improvement of the functional forms that are used difficult. Ab initio force field methods aim to combine the merits of both types of methods. The ideal ab initio force fields are built on first principles and require no fitted parameters. Ab initio force field methods surveyed in this perspective are based on fragmentation approaches and intermolecular perturbation theory. This perspective summarizes their theoretical foundation, key components in their formulation, and discusses key aspects of these methods such as accuracy and formal computational cost. The ab initio force fields considered here were developed for different targets, and this perspective also aims to provide a balanced presentation of their strengths and shortcomings. Finally, this perspective suggests some future directions for this actively developing area.

  12. Field camp: Using traditional methods to train the next generation of petroleum geologists

    USGS Publications Warehouse

    Puckette, J.O.; Suneson, N.H.

    2009-01-01

    The summer field camp experience provides many students with their best opportunity to learn the scientific process by making observations and collecting, recording, evaluating, and interpreting geologic data. Field school projects enhance student professional development by requiring cooperation and interpersonal interaction, report writing to communicate interpretations, and the development of project management skills to achieve a common goal. The field school setting provides students with the opportunity to observe geologic features and their spatial distribution, size, and shape that will impact the student's future careers as geoscientists. The Les Huston Geology Field Camp (a.k.a. Oklahoma Geology Camp) near Ca??on City, Colorado, focuses on time-tested traditional methods of geological mapping and fieldwork to accomplish these goals. The curriculum consists of an introduction to field techniques (pacing, orienteering, measuring strike and dip, and using a Jacob's staff), sketching outcrops, section measuring (one illustrating facies changes), three mapping exercises (of increasing complexity), and a field geophysics project. Accurate rock and contact descriptions are emphasized, and attitudes and contacts are mapped in the field. Mapping is done on topographic maps at 1:12,000 and 1:6000 scales; air photos are provided. Global positioning system (GPS)-assisted mapping is allowed, but we insist that locations be recorded in the field and confirmed using visual observations. The course includes field trips to the Cripple Creek and Leadville mining districts, Floris-sant/Guffey volcano area, Pikes Peak batholith, and the Denver Basin. Each field trip is designed to emphasize aspects of geology that are not stressed in the field exercises. Students are strongly encouraged to accurately describe geologic features and gather evidence to support their interpretations of the geologic history. Concise reports are a part of each major exercise. Students are grouped

  13. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement

  14. Various background pattern-effect on saccadic suppression.

    PubMed

    Mitrani, L; Radil-Weiss, T; Yakimoff, N; Mateeff, S; Bozkov, V

    1975-09-01

    It has been proved that the saccadic suppression is a phenomenon closely related to the presence of contours and structures in the visual field. Experiments were performed to clarify whether the structured background influences the pattern of attention distribution (making the stimulus detection more difficult) or whether the elevation of visual threshold is due to the "masking' effect of the moving background image over the retina. Two types of backgrounds were used therefore: those with symbolic meaning in the processing of which "psychological' mechanisms are presumably involved like picture reproductions of famous painters and photographs of nudes, and those lacking semantic significance like computer figures composed of randomly distributed black and white squares with different grain expressed as the entropy of the pattern. The results show that saccadic suppression is primarily a consequence of peripheral mechanisms, probably of lateral inhibition in the visual field occurring in the presence of moving edges over the retina. Psychological factors have to be excluded as being fundamental for saccadic suppression.

  15. Optimization of advanced Wiener estimation methods for Raman reconstruction from narrow-band measurements in the presence of fluorescence background

    PubMed Central

    Chen, Shuo; Ong, Yi Hong; Lin, Xiaoqian; Liu, Quan

    2015-01-01

    Raman spectroscopy has shown great potential in biomedical applications. However, intrinsically weak Raman signals cause slow data acquisition especially in Raman imaging. This problem can be overcome by narrow-band Raman imaging followed by spectral reconstruction. Our previous study has shown that Raman spectra free of fluorescence background can be reconstructed from narrow-band Raman measurements using traditional Wiener estimation. However, fluorescence-free Raman spectra are only available from those sophisticated Raman setups capable of fluorescence suppression. The reconstruction of Raman spectra with fluorescence background from narrow-band measurements is much more challenging due to the significant variation in fluorescence background. In this study, two advanced Wiener estimation methods, i.e. modified Wiener estimation and sequential weighted Wiener estimation, were optimized to achieve this goal. Both spontaneous Raman spectra and surface enhanced Raman spectra were evaluated. Compared with traditional Wiener estimation, two advanced methods showed significant improvement in the reconstruction of spontaneous Raman spectra. However, traditional Wiener estimation can work as effectively as the advanced methods for SERS spectra but much faster. The wise selection of these methods would enable accurate Raman reconstruction in a simple Raman setup without the function of fluorescence suppression for fast Raman imaging. PMID:26203387

  16. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Subok; Jennings, Robert; Liu Haimo

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulnessmore » of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system

  17. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  18. Background element content of the lichen Pseudevernia furfuracea: A supra-national state of art implemented by novel field data from Italy.

    PubMed

    Cecconi, Elva; Incerti, Guido; Capozzi, Fiore; Adamo, Paola; Bargagli, Roberto; Benesperi, Renato; Candotto Carniel, Fabio; Favero-Longo, Sergio Enrico; Giordano, Simonetta; Puntillo, Domenico; Ravera, Sonia; Spagnuolo, Valeria; Tretiach, Mauro

    2018-05-01

    In biomonitoring, the knowledge of background element content (BEC) values is an essential pre-requisite for the correct assessment of pollution levels. Here, we estimated the BEC values of a highly performing biomonitor, the epiphytic lichen Pseudevernia furfuracea, by means of a careful review of literature data, integrated by an extensive field survey. Methodologically homogeneous element content datasets, reflecting different exposure conditions across European and extra-European countries, were compiled and comparatively analysed. Element content in samples collected in remote areas was compared to that of potentially enriched samples, testing differences between medians for 25 elements. This analysis confirmed that the former samples were substantially unaffected by anthropogenic contributions, and their metrics were therefore proposed as a first overview at supra-national background level. We also showed that bioaccumulation studies suffer a huge methodological variability. Limited to original field data, we investigated the background variability of 43 elements in 62 remote Italian sites, characterized in GIS environment for anthropization, land use, climate and lithology at different scale resolution. The relationships between selected environmental descriptors and BEC were tested using Principal Component Regression (PCR) modelling. Elemental composition resulted significantly dependent on land use, climate and lithology. In the case of lithogenic elements, regression models correctly reproduced the lichen content throughout the country at randomly selected sites. Further descriptors should be identified only for As, Co, and V. Through a multivariate approach we also identified three geographically homogeneous macro-regions for which specific BECs were provided for use as reference in biomonitoring applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Multigrid Methods for the Computation of Propagators in Gauge Fields

    NASA Astrophysics Data System (ADS)

    Kalkreuter, Thomas

    Multigrid methods were invented for the solution of discretized partial differential equations in order to overcome the slowness of traditional algorithms by updates on various length scales. In the present work generalizations of multigrid methods for propagators in gauge fields are investigated. Gauge fields are incorporated in algorithms in a covariant way. The kernel C of the restriction operator which averages from one grid to the next coarser grid is defined by projection on the ground-state of a local Hamiltonian. The idea behind this definition is that the appropriate notion of smoothness depends on the dynamics. The ground-state projection choice of C can be used in arbitrary dimension and for arbitrary gauge group. We discuss proper averaging operations for bosons and for staggered fermions. The kernels C can also be used in multigrid Monte Carlo simulations, and for the definition of block spins and blocked gauge fields in Monte Carlo renormalization group studies. Actual numerical computations are performed in four-dimensional SU(2) gauge fields. We prove that our proposals for block spins are “good”, using renormalization group arguments. A central result is that the multigrid method works in arbitrarily disordered gauge fields, in principle. It is proved that computations of propagators in gauge fields without critical slowing down are possible when one uses an ideal interpolation kernel. Unfortunately, the idealized algorithm is not practical, but it was important to answer questions of principle. Practical methods are able to outperform the conjugate gradient algorithm in case of bosons. The case of staggered fermions is harder. Multigrid methods give considerable speed-ups compared to conventional relaxation algorithms, but on lattices up to 184 conjugate gradient is superior.

  20. Gamma-Ray Background Variability in Mobile Detectors

    NASA Astrophysics Data System (ADS)

    Aucott, Timothy John

    Gamma-ray background radiation significantly reduces detection sensitivity when searching for radioactive sources in the field, such as in wide-area searches for homeland security applications. Mobile detector systems in particular must contend with a variable background that is not necessarily known or even measurable a priori. This work will present measurements of the spatial and temporal variability of the background, with the goal of merging gamma-ray detection, spectroscopy, and imaging with contextual information--a "nuclear street view" of the ubiquitous background radiation. The gamma-ray background originates from a variety of sources, both natural and anthropogenic. The dominant sources in the field are the primordial isotopes potassium-40, uranium-238, and thorium-232, as well as their decay daughters. In addition to the natural background, many artificially-created isotopes are used for industrial or medical purposes, and contamination from fission products can be found in many environments. Regardless of origin, these backgrounds will reduce detection sensitivity by adding both statistical as well as systematic uncertainty. In particular, large detector arrays will be limited by the systematic uncertainty in the background and will suffer from a high rate of false alarms. The goal of this work is to provide a comprehensive characterization of the gamma-ray background and its variability in order to improve detection sensitivity and evaluate the performance of mobile detectors in the field. Large quantities of data are measured in order to study their performance at very low false alarm rates. Two different approaches, spectroscopy and imaging, are compared in a controlled study in the presence of this measured background. Furthermore, there is additional information that can be gained by correlating the gamma-ray data with contextual data streams (such as cameras and global positioning systems) in order to reduce the variability in the background

  1. Adaptive-Grid Methods for Phase Field Models of Microstructure Development

    NASA Technical Reports Server (NTRS)

    Provatas, Nikolas; Goldenfeld, Nigel; Dantzig, Jonathan A.

    1999-01-01

    In this work the authors show how the phase field model can be solved in a computationally efficient manner that opens a new large-scale simulational window on solidification physics. Our method uses a finite element, adaptive-grid formulation, and exploits the fact that the phase and temperature fields vary significantly only near the interface. We illustrate how our method allows efficient simulation of phase-field models in very large systems, and verify the predictions of solvability theory at intermediate undercooling. We then present new results at low undercoolings that suggest that solvability theory may not give the correct tip speed in that regime. We model solidification using the phase-field model used by Karma and Rappel.

  2. A regularization method for extrapolation of solar potential magnetic fields

    NASA Technical Reports Server (NTRS)

    Gary, G. A.; Musielak, Z. E.

    1992-01-01

    The mathematical basis of a Tikhonov regularization method for extrapolating the chromospheric-coronal magnetic field using photospheric vector magnetograms is discussed. The basic techniques show that the Cauchy initial value problem can be formulated for potential magnetic fields. The potential field analysis considers a set of linear, elliptic partial differential equations. It is found that, by introducing an appropriate smoothing of the initial data of the Cauchy potential problem, an approximate Fourier integral solution is found, and an upper bound to the error in the solution is derived. This specific regularization technique, which is a function of magnetograph measurement sensitivities, provides a method to extrapolate the potential magnetic field above an active region into the chromosphere and low corona.

  3. Studying extragalactic background fluctuations with the Cosmic Infrared Background ExpeRiment 2 (CIBER-2)

    NASA Astrophysics Data System (ADS)

    Lanz, Alicia; Arai, Toshiaki; Battle, John; Bock, James; Cooray, Asantha; Hristov, Viktor; Korngut, Phillip; Lee, Dae Hee; Mason, Peter; Matsumoto, Toshio; Matsuura, Shuji; Morford, Tracy; Onishi, Yosuke; Shirahata, Mai; Tsumura, Kohji; Wada, Takehiko; Zemcov, Michael

    2014-08-01

    Fluctuations in the extragalactic background light trace emission from the history of galaxy formation, including the emission from the earliest sources from the epoch of reionization. A number of recent near-infrared measure- ments show excess spatial power at large angular scales inconsistent with models of z < 5 emission from galaxies. These measurements have been interpreted as arising from either redshifted stellar and quasar emission from the epoch of reionization, or the combined intra-halo light from stars thrown out of galaxies during merging activity at lower redshifts. Though astrophysically distinct, both interpretations arise from faint, low surface brightness source populations that are difficult to detect except by statistical approaches using careful observations with suitable instruments. The key to determining the source of these background anisotropies will be wide-field imaging measurements spanning multiple bands from the optical to the near-infrared. The Cosmic Infrared Background ExpeRiment 2 (CIBER-2) will measure spatial anisotropies in the extra- galactic infrared background caused by cosmological structure using six broad spectral bands. The experiment uses three 2048 x 2048 Hawaii-2RG near-infrared arrays in three cameras coupled to a single 28.5 cm telescope housed in a reusable sounding rocket-borne payload. A small portion of each array will also be combined with a linear-variable filter to make absolute measurements of the spectrum of the extragalactic background with high spatial resolution for deep subtraction of Galactic starlight. The large field of view and multiple spectral bands make CIBER-2 unique in its sensitivity to fluctuations predicted by models of lower limits on the luminosity of the first stars and galaxies and in its ability to distinguish between primordial and foreground anisotropies. In this paper the scientific motivation for CIBER-2 and details of its first flight instrumentation will be discussed, including

  4. Topology of microwave background fluctuations - Theory

    NASA Technical Reports Server (NTRS)

    Gott, J. Richard, III; Park, Changbom; Bies, William E.; Bennett, David P.; Juszkiewicz, Roman

    1990-01-01

    Topological measures are used to characterize the microwave background temperature fluctuations produced by 'standard' scenarios (Gaussian) and by cosmic strings (non-Gaussian). Three topological quantities: total area of the excursion regions, total length, and total curvature (genus) of the isotemperature contours, are studied for simulated Gaussian microwave background anisotropy maps and then compared with those of the non-Gaussian anisotropy pattern produced by cosmic strings. In general, the temperature gradient field shows the non-Gaussian behavior of the string map more distinctively than the temperature field for all topology measures. The total contour length and the genus are found to be more sensitive to the existence of a stringy pattern than the usual temperature histogram. Situations when instrumental noise is superposed on the map, are considered to find the critical signal-to-noise ratio for which strings can be detected.

  5. Method of electric field flow fractionation wherein the polarity of the electric field is periodically reversed

    DOEpatents

    Stevens, Fred J.

    1992-01-01

    A novel method of electric field flow fractionation for separating solute molecules from a carrier solution is disclosed. The method of the invention utilizes an electric field that is periodically reversed in polarity, in a time-dependent, wave-like manner. The parameters of the waveform, including amplitude, frequency and wave shape may be varied to optimize separation of solute species. The waveform may further include discontinuities to enhance separation.

  6. Towards standardization of 18F-FET PET imaging: do we need a consistent method of background activity assessment?

    PubMed

    Unterrainer, Marcus; Vettermann, Franziska; Brendel, Matthias; Holzgreve, Adrien; Lifschitz, Michael; Zähringer, Matthias; Suchorska, Bogdana; Wenter, Vera; Illigens, Ben M; Bartenstein, Peter; Albert, Nathalie L

    2017-12-01

    PET with O-(2- 18 F-fluoroethyl)-L-tyrosine ( 18 F-FET) has reached increasing clinical significance for patients with brain neoplasms. For quantification of standard PET-derived parameters such as the tumor-to-background ratio, the background activity is assessed using a region of interest (ROI) or volume of interest (VOI) in unaffected brain tissue. However, there is no standardized approach regarding the assessment of the background reference. Therefore, we evaluated the intra- and inter-reader variability of commonly applied approaches for clinical 18 F-FET PET reading. The background activity of 20 18 F-FET PET scans was independently evaluated by 6 readers using a (i) simple 2D-ROI, (ii) spherical VOI with 3.0 cm diameter, and (iii) VOI consisting of crescent-shaped ROIs; each in the contralateral, non-affected hemisphere including white and gray matter in line with the European Association of Nuclear Medicine (EANM) and German guidelines. To assess intra-reader variability, each scan was evaluated 10 times by each reader. The coefficient of variation (CoV) was assessed for determination of intra- and inter-reader variability. In a second step, the best method was refined by instructions for a guided background activity assessment and validated by 10 further scans. Compared to the other approaches, the crescent-shaped VOIs revealed most stable results with the lowest intra-reader variabilities (median CoV 1.52%, spherical VOI 4.20%, 2D-ROI 3.69%; p < 0.001) and inter-reader variabilities (median CoV 2.14%, spherical VOI 4.02%, 2D-ROI 3.83%; p = 0.001). Using the guided background assessment, both intra-reader variabilities (median CoV 1.10%) and inter-reader variabilities (median CoV 1.19%) could be reduced even more. The commonly applied methods for background activity assessment show different variability which might hamper 18 F-FET PET quantification and comparability in multicenter settings. The proposed background activity assessment using a

  7. Dim target trajectory-associated detection in bright earth limb background

    NASA Astrophysics Data System (ADS)

    Chen, Penghui; Xu, Xiaojian; He, Xiaoyu; Jiang, Yuesong

    2015-09-01

    The intensive emission of earth limb in the field of view of sensors contributes much to the observation images. Due to the low signal-to-noise ratio (SNR), it is a challenge to detect small targets in earth limb background, especially for the detection of point-like targets from a single frame. To improve the target detection, track before detection (TBD) based on the frame sequence is performed. In this paper, a new technique is proposed to determine the target associated trajectories, which jointly carries out background removing, maximum value projection (MVP) and Hough transform. The background of the bright earth limb in the observation images is removed according to the profile characteristics. For a moving target, the corresponding pixels in the MVP image are shifting approximately regularly in time sequence. And the target trajectory is determined by Hough transform according to the pixel characteristics of the target and the clutter and noise. Comparing with traditional frame-by-frame methods, determining associated trajectories from MVP reduces the computation load. Numerical simulations are presented to demonstrate the effectiveness of the approach proposed.

  8. mHealth Series: mHealth project in Zhao County, rural China – Description of objectives, field site and methods

    PubMed Central

    van Velthoven, Michelle Helena; Li, Ye; Wang, Wei; Du, Xiaozhen; Wu, Qiong; Chen, Li; Majeed, Azeem; Rudan, Igor; Zhang, Yanfeng; Car, Josip

    2013-01-01

    Background We set up a collaboration between researchers in China and the UK that aimed to explore the use of mHealth in China. This is the first paper in a series of papers on a large mHealth project part of this collaboration. This paper included the aims and objectives of the mHealth project, our field site, and the detailed methods of two studies. Field site The field site for this mHealth project was Zhao County, which lies 280 km south of Beijing in Hebei Province, China. Methods We described the methodology of two studies: (i) a mixed methods study exploring factors influencing sample size calculations for mHealth–based health surveys and (ii) a cross–over study determining validity of an mHealth text messaging data collection tool. The first study used mixed methods, both quantitative and qualitative, including: (i) two surveys with caregivers of young children, (ii) interviews with caregivers, village doctors and participants of the cross–over study, and (iii) researchers’ views. We combined data from caregivers, village doctors and researchers to provide an in–depth understanding of factors influencing sample size calculations for mHealth–based health surveys. The second study, a cross–over study, used a randomised cross–over study design to compare the traditional face–to–face survey method to the new text messaging survey method. We assessed data equivalence (intrarater agreement), the amount of information in responses, reasons for giving different responses, the response rate, characteristics of non–responders, and the error rate. Conclusions This paper described the objectives, field site and methods of a large mHealth project part of a collaboration between researchers in China and the UK. The mixed methods study evaluating factors that influence sample size calculations could help future studies with estimating reliable sample sizes. The cross–over study comparing face–to–face and text message survey data collection

  9. Effect of background dielectric on TE-polarized photonic bandgap of metallodielectric photonic crystals using Dirichlet-to-Neumann map method.

    PubMed

    Sedghi, Aliasghar; Rezaei, Behrooz

    2016-11-20

    Using the Dirichlet-to-Neumann map method, we have calculated the photonic band structure of two-dimensional metallodielectric photonic crystals having the square and triangular lattices of circular metal rods in a dielectric background. We have selected the transverse electric mode of electromagnetic waves, and the resulting band structures showed the existence of photonic bandgap in these structures. We theoretically study the effect of background dielectric on the photonic bandgap.

  10. Teaching about Natural Background Radiation

    ERIC Educational Resources Information Center

    Al-Azmi, Darwish; Karunakara, N.; Mustapha, Amidu O.

    2013-01-01

    Ambient gamma dose rates in air were measured at different locations (indoors and outdoors) to demonstrate the ubiquitous nature of natural background radiation in the environment and to show that levels vary from one location to another, depending on the underlying geology. The effect of a lead shield on a gamma radiation field was also…

  11. Detecting background changes in environments with dynamic foreground by separating probability distribution function mixtures using Pearson's method of moments

    NASA Astrophysics Data System (ADS)

    Jenkins, Colleen; Jordan, Jay; Carlson, Jeff

    2007-02-01

    This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.

  12. Improved background suppression in 1H MAS NMR using composite pulses

    NASA Astrophysics Data System (ADS)

    Odedra, Smita; Wimperis, Stephen

    2012-08-01

    A well known feature of 1H MAS NMR spectroscopy, particularly of solids where the concentration of 1H nuclei is low, is the presence in the spectrum of a significant broad "background" signal arising from 1H nuclei that are outside the MAS rotor and radiofrequency coil, probably located on the surfaces of the static components of the probehead. A popular method of suppressing this unwanted signal is the "depth pulse" method, consisting of a 90° pulse followed by one or two 180° pulses that are phase cycled according to the "Exorcycle" scheme, which removes signal associated with imperfect 180° pulses. Consequently, only spins in the centre of the radiofrequency coil contribute to the 1H MAS spectrum, while those experiencing a low B1 field outside the coil are suppressed. Although very effective at removing background signal from the spectrum, one drawback with this approach is that significant loss of the desired signal from the sample also occurs. Here we investigate the 1H background suppression problem and, in particular, the use of novel antisymmetric passband composite pulses to replace the simple pulses in a depth pulse experiment. We show that it is possible to improve the intensity of the 1H signals of interest while still maintaining effective background suppression. We expect that these results will be relevant to 1H MAS NMR studies of, for example, nominally perdeuterated biological samples or nominally anhydrous inorganic materials.

  13. COSMIC INFRARED BACKGROUND FLUCTUATIONS AND ZODIACAL LIGHT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Richard G.; Kashlinsky, A.; Moseley, S. H.

    We performed a specific observational test to measure the effect that the zodiacal light can have on measurements of the spatial fluctuations of the near-IR background. Previous estimates of possible fluctuations caused by zodiacal light have often been extrapolated from observations of the thermal emission at longer wavelengths and low angular resolution or from IRAC observations of high-latitude fields where zodiacal light is faint and not strongly varying with time. The new observations analyzed here target the COSMOS field at low ecliptic latitude where the zodiacal light intensity varies by factors of ∼2 over the range of solar elongations atmore » which the field can be observed. We find that the white-noise component of the spatial power spectrum of the background is correlated with the modeled zodiacal light intensity. Roughly half of the measured white noise is correlated with the zodiacal light, but a more detailed interpretation of the white noise is hampered by systematic uncertainties that are evident in the zodiacal light model. At large angular scales (≳100″) where excess power above the white noise is observed, we find no correlation of the power with the modeled intensity of the zodiacal light. This test clearly indicates that the large-scale power in the infrared background is not being caused by the zodiacal light.« less

  14. Cosmic Infrared Background Fluctuations and Zodiacal Light

    NASA Astrophysics Data System (ADS)

    Arendt, Richard G.; Kashlinsky, A.; Moseley, S. H.; Mather, J.

    2016-06-01

    We performed a specific observational test to measure the effect that the zodiacal light can have on measurements of the spatial fluctuations of the near-IR background. Previous estimates of possible fluctuations caused by zodiacal light have often been extrapolated from observations of the thermal emission at longer wavelengths and low angular resolution or from IRAC observations of high-latitude fields where zodiacal light is faint and not strongly varying with time. The new observations analyzed here target the COSMOS field at low ecliptic latitude where the zodiacal light intensity varies by factors of ˜2 over the range of solar elongations at which the field can be observed. We find that the white-noise component of the spatial power spectrum of the background is correlated with the modeled zodiacal light intensity. Roughly half of the measured white noise is correlated with the zodiacal light, but a more detailed interpretation of the white noise is hampered by systematic uncertainties that are evident in the zodiacal light model. At large angular scales (≳100″) where excess power above the white noise is observed, we find no correlation of the power with the modeled intensity of the zodiacal light. This test clearly indicates that the large-scale power in the infrared background is not being caused by the zodiacal light.

  15. Generalized theoretical method for the interaction between arbitrary nonuniform electric field and molecular vibrations: Toward near-field infrared spectroscopy and microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwasa, Takeshi, E-mail: tiwasa@mail.sci.hokudai.ac.jp; Takenaka, Masato; Taketsugu, Tetsuya

    A theoretical method to compute infrared absorption spectra when a molecule is interacting with an arbitrary nonuniform electric field such as near-fields is developed and numerically applied to simple model systems. The method is based on the multipolar Hamiltonian where the light-matter interaction is described by a spatial integral of the inner product of the molecular polarization and applied electric field. The computation scheme is developed under the harmonic approximation for the molecular vibrations and the framework of modern electronic structure calculations such as the density functional theory. Infrared reflection absorption and near-field infrared absorption are considered as model systems.more » The obtained IR spectra successfully reflect the spatial structure of the applied electric field and corresponding vibrational modes, demonstrating applicability of the present method to analyze modern nanovibrational spectroscopy using near-fields. The present method can use arbitral electric fields and thus can integrate two fields such as computational chemistry and electromagnetics.« less

  16. Generalized theoretical method for the interaction between arbitrary nonuniform electric field and molecular vibrations: Toward near-field infrared spectroscopy and microscopy.

    PubMed

    Iwasa, Takeshi; Takenaka, Masato; Taketsugu, Tetsuya

    2016-03-28

    A theoretical method to compute infrared absorption spectra when a molecule is interacting with an arbitrary nonuniform electric field such as near-fields is developed and numerically applied to simple model systems. The method is based on the multipolar Hamiltonian where the light-matter interaction is described by a spatial integral of the inner product of the molecular polarization and applied electric field. The computation scheme is developed under the harmonic approximation for the molecular vibrations and the framework of modern electronic structure calculations such as the density functional theory. Infrared reflection absorption and near-field infrared absorption are considered as model systems. The obtained IR spectra successfully reflect the spatial structure of the applied electric field and corresponding vibrational modes, demonstrating applicability of the present method to analyze modern nanovibrational spectroscopy using near-fields. The present method can use arbitral electric fields and thus can integrate two fields such as computational chemistry and electromagnetics.

  17. A simple calculation method for determination of equivalent square field

    PubMed Central

    Shafiei, Seyed Ali; Hasanzadeh, Hadi; Shafiei, Seyed Ahmad

    2012-01-01

    Determination of the equivalent square fields for rectangular and shielded fields is of great importance in radiotherapy centers and treatment planning software. This is accomplished using standard tables and empirical formulas. The goal of this paper is to present a formula based on analysis of scatter reduction due to inverse square law to obtain equivalent field. Tables are published by different agencies such as ICRU (International Commission on Radiation Units and measurements), which are based on experimental data; but there exist mathematical formulas that yield the equivalent square field of an irregular rectangular field which are used extensively in computation techniques for dose determination. These processes lead to some complicated and time-consuming formulas for which the current study was designed. In this work, considering the portion of scattered radiation in absorbed dose at a point of measurement, a numerical formula was obtained based on which a simple formula was developed to calculate equivalent square field. Using polar coordinate and inverse square law will lead to a simple formula for calculation of equivalent field. The presented method is an analytical approach based on which one can estimate the equivalent square field of a rectangular field and may be used for a shielded field or an off-axis point. Besides, one can calculate equivalent field of rectangular field with the concept of decreased scatter radiation with inverse square law with a good approximation. This method may be useful in computing Percentage Depth Dose and Tissue-Phantom Ratio which are extensively used in treatment planning. PMID:22557801

  18. A simple calculation method for determination of equivalent square field.

    PubMed

    Shafiei, Seyed Ali; Hasanzadeh, Hadi; Shafiei, Seyed Ahmad

    2012-04-01

    Determination of the equivalent square fields for rectangular and shielded fields is of great importance in radiotherapy centers and treatment planning software. This is accomplished using standard tables and empirical formulas. The goal of this paper is to present a formula based on analysis of scatter reduction due to inverse square law to obtain equivalent field. Tables are published by different agencies such as ICRU (International Commission on Radiation Units and measurements), which are based on experimental data; but there exist mathematical formulas that yield the equivalent square field of an irregular rectangular field which are used extensively in computation techniques for dose determination. These processes lead to some complicated and time-consuming formulas for which the current study was designed. In this work, considering the portion of scattered radiation in absorbed dose at a point of measurement, a numerical formula was obtained based on which a simple formula was developed to calculate equivalent square field. Using polar coordinate and inverse square law will lead to a simple formula for calculation of equivalent field. The presented method is an analytical approach based on which one can estimate the equivalent square field of a rectangular field and may be used for a shielded field or an off-axis point. Besides, one can calculate equivalent field of rectangular field with the concept of decreased scatter radiation with inverse square law with a good approximation. This method may be useful in computing Percentage Depth Dose and Tissue-Phantom Ratio which are extensively used in treatment planning.

  19. Extending the Fellegi-Sunter probabilistic record linkage method for approximate field comparators.

    PubMed

    DuVall, Scott L; Kerber, Richard A; Thomas, Alun

    2010-02-01

    Probabilistic record linkage is a method commonly used to determine whether demographic records refer to the same person. The Fellegi-Sunter method is a probabilistic approach that uses field weights based on log likelihood ratios to determine record similarity. This paper introduces an extension of the Fellegi-Sunter method that incorporates approximate field comparators in the calculation of field weights. The data warehouse of a large academic medical center was used as a case study. The approximate comparator extension was compared with the Fellegi-Sunter method in its ability to find duplicate records previously identified in the data warehouse using different demographic fields and matching cutoffs. The approximate comparator extension misclassified 25% fewer pairs and had a larger Welch's T statistic than the Fellegi-Sunter method for all field sets and matching cutoffs. The accuracy gain provided by the approximate comparator extension grew as less information was provided and as the matching cutoff increased. Given the ubiquity of linkage in both clinical and research settings, the incremental improvement of the extension has the potential to make a considerable impact.

  20. Evaluation of aerosol sources at European high altitude background sites with trajectory statistical methods

    NASA Astrophysics Data System (ADS)

    Salvador, P.; Artíñano, B.; Pio, C. A.; Afonso, J.; Puxbaum, H.; Legrand, M.; Hammer, S.; Kaiser, A.

    2009-04-01

    countries). Secondary organic aerosol carbon formed by the photo-oxidation of biogenic emissions mainly from Germany, seems to be predominant in this season. This seasonal cycle is mainly driven by the winter/summer contrast of the regional-scale vertical mixing. During the warm season the vertical air mass exchange is enhanced by a more efficient upward transport from the boundary layer to the mountain sites. During the winter months, the vertical mixing intensity is reduced. In this season the mean levels obtained for OC and EC were lower than those recorded during the summer. Their spatiotemporal variability was mainly governed by air mass transport from distant regions, especially from Eastern Europe regions, where significant amounts of fossil fuels and biomass are currently consumed. Furthermore, emissions from desert regions in North Africa seemed to significantly influence the central European background mineral aerosol concentrations throughout the year. References: Pio C. A., M. Legrand, T. Oliveira, J. Afonso, C. Santos, A. Caseiro, P. Fialho, F. Barata, H. Puxbaum, A. Sanchez-Ochoa, A. Kasper-Giebl, A. Gelencsér, S. Preunkert, and M. Schock (2007), Climatology of aerosol composition (organic versus inorganic) at nonurban sites on a west-east transect across Europe. J. Geophys. Res., 112, D23S02, doi:10.1029/2006JD008038. Stohl A., G. Wotawa, P. Seibert and H. Kromp-Kolb (1995), Interpolation errors in wind fields as a function of spatial and temporal resolution and their impact on different types of kinematic trajectories. J. Appl. Meteorol., 34, 2149-2165. Stohl A. (1996), Trajectory statistics-a new method to establish source-receptor relationships of air pollutants and its application to the transport of particulate sulfate in Europe. Atmos. Environ., 30(4), 579-587.

  1. Characterization techniques for incorporating backgrounds into DIRSIG

    NASA Astrophysics Data System (ADS)

    Brown, Scott D.; Schott, John R.

    2000-07-01

    The appearance of operation hyperspectral imaging spectrometers in both solar and thermal regions has lead to the development of a variety of spectral detection algorithms. The development and testing of these algorithms requires well characterized field collection campaigns that can be time and cost prohibitive. Radiometrically robust synthetic image generation (SIG) environments that can generate appropriate images under a variety of atmospheric conditions and with a variety of sensors offers an excellent supplement to reduce the scope of the expensive field collections. In addition, SIG image products provide the algorithm developer with per-pixel truth, allowing for improved characterization of the algorithm performance. To meet the needs of the algorithm development community, the image modeling community needs to supply synthetic image products that contain all the spatial and spectral variability present in real world scenes, and that provide the large area coverage typically acquired with actual sensors. This places a heavy burden on synthetic scene builders to construct well characterized scenes that span large areas. Several SIG models have demonstrated the ability to accurately model targets (vehicles, buildings, etc.) Using well constructed target geometry (from CAD packages) and robust thermal and radiometry models. However, background objects (vegetation, infrastructure, etc.) dominate the percentage of real world scene pixels and utilizing target building techniques is time and resource prohibitive. This paper discusses new methods that have been integrated into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model to characterize backgrounds. The new suite of scene construct types allows the user to incorporate both terrain and surface properties to obtain wide area coverage. The terrain can be incorporated using a triangular irregular network (TIN) derived from elevation data or digital elevation model (DEM) data from actual

  2. [Cosmic Microwave Background (CMB) Anisotropies

    NASA Technical Reports Server (NTRS)

    Silk, Joseph

    1998-01-01

    One of the main areas of research is the theory of cosmic microwave background (CMB) anisotropies and analysis of CMB data. Using the four year COBE data we were able to improve existing constraints on global shear and vorticity. We found that, in the flat case (which allows for greatest anisotropy), (omega/H)0 less than 10(exp -7), where omega is the vorticity and H is the Hubble constant. This is two orders of magnitude lower than the tightest, previous constraint. We have defined a new set of statistics which quantify the amount of non-Gaussianity in small field cosmic microwave background maps. By looking at the distribution of power around rings in Fourier space, and at the correlations between adjacent rings, one can identify non-Gaussian features which are masked by large scale Gaussian fluctuations. This may be particularly useful for identifying unresolved localized sources and line-like discontinuities. Levin and collaborators devised a method to determine the global geometry of the universe through observations of patterns in the hot and cold spots of the CMB. We have derived properties of the peaks (maxima) of the CMB anisotropies expected in flat and open CDM models. We represent results for angular resolutions ranging from 5 arcmin to 20 arcmin (antenna FWHM), scales that are relevant for the MAP and COBRA/SAMBA space missions and the ground-based interferometer. Results related to galaxy formation and evolution are also discussed.

  3. A comparison of field methods to assess body composition in a diverse group of sedentary women.

    PubMed

    D'Alonzo, Karen T; Aluf, Ana; Vincent, Linda; Cooper, Kristin

    2009-01-01

    Accurate assessment of body composition is essential in the evaluation of obesity. While laboratory methods are commonly used to assess fat mass, field measures (e.g., skinfold thickness [SKF] and bioelectrical impedance [BIA]) may be more practical for screening large numbers of individuals in intervention studies. In this study, a correlational design was used among 46 racially and ethnically diverse, sedentary women (mean age = 25.73 years) to (a) compare the percentage of body fat as determined by SKF and the upper body BIA and (b) examine the effects of body mass index (BMI), racial/ethnic background, age, and stage of the menstrual cycle on differences in the estimated percentage of body fat obtained using the SKF and BIA. Overall, a significant correlation between SKF and BIA (r = .98, p < .001) was found, with similar findings among Black, Hispanic and White non-Hispanic women. The mean differences between BIA and SKF were not significantly correlated with BMI, age, race/ethnicity or stage of the menstrual cycle. Data from this study suggest that BIA showed similar body fat prediction values compared with SKF and may be a viable alternative to SKF among diverse groups of healthy women. Additional testing and comparison of these field methods with the laboratory methods of hydro-densitometry or dual energy X-ray absorptiometry is recommended to further determine whether BIA devices can be routinely recommended as an alternative to the SKF.

  4. Dynamics of Magnetized Plasma Jets and Bubbles Launched into a Background Magnetized Plasma

    NASA Astrophysics Data System (ADS)

    Wallace, B.; Zhang, Y.; Fisher, D. M.; Gilmore, M.

    2016-10-01

    The propagation of dense magnetized plasma, either collimated with mainly azimuthal B-field (jet) or toroidal with closed B-field (bubble), in a background plasma occurs in a number of solar and astrophysical cases. Such cases include coronal mass ejections moving in the background solar wind and extragalactic radio lobes expanding into the extragalactic medium. Understanding the detailed MHD behavior is crucial for correctly modeling these events. In order to further the understanding of such systems, we are investigating the injection of dense magnetized jets and bubbles into a lower density background magnetized plasma using a coaxial plasma gun and a background helicon or cathode plasma. In both jet and bubble cases, the MHD dynamics are found to be very different when launched into background plasma or magnetic field, as compared to vacuum. In the jet case, it is found that the inherent kink instability is stabilized by velocity shear developed due to added magnetic tension from the background field. In the bubble case, rather than directly relaxing to a minimum energy Taylor state (spheromak) as in vacuum, there is an expansion asymmetry and the bubble becomes Rayleigh-Taylor unstable on one side. Recent results will be presented. Work supported by the Army Research Office Award No. W911NF1510480.

  5. 3D SAPIV particle field reconstruction method based on adaptive threshold.

    PubMed

    Qu, Xiangju; Song, Yang; Jin, Ying; Li, Zhenhua; Wang, Xuezhen; Guo, ZhenYan; Ji, Yunjing; He, Anzhi

    2018-03-01

    Particle image velocimetry (PIV) is a necessary flow field diagnostic technique that provides instantaneous velocimetry information non-intrusively. Three-dimensional (3D) PIV methods can supply the full understanding of a 3D structure, the complete stress tensor, and the vorticity vector in the complex flows. In synthetic aperture particle image velocimetry (SAPIV), the flow field can be measured with large particle intensities from the same direction by different cameras. During SAPIV particle reconstruction, particles are commonly reconstructed by manually setting a threshold to filter out unfocused particles in the refocused images. In this paper, the particle intensity distribution in refocused images is analyzed, and a SAPIV particle field reconstruction method based on an adaptive threshold is presented. By using the adaptive threshold to filter the 3D measurement volume integrally, the three-dimensional location information of the focused particles can be reconstructed. The cross correlations between images captured from cameras and images projected by the reconstructed particle field are calculated for different threshold values. The optimal threshold is determined by cubic curve fitting and is defined as the threshold value that causes the correlation coefficient to reach its maximum. The numerical simulation of a 16-camera array and a particle field at two adjacent time events quantitatively evaluates the performance of the proposed method. An experimental system consisting of a camera array of 16 cameras was used to reconstruct the four adjacent frames in a vortex flow field. The results show that the proposed reconstruction method can effectively reconstruct the 3D particle fields.

  6. Consistency restrictions on maximal electric-field strength in quantum field theory.

    PubMed

    Gavrilov, S P; Gitman, D M

    2008-09-26

    Quantum field theory with an external background can be considered as a consistent model only if backreaction is relatively small with respect to the background. To find the corresponding consistency restrictions on an external electric field and its duration in QED and QCD, we analyze the mean-energy density of quantized fields for an arbitrary constant electric field E, acting during a large but finite time T. Using the corresponding asymptotics with respect to the dimensionless parameter eET2, one can see that the leading contributions to the energy are due to the creation of particles by the electric field. Assuming that these contributions are small in comparison with the energy density of the electric background, we establish the above-mentioned restrictions, which determine, in fact, the time scales from above of depletion of an electric field due to the backreaction.

  7. Background Noise Analysis in a Few-Photon-Level Qubit Memory

    NASA Astrophysics Data System (ADS)

    Mittiga, Thomas; Kupchak, Connor; Jordaan, Bertus; Namazi, Mehdi; Nolleke, Christian; Figeroa, Eden

    2014-05-01

    We have developed an Electromagnetically Induced Transparency based polarization qubit memory. The device is composed of a dual-rail probe field polarization setup colinear with an intense control field to store and retrieve any arbitrary polarization state by addressing a Λ-type energy level scheme in a 87Rb vapor cell. To achieve a signal-to-background ratio at the few photon level sufficient for polarization tomography of the retrieved state, the intense control field is filtered out through an etalon filtrating system. We have developed an analytical model predicting the influence of the signal-to-background ratio on the fidelities and compared it to experimental data. Experimentally measured global fidelities have been found to follow closely the theoretical prediction as signal-to-background decreases. These results suggest the plausibility of employing room temperature memories to store photonic qubits at the single photon level and for future applications in long distance quantum communication schemes.

  8. Improved methods for fan sound field determination

    NASA Technical Reports Server (NTRS)

    Cicon, D. E.; Sofrin, T. G.; Mathews, D. C.

    1981-01-01

    Several methods for determining acoustic mode structure in aircraft turbofan engines using wall microphone data were studied. A method for reducing data was devised and implemented which makes the definition of discrete coherent sound fields measured in the presence of engine speed fluctuation more accurate. For the analytical methods, algorithms were developed to define the dominant circumferential modes from full and partial circumferential arrays of microphones. Axial arrays were explored to define mode structure as a function of cutoff ratio, and the use of data taken at several constant speeds was also evaluated in an attempt to reduce instrumentation requirements. Sensitivities of the various methods to microphone density, array size and measurement error were evaluated and results of these studies showed these new methods to be impractical. The data reduction method used to reduce the effects of engine speed variation consisted of an electronic circuit which windowed the data so that signal enhancement could occur only when the speed was within a narrow range.

  9. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra.

    PubMed

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W; Popp, Jürgen

    2017-07-27

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC.

  10. Extending methods: using Bourdieu's field analysis to further investigate taste

    NASA Astrophysics Data System (ADS)

    Schindel Dimick, Alexandra

    2015-06-01

    In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.

  11. Implementation of a chemical background method (OH-CHEM) for measurements of OH using the Leeds FAGE instrument: Characterisation and observations from a coastal location

    NASA Astrophysics Data System (ADS)

    Woodward-Massey, R.; Cryer, D. R.; Whalley, L. K.; Ingham, T.; Seakins, P. W.; Heard, D. E.; Stimpson, L. M.

    2015-12-01

    The removal of pollutants and greenhouse gases in the troposphere is dominated by reactions with the hydroxyl radical (OH), which is closely coupled to the hydroperoxy radical (HO2). Comparisons of the levels of OH and HO2 observed during field campaigns to the results of detailed chemical box models serve as a vital tool to assess our understanding of the underlying chemical mechanisms involved in tropospheric oxidation. Recent measurements of OH and HO2 radicals are significantly higher than those predicted by models for some instruments measuring in certain environments, especially those influenced by high emissions of biogenic compounds such as isoprene, prompting intense laboratory research to account for such discrepancies. While current chemical mechanisms are likely incomplete, it is also possible that, at least in part, these elevated radical observations have been influenced by instrumental biases from interfering species. Recent studies have suggested that fluorescence assay by gas expansion (FAGE) instruments may be susceptible to an unknown interference in the measurement of OH. This hypothesis can be tested through the implementation of an alternative method to determine the OH background signal, whereby OH is removed by the addition of a chemical scavenger prior to sampling by FAGE. The Leeds FAGE instrument was modified to facilitate this method by the construction of an inlet pre-injector (IPI), where OH is removed through reaction with propane. The modified Leeds FAGE instrument was deployed at a coastal location in southeast England during summer 2015 as part of the ICOZA (Integrated Chemistry of OZone in the Atmosphere) project. Measurements of OH made using both background methods will be presented, alongside results from laboratory characterisation experiments and details of the IPI design.

  12. Limitations of STIRAP-like population transfer in extended systems: the three-level system embedded in a web of background states.

    PubMed

    Jakubetz, Werner

    2012-12-14

    This paper presents a systematic numerical investigation of background state participation in STIRAP (stimulated Raman-adiabatic passage) population transfer among vibrational states, focusing on the consequences for the robustness of the method. The simulations, which are performed over extended grids in the parameter space of the Stokes- and pump pulses (frequencies, field strengths, and pulse lengths), involve hierarchies of (3 + N)-level systems of increasing complexity, ranging from the standard three-level STIRAP setup, (N = 0) in Λ-configuration, up to N = 446. A strongly coupled three-level core system is selected from the full Hamiltonian of the double-well HCN∕HNC system, and the couplings connecting this core system to the remaining states are (re-) parameterized in different ways, from very weak to very strong. The systems so obtained represent a three-level system embedded in various ways in webs of cross-linked vibrational background states and incorporate typical molecular properties. We first summarize essential properties of population transfer in the standard three-level system and quantify the robustness of the method and its dependence on the pulse parameters. Against these reference results, we present results obtained for four (3 + 446)-level systems and several subsystems. For pulse lengths of at most few picoseconds the intrinsic robustness of STIRAP with respect to variations in the field strength disappears as soon as the largest core-background couplings exceed about one tenth of the STIRAP couplings. In such cases robustness with respect to variations in the field strength is entirely lost, since at higher field strengths, except for irregularly spaced narrow frequency ranges, transfer probabilities are strongly reduced. STIRAP-like population transfer is maintained, with some restrictions, at low field strengths near the onset of adiabatic transfer. The suppression of STIRAP is traced back to different mechanisms based on a

  13. Cosmic Infrared Background Fluctuations and Zodiacal Light

    NASA Technical Reports Server (NTRS)

    Arendt, Richard G.; Kashlinsky, A.; Moseley, S. H.; Mather, J.

    2017-01-01

    We performed a specific observational test to measure the effect that the zodiacal light can have on measurements of the spatial fluctuations of the near-IR (near-infrared)background. Previous estimates of possible fluctuations caused by zodiacal light have often been extrapolated from observations of the thermal emission at longer wavelengths and low angular resolution or from IRAC (Infrared Array Camera) observations of high-latitude fields where zodiacal light is faint and not strongly varying with time. The new observations analyzed here target the COSMOS (Cosmic Evolution Survey) field at low ecliptic latitude where the zodiacal light intensity varies by factors of approximately 2 over the range of solar elongations at which the field can be observed. We find that the white-noise component of the spatial power spectrum of the background is correlated with the modeled zodiacal light intensity. Roughly half of the measured white noise is correlated with the zodiacal light, but a more detailed interpretation of the white noise is hampered by systematic uncertainties that are evident in the zodiacal light model. At large angular scales (greater than or approximately equal to 100 arcseconds) where excess power above the white noise is observed, we find no correlation of the power with the modeled intensity of the zodiacal light. This test clearly indicates that the large-scale power in the infrared background is not being caused by the zodiacal light.

  14. Research on the algorithm of infrared target detection based on the frame difference and background subtraction method

    NASA Astrophysics Data System (ADS)

    Liu, Yun; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Hui, Mei; Liu, Xiaohua; Wu, Yijian

    2015-09-01

    As an important branch of infrared imaging technology, infrared target tracking and detection has a very important scientific value and a wide range of applications in both military and civilian areas. For the infrared image which is characterized by low SNR and serious disturbance of background noise, an innovative and effective target detection algorithm is proposed in this paper, according to the correlation of moving target frame-to-frame and the irrelevance of noise in sequential images based on OpenCV. Firstly, since the temporal differencing and background subtraction are very complementary, we use a combined detection method of frame difference and background subtraction which is based on adaptive background updating. Results indicate that it is simple and can extract the foreground moving target from the video sequence stably. For the background updating mechanism continuously updating each pixel, we can detect the infrared moving target more accurately. It paves the way for eventually realizing real-time infrared target detection and tracking, when transplanting the algorithms on OpenCV to the DSP platform. Afterwards, we use the optimal thresholding arithmetic to segment image. It transforms the gray images to black-white images in order to provide a better condition for the image sequences detection. Finally, according to the relevance of moving objects between different frames and mathematical morphology processing, we can eliminate noise, decrease the area, and smooth region boundaries. Experimental results proves that our algorithm precisely achieve the purpose of rapid detection of small infrared target.

  15. Prediction of solar activity from solar background magnetic field variations in cycles 21-23

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepherd, Simon J.; Zharkov, Sergei I.; Zharkova, Valentina V., E-mail: s.j.shepherd@brad.ac.uk, E-mail: s.zharkov@hull.ac.uk, E-mail: valentina.zharkova@northumbria.ac.uk

    2014-11-01

    A comprehensive spectral analysis of both the solar background magnetic field (SBMF) in cycles 21-23 and the sunspot magnetic field in cycle 23 reported in our recent paper showed the presence of two principal components (PCs) of SBMF having opposite polarity, e.g., originating in the northern and southern hemispheres, respectively. Over a duration of one solar cycle, both waves are found to travel with an increasing phase shift toward the northern hemisphere in odd cycles 21 and 23 and to the southern hemisphere in even cycle 22. These waves were linked to solar dynamo waves assumed to form in differentmore » layers of the solar interior. In this paper, for the first time, the PCs of SBMF in cycles 21-23 are analyzed with the symbolic regression technique using Hamiltonian principles, allowing us to uncover the underlying mathematical laws governing these complex waves in the SBMF presented by PCs and to extrapolate these PCs to cycles 24-26. The PCs predicted for cycle 24 very closely fit (with an accuracy better than 98%) the PCs derived from the SBMF observations in this cycle. This approach also predicts a strong reduction of the SBMF in cycles 25 and 26 and, thus, a reduction of the resulting solar activity. This decrease is accompanied by an increasing phase shift between the two predicted PCs (magnetic waves) in cycle 25 leading to their full separation into the opposite hemispheres in cycle 26. The variations of the modulus summary of the two PCs in SBMF reveals a remarkable resemblance to the average number of sunspots in cycles 21-24 and to predictions of reduced sunspot numbers compared to cycle 24: 80% in cycle 25 and 40% in cycle 26.« less

  16. Simple, Low-Cost Data Collection Methods for Agricultural Field Studies.

    ERIC Educational Resources Information Center

    Koenig, Richard T.; Winger, Marlon; Kitchen, Boyd

    2000-01-01

    Summarizes relatively simple and inexpensive methods for collecting data from agricultural field studies. Describes methods involving on-farm testing, crop yield measurement, quality evaluations, weed control effectiveness, plant nutrient status, and other measures. Contains 29 references illustrating how these methods were used to conduct…

  17. Dual stage potential field method for robotic path planning

    NASA Astrophysics Data System (ADS)

    Singh, Pradyumna Kumar; Parida, Pramod Kumar

    2018-04-01

    Path planning for autonomous mobile robots are the root for all autonomous mobile systems. Various methods are used for optimization of path to be followed by the autonomous mobile robots. Artificial potential field based path planning method is one of the most used methods for the researchers. Various algorithms have been proposed using the potential field approach. But in most of the common problems are encounters while heading towards the goal or target. i.e. local minima problem, zero potential regions problem, complex shaped obstacles problem, target near obstacle problem. In this paper we provide a new algorithm in which two types of potential functions are used one after another. The former one is to use to get the probable points and later one for getting the optimum path. In this algorithm we consider only the static obstacle and goal.

  18. Cluster richness-mass calibration with cosmic microwave background lensing

    NASA Astrophysics Data System (ADS)

    Geach, James E.; Peacock, John A.

    2017-11-01

    Identifying galaxy clusters through overdensities of galaxies in photometric surveys is the oldest1,2 and arguably the most economical and mass-sensitive detection method3,4, compared with X-ray5-7 and Sunyaev-Zel'dovich effect8 surveys that detect the hot intracluster medium. However, a perennial problem has been the mapping of optical `richness' measurements onto total cluster mass3,9-12. Emitted at a conformal distance of 14 gigaparsecs, the cosmic microwave background acts as a backlight to all intervening mass in the Universe, and therefore has been gravitationally lensed13-15. Experiments such as the Atacama Cosmology Telescope16, South Pole Telescope17-19 and the Planck20 satellite have now detected gravitational lensing of the cosmic microwave background and produced large-area maps of the foreground deflecting structures. Here we present a calibration of cluster optical richness at the 10% level by measuring the average cosmic microwave background lensing measured by Planck towards the positions of large numbers of optically selected clusters, detecting the deflection of photons by structures of total mass of order 1014 M⊙. Although mainly aimed at the study of larger-scale structures, the Planck estimate of the cosmic microwave background lensing field can be used to recover a nearly unbiased lensing signal for stacked clusters on arcminute scales15,21. This approach offers a clean measure of total cluster masses over most of cosmic history, largely independent of baryon physics.

  19. Extracting harmonic signal from a chaotic background with local linear model

    NASA Astrophysics Data System (ADS)

    Li, Chenlong; Su, Liyun

    2017-02-01

    In this paper, the problems of blind detection and estimation of harmonic signal in strong chaotic background are analyzed, and new methods by using local linear (LL) model are put forward. The LL model has been exhaustively researched and successfully applied for fitting and forecasting chaotic signal in many chaotic fields. We enlarge the modeling capacity substantially. Firstly, we can predict the short-term chaotic signal and obtain the fitting error based on the LL model. Then we detect the frequencies from the fitting error by periodogram, a property on the fitting error is proposed which has not been addressed before, and this property ensures that the detected frequencies are similar to that of harmonic signal. Secondly, we establish a two-layer LL model to estimate the determinate harmonic signal in strong chaotic background. To estimate this simply and effectively, we develop an efficient backfitting algorithm to select and optimize the parameters that are hard to be exhaustively searched for. In the method, based on sensitivity to initial value of chaos motion, the minimum fitting error criterion is used as the objective function to get the estimation of the parameters of the two-layer LL model. Simulation shows that the two-layer LL model and its estimation technique have appreciable flexibility to model the determinate harmonic signal in different chaotic backgrounds (Lorenz, Henon and Mackey-Glass (M-G) equations). Specifically, the harmonic signal can be extracted well with low SNR and the developed background algorithm satisfies the condition of convergence in repeated 3-5 times.

  20. Improved background suppression in ¹H MAS NMR using composite pulses.

    PubMed

    Odedra, Smita; Wimperis, Stephen

    2012-08-01

    A well known feature of ¹H MAS NMR spectroscopy, particularly of solids where the concentration of ¹H nuclei is low, is the presence in the spectrum of a significant broad "background" signal arising from ¹H nuclei that are outside the MAS rotor and radiofrequency coil, probably located on the surfaces of the static components of the probehead. A popular method of suppressing this unwanted signal is the "depth pulse" method, consisting of a 90° pulse followed by one or two 180° pulses that are phase cycled according to the "Exorcycle" scheme, which removes signal associated with imperfect 180° pulses. Consequently, only spins in the centre of the radiofrequency coil contribute to the ¹H MAS spectrum, while those experiencing a low B₁ field outside the coil are suppressed. Although very effective at removing background signal from the spectrum, one drawback with this approach is that significant loss of the desired signal from the sample also occurs. Here we investigate the ¹H background suppression problem and, in particular, the use of novel antisymmetric passband composite pulses to replace the simple pulses in a depth pulse experiment. We show that it is possible to improve the intensity of the ¹H signals of interest while still maintaining effective background suppression. We expect that these results will be relevant to ¹H MAS NMR studies of, for example, nominally perdeuterated biological samples or nominally anhydrous inorganic materials. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. A Novel Method of Localization for Moving Objects with an Alternating Magnetic Field

    PubMed Central

    Gao, Xiang; Yan, Shenggang; Li, Bin

    2017-01-01

    Magnetic detection technology has wide applications in the fields of geological exploration, biomedical treatment, wreck removal and localization of unexploded ordinance. A large number of methods have been developed to locate targets with static magnetic fields, however, the relation between the problem of localization of moving objectives with alternating magnetic fields and the localization with a static magnetic field is rarely studied. A novel method of target localization based on coherent demodulation was proposed in this paper. The problem of localization of moving objects with an alternating magnetic field was transformed into the localization with a static magnetic field. The Levenberg-Marquardt (L-M) algorithm was applied to calculate the position of the target with magnetic field data measured by a single three-component magnetic sensor. Theoretical simulation and experimental results demonstrate the effectiveness of the proposed method. PMID:28430153

  2. Urban Background Study Webinar

    EPA Pesticide Factsheets

    This webinar presented the methodology developed for collecting a city-wide or urban area background data set, general results of southeastern cities data collected to date, and a case study that used this sampling method.

  3. New method for solving inductive electric fields in the non-uniformly conducting ionosphere

    NASA Astrophysics Data System (ADS)

    Vanhamäki, H.; Amm, O.; Viljanen, A.

    2006-10-01

    We present a new calculation method for solving inductive electric fields in the ionosphere. The time series of the potential part of the ionospheric electric field, together with the Hall and Pedersen conductances serves as the input to this method. The output is the time series of the induced rotational part of the ionospheric electric field. The calculation method works in the time-domain and can be used with non-uniform, time-dependent conductances. In addition, no particular symmetry requirements are imposed on the input potential electric field. The presented method makes use of special non-local vector basis functions called the Cartesian Elementary Current Systems (CECS). This vector basis offers a convenient way of representing curl-free and divergence-free parts of 2-dimensional vector fields and makes it possible to solve the induction problem using simple linear algebra. The new calculation method is validated by comparing it with previously published results for Alfvén wave reflection from a uniformly conducting ionosphere.

  4. An accelerated lambda iteration method for multilevel radiative transfer. I - Non-overlapping lines with background continuum

    NASA Technical Reports Server (NTRS)

    Rybicki, G. B.; Hummer, D. G.

    1991-01-01

    A method is presented for solving multilevel transfer problems when nonoverlapping lines and background continuum are present and active continuum transfer is absent. An approximate lambda operator is employed to derive linear, 'preconditioned', statistical-equilibrium equations. A method is described for finding the diagonal elements of the 'true' numerical lambda operator, and therefore for obtaining the coefficients of the equations. Iterations of the preconditioned equations, in conjunction with the transfer equation's formal solution, are used to solve linear equations. Some multilevel problems are considered, including an eleven-level neutral helium atom. Diagonal and tridiagonal approximate lambda operators are utilized in the problems to examine the convergence properties of the method, and it is found to be effective for the line transfer problems.

  5. The Anisotropy of the Microwave Background to l=3500: Mosaic Observations with the Cosmic Background Imager

    NASA Technical Reports Server (NTRS)

    Pearson, T. J.; Mason, B. S.; Readhead, A. C. S.; Shepherd, M. C.; Sievers, J. L.; Udomprasert, P. S.; Cartwright, J. K.; Farmer, A. J.; Padin, S.; Myers, S. T.; hide

    2002-01-01

    Using the Cosmic Background Imager, a 13-element interferometer array operating in the 26-36 GHz frequency band, we have observed 40 deg (sup 2) of sky in three pairs of fields, each approximately 145 feet x 165 feet, using overlapping pointings: (mosaicing). We present images and power spectra of the cosmic microwave background radiation in these mosaic fields. We remove ground radiation and other low-level contaminating signals by differencing matched observations of the fields in each pair. The primary foreground contamination is due to point sources (radio galaxies and quasars). We have subtracted the strongest sources from the data using higher-resolution measurements, and we have projected out the response to other sources of known position in the power-spectrum analysis. The images show features on scales approximately 6 feet-15 feet, corresponding to masses approximately 5-80 x 10(exp 14) solar mass at the surface of last scattering, which are likely to be the seeds of clusters of galaxies. The power spectrum estimates have a resolution delta l approximately 200 and are consistent with earlier results in the multipole range l approximately less than 1000. The power spectrum is detected with high signal-to-noise ratio in the range 300 approximately less than l approximately less than 1700. For 1700 approximately less than l approximately less than 3000 the observations are consistent with the results from more sensitive CBI deep-field observations. The results agree with the extrapolation of cosmological models fitted to observations at lower l, and show the predicted drop at high l (the "damping tail").

  6. Effective-field renormalization-group method for Ising systems

    NASA Astrophysics Data System (ADS)

    Fittipaldi, I. P.; De Albuquerque, D. F.

    1992-02-01

    A new applicable effective-field renormalization-group (ERFG) scheme for computing critical properties of Ising spins systems is proposed and used to study the phase diagrams of a quenched bond-mixed spin Ising model on square and Kagomé lattices. The present EFRG approach yields results which improves substantially on those obtained from standard mean-field renormalization-group (MFRG) method. In particular, it is shown that the EFRG scheme correctly distinguishes the geometry of the lattice structure even when working with the smallest possible clusters, namely N'=1 and N=2.

  7. Method and system for progressive mesh storage and reconstruction using wavelet-encoded height fields

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory A. (Inventor); Linger, Timothy C. (Inventor)

    2011-01-01

    Systems and methods are provided for progressive mesh storage and reconstruction using wavelet-encoded height fields. A method for progressive mesh storage includes reading raster height field data, and processing the raster height field data with a discrete wavelet transform to generate wavelet-encoded height fields. In another embodiment, a method for progressive mesh storage includes reading texture map data, and processing the texture map data with a discrete wavelet transform to generate wavelet-encoded texture map fields. A method for reconstructing a progressive mesh from wavelet-encoded height field data includes determining terrain blocks, and a level of detail required for each terrain block, based upon a viewpoint. Triangle strip constructs are generated from vertices of the terrain blocks, and an image is rendered utilizing the triangle strip constructs. Software products that implement these methods are provided.

  8. Method and system for progressive mesh storage and reconstruction using wavelet-encoded height fields

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory A. (Inventor)

    2010-01-01

    Systems and methods are provided for progressive mesh storage and reconstruction using wavelet-encoded height fields. A method for progressive mesh storage includes reading raster height field data, and processing the raster height field data with a discrete wavelet transform to generate wavelet-encoded height fields. In another embodiment, a method for progressive mesh storage includes reading texture map data, and processing the texture map data with a discrete wavelet transform to generate wavelet-encoded texture map fields. A method for reconstructing a progressive mesh from wavelet-encoded height field data includes determining terrain blocks, and a level of detail required for each terrain block, based upon a viewpoint. Triangle strip constructs are generated from vertices of the terrain blocks, and an image is rendered utilizing the triangle strip constructs. Software products that implement these methods are provided.

  9. Evaluation of Shifted Excitation Raman Difference Spectroscopy and Comparison to Computational Background Correction Methods Applied to Biochemical Raman Spectra

    PubMed Central

    Cordero, Eliana; Korinth, Florian; Stiebing, Clara; Krafft, Christoph; Schie, Iwan W.; Popp, Jürgen

    2017-01-01

    Raman spectroscopy provides label-free biochemical information from tissue samples without complicated sample preparation. The clinical capability of Raman spectroscopy has been demonstrated in a wide range of in vitro and in vivo applications. However, a challenge for in vivo applications is the simultaneous excitation of auto-fluorescence in the majority of tissues of interest, such as liver, bladder, brain, and others. Raman bands are then superimposed on a fluorescence background, which can be several orders of magnitude larger than the Raman signal. To eliminate the disturbing fluorescence background, several approaches are available. Among instrumentational methods shifted excitation Raman difference spectroscopy (SERDS) has been widely applied and studied. Similarly, computational techniques, for instance extended multiplicative scatter correction (EMSC), have also been employed to remove undesired background contributions. Here, we present a theoretical and experimental evaluation and comparison of fluorescence background removal approaches for Raman spectra based on SERDS and EMSC. PMID:28749450

  10. A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani

    2013-01-01

    This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.

  11. A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters

    NASA Technical Reports Server (NTRS)

    Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani

    2013-01-01

    This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes, the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.

  12. Removal of anti-Stokes emission background in STED microscopy by FPGA-based synchronous detection

    NASA Astrophysics Data System (ADS)

    Castello, M.; Tortarolo, G.; Coto Hernández, I.; Deguchi, T.; Diaspro, A.; Vicidomini, G.

    2017-05-01

    In stimulated emission depletion (STED) microscopy, the role of the STED beam is to de-excite, via stimulated emission, the fluorophores that have been previously excited by the excitation beam. This condition, together with specific beam intensity distributions, allows obtaining true sub-diffraction spatial resolution images. However, if the STED beam has a non-negligible probability to excite the fluorophores, a strong fluorescent background signal (anti-Stokes emission) reduces the effective resolution. For STED scanning microscopy, different synchronous detection methods have been proposed to remove this anti-Stokes emission background and recover the resolution. However, every method works only for a specific STED microscopy implementation. Here we present a user-friendly synchronous detection method compatible with any STED scanning microscope. It exploits a data acquisition (DAQ) card based on a field-programmable gate array (FPGA), which is progressively used in STED microscopy. In essence, the FPGA-based DAQ card synchronizes the fluorescent signal registration, the beam deflection, and the excitation beam interruption, providing a fully automatic pixel-by-pixel synchronous detection method. We validate the proposed method in both continuous wave and pulsed STED microscope systems.

  13. A self-consistent field method for galactic dynamics

    NASA Technical Reports Server (NTRS)

    Hernquist, Lars; Ostriker, Jeremiah P.

    1992-01-01

    The present study describes an algorithm for evolving collisionless stellar systems in order to investigate the evolution of systems with density profiles like the R exp 1/4 law, using only a few terms in the expansions. A good fit is obtained for a truncated isothermal distribution, which renders the method appropriate for galaxies with flat rotation curves. Calculations employing N of about 10 exp 6-7 are straightforward on existing supercomputers, making possible simulations having significantly smoother fields than with direct methods such as tree-codes. Orbits are found in a given static or time-dependent gravitational field; the potential, phi(r, t) is revised from the resultant density, rho(r, t). Possible scientific uses of this technique are discussed, including tidal perturbations of dwarf galaxies, the adiabatic growth of central masses in spheroidal galaxies, instabilities in realistic galaxy models, and secular processes in galactic evolution.

  14. Method for imaging with low frequency electromagnetic fields

    DOEpatents

    Lee, Ki H.; Xie, Gan Q.

    1994-01-01

    A method for imaging with low frequency electromagnetic fields, and for interpreting the electromagnetic data using ray tomography, in order to determine the earth conductivity with high accuracy and resolution. The imaging method includes the steps of placing one or more transmitters, at various positions in a plurality of transmitter holes, and placing a plurality of receivers in a plurality of receiver holes. The transmitters generate electromagnetic signals which diffuse through a medium, such as earth, toward the receivers. The measured diffusion field data H is then transformed into wavefield data U. The traveltimes corresponding to the wavefield data U, are then obtained, by charting the wavefield data U, using a different regularization parameter .alpha. for each transform. The desired property of the medium, such as conductivity, is then derived from the velocity, which in turn is constructed from the wavefield data U using ray tomography.

  15. Method for imaging with low frequency electromagnetic fields

    DOEpatents

    Lee, K.H.; Xie, G.Q.

    1994-12-13

    A method is described for imaging with low frequency electromagnetic fields, and for interpreting the electromagnetic data using ray tomography, in order to determine the earth conductivity with high accuracy and resolution. The imaging method includes the steps of placing one or more transmitters, at various positions in a plurality of transmitter holes, and placing a plurality of receivers in a plurality of receiver holes. The transmitters generate electromagnetic signals which diffuse through a medium, such as earth, toward the receivers. The measured diffusion field data H is then transformed into wavefield data U. The travel times corresponding to the wavefield data U, are then obtained, by charting the wavefield data U, using a different regularization parameter [alpha] for each transform. The desired property of the medium, such as conductivity, is then derived from the velocity, which in turn is constructed from the wavefield data U using ray tomography. 13 figures.

  16. A Flexible Cosmic Ultraviolet Background Model

    NASA Astrophysics Data System (ADS)

    McQuinn, Matthew

    2016-10-01

    HST studies of the IGM, of the CGM, and of reionization-era galaxies are all aided by ionizing background models, which are a critical input in modeling the ionization state of diffuse, 10^4 K gas. The ionization state in turn enables the determination of densities and sizes of absorbing clouds and, when applied to the Ly-a forest, the global ionizing emissivity of sources. Unfortunately, studies that use these background models have no way of gauging the amount of uncertainty in the adopted model other than to recompute their results using previous background models with outdated observational inputs. As of yet there has been no systematic study of uncertainties in the background model and there unfortunately is no publicly available ultraviolet background code. A public code would enable users to update the calculation with the latest observational constraints, and it would allow users to experiment with varying the background model's assumptions regarding emissions and absorptions. We propose to develop a publicly available ionizing background code and, as an initial application, quantify the level of uncertainty in the ionizing background spectrum across cosmic time. As the background model improves, so does our understanding of (1) the sources that dominate ionizing emissions across cosmic time and (2) the properties of diffuse gas in the circumgalactic medium, the WHIM, and the Ly-a forest. HST is the primary telescope for studying both the highest redshift galaxies and low-redshift diffuse gas. The proposed program would benefit HST studies of the Universe at z 0 all the way up to z = 10, including of high-z galaxies observed in the HST Frontier Fields.

  17. Method of determining interwell oil field fluid saturation distribution

    DOEpatents

    Donaldson, Erle C.; Sutterfield, F. Dexter

    1981-01-01

    A method of determining the oil and brine saturation distribution in an oil field by taking electrical current and potential measurements among a plurality of open-hole wells geometrically distributed throughout the oil field. Poisson's equation is utilized to develop fluid saturation distributions from the electrical current and potential measurement. Both signal generating equipment and chemical means are used to develop current flow among the several open-hole wells.

  18. Metal-as-insulation variant of no-insulation HTS winding technique: pancake tests under high background magnetic field and high current at 4.2 K

    NASA Astrophysics Data System (ADS)

    Lécrevisse, T.; Badel, A.; Benkel, T.; Chaud, X.; Fazilleau, P.; Tixador, P.

    2018-05-01

    In the framework of a project aiming at fabricating a 10 T high temperature superconducting (HTS) insert to operate in a 20 T background field, we are investigating the behavior of pancakes consisting of a REBCO HTS tape co-wound with a stainless steel tape (metal-as-insulation (MI) coil). The MI winding is inducing a significant turn-to-turn electrical resistance which helps to reduce the charging time delay. Despite this resistance, the self-protection feature of no-insulation coils is still enabled, thanks to the voltage limit of the power supply. We have built a single pancake coil representative of the pancake that will be used in the insert and performed tests under very high background magnetic field. Our coil experienced over 100 heater induced quenches without a measureable increase of its internal resistance. We have gathered stability and quench behavior data for magnetic fields and engineering current densities (je ) in the range of 0–17 T and 0–635 A mm‑2 respectively. We also present our very first experiments on the insert/outsert interaction in the case of a resistive magnet fault. We show that if self-protection of the MI winding is really effective in the case of a MI coil quench, a major issue comes from the outsert fault which induces a huge current inside the MI coil.

  19. [Zhu Lian's cognition on theory and method of acupuncture and moxibustion under background of western medicine].

    PubMed

    Li, Su-yun; Zhang, Li-jian; Liu, Bing

    2014-11-01

    With new acupuncture and moxibustion as the study object, based on the basic composition of acupuncture-moxibustion theory, from 3 aspects of meridian-acupoint theory, acupuncture-moxibustion method theory and acupuncture-moxibustion treatment theory, under the background of western medicine, ZHU Lian's different opinions on theory and method of acupuncture and moxibustion were discussed. It was believed by ZHU Lian that the distribution of 14-meridians was approximately identical to that of nerves, so with modern neuroanatomy knowledge to understand the meaning of acupoint; the acupuncture function could be explained from the angle of neurophysiology. Clinical diagnosis and treatment method could be established by modern classification methods of diseases. ZHU Lian's cognition that was different from traditional theory and method of acupuncture and moxibustion was combined with updated physiology and anatomy knowledge at that time, and was involved with Pavlov's advanced nerve theory, so she firstly put forward the opinion that acupuncture therapy can't work without the involvement of cerebral cortex.

  20. Skeletonization of Gridded Potential-Field Images

    NASA Astrophysics Data System (ADS)

    Gao, L.; Morozov, I. B.

    2012-12-01

    A new approach to skeletonization was developed for gridded potential-field data. Generally, skeletonization is a pattern-recognition technique allowing automatic recognition of near-linear features in the images, measurement of their parameters, and analyzing them for similarities. Our approach decomposes the images into arbitrarily-oriented "wavelets" characterized by positive or negative amplitudes, orientation angles, spatial dimensions, polarities, and other attributes. Orientations of the wavelets are obtained by scanning the azimuths to detect the strike direction of each anomaly. The wavelets are connected according to the similarities of these attributes, which leads to a "skeleton" map of the potential-field data. In addition, 2-D filtering is conducted concurrently with the wavelet-identification process, which allows extracting parameters of background trends and reduces the adverse effects of low-frequency background (which is often strong in potential-field maps) on skeletonization.. By correlating the neighboring wavelets, linear anomalies are identified and characterized. The advantages of this algorithm are the generality and isotropy of feature detection, as well as being specifically designed for gridded data. With several options for background-trend extraction, the stability for identification of lineaments is improved and optimized. The algorithm is also integrated in a powerful processing system which allows combining it with numerous other tools, such as filtering, computation of analytical signal, empirical mode decomposition, and various types of plotting. The method is applied to potential-field data for the Western Canada Sedimentary Basin, in a study area which extends from southern Saskatchewan into southwestern Manitoba. The target is the structure of crystalline basement beneath Phanerozoic sediments. The examples illustrate that skeletonization aid in the interpretation of complex structures at different scale lengths. The results

  1. A sparse equivalent source method for near-field acoustic holography.

    PubMed

    Fernandez-Grande, Efren; Xenaki, Angeliki; Gerstoft, Peter

    2017-01-01

    This study examines a near-field acoustic holography method consisting of a sparse formulation of the equivalent source method, based on the compressive sensing (CS) framework. The method, denoted Compressive-Equivalent Source Method (C-ESM), encourages spatially sparse solutions (based on the superposition of few waves) that are accurate when the acoustic sources are spatially localized. The importance of obtaining a non-redundant representation, i.e., a sensing matrix with low column coherence, and the inherent ill-conditioning of near-field reconstruction problems is addressed. Numerical and experimental results on a classical guitar and on a highly reactive dipole-like source are presented. C-ESM is valid beyond the conventional sampling limits, making wide-band reconstruction possible. Spatially extended sources can also be addressed with C-ESM, although in this case the obtained solution does not recover the spatial extent of the source.

  2. Massive graviton on arbitrary background: derivation, syzygies, applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Laura; Deffayet, Cédric; IHES, Institut des Hautes Études Scientifiques,Le Bois-Marie, 35 route de Chartres, F-91440 Bures-sur-Yvette

    2015-06-23

    We give the detailed derivation of the fully covariant form of the quadratic action and the derived linear equations of motion for a massive graviton in an arbitrary background metric (which were presented in arXiv:1410.8302 [hep-th]). Our starting point is the de Rham-Gabadadze-Tolley (dRGT) family of ghost free massive gravities and using a simple model of this family, we are able to express this action and these equations of motion in terms of a single metric in which the graviton propagates, hence removing in particular the need for a “reference metric' which is present in the non perturbative formulation. Wemore » show further how 5 covariant constraints can be obtained including one which leads to the tracelessness of the graviton on flat space-time and removes the Boulware-Deser ghost. This last constraint involves powers and combinations of the curvature of the background metric. The 5 constraints are obtained for a background metric which is unconstrained, i.e. which does not have to obey the background field equations. We then apply these results to the case of Einstein space-times, where we show that the 5 constraints become trivial, and Friedmann-Lemaître-Robertson-Walker space-times, for which we correct in particular some results that appeared elsewhere. To reach our results, we derive several non trivial identities, syzygies, involving the graviton fields, its derivatives and the background metric curvature. These identities have their own interest. We also discover that there exist backgrounds for which the dRGT equations cannot be unambiguously linearized.« less

  3. Massive graviton on arbitrary background: derivation, syzygies, applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Laura; Deffayet, Cédric; Strauss, Mikael von, E-mail: bernard@iap.fr, E-mail: deffayet@iap.fr, E-mail: strauss@iap.fr

    2015-06-01

    We give the detailed derivation of the fully covariant form of the quadratic action and the derived linear equations of motion for a massive graviton in an arbitrary background metric (which were presented in arXiv:1410.8302 [hep-th]). Our starting point is the de Rham-Gabadadze-Tolley (dRGT) family of ghost free massive gravities and using a simple model of this family, we are able to express this action and these equations of motion in terms of a single metric in which the graviton propagates, hence removing in particular the need for a ''reference metric' which is present in the non perturbative formulation. Wemore » show further how 5 covariant constraints can be obtained including one which leads to the tracelessness of the graviton on flat space-time and removes the Boulware-Deser ghost. This last constraint involves powers and combinations of the curvature of the background metric. The 5 constraints are obtained for a background metric which is unconstrained, i.e. which does not have to obey the background field equations. We then apply these results to the case of Einstein space-times, where we show that the 5 constraints become trivial, and Friedmann-Lemaître-Robertson-Walker space-times, for which we correct in particular some results that appeared elsewhere. To reach our results, we derive several non trivial identities, syzygies, involving the graviton fields, its derivatives and the background metric curvature. These identities have their own interest. We also discover that there exist backgrounds for which the dRGT equations cannot be unambiguously linearized.« less

  4. The EPIC-MOS Particle-Induced Background Spectra

    NASA Technical Reports Server (NTRS)

    Kuntz, K. D.; Sowden, S. L.

    2007-01-01

    In order to analyse diffuse emission that fills the field of view, one must accurately characterize the instrumental backgrounds. For the XMM-Newton EPIC instrument these backgrounds include a temporally variable "quiescent" component. as well as the strongly variable soft proton contamination. We have characterized the spectral and spatial response of the EPIC detectors to these background components and have developed tools to remove these backgrounds from observations. The "quiescent" component was characterized using a combination of the filter-wheel-closed data and a database of unexposed-region data. The soft proton contamination was characterized by differencing images and spectra taken during flared and flare-free intervals. After application of our modeled backgrounds, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear spectral evidence of solar wind charge exchange emission. Using a large sample of blank sky data, we show that strong magnetospheric SWCX emission requires elevated solar wind fluxes; observations through the densest part of the magnetosheath are not necessarily strongly contaminated with SWCX emission.

  5. Evolutionary programming-based univector field navigation method for past mobile robots.

    PubMed

    Kim, Y J; Kim, J H; Kwon, D S

    2001-01-01

    Most of navigation techniques with obstacle avoidance do not consider the robot orientation at the target position. These techniques deal with the robot position only and are independent of its orientation and velocity. To solve these problems this paper proposes a novel univector field method for fast mobile robot navigation which introduces a normalized two dimensional vector field. The method provides fast moving robots with the desired posture at the target position and obstacle avoidance. To obtain the sub-optimal vector field, a function approximator is used and trained by evolutionary programming. Two kinds of vector fields are trained, one for the final posture acquisition and the other for obstacle avoidance. Computer simulations and real experiments are carried out for a fast moving mobile robot to demonstrate the effectiveness of the proposed scheme.

  6. Method of using triaxial magnetic fields for making particle structures

    DOEpatents

    Martin, James E.; Anderson, Robert A.; Williamson, Rodney L.

    2005-01-18

    A method of producing three-dimensional particle structures with enhanced magnetic susceptibility in three dimensions by applying a triaxial energetic field to a magnetic particle suspension and subsequently stabilizing said particle structure. Combinations of direct current and alternating current fields in three dimensions produce particle gel structures, honeycomb structures, and foam-like structures.

  7. Revealing infinite derivative gravity's true potential: The weak-field limit around de Sitter backgrounds

    NASA Astrophysics Data System (ADS)

    Edholm, James

    2018-03-01

    General Relativity is known to produce singularities in the potential generated by a point source. Our universe can be modeled as a de Sitter (dS) metric and we show that ghost-free infinite derivative gravity (IDG) produces a nonsingular potential around a dS background, while returning to the GR prediction at large distances. We also show that although there are an apparently infinite number of coefficients in the theory, only a finite number actually affect the predictions. By writing the linearized equations of motion in a simplified form, we find that at distances below the Hubble length scale, the difference between the IDG potential around a flat background and around a de Sitter background is negligible.

  8. Multi-phase-field method for surface tension induced elasticity

    NASA Astrophysics Data System (ADS)

    Schiedung, Raphael; Steinbach, Ingo; Varnik, Fathollah

    2018-01-01

    A method, based on the multi-phase-field framework, is proposed that adequately accounts for the effects of a coupling between surface free energy and elastic deformation in solids. The method is validated via a number of analytically solvable problems. In addition to stress states at mechanical equilibrium in complex geometries, the underlying multi-phase-field framework naturally allows us to account for the influence of surface energy induced stresses on phase transformation kinetics. This issue, which is of fundamental importance on the nanoscale, is demonstrated in the limit of fast diffusion for a solid sphere, which melts due to the well-known Gibbs-Thompson effect. This melting process is slowed down when coupled to surface energy induced elastic deformation.

  9. FIELD ANALYTICAL SCREENING PROGRAM: PCP METHOD - INNOVATIVE TECHNOLOGY EVALUATION REPORT

    EPA Science Inventory

    The Field Analytical Screening Program (FASP) pentachlorophenol (PCP) method uses a gas chromatograph (GC) equipped with a megabore capillary column and flame ionization detector (FID) and electron capture detector (ECD) to identify and quantify PCP. The FASP PCP method is design...

  10. Multiframe super resolution reconstruction method based on light field angular images

    NASA Astrophysics Data System (ADS)

    Zhou, Shubo; Yuan, Yan; Su, Lijuan; Ding, Xiaomin; Wang, Jichao

    2017-12-01

    The plenoptic camera can directly obtain 4-dimensional light field information from a 2-dimensional sensor. However, based on the sampling theorem, the spatial resolution is greatly limited by the microlenses. In this paper, we present a method of reconstructing high-resolution images from the angular images. First, the ray tracing method is used to model the telecentric-based light field imaging process. Then, we analyze the subpixel shifts between the angular images extracted from the defocused light field data and the blur in the angular images. According to the analysis above, we construct the observation model from the ideal high-resolution image to the angular images. Applying the regularized super resolution method, we can obtain the super resolution result with a magnification ratio of 8. The results demonstrate the effectiveness of the proposed observation model.

  11. A telluric method for natural field induced polarization studies

    NASA Astrophysics Data System (ADS)

    Zorin, Nikita; Epishkin, Dmitrii; Yakovlev, Andrey

    2016-12-01

    Natural field induced polarization (NFIP) is a branch of low-frequency electromagnetics designed for detection of buried polarizable objects from magnetotelluric (MT) data. The conventional approach to the method deals with normalized MT apparent resistivity. We show that it is more favorable to extract the IP effect from solely electric (telluric) transfer functions instead. For lateral localization of polarizable bodies it is convenient to work with the telluric tensor determinant, which does not depend on the rotation of the receiving electric dipoles. Applicability of the new method was verified in the course of a large-scale field research. The field work was conducted in a well-explored area in East Kazakhstan known for the presence of various IP sources such as graphite, magnetite, and sulfide mineralization. A new multichannel processing approach allowed the determination of the telluric tensor components with very good accuracy. This holds out a hope that in some cases NFIP data may be used not only for detection of polarizable objects, but also for a rough estimation of their spectral IP characteristics.

  12. Simulation of PEP-II Accelerator Backgrounds Using TURTLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barlow, R.J.; Fieguth, T.; /SLAC

    2006-02-15

    We present studies of accelerator-induced backgrounds in the BaBar detector at the SLAC B-Factory, carried out using LPTURTLE, a modified version of the DECAY TURTLE simulation package. Lost-particle backgrounds in PEP-II are dominated by a combination of beam-gas bremstrahlung, beam-gas Coulomb scattering, radiative-Bhabha events and beam-beam blow-up. The radiation damage and detector occupancy caused by the associated electromagnetic shower debris can limit the usable luminosity. In order to understand and mitigate such backgrounds, we have performed a full program of beam-gas and luminosity-background simulations, that include the effects of the detector solenoidal field, detailed modeling of limiting apertures in bothmore » collider rings, and optimization of the betatron collimation scheme in the presence of large transverse tails.« less

  13. Single and tandem Fabry-Perot etalons as solar background filters for lidar.

    PubMed

    McKay, J A

    1999-09-20

    Atmospheric lidar is difficult in daylight because of sunlight scattered into the receiver field of view. In this research methods for the design and performance analysis of Fabry-Perot etalons as solar background filters are presented. The factor by which the signal to background ratio is enhanced is defined as a measure of the performance of the etalon as a filter. Equations for evaluating this parameter are presented for single-, double-, and triple-etalon filter systems. The role of reflective coupling between etalons is examined and shown to substantially reduce the contributions of the second and third etalons to the filter performance. Attenuators placed between the etalons can improve the filter performance, at modest cost to the signal transmittance. The principal parameter governing the performance of the etalon filters is the etalon defect finesse. Practical limitations on etalon plate smoothness and parallelism cause the defect finesse to be relatively low, especially in the ultraviolet, and this sets upper limits to the capability of tandem etalon filters to suppress the solar background at tolerable cost to the signal.

  14. Mapping gravitational-wave backgrounds using methods from CMB analysis: Application to pulsar timing arrays

    NASA Astrophysics Data System (ADS)

    Gair, Jonathan; Romano, Joseph D.; Taylor, Stephen; Mingarelli, Chiara M. F.

    2014-10-01

    We describe an alternative approach to the analysis of gravitational-wave backgrounds, based on the formalism used to characterize the polarization of the cosmic microwave background. In contrast to standard analyses, this approach makes no assumptions about the nature of the background and so has the potential to reveal much more about the physical processes that generated it. An arbitrary background can be decomposed into modes whose angular dependence on the sky is given by gradients and curls of spherical harmonics. We derive the pulsar timing overlap reduction functions for the individual modes, which are given by simple combinations of spherical harmonics evaluated at the pulsar locations. We show how these can be used to recover the components of an arbitrary background, giving explicit results for both isotropic and anisotropic uncorrelated backgrounds. We also find that the response of a pulsar timing array to curl modes is identically zero, so half of the gravitational-wave sky will never be observed using pulsar timing, no matter how many pulsars are included in the array. An isotropic, unpolarized and uncorrelated background can be accurately represented using only three modes, and so a search of this type will be only slightly more complicated than the standard cross-correlation search using the Hellings and Downs overlap reduction function. However, by measuring the components of individual modes of the background and checking for consistency with isotropy, this approach has the potential to reveal much more information. Each individual mode on its own describes a background that is correlated between different points on the sky. A measurement of the components that indicates the presence of correlations in the background on large angular scales would suggest startling new physics.

  15. Saliency Detection on Light Field.

    PubMed

    Li, Nianyi; Ye, Jinwei; Ji, Yu; Ling, Haibin; Yu, Jingyi

    2017-08-01

    Existing saliency detection approaches use images as inputs and are sensitive to foreground/background similarities, complex background textures, and occlusions. We explore the problem of using light fields as input for saliency detection. Our technique is enabled by the availability of commercial plenoptic cameras that capture the light field of a scene in a single shot. We show that the unique refocusing capability of light fields provides useful focusness, depths, and objectness cues. We further develop a new saliency detection algorithm tailored for light fields. To validate our approach, we acquire a light field database of a range of indoor and outdoor scenes and generate the ground truth saliency map. Experiments show that our saliency detection scheme can robustly handle challenging scenarios such as similar foreground and background, cluttered background, complex occlusions, etc., and achieve high accuracy and robustness.

  16. Method of improving field emission characteristics of diamond thin films

    DOEpatents

    Krauss, A.R.; Gruen, D.M.

    1999-05-11

    A method of preparing diamond thin films with improved field emission properties is disclosed. The method includes preparing a diamond thin film on a substrate, such as Mo, W, Si and Ni. An atmosphere of hydrogen (molecular or atomic) can be provided above the already deposited film to form absorbed hydrogen to reduce the work function and enhance field emission properties of the diamond film. In addition, hydrogen can be absorbed on intergranular surfaces to enhance electrical conductivity of the diamond film. The treated diamond film can be part of a microtip array in a flat panel display. 3 figs.

  17. Method of improving field emission characteristics of diamond thin films

    DOEpatents

    Krauss, Alan R.; Gruen, Dieter M.

    1999-01-01

    A method of preparing diamond thin films with improved field emission properties. The method includes preparing a diamond thin film on a substrate, such as Mo, W, Si and Ni. An atmosphere of hydrogen (molecular or atomic) can be provided above the already deposited film to form absorbed hydrogen to reduce the work function and enhance field emission properties of the diamond film. In addition, hydrogen can be absorbed on intergranular surfaces to enhance electrical conductivity of the diamond film. The treated diamond film can be part of a microtip array in a flat panel display.

  18. Method and apparatus for reducing solvent luminescence background emissions

    DOEpatents

    Affleck, Rhett L.; Ambrose, W. Patrick; Demas, James N.; Goodwin, Peter M.; Johnson, Mitchell E.; Keller, Richard A.; Petty, Jeffrey T.; Schecker, Jay A.; Wu, Ming

    1998-01-01

    The detectability of luminescent molecules in solution is enhanced by reducing the background luminescence due to impurity species also present in the solution. A light source that illuminates the solution acts to photolyze the impurities so that the impurities do not luminesce in the fluorescence band of the molecule of interest. Molecules of interest may be carried through the photolysis region in the solution or may be introduced into the solution after the photolysis region.

  19. An Efficient and Examinable Illegal Fallow Fields Detecting Method with Spatio-Temporal Information Integration

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Hao; Chu, Tzu-How

    2017-04-01

    To control the rice production and farm usage in Taiwan, Agriculture and Food Agency (AFA) has published a series of policies to subsidize farmers to plant different crops or to practice fallow science 1983. Because of no efficient and examinable mechanism to verify the fallow fields surveyed by township office, illegal fallow fields were still repeated each year. In this research, we used remote sensing images, GIS data of Fields, and application records of fallow fields to establish an illegal fallow fields detecting method in Yulin County in central Taiwan. This method included: 1. collected multi-temporal images from FS-2 or SPOT series with 4 time periods; 2. combined the application records and GIS data of fields to verify the location of fallow fields; 3. conducted ground truth survey and classified images with ISODATA and Maximum Likelihood Classification (MLC); 4. defined the land cover type of fallow fields by zonal statistic; 5. verified accuracy with ground truth; 6. developed potential illegal fallow fields survey method and benefit estimation. We use 190 fallow fields with 127 legal and 63 illegal as ground truth and accuracies of illegal fallow field interpretation in producer and user are 71.43% and 38.46%. If township office surveyed 117 classified illegal fallow fields, 45 of 63 illegal fallow fields will be detected. By using our method, township office can save 38.42% of the manpower to detect illegal fallow fields and receive an examinable 71.43% producer accuracy.

  20. The method of generating functions in exact scalar field inflationary cosmology

    NASA Astrophysics Data System (ADS)

    Chervon, Sergey V.; Fomin, Igor V.; Beesham, Aroonkumar

    2018-04-01

    The construction of exact solutions in scalar field inflationary cosmology is of growing interest. In this work, we review the results which have been obtained with the help of one of the most effective methods, viz., the method of generating functions for the construction of exact solutions in scalar field cosmology. We also include in the debate the superpotential method, which may be considered as the bridge to the slow roll approximation equations. Based on the review, we suggest a classification for the generating functions, and find a connection for all of them with the superpotential.

  1. A new method for indirectly estimating infiltration of paddy fields in situ

    NASA Astrophysics Data System (ADS)

    Xu, Yunqiang; Su, Baolin; Wang, Hongqi; He, Jingyi

    2018-06-01

    Infiltration is one of the major procedures in water balance research and pollution load estimation in paddy fields. In this study, a new method for indirectly estimating infiltration of paddy fields in situ was proposed and implemented in Taihu Lake basin. Since when there were no rainfall, irrigation and artificial drainage, the water depth variation process of a paddy field is only influenced by evapotranspiration and infiltration (E + F). Firstly, (E + F) was estimated by deciding the steady decreasing rate of water depth; then the evapotranspiration (ET) of the paddy field was calculated by using the crop coefficient method with the recommended FAO-56 Penman-Monteith equation; finally, the infiltration of the paddy field was obtained by subtracting ET from (E + F). Results show that the mean infiltration of the studied paddy field during rice jointing-booting period was 7.41 mm day-1, and the mean vertical infiltration and lateral seepage of the paddy field were 5.46 and 1.95 mm day-1 respectively.

  2. Field Evaluation of Advanced Methods of Subsurface Exploration for Transit Tunneling

    DOT National Transportation Integrated Search

    1980-06-01

    This report presents the results of a field evaluation of advanced methods of subsurface exploration on an ongoing urban rapid transit tunneling project. The objective of this study is to evaluate, through a field demonstration project, the feasibili...

  3. A Rigorous Geometric Derivation of the Chiral Anomaly in Curved Backgrounds

    NASA Astrophysics Data System (ADS)

    Bär, Christian; Strohmaier, Alexander

    2016-11-01

    We discuss the chiral anomaly for a Weyl field in a curved background and show that a novel index theorem for the Lorentzian Dirac operator can be applied to describe the gravitational chiral anomaly. A formula for the total charge generated by the gravitational and gauge field background is derived directly in Lorentzian signature and in a mathematically rigorous manner. It contains a term identical to the integrand in the Atiyah-Singer index theorem and another term involving the {η}-invariant of the Cauchy hypersurfaces.

  4. A new gradient shimming method based on undistorted field map of B0 inhomogeneity.

    PubMed

    Bao, Qingjia; Chen, Fang; Chen, Li; Song, Kan; Liu, Zao; Liu, Chaoyang

    2016-04-01

    Most existing gradient shimming methods for NMR spectrometers estimate field maps that resolve B0 inhomogeneity spatially from dual gradient-echo (GRE) images acquired at different echo times. However, the distortions induced by B0 inhomogeneity that always exists in the GRE images can result in estimated field maps that are distorted in both geometry and intensity, leading to inaccurate shimming. This work proposes a new gradient shimming method based on undistorted field map of B0 inhomogeneity obtained by a more accurate field map estimation technique. Compared to the traditional field map estimation method, this new method exploits both the positive and negative polarities of the frequency encoded gradients to eliminate the distortions caused by B0 inhomogeneity in the field map. Next, the corresponding automatic post-data procedure is introduced to obtain undistorted B0 field map based on knowledge of the invariant characteristics of the B0 inhomogeneity and the variant polarity of the encoded gradient. The experimental results on both simulated and real gradient shimming tests demonstrate the high performance of this new method. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Thresholding of auditory cortical representation by background noise

    PubMed Central

    Liang, Feixue; Bai, Lin; Tao, Huizhong W.; Zhang, Li I.; Xiao, Zhongju

    2014-01-01

    It is generally thought that background noise can mask auditory information. However, how the noise specifically transforms neuronal auditory processing in a level-dependent manner remains to be carefully determined. Here, with in vivo loose-patch cell-attached recordings in layer 4 of the rat primary auditory cortex (A1), we systematically examined how continuous wideband noise of different levels affected receptive field properties of individual neurons. We found that the background noise, when above a certain critical/effective level, resulted in an elevation of intensity threshold for tone-evoked responses. This increase of threshold was linearly dependent on the noise intensity above the critical level. As such, the tonal receptive field (TRF) of individual neurons was translated upward as an entirety toward high intensities along the intensity domain. This resulted in preserved preferred characteristic frequency (CF) and the overall shape of TRF, but reduced frequency responding range and an enhanced frequency selectivity for the same stimulus intensity. Such translational effects on intensity threshold were observed in both excitatory and fast-spiking inhibitory neurons, as well as in both monotonic and nonmonotonic (intensity-tuned) A1 neurons. Our results suggest that in a noise background, fundamental auditory representations are modulated through a background level-dependent linear shifting along intensity domain, which is equivalent to reducing stimulus intensity. PMID:25426029

  6. Behavior analysis of video object in complicated background

    NASA Astrophysics Data System (ADS)

    Zhao, Wenting; Wang, Shigang; Liang, Chao; Wu, Wei; Lu, Yang

    2016-10-01

    This paper aims to achieve robust behavior recognition of video object in complicated background. Features of the video object are described and modeled according to the depth information of three-dimensional video. Multi-dimensional eigen vector are constructed and used to process high-dimensional data. Stable object tracing in complex scenes can be achieved with multi-feature based behavior analysis, so as to obtain the motion trail. Subsequently, effective behavior recognition of video object is obtained according to the decision criteria. What's more, the real-time of algorithms and accuracy of analysis are both improved greatly. The theory and method on the behavior analysis of video object in reality scenes put forward by this project have broad application prospect and important practical significance in the security, terrorism, military and many other fields.

  7. Introducing 3D U-statistic method for separating anomaly from background in exploration geochemical data with associated software development

    NASA Astrophysics Data System (ADS)

    Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir

    2016-03-01

    The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.

  8. GafChromic EBT film dosimetry with flatbed CCD scanner: a novel background correction method and full dose uncertainty analysis.

    PubMed

    Saur, Sigrun; Frengen, Jomar

    2008-07-01

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scans of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16 x 16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution

  9. A beam hardening and dispersion correction for x-ray dark-field radiography.

    PubMed

    Pelzer, Georg; Anton, Gisela; Horn, Florian; Rieger, Jens; Ritter, André; Wandner, Johannes; Weber, Thomas; Michel, Thilo

    2016-06-01

    X-ray dark-field imaging promises information on the small angle scattering properties even of large samples. However, the dark-field image is correlated with the object's attenuation and phase-shift if a polychromatic x-ray spectrum is used. A method to remove part of these correlations is proposed. The experimental setup for image acquisition was modeled in a wave-field simulation to quantify the dark-field signals originating solely from a material's attenuation and phase-shift. A calibration matrix was simulated for ICRU46 breast tissue. Using the simulated data, a dark-field image of a human mastectomy sample was corrected for the finger print of attenuation- and phase-image. Comparing the simulated, attenuation-based dark-field values to a phantom measurement, a good agreement was found. Applying the proposed method to mammographic dark-field data, a reduction of the dark-field background and anatomical noise was achieved. The contrast between microcalcifications and their surrounding background was increased. The authors show that the influence of and dispersion can be quantified by simulation and, thus, measured image data can be corrected. The simulation allows to determine the corresponding dark-field artifacts for a wide range of setup parameters, like tube-voltage and filtration. The application of the proposed method to mammographic dark-field data shows an increase in contrast compared to the original image, which might simplify a further image-based diagnosis.

  10. Data Friction Meets Social Friction: Challenges for standardization in emerging fields of geoscience

    NASA Astrophysics Data System (ADS)

    Darch, P. T.

    2017-12-01

    Many interdisciplinary endeavors in the geosciences occur in emergent scientific fields. These fields are often characterized by heterogeneity of methods for production and collection of data, and by data scarcity. This paper presents findings about processes of methods standardization from a long-term case study of an emergent, data-scarce field, the deep subseafloor biosphere. Researchers come from many physical and life science backgrounds to study interactions between microbial life in the seafloor and the physical environment they inhabit. Standardization of methods for collecting data promises multiple benefits to this field, including: Addressing data scarcity through enabling greater data reuse and promoting better interoperability with large scale infrastructures; Fostering stronger collaborative links between researchers distributed across institutions and backgrounds. Ongoing standardization efforts in the field do not only involve scientific judgments about which among a range of methods is most efficient, least biased, or most reliable. Instead, these efforts also encounter multiple difficult social challenges, including: Lack of agreed upon criteria about how to judge competing methods: should efficiency, bias, or reliability take priority?; Lack of resources to carry out the work necessary to determine standards, particularly acute in emergent fields; Concerns that standardization is premature in such a new field, foreclosing the possibility of better methods being developed in the future; Concerns that standardization could prematurely shut down important scientific debates; Concerns among some researchers that their own work may become obsolete should the methods chosen as standard be different from their own. The success of these standardization efforts will depend on addressing both scientific and social dimensions, to ensure widespread acceptance among researchers in the field.

  11. A method for the estimate of the wall diffusion for non-axisymmetric fields using rotating external fields

    NASA Astrophysics Data System (ADS)

    Frassinetti, L.; Olofsson, K. E. J.; Fridström, R.; Setiadi, A. C.; Brunsell, P. R.; Volpe, F. A.; Drake, J.

    2013-08-01

    A new method for the estimate of the wall diffusion time of non-axisymmetric fields is developed. The method based on rotating external fields and on the measurement of the wall frequency response is developed and tested in EXTRAP T2R. The method allows the experimental estimate of the wall diffusion time for each Fourier harmonic and the estimate of the wall diffusion toroidal asymmetries. The method intrinsically considers the effects of three-dimensional structures and of the shell gaps. Far from the gaps, experimental results are in good agreement with the diffusion time estimated with a simple cylindrical model that assumes a homogeneous wall. The method is also applied with non-standard configurations of the coil array, in order to mimic tokamak-relevant settings with a partial wall coverage and active coils of large toroidal extent. The comparison with the full coverage results shows good agreement if the effects of the relevant sidebands are considered.

  12. Improved methods for the measurement and analysis of stellar magnetic fields

    NASA Technical Reports Server (NTRS)

    Saar, Steven H.

    1988-01-01

    The paper presents several improved methods for the measurement of magnetic fields on cool stars which take into account simple radiative transfer effects and the exact Zeeman patterns. Using these methods, high-resolution, low-noise data can be fitted with theoretical line profiles to determine the mean magnetic field strength in stellar active regions and a model-dependent fraction of the stellar surface (filling factor) covered by these regions. Random errors in the derived field strength and filling factor are parameterized in terms of signal-to-noise ratio, wavelength, spectral resolution, stellar rotation rate, and the magnetic parameters themselves. Weak line blends, if left uncorrected, can have significant systematic effects on the derived magnetic parameters, and thus several methods are developed to compensate partially for them. The magnetic parameters determined by previous methods likely have systematic errors because of such line blends and because of line saturation effects. Other sources of systematic error are explored in detail. These sources of error currently make it difficult to determine the magnetic parameters of individual stars to better than about + or - 20 percent.

  13. A sparse reconstruction method for the estimation of multiresolution emission fields via atmospheric inversion

    DOE PAGES

    Ray, J.; Lee, J.; Yadav, V.; ...

    2014-08-20

    We present a sparse reconstruction scheme that can also be used to ensure non-negativity when fitting wavelet-based random field models to limited observations in non-rectangular geometries. The method is relevant when multiresolution fields are estimated using linear inverse problems. Examples include the estimation of emission fields for many anthropogenic pollutants using atmospheric inversion or hydraulic conductivity in aquifers from flow measurements. The scheme is based on three new developments. Firstly, we extend an existing sparse reconstruction method, Stagewise Orthogonal Matching Pursuit (StOMP), to incorporate prior information on the target field. Secondly, we develop an iterative method that uses StOMP tomore » impose non-negativity on the estimated field. Finally, we devise a method, based on compressive sensing, to limit the estimated field within an irregularly shaped domain. We demonstrate the method on the estimation of fossil-fuel CO 2 (ffCO 2) emissions in the lower 48 states of the US. The application uses a recently developed multiresolution random field model and synthetic observations of ffCO 2 concentrations from a limited set of measurement sites. We find that our method for limiting the estimated field within an irregularly shaped region is about a factor of 10 faster than conventional approaches. It also reduces the overall computational cost by a factor of two. Further, the sparse reconstruction scheme imposes non-negativity without introducing strong nonlinearities, such as those introduced by employing log-transformed fields, and thus reaps the benefits of simplicity and computational speed that are characteristic of linear inverse problems.« less

  14. A comparison between GO/aperture-field and physical-optics methods for offset reflectors

    NASA Technical Reports Server (NTRS)

    Rahmat-Samii, Y.

    1984-01-01

    Both geometrical optics (GO)/aperture-field and physical-optics (PO) methods are used extensively in the diffraction analysis of offset parabolic and dual reflectors. An analytical/numerical comparative study is performed to demonstrate the limitations of the GO/aperture-field method for accurately predicting the sidelobe and null positions and levels. In particular, it is shown that for offset parabolic reflectors and for feeds located at the focal point, the predicted far-field patterns (amplitude) by the GO/aperture-field method will always be symmetric even in the offset plane. This, of course, is inaccurate for the general case and it is shown that the physical-optics method can result in asymmetric patterns for cases in which the feed is located at the focal point. Representative numerical data are presented and a comparison is made with available measured data.

  15. The diffuse infrared background - COBE and other observations

    NASA Technical Reports Server (NTRS)

    Hauser, M. G.; Kelsall, T.; Moseley, S. H., Jr.; Silverberg, R. F.; Murdock, T.; Toller, G.; Spiesman, W.; Weiland, J.

    1991-01-01

    The Diffuse Infrared Background Experiment (DIRBE) on the Cosmic Background Explorer (COBE) satellite is designed to conduct a sensitive search for an isotropic cosmic infrared background radiation over the spectral range from 1 to 300 micrometers. The cumulative emissions of pregalactic, protogalactic, and evolving galactic systems are expected to be recorded in this background. The DIRBE instrument, a 10 spectral band absolute photometer with an 0.7 deg field of view, maps the full sky with high redundancy at solar elongation angles ranging from 64 to 124 degrees to facilitate separation of interplanetary, Galactic, and extragalactic sources of emission. Initial sky maps show the expected character of the foreground emissions, with relative minima at wavelengths of 3.4 micrometers and longward of 100 micrometers. Extensive modelling of the foregrounds, just beginning, will be required to isolate the extragalactic component. In this paper, we summarize the status of diffuse infrared background observations from the DIRBE, and compare preliminary results with those of recent rocket and satellite instruments.

  16. Longitudinal leading-twist distribution amplitude of the J /ψ meson within the background field theory

    NASA Astrophysics Data System (ADS)

    Fu, Hai-Bing; Zeng, Long; Cheng, Wei; Wu, Xing-Gang; Zhong, Tao

    2018-04-01

    We make a detailed study on the J /ψ meson longitudinal leading-twist distribution amplitude ϕ2;J /ψ ∥ by using the QCD sum rules within the background field theory. By keeping all the nonperturbative condensates up to dimension 6, we obtain accurate QCD sum rules for the moments ⟨ξn;J /ψ ∥⟩. The first three ones are ⟨ξ2;J /ψ ∥⟩=0.083 (12 ), ⟨ξ4;J /ψ ∥⟩=0.015 (5 ), and ⟨ξ6;J /ψ ∥⟩=0.003 (2 ), respectively. Those values indicate a single peaked behavior for ϕ2;J /ψ ∥. As an application, we adopt the QCD light-cone sum rules to calculate the Bc meson semileptonic decay Bc+→J /ψ ℓ+νℓ. We obtain Γ (Bc+→J /ψ ℓ+νℓ)=(89.67-19.06+24.76)×10-15 GeV and ℜ(J /ψ ℓ+νℓ)=0.21 7-0.057+0.069, which agree with both the extrapolated next-to-leading order pQCD prediction and the new CDF measurement within errors.

  17. Background Error Covariance Estimation Using Information from a Single Model Trajectory with Application to Ocean Data Assimilation

    NASA Technical Reports Server (NTRS)

    Keppenne, Christian L.; Rienecker, Michele; Kovach, Robin M.; Vernieres, Guillaume

    2014-01-01

    An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory.SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

  18. Method and apparatus for reducing solvent luminescence background emissions

    DOEpatents

    Affleck, R.L.; Ambrose, W.P.; Demas, J.N.; Goodwin, P.M.; Johnson, M.E.; Keller, R.A.; Petty, J.T.; Schecker, J.A.; Wu, M.

    1998-10-27

    The detectability of luminescent molecules in solution is enhanced by reducing the background luminescence due to impurity species also present in the solution. A light source that illuminates the solution acts to photolyze the impurities so that the impurities do not luminesce in the fluorescence band of the molecule of interest. Molecules of interest may be carried through the photolysis region in the solution or may be introduced into the solution after the photolysis region. 6 figs.

  19. A geologic approach to field methods in fluvial geomorphology

    USGS Publications Warehouse

    Fitzpatrick, Faith A.; Thornbush, Mary J; Allen, Casey D; Fitzpatrick, Faith A.

    2014-01-01

    A geologic approach to field methods in fluvial geomorphology is useful for understanding causes and consequences of past, present, and possible future perturbations in river behavior and floodplain dynamics. Field methods include characterizing river planform and morphology changes and floodplain sedimentary sequences over long periods of time along a longitudinal river continuum. Techniques include topographic and bathymetric surveying of fluvial landforms in valley bottoms and describing floodplain sedimentary sequences through coring, trenching, and examining pits and exposures. Historical sediment budgets that include floodplain sedimentary records can characterize past and present sources and sinks of sediment along a longitudinal river continuum. Describing paleochannels and floodplain vertical accretion deposits, estimating long-term sedimentation rates, and constructing historical sediment budgets can assist in management of aquatic resources, habitat, sedimentation, and flooding issues.

  20. Mechanical reinforcement for RACC cables in high magnetic background fields

    NASA Astrophysics Data System (ADS)

    Bayer, C. M.; Gade, P. V.; Barth, C.; Preuß, A.; Jung, A.; Weiß, K. P.

    2016-02-01

    Operable in liquid helium, liquid hydrogen or liquid nitrogen, high temperature superconductor (HTS) cables are investigated as future alternatives to low temperature superconductor (LTS) cables in magnet applications. Different high current HTS cable concepts have been developed and optimized in the last years—each coming with its own benefits and challenges. As the Roebel assembled coated conductor (RACC) is the only fully transposed HTS cable investigated so far, it is attractive for large scale magnet and accelerator magnet applications when field quality and alternating current (AC) losses are of highest importance. However, due to its filamentary character, the RACC is very sensitive to Lorentz forces. In order to increase the mechanical strength of the RACC, each of the HTS strands was covered by an additional copper tape. After investigating the maximum applicable transverse pressure on the strand composition, the cable was clamped into a stainless steel structure to reinforce it against Lorentz forces. A comprehensive test has been carried out in the FBI facility at 4.2 K in a magnetic field of up to 12 T. This publication discusses the maximum applicable pressure as well as the behaviour of the RACC cable as a function of an external magnetic field.

  1. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    NASA Technical Reports Server (NTRS)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  2. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    PubMed

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  3. A New Method for Coronal Magnetic Field Reconstruction

    NASA Astrophysics Data System (ADS)

    Yi, Sibaek; Choe, Gwang-Son; Cho, Kyung-Suk; Kim, Kap-Sung

    2017-08-01

    A precise way of coronal magnetic field reconstruction (extrapolation) is an indispensable tool for understanding of various solar activities. A variety of reconstruction codes have been developed so far and are available to researchers nowadays, but they more or less bear this and that shortcoming. In this paper, a new efficient method for coronal magnetic field reconstruction is presented. The method imposes only the normal components of magnetic field and current density at the bottom boundary to avoid the overspecification of the reconstruction problem, and employs vector potentials to guarantee the divergence-freeness. In our method, the normal component of current density is imposed, not by adjusting the tangential components of A, but by adjusting its normal component. This allows us to avoid a possible numerical instability that on and off arises in codes using A. In real reconstruction problems, the information for the lateral and top boundaries is absent. The arbitrariness of the boundary conditions imposed there as well as various preprocessing brings about the diversity of resulting solutions. We impose the source surface condition at the top boundary to accommodate flux imbalance, which always shows up in magnetograms. To enhance the convergence rate, we equip our code with a gradient-method type accelerator. Our code is tested on two analytical force-free solutions. When the solution is given only at the bottom boundary, our result surpasses competitors in most figures of merits devised by Schrijver et al. (2006). We have also applied our code to a real active region NOAA 11974, in which two M-class flares and a halo CME took place. The EUV observation shows a sudden appearance of an erupting loop before the first flare. Our numerical solutions show that two entwining flux tubes exist before the flare and their shackling is released after the CME with one of them opened up. We suggest that the erupting loop is created by magnetic reconnection between

  4. T1 and susceptibility contrast at high fields

    NASA Astrophysics Data System (ADS)

    Neelavalli, Jaladhar

    partly because of the invariance of most tissue susceptibilities with field strength. This essentially ensures a constant available phase contrast between tissues across field strengths. In fact, with the increased SNR at high fields, the phase CNR actually increases with field strength which is even better. Susceptibility weighted imaging, which uniquely combines this phase and magnitude information to generate enhanced susceptibility contrast magnitude images, has proven to be an important tool in the study of various neurological conditions like, Alzheimer's, Parkinson's, Huntington's disease and multiple sclerosis even at conventional field strength of 1.5T and should have more applicability at high fields. A major issue in using phase images for susceptibility contrast, directly or as processed SWI magnitude images, is the large scale background phase variations that obscure the local susceptibility based contrast. A novel method is proposed for removing such geometrically induced large scale phase variations using a Fourier Transform based field calculation method. It is shown that the new method is capable of successfully removing the background field effects. It is shown that the new method is not only capable of successfully removing the background field effects but also helps in preserving more local phase information.

  5. Field methods to measure surface displacement and strain with the Video Image Correlation method

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Horton, Charles M.; Mcneill, Stephen R.; Lansing, Matthew D.

    1994-01-01

    The objective of this project was to develop methods and application procedures to measure displacement and strain fields during the structural testing of aerospace components using paint speckle in conjunction with the Video Image Correlation (VIC) system.

  6. Characterization of seed nuclei in glucagon aggregation using light scattering methods and field-flow fractionation

    PubMed Central

    Hoppe, Cindy C; Nguyen, Lida T; Kirsch, Lee E; Wiencek, John M

    2008-01-01

    Background Glucagon is a peptide hormone with many uses as a therapeutic agent, including the emergency treatment of hypoglycemia. Physical instability of glucagon in solution leads to problems with the manufacture, formulation, and delivery of this pharmaceutical product. Glucagon has been shown to aggregate and form fibrils and gels in vitro. Small oligomeric precursors serve to initiate and nucleate the aggregation process. In this study, these initial aggregates, or seed nuclei, are characterized in bulk solution using light scattering methods and field-flow fractionation. Results High molecular weight aggregates of glucagon were detected in otherwise monomeric solutions using light scattering techniques. These aggregates were detected upon initial mixing of glucagon powder in dilute HCl and NaOH. In the pharmaceutically relevant case of acidic glucagon, the removal of aggregates by filtration significantly slowed the aggregation process. Field-flow fractionation was used to separate aggregates from monomeric glucagon and determine relative mass. The molar mass of the large aggregates was shown to grow appreciably over time as the glucagon solutions gelled. Conclusion The results of this study indicate that initial glucagon solutions are predominantly monomeric, but contain small quantities of large aggregates. These results suggest that the initial aggregates are seed nuclei, or intermediates which catalyze the aggregation process, even at low concentrations. PMID:18613970

  7. Influence of detector noise and background noise on detection-system

    NASA Astrophysics Data System (ADS)

    Song, Yiheng; Wang, Zhiyong

    2018-02-01

    Study the noise by detectors and background light ,we find that the influence of background noise on the detection is more than that of itself. Therefore, base on the fiber coupled beam splitting technique, the small area detector is used to replace the large area detector. It can achieve high signal-to-noise ratio (SNR) and reduce the speckle interference of the background light. This technique is expected to solve the bottleneck of large field of view and high sensitivity.

  8. Small massless excitations against a nontrivial background

    NASA Astrophysics Data System (ADS)

    Khariton, N. G.; Svetovoy, V. B.

    1994-03-01

    We propose a systematic approach for finding bosonic zero modes of nontrivial classical solutions in a gauge theory. The method allows us to find all the modes connected with the broken space-time and gauge symmetries. The ground state is supposed to be dependent on some space coordinates yα and independent of the rest of the coordinates xi. The main problem which is solved is how to construct the zero modes corresponding to the broken xiyα rotations in vacuum and which boundary conditions specify them. It is found that the rotational modes are typically singular at the origin or at infinity, but their energy remains finite. They behave as massless vector fields in x space. We analyze local and global symmetries affecting the zero modes. An algorithm for constructing the zero mode excitations is formulated. The main results are illustrated in the Abelian Higgs model with the string background.

  9. Studies on system and measuring method of far-field beam divergency in near field by Ronchi ruling

    NASA Astrophysics Data System (ADS)

    Zhou, Chenbo; Yang, Li; Ma, Wenli; Yan, Peiying; Fan, Tianquan; He, Shangfeng

    1996-10-01

    Up to now, as large as seven times of Rayleigh-range or more is needed in measuring the far-field Gaussian beam divergency. This method is very inconvenient for the determination of the output beam divergency of the industrial product such as He-Ne lasers and the measuring unit will occupy a large space. The measurement and the measuring accuracy will be greatly influenced by the environment. Application of the Ronchi ruling to the measurement of far-field divergency of Gaussian beam in near-field is analyzed in the paper. The theoretical research and the experiments show that this measuring method is convenient in industrial application. The measuring system consists of a precision mechanical unit which scans Gaussian beam with a microdisplaced Ronchi ruling, a signal sampling system, a single-chip microcomputer data processing system and an electronic unit with microprinter output. The characteristics of the system is stable and the repeatability errors of the system are low. The spot size and far-field divergency of visible Gaussian laser beam can be measured with the system.

  10. Novel symmetries in Weyl-invariant gravity with massive gauge field

    NASA Astrophysics Data System (ADS)

    Abhinav, K.; Shukla, A.; Panigrahi, P. K.

    2016-11-01

    The background field method is used to linearize the Weyl-invariant scalar-tensor gravity, coupled with a Stückelberg field. For a generic background metric, this action is found not to be invariant, under both a diffeomorphism and generalized Weyl symmetry, the latter being a combination of gauge and Weyl transformations. Interestingly, the quadratic Lagrangian, emerging from a background of Minkowski metric, respects both transformations independently. The Becchi-Rouet-Stora-Tyutin symmetry of scalar-tensor gravity coupled with a Stückelberg-like massive gauge particle, possessing a diffeomorphism and generalized Weyl symmetry, reveals that in both cases negative-norm states with unphysical degrees of freedom do exist. We then show that, by combining diffeomorphism and generalized Weyl symmetries, all the ghost states decouple, thereby removing the unphysical redundancies of the theory. During this process, the scalar field does not represent any dynamic mode, yet modifies the usual harmonic gauge condition through non-minimal coupling with gravity.

  11. FIELD ANALYTICAL SCREENING PROGRAM: PCB METHOD - INNOVATIVE TECHNOLOGY REPORT

    EPA Science Inventory

    This innovative technology evaluation report (ITER) presents information on the demonstration of the U.S. Environmental Protection Agency (EPA) Region 7 Superfund Field Analytical Screening Program (FASP) method for determining polychlorinated biphenyl (PCB) contamination in soil...

  12. Comparison of global storm activity rate calculated from Schumann resonance background components to electric field intensity E0 Z

    NASA Astrophysics Data System (ADS)

    Nieckarz, Zenon; Kułak, Andrzej; Zięba, Stanisław; Kubicki, Marek; Michnowski, Stanisław; Barański, Piotr

    2009-02-01

    This work presents the results of a comparison between the global storm activity rate IRS and electric field intensity E0 Z. The permanent analysis of the IRS may become an important tool for testing Global Electric Circuit models. IRS is determined by a new method that uses the background component of the first 7 Schumann resonances (SR). The rate calculations are based on ELF observations carried out in 2005 and 2006 in the observatory station "Hylaty" of the Jagiellonian University in the Eastern Carpathians (Kułak, A., Zięba, S., Micek, S., Nieckarz, Z., 2003. Solar variations in extremely low frequency propagation parameters: I. A two-dimensional telegraph equation (TDTE) model of ELF propagation and fundamental parameters of Schumann resonances, J. Geophys. Res., 108, 1270, doi:10.1029/2002JA009304). Diurnal runs of the IRS rate were compared with diurnal runs of E0 Z amplitudes registered at the Earth's surface in the Geophysical Observatory of the Polish Academy of Sciences in Świder (Kubicki, M., 2005. Results of Atmospheric Electricity and Meteorological Observations, S. Kalinowski Geophysical Observatory at Świder 2004, Pub. Inst. Geophysics Polish Academy of Sciences, D-68 (383), Warszawa.). The days with the highest values of the correlation coefficient ( R) between amplitudes of both observed parameters characterizing atmosphere electric activity are shown. The seasonal changes of R, IRS and E0 Z are also presented.

  13. Hamiltonian lattice field theory: Computer calculations using variational methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zako, Robert L.

    1991-12-03

    I develop a variational method for systematic numerical computation of physical quantities -- bound state energies and scattering amplitudes -- in quantum field theory. An infinite-volume, continuum theory is approximated by a theory on a finite spatial lattice, which is amenable to numerical computation. I present an algorithm for computing approximate energy eigenvalues and eigenstates in the lattice theory and for bounding the resulting errors. I also show how to select basis states and choose variational parameters in order to minimize errors. The algorithm is based on the Rayleigh-Ritz principle and Kato`s generalizations of Temple`s formula. The algorithm could bemore » adapted to systems such as atoms and molecules. I show how to compute Green`s functions from energy eigenvalues and eigenstates in the lattice theory, and relate these to physical (renormalized) coupling constants, bound state energies and Green`s functions. Thus one can compute approximate physical quantities in a lattice theory that approximates a quantum field theory with specified physical coupling constants. I discuss the errors in both approximations. In principle, the errors can be made arbitrarily small by increasing the size of the lattice, decreasing the lattice spacing and computing sufficiently long. Unfortunately, I do not understand the infinite-volume and continuum limits well enough to quantify errors due to the lattice approximation. Thus the method is currently incomplete. I apply the method to real scalar field theories using a Fock basis of free particle states. All needed quantities can be calculated efficiently with this basis. The generalization to more complicated theories is straightforward. I describe a computer implementation of the method and present numerical results for simple quantum mechanical systems.« less

  14. Background Model for the Majorana Demonstrator

    NASA Astrophysics Data System (ADS)

    Cuesta, C.; Abgrall, N.; Aguayo, E.; Avignone, F. T.; Barabash, A. S.; Bertrand, F. E.; Boswell, M.; Brudanin, V.; Busch, M.; Byram, D.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Combs, D. C.; Detwiler, J. A.; Doe, P. J.; Efremenko, Yu.; Egorov, V.; Ejiri, H.; Elliott, S. R.; Fast, J. E.; Finnerty, P.; Fraenkle, F. M.; Galindo-Uribarri, A.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guiseppe, V. E.; Gusev, K.; Hallin, A. L.; Hazama, R.; Hegai, A.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Kochetov, O.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; Leviner, L. E.; Loach, J. C.; MacMullin, J.; MacMullin, S.; Martin, R. D.; Meijer, S.; Mertens, S.; Nomachi, M.; Orrell, J. L.; O'Shaughnessy, C.; Overman, N. R.; Phillips, D. G.; Poon, A. W. P.; Pushkin, K.; Radford, D. C.; Rager, J.; Rielage, K.; Robertson, R. G. H.; Romero-Romero, E.; Ronquest, M. C.; Schubert, A. G.; Shanks, B.; Shima, T.; Shirchenko, M.; Snavely, K. J.; Snyder, N.; Suriano, A. M.; Thompson, J.; Timkin, V.; Tornow, W.; Trimble, J. E.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Young, A. R.; Yu, C.-H.; Yumatov, V.

    The Majorana Collaboration is constructing a system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment capable of probing the neutrino mass scale in the inverted-hierarchy region. To realize this, a major goal of the Majorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursued through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example using powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using simulations of the different background components whose purity levels are constrained from radioassay measurements.

  15. Background model for the Majorana Demonstrator

    DOE PAGES

    Cuesta, C.; Abgrall, N.; Aguayo, E.; ...

    2015-01-01

    The Majorana Collaboration is constructing a system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment capable of probing the neutrino mass scale in the inverted-hierarchy region. To realize this, a major goal of the Majorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursued through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example usingmore » powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using simulations of the different background components whose purity levels are constrained from radioassay measurements.« less

  16. The Radio Background below 100 MHz

    NASA Astrophysics Data System (ADS)

    Dowell, Jayce; Taylor, Greg B.

    2018-05-01

    The recent detection of the “cosmic dawn” redshifted 21 cm signal at 78 MHz by the Experiment to Detect the Global EoR Signatures (EDGES) differs significantly from theoretical predictions. In particular, the absorption trough is roughly a factor of two stronger than the most optimistic theoretical models. The early interpretations of the origin of this discrepancy fall into two categories. The first is that there is increased cooling of the gas due to interactions with dark matter, while the second is that the background radiation field includes a contribution from a component in addition to the cosmic microwave background (CMB). In this Letter we examine the feasibility of the second idea using new data from the first station of the Long Wavelength Array. The data span 40–80 MHz and provide important constraints on the present-day background in a frequency range where there are few surveys with absolute temperature calibration suitable for measuring the strength of the radio monopole. We find support for a strong, diffuse radio background that was suggested by the ARCARDE 2 results in the 3–10 GHz range. We find that this background is well modeled by a power law with a spectral index of ‑2.58 ± 0.05 and a temperature at the rest frame 21 cm frequency of {603}-92+102 mK.

  17. Visual field examination method using virtual reality glasses compared with the Humphrey perimeter.

    PubMed

    Tsapakis, Stylianos; Papaconstantinou, Dimitrios; Diagourtas, Andreas; Droutsas, Konstantinos; Andreanos, Konstantinos; Moschos, Marilita M; Brouzas, Dimitrios

    2017-01-01

    To present a visual field examination method using virtual reality glasses and evaluate the reliability of the method by comparing the results with those of the Humphrey perimeter. Virtual reality glasses, a smartphone with a 6 inch display, and software that implements a fast-threshold 3 dB step staircase algorithm for the central 24° of visual field (52 points) were used to test 20 eyes of 10 patients, who were tested in a random and consecutive order as they appeared in our glaucoma department. The results were compared with those obtained from the same patients using the Humphrey perimeter. High correlation coefficient ( r =0.808, P <0.0001) was found between the virtual reality visual field test and the Humphrey perimeter visual field. Visual field examination results using virtual reality glasses have a high correlation with the Humphrey perimeter allowing the method to be suitable for probable clinical use.

  18. Measurements of SWIR backgrounds using the swux unit of measure

    NASA Astrophysics Data System (ADS)

    Richards, A.; Hübner, M.; Vollmer, M.

    2018-04-01

    The SWIR waveband between 0.8μm-1.8μm is getting increasingly exploited by imaging systems in a variety of different applications, including persistent imaging for security and surveillance of high-value assets, handheld tactical imagers, range-gated imaging systems and imaging LADAR for driverless vehicles. The vast majority of these applications utilize lattice-matched InGaAs detectors in their imaging sensors, and these sensors are rapidly falling in price, leading to their widening adoption. As these sensors are used in novel applications and locations, it is important that ambient SWIR backgrounds be understood and characterized for a variety of different field conditions, primarily for the purposes of system performance modeling of SNR and range metrics. SWIR irradiance backgrounds do not consistently track visible-light illumination at all. There is currently little of this type of information in the open literature, particularly measurements of SWIR backgrounds in urban areas, natural areas, or indoors. This paper presents field measurements done with an InGaAs detector calibrated in the swux unit of InGaAs-band-specific irradiance proposed by two of the authors in 2017. Simultaneous measurements of illuminance levels (in lux) at these sites are presented, as well as visible and InGaAs camera images of the scenery at some of these measurement sites. The swux and lux measurement hardware is described, along with the methods used to calibrate it. Finally, the swux levels during the partial and total phases of the total solar eclipse of 2017 are presented, along with curves fitted to the data from a theoretical model, based on obscuration of the sun by the moon. The apparent differences between photometric and swux measurements will be discussed.

  19. Evaluation of Three Field-Based Methods for Quantifying Soil Carbon

    PubMed Central

    Izaurralde, Roberto C.; Rice, Charles W.; Wielopolski, Lucian; Ebinger, Michael H.; Reeves, James B.; Thomson, Allison M.; Francis, Barry; Mitra, Sudeep; Rappaport, Aaron G.; Etchevers, Jorge D.; Sayre, Kenneth D.; Govaerts, Bram; McCarty, Gregory W.

    2013-01-01

    Three advanced technologies to measure soil carbon (C) density (g C m−2) are deployed in the field and the results compared against those obtained by the dry combustion (DC) method. The advanced methods are: a) Laser Induced Breakdown Spectroscopy (LIBS), b) Diffuse Reflectance Fourier Transform Infrared Spectroscopy (DRIFTS), and c) Inelastic Neutron Scattering (INS). The measurements and soil samples were acquired at Beltsville, MD, USA and at Centro International para el Mejoramiento del Maíz y el Trigo (CIMMYT) at El Batán, Mexico. At Beltsville, soil samples were extracted at three depth intervals (0–5, 5–15, and 15–30 cm) and processed for analysis in the field with the LIBS and DRIFTS instruments. The INS instrument determined soil C density to a depth of 30 cm via scanning and stationary measurements. Subsequently, soil core samples were analyzed in the laboratory for soil bulk density (kg m−3), C concentration (g kg−1) by DC, and results reported as soil C density (kg m−2). Results from each technique were derived independently and contributed to a blind test against results from the reference (DC) method. A similar procedure was employed at CIMMYT in Mexico employing but only with the LIBS and DRIFTS instruments. Following conversion to common units, we found that the LIBS, DRIFTS, and INS results can be compared directly with those obtained by the DC method. The first two methods and the standard DC require soil sampling and need soil bulk density information to convert soil C concentrations to soil C densities while the INS method does not require soil sampling. We conclude that, in comparison with the DC method, the three instruments (a) showed acceptable performances although further work is needed to improve calibration techniques and (b) demonstrated their portability and their capacity to perform under field conditions. PMID:23383225

  20. Hyperspectral Imaging and Related Field Methods: Building the Science

    NASA Technical Reports Server (NTRS)

    Goetz, Alexander F. H.; Steffen, Konrad; Wessman, Carol

    1999-01-01

    The proposal requested funds for the computing power to bring hyperspectral image processing into undergraduate and graduate remote sensing courses. This upgrade made it possible to handle more students in these oversubscribed courses and to enhance CSES' summer short course entitled "Hyperspectral Imaging and Data Analysis" provided for government, industry, university and military. Funds were also requested to build field measurement capabilities through the purchase of spectroradiometers, canopy radiation sensors and a differential GPS system. These instruments provided systematic and complete sets of field data for the analysis of hyperspectral data with the appropriate radiometric and wavelength calibration as well as atmospheric data needed for application of radiative transfer models. The proposed field equipment made it possible to team-teach a new field methods course, unique in the country, that took advantage of the expertise of the investigators rostered in three different departments, Geology, Geography and Biology.

  1. Bi-color near infrared thermoreflectometry: a method for true temperature field measurement.

    PubMed

    Sentenac, Thierry; Gilblas, Rémi; Hernandez, Daniel; Le Maoult, Yannick

    2012-12-01

    In a context of radiative temperature field measurement, this paper deals with an innovative method, called bicolor near infrared thermoreflectometry, for the measurement of true temperature fields without prior knowledge of the emissivity field of an opaque material. This method is achieved by a simultaneous measurement, in the near infrared spectral band, of the radiance temperature fields and of the emissivity fields measured indirectly by reflectometry. The theoretical framework of the method is introduced and the principle of the measurements at two wavelengths is detailed. The crucial features of the indirect measurement of emissivity are the measurement of bidirectional reflectivities in a single direction and the introduction of an unknown variable, called the "diffusion factor." Radiance temperature and bidirectional reflectivities are then merged into a bichromatic system based on Kirchhoff's laws. The assumption of the system, based on the invariance of the diffusion factor for two near wavelengths, and the value of the chosen wavelengths, are then discussed in relation to a database of several material properties. A thermoreflectometer prototype was developed, dimensioned, and evaluated. Experiments were carried out to outline its trueness in challenging cases. First, experiments were performed on a metallic sample with a high emissivity value. The bidirectional reflectivity was then measured from low signals. The results on erbium oxide demonstrate the power of the method with materials with high emissivity variations in near infrared spectral band.

  2. Incentive Pay for Remotely Piloted Aircraft Career Fields

    DTIC Science & Technology

    2012-01-01

    Fields C.1. Mathematical Symbols for Non-Stochastic Values and Shock Terms...78 C.2. Mathematical Symbols for Taste and Compensation . . . . . . . . . . . 79 xiii Summary Background and...manning requirement, even with the current incentive pays and reenlistment bonuses. 2 The mathematical foundations, data, and estimation methods for the

  3. Automatic background updating for video-based vehicle detection

    NASA Astrophysics Data System (ADS)

    Hu, Chunhai; Li, Dongmei; Liu, Jichuan

    2008-03-01

    Video-based vehicle detection is one of the most valuable techniques for the Intelligent Transportation System (ITS). The widely used video-based vehicle detection technique is the background subtraction method. The key problem of this method is how to subtract and update the background effectively. In this paper an efficient background updating scheme based on Zone-Distribution for vehicle detection is proposed to resolve the problems caused by sudden camera perturbation, sudden or gradual illumination change and the sleeping person problem. The proposed scheme is robust and fast enough to satisfy the real-time constraints of vehicle detection.

  4. Stacked Multilayer Self-Organizing Map for Background Modeling.

    PubMed

    Zhao, Zhenjie; Zhang, Xuebo; Fang, Yongchun

    2015-09-01

    In this paper, a new background modeling method called stacked multilayer self-organizing map background model (SMSOM-BM) is proposed, which presents several merits such as strong representative ability for complex scenarios, easy to use, and so on. In order to enhance the representative ability of the background model and make the parameters learned automatically, the recently developed idea of representative learning (or deep learning) is elegantly employed to extend the existing single-layer self-organizing map background model to a multilayer one (namely, the proposed SMSOM-BM). As a consequence, the SMSOM-BM gains several merits including strong representative ability to learn background model of challenging scenarios, and automatic determination for most network parameters. More specifically, every pixel is modeled by a SMSOM, and spatial consistency is considered at each layer. By introducing a novel over-layer filtering process, we can train the background model layer by layer in an efficient manner. Furthermore, for real-time performance consideration, we have implemented the proposed method using NVIDIA CUDA platform. Comparative experimental results show superior performance of the proposed approach.

  5. An improved method for the calculation of Near-Field Acoustic Radiation Modes

    NASA Astrophysics Data System (ADS)

    Liu, Zu-Bin; Maury, Cédric

    2016-02-01

    Sensing and controlling Acoustic Radiation Modes (ARMs) in the near-field of vibrating structures is of great interest for broadband noise reduction or enhancement, as ARMs are velocity distributions defined over a vibrating surface, that independently and optimally contribute to the acoustic power in the acoustic field. But present methods only provide far-field ARMs (FFARMs) that are inadequate for the acoustic near-field problem. The Near-Field Acoustic Radiation Modes (NFARMs) are firstly studied with an improved numerical method, the Pressure-Velocity method, which rely on the eigen decomposition of the acoustic transfers between the vibrating source and a conformal observation surface, including sound pressure and velocity transfer matrices. The active and reactive parts of the sound power are separated and lead to the active and reactive ARMs. NFARMs are studied for a 2D baffled beam and for a 3D baffled plate, and so as differences between the NFARMS and the classical FFARMs. Comparisons of the NFARMs are analyzed when varying frequency and observation distance to the source. It is found that the efficiencies and shapes of the optimal active ARMs are independent on the distance while that of the reactive ones are distinctly related on.

  6. Neuronal current detection with low-field magnetic resonance: simulations and methods.

    PubMed

    Cassará, Antonino Mario; Maraviglia, Bruno; Hartwig, Stefan; Trahms, Lutz; Burghoff, Martin

    2009-10-01

    The noninvasive detection of neuronal currents in active brain networks [or direct neuronal imaging (DNI)] by means of nuclear magnetic resonance (NMR) remains a scientific challenge. Many different attempts using NMR scanners with magnetic fields >1 T (high-field NMR) have been made in the past years to detect phase shifts or magnitude changes in the NMR signals. However, the many physiological (i.e., the contemporarily BOLD effect, the weakness of the neuronal-induced magnetic field, etc.) and technical limitations (e.g., the spatial resolution) in observing the weak signals have led to some contradicting results. In contrast, only a few attempts have been made using low-field NMR techniques. As such, this paper was aimed at reviewing two recent developments in this front. The detection schemes discussed in this manuscript, the resonant mechanism (RM) and the DC method, are specific to NMR instrumentations with main fields below the earth magnetic field (50 microT), while some even below a few microteslas (ULF-NMR). However, the experimental validation for both techniques, with differentiating sensitivity to the various neuronal activities at specific temporal and spatial resolutions, is still in progress and requires carefully designed magnetic field sensor technology. Additional care should be taken to ensure a stringent magnetic shield from the ambient magnetic field fluctuations. In this review, we discuss the characteristics and prospect of these two methods in detecting neuronal currents, along with the technical requirements on the instrumentation.

  7. Accurate, efficient, and (iso)geometrically flexible collocation methods for phase-field models

    NASA Astrophysics Data System (ADS)

    Gomez, Hector; Reali, Alessandro; Sangalli, Giancarlo

    2014-04-01

    We propose new collocation methods for phase-field models. Our algorithms are based on isogeometric analysis, a new technology that makes use of functions from computational geometry, such as, for example, Non-Uniform Rational B-Splines (NURBS). NURBS exhibit excellent approximability and controllable global smoothness, and can represent exactly most geometries encapsulated in Computer Aided Design (CAD) models. These attributes permitted us to derive accurate, efficient, and geometrically flexible collocation methods for phase-field models. The performance of our method is demonstrated by several numerical examples of phase separation modeled by the Cahn-Hilliard equation. We feel that our method successfully combines the geometrical flexibility of finite elements with the accuracy and simplicity of pseudo-spectral collocation methods, and is a viable alternative to classical collocation methods.

  8. Measurement of volatile plant compounds in field ambient air by thermal desorption-gas chromatography-mass spectrometry.

    PubMed

    Cai, Xiao-Ming; Xu, Xiu-Xiu; Bian, Lei; Luo, Zong-Xiu; Chen, Zong-Mao

    2015-12-01

    Determination of volatile plant compounds in field ambient air is important to understand chemical communication between plants and insects and will aid the development of semiochemicals from plants for pest control. In this study, a thermal desorption-gas chromatography-mass spectrometry (TD-GC-MS) method was developed to measure ultra-trace levels of volatile plant compounds in field ambient air. The desorption parameters of TD, including sorbent tube material, tube desorption temperature, desorption time, and cold trap temperature, were selected and optimized. In GC-MS analysis, the selected ion monitoring mode was used for enhanced sensitivity and selectivity. This method was sufficiently sensitive to detect part-per-trillion levels of volatile plant compounds in field ambient air. Laboratory and field evaluation revealed that the method presented high precision and accuracy. Field studies indicated that the background odor of tea plantations contained some common volatile plant compounds, such as (Z)-3-hexenol, methyl salicylate, and (E)-ocimene, at concentrations ranging from 1 to 3400 ng m(-3). In addition, the background odor in summer was more abundant in quality and quantity than in autumn. Relative to previous methods, the TD-GC-MS method is more sensitive, permitting accurate qualitative and quantitative measurements of volatile plant compounds in field ambient air.

  9. Systems, computer-implemented methods, and tangible computer-readable storage media for wide-field interferometry

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G. (Inventor); Leisawitz, David T. (Inventor); Rinehart, Stephen A. (Inventor); Memarsadeghi, Nargess (Inventor)

    2012-01-01

    Disclosed herein are systems, computer-implemented methods, and tangible computer-readable storage media for wide field imaging interferometry. The method includes for each point in a two dimensional detector array over a field of view of an image: gathering a first interferogram from a first detector and a second interferogram from a second detector, modulating a path-length for a signal from an image associated with the first interferogram in the first detector, overlaying first data from the modulated first detector and second data from the second detector, and tracking the modulating at every point in a two dimensional detector array comprising the first detector and the second detector over a field of view for the image. The method then generates a wide-field data cube based on the overlaid first data and second data for each point. The method can generate an image from the wide-field data cube.

  10. Andromeda (M31) optical and infrared disk survey. I. Insights in wide-field near-IR surface photometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sick, Jonathan; Courteau, Stéphane; Cuillandre, Jean-Charles

    We present wide-field near-infrared J and K{sub s} images of the Andromeda Galaxy (M31) taken with WIRCam at the Canada-France-Hawaii Telescope as part of the Andromeda Optical and Infrared Disk Survey. This data set allows simultaneous observations of resolved stars and near-infrared (NIR) surface brightness across M31's entire bulge and disk (within R = 22 kpc), permitting a direct test of the stellar composition of near-infrared light in a nearby galaxy. Here we develop NIR observation and reduction methods to recover a uniform surface brightness map across the 3° × 1° disk of M31 with 27 WIRCam fields. Two sky-targetmore » nodding strategies are tested, and we find that strictly minimizing sky sampling latency cannot improve background subtraction accuracy to better than 2% of the background level due to spatio-temporal variations in the NIR skyglow. We fully describe our WIRCam reduction pipeline and advocate using flats built from night-sky images over a single night, rather than dome flats that do not capture the WIRCam illumination field. Contamination from scattered light and thermal background in sky flats has a negligible effect on the surface brightness shape compared to the stochastic differences in background shape between sky and galaxy disk fields, which are ∼0.3% of the background level. The most dramatic calibration step is the introduction of scalar sky offsets to each image that optimizes surface brightness continuity. Sky offsets reduce the mean surface brightness difference between observation blocks from 1% to <0.1% of the background level, though the absolute background level remains statistically uncertain to 0.15% of the background level. We present our WIRCam reduction pipeline and performance analysis to give specific recommendations for the improvement of NIR wide-field imaging methods.« less

  11. Systems and Methods for Implementing Robust Carbon Nanotube-Based Field Emitters

    NASA Technical Reports Server (NTRS)

    Kristof, Valerie (Inventor); Manohara, Harish (Inventor); Toda, Risaku (Inventor)

    2015-01-01

    Systems and methods in accordance with embodiments of the invention implement carbon nanotube-based field emitters. In one embodiment, a method of fabricating a carbon nanotube field emitter includes: patterning a substrate with a catalyst, where the substrate has thereon disposed a diffusion barrier layer; growing a plurality of carbon nanotubes on at least a portion of the patterned catalyst; and heating the substrate to an extent where it begins to soften such that at least a portion of at least one carbon nanotube becomes enveloped by the softened substrate.

  12. Feasibility, acceptability and clinical utility of the Cultural Formulation Interview: mixed-methods results from the DSM-5 international field trial.

    PubMed

    Lewis-Fernández, Roberto; Aggarwal, Neil Krishan; Lam, Peter C; Galfalvy, Hanga; Weiss, Mitchell G; Kirmayer, Laurence J; Paralikar, Vasudeo; Deshpande, Smita N; Díaz, Esperanza; Nicasio, Andel V; Boiler, Marit; Alarcón, Renato D; Rohlof, Hans; Groen, Simon; van Dijk, Rob C J; Jadhav, Sushrut; Sarmukaddam, Sanjeev; Ndetei, David; Scalco, Monica Z; Bassiri, Kavoos; Aguilar-Gaxiola, Sergio; Ton, Hendry; Westermeyer, Joseph; Vega-Dienstmaier, Johann M

    2017-04-01

    Background There is a need for clinical tools to identify cultural issues in diagnostic assessment. Aims To assess the feasibility, acceptability and clinical utility of the DSM-5 Cultural Formulation Interview (CFI) in routine clinical practice. Method Mixed-methods evaluation of field trial data from six countries. The CFI was administered to diagnostically diverse psychiatric out-patients during a diagnostic interview. In post-evaluation sessions, patients and clinicians completed debriefing qualitative interviews and Likert-scale questionnaires. The duration of CFI administration and the full diagnostic session were monitored. Results Mixed-methods data from 318 patients and 75 clinicians found the CFI feasible, acceptable and useful. Clinician feasibility ratings were significantly lower than patient ratings and other clinician-assessed outcomes. After administering one CFI, however, clinician feasibility ratings improved significantly and subsequent interviews required less time. Conclusions The CFI was included in DSM-5 as a feasible, acceptable and useful cultural assessment tool. © The Royal College of Psychiatrists 2017.

  13. Electric line source illumination of a chiral cylinder placed in another chiral background medium

    NASA Astrophysics Data System (ADS)

    Aslam, M.; Saleem, A.; Awan, Z. A.

    2018-05-01

    An electric line source illumination of a chiral cylinder embedded in a chiral background medium is considered. The field expressions inside and outside of a chiral cylinder have been derived using the wave field decomposition approach. The effects of various chiral cylinders, chiral background media and source locations upon the scattering gain pattern have been investigated. It is observed that the chiral background reduces the backward scattering gain as compared to the free space background for a dielectric cylinder. It is also studied that by moving a line source away from a cylinder reduces the backward scattering gain for a chiral cylinder placed in a chiral background under some specific conditions. A unique phenomenon of reduced scattering gain has been observed at a specific observation angle for a chiral cylinder placed in a chiral background having an electric line source location of unity free space wavelength. An isotropic scattering gain pattern is observed for a chiral nihility background provided that if cylinder is chiral or chiral nihility type. It is also observed that this isotropic behaviour is independent of background and cylinder chirality.

  14. Cephalopod dynamic camouflage: bridging the continuum between background matching and disruptive coloration

    PubMed Central

    Hanlon, R.T.; Chiao, C.-C.; Mäthger, L.M.; Barbosa, A.; Buresch, K.C.; Chubb, C.

    2008-01-01

    Individual cuttlefish, octopus and squid have the versatile capability to use body patterns for background matching and disruptive coloration. We define—qualitatively and quantitatively—the chief characteristics of the three major body pattern types used for camouflage by cephalopods: uniform and mottle patterns for background matching, and disruptive patterns that primarily enhance disruptiveness but aid background matching as well. There is great variation within each of the three body pattern types, but by defining their chief characteristics we lay the groundwork to test camouflage concepts by correlating background statistics with those of the body pattern. We describe at least three ways in which background matching can be achieved in cephalopods. Disruptive patterns in cuttlefish possess all four of the basic components of ‘disruptiveness’, supporting Cott's hypotheses, and we provide field examples of disruptive coloration in which the body pattern contrast exceeds that of the immediate surrounds. Based upon laboratory testing as well as thousands of images of camouflaged cephalopods in the field (a sample is provided on a web archive), we note that size, contrast and edges of background objects are key visual cues that guide cephalopod camouflage patterning. Mottle and disruptive patterns are frequently mixed, suggesting that background matching and disruptive mechanisms are often used in the same pattern. PMID:19008200

  15. Automatic vehicle counting using background subtraction method on gray scale images and morphology operation

    NASA Astrophysics Data System (ADS)

    Adi, K.; Widodo, A. P.; Widodo, C. E.; Pamungkas, A.; Putranto, A. B.

    2018-05-01

    Traffic monitoring on road needs to be done, the counting of the number of vehicles passing the road is necessary. It is more emphasized for highway transportation management in order to prevent efforts. Therefore, it is necessary to develop a system that is able to counting the number of vehicles automatically. Video processing method is able to counting the number of vehicles automatically. This research has development a system of vehicle counting on toll road. This system includes processes of video acquisition, frame extraction, and image processing for each frame. Video acquisition is conducted in the morning, at noon, in the afternoon, and in the evening. This system employs of background subtraction and morphology methods on gray scale images for vehicle counting. The best vehicle counting results were obtained in the morning with a counting accuracy of 86.36 %, whereas the lowest accuracy was in the evening, at 21.43 %. Differences in morning and evening results are caused by different illumination in the morning and evening. This will cause the values in the image pixels to be different.

  16. Level crossing analysis of cosmic microwave background radiation: a method for detecting cosmic strings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Movahed, M. Sadegh; Khosravi, Shahram, E-mail: m.s.movahed@ipm.ir, E-mail: khosravi@ipm.ir

    2011-03-01

    In this paper we study the footprint of cosmic string as the topological defects in the very early universe on the cosmic microwave background radiation. We develop the method of level crossing analysis in the context of the well-known Kaiser-Stebbins phenomenon for exploring the signature of cosmic strings. We simulate a Gaussian map by using the best fit parameter given by WMAP-7 and then superimpose cosmic strings effects on it as an incoherent and active fluctuations. In order to investigate the capability of our method to detect the cosmic strings for the various values of tension, Gμ, a simulated puremore » Gaussian map is compared with that of including cosmic strings. Based on the level crossing analysis, the superimposed cosmic string with Gμ∼>4 × 10{sup −9} in the simulated map without instrumental noise and the resolution R = 1' could be detected. In the presence of anticipated instrumental noise the lower bound increases just up to Gμ∼>5.8 × 10{sup −9}.« less

  17. Multi-scale Methods in Quantum Field Theory

    NASA Astrophysics Data System (ADS)

    Polyzou, W. N.; Michlin, Tracie; Bulut, Fatih

    2018-05-01

    Daubechies wavelets are used to make an exact multi-scale decomposition of quantum fields. For reactions that involve a finite energy that take place in a finite volume, the number of relevant quantum mechanical degrees of freedom is finite. The wavelet decomposition has natural resolution and volume truncations that can be used to isolate the relevant degrees of freedom. The application of flow equation methods to construct effective theories that decouple coarse and fine scale degrees of freedom is examined.

  18. FIELD SCREENING METHODS FOR HAZARDOUS WASTES AND TOXIC CHEMICALS

    EPA Science Inventory

    The purpose of this document is to present the technical papers that were presented at the Second International Symposium on Field Screening Methods for Hazardous Wastes and Toxic Chemicals. ixty platform presentations were made and included in one of ten sessions: hemical sensor...

  19. Local Field Response Method Phenomenologically Introducing Spin Correlations

    NASA Astrophysics Data System (ADS)

    Tomaru, Tatsuya

    2018-03-01

    The local field response (LFR) method is a way of searching for the ground state in a similar manner to quantum annealing. However, the LFR method operates on a classical machine, and quantum effects are introduced through a priori information and through phenomenological means reflecting the states during the computations. The LFR method has been treated with a one-body approximation, and therefore, the effect of entanglement has not been sufficiently taken into account. In this report, spin correlations are phenomenologically introduced as one of the effects of entanglement, by which multiple tunneling at anticrossing points is taken into account. As a result, the accuracy of solutions for a 128-bit system increases by 31% compared with that without spin correlations.

  20. An Exact Model-Based Method for Near-Field Sources Localization with Bistatic MIMO System.

    PubMed

    Singh, Parth Raj; Wang, Yide; Chargé, Pascal

    2017-03-30

    In this paper, we propose an exact model-based method for near-field sources localization with a bistatic multiple input, multiple output (MIMO) radar system, and compare it with an approximated model-based method. The aim of this paper is to propose an efficient way to use the exact model of the received signals of near-field sources in order to eliminate the systematic error introduced by the use of approximated model in most existing near-field sources localization techniques. The proposed method uses parallel factor (PARAFAC) decomposition to deal with the exact model. Thanks to the exact model, the proposed method has better precision and resolution than the compared approximated model-based method. The simulation results show the performance of the proposed method.

  1. Graviton propagator from background-independent quantum gravity.

    PubMed

    Rovelli, Carlo

    2006-10-13

    We study the graviton propagator in Euclidean loop quantum gravity. We use spin foam, boundary-amplitude, and group-field-theory techniques. We compute a component of the propagator to first order, under some approximations, obtaining the correct large-distance behavior. This indicates a way for deriving conventional spacetime quantities from a background-independent theory.

  2. Spatial sound field synthesis and upmixing based on the equivalent source method.

    PubMed

    Bai, Mingsian R; Hsu, Hoshen; Wen, Jheng-Ciang

    2014-01-01

    Given scarce number of recorded signals, spatial sound field synthesis with an extended sweet spot is a challenging problem in acoustic array signal processing. To address the problem, a synthesis and upmixing approach inspired by the equivalent source method (ESM) is proposed. The synthesis procedure is based on the pressure signals recorded by a microphone array and requires no source model. The array geometry can also be arbitrary. Four upmixing strategies are adopted to enhance the resolution of the reproduced sound field when there are more channels of loudspeakers than the microphones. Multi-channel inverse filtering with regularization is exploited to deal with the ill-posedness in the reconstruction process. The distance between the microphone and loudspeaker arrays is optimized to achieve the best synthesis quality. To validate the proposed system, numerical simulations and subjective listening experiments are performed. The results demonstrated that all upmixing methods improved the quality of reproduced target sound field over the original reproduction. In particular, the underdetermined ESM interpolation method yielded the best spatial sound field synthesis in terms of the reproduction error, timbral quality, and spatial quality.

  3. Grassmann phase space methods for fermions. II. Field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dalton, B.J., E-mail: bdalton@swin.edu.au; Jeffers, J.; Barnett, S.M.

    In both quantum optics and cold atom physics, the behaviour of bosonic photons and atoms is often treated using phase space methods, where mode annihilation and creation operators are represented by c-number phase space variables, with the density operator equivalent to a distribution function of these variables. The anti-commutation rules for fermion annihilation, creation operators suggests the possibility of using anti-commuting Grassmann variables to represent these operators. However, in spite of the seminal work by Cahill and Glauber and a few applications, the use of Grassmann phase space methods in quantum-atom optics to treat fermionic systems is rather rare, thoughmore » fermion coherent states using Grassmann variables are widely used in particle physics. This paper presents a phase space theory for fermion systems based on distribution functionals, which replace the density operator and involve Grassmann fields representing anti-commuting fermion field annihilation, creation operators. It is an extension of a previous phase space theory paper for fermions (Paper I) based on separate modes, in which the density operator is replaced by a distribution function depending on Grassmann phase space variables which represent the mode annihilation and creation operators. This further development of the theory is important for the situation when large numbers of fermions are involved, resulting in too many modes to treat separately. Here Grassmann fields, distribution functionals, functional Fokker–Planck equations and Ito stochastic field equations are involved. Typical applications to a trapped Fermi gas of interacting spin 1/2 fermionic atoms and to multi-component Fermi gases with non-zero range interactions are presented, showing that the Ito stochastic field equations are local in these cases. For the spin 1/2 case we also show how simple solutions can be obtained both for the untrapped case and for an optical lattice trapping potential.« less

  4. Bino variations: Effective field theory methods for dark matter direct detection

    NASA Astrophysics Data System (ADS)

    Berlin, Asher; Robertson, Denis S.; Solon, Mikhail P.; Zurek, Kathryn M.

    2016-05-01

    We apply effective field theory methods to compute bino-nucleon scattering, in the case where tree-level interactions are suppressed and the leading contribution is at loop order via heavy flavor squarks or sleptons. We find that leading log corrections to fixed-order calculations can increase the bino mass reach of direct detection experiments by a factor of 2 in some models. These effects are particularly large for the bino-sbottom coannihilation region, where bino dark matter as heavy as 5-10 TeV may be detected by near future experiments. For the case of stop- and selectron-loop mediated scattering, an experiment reaching the neutrino background will probe thermal binos as heavy as 500 and 300 GeV, respectively. We present three key examples that illustrate in detail the framework for determining weak scale coefficients, and for mapping onto a low-energy theory at hadronic scales, through a sequence of effective theories and renormalization group evolution. For the case of a squark degenerate with the bino, we extend the framework to include a squark degree of freedom at low energies using heavy particle effective theory, thus accounting for large logarithms through a "heavy-light current." Benchmark predictions for scattering cross sections are evaluated, including complete leading order matching onto quark and gluon operators, and a systematic treatment of perturbative and hadronic uncertainties.

  5. Magnetic irreversibility: An important amendment in the zero-field-cooling and field-cooling method

    NASA Astrophysics Data System (ADS)

    Teixeira Dias, Fábio; das Neves Vieira, Valdemar; Esperança Nunes, Sabrina; Pureur, Paulo; Schaf, Jacob; Fernanda Farinela da Silva, Graziele; de Paiva Gouvêa, Cristol; Wolff-Fabris, Frederik; Kampert, Erik; Obradors, Xavier; Puig, Teresa; Roa Rovira, Joan Josep

    2016-02-01

    The present work reports about experimental procedures to correct significant deviations of magnetization data, caused by magnetic relaxation, due to small field cycling by sample transport in the inhomogeneous applied magnetic field of commercial magnetometers. The extensively used method for measuring the magnetic irreversibility by first cooling the sample in zero field, switching on a constant applied magnetic field and measuring the magnetization M(T) while slowly warming the sample, and subsequently measuring M(T) while slowly cooling it back in the same field, is very sensitive even to small displacement of the magnetization curve. In our melt-processed YBaCuO superconducting sample we observed displacements of the irreversibility limit up to 7 K in high fields. Such displacements are detected only on confronting the magnetic irreversibility limit with other measurements, like for instance zero resistance, in which the sample remains fixed and so is not affected by such relaxation. We measured the magnetic irreversibility, Tirr(H), using a vibrating sample magnetometer (VSM) from Quantum Design. The zero resistance data, Tc0(H), were obtained using a PPMS from Quantum Design. On confronting our irreversibility lines with those of zero resistance, we observed that the Tc0(H) data fell several degrees K above the Tirr(H) data, which obviously contradicts the well known properties of superconductivity. In order to get consistent Tirr(H) data in the H-T plane, it was necessary to do a lot of additional measurements as a function of the amplitude of the sample transport and extrapolate the Tirr(H) data for each applied field to zero amplitude.

  6. Illuminating the Background: Topics in Cosmic Microwave Background Polarization Research

    NASA Astrophysics Data System (ADS)

    Miller, Nathan J.

    The cosmic microwave background provides a wealth of information about the origin and history of the universe. The statistics of the anisotropy and the polarization of the cosmic microwave background, among other things, can tell us about the distribution of matter, the redshift of reionization, and the nature of the primordial uctuations. From the lensing of cosmic microwave background due to intervening matter, we can extract information about neutrinos and the equation of state of dark energy. A measurement of the large angular scale B-mode polarization has been called the "smoking gun" of in ation, a theory that describes a possible early rapid expansion of the universe. The focus of current experiments is to measure this B-mode polarization, while several experiments, such as POLARBEAR, are also looking to measure the lensing of the cosmic microwave background. This dissertation will discuss several different topics in cosmic microwave background polarization research. I will make predictions for future experiments and I will also show analysis for two current experiments, POLARBEAR and BICEP. I will show how beam systematics affect the measurement of cosmological parameters and how well we must limit these systematics in order to get unbiased constraints on cosmological parameters for future experiments. I will discuss a novel way of using the temperature-polarization cross correlation to constrain the amount of inflationary gravitational waves. Through Markov Chain Monte Carlo methods, I will determine how well future experiments will be able to constrain the neutrino masses and their degeneracy parameters. I will show results from current data analysis and calibration being done on the Cedar Flat deployment for the POLARBEAR experiment which is currently being constructed in the Atacama desert in Chile. Finally, I will analyze the claim of detection of cosmological birefringence in the BICEP data and show that there is reason to believe it is due to

  7. Background derivation and image flattening: getimages

    NASA Astrophysics Data System (ADS)

    Men'shchikov, A.

    2017-11-01

    Modern high-resolution images obtained with space observatories display extremely strong intensity variations across images on all spatial scales. Source extraction in such images with methods based on global thresholding may bring unacceptably large numbers of spurious sources in bright areas while failing to detect sources in low-background or low-noise areas. It would be highly beneficial to subtract background and equalize the levels of small-scale fluctuations in the images before extracting sources or filaments. This paper describes getimages, a new method of background derivation and image flattening. It is based on median filtering with sliding windows that correspond to a range of spatial scales from the observational beam size up to a maximum structure width Xλ. The latter is a single free parameter of getimages that can be evaluated manually from the observed image ℐλ. The median filtering algorithm provides a background image \\tilde{Bλ} for structures of all widths below Xλ. The same median filtering procedure applied to an image of standard deviations 𝓓λ derived from a background-subtracted image \\tilde{Sλ} results in a flattening image \\tilde{Fλ}. Finally, a flattened detection image I{λD} = \\tilde{Sλ}/\\tilde{Fλ} is computed, whose standard deviations are uniform outside sources and filaments. Detecting sources in such greatly simplified images results in much cleaner extractions that are more complete and reliable. As a bonus, getimages reduces various observational and map-making artifacts and equalizes noise levels between independent tiles of mosaicked images.

  8. Particle and flow field holography: A critical survey

    NASA Technical Reports Server (NTRS)

    Trolinger, James D.

    1987-01-01

    A brief background is provided for the fields of particle and flow visualization holography. A summary of methods currently in use is given, followed by a discussion of more recent and unique applications. The problem of data reduction is discussed. A state of the art summary is then provided with a prognosis of the future of the field. Particle and flow visualization holography are characterized as powerful tools currently in wide use and with significant untapped potential.

  9. a Data Field Method for Urban Remotely Sensed Imagery Classification Considering Spatial Correlation

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Qin, K.; Zeng, C.; Zhang, E. B.; Yue, M. X.; Tong, X.

    2016-06-01

    Spatial correlation between pixels is important information for remotely sensed imagery classification. Data field method and spatial autocorrelation statistics have been utilized to describe and model spatial information of local pixels. The original data field method can represent the spatial interactions of neighbourhood pixels effectively. However, its focus on measuring the grey level change between the central pixel and the neighbourhood pixels results in exaggerating the contribution of the central pixel to the whole local window. Besides, Geary's C has also been proven to well characterise and qualify the spatial correlation between each pixel and its neighbourhood pixels. But the extracted object is badly delineated with the distracting salt-and-pepper effect of isolated misclassified pixels. To correct this defect, we introduce the data field method for filtering and noise limitation. Moreover, the original data field method is enhanced by considering each pixel in the window as the central pixel to compute statistical characteristics between it and its neighbourhood pixels. The last step employs a support vector machine (SVM) for the classification of multi-features (e.g. the spectral feature and spatial correlation feature). In order to validate the effectiveness of the developed method, experiments are conducted on different remotely sensed images containing multiple complex object classes inside. The results show that the developed method outperforms the traditional method in terms of classification accuracies.

  10. Method and systems for collecting data from multiple fields of view

    NASA Technical Reports Server (NTRS)

    Schwemmer, Geary K. (Inventor)

    2002-01-01

    Systems and methods for processing light from multiple fields (48, 54, 55) of view without excessive machinery for scanning optical elements. In an exemplary embodiment of the invention, multiple holographic optical elements (41, 42, 43, 44, 45), integrated on a common film (4), diffract and project light from respective fields of view.

  11. Krylov subspace iterative methods for boundary element method based near-field acoustic holography.

    PubMed

    Valdivia, Nicolas; Williams, Earl G

    2005-02-01

    The reconstruction of the acoustic field for general surfaces is obtained from the solution of a matrix system that results from a boundary integral equation discretized using boundary element methods. The solution to the resultant matrix system is obtained using iterative regularization methods that counteract the effect of noise on the measurements. These methods will not require the calculation of the singular value decomposition, which can be expensive when the matrix system is considerably large. Krylov subspace methods are iterative methods that have the phenomena known as "semi-convergence," i.e., the optimal regularization solution is obtained after a few iterations. If the iteration is not stopped, the method converges to a solution that generally is totally corrupted by errors on the measurements. For these methods the number of iterations play the role of the regularization parameter. We will focus our attention to the study of the regularizing properties from the Krylov subspace methods like conjugate gradients, least squares QR and the recently proposed Hybrid method. A discussion and comparison of the available stopping rules will be included. A vibrating plate is considered as an example to validate our results.

  12. Comparability between various field and laboratory wood-stove emission-measurement methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCrillis, R.C.; Jaasma, D.R.

    1991-01-01

    The paper compares various field and laboratory woodstove emission measurement methods. In 1988, the U.S. EPA promulgated performance standards for residential wood heaters (woodstoves). Over the past several years, a number of field studies have been undertaken to determine the actual level of emission reduction achieved by new technology woodstoves in everyday use. The studies have required the development and use of particulate and gaseous emission sampling equipment compatible with operation in private homes. Since woodstoves are tested for certification in the laboratory using EPA Methods 5G and 5H, it is of interest to determine the correlation between these regulatorymore » methods and the inhouse equipment. Two inhouse sampling systems have been used most widely: one is an intermittent, pump-driven particulate sampler that collects particulate and condensible organics on a filter and organic adsorbent resin; and the other uses an evacuated cylinder as the motive force and particulate and condensible organics are collected in a condenser and dual filter. Both samplers can operate unattended for 1-week periods. A large number of tests have been run comparing Methods 5G and 5H to both samplers. The paper presents these comparison data and determines the relationships between regulations and field samplers.« less

  13. Spectral methods for coupled channels with a mass gap

    NASA Astrophysics Data System (ADS)

    Weigel, H.; Quandt, M.; Graham, N.

    2018-02-01

    We develop a method to compute the vacuum polarization energy for coupled scalar fields with different masses scattering off a background potential in one space dimension. As an example we consider the vacuum polarization energy of a kinklike soliton built from two real scalar fields with different mass parameters.

  14. Ionization signals from diamond detectors in fast-neutron fields

    NASA Astrophysics Data System (ADS)

    Weiss, C.; Frais-Kölbl, H.; Griesmayer, E.; Kavrigin, P.

    2016-09-01

    In this paper we introduce a novel analysis technique for measurements with single-crystal chemical vapor deposition (sCVD) diamond detectors in fast-neutron fields. This method exploits the unique electronic property of sCVD diamond sensors that the signal shape of the detector current is directly proportional to the initial ionization profile. In fast-neutron fields the diamond sensor acts simultaneously as target and sensor. The interaction of neutrons with the stable isotopes 12 C and 13 C is of interest for fast-neutron diagnostics. The measured signal shapes of detector current pulses are used to identify individual types of interactions in the diamond with the goal to select neutron-induced reactions in the diamond and to suppress neutron-induced background reactions as well as γ-background. The method is verified with experimental data from a measurement in a 14.3 MeV neutron beam at JRC-IRMM, Geel/Belgium, where the 13C(n, α)10Be reaction was successfully extracted from the dominating background of recoil protons and γ-rays and the energy resolution of the 12C(n, α)9Be reaction was substantially improved. The presented analysis technique is especially relevant for diagnostics in harsh radiation environments, like fission and fusion reactors. It allows to extract the neutron spectrum from the background, and is particularly applicable to neutron flux monitoring and neutron spectroscopy.

  15. Single-camera displacement field correlation method for centrosymmetric 3D dynamic deformation measurement

    NASA Astrophysics Data System (ADS)

    Zhao, Jiaye; Wen, Huihui; Liu, Zhanwei; Rong, Jili; Xie, Huimin

    2018-05-01

    Three-dimensional (3D) deformation measurements are a key issue in experimental mechanics. In this paper, a displacement field correlation (DFC) method to measure centrosymmetric 3D dynamic deformation using a single camera is proposed for the first time. When 3D deformation information is collected by a camera at a tilted angle, the measured displacement fields are coupling fields of both the in-plane and out-of-plane displacements. The features of the coupling field are analysed in detail, and a decoupling algorithm based on DFC is proposed. The 3D deformation to be measured can be inverted and reconstructed using only one coupling field. The accuracy of this method was validated by a high-speed impact experiment that simulated an underwater explosion. The experimental results show that the approach proposed in this paper can be used in 3D deformation measurements with higher sensitivity and accuracy, and is especially suitable for high-speed centrosymmetric deformation. In addition, this method avoids the non-synchronisation problem associated with using a pair of high-speed cameras, as is common in 3D dynamic measurements.

  16. Application of Discrete Huygens Method for Diffraction of Transient Ultrasonic Field

    NASA Astrophysics Data System (ADS)

    Alia, A.

    2018-01-01

    Several time-domain methods have been widely used to predict impulse response in acoustics. Despite its great potential, Discrete Huygens Method (DHM) has not been as widely used in the domain of ultrasonic diffraction as in other fields. In fact, little can be found in literature about the application of the DHM to diffraction phenomenon that can be described in terms of direct and edge waves, a concept suggested by Young since 1802. In this paper, a simple axisymmetric DHM-model has been used to simulate the transient ultrasonic field radiation of a baffled transducer and its diffraction by a target located on axis. The results are validated by impulse response based calculations. They indicate the capability of DHM to simulate diffraction occurring at transducer and target edges and to predict the complicated transient field in pulse mode.

  17. Research on infrared dim-point target detection and tracking under sea-sky-line complex background

    NASA Astrophysics Data System (ADS)

    Dong, Yu-xing; Li, Yan; Zhang, Hai-bo

    2011-08-01

    Target detection and tracking technology in infrared image is an important part of modern military defense system. Infrared dim-point targets detection and recognition under complex background is a difficulty and important strategic value and challenging research topic. The main objects that carrier-borne infrared vigilance system detected are sea-skimming aircrafts and missiles. Due to the characteristics of wide field of view of vigilance system, the target is usually under the sea clutter. Detection and recognition of the target will be taken great difficulties .There are some traditional point target detection algorithms, such as adaptive background prediction detecting method. When background has dispersion-decreasing structure, the traditional target detection algorithms would be more useful. But when the background has large gray gradient, such as sea-sky-line, sea waves etc .The bigger false-alarm rate will be taken in these local area .It could not obtain satisfactory results. Because dim-point target itself does not have obvious geometry or texture feature ,in our opinion , from the perspective of mathematics, the detection of dim-point targets in image is about singular function analysis .And from the perspective image processing analysis , the judgment of isolated singularity in the image is key problem. The foregoing points for dim-point targets detection, its essence is a separation of target and background of different singularity characteristics .The image from infrared sensor usually accompanied by different kinds of noise. These external noises could be caused by the complicated background or from the sensor itself. The noise might affect target detection and tracking. Therefore, the purpose of the image preprocessing is to reduce the effects from noise, also to raise the SNR of image, and to increase the contrast of target and background. According to the low sea-skimming infrared flying small target characteristics , the median filter is used to

  18. Infrared images target detection based on background modeling in the discrete cosine domain

    NASA Astrophysics Data System (ADS)

    Ye, Han; Pei, Jihong

    2018-02-01

    Background modeling is the critical technology to detect the moving target for video surveillance. Most background modeling techniques are aimed at land monitoring and operated in the spatial domain. A background establishment becomes difficult when the scene is a complex fluctuating sea surface. In this paper, the background stability and separability between target are analyzed deeply in the discrete cosine transform (DCT) domain, on this basis, we propose a background modeling method. The proposed method models each frequency point as a single Gaussian model to represent background, and the target is extracted by suppressing the background coefficients. Experimental results show that our approach can establish an accurate background model for seawater, and the detection results outperform other background modeling methods in the spatial domain.

  19. Anisotropic quantum quench in the presence of frustration or background gauge fields: A probe of bulk currents and topological chiral edge modes

    NASA Astrophysics Data System (ADS)

    Killi, Matthew; Trotzky, Stefan; Paramekanti, Arun

    2012-12-01

    Bosons and fermions, in the presence of frustration or background gauge fields, can form many-body ground states that support equilibrium charge or spin currents. Motivated by the experimental creation of frustration or synthetic gauge fields in ultracold atomic systems, we propose a general scheme by which making a sudden anisotropic quench of the atom tunneling across the lattice and tracking the ensuing density modulations provides a powerful and gauge-invariant route to probing diverse equilibrium current patterns. Using illustrative examples of trapped superfluid Bose and normal Fermi systems in the presence of artificial magnetic fluxes on square lattices, and frustrated bosons in a triangular lattice, we show that this scheme to probe equilibrium bulk current order works independent of particle statistics. We also show that such quenches can detect chiral edge modes in gapped topological states, such as quantum Hall or quantum spin Hall insulators.

  20. Holography for Schrödinger backgrounds

    NASA Astrophysics Data System (ADS)

    Guica, Monica; Skenderis, Kostas; Taylor, Marika; van Rees, Balt C.

    2011-02-01

    We discuss holography for Schrödinger solutions of both topologically massive gravity in three dimensions and massive vector theories in ( d + 1) dimensions. In both cases the dual field theory can be viewed as a d-dimensional conformal field theory (two dimensional in the case of TMG) deformed by certain operators that respect the Schrödinger symmetry. These operators are irrelevant from the viewpoint of the relativistic conformal group but they are exactly marginal with respect to the non-relativistic conformal group. The spectrum of linear fluctuations around the background solutions corresponds to operators that are labeled by their scaling dimension and the lightcone momentum k v . We set up the holographic dictionary and compute 2-point functions of these operators both holographically and in field theory using conformal perturbation theory and find agreement. The counterterms needed for holographic renormalization are non-local in the v lightcone direction.

  1. A field method for soil erosion measurements in agricultural and natural lands

    Treesearch

    Y.P. Hsieh; K.T. Grant; G.C. Bugna

    2009-01-01

    Soil erosion is one of the most important watershed processes in nature, yet quantifying it under field conditions remains a challenge. The lack of soil erosion field data is a major factor hindering our ability to predict soil erosion in a watershed. We present here the development of a simple and sensitive field method that quantifies soil erosion and the resulting...

  2. Isotropy-violation diagnostics for B-mode polarization foregrounds to the Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Rotti, Aditya; Huffenberger, Kevin

    2016-09-01

    Isotropy-violation statistics can highlight polarized galactic foregrounds that contaminate primordial B-modes in the Cosmic Microwave Background (CMB). We propose a particular isotropy-violation test and apply it to polarized Planck 353 GHz data, constructing a map that indicates B-mode foreground dust power over the sky. We build our main isotropy test in harmonic space via the bipolar spherical harmonic basis, and our method helps us to identify the least-contaminated directions. By this measure, there are regions of low foreground in and around the BICEP field, near the South Galactic Pole, and in the Northern Galactic Hemisphere. There is also a possible foreground feature in the BICEP field. We compare our results to those based on the local power spectrum, which is computed on discs using a version of the method of Planck Int. XXX (2016). The discs method is closely related to our isotropy-violation diagnostic. We pay special care to the treatment of noise, including chance correlations with the foregrounds. Currently we use our isotropy tool to assess the cleanest portions of the sky, but in the future such methods will allow isotropy-based null tests for foreground contamination in maps purported to measure primordial B-modes, particularly in cases of limited frequency coverage.

  3. Probabilistic BPRRC: Robust Change Detection against Illumination Changes and Background Movements

    NASA Astrophysics Data System (ADS)

    Yokoi, Kentaro

    This paper presents Probabilistic Bi-polar Radial Reach Correlation (PrBPRRC), a change detection method that is robust against illumination changes and background movements. Most of the traditional change detection methods are robust against either illumination changes or background movements; BPRRC is one of the illumination-robust change detection methods. We introduce a probabilistic background texture model into BPRRC and add the robustness against background movements including foreground invasions such as moving cars, walking people, swaying trees, and falling snow. We show the superiority of PrBPRRC in the environment with illumination changes and background movements by using three public datasets and one private dataset: ATON Highway data, Karlsruhe traffic sequence data, PETS 2007 data, and Walking-in-a-room data.

  4. BOOK REVIEW: The Cosmic Microwave Background The Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Coles, Peter

    2009-08-01

    With the successful launch of the European Space Agency's Planck satellite earlier this year the cosmic microwave background (CMB) is once again the centre of attention for cosmologists around the globe. Since its accidental discovery in 1964 by Arno Penzias and Robert Wilson, this relic of the Big Bang has been subjected to intense scrutiny by generation after generation of experiments and has gradually yielded up answers to the deepest questions about the origin of our Universe. Most recently, the Wilkinson Microwave Anisotropy Probe (WMAP) has made a full-sky analysis of the pattern of temperature and polarization variations that helped establish a new standard cosmological model, confirmed the existence of dark matter and dark energy, and provided strong evidence that there was an epoch of primordial inflation. Ruth Durrer's book reflects the importance of the CMB for future developments in this field. Aimed at graduate students and established researchers, it consists of a basic introduction to cosmology and the theory of primordial perturbations followed by a detailed explanation of how these manifest themselves as measurable variations in the present-day radiation field. It then focuses on the statistical methods needed to obtain accurate estimates of the parameters of the standard cosmological model, and finishes with a discussion of the effect of gravitational lensing on the CMB and on the evolution of its spectrum. The book apparently grew out of various lecture notes on CMB anisotropies for graduate courses given by the author. Its level and scope are well matched to the needs of such an audience and the presentation is clear and well-organized. I am sure that this book will be a useful reference for more senior scientists too. If I have a criticism, it is not about what is in the book but what is omitted. In my view, one of the most exciting possibilities for future CMB missions, including Planck, is the possibility that they might discover physics

  5. Research on Visualization Design Method in the Field of New Media Software Engineering

    NASA Astrophysics Data System (ADS)

    Deqiang, Hu

    2018-03-01

    In the new period of increasingly developed science and technology, with the increasingly fierce competition in the market and the increasing demand of the masses, new design and application methods have emerged in the field of new media software engineering, that is, the visualization design method. Applying the visualization design method to the field of new media software engineering can not only improve the actual operation efficiency of new media software engineering but more importantly the quality of software development can be enhanced by means of certain media of communication and transformation; on this basis, the progress and development of new media software engineering in China are also continuously promoted. Therefore, the application of visualization design method in the field of new media software engineering is analysed concretely in this article from the perspective of the overview of visualization design methods and on the basis of systematic analysis of the basic technology.

  6. Verification of the ISO calibration method for field pyranometers under tropical sky conditions

    NASA Astrophysics Data System (ADS)

    Janjai, Serm; Tohsing, Korntip; Pattarapanitchai, Somjet; Detkhon, Pasakorn

    2017-02-01

    Field pyranomters need to be annually calibrated and the International Organization for Standardization (ISO) has defined a standard method (ISO 9847) for calibrating these pyranometers. According to this standard method for outdoor calibration, the field pyranometers have to be compared to a reference pyranometer for the period of 2 to 14 days, depending on sky conditions. In this work, the ISO 9847 standard method was verified under tropical sky conditions. To verify the standard method, calibration of field pyranometers was conducted at a tropical site located in Nakhon Pathom (13.82o N, 100.04o E), Thailand under various sky conditions. The conditions of the sky were monitored by using a sky camera. The calibration results for different time periods used for the calibration under various sky conditions were analyzed. It was found that the calibration periods given by this standard method could be reduced without significant change in the final calibration result. In addition, recommendation and discussion on the use of this standard method in the tropics were also presented.

  7. A dark-field microscope for background-free detection of resonance fluorescence from single semiconductor quantum dots operating in a set-and-forget mode

    NASA Astrophysics Data System (ADS)

    Kuhlmann, Andreas V.; Houel, Julien; Brunner, Daniel; Ludwig, Arne; Reuter, Dirk; Wieck, Andreas D.; Warburton, Richard J.

    2013-07-01

    Optically active quantum dots, for instance self-assembled InGaAs quantum dots, are potentially excellent single photon sources. The fidelity of the single photons is much improved using resonant rather than non-resonant excitation. With resonant excitation, the challenge is to distinguish between resonance fluorescence and scattered laser light. We have met this challenge by creating a polarization-based dark-field microscope to measure the resonance fluorescence from a single quantum dot at low temperature. We achieve a suppression of the scattered laser exceeding a factor of 107 and background-free detection of resonance fluorescence. The same optical setup operates over the entire quantum dot emission range (920-980 nm) and also in high magnetic fields. The major development is the outstanding long-term stability: once the dark-field point has been established, the microscope operates for days without alignment. The mechanical and optical designs of the microscope are presented, as well as exemplary resonance fluorescence spectroscopy results on individual quantum dots to underline the microscope's excellent performance.

  8. Evaluation Method for Fieldlike-Torque Efficiency by Modulation of the Resonance Field

    NASA Astrophysics Data System (ADS)

    Kim, Changsoo; Kim, Dongseuk; Chun, Byong Sun; Moon, Kyoung-Woong; Hwang, Chanyong

    2018-05-01

    The spin Hall effect has attracted a lot of interest in spintronics because it offers the possibility of a faster switching route with an electric current than with a spin-transfer-torque device. Recently, fieldlike spin-orbit torque has been shown to play an important role in the magnetization switching mechanism. However, there is no simple method for observing the fieldlike spin-orbit torque efficiency. We suggest a method for measuring fieldlike spin-orbit torque using a linear change in the resonance field in spectra of direct-current (dc)-tuned spin-torque ferromagnetic resonance. The fieldlike spin-orbit torque efficiency can be obtained in both a macrospin simulation and in experiments by simply subtracting the Oersted field from the shifted amount of resonance field. This method analyzes the effect of fieldlike torque using dc in a normal metal; therefore, only the dc resistivity and the dimensions of each layer are considered in estimating the fieldlike spin-torque efficiency. The evaluation of fieldlike-torque efficiency of a newly emerging material by modulation of the resonance field provides a shortcut in the development of an alternative magnetization switching device.

  9. Background for protective action recommendations: accidental radioactive contamination of food and animal feeds. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shleien, B.; Schmidt, G.D.; Chiacchierini, R.P.

    This report provides background material for the development of FDA's Protective Action Recommendations: Accidental Radioactive Contamination of Food and Animal Feeds. The rationale, dosimetric and agricultural transport models for the Protective Action Guides are presented, along with information on dietary intake. In addition, the document contains a discussion of field methods of analysis of radionuclides deposited on the ground or contained in milk and herbage. Various protective actions are described and evaluated, and a cost-effectiveness analysis for the recommendations performed.

  10. Background Model for the Majorana Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuesta, C.; Abgrall, N.; Aguayo, Estanislao

    2015-06-01

    The Majorana Collaboration is constructing a prototype system containing 40 kg of HPGe detectors to demonstrate the feasibility and potential of a future tonne-scale experiment to search for neutrinoless double-beta (0v BB) decay in 76Ge. In view of the requirement that the next generation of tonne-scale Ge-based 0vBB-decay experiment be capable of probing the neutrino mass scale in the inverted-hierarchy region, a major goal of theMajorana Demonstrator is to demonstrate a path forward to achieving a background rate at or below 1 cnt/(ROI-t-y) in the 4 keV region of interest around the Q-value at 2039 keV. This goal is pursuedmore » through a combination of a significant reduction of radioactive impurities in construction materials with analytical methods for background rejection, for example using powerful pulse shape analysis techniques profiting from the p-type point contact HPGe detectors technology. The effectiveness of these methods is assessed using Geant4 simulations of the different background components whose purity levels are constrained from radioassay measurements.« less

  11. Modeling background radiation in Southern Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haber, Daniel A.; Burnley, Pamela C.; Adcock, Christopher T.

    Aerial gamma ray surveys are an important tool for national security, scientific, and industrial interests in determining locations of both anthropogenic and natural sources of radioactivity. There is a relationship between radioactivity and geology and in the past this relationship has been used to predict geology from an aerial survey. The purpose of this project is to develop a method to predict the radiologic exposure rate of the geologic materials by creating a high resolution background model. The intention is for this method to be used in an emergency response scenario where the background radiation envi-ronment is unknown. Two studymore » areas in Southern Nevada have been modeled using geologic data, images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), geochemical data, and pre-existing low resolution aerial surveys from the National Uranium Resource Evaluation (NURE) Survey. Using these data, geospatial areas that are homogenous in terms of K, U, and Th, referred to as background radiation units, are defined and the gamma ray exposure rate is predicted. The prediction is compared to data collected via detailed aerial survey by the Department of Energy's Remote Sensing Lab - Nellis, allowing for the refinement of the technique. By using geologic units to define radiation background units of exposed bedrock and ASTER visualizations to subdivide and define radiation background units within alluvium, successful models have been produced for Government Wash, north of Lake Mead, and for the western shore of Lake Mohave, east of Searchlight, NV.« less

  12. Modeling background radiation in Southern Nevada

    DOE PAGES

    Haber, Daniel A.; Burnley, Pamela C.; Adcock, Christopher T.; ...

    2017-02-06

    Aerial gamma ray surveys are an important tool for national security, scientific, and industrial interests in determining locations of both anthropogenic and natural sources of radioactivity. There is a relationship between radioactivity and geology and in the past this relationship has been used to predict geology from an aerial survey. The purpose of this project is to develop a method to predict the radiologic exposure rate of the geologic materials by creating a high resolution background model. The intention is for this method to be used in an emergency response scenario where the background radiation envi-ronment is unknown. Two studymore » areas in Southern Nevada have been modeled using geologic data, images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), geochemical data, and pre-existing low resolution aerial surveys from the National Uranium Resource Evaluation (NURE) Survey. Using these data, geospatial areas that are homogenous in terms of K, U, and Th, referred to as background radiation units, are defined and the gamma ray exposure rate is predicted. The prediction is compared to data collected via detailed aerial survey by the Department of Energy's Remote Sensing Lab - Nellis, allowing for the refinement of the technique. By using geologic units to define radiation background units of exposed bedrock and ASTER visualizations to subdivide and define radiation background units within alluvium, successful models have been produced for Government Wash, north of Lake Mead, and for the western shore of Lake Mohave, east of Searchlight, NV.« less

  13. The reduced basis method for the electric field integral equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fares, M., E-mail: fares@cerfacs.f; Hesthaven, J.S., E-mail: Jan_Hesthaven@Brown.ed; Maday, Y., E-mail: maday@ann.jussieu.f

    We introduce the reduced basis method (RBM) as an efficient tool for parametrized scattering problems in computational electromagnetics for problems where field solutions are computed using a standard Boundary Element Method (BEM) for the parametrized electric field integral equation (EFIE). This combination enables an algorithmic cooperation which results in a two step procedure. The first step consists of a computationally intense assembling of the reduced basis, that needs to be effected only once. In the second step, we compute output functionals of the solution, such as the Radar Cross Section (RCS), independently of the dimension of the discretization space, formore » many different parameter values in a many-query context at very little cost. Parameters include the wavenumber, the angle of the incident plane wave and its polarization.« less

  14. Field Science Ethnography: Methods For Systematic Observation on an Expedition

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    The Haughton-Mars expedition is a multidisciplinary project, exploring an impact crater in an extreme environment to determine how people might live and work on Mars. The expedition seeks to understand and field test Mars facilities, crew roles, operations, and computer tools. I combine an ethnographic approach to establish a baseline understanding of how scientists prefer to live and work when relatively unemcumbered, with a participatory design approach of experimenting with procedures and tools in the context of use. This paper focuses on field methods for systematically recording and analyzing the expedition's activities. Systematic photography and time-lapse video are combined with concept mapping to organize and present information. This hybrid approach is generally applicable to the study of modern field expeditions having a dozen or more multidisciplinary participants, spread over a large terrain during multiple field seasons.

  15. Current Status of the Polyamine Research Field

    PubMed Central

    Pegg, Anthony E.; Casero, Robert A.

    2013-01-01

    This chapter provides an overview of the polyamine field and introduces the 32 other chapters that make up this volume. These chapters provide a wide range of methods, advice, and background relevant to studies of the function of polyamines, the regulation of their content, their role in disease, and the therapeutic potential of drugs targeting polyamine content and function. The methodology provided in this new volume will enable laboratories already working in this area to expand their experimental techniques and facilitate the entry of additional workers into this rapidly expanding field. PMID:21318864

  16. Surface Profile and Stress Field Evaluation using Digital Gradient Sensing Method

    DOE PAGES

    Miao, C.; Sundaram, B. M.; Huang, L.; ...

    2016-08-09

    Shape and surface topography evaluation from measured orthogonal slope/gradient data is of considerable engineering significance since many full-field optical sensors and interferometers readily output accurate data of that kind. This has applications ranging from metrology of optical and electronic elements (lenses, silicon wafers, thin film coatings), surface profile estimation, wave front and shape reconstruction, to name a few. In this context, a new methodology for surface profile and stress field determination based on a recently introduced non-contact, full-field optical method called digital gradient sensing (DGS) capable of measuring small angular deflections of light rays coupled with a robust finite-difference-based least-squaresmore » integration (HFLI) scheme in the Southwell configuration is advanced here. The method is demonstrated by evaluating (a) surface profiles of mechanically warped silicon wafers and (b) stress gradients near growing cracks in planar phase objects.« less

  17. Cross-comparison and evaluation of air pollution field estimation methods

    NASA Astrophysics Data System (ADS)

    Yu, Haofei; Russell, Armistead; Mulholland, James; Odman, Talat; Hu, Yongtao; Chang, Howard H.; Kumar, Naresh

    2018-04-01

    Accurate estimates of human exposure is critical for air pollution health studies and a variety of methods are currently being used to assign pollutant concentrations to populations. Results from these methods may differ substantially, which can affect the outcomes of health impact assessments. Here, we applied 14 methods for developing spatiotemporal air pollutant concentration fields of eight pollutants to the Atlanta, Georgia region. These methods include eight methods relying mostly on air quality observations (CM: central monitor; SA: spatial average; IDW: inverse distance weighting; KRIG: kriging; TESS-D: discontinuous tessellation; TESS-NN: natural neighbor tessellation with interpolation; LUR: land use regression; AOD: downscaled satellite-derived aerosol optical depth), one using the RLINE dispersion model, and five methods using a chemical transport model (CMAQ), with and without using observational data to constrain results. The derived fields were evaluated and compared. Overall, all methods generally perform better at urban than rural area, and for secondary than primary pollutants. We found the CM and SA methods may be appropriate only for small domains, and for secondary pollutants, though the SA method lead to large negative spatial correlations when using data withholding for PM2.5 (spatial correlation coefficient R = -0.81). The TESS-D method was found to have major limitations. Results of the IDW, KRIG and TESS-NN methods are similar. They are found to be better suited for secondary pollutants because of their satisfactory temporal performance (e.g. average temporal R2 > 0.85 for PM2.5 but less than 0.35 for primary pollutant NO2). In addition, they are suitable for areas with relatively dense monitoring networks due to their inability to capture spatial concentration variabilities, as indicated by the negative spatial R (lower than -0.2 for PM2.5 when assessed using data withholding). The performance of LUR and AOD methods were similar to

  18. Statistical simulations of the dust foreground to cosmic microwave background polarization

    NASA Astrophysics Data System (ADS)

    Vansyngel, F.; Boulanger, F.; Ghosh, T.; Wandelt, B.; Aumont, J.; Bracco, A.; Levrier, F.; Martin, P. G.; Montier, L.

    2017-07-01

    The characterization of the dust polarization foreground to the cosmic microwave background (CMB) is a necessary step toward the detection of the B-mode signal associated with primordial gravitational waves. We present a method to simulate maps of polarized dust emission on the sphere that is similar to the approach used for CMB anisotropies. This method builds on the understanding of Galactic polarization stemming from the analysis of Planck data. It relates the dust polarization sky to the structure of the Galactic magnetic field and its coupling with interstellar matter and turbulence. The Galactic magnetic field is modeled as a superposition of a mean uniform field and a Gaussian random (turbulent) component with a power-law power spectrum of exponent αM. The integration along the line of sight carried out to compute Stokes maps is approximated by a sum over a small number of emitting layers with different realizations of the random component of the magnetic field. The model parameters are constrained to fit the power spectra of dust polarization EE, BB, and TE measured using Planck data. We find that the slopes of the E and B power spectra of dust polarization are matched for αM = -2.5, an exponent close to that measured for total dust intensity but larger than the Kolmogorov exponent - 11/3. The model allows us to compute multiple realizations of the Stokes Q and U maps for different realizations of the random component of the magnetic field, and to quantify the variance of dust polarization spectra for any given sky area outside of the Galactic plane. The simulations reproduce the scaling relation between the dust polarization power and the mean total dust intensity including the observed dispersion around the mean relation. We also propose a method to carry out multifrequency simulations, including the decorrelation measured recently by Planck, using a given covariance matrix of the polarization maps. These simulations are well suited to optimize

  19. Implementation of a flow-dependent background error correlation length scale formulation in the NEMOVAR OSTIA system

    NASA Astrophysics Data System (ADS)

    Fiedler, Emma; Mao, Chongyuan; Good, Simon; Waters, Jennifer; Martin, Matthew

    2017-04-01

    OSTIA is the Met Office's Operational Sea Surface Temperature (SST) and Ice Analysis system, which produces L4 (globally complete, gridded) analyses on a daily basis. Work is currently being undertaken to replace the original OI (Optimal Interpolation) data assimilation scheme with NEMOVAR, a 3D-Var data assimilation method developed for use with the NEMO ocean model. A dual background error correlation length scale formulation is used for SST in OSTIA, as implemented in NEMOVAR. Short and long length scales are combined according to the ratio of the decomposition of the background error variances into short and long spatial correlations. The pre-defined background error variances vary spatially and seasonally, but not on shorter time-scales. If the derived length scales applied to the daily analysis are too long, SST features may be smoothed out. Therefore a flow-dependent component to determining the effective length scale has also been developed. The total horizontal gradient of the background SST field is used to identify regions where the length scale should be shortened. These methods together have led to an improvement in the resolution of SST features compared to the previous OI analysis system, without the introduction of spurious noise. This presentation will show validation results for feature resolution in OSTIA using the OI scheme, the dual length scale NEMOVAR scheme, and the flow-dependent implementation.

  20. Characterizing the Background Corona with SDO/AIA

    NASA Technical Reports Server (NTRS)

    Napier, Kate; Alexander, Caroline; Winebarger, Amy

    2014-01-01

    Characterizing the nature of the solar coronal background would enable scientists to more accurately determine plasma parameters, and may lead to a better understanding of the coronal heating problem. Because scientists study the 3D structure of the Sun in 2D, any line-of-sight includes both foreground and background material, and thus, the issue of background subtraction arises. By investigating the intensity values in and around an active region, using multiple wavelengths collected from the Atmospheric Imaging Assembly (AIA) on the Solar Dynamics Observatory (SDO) over an eight-hour period, this project aims to characterize the background as smooth or structured. Different methods were employed to measure the true coronal background and create minimum intensity images. These were then investigated for the presence of structure. The background images created were found to contain long-lived structures, including coronal loops, that were still present in all of the wavelengths, 131, 171, 193, 211, and 335 A. The intensity profiles across the active region indicate that the background is much more structured than previously thought.

  1. Regional and local background ozone in Houston during Texas Air Quality Study 2006

    NASA Astrophysics Data System (ADS)

    Langford, A. O.; Senff, C. J.; Banta, R. M.; Hardesty, R. M.; Alvarez, R. J.; Sandberg, Scott P.; Darby, Lisa S.

    2009-04-01

    Principal Component Analysis (PCA) is used to isolate the common modes of behavior in the daily maximum 8-h average ozone mixing ratios measured at 30 Continuous Ambient Monitoring Stations in the Houston-Galveston-Brazoria area during the Second Texas Air Quality Study field intensive (1 August to 15 October 2006). Three principal components suffice to explain 93% of the total variance. Nearly 84% is explained by the first component, which is attributed to changes in the "regional background" determined primarily by the large-scale winds. The second component (6%) is attributed to changes in the "local background," that is, ozone photochemically produced in the Houston area and spatially and temporally averaged by local circulations. Finally, the third component (3.5%) is attributed to short-lived plumes containing high ozone originating from industrial areas along Galveston Bay and the Houston Ship Channel. Regional background ozone concentrations derived using the first component compare well with mean ozone concentrations measured above the Gulf of Mexico by the tunable profiler for aerosols and ozone lidar aboard the NOAA Twin Otter. The PCA regional background values also agree well with background values derived using the lowest daily 8-h maximum method of Nielsen-Gammon et al. (2005), provided the Galveston Airport data (C34) are omitted from that analysis. The differences found when Galveston is included are caused by the sea breeze, which depresses ozone at Galveston relative to sites further inland. PCA removes the effects of this and other local circulations to obtain a regional background value representative of the greater Houston area.

  2. Investigating the effect of background magnetic field on the resonance condition between EMIC waves and relativistic electrons

    NASA Astrophysics Data System (ADS)

    Woodger, L. A.; Millan, R. M.

    2017-12-01

    Balloon-borne x-ray detectors observe bremsstrahlung from precipitating electrons, offering a unique opportunity to observe sustained precipitation from a quasi-geosynchronous platform. Recent balloon observations of duskside relativistic electron precipitation (REP) on BARREL confirm that Electro-Magnetic Ion Cyclotron (EMIC) waves cause electron precipitation [e.g. Li et al., 2014]. However, BARREL observations show precipitation does not occur everywhere that waves are observed; precipitation is confined to narrow magnetic local time (MLT) regions in the duskside magnetosphere [Blum et al., 2015]. Furthermore, modulation of relativistic electron precipitation on Ultra Low Frequency (ULF) wave (f < 20 mHz) timescales has been reported in several events from balloon X-ray observations [Foat et al., 1998; Millan et al., 2002]. Wave-particle interaction between relativistic electrons and EMIC waves is a highly debated loss processes contributing to the dynamics of Earth's radiation belts. We present REP from balloon x-ray observations in the context of precipitation driven by EMIC waves. We investigate how background magnetic field strength could drive the localization, distribution, and temporal structure of the precipitating electrons.

  3. Relation between residential magnetic fields, light-at-night, and nocturnal urine melatonin levels in women: Volume 1 -- Background and purpose, methods, results, discussion. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaune, W.; Davis, S.; Stevens, R.

    Scientists have postulated a link between exposure to magnetic fields and reduced blood melatonin levels. This EPRI study was designed to supplement a National Cancer Institute study (NCI-BC) of magnetic fields, light-at-night, and the risk of breast cancer. By expanding the exposure assessment of the NCI-BC and collecting data on urine melatonin levels, this project provides new insight into a possible magnetic field-melatonin link. It has been proposed that exposure to 60-Hz (power frequency) magnetic fields may increase the risk of breast cancer by suppressing the normal nocturnal rise in melatonin production in the pineal gland. It remains unknown whethermore » the human pineal gland is reproducibly responsive or sensitive to magnetic field exposure, and whether such exposures could alter elements of the endogenous hormonal environment in women that might be important in the etiology of breast cancer. The objective of this research was to investigate whether exposure to power-frequency magnetic fields and/or light-at-night is associated with levels of the primary urinary melatonin metabolite in women without a history of breast cancer.« less

  4. Study on transport properties of silicene monolayer under external field using NEGF method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Syaputra, Marhamni, E-mail: marhamni@students.itb.ac.id; Wella, Sasfan Arman; Wungu, Triati Dewi Kencana

    2015-09-30

    We investigate the current-voltage (I-V) characteristics of a pristine monolayer silicene using non-equilibrium Green function (NEGF) method combining with density functional theory (DFT). This method succeeded in showing the relationship of I and V on silicene corresponding to the electronic characteristics such as density of states. The external field perpendicular to the silicene monolayer affects in increasing of the current. Under 0.2 eV external field, the current reaches the maximum peak at Vb = 0.3 eV with the increase is about 60% from what it is in zero external field.

  5. Detection of cosmic microwave background structure in a second field with the Cosmic Anisotropy Telescope

    NASA Astrophysics Data System (ADS)

    Baker, Joanne C.; Grainge, Keith; Hobson, M. P.; Jones, Michael E.; Kneissl, R.; Lasenby, A. N.; O'Sullivan, C. M. M.; Pooley, Guy; Rocha, G.; Saunders, Richard; Scott, P. F.; Waldram, E. M.

    1999-10-01

    We describe observations at frequencies near 15GHz of the second 2x2deg^2 field imaged with the Cambridge Cosmic Anisotropy Telescope (CAT). After the removal of discrete radio sources, structure is detected in the images on characteristic scales of about half a degree, corresponding to spherical harmonic multipoles in the range l~330-680. A Bayesian analysis confirms that the signal arises predominantly from the cosmic microwave background (CMB) radiation for multipoles in the lower half of this range; the average broad-band power in a bin with centroid l=422 (θ~51arcmin) is estimated to be ΔTT 2.1-0.5+0.4 x10-5. For multipoles centred on l=615 (θ~35arcmin), we find contamination from Galactic emission is significant, and constrain the CMB contribution to the measured power in this bin to be ΔTT<2.0x10^-5 (1σ upper limit). These new results are consistent with the first detection made by CAT in a completely different area of sky. Together with data from other experiments, this new CAT detection adds weight to earlier evidence from CAT for a downturn in the CMB power spectrum on scales smaller than 1deg. Improved limits on the values of H0 and Ω are determined using the new CAT data.

  6. Method of synthesizing small-diameter carbon nanotubes with electron field emission properties

    NASA Technical Reports Server (NTRS)

    Liu, Jie (Inventor); Du, Chunsheng (Inventor); Qian, Cheng (Inventor); Gao, Bo (Inventor); Qiu, Qi (Inventor); Zhou, Otto Z. (Inventor)

    2009-01-01

    Carbon nanotube material having an outer diameter less than 10 nm and a number of walls less than ten are disclosed. Also disclosed are an electron field emission device including a substrate, an optionally layer of adhesion-promoting layer, and a layer of electron field emission material. The electron field emission material includes a carbon nanotube having a number of concentric graphene shells per tube of from two to ten, an outer diameter from 2 to 8 nm, and a nanotube length greater than 0.1 microns. One method to fabricate carbon nanotubes includes the steps of (a) producing a catalyst containing Fe and Mo supported on MgO powder, (b) using a mixture of hydrogen and carbon containing gas as precursors, and (c) heating the catalyst to a temperature above 950.degree. C. to produce a carbon nanotube. Another method of fabricating an electron field emission cathode includes the steps of (a) synthesizing electron field emission materials containing carbon nanotubes with a number of concentric graphene shells per tube from two to ten, an outer diameter of from 2 to 8 nm, and a length greater than 0.1 microns, (b) dispersing the electron field emission material in a suitable solvent, (c) depositing the electron field emission materials onto a substrate, and (d) annealing the substrate.

  7. Light scalars on cosmological backgrounds

    NASA Astrophysics Data System (ADS)

    Markkanen, Tommi

    2018-01-01

    We study the behaviour of a light quartically self-interacting scalar field ϕ on curved backgrounds that may be described with the cosmological equation state parameter w. At leading order in the non-perturbative 2PI expansion we find a general formula for the variance < {\\widehat{φ}}^2> and show for several previously unexplored cases, including matter domination and kination, that the curvature of space can induce a significant excitation of the field. We discuss how the generation of a non-zero variance for w ≠ -1 can be understood as a process of self-regulation of the infrared divergences very similarly to what is known to occur in de Sitter space. To conclude, the appearance of an effective mass due to self-interaction is generic for a light scalar in curved space and can have important implications for reheating, vacuum stability and dark matter generation.

  8. An efficient method for the fusion of light field refocused images

    NASA Astrophysics Data System (ADS)

    Wang, Yingqian; Yang, Jungang; Xiao, Chao; An, Wei

    2018-04-01

    Light field cameras have drawn much attention due to the advantage of post-capture adjustments such as refocusing after exposure. The depth of field in refocused images is always shallow because of the large equivalent aperture. As a result, a large number of multi-focus images are obtained and an all-in-focus image is demanded. Consider that most multi-focus image fusion algorithms do not particularly aim at large numbers of source images and traditional DWT-based fusion approach has serious problems in dealing with lots of multi-focus images, causing color distortion and ringing effect. To solve this problem, this paper proposes an efficient multi-focus image fusion method based on stationary wavelet transform (SWT), which can deal with a large quantity of multi-focus images with shallow depth of fields. We compare SWT-based approach with DWT-based approach on various occasions. And the results demonstrate that the proposed method performs much better both visually and quantitatively.

  9. Family Background and Educational Path of Italian Graduates

    ERIC Educational Resources Information Center

    Vergolini, Loris; Vlach, Eleonora

    2017-01-01

    In this paper, we analyse social inequalities along the horizontal dimension of education in Italy. More precisely, we focus on the role of family background in completing specific fields of study at both secondary and tertiary levels of education. To mitigate the limitations of the traditional sequential model, we construct a typology of…

  10. General Anisotropy Identification of Paperboard with Virtual Fields Method

    Treesearch

    J.M. Considine; F. Pierron; K.T. Turner; D.W. Vahey

    2014-01-01

    This work extends previous efforts in plate bending of Virtual Fields Method (VFM) parameter identification to include a general 2-D anisotropicmaterial. Such an extension was needed for instances in which material principal directions are unknown or when specimen orientation is not aligned with material principal directions. A new fixture with a multiaxial force...

  11. Designs of Empirical Evaluations of Nonexperimental Methods in Field Settings.

    PubMed

    Wong, Vivian C; Steiner, Peter M

    2018-01-01

    Over the last three decades, a research design has emerged to evaluate the performance of nonexperimental (NE) designs and design features in field settings. It is called the within-study comparison (WSC) approach or the design replication study. In the traditional WSC design, treatment effects from a randomized experiment are compared to those produced by an NE approach that shares the same target population. The nonexperiment may be a quasi-experimental design, such as a regression-discontinuity or an interrupted time-series design, or an observational study approach that includes matching methods, standard regression adjustments, and difference-in-differences methods. The goals of the WSC are to determine whether the nonexperiment can replicate results from a randomized experiment (which provides the causal benchmark estimate), and the contexts and conditions under which these methods work in practice. This article presents a coherent theory of the design and implementation of WSCs for evaluating NE methods. It introduces and identifies the multiple purposes of WSCs, required design components, common threats to validity, design variants, and causal estimands of interest in WSCs. It highlights two general approaches for empirical evaluations of methods in field settings, WSC designs with independent and dependent benchmark and NE arms. This article highlights advantages and disadvantages for each approach, and conditions and contexts under which each approach is optimal for addressing methodological questions.

  12. Higgs effective potential in a perturbed Robertson-Walker background

    NASA Astrophysics Data System (ADS)

    Maroto, Antonio L.; Prada, Francisco

    2014-12-01

    We calculate the one-loop effective potential of a scalar field in a Robertson-Walker background with scalar metric perturbations. A complete set of orthonormal solutions of the perturbed equations is obtained by using the adiabatic approximation for comoving observers. After analyzing the problem of renormalization in inhomogeneous backgrounds, we get the explicit contribution of metric perturbations to the effective potential. We apply these results to the Standard Model Higgs field and evaluate the effects of metric perturbations on the Higgs mass and on its vacuum expectation value. Space-time variations are found, which are proportional to the gravitational slip parameter, with a typical amplitude of the order of Δ ϕ /ϕ ≃10-11 on cosmological scales. We also discuss possible astrophysical signatures in the Solar System and in the Milky Way that could open new possibilities to explore the symmetry breaking sector of the electroweak interactions.

  13. Background-Oriented Schlieren (BOS) for Scramjet Inlet-isolator Investigation

    NASA Astrophysics Data System (ADS)

    Che Idris, Azam; Rashdan Saad, Mohd; Hing Lo, Kin; Kontis, Konstantinos

    2018-05-01

    Background-oriented Schlieren (BOS) technique is a recently invented non-intrusive flow diagnostic method which has yet to be fully explored in its capabilities. In this paper, BOS technique has been applied for investigating the general flow field characteristics inside a generic scramjet inlet-isolator with Mach 5 flow. The difficulty in finding the delicate balance between measurement sensitivity and measurement area image focusing has been demonstrated. The differences between direct cross-correlation (DCC) and Fast Fourier Transform (FFT) raw data processing algorithm have also been demonstrated. As an exploratory study of BOS capability, this paper found that BOS is simple yet robust enough to be used to visualize complex flow in a scramjet inlet in hypersonic flow. However, in this case its quantitative data can be strongly affected by 3-dimensionality thus obscuring the density value with significant errors.

  14. A Spatial-frequency Method for Analyzing Antenna-to-Probe Interactions in Near-field Antenna Measurements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, Billy C.

    The measurement of the radiation characteristics of an antenna on a near-field range requires that the antenna under test be located very close to the near-field probe. Although the direct coupling is utilized for characterizing the near field, this close proximity also presents the opportunity for significant undesired interactions (for example, reflections) to occur between the antenna and the near-field probe. When uncompensated, these additional interactions will introduce error into the measurement, increasing the uncertainty in the final gain pattern obtained through the near-field-to-far-field transformation. Quantifying this gain-uncertainty contribution requires quantifying the various additional interactions. A method incorporating spatial-frequency analysismore » is described which allows the dominant interaction contributions to be easily identified and quantified. In addition to identifying the additional antenna-to-probe interactions, the method also allows identification and quantification of interactions with other nearby objects within the measurement room. Because the method is a spatial-frequency method, wide-bandwidth data is not required, and it can be applied even when data is available at only a single temporal frequency. This feature ensures that the method can be applied to narrow-band antennas, where a similar time-domain analysis would not be possible. - 3 - - 4 -« less

  15. A novel method for unsteady flow field segmentation based on stochastic similarity of direction

    NASA Astrophysics Data System (ADS)

    Omata, Noriyasu; Shirayama, Susumu

    2018-04-01

    Recent developments in fluid dynamics research have opened up the possibility for the detailed quantitative understanding of unsteady flow fields. However, the visualization techniques currently in use generally provide only qualitative insights. A method for dividing the flow field into physically relevant regions of interest can help researchers quantify unsteady fluid behaviors. Most methods at present compare the trajectories of virtual Lagrangian particles. The time-invariant features of an unsteady flow are also frequently of interest, but the Lagrangian specification only reveals time-variant features. To address these challenges, we propose a novel method for the time-invariant spatial segmentation of an unsteady flow field. This segmentation method does not require Lagrangian particle tracking but instead quantitatively compares the stochastic models of the direction of the flow at each observed point. The proposed method is validated with several clustering tests for 3D flows past a sphere. Results show that the proposed method reveals the time-invariant, physically relevant structures of an unsteady flow.

  16. Background of SAM atom-fraction profiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernst, Frank

    Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which ismore » validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition.« less

  17. Process system and method for fabricating submicron field emission cathodes

    DOEpatents

    Jankowski, Alan F.; Hayes, Jeffrey P.

    1998-01-01

    A process method and system for making field emission cathodes exists. The deposition source divergence is controlled to produce field emission cathodes with height-to-base aspect ratios that are uniform over large substrate surface areas while using very short source-to-substrate distances. The rate of hole closure is controlled from the cone source. The substrate surface is coated in well defined increments. The deposition source is apertured to coat pixel areas on the substrate. The entire substrate is coated using a manipulator to incrementally move the whole substrate surface past the deposition source. Either collimated sputtering or evaporative deposition sources can be used. The position of the aperture and its size and shape are used to control the field emission cathode size and shape.

  18. Non-Gaussian microwave background fluctuations from nonlinear gravitational effects

    NASA Technical Reports Server (NTRS)

    Salopek, D. S.; Kunstatter, G. (Editor)

    1991-01-01

    Whether the statistics of primordial fluctuations for structure formation are Gaussian or otherwise may be determined if the Cosmic Background Explorer (COBE) Satellite makes a detection of the cosmic microwave-background temperature anisotropy delta T(sub CMB)/T(sub CMB). Non-Gaussian fluctuations may be generated in the chaotic inflationary model if two scalar fields interact nonlinearly with gravity. Theoretical contour maps are calculated for the resulting Sachs-Wolfe temperature fluctuations at large angular scales (greater than 3 degrees). In the long-wavelength approximation, one can confidently determine the nonlinear evolution of quantum noise with gravity during the inflationary epoch because: (1) different spatial points are no longer in causal contact; and (2) quantum gravity corrections are typically small-- it is sufficient to model the system using classical random fields. If the potential for two scalar fields V(phi sub 1, phi sub 2) possesses a sharp feature, then non-Gaussian fluctuations may arise. An explicit model is given where cold spots in delta T(sub CMB)/T(sub CMB) maps are suppressed as compared to the Gaussian case. The fluctuations are essentially scale-invariant.

  19. A sparse reconstruction method for the estimation of multi-resolution emission fields via atmospheric inversion

    DOE PAGES

    Ray, J.; Lee, J.; Yadav, V.; ...

    2015-04-29

    Atmospheric inversions are frequently used to estimate fluxes of atmospheric greenhouse gases (e.g., biospheric CO 2 flux fields) at Earth's surface. These inversions typically assume that flux departures from a prior model are spatially smoothly varying, which are then modeled using a multi-variate Gaussian. When the field being estimated is spatially rough, multi-variate Gaussian models are difficult to construct and a wavelet-based field model may be more suitable. Unfortunately, such models are very high dimensional and are most conveniently used when the estimation method can simultaneously perform data-driven model simplification (removal of model parameters that cannot be reliably estimated) andmore » fitting. Such sparse reconstruction methods are typically not used in atmospheric inversions. In this work, we devise a sparse reconstruction method, and illustrate it in an idealized atmospheric inversion problem for the estimation of fossil fuel CO 2 (ffCO 2) emissions in the lower 48 states of the USA. Our new method is based on stagewise orthogonal matching pursuit (StOMP), a method used to reconstruct compressively sensed images. Our adaptations bestow three properties to the sparse reconstruction procedure which are useful in atmospheric inversions. We have modified StOMP to incorporate prior information on the emission field being estimated and to enforce non-negativity on the estimated field. Finally, though based on wavelets, our method allows for the estimation of fields in non-rectangular geometries, e.g., emission fields inside geographical and political boundaries. Our idealized inversions use a recently developed multi-resolution (i.e., wavelet-based) random field model developed for ffCO 2 emissions and synthetic observations of ffCO 2 concentrations from a limited set of measurement sites. We find that our method for limiting the estimated field within an irregularly shaped region is about a factor of 10 faster than conventional approaches. It

  20. A sparse reconstruction method for the estimation of multi-resolution emission fields via atmospheric inversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ray, J.; Lee, J.; Yadav, V.

    Atmospheric inversions are frequently used to estimate fluxes of atmospheric greenhouse gases (e.g., biospheric CO 2 flux fields) at Earth's surface. These inversions typically assume that flux departures from a prior model are spatially smoothly varying, which are then modeled using a multi-variate Gaussian. When the field being estimated is spatially rough, multi-variate Gaussian models are difficult to construct and a wavelet-based field model may be more suitable. Unfortunately, such models are very high dimensional and are most conveniently used when the estimation method can simultaneously perform data-driven model simplification (removal of model parameters that cannot be reliably estimated) andmore » fitting. Such sparse reconstruction methods are typically not used in atmospheric inversions. In this work, we devise a sparse reconstruction method, and illustrate it in an idealized atmospheric inversion problem for the estimation of fossil fuel CO 2 (ffCO 2) emissions in the lower 48 states of the USA. Our new method is based on stagewise orthogonal matching pursuit (StOMP), a method used to reconstruct compressively sensed images. Our adaptations bestow three properties to the sparse reconstruction procedure which are useful in atmospheric inversions. We have modified StOMP to incorporate prior information on the emission field being estimated and to enforce non-negativity on the estimated field. Finally, though based on wavelets, our method allows for the estimation of fields in non-rectangular geometries, e.g., emission fields inside geographical and political boundaries. Our idealized inversions use a recently developed multi-resolution (i.e., wavelet-based) random field model developed for ffCO 2 emissions and synthetic observations of ffCO 2 concentrations from a limited set of measurement sites. We find that our method for limiting the estimated field within an irregularly shaped region is about a factor of 10 faster than conventional approaches. It

  1. Stable solitary waves in super dense plasmas at external magnetic fields

    NASA Astrophysics Data System (ADS)

    Ghaani, Azam; Javidan, Kurosh; Sarbishaei, Mohsen

    2015-07-01

    Propagation of localized waves in a Fermi-Dirac distributed super dense matter at the presence of strong external magnetic fields is studied using the reductive perturbation method. We have shown that stable solitons can be created in such non-relativistic fluids in the presence of an external magnetic field. Such solitary waves are governed by the Zakharov-Kuznetsov (ZK) equation. Properties of solitonic solutions are studied in media with different values of background mass density and strength of magnetic field.

  2. Method for confining the magnetic field of the cross-tail current inside the magnetopause

    NASA Technical Reports Server (NTRS)

    Sotirelis, T.; Tsyganenko, N. A.; Stern, D. P.

    1994-01-01

    A method is presented for analytically representing the magnetic field due to the cross-tail current and its closure on the magnetopause. It is an extension of a method used by Tsyganenko (1989b) to confine the dipole field inside an ellipsoidal magnetopause using a scalar potential. Given a model of the cross-tail current, the implied net magnetic field is obtained by adding to the cross-tail current field a potential field B = - del gamma, which makes all field lines divide into two disjoint groups, separated by the magnetopause (i.e., the combined field is made to have zero normal component with the magnetopause). The magnetopause is assumed to be an ellipsoid of revolution (a prolate spheroid) as an approximation to observations (Sibeck et al., 1991). This assumption permits the potential gamma to be expressed in spheroidal coordinates, expanded in spheroidal harmonics and its terms evaluated by performing inversion integrals. Finally, the field outside the magnetopause is replaced by zero, resulting in a consistent current closure along the magnetopause. This procedure can also be used to confine the modeled field of any other interior magnetic source, though the model current must always flow in closed circuits. The method is demonstrated on the T87 cross-tail current, examples illustrate the effect of changing the size and shape of the prescribed magnetopause and a comparison is made to an independent numerical scheme based on the Biot-Savart equation.

  3. Atmospheric Blocking and Intercomparison of Objective Detection Methods: Flow Field Characteristics

    NASA Astrophysics Data System (ADS)

    Pinheiro, M. C.; Ullrich, P. A.; Grotjahn, R.

    2017-12-01

    A number of objective methods for identifying and quantifying atmospheric blocking have been developed over the last couple of decades, but there is variable consensus on the resultant blocking climatology. This project examines blocking climatologies as produced by three different methods: two anomaly-based methods, and the geopotential height gradient method of Tibaldi and Molteni (1990). The results highlight the differences in blocking that arise from the choice of detection method, with emphasis on the physical characteristics of the flow field and the subsequent effects on the blocking patterns that emerge.

  4. Error model of geomagnetic-field measurement and extended Kalman-filter based compensation method

    PubMed Central

    Ge, Zhilei; Liu, Suyun; Li, Guopeng; Huang, Yan; Wang, Yanni

    2017-01-01

    The real-time accurate measurement of the geomagnetic-field is the foundation to achieving high-precision geomagnetic navigation. The existing geomagnetic-field measurement models are essentially simplified models that cannot accurately describe the sources of measurement error. This paper, on the basis of systematically analyzing the source of geomagnetic-field measurement error, built a complete measurement model, into which the previously unconsidered geomagnetic daily variation field was introduced. This paper proposed an extended Kalman-filter based compensation method, which allows a large amount of measurement data to be used in estimating parameters to obtain the optimal solution in the sense of statistics. The experiment results showed that the compensated strength of the geomagnetic field remained close to the real value and the measurement error was basically controlled within 5nT. In addition, this compensation method has strong applicability due to its easy data collection and ability to remove the dependence on a high-precision measurement instrument. PMID:28445508

  5. NMR system and method having a permanent magnet providing a rotating magnetic field

    DOEpatents

    Schlueter, Ross D [Berkeley, CA; Budinger, Thomas F [Berkeley, CA

    2009-05-19

    Disclosed herein are systems and methods for generating a rotating magnetic field. The rotating magnetic field can be used to obtain rotating-field NMR spectra, such as magic angle spinning spectra, without having to physically rotate the sample. This result allows magic angle spinning NMR to be conducted on biological samples such as live animals, including humans.

  6. Background suppression of infrared small target image based on inter-frame registration

    NASA Astrophysics Data System (ADS)

    Ye, Xiubo; Xue, Bindang

    2018-04-01

    We propose a multi-frame background suppression method for remote infrared small target detection. Inter-frame information is necessary when the heavy background clutters make it difficult to distinguish real targets and false alarms. A registration procedure based on points matching in image patches is used to compensate the local deformation of background. Then the target can be separated by background subtraction. Experiments show our method serves as an effective preliminary of target detection.

  7. Determination of the maximum-depth to potential field sources by a maximum structural index method

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Florio, G.

    2013-01-01

    A simple and fast determination of the limiting depth to the sources may represent a significant help to the data interpretation. To this end we explore the possibility of determining those source parameters shared by all the classes of models fitting the data. One approach is to determine the maximum depth-to-source compatible with the measured data, by using for example the well-known Bott-Smith rules. These rules involve only the knowledge of the field and its horizontal gradient maxima, and are independent from the density contrast. Thanks to the direct relationship between structural index and depth to sources we work out a simple and fast strategy to obtain the maximum depth by using the semi-automated methods, such as Euler deconvolution or depth-from-extreme-points method (DEXP). The proposed method consists in estimating the maximum depth as the one obtained for the highest allowable value of the structural index (Nmax). Nmax may be easily determined, since it depends only on the dimensionality of the problem (2D/3D) and on the nature of the analyzed field (e.g., gravity field or magnetic field). We tested our approach on synthetic models against the results obtained by the classical Bott-Smith formulas and the results are in fact very similar, confirming the validity of this method. However, while Bott-Smith formulas are restricted to the gravity field only, our method is applicable also to the magnetic field and to any derivative of the gravity and magnetic field. Our method yields a useful criterion to assess the source model based on the (∂f/∂x)max/fmax ratio. The usefulness of the method in real cases is demonstrated for a salt wall in the Mississippi basin, where the estimation of the maximum depth agrees with the seismic information.

  8. Method and system for entering data within a flight plan entry field

    NASA Technical Reports Server (NTRS)

    Gibbs, Michael J. (Inventor); Van Omen, Debi (Inventor); Adams, Michael B. (Inventor); Chase, Karl L. (Inventor); Lewis, Daniel E. (Inventor); McCrobie, Daniel E. (Inventor)

    2005-01-01

    The present invention provides systems, apparatus and methods for entering data into a flight plan entry field which facilitates the display and editing of aircraft flight-plan data. In one embodiment, the present invention provides a method for entering multiple waypoint and procedure identifiers at once within a single a flight plan entry field. In another embodiment, the present invention provides for the partial entry of any waypoint or procedure identifiers, and thereafter relating the identifiers with an aircraft's flight management system to anticipate the complete text entry for display. In yet another embodiment, the present invention discloses a method to automatically provide the aircraft operator with selectable prioritized arrival and approach routing identifiers by a single manual selection. In another embodiment, the present invention is a method for providing the aircraft operator with selectable alternate patterns to a new runway.

  9. An optical flow-based method for velocity field of fluid flow estimation

    NASA Astrophysics Data System (ADS)

    Głomb, Grzegorz; Świrniak, Grzegorz; Mroczka, Janusz

    2017-06-01

    The aim of this paper is to present a method for estimating flow-velocity vector fields using the Lucas-Kanade algorithm. The optical flow measurements are based on the Particle Image Velocimetry (PIV) technique, which is commonly used in fluid mechanics laboratories in both research institutes and industry. Common approaches for an optical characterization of velocity fields base on computation of partial derivatives of the image intensity using finite differences. Nevertheless, the accuracy of velocity field computations is low due to the fact that an exact estimation of spatial derivatives is very difficult in presence of rapid intensity changes in the PIV images, caused by particles having small diameters. The method discussed in this paper solves this problem by interpolating the PIV images using Gaussian radial basis functions. This provides a significant improvement in the accuracy of the velocity estimation but, more importantly, allows for the evaluation of the derivatives in intermediate points between pixels. Numerical analysis proves that the method is able to estimate even a separate vector for each particle with a 5× 5 px2 window, whereas a classical correlation-based method needs at least 4 particle images. With the use of a specialized multi-step hybrid approach to data analysis the method improves the estimation of the particle displacement far above 1 px.

  10. Comparison of Field Methods and Models to Estimate Mean Crown Diameter

    Treesearch

    William A. Bechtold; Manfred E. Mielke; Stanley J. Zarnoch

    2002-01-01

    The direct measurement of crown diameters with logger's tapes adds significantly to the cost of extensive forest inventories. We undertook a study of 100 trees to compare this measurement method to four alternatives-two field instruments, ocular estimates, and regression models. Using the taping method as the standard of comparison, accuracy of the tested...

  11. Superhorizon electromagnetic field background from Higgs loops in inflation

    NASA Astrophysics Data System (ADS)

    Kaya, Ali

    2018-03-01

    If Higgs is a spectator scalar, i.e. if it is not directly coupled to the inflaton, superhorizon Higgs modes must have been exited during inflation. Since Higgs is unstable its decay into photons is expected to seed superhorizon photon modes. We use in-in perturbation theory to show that this naive physical expectation is indeed fulfilled via loop effects. Specifically, we calculate the first order Higgs loop correction to the magnetic field power spectrum evaluated at some late time after inflation. It turns out that this loop correction becomes much larger than the tree-level power spectrum at the superhorizon scales. This suggests a mechanism to generate cosmologically interesting superhorizon vector modes by scalar-vector interactions.

  12. A Multifunctional Interface Method for Coupling Finite Element and Finite Difference Methods: Two-Dimensional Scalar-Field Problems

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    2002-01-01

    A multifunctional interface method with capabilities for variable-fidelity modeling and multiple method analysis is presented. The methodology provides an effective capability by which domains with diverse idealizations can be modeled independently to exploit the advantages of one approach over another. The multifunctional method is used to couple independently discretized subdomains, and it is used to couple the finite element and the finite difference methods. The method is based on a weighted residual variational method and is presented for two-dimensional scalar-field problems. A verification test problem and a benchmark application are presented, and the computational implications are discussed.

  13. A brain MRI bias field correction method created in the Gaussian multi-scale space

    NASA Astrophysics Data System (ADS)

    Chen, Mingsheng; Qin, Mingxin

    2017-07-01

    A pre-processing step is needed to correct for the bias field signal before submitting corrupted MR images to such image-processing algorithms. This study presents a new bias field correction method. The method creates a Gaussian multi-scale space by the convolution of the inhomogeneous MR image with a two-dimensional Gaussian function. In the multi-Gaussian space, the method retrieves the image details from the differentiation of the original image and convolution image. Then, it obtains an image whose inhomogeneity is eliminated by the weighted sum of image details in each layer in the space. Next, the bias field-corrected MR image is retrieved after the Υ correction, which enhances the contrast and brightness of the inhomogeneity-eliminated MR image. We have tested the approach on T1 MRI and T2 MRI with varying bias field levels and have achieved satisfactory results. Comparison experiments with popular software have demonstrated superior performance of the proposed method in terms of quantitative indices, especially an improvement in subsequent image segmentation.

  14. Estimation of channel parameters and background irradiance for free-space optical link.

    PubMed

    Khatoon, Afsana; Cowley, William G; Letzepis, Nick; Giggenbach, Dirk

    2013-05-10

    Free-space optical communication can experience severe fading due to optical scintillation in long-range links. Channel estimation is also corrupted by background and electrical noise. Accurate estimation of channel parameters and scintillation index (SI) depends on perfect removal of background irradiance. In this paper, we propose three different methods, the minimum-value (MV), mean-power (MP), and maximum-likelihood (ML) based methods, to remove the background irradiance from channel samples. The MV and MP methods do not require knowledge of the scintillation distribution. While the ML-based method assumes gamma-gamma scintillation, it can be easily modified to accommodate other distributions. Each estimator's performance is compared using simulation data as well as experimental measurements. The estimators' performance are evaluated from low- to high-SI areas using simulation data as well as experimental trials. The MV and MP methods have much lower complexity than the ML-based method. However, the ML-based method shows better SI and background-irradiance estimation performance.

  15. Scalar field coupling to Einstein tensor in regular black hole spacetime

    NASA Astrophysics Data System (ADS)

    Zhang, Chi; Wu, Chen

    2018-02-01

    In this paper, we study the perturbation property of a scalar field coupling to Einstein's tensor in the background of the regular black hole spacetimes. Our calculations show that the the coupling constant η imprints in the wave equation of a scalar perturbation. We calculated the quasinormal modes of scalar field coupling to Einstein's tensor in the regular black hole spacetimes by the 3rd order WKB method.

  16. Recursive least squares background prediction of univariate syndromic surveillance data

    PubMed Central

    2009-01-01

    Background Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Methods Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. Results We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. Conclusion The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the

  17. A field method for measurement of infiltration

    USGS Publications Warehouse

    Johnson, A.I.

    1963-01-01

    The determination of infiltration--the downward entry of water into a soil (or sediment)--is receiving increasing attention in hydrologic studies because of the need for more quantitative data on all phases of the hydrologic cycle. A measure of infiltration, the infiltration rate, is usually determined in the field by flooding basins or furrows, sprinkling, or measuring water entry from cylinders (infiltrometer rings). Rates determined by ponding in large areas are considered most reliable, but the high cost usually dictates that infiltrometer rings, preferably 2 feet in diameter or larger, be used. The hydrology of subsurface materials is critical in the study of infiltration. The zone controlling the rate of infiltration is usually the least permeable zone. Many other factors affect infiltration rate--the sediment (soil) structure, the condition of the sediment surface, the distribution of soil moisture or soil- moisture tension, the chemical and physical nature of the sediments, the head of applied water, the depth to ground water, the chemical quality and the turbidity of the applied water, the temperature of the water and the sediments, the percentage of entrapped air in the sediments, the atmospheric pressure, the length of time of application of water, the biological activity in the sediments, and the type of equipment or method used. It is concluded that specific values of the infiltration rate for a particular type of sediment are probably nonexistent and that measured rates are primarily for comparative use. A standard field-test method for determining infiltration rates by means of single- or double-ring infiltrometers is described and the construction, installation, and operation of the infiltrometers are discussed in detail.

  18. Process system and method for fabricating submicron field emission cathodes

    DOEpatents

    Jankowski, A.F.; Hayes, J.P.

    1998-05-05

    A process method and system for making field emission cathodes exists. The deposition source divergence is controlled to produce field emission cathodes with height-to-base aspect ratios that are uniform over large substrate surface areas while using very short source-to-substrate distances. The rate of hole closure is controlled from the cone source. The substrate surface is coated in well defined increments. The deposition source is apertured to coat pixel areas on the substrate. The entire substrate is coated using a manipulator to incrementally move the whole substrate surface past the deposition source. Either collimated sputtering or evaporative deposition sources can be used. The position of the aperture and its size and shape are used to control the field emission cathode size and shape. 3 figs.

  19. FLASHFLOOD: A 3D Field-based similarity search and alignment method for flexible molecules

    NASA Astrophysics Data System (ADS)

    Pitman, Michael C.; Huber, Wolfgang K.; Horn, Hans; Krämer, Andreas; Rice, Julia E.; Swope, William C.

    2001-07-01

    A three-dimensional field-based similarity search and alignment method for flexible molecules is introduced. The conformational space of a flexible molecule is represented in terms of fragments and torsional angles of allowed conformations. A user-definable property field is used to compute features of fragment pairs. Features are generalizations of CoMMA descriptors (Silverman, B.D. and Platt, D.E., J. Med. Chem., 39 (1996) 2129.) that characterize local regions of the property field by its local moments. The features are invariant under coordinate system transformations. Features taken from a query molecule are used to form alignments with fragment pairs in the database. An assembly algorithm is then used to merge the fragment pairs into full structures, aligned to the query. Key to the method is the use of a context adaptive descriptor scaling procedure as the basis for similarity. This allows the user to tune the weights of the various feature components based on examples relevant to the particular context under investigation. The property fields may range from simple, phenomenological fields, to fields derived from quantum mechanical calculations. We apply the method to the dihydrofolate/methotrexate benchmark system, and show that when one injects relevant contextual information into the descriptor scaling procedure, better results are obtained more efficiently. We also show how the method works and include computer times for a query from a database that represents approximately 23 million conformers of seventeen flexible molecules.

  20. Reconstructing the Initial Density Field of the Local Universe: Methods and Tests with Mock Catalogs

    NASA Astrophysics Data System (ADS)

    Wang, Huiyuan; Mo, H. J.; Yang, Xiaohu; van den Bosch, Frank C.

    2013-07-01

    Our research objective in this paper is to reconstruct an initial linear density field, which follows the multivariate Gaussian distribution with variances given by the linear power spectrum of the current cold dark matter model and evolves through gravitational instabilities to the present-day density field in the local universe. For this purpose, we develop a Hamiltonian Markov Chain Monte Carlo method to obtain the linear density field from a posterior probability function that consists of two components: a prior of a Gaussian density field with a given linear spectrum and a likelihood term that is given by the current density field. The present-day density field can be reconstructed from galaxy groups using the method developed in Wang et al. Using a realistic mock Sloan Digital Sky Survey DR7, obtained by populating dark matter halos in the Millennium simulation (MS) with galaxies, we show that our method can effectively and accurately recover both the amplitudes and phases of the initial, linear density field. To examine the accuracy of our method, we use N-body simulations to evolve these reconstructed initial conditions to the present day. The resimulated density field thus obtained accurately matches the original density field of the MS in the density range 0.3 \\lesssim \\rho /\\bar{\\rho } \\lesssim 20 without any significant bias. In particular, the Fourier phases of the resimulated density fields are tightly correlated with those of the original simulation down to a scale corresponding to a wavenumber of ~1 h Mpc-1, much smaller than the translinear scale, which corresponds to a wavenumber of ~0.15 h Mpc-1.

  1. Method of measuring the dc electric field and other tokamak parameters

    DOEpatents

    Fisch, Nathaniel J.; Kirtz, Arnold H.

    1992-01-01

    A method including externally imposing an impulsive momentum-space flux to perturb hot tokamak electrons thereby producing a transient synchrotron radiation signal, in frequency-time space, and the inference, using very fast algorithms, of plasma parameters including the effective ion charge state Z.sub.eff, the direction of the magnetic field, and the position and width in velocity space of the impulsive momentum-space flux, and, in particular, the dc toroidal electric field.

  2. Preservation Methods Differ in Fecal Microbiome Stability, Affecting Suitability for Field Studies.

    PubMed

    Song, Se Jin; Amir, Amnon; Metcalf, Jessica L; Amato, Katherine R; Xu, Zhenjiang Zech; Humphrey, Greg; Knight, Rob

    2016-01-01

    Immediate freezing at -20°C or below has been considered the gold standard for microbiome preservation, yet this approach is not feasible for many field studies, ranging from anthropology to wildlife conservation. Here we tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including such types of variation as freeze-thaw cycles and the high temperature fluctuations often encountered under field conditions. We found that three of the methods-95% ethanol, FTA cards, and the OMNIgene Gut kit-can preserve samples sufficiently well at ambient temperatures such that differences at 8 weeks are comparable to differences among technical replicates. However, even the worst methods, including those with no fixative, were able to reveal microbiome differences between species at 8 weeks and between individuals after a week, allowing meta-analyses of samples collected using various methods when the effect of interest is expected to be larger than interindividual variation (although use of a single method within a study is strongly recommended to reduce batch effects). Encouragingly for FTA cards, the differences caused by this method are systematic and can be detrended. As in other studies, we strongly caution against the use of 70% ethanol. The results, spanning 15 individuals and over 1,200 samples, provide our most comprehensive view to date of storage effects on stool and provide a paradigm for the future studies of other sample types that will be required to provide a global view of microbial diversity and its interaction among humans, animals, and the environment. IMPORTANCE Our study, spanning 15 individuals and over 1,200 samples, provides our most comprehensive view to date of storage and stabilization effects on stool. We tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including the types of variation often encountered under field conditions, such as freeze-thaw cycles and

  3. A high precision extrapolation method in multiphase-field model for simulating dendrite growth

    NASA Astrophysics Data System (ADS)

    Yang, Cong; Xu, Qingyan; Liu, Baicheng

    2018-05-01

    The phase-field method coupling with thermodynamic data has become a trend for predicting the microstructure formation in technical alloys. Nevertheless, the frequent access to thermodynamic database and calculation of local equilibrium conditions can be time intensive. The extrapolation methods, which are derived based on Taylor expansion, can provide approximation results with a high computational efficiency, and have been proven successful in applications. This paper presents a high precision second order extrapolation method for calculating the driving force in phase transformation. To obtain the phase compositions, different methods in solving the quasi-equilibrium condition are tested, and the M-slope approach is chosen for its best accuracy. The developed second order extrapolation method along with the M-slope approach and the first order extrapolation method are applied to simulate dendrite growth in a Ni-Al-Cr ternary alloy. The results of the extrapolation methods are compared with the exact solution with respect to the composition profile and dendrite tip position, which demonstrate the high precision and efficiency of the newly developed algorithm. To accelerate the phase-field and extrapolation computation, the graphic processing unit (GPU) based parallel computing scheme is developed. The application to large-scale simulation of multi-dendrite growth in an isothermal cross-section has demonstrated the ability of the developed GPU-accelerated second order extrapolation approach for multiphase-field model.

  4. Iterative Methods to Solve Linear RF Fields in Hot Plasma

    NASA Astrophysics Data System (ADS)

    Spencer, Joseph; Svidzinski, Vladimir; Evstatiev, Evstati; Galkin, Sergei; Kim, Jin-Soo

    2014-10-01

    Most magnetic plasma confinement devices use radio frequency (RF) waves for current drive and/or heating. Numerical modeling of RF fields is an important part of performance analysis of such devices and a predictive tool aiding design and development of future devices. Prior attempts at this modeling have mostly used direct solvers to solve the formulated linear equations. Full wave modeling of RF fields in hot plasma with 3D nonuniformities is mostly prohibited, with memory demands of a direct solver placing a significant limitation on spatial resolution. Iterative methods can significantly increase spatial resolution. We explore the feasibility of using iterative methods in 3D full wave modeling. The linear wave equation is formulated using two approaches: for cold plasmas the local cold plasma dielectric tensor is used (resolving resonances by particle collisions), while for hot plasmas the conductivity kernel (which includes a nonlocal dielectric response) is calculated by integrating along test particle orbits. The wave equation is discretized using a finite difference approach. The initial guess is important in iterative methods, and we examine different initial guesses including the solution to the cold plasma wave equation. Work is supported by the U.S. DOE SBIR program.

  5. A comparison between progressive extension method (PEM) and iterative method (IM) for magnetic field extrapolations in the solar atmosphere

    NASA Technical Reports Server (NTRS)

    Wu, S. T.; Sun, M. T.; Sakurai, Takashi

    1990-01-01

    This paper presents a comparison between two numerical methods for the extrapolation of nonlinear force-free magnetic fields, viz the Iterative Method (IM) and the Progressive Extension Method (PEM). The advantages and disadvantages of these two methods are summarized, and the accuracy and numerical instability are discussed. On the basis of this investigation, it is claimed that the two methods do resemble each other qualitatively.

  6. Near-field diffraction from amplitude diffraction gratings: theory, simulation and results

    NASA Astrophysics Data System (ADS)

    Abedin, Kazi Monowar; Rahman, S. M. Mujibur

    2017-08-01

    We describe a computer simulation method by which the complete near-field diffract pattern of an amplitude diffraction grating can be generated. The technique uses the method of iterative Fresnel integrals to calculate and generate the diffraction images. Theoretical background as well as the techniques to perform the simulation is described. The program is written in MATLAB, and can be implemented in any ordinary PC. Examples of simulated diffraction images are presented and discussed. The generated images in the far-field where they reduce to Fraunhofer diffraction pattern are also presented for a realistic grating, and compared with the results predicted by the grating equation, which is applicable in the far-field. The method can be used as a tool to teach the complex phenomenon of diffraction in classrooms.

  7. Wide-field absolute transverse blood flow velocity mapping in vessel centerline

    NASA Astrophysics Data System (ADS)

    Wu, Nanshou; Wang, Lei; Zhu, Bifeng; Guan, Caizhong; Wang, Mingyi; Han, Dingan; Tan, Haishu; Zeng, Yaguang

    2018-02-01

    We propose a wide-field absolute transverse blood flow velocity measurement method in vessel centerline based on absorption intensity fluctuation modulation effect. The difference between the light absorption capacities of red blood cells and background tissue under low-coherence illumination is utilized to realize the instantaneous and average wide-field optical angiography images. The absolute fuzzy connection algorithm is used for vessel centerline extraction from the average wide-field optical angiography. The absolute transverse velocity in the vessel centerline is then measured by a cross-correlation analysis according to instantaneous modulation depth signal. The proposed method promises to contribute to the treatment of diseases, such as those related to anemia or thrombosis.

  8. Serial single molecule electron diffraction imaging: diffraction background of superfluid helium droplets

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; He, Yunteng; Lei, Lei; Alghamdi, Maha; Oswalt, Andrew; Kong, Wei

    2017-08-01

    In an effort to solve the crystallization problem in crystallography, we have been engaged in developing a method termed "serial single molecule electron diffraction imaging" (SS-EDI). The unique features of SS-EDI are superfluid helium droplet cooling and field-induced orientation: together the two features constitute a molecular goniometer. Unfortunately, the helium atoms surrounding the sample molecule also contribute to a diffraction background. In this report, we analyze the properties of a superfluid helium droplet beam and its doping statistics, and demonstrate the feasibility of overcoming the background issue by using the velocity slip phenomenon of a pulsed droplet beam. Electron diffraction profiles and pair correlation functions of ferrocene-monomer-doped droplets and iodine-nanocluster-doped droplets are presented. The timing of the pulsed electron gun and the effective doping efficiency under different dopant pressures can both be controlled for size selection. This work clears any doubt of the effectiveness of superfluid helium droplets in SS-EDI, thereby advancing the effort in demonstrating the "proof-of-concept" one step further.

  9. Harmonic demodulation and minimum enhancement factors in field-enhanced near-field optical microscopy.

    PubMed

    Scarpettini, A F; Bragas, A V

    2015-01-01

    Field-enhanced scanning optical microscopy relies on the design and fabrication of plasmonic probes which had to provide optical and chemical contrast at the nanoscale. In order to do so, the scattering containing the near-field information recorded in a field-enhanced scanning optical microscopy experiment, has to surpass the background light, always present due to multiple interferences between the macroscopic probe and sample. In this work, we show that when the probe-sample distance is modulated with very low amplitude, the higher the harmonic demodulation is, the better the ratio between the near-field signal and the interferometric background results. The choice of working at a given n harmonic is dictated by the experiment when the signal at the n + 1 harmonic goes below the experimental noise. We demonstrate that the optical contrast comes from the nth derivative of the near-field scattering, amplified by the interferometric background. By modelling the far and near field we calculate the probe-sample approach curves, which fit very well the experimental ones. After taking a great amount of experimental data for different probes and samples, we conclude with a table of the minimum enhancement factors needed to have optical contrast with field-enhanced scanning optical microscopy. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  10. Field test of available methods to measure remotely SOx and NOx emissions from ships

    NASA Astrophysics Data System (ADS)

    Balzani Lööv, J. M.; Alfoldy, B.; Gast, L. F. L.; Hjorth, J.; Lagler, F.; Mellqvist, J.; Beecken, J.; Berg, N.; Duyzer, J.; Westrate, H.; Swart, D. P. J.; Berkhout, A. J. C.; Jalkanen, J.-P.; Prata, A. J.; van der Hoff, G. R.; Borowiak, A.

    2014-08-01

    Methods for the determination of ship fuel sulphur content and NOx emission factors based on remote measurements have been compared in the harbour of Rotterdam and compared to direct stack emission measurements on the ferry Stena Hollandica. The methods were selected based on a review of the available literature on ship emission measurements. They were either optical (LIDAR, Differential Optical Absorption Spectroscopy (DOAS), UV camera), combined with model-based estimates of fuel consumption, or based on the so called "sniffer" principle, where SO2 or NOx emission factors are determined from simultaneous measurement of the increase of CO2 and SO2 or NOx concentrations in the plume of the ship compared to the background. The measurements were performed from stations at land, from a boat and from a helicopter. Mobile measurement platforms were found to have important advantages compared to the land-based ones because they allow optimizing the sampling conditions and sampling from ships on the open sea. Although optical methods can provide reliable results it was found that at the state of the art level, the "sniffer" approach is the most convenient technique for determining both SO2 and NOx emission factors remotely. The average random error on the determination of SO2 emission factors comparing two identical instrumental set-ups was 6%. However, it was found that apparently minor differences in the instrumental characteristics, such as response time, could cause significant differences between the emission factors determined. Direct stack measurements showed that about 14% of the fuel sulphur content was not emitted as SO2. This was supported by the remote measurements and is in agreement with the results of other field studies.

  11. System and method for evaluating wind flow fields using remote sensing devices

    DOEpatents

    Schroeder, John; Hirth, Brian; Guynes, Jerry

    2016-12-13

    The present invention provides a system and method for obtaining data to determine one or more characteristics of a wind field using a first remote sensing device and a second remote sensing device. Coordinated data is collected from the first and second remote sensing devices and analyzed to determine the one or more characteristics of the wind field. The first remote sensing device is positioned to have a portion of the wind field within a first scanning sector of the first remote sensing device. The second remote sensing device is positioned to have the portion of the wind field disposed within a second scanning sector of the second remote sensing device.

  12. σ-SCF: A direct energy-targeting method to mean-field excited states

    NASA Astrophysics Data System (ADS)

    Ye, Hong-Zhou; Welborn, Matthew; Ricke, Nathan D.; Van Voorhis, Troy

    2017-12-01

    The mean-field solutions of electronic excited states are much less accessible than ground state (e.g., Hartree-Fock) solutions. Energy-based optimization methods for excited states, like Δ-SCF (self-consistent field), tend to fall into the lowest solution consistent with a given symmetry—a problem known as "variational collapse." In this work, we combine the ideas of direct energy-targeting and variance-based optimization in order to describe excited states at the mean-field level. The resulting method, σ-SCF, has several advantages. First, it allows one to target any desired excited state by specifying a single parameter: a guess of the energy of that state. It can therefore, in principle, find all excited states. Second, it avoids variational collapse by using a variance-based, unconstrained local minimization. As a consequence, all states—ground or excited—are treated on an equal footing. Third, it provides an alternate approach to locate Δ-SCF solutions that are otherwise hardly accessible by the usual non-aufbau configuration initial guess. We present results for this new method for small atoms (He, Be) and molecules (H2, HF). We find that σ-SCF is very effective at locating excited states, including individual, high energy excitations within a dense manifold of excited states. Like all single determinant methods, σ-SCF shows prominent spin-symmetry breaking for open shell states and our results suggest that this method could be further improved with spin projection.

  13. σ-SCF: A direct energy-targeting method to mean-field excited states.

    PubMed

    Ye, Hong-Zhou; Welborn, Matthew; Ricke, Nathan D; Van Voorhis, Troy

    2017-12-07

    The mean-field solutions of electronic excited states are much less accessible than ground state (e.g., Hartree-Fock) solutions. Energy-based optimization methods for excited states, like Δ-SCF (self-consistent field), tend to fall into the lowest solution consistent with a given symmetry-a problem known as "variational collapse." In this work, we combine the ideas of direct energy-targeting and variance-based optimization in order to describe excited states at the mean-field level. The resulting method, σ-SCF, has several advantages. First, it allows one to target any desired excited state by specifying a single parameter: a guess of the energy of that state. It can therefore, in principle, find all excited states. Second, it avoids variational collapse by using a variance-based, unconstrained local minimization. As a consequence, all states-ground or excited-are treated on an equal footing. Third, it provides an alternate approach to locate Δ-SCF solutions that are otherwise hardly accessible by the usual non-aufbau configuration initial guess. We present results for this new method for small atoms (He, Be) and molecules (H 2 , HF). We find that σ-SCF is very effective at locating excited states, including individual, high energy excitations within a dense manifold of excited states. Like all single determinant methods, σ-SCF shows prominent spin-symmetry breaking for open shell states and our results suggest that this method could be further improved with spin projection.

  14. Full-Field Strain Measurement On Titanium Welds And Local Elasto-Plastic Identification With The Virtual Fields Method

    NASA Astrophysics Data System (ADS)

    Tattoli, F.; Pierron, F.; Rotinat, R.; Casavola, C.; Pappalettere, C.

    2011-01-01

    One of the main problems in welding is the microstructural transformation within the area affected by the thermal history. The resulting heterogeneous microstructure within the weld nugget and the heat affected zones is often associated with changes in local material properties. The present work deals with the identification of material parameters governing the elasto—plastic behaviour of the fused and heat affected zones as well as the base material for titanium hybrid welded joints (Ti6Al4V alloy). The material parameters are identified from heterogeneous strain fields with the Virtual Fields Method. This method is based on a relevant use of the principle of virtual work and it has been shown to be useful and much less time consuming than classical finite element model updating approaches applied to similar problems. The paper will present results and discuss the problem of selection of the weld zones for the identification.

  15. Peripheral Prism Glasses: Effects of Moving and Stationary Backgrounds

    PubMed Central

    Shen, Jieming; Peli, Eli; Bowers, Alex R.

    2015-01-01

    Purpose Unilateral peripheral prisms for homonymous hemianopia (HH) expand the visual field through peripheral binocular visual confusion, a stimulus for binocular rivalry that could lead to reduced predominance (partial local suppression) of the prism image and limit device functionality. Using natural-scene images and motion videos, we evaluated whether detection was reduced in binocular compared to monocular viewing. Methods Detection rates of nine participants with HH or quadranopia and normal binocularity wearing peripheral prisms were determined for static checkerboard perimetry targets briefly presented in the prism expansion area and the seeing hemifield. Perimetry was conducted under monocular and binocular viewing with targets presented over videos of real-world driving scenes and still frame images derived from those videos. Results With unilateral prisms, detection rates in the prism expansion area were significantly lower in binocular than monocular (prism eye) viewing on the motion background (medians 13% and 58%, respectively, p = 0.008), but not the still frame background (63% and 68%, p = 0.123). When the stimulus for binocular rivalry was reduced by fitting prisms bilaterally in 1 HH and 1 normally-sighted subject with simulated HH, prism-area detection rates on the motion background were not significantly different (p > 0.6) in binocular and monocular viewing. Conclusions Conflicting binocular motion appears to be a stimulus for reduced predominance of the prism image in binocular viewing when using unilateral peripheral prisms. However, the effect was only found for relatively small targets. Further testing is needed to determine the extent to which this phenomenon might affect the functionality of unilateral peripheral prisms in more real-world situations. PMID:25785533

  16. Solid Lubrication Fundamentals and Applications: Introduction and Background. Revision 1

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    1998-01-01

    This chapter presents an introduction and historical background to the field of tribology, especially solid lubrication and lubricants and sets them in the perspective of techniques and materials in lubrication. Also, solid and liquid lubrication films are defined and described.

  17. Solid Lubrication Fundamentals and Applications. Chapter 1; Introduction and Background

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    1996-01-01

    This chapter presents an introduction and historical background to the field of tribology, especially solid lubrication and lubricants and sets them in the perspective of techniques and materials in lubrication. Also, solid and liquid lubrication films are defined and described.

  18. Nondestructive acoustic electric field probe apparatus and method

    DOEpatents

    Migliori, Albert

    1982-01-01

    The disclosure relates to a nondestructive acoustic electric field probe and its method of use. A source of acoustic pulses of arbitrary but selected shape is placed in an oil bath along with material to be tested across which a voltage is disposed and means for receiving acoustic pulses after they have passed through the material. The received pulses are compared with voltage changes across the material occurring while acoustic pulses pass through it and analysis is made thereof to determine preselected characteristics of the material.

  19. Obtaining source current density related to irregularly structured electromagnetic target field inside human body using hybrid inverse/FDTD method.

    PubMed

    Han, Jijun; Yang, Deqiang; Sun, Houjun; Xin, Sherman Xuegang

    2017-01-01

    Inverse method is inherently suitable for calculating the distribution of source current density related with an irregularly structured electromagnetic target field. However, the present form of inverse method cannot calculate complex field-tissue interactions. A novel hybrid inverse/finite-difference time domain (FDTD) method that can calculate the complex field-tissue interactions for the inverse design of source current density related with an irregularly structured electromagnetic target field is proposed. A Huygens' equivalent surface is established as a bridge to combine the inverse and FDTD method. Distribution of the radiofrequency (RF) magnetic field on the Huygens' equivalent surface is obtained using the FDTD method by considering the complex field-tissue interactions within the human body model. The obtained magnetic field distributed on the Huygens' equivalent surface is regarded as the next target. The current density on the designated source surface is derived using the inverse method. The homogeneity of target magnetic field and specific energy absorption rate are calculated to verify the proposed method.

  20. An EPIC Tale of the Quiescent Particle Background

    NASA Technical Reports Server (NTRS)

    Snowden, S.L.; Kuntz, K.D.

    2017-01-01

    Extended Source Analysis Software Use Based Empirical Investigation: (1) Builds quiescent particle background (QPB) spectra and images for observations of extended sources that fill (or mostly fill) the FOV i.e., annular background subtraction won't work. (2) Uses a combination of Filter Wheel Closed (FWC) and corner data to capture the spectral, spatial, and temporal variation of the quiescent particle background. New Work: (1) Improved understanding of the QPB (aided by adding a whole lot of data since 2008). (2) Significantly improved statistics (did I mention a LOT more data?). (3) Better characterization and identification of anomalous states. (4) Builds backgrounds for some anomalous state. (5) New efficient method for non-anomalous states.

  1. Magnetic fields end-face effect investigation of HTS bulk over PMG with 3D-modeling numerical method

    NASA Astrophysics Data System (ADS)

    Qin, Yujie; Lu, Yiyun

    2015-09-01

    In this paper, the magnetic fields end-face effect of high temperature superconducting (HTS) bulk over a permanent magnetic guideway (PMG) is researched with 3D-modeling numerical method. The electromagnetic behavior of the bulk is simulated using finite element method (FEM). The framework is formulated by the magnetic field vector method (H-method). A superconducting levitation system composed of one rectangular HTS bulk and one infinite long PMG is successfully investigated using the proposed method. The simulation results show that for finite geometrical HTS bulk, even the applied magnetic field is only distributed in x-y plane, the magnetic field component Hz which is along the z-axis can be observed interior the HTS bulk.

  2. Calorimetric method of ac loss measurement in a rotating magnetic field.

    PubMed

    Ghoshal, P K; Coombs, T A; Campbell, A M

    2010-07-01

    A method is described for calorimetric ac-loss measurements of high-T(c) superconductors (HTS) at 80 K. It is based on a technique used at 4.2 K for conventional superconducting wires that allows an easy loss measurement in parallel or perpendicular external field orientation. This paper focuses on ac loss measurement setup and calibration in a rotating magnetic field. This experimental setup is to demonstrate measuring loss using a temperature rise method under the influence of a rotating magnetic field. The slight temperature increase of the sample in an ac-field is used as a measure of losses. The aim is to simulate the loss in rotating machines using HTS. This is a unique technique to measure total ac loss in HTS at power frequencies. The sample is mounted on to a cold finger extended from a liquid nitrogen heat exchanger (HEX). The thermal insulation between the HEX and sample is provided by a material of low thermal conductivity, and low eddy current heating sample holder in vacuum vessel. A temperature sensor and noninductive heater have been incorporated in the sample holder allowing a rapid sample change. The main part of the data is obtained in the calorimetric measurement is used for calibration. The focus is on the accuracy and calibrations required to predict the actual ac losses in HTS. This setup has the advantage of being able to measure the total ac loss under the influence of a continuous moving field as experienced by any rotating machines.

  3. Background concentrations for high resolution satellite observing systems of methane

    NASA Astrophysics Data System (ADS)

    Benmergui, J. S.; Propp, A. M.; Turner, A. J.; Wofsy, S. C.

    2017-12-01

    Emerging satellite technologies promise to measure total column dry-air mole fractions of methane (XCH4) at resolutions on the order of a kilometer. XCH4 is linearly related to regional methane emissions through enhancements in the mixed layer, giving these satellites the ability to constrain emissions at unprecedented resolution. However, XCH4 is also sensitive to variability in transport of upwind concentrations (the "background concentration"). Variations in the background concentration are caused by synoptic scale transport in both the free troposphere and the stratosphere, as well as the rate of methane oxidation. Misspecification of the background concentration is aliased onto retrieved emissions as bias. This work explores several methods of specifying the background concentration for high resolution satellite observations of XCH4. We conduct observing system simulation experiments (OSSEs) that simulate the retrieval of emissions in the Barnett Shale using observations from a 1.33 km resolution XCH4 imaging satellite. We test background concentrations defined (1) from an external continental-scale model, (2) using pixels along the edge of the image as a boundary value, (3) using differences between adjacent pixels, and (4) using differences between the same pixel separated by one hour in time. We measure success using the accuracy of the retrieval, the potential for bias induced by misspecification of the background, and the computational expedience of the method. Pathological scenarios are given to each method.

  4. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-03

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Optical properties in GaAs/AlGaAs semiparabolic quantum wells by the finite difference method: Combined effects of electric field and magnetic field

    NASA Astrophysics Data System (ADS)

    Yan, Ru-Yu; Tang, Jian; Zhang, Zhi-Hai; Yuan, Jian-Hui

    2018-05-01

    In the present work, the optical properties of GaAs/AlGaAs semiparabolic quantum wells (QWs) are studied under the effect of applied electric field and magnetic field by using the compact-density-matrix method. The energy eigenvalues and their corresponding eigenfunctions of the system are calculated by using the differential method. Simultaneously, the nonlinear optical rectification (OR) and optical absorption coefficients (OACs) are investigated, which are modulated by the applied electric field and magnetic field. It is found that the position and the magnitude of the resonant peaks of the nonlinear OR and OACs can depend strongly on the applied electric field, magnetic field and confined potential frequencies. This gives a new way to control the device applications based on the intersubband transitions of electrons in this system.

  6. Adaptive removal of background and white space from document images using seam categorization

    NASA Astrophysics Data System (ADS)

    Fillion, Claude; Fan, Zhigang; Monga, Vishal

    2011-03-01

    Document images are obtained regularly by rasterization of document content and as scans of printed documents. Resizing via background and white space removal is often desired for better consumption of these images, whether on displays or in print. While white space and background are easy to identify in images, existing methods such as naïve removal and content aware resizing (seam carving) each have limitations that can lead to undesirable artifacts, such as uneven spacing between lines of text or poor arrangement of content. An adaptive method based on image content is hence needed. In this paper we propose an adaptive method to intelligently remove white space and background content from document images. Document images are different from pictorial images in structure. They typically contain objects (text letters, pictures and graphics) separated by uniform background, which include both white paper space and other uniform color background. Pixels in uniform background regions are excellent candidates for deletion if resizing is required, as they introduce less change in document content and style, compared with deletion of object pixels. We propose a background deletion method that exploits both local and global context. The method aims to retain the document structural information and image quality.

  7. Preservation Methods Differ in Fecal Microbiome Stability, Affecting Suitability for Field Studies

    PubMed Central

    Amir, Amnon; Metcalf, Jessica L.; Amato, Katherine R.; Xu, Zhenjiang Zech; Humphrey, Greg

    2016-01-01

    ABSTRACT Immediate freezing at −20°C or below has been considered the gold standard for microbiome preservation, yet this approach is not feasible for many field studies, ranging from anthropology to wildlife conservation. Here we tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including such types of variation as freeze-thaw cycles and the high temperature fluctuations often encountered under field conditions. We found that three of the methods—95% ethanol, FTA cards, and the OMNIgene Gut kit—can preserve samples sufficiently well at ambient temperatures such that differences at 8 weeks are comparable to differences among technical replicates. However, even the worst methods, including those with no fixative, were able to reveal microbiome differences between species at 8 weeks and between individuals after a week, allowing meta-analyses of samples collected using various methods when the effect of interest is expected to be larger than interindividual variation (although use of a single method within a study is strongly recommended to reduce batch effects). Encouragingly for FTA cards, the differences caused by this method are systematic and can be detrended. As in other studies, we strongly caution against the use of 70% ethanol. The results, spanning 15 individuals and over 1,200 samples, provide our most comprehensive view to date of storage effects on stool and provide a paradigm for the future studies of other sample types that will be required to provide a global view of microbial diversity and its interaction among humans, animals, and the environment. IMPORTANCE Our study, spanning 15 individuals and over 1,200 samples, provides our most comprehensive view to date of storage and stabilization effects on stool. We tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including the types of variation often encountered under field conditions, such as freeze

  8. Characterizing the True Background Corona with SDO/AIA

    NASA Technical Reports Server (NTRS)

    Napier, Kate; Winebarger, Amy; Alexander, Caroline

    2014-01-01

    Characterizing the nature of the solar coronal background would enable scientists to more accurately determine plasma parameters, and may lead to a better understanding of the coronal heating problem. Because scientists study the 3D structure of the Sun in 2D, any line of sight includes both foreground and background material, and thus, the issue of background subtraction arises. By investigating the intensity values in and around an active region, using multiple wavelengths collected from the Atmospheric Imaging Assembly (AIA) on the Solar Dynamics Observatory (SDO) over an eight-hour period, this project aims to characterize the background as smooth or structured. Different methods were employed to measure the true coronal background and create minimum intensity images. These were then investigated for the presence of structure. The background images created were found to contain long-lived structures, including coronal loops, that were still present in all of the wavelengths, 193 Angstroms,171 Angstroms,131 Angstroms, and 211 Angstroms. The intensity profiles across the active region indicate that the background is much more structured than previously thought.

  9. Experiments on the Expansion of a Dense Plasma into a Background Magnetoplasma

    NASA Astrophysics Data System (ADS)

    Gekelman, Walter; Vanzeeland, Mike; Vincena, Steve; Pribyl, Pat

    2003-10-01

    There are many situations, which occur in space (coronal mass ejections, or are man-made (upper atmospheric detonations) as well as the initial stages of a supernovae, in which a dense plasma expands into a background magnetized plasma, that can support Alfvèn waves. The upgraded LArge Plasma Device (LAPD) is a machine, at UCLA, in which Alfvèn wave propagation in homogeneous and inhomogeneous plasmas has been studied. We describe a series of experiments,which involve the expansion of a dense (initially, n_laser-plasma/n_0≫1) laser-produced plasma into an ambient highly magnetized background plasma capable of supporting Alfvèn waves will be presented. The 150 MW laser is pulsed at the same 1 Hz repetition rate as the plasma in a highly reproducible experiment. The interaction results in the production of intense shear Alfvèn waves, as well as large density perturbations. The waves propagate away from the target and are observed to become plasma column resonances. In the initial phase the background magnetic field is expelled from a plasma bubble. Currents in the main body of the plasma are generated to neutralize the positively charged bubble. The current system which results, becomes that of a spectrum of shear Alfvèn waves. Spatial patterns of the wave magnetic fields waves are measured at over 10^4 locations. As the dense plasma expands across the magnetic field it seeds the column with shear waves. Most of the Alfvèn wave energy is in shear waves, which become field line resonances after a machine transit time. The interplay between waves, currents, inductive electric fields and space charge is analyzed in great detail. Dramatic movies of the measured wave fields and their associated currents will be presented. Work supported by ONR, and DOE /NSF.

  10. Potential, velocity, and density fields from sparse and noisy redshift-distance samples - Method

    NASA Technical Reports Server (NTRS)

    Dekel, Avishai; Bertschinger, Edmund; Faber, Sandra M.

    1990-01-01

    A method for recovering the three-dimensional potential, velocity, and density fields from large-scale redshift-distance samples is described. Galaxies are taken as tracers of the velocity field, not of the mass. The density field and the initial conditions are calculated using an iterative procedure that applies the no-vorticity assumption at an initial time and uses the Zel'dovich approximation to relate initial and final positions of particles on a grid. The method is tested using a cosmological N-body simulation 'observed' at the positions of real galaxies in a redshift-distance sample, taking into account their distance measurement errors. Malmquist bias and other systematic and statistical errors are extensively explored using both analytical techniques and Monte Carlo simulations.

  11. Characterization of background concentrations of contaminants using a mixture of normal distributions.

    PubMed

    Qian, Song S; Lyons, Regan E

    2006-10-01

    We present a Bayesian approach for characterizing background contaminant concentration distributions using data from sites that may have been contaminated. Our method, focused on estimation, resolves several technical problems of the existing methods sanctioned by the U.S. Environmental Protection Agency (USEPA) (a hypothesis testing based method), resulting in a simple and quick procedure for estimating background contaminant concentrations. The proposed Bayesian method is applied to two data sets from a federal facility regulated under the Resource Conservation and Restoration Act. The results are compared to background distributions identified using existing methods recommended by the USEPA. The two data sets represent low and moderate levels of censorship in the data. Although an unbiased estimator is elusive, we show that the proposed Bayesian estimation method will have a smaller bias than the EPA recommended method.

  12. Applied Ecosystem Analysis - Background : EDT the Ecosystem Diagnosis and Treatment Method.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mobrand, Lars E.

    1996-05-01

    This volume consists of eight separate reports. We present them as background to the Ecosystem Diagnosis and Treatment (EDT) methodology. They are a selection from publications, white papers, and presentations prepared over the past two years. Some of the papers are previously published, others are currently being prepared for publication. In the early to mid 1980`s the concern for failure of both natural and hatchery production of Columbia river salmon populations was widespread. The concept of supplementation was proposed as an alternative solution that would integrate artificial propagation with natural production. In response to the growing expectations placed upon themore » supplementation tool, a project called Regional Assessment of Supplementation Project (RASP) was initiated in 1990. The charge of RASP was to define supplementation and to develop guidelines for when, where and how it would be the appropriate solution to salmon enhancement in the Columbia basin. The RASP developed a definition of supplementation and a set of guidelines for planning salmon enhancement efforts which required consideration of all factors affecting salmon populations, including environmental, genetic, and ecological variables. The results of RASP led to a conclusion that salmon issues needed to be addressed in a manner that was consistent with an ecosystem approach. If the limitations and potentials of supplementation or any other management tool were to be fully understood it would have to be within the context of a broadly integrated approach - thus the Ecosystem Diagnosis and Treatment (EDT) method was born.« less

  13. Methods in field chronobiology.

    PubMed

    Dominoni, Davide M; Åkesson, Susanne; Klaassen, Raymond; Spoelstra, Kamiel; Bulla, Martin

    2017-11-19

    Chronobiological research has seen a continuous development of novel approaches and techniques to measure rhythmicity at different levels of biological organization from locomotor activity (e.g. migratory restlessness) to physiology (e.g. temperature and hormone rhythms, and relatively recently also in genes, proteins and metabolites). However, the methodological advancements in this field have been mostly and sometimes exclusively used only in indoor laboratory settings. In parallel, there has been an unprecedented and rapid improvement in our ability to track animals and their behaviour in the wild. However, while the spatial analysis of tracking data is widespread, its temporal aspect is largely unexplored. Here, we review the tools that are available or have potential to record rhythms in the wild animals with emphasis on currently overlooked approaches and monitoring systems. We then demonstrate, in three question-driven case studies, how the integration of traditional and newer approaches can help answer novel chronobiological questions in free-living animals. Finally, we highlight unresolved issues in field chronobiology that may benefit from technological development in the future. As most of the studies in the field are descriptive, the future challenge lies in applying the diverse technologies to experimental set-ups in the wild.This article is part of the themed issue 'Wild clocks: integrating chronobiology and ecology to understand timekeeping in free-living animals'. © 2017 The Author(s).

  14. Sensitivity curves for searches for gravitational-wave backgrounds

    NASA Astrophysics Data System (ADS)

    Thrane, Eric; Romano, Joseph D.

    2013-12-01

    We propose a graphical representation of detector sensitivity curves for stochastic gravitational-wave backgrounds that takes into account the increase in sensitivity that comes from integrating over frequency in addition to integrating over time. This method is valid for backgrounds that have a power-law spectrum in the analysis band. We call these graphs “power-law integrated curves.” For simplicity, we consider cross-correlation searches for unpolarized and isotropic stochastic backgrounds using two or more detectors. We apply our method to construct power-law integrated sensitivity curves for second-generation ground-based detectors such as Advanced LIGO, space-based detectors such as LISA and the Big Bang Observer, and timing residuals from a pulsar timing array. The code used to produce these plots is available at https://dcc.ligo.org/LIGO-P1300115/public for researchers interested in constructing similar sensitivity curves.

  15. Step-scan differential Fourier transform infrared photoacoustic spectroscopy (DFTIR-PAS): a spectral deconvolution method for weak absorber detection in the presence of strongly overlapping background absorptions.

    PubMed

    Liu, Lixian; Mandelis, Andreas; Huan, Huiting; Michaelian, Kirk H

    2017-04-01

    The determination of small absorption coefficients of trace gases in the atmosphere constitutes a challenge for analytical air contaminant measurements, especially in the presence of strongly absorbing backgrounds. A step-scan differential Fourier transform infrared photoacoustic spectroscopy (DFTIR-PAS) method was developed to suppress the coherent external noise and spurious photoacoustic (PA) signals caused by strongly absorbing backgrounds. The infrared absorption spectra of acetylene (C2H2) and local air were used to verify the performance of the step-scan DFTIR-PAS method. A linear amplitude response to C2H2 concentrations from 100 to 5000 ppmv was observed, leading to a theoretical detection limit of 5 ppmv. The differential mode was capable of eliminating the coherent noise and dominant background gas signals, thereby revealing the presence of the otherwise hidden C2H2 weak absorption. Thus, the step-scan DFTIR-PAS modality was demonstrated to be an effective approach for monitoring weakly absorbing gases with absorption bands overlapped by strongly absorbing background species.

  16. Background noise exerts diverse effects on the cortical encoding of foreground sounds.

    PubMed

    Malone, B J; Heiser, Marc A; Beitel, Ralph E; Schreiner, Christoph E

    2017-08-01

    In natural listening conditions, many sounds must be detected and identified in the context of competing sound sources, which function as background noise. Traditionally, noise is thought to degrade the cortical representation of sounds by suppressing responses and increasing response variability. However, recent studies of neural network models and brain slices have shown that background synaptic noise can improve the detection of signals. Because acoustic noise affects the synaptic background activity of cortical networks, it may improve the cortical responses to signals. We used spike train decoding techniques to determine the functional effects of a continuous white noise background on the responses of clusters of neurons in auditory cortex to foreground signals, specifically frequency-modulated sweeps (FMs) of different velocities, directions, and amplitudes. Whereas the addition of noise progressively suppressed the FM responses of some cortical sites in the core fields with decreasing signal-to-noise ratios (SNRs), the stimulus representation remained robust or was even significantly enhanced at specific SNRs in many others. Even though the background noise level was typically not explicitly encoded in cortical responses, significant information about noise context could be decoded from cortical responses on the basis of how the neural representation of the foreground sweeps was affected. These findings demonstrate significant diversity in signal in noise processing even within the core auditory fields that could support noise-robust hearing across a wide range of listening conditions. NEW & NOTEWORTHY The ability to detect and discriminate sounds in background noise is critical for our ability to communicate. The neural basis of robust perceptual performance in noise is not well understood. We identified neuronal populations in core auditory cortex of squirrel monkeys that differ in how they process foreground signals in background noise and that may

  17. Perspectives on the simulation of protein–surface interactions using empirical force field methods

    PubMed Central

    Latour, Robert A.

    2014-01-01

    Protein–surface interactions are of fundamental importance for a broad range of applications in the fields of biomaterials and biotechnology. Present experimental methods are limited in their ability to provide a comprehensive depiction of these interactions at the atomistic level. In contrast, empirical force field based simulation methods inherently provide the ability to predict and visualize protein–surface interactions with full atomistic detail. These methods, however, must be carefully developed, validated, and properly applied before confidence can be placed in results from the simulations. In this perspectives paper, I provide an overview of the critical aspects that I consider being of greatest importance for the development of these methods, with a focus on the research that my combined experimental and molecular simulation groups have conducted over the past decade to address these issues. These critical issues include the tuning of interfacial force field parameters to accurately represent the thermodynamics of interfacial behavior, adequate sampling of these types of complex molecular systems to generate results that can be comparable with experimental data, and the generation of experimental data that can be used for simulation results evaluation and validation. PMID:25028242

  18. Performance of FFT methods in local gravity field modelling

    NASA Technical Reports Server (NTRS)

    Forsberg, Rene; Solheim, Dag

    1989-01-01

    Fast Fourier transform (FFT) methods provide a fast and efficient means of processing large amounts of gravity or geoid data in local gravity field modelling. The FFT methods, however, has a number of theoretical and practical limitations, especially the use of flat-earth approximation, and the requirements for gridded data. In spite of this the method often yields excellent results in practice when compared to other more rigorous (and computationally expensive) methods, such as least-squares collocation. The good performance of the FFT methods illustrate that the theoretical approximations are offset by the capability of taking into account more data in larger areas, especially important for geoid predictions. For best results good data gridding algorithms are essential. In practice truncated collocation approaches may be used. For large areas at high latitudes the gridding must be done using suitable map projections such as UTM, to avoid trivial errors caused by the meridian convergence. The FFT methods are compared to ground truth data in New Mexico (xi, eta from delta g), Scandinavia (N from delta g, the geoid fits to 15 cm over 2000 km), and areas of the Atlantic (delta g from satellite altimetry using Wiener filtering). In all cases the FFT methods yields results comparable or superior to other methods.

  19. Reconstruction and separation of vibratory field using structural holography

    NASA Astrophysics Data System (ADS)

    Chesnais, C.; Totaro, N.; Thomas, J.-H.; Guyader, J.-L.

    2017-02-01

    A method for reconstructing and separating vibratory field on a plate-like structure is presented. The method, called "Structural Holography" is derived from classical Near-field Acoustic Holography (NAH) but in the vibratory domain. In this case, the plate displacement is measured on one-dimensional lines (the holograms) and used to reconstruct the entire two-dimensional displacement field. As a consequence, remote measurements on non directly accessible zones are possible with Structural Holography. Moreover, as it is based on the decomposition of the field into forth and back waves, Structural Holography permits to separate forces in the case of multi-sources excitation. The theoretical background of the Structural Holography method is described first. Then, to illustrate the process and the possibilities of Structural Holography, the academic test case of an infinite plate excited by few point forces is presented. With the principle of vibratory field separation, the displacement fields produced by each point force separately is reconstructed. However, the displacement field is not always meaningful and some additional treatments are mandatory to localize the position of point forces for example. From the simple example of an infinite plate, a post-processing based on the reconstruction of the structural intensity field is thus proposed. Finally, Structural Holography is generalized to finite plates and applied to real experimental measurements

  20. Can one measure the Cosmic Neutrino Background?

    NASA Astrophysics Data System (ADS)

    Faessler, Amand; Hodák, Rastislav; Kovalenko, Sergey; Šimkovic, Fedor

    The Cosmic Microwave Background (CMB) yields information about our Universe at around 380,000 years after the Big Bang (BB). Due to the weak interaction of the neutrinos with matter, the Cosmic Neutrino Background (CNB) should give information about a much earlier time of our Universe, around one second after the BB. Probably, the most promising method to “see” the CNB is the capture of the electron neutrinos from the Background by Tritium, which then decays into 3He and an electron with the energy of the the Q-value = 18.562keV plus the electron neutrino rest mass. The “KArlsruhe TRItium Neutrino” (KATRIN) experiment, which is in preparation, seems presently the most sensitive proposed method for measuring the electron antineutrino mass. At the same time, KATRIN can also look by the reaction νe(˜ 1.95K) +3H →3He + e-(Q = 18.6keV + m νec2). The capture of the Cosmic Background Neutrinos (CNB) should show in the electron spectrum as a peak by the electron neutrino rest mass above Q. Here, the possibility to see the CNB with KATRIN is studied. A detection of the CNB by KATRIN seems not to be possible at the moment. But KATRIN should be able to determine an upper limit for the local electron neutrino density of the CNB.

  1. Can one measure the Cosmic Neutrino Background?

    NASA Astrophysics Data System (ADS)

    Faessler, Amand; Hodák, Rastislav; Kovalenko, Sergey; Šimkovic, Fedor

    The Cosmic Microwave Background (CMB) yields information about our Universe at around 380,000 years after the Big Bang (BB). Due to the weak interaction of the neutrinos with matter, the Cosmic Neutrino Background (CNB) should give information about a much earlier time of our Universe, around one second after the BB. Probably, the most promising method to "see" the CNB is the capture of the electron neutrinos from the Background by Tritium, which then decays into 3He and an electron with the energy of the the Q-value = 18.562 keV plus the electron neutrino rest mass. The "KArlsruhe TRItium Neutrino" (KATRIN) experiment, which is in preparation, seems presently the most sensitive proposed method for measuring the electron antineutrino mass. At the same time, KATRIN can also look by the reaction νe(˜1.95K) + 3H → 3He + e-(Q = 18.6keV + mνec2). The capture of the Cosmic Background Neutrinos (CNB) should show in the electron spectrum as a peak by the electron neutrino rest mass above Q. Here, the possibility to see the CNB with KATRIN is studied. A detection of the CNB by KATRIN seems not to be possible at the moment. But KATRIN should be able to determine an upper limit for the local electron neutrino density of the CNB.

  2. A method to describe inelastic gamma field distribution in neutron gamma density logging.

    PubMed

    Zhang, Feng; Zhang, Quanying; Liu, Juntao; Wang, Xinguang; Wu, He; Jia, Wenbao; Ti, Yongzhou; Qiu, Fei; Zhang, Xiaoyang

    2017-11-01

    Pulsed neutron gamma density logging (NGD) is of great significance for radioprotection and density measurement in LWD, however, the current methods have difficulty in quantitative calculation and single factor analysis for the inelastic gamma field distribution. In order to clarify the NGD mechanism, a new method is developed to describe the inelastic gamma field distribution. Based on the fast-neutron scattering and gamma attenuation, the inelastic gamma field distribution is characterized by the inelastic scattering cross section, fast-neutron scattering free path, formation density and other parameters. And the contribution of formation parameters on the field distribution is quantitatively analyzed. The results shows the contribution of density attenuation is opposite to that of inelastic scattering cross section and fast-neutron scattering free path. And as the detector-spacing increases, the density attenuation gradually plays a dominant role in the gamma field distribution, which means large detector-spacing is more favorable for the density measurement. Besides, the relationship of density sensitivity and detector spacing was studied according to this gamma field distribution, therefore, the spacing of near and far gamma ray detector is determined. The research provides theoretical guidance for the tool parameter design and density determination of pulsed neutron gamma density logging technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. A new method of field MRTD test

    NASA Astrophysics Data System (ADS)

    Chen, Zhibin; Song, Yan; Liu, Xianhong; Xiao, Wenjian

    2014-09-01

    MRTD is an important indicator to measure the imaging performance of infrared camera. In the traditional laboratory test, blackbody is used as simulated heat source which is not only expensive and bulky but also difficult to meet field testing requirements of online automatic infrared camera MRTD. To solve this problem, this paper introduces a new detection device for MRTD, which uses LED as a simulation heat source and branded plated zinc sulfide glass carved four-bar target as a simulation target. By using high temperature adaptability cassegrain collimation system, the target is simulated to be distance-infinite so that it can be observed by the human eyes to complete the subjective test, or collected to complete objective measurement by image processing. This method will use LED to replace blackbody. The color temperature of LED is calibrated by thermal imager, thereby, the relation curve between the LED temperature controlling current and the blackbody simulation temperature difference is established, accurately achieved the temperature control of the infrared target. Experimental results show that the accuracy of the device in field testing of thermal imager MRTD can be limited within 0.1K, which greatly reduces the cost to meet the project requirements with a wide application value.

  4. A potential method for lift evaluation from velocity field data

    NASA Astrophysics Data System (ADS)

    de Guyon-Crozier, Guillaume; Mulleners, Karen

    2017-11-01

    Computing forces from velocity field measurements is one of the challenges in experimental aerodynamics. This work focuses on low Reynolds flows, where the dynamics of the leading and trailing edge vortices play a major role in lift production. Recent developments in 2D potential flow theory, using discrete vortex models, have shown good results for unsteady wing motions. A method is presented to calculate lift from experimental velocity field data using a discrete vortex potential flow model. The model continuously adds new point vortices at leading and trailing edges whose circulations are set directly from vorticity measurements. Forces are computed using the unsteady Blasius equation and compared with measured loads.

  5. Hawkes process model with a time-dependent background rate and its application to high-frequency financial data.

    PubMed

    Omi, Takahiro; Hirata, Yoshito; Aihara, Kazuyuki

    2017-07-01

    A Hawkes process model with a time-varying background rate is developed for analyzing the high-frequency financial data. In our model, the logarithm of the background rate is modeled by a linear model with a relatively large number of variable-width basis functions, and the parameters are estimated by a Bayesian method. Our model can capture not only the slow time variation, such as in the intraday seasonality, but also the rapid one, which follows a macroeconomic news announcement. By analyzing the tick data of the Nikkei 225 mini, we find that (i) our model is better fitted to the data than the Hawkes models with a constant background rate or a slowly varying background rate, which have been commonly used in the field of quantitative finance; (ii) the improvement in the goodness-of-fit to the data by our model is significant especially for sessions where considerable fluctuation of the background rate is present; and (iii) our model is statistically consistent with the data. The branching ratio, which quantifies the level of the endogeneity of markets, estimated by our model is 0.41, suggesting the relative importance of exogenous factors in the market dynamics. We also demonstrate that it is critically important to appropriately model the time-dependent background rate for the branching ratio estimation.

  6. Hawkes process model with a time-dependent background rate and its application to high-frequency financial data

    NASA Astrophysics Data System (ADS)

    Omi, Takahiro; Hirata, Yoshito; Aihara, Kazuyuki

    2017-07-01

    A Hawkes process model with a time-varying background rate is developed for analyzing the high-frequency financial data. In our model, the logarithm of the background rate is modeled by a linear model with a relatively large number of variable-width basis functions, and the parameters are estimated by a Bayesian method. Our model can capture not only the slow time variation, such as in the intraday seasonality, but also the rapid one, which follows a macroeconomic news announcement. By analyzing the tick data of the Nikkei 225 mini, we find that (i) our model is better fitted to the data than the Hawkes models with a constant background rate or a slowly varying background rate, which have been commonly used in the field of quantitative finance; (ii) the improvement in the goodness-of-fit to the data by our model is significant especially for sessions where considerable fluctuation of the background rate is present; and (iii) our model is statistically consistent with the data. The branching ratio, which quantifies the level of the endogeneity of markets, estimated by our model is 0.41, suggesting the relative importance of exogenous factors in the market dynamics. We also demonstrate that it is critically important to appropriately model the time-dependent background rate for the branching ratio estimation.

  7. System and method for magnetic current density imaging at ultra low magnetic fields

    DOEpatents

    Espy, Michelle A.; George, John Stevens; Kraus, Robert Henry; Magnelind, Per; Matlashov, Andrei Nikolaevich; Tucker, Don; Turovets, Sergei; Volegov, Petr Lvovich

    2016-02-09

    Preferred systems can include an electrical impedance tomography apparatus electrically connectable to an object; an ultra low field magnetic resonance imaging apparatus including a plurality of field directions and disposable about the object; a controller connected to the ultra low field magnetic resonance imaging apparatus and configured to implement a sequencing of one or more ultra low magnetic fields substantially along one or more of the plurality of field directions; and a display connected to the controller, and wherein the controller is further configured to reconstruct a displayable image of an electrical current density in the object. Preferred methods, apparatuses, and computer program products are also disclosed.

  8. A new cation-exchange method for accurate field speciation of hexavalent chromium

    USGS Publications Warehouse

    Ball, J.W.; McCleskey, R. Blaine

    2003-01-01

    A new method for field speciation of Cr(VI) has been developed to meet present stringent regulatory standards and to overcome the limitations of existing methods. The method consists of passing a water sample through strong acid cation-exchange resin at the field site, where Cr(III) is retained while Cr(VI) passes into the effluent and is preserved for later determination. The method is simple, rapid, portable, and accurate, and makes use of readily available, inexpensive materials. Cr(VI) concentrations are determined later in the laboratory using any elemental analysis instrument sufficiently sensitive to measure the Cr(VI) concentrations of interest. The new method allows measurement of Cr(VI) concentrations as low as 0.05 ??g 1-1, storage of samples for at least several weeks prior to analysis, and use of readily available analytical instrumentation. Cr(VI) can be separated from Cr(III) between pH 2 and 11 at Cr(III)/Cr(VI) concentration ratios as high as 1000. The new method has demonstrated excellent comparability with two commonly used methods, the Hach Company direct colorimetric method and USEPA method 218.6. The new method is superior to the Hach direct colorimetric method owing to its relative sensitivity and simplicity. The new method is superior to USEPA method 218.6 in the presence of Fe(II) concentrations up to 1 mg 1-1 and Fe(III) concentrations up to 10 mg 1-1. Time stability of preserved samples is a significant advantage over the 24-h time constraint specified for USEPA method 218.6.

  9. Background Noise Reduction Using Adaptive Noise Cancellation Determined by the Cross-Correlation

    NASA Technical Reports Server (NTRS)

    Spalt, Taylor B.; Brooks, Thomas F.; Fuller, Christopher R.

    2012-01-01

    Background noise due to flow in wind tunnels contaminates desired data by decreasing the Signal-to-Noise Ratio. The use of Adaptive Noise Cancellation to remove background noise at measurement microphones is compromised when the reference sensor measures both background and desired noise. The technique proposed modifies the classical processing configuration based on the cross-correlation between the reference and primary microphone. Background noise attenuation is achieved using a cross-correlation sample width that encompasses only the background noise and a matched delay for the adaptive processing. A present limitation of the method is that a minimum time delay between the background noise and desired signal must exist in order for the correlated parts of the desired signal to be separated from the background noise in the crosscorrelation. A simulation yields primary signal recovery which can be predicted from the coherence of the background noise between the channels. Results are compared with two existing methods.

  10. ViBe: a universal background subtraction algorithm for video sequences.

    PubMed

    Barnich, Olivier; Van Droogenbroeck, Marc

    2011-06-01

    This paper presents a technique for motion detection that incorporates several innovative mechanisms. For example, our proposed technique stores, for each pixel, a set of values taken in the past at the same location or in the neighborhood. It then compares this set to the current pixel value in order to determine whether that pixel belongs to the background, and adapts the model by choosing randomly which values to substitute from the background model. This approach differs from those based upon the classical belief that the oldest values should be replaced first. Finally, when the pixel is found to be part of the background, its value is propagated into the background model of a neighboring pixel. We describe our method in full details (including pseudo-code and the parameter values used) and compare it to other background subtraction techniques. Efficiency figures show that our method outperforms recent and proven state-of-the-art methods in terms of both computation speed and detection rate. We also analyze the performance of a downscaled version of our algorithm to the absolute minimum of one comparison and one byte of memory per pixel. It appears that even such a simplified version of our algorithm performs better than mainstream techniques.

  11. Background rejection in NEXT using deep neural networks

    DOE PAGES

    Renner, J.; Farbin, A.; Vidal, J. Muñoz; ...

    2017-01-16

    Here, we investigate the potential of using deep learning techniques to reject background events in searches for neutrinoless double beta decay with high pressure xenon time projection chambers capable of detailed track reconstruction. The differences in the topological signatures of background and signal events can be learned by deep neural networks via training over many thousands of events. These networks can then be used to classify further events as signal or background, providing an additional background rejection factor at an acceptable loss of efficiency. The networks trained in this study performed better than previous methods developed based on the usemore » of the same topological signatures by a factor of 1.2 to 1.6, and there is potential for further improvement.« less

  12. Effects of placement point of background music on shopping website.

    PubMed

    Lai, Chien-Jung; Chiang, Chia-Chi

    2012-01-01

    Consumer on-line behaviors are more important than ever due to highly growth of on-line shopping. The purposes of this study were to design placement methods of background music for shopping website and examine the effect on browsers' emotional and cognitive response. Three placement points of background music during the browsing, i.e. 2 min., 4 min., and 6 min. from the start of browsing were considered for entry points. Both browsing without music (no music) and browsing with constant music volume (full music) were treated as control groups. Participants' emotional state, approach-avoidance behavior intention, and action to adjust music volume were collected. Results showed that participants had a higher level of pleasure, arousal and approach behavior intention for the three placement points than for no music and full music. Most of the participants for full music (5/6) adjusted the background music. Only 16.7% (3/18) participants for other levels turn off the background music. The results indicate that playing background music after the start of browsing is benefit for on-line shopping atmosphere. It is inappropriate to place background music at the start of browsing shopping website. The marketer must manipulated placement methods of background music for a web store carefully.

  13. Determination of antenna factors using a three-antenna method at open-field test site

    NASA Astrophysics Data System (ADS)

    Masuzawa, Hiroshi; Tejima, Teruo; Harima, Katsushige; Morikawa, Takao

    1992-09-01

    Recently NIST has used the three-antenna method for calibration of the antenna factor of an antenna used for EMI measurements. This method does not require the specially designed standard antennas which are necessary in the standard field method or the standard antenna method, and can be used at an open-field test site. This paper theoretically and experimentally examines the measurement errors of this method and evaluates the precision of the antenna-factor calibration. It is found that the main source of the error is the non-ideal propagation characteristics of the test site, which should therefore be measured before the calibration. The precision of the antenna-factor calibration at the test site used in these experiments, is estimated to be 0.5 dB.

  14. Mechanics of Ballast Compaction. Volume 2 : Field Methods for Ballast Physical State Measurement

    DOT National Transportation Integrated Search

    1982-03-01

    Field methods for measuring ballast physical state are needed to study the effects of ballast compaction. Following a consideration of various alternatives, three methods were selected for development and evaluation. The first was in-place density, w...

  15. A new method to measure galaxy bias by combining the density and weak lensing fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pujol, Arnau; Chang, Chihway; Gaztañaga, Enrique

    We present a new method to measure redshift-dependent galaxy bias by combining information from the galaxy density field and the weak lensing field. This method is based on the work of Amara et al., who use the galaxy density field to construct a bias-weighted convergence field κg. The main difference between Amara et al.'s work and our new implementation is that here we present another way to measure galaxy bias, using tomography instead of bias parametrizations. The correlation between κg and the true lensing field κ allows us to measure galaxy bias using different zero-lag correlations, such as / ormore » /. Our method measures the linear bias factor on linear scales, under the assumption of no stochasticity between galaxies and matter. We use the Marenostrum Institut de Ciències de l'Espai (MICE) simulation to measure the linear galaxy bias for a flux-limited sample (i < 22.5) in tomographic redshift bins using this method. This article is the first that studies the accuracy and systematic uncertainties associated with the implementation of the method and the regime in which it is consistent with the linear galaxy bias defined by projected two-point correlation functions (2PCF). We find that our method is consistent with a linear bias at the per cent level for scales larger than 30 arcmin, while non-linearities appear at smaller scales. This measurement is a good complement to other measurements of bias, since it does not depend strongly on σ8 as do the 2PCF measurements. We will apply this method to the Dark Energy Survey Science Verification data in a follow-up article.« less

  16. An improved schlieren method for measurement and automatic reconstruction of the far-field focal spot

    PubMed Central

    Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye

    2017-01-01

    The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758

  17. Strong field QED in lepton colliders and electron/laser interactions

    NASA Astrophysics Data System (ADS)

    Hartin, Anthony

    2018-05-01

    The studies of strong field particle physics processes in electron/laser interactions and lepton collider interaction points (IPs) are reviewed. These processes are defined by the high intensity of the electromagnetic fields involved and the need to take them into account as fully as possible. Thus, the main theoretical framework considered is the Furry interaction picture within intense field quantum field theory. In this framework, the influence of a background electromagnetic field in the Lagrangian is calculated nonperturbatively, involving exact solutions for quantized charged particles in the background field. These “dressed” particles go on to interact perturbatively with other particles, enabling the background field to play both macroscopic and microscopic roles. Macroscopically, the background field starts to polarize the vacuum, in effect rendering it a dispersive medium. Particles encountering this dispersive vacuum obtain a lifetime, either radiating or decaying into pair particles at a rate dependent on the intensity of the background field. In fact, the intensity of the background field enters into the coupling constant of the strong field quantum electrodynamic Lagrangian, influencing all particle processes. A number of new phenomena occur. Particles gain an intensity-dependent rest mass shift that accounts for their presence in the dispersive vacuum. Multi-photon events involving more than one external field photon occur at each vertex. Higher order processes which exchange a virtual strong field particle resonate via the lifetimes of the unstable strong field states. Two main arenas of strong field physics are reviewed; those occurring in relativistic electron interactions with intense laser beams, and those occurring in the beam-beam physics at the interaction point of colliders. This review outlines the theory, describes its significant novel phenomenology and details the experimental schema required to detect strong field effects and the

  18. Helical magnetic fields in molecular clouds?. A new method to determine the line-of-sight magnetic field structure in molecular clouds

    NASA Astrophysics Data System (ADS)

    Tahani, M.; Plume, R.; Brown, J. C.; Kainulainen, J.

    2018-06-01

    Context. Magnetic fields pervade in the interstellar medium (ISM) and are believed to be important in the process of star formation, yet probing magnetic fields in star formation regions is challenging. Aims: We propose a new method to use Faraday rotation measurements in small-scale star forming regions to find the direction and magnitude of the component of magnetic field along the line of sight. We test the proposed method in four relatively nearby regions of Orion A, Orion B, Perseus, and California. Methods: We use rotation measure data from the literature. We adopt a simple approach based on relative measurements to estimate the rotation measure due to the molecular clouds over the Galactic contribution. We then use a chemical evolution code along with extinction maps of each cloud to find the electron column density of the molecular cloud at the position of each rotation measure data point. Combining the rotation measures produced by the molecular clouds and the electron column density, we calculate the line-of-sight magnetic field strength and direction. Results: In California and Orion A, we find clear evidence that the magnetic fields at one side of these filamentary structures are pointing towards us and are pointing away from us at the other side. Even though the magnetic fields in Perseus might seem to suggest the same behavior, not enough data points are available to draw such conclusions. In Orion B, as well, there are not enough data points available to detect such behavior. This magnetic field reversal is consistent with a helical magnetic field morphology. In the vicinity of available Zeeman measurements in OMC-1, OMC-B, and the dark cloud Barnard 1, we find magnetic field values of - 23 ± 38 μG, - 129 ± 28 μG, and 32 ± 101 μG, respectively, which are in agreement with the Zeeman measurements. Tables 1 to 7 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http

  19. DEMONSTRATION BULLETIN: FIELD ANALYTICAL SCREENING PROGRAM: PCB METHOD - U.S. ENVIRONMENTAL PROTECTION AGENCY

    EPA Science Inventory

    The field analytical screening program (FASP) polychlorinated biphenyl (PCB) method uses a temperature-programmable gas chromatograph (GC) equipped with an electron capture detector (ECD) to identify and quantify PCBs. Gas chromatography is an EPA-approved method for determi...

  20. Apparatus and method for materials processing utilizing a rotating magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muralidharan, Govindarajan; Angelini, Joseph A.; Murphy, Bart L.

    An apparatus for materials processing utilizing a rotating magnetic field comprises a platform for supporting a specimen, and a plurality of magnets underlying the platform. The plurality of magnets are configured for rotation about an axis of rotation intersecting the platform. A heat source is disposed above the platform for heating the specimen during the rotation of the plurality of magnets. A method for materials processing utilizing a rotating magnetic field comprises providing a specimen on a platform overlying a plurality of magnets; rotating the plurality of magnets about an axis of rotation intersecting the platform, thereby applying a rotatingmore » magnetic field to the specimen; and, while rotating the plurality of magnets, heating the specimen to a desired temperature.« less

  1. An application of the Braunbeck method to the Maggi-Rubinowicz field representation

    NASA Technical Reports Server (NTRS)

    Meneghini, R.

    1982-01-01

    The Braunbek method is applied to the generalized vector potential associated with the Maggi-rubinowicz representation. Under certain approximations, an asymptotic evaluation of the vector potential is obtained. For observation points away from caustics or shadow boundaries, the field derived from this quantity is the same as that determined from the geometrical theory of diffraction on a singly diffracted edge ray. An evaluation of the field for the simple case of a plane wave normally incident on a circular aperture is presented showing that the field predicted by the Maggi-Rubinowicz theory is continuous across the shadow boundary.

  2. An application of the Braunbeck method to the Maggi-Rubinowicz field representation

    NASA Astrophysics Data System (ADS)

    Meneghini, R.

    1982-06-01

    The Braunbek method is applied to the generalized vector potential associated with the Maggi-rubinowicz representation. Under certain approximations, an asymptotic evaluation of the vector potential is obtained. For observation points away from caustics or shadow boundaries, the field derived from this quantity is the same as that determined from the geometrical theory of diffraction on a singly diffracted edge ray. An evaluation of the field for the simple case of a plane wave normally incident on a circular aperture is presented showing that the field predicted by the Maggi-Rubinowicz theory is continuous across the shadow boundary.

  3. Background characterization of an ultra-low background liquid scintillation counter

    DOE PAGES

    Erchinger, J. L.; Orrell, John L.; Aalseth, C. E.; ...

    2017-01-26

    The Ultra-Low Background Liquid Scintillation Counter developed by Pacific Northwest National Laboratory will expand the application of liquid scintillation counting by enabling lower detection limits and smaller sample volumes. By reducing the overall count rate of the background environment approximately 2 orders of magnitude below that of commercially available systems, backgrounds on the order of tens of counts per day over an energy range of ~3–3600 keV can be realized. Finally, initial test results of the ULB LSC show promising results for ultra-low background detection with liquid scintillation counting.

  4. Identification of rice field using Multi-Temporal NDVI and PCA method on Landsat 8 (Case Study: Demak, Central Java)

    NASA Astrophysics Data System (ADS)

    Sukmono, Abdi; Ardiansyah

    2017-01-01

    Paddy is one of the most important agricultural crop in Indonesia. Indonesia’s consumption of rice per capita in 2013 amounted to 78,82 kg/capita/year. In 2017, the Indonesian government has the mission of realizing Indonesia became self-sufficient in food. Therefore, the Indonesian government should be able to seek the stability of the fulfillment of basic needs for food, such as rice field mapping. The accurate mapping for rice field can use a quick and easy method such as Remote Sensing. In this study, multi-temporal Landsat 8 are used for identification of rice field based on Rice Planting Time. It was combined with other method for extract information from the imagery. The methods which was used Normalized Difference Vegetation Index (NDVI), Principal Component Analysis (PCA) and band combination. Image classification is processed by using nine classes, those are water, settlements, mangrove, gardens, fields, rice fields 1st, rice fields 2nd, rice fields 3rd and rice fields 4th. The results showed the rice fields area obtained from the PCA method was 50,009 ha, combination bands was 51,016 ha and NDVI method was 45,893 ha. The accuracy level was obtained PCA method (84.848%), band combination (81.818%), and NDVI method (75.758%).

  5. Background Oriented Schlieren Implementation in a Jet-Surface Interaction Test

    NASA Technical Reports Server (NTRS)

    Clem, Michelle M.; Brown, Clifford A.; Fagan, Amy

    2013-01-01

    Many current and future aircraft designs rely on the wing or other aircraft surfaces to shield the engine noise from observers on the ground. However the available data regarding how a planar surface interacts with a jet to shield and/or enhance the jet noise are currently limited. Therefore, the Jet-Surface Interaction Tests supported by NASA's Fundamental Aeronautics Program's Fixed Wing Project were undertaken to supply experimental data covering a wide range of surface geometries and positions interacting with high-speed jet flows in order to support the development of noise prediction methods. Phase 1 of the Test was conducted in the Aero-Acoustic Propulsion Laboratory at NASA Glenn Research Center and consisted of validating noise prediction schemes for a round nozzle interacting with a planar surface. Phased array data and far-field acoustic data were collected for both the shielded and reflected sides of the surface. Phase 1 results showed that the broadband shock noise was greatly reduced by the surface when the jet was operated at the over-expanded condition, however, it was unclear whether this reduction was due a change in the shock cell structure by the surface. In the present study, Background Oriented Schlieren is implemented in Phase 2 of the Jet-Surface Interaction Tests to investigate whether the planar surface interacts with the high-speed jet ow to change the shock cell structure. Background Oriented Schlieren data are acquired for under-expanded, ideally-expanded, and over-expanded ow regimes for multiple axial and radial positions of the surface at three different plate lengths. These data are analyzed with far-field noise measurements to relate the shock cell structure to the broadband shock noise produced by a jet near a surface.

  6. Bayesian Methods for Effective Field Theories

    NASA Astrophysics Data System (ADS)

    Wesolowski, Sarah

    Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that

  7. Determination of space shuttle flow field by the three-dimensional method of characteristics

    NASA Technical Reports Server (NTRS)

    Chu, C.; Powers, S. A.

    1972-01-01

    The newly improved three-dimensional method of characteristics program has been applied successfully to the calculation of flow fields over a variety of bodies including slab delta wings and shuttle orbiters. Flow fields over fuselage shapes for Mach numbers as low as 1.5 have been calculated. Some typical results are presented.

  8. Influence of background size, luminance and eccentricity on different adaptation mechanisms

    PubMed Central

    Gloriani, Alejandro H.; Matesanz, Beatriz M.; Barrionuevo, Pablo A.; Arranz, Isabel; Issolio, Luis; Mar, Santiago; Aparicio, Juan A.

    2016-01-01

    Mechanisms of light adaptation have been traditionally explained with reference to psychophysical experimentation. However, the neural substrata involved in those mechanisms remain to be elucidated. Our study analyzed links between psychophysical measurements and retinal physiological evidence with consideration for the phenomena of rod-cone interactions, photon noise, and spatial summation. Threshold test luminances were obtained with steady background fields at mesopic and photopic light levels (i.e., 0.06–110 cd/m2) for retinal eccentricities from 0° to 15° using three combinations of background/test field sizes (i.e., 10°/2°, 10°/0.45°, and 1°/0.45°). A two-channel Maxwellian view optical system was employed to eliminate pupil effects on the measured thresholds. A model based on visual mechanisms that were described in the literature was optimized to fit the measured luminance thresholds in all experimental conditions. Our results can be described by a combination of visual mechanisms. We determined how spatial summation changed with eccentricity and how subtractive adaptation changed with eccentricity and background field size. According to our model, photon noise plays a significant role to explain contrast detection thresholds measured with the 1/0.45° background/test size combination at mesopic luminances and at off-axis eccentricities. In these conditions, our data reflect the presence of rod-cone interaction for eccentricities between 6° and 9° and luminances between 0.6 and 5 cd/m2. In spite of the increasing noise effects with eccentricity, results also show that the visual system tends to maintain a constant signal-to-noise ratio in the off-axis detection task over the whole mesopic range. PMID:27210038

  9. Influence of background size, luminance and eccentricity on different adaptation mechanisms.

    PubMed

    Gloriani, Alejandro H; Matesanz, Beatriz M; Barrionuevo, Pablo A; Arranz, Isabel; Issolio, Luis; Mar, Santiago; Aparicio, Juan A

    2016-08-01

    Mechanisms of light adaptation have been traditionally explained with reference to psychophysical experimentation. However, the neural substrata involved in those mechanisms remain to be elucidated. Our study analyzed links between psychophysical measurements and retinal physiological evidence with consideration for the phenomena of rod-cone interactions, photon noise, and spatial summation. Threshold test luminances were obtained with steady background fields at mesopic and photopic light levels (i.e., 0.06-110cd/m(2)) for retinal eccentricities from 0° to 15° using three combinations of background/test field sizes (i.e., 10°/2°, 10°/0.45°, and 1°/0.45°). A two-channel Maxwellian view optical system was employed to eliminate pupil effects on the measured thresholds. A model based on visual mechanisms that were described in the literature was optimized to fit the measured luminance thresholds in all experimental conditions. Our results can be described by a combination of visual mechanisms. We determined how spatial summation changed with eccentricity and how subtractive adaptation changed with eccentricity and background field size. According to our model, photon noise plays a significant role to explain contrast detection thresholds measured with the 1/0.45° background/test size combination at mesopic luminances and at off-axis eccentricities. In these conditions, our data reflect the presence of rod-cone interaction for eccentricities between 6° and 9° and luminances between 0.6 and 5cd/m(2). In spite of the increasing noise effects with eccentricity, results also show that the visual system tends to maintain a constant signal-to-noise ratio in the off-axis detection task over the whole mesopic range. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Novel method for detecting weak magnetic fields at low frequencies

    NASA Astrophysics Data System (ADS)

    González-Martínez, S.; Castillo-Torres, J.; Mendoza-Santos, J. C.; Zamorano-Ulloa, R.

    2005-06-01

    A low-level-intensity magnetic field detection system has been designed and developed based on the amplification-selection process of signals. This configuration is also very sensitive to magnetic field changes produced by harmonic-like electrical currents transported in finite-length wires. Experimental and theoretical results of magnetic fields detection as low as 10-9T at 120Hz are also presented with an accuracy of around 13%. The assembled equipment is designed to measure an electromotive force induced in a free-magnetic-core coil in order to recover signals which are previously selected, despite the fact that their intensities are much lower than the environment electromagnetic radiation. The prototype has a signal-to-noise ratio of 60dB. This system also presents the advantage for using it as a portable unit of measurement. The concept and prototype may be applied, for example, as a nondestructive method to analyze any corrosion formation in metallic oil pipelines which are subjected to cathodic protection.

  11. Nonlinear Dynamics of the Cosmic Neutrino Background

    NASA Astrophysics Data System (ADS)

    Inman, Derek

    At least two of the three neutrino species are known to be massive, but their exact masses are currently unknown. Cosmic neutrinos decoupled from the rest of the primordial plasma early on when the Universe was over a billion times hotter than it is today. These relic particles, which have cooled and are now non-relativistic, constitute the Cosmic Neutrino Background and permeate the Universe. While they are not observable directly, their presence can be inferred by measuring the suppression of the matter power spectrum. This suppression is a linear effect caused by the large thermal velocities of neutrinos, which prevent them from collapsing gravitationally on small scales. Unfortunately, it is difficult to measure because of degeneracies with other cosmological parameters and biases arising from the fact that we typically observe point-like galaxies rather than a continous matter field. It is therefore important to look for new effects beyond linear suppression that may be more sensitive to neutrinos. This thesis contributes to the understanding of the nonlinear dynamics of the cosmological neutrino background in the following ways: (i) the development of a new injection scheme for neutrinos in cosmological N-body simulations which circumvents many issues associated with simulating neutrinos at large redshifts, (ii) the numerical study of the relative velocity field between cold dark matter and neutrinos including its reconstruction from density fields, (iii) the theoretical description of neutrinos as a dispersive fluid and its use in modelling the nonlinear evolution of the neutrino density power spectrum, (iv) the derivation of the dipole correlation function using linear response which allows for the Fermi-Dirac velocity distribution to be properly included, and (v) the numerical study and detection of the dipole correlation function in the TianNu simulation. In totality, this thesis is a comprehensive study of neutrino density and velocity fields that may

  12. Work function measurements by the field emission retarding potential method

    NASA Technical Reports Server (NTRS)

    Swanson, L. W.; Strayer, R. W.; Mackie, W. A.

    1971-01-01

    Using the field emission retarding potential method true work functions have been measured for the following monocrystalline substrates: W(110), W(111), W(100), Nb(100), Ni(100), Cu(100), Ir(110) and Ir(111). The electron elastic and inelastic reflection coefficients from several of these surfaces have also been examined near zero primary beam energy.

  13. Seasonal changes in background levels of deuterium and oxygen-18 prove water drinking by harp seals, which affects the use of the doubly labelled water method.

    PubMed

    Nordøy, Erling S; Lager, Anne R; Schots, Pauke C

    2017-12-01

    The aim of this study was to monitor seasonal changes in stable isotopes of pool freshwater and harp seal ( Phoca groenlandica ) body water, and to study whether these potential seasonal changes might bias results obtained using the doubly labelled water (DLW) method when measuring energy expenditure in animals with access to freshwater. Seasonal changes in the background levels of deuterium and oxygen-18 in the body water of four captive harp seals and in the freshwater pool in which they were kept were measured over a time period of 1 year. The seals were offered daily amounts of capelin and kept under a seasonal photoperiod of 69°N. Large seasonal variations of deuterium and oxygen-18 in the pool water were measured, and the isotope abundance in the body water showed similar seasonal changes to the pool water. This shows that the seals were continuously equilibrating with the surrounding water as a result of significant daily water drinking. Variations in background levels of deuterium and oxygen-18 in freshwater sources may be due to seasonal changes in physical processes such as precipitation and evaporation that cause fractionation of isotopes. Rapid and abrupt changes in the background levels of deuterium and oxygen-18 may complicate calculation of energy expenditure by use of the DLW method. It is therefore strongly recommended that analysis of seasonal changes in background levels of isotopes is performed before the DLW method is applied on (free-ranging) animals, and to use a control group in order to correct for changes in background levels. © 2017. Published by The Company of Biologists Ltd.

  14. Hidden in the background: a local approach to CMB anomalies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sánchez, Juan C. Bueno, E-mail: juan.c.bueno@correounivalle.edu.co

    2016-09-01

    We investigate a framework aiming to provide a common origin for the large-angle anomalies detected in the Cosmic Microwave Background (CMB), which are hypothesized as the result of the statistical inhomogeneity developed by different isocurvature fields of mass m ∼ H present during inflation. The inhomogeneity arises as the combined effect of ( i ) the initial conditions for isocurvature fields (obtained after a fast-roll stage finishing many e -foldings before cosmological scales exit the horizon), ( ii ) their inflationary fluctuations and ( iii ) their coupling to other degrees of freedom. Our case of interest is when thesemore » fields (interpreted as the precursors of large-angle anomalies) leave an observable imprint only in isolated patches of the Universe. When the latter intersect the last scattering surface, such imprints arise in the CMB. Nevertheless, due to their statistically inhomogeneous nature, these imprints are difficult to detect, for they become hidden in the background similarly to the Cold Spot. We then compute the probability that a single isocurvature field becomes inhomogeneous at the end of inflation and find that, if the appropriate conditions are given (which depend exclusively on the preexisting fast-roll stage), this probability is at the percent level. Finally, we discuss several mechanisms (including the curvaton and the inhomogeneous reheating) to investigate whether an initial statistically inhomogeneous isocurvature field fluctuation might give rise to some of the observed anomalies. In particular, we focus on the Cold Spot, the power deficit at low multipoles and the breaking of statistical isotropy.« less

  15. Estimating nitrogen loading and far-field dispersal potential from background sources and coastal finfish aquaculture: A simple framework and case study in Atlantic Canada

    NASA Astrophysics Data System (ADS)

    McIver, R.; Milewski, I.; Loucks, R.; Smith, R.

    2018-05-01

    Far-field nutrient impacts associated with finfish aquaculture have been identified as a topic of concern for regulators, managers, scientists, and the public for over two decades but disentangling aquaculture impacts from those caused by other natural and anthropogenic sources has impeded the development of monitoring metrics and management plans. We apply a bulk, steady-state nitrogen loading model (NLM) framework to estimate the annual input of Total Dissolved Nitrogen (TDN) from point and non-point sources to the watershed surrounding Port Mouton Bay, Nova Scotia (Canada). We then use the results of the NLM together with estimates of dissolved inorganic nitrogen (DIN) loading from a sea-cage trout farm in the Bay and progressive vector diagrams to illustrate potential patterns of DIN dispersal from the trout farm. Our estimated anthropogenic nitrogen contribution to Port Mouton Bay from all terrestrial and atmospheric sources is ∼211,703 kg TDN/year with atmospheric deposition accounting for almost all (98.6%). At a stocking level of ∼400,000 rainbow trout, the Port Mouton Bay sea-cage farm increases the annual anthropogenic TDN loading to the bay by 14.4% or 30,400 kg. Depending on current flow rates, nitrogen flux from the trout farm can be more than double the background concentrations of TDN near the farm site. Although it is unlikely that nitrogen loading from this single fish farm is saturating the DIN requirements of the entire bay, progressive vector diagrams suggest that the dispersal potential may be insufficient to mitigate potential symptoms of eutrophication associated with nitrogen fluxes. We present an accessible and user-friendly tool for managers to estimate baseline nutrient loading in relation to aquaculture and our use of progressive vector diagrams illustrate a practical and simple method for characterizing potential nutrient dispersal based on local conditions and spatial scales. Our study joins numerous studies which have highlighted

  16. A field investigation on the effects of background erosion on the free span development of a submarine pipeline

    NASA Astrophysics Data System (ADS)

    Wen, Shipeng; Xu, Jishang; Hu, Guanghai; Dong, Ping; Shen, Hong

    2015-08-01

    The safety of submarine pipelines is largely influenced by free spans and corrosions. Previous studies on free spans caused by seabed scours are mainly based on the stable environment, where the background seabed scour is in equilibrium and the soil is homogeneous. To study the effects of background erosion on the free span development of subsea pipelines, a submarine pipeline located at the abandoned Yellow River subaqueous delta lobe was investigated with an integrated surveying system which included a Multibeam bathymetric system, a dual-frequency side-scan sonar, a high resolution sub-bottom profiler, and a Magnetic Flux Leakage (MFL) sensor. We found that seabed homogeneity has a great influence on the free span development of the pipeline. More specifically, for homogeneous background scours, the morphology of scour hole below the pipeline is quite similar to that without the background scour, whereas for inhomogeneous background scour, the nature of spanning is mainly dependent on the evolution of seabed morphology near the pipeline. Magnetic Flux Leakage (MFL) detection results also reveal the possible connection between long free spans and accelerated corrosion of the pipeline.

  17. Background Oriented Schlieren Using Celestial Objects

    NASA Technical Reports Server (NTRS)

    Haering, Edward, A., Jr. (Inventor); Hill, Michael A (Inventor)

    2017-01-01

    The present invention is a system and method of visualizing fluid flow around an object, such as an aircraft or wind turbine, by aligning the object between an imaging system and a celestial object having a speckled background, taking images, and comparing those images to obtain fluid flow visualization.

  18. Prediction of sound fields in acoustical cavities using the boundary element method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kipp, C. R.; Bernhard, R. J.

    1985-01-01

    A method was developed to predict sound fields in acoustical cavities. The method is based on the indirect boundary element method. An isoparametric quadratic boundary element is incorporated. Pressure, velocity and/or impedance boundary conditions may be applied to a cavity by using this method. The capability to include acoustic point sources within the cavity is implemented. The method is applied to the prediction of sound fields in spherical and rectangular cavities. All three boundary condition types are verified. Cases with a point source within the cavity domain are also studied. Numerically determined cavity pressure distributions and responses are presented. The numerical results correlate well with available analytical results.

  19. A method for approximating acoustic-field-amplitude uncertainty caused by environmental uncertainties.

    PubMed

    James, Kevin R; Dowling, David R

    2008-09-01

    In underwater acoustics, the accuracy of computational field predictions is commonly limited by uncertainty in environmental parameters. An approximate technique for determining the probability density function (PDF) of computed field amplitude, A, from known environmental uncertainties is presented here. The technique can be applied to several, N, uncertain parameters simultaneously, requires N+1 field calculations, and can be used with any acoustic field model. The technique implicitly assumes independent input parameters and is based on finding the optimum spatial shift between field calculations completed at two different values of each uncertain parameter. This shift information is used to convert uncertain-environmental-parameter distributions into PDF(A). The technique's accuracy is good when the shifted fields match well. Its accuracy is evaluated in range-independent underwater sound channels via an L(1) error-norm defined between approximate and numerically converged results for PDF(A). In 50-m- and 100-m-deep sound channels with 0.5% uncertainty in depth (N=1) at frequencies between 100 and 800 Hz, and for ranges from 1 to 8 km, 95% of the approximate field-amplitude distributions generated L(1) values less than 0.52 using only two field calculations. Obtaining comparable accuracy from traditional methods requires of order 10 field calculations and up to 10(N) when N>1.

  20. FIELD MEASUREMENT OF DISSOLVED OXYGEN: A COMPARISON OF METHODS: JOURNAL ARTICLE

    EPA Science Inventory

    NRMRL-ADA- 00160 Wilkin*, R.T., McNeil*, M.S., Adair*, C.J., and Wilson*, J.T. Field Measurement of Dissolved Oxygen: A Comparison of Methods. Ground Water Monitoring and Remediation (Fall):124-132 (2001). EPA/600/J-01/403. The abili...